Oracle further raises the bar on cloud for enterprise AI: in the most recent market communications, the company indicated that infrastructure revenues could grow up to $144 billion by fiscal year 2030, pushing the stock significantly higher and contributing to the increase of over $100 billion approximately, in the wealth of co-founder Larry Ellison.
In this context, the new contractual metrics and the AI-related pipeline have captured the attention of investors, as reported by Reuters and Oracle’s investor materials, confirmed in the official release on September 9, 2025, Oracle Investor Release and framed within the infrastructure investment scenario highlighted by industry analysis McKinsey.
According to data collected by Oracle’s Investor Relations and quarterly notes (Q1 FY2026, statement of September 9, 2025), the RPOs were reported at approximately $455 billion, an increase of 359% year-over-year. The industry analysts we monitor observe how this combination of contractual backlog and capacity agreements with model suppliers and hyperscalers creates unprecedented medium-term revenue visibility for the AI cloud sector. In our comparative analyses, the dynamics of capacity contracts and delivery timelines emerge as determining factors for the conversion of RPOs into actual revenues.
In brief (updated as of September 11, 2025)
- RPO (residual performance obligations) at approximately $455 billion, with an increase of 359% year-over-year.
- Cloud infrastructure revenue: from approximately $10 billion in the last fiscal year to $144 billion by FY2030 (internal projection).
- Title up by approximately 36% on the announcement; Ellison’s wealth has increased by over $100 billion, according to Reuters estimates.
- AI Clients/Partners mentioned: OpenAI, xAI, Meta, Nvidia, AMD.
Key Figures Reported by Oracle
In the course of the latest quarterly report, Oracle highlighted the growth of Remaining Performance Obligations (RPO) to about $455 billion, an annual increase of 359%. The RPO represents the contracted revenues not yet recognized in the income statement, distributed over multi-year contracts. Indeed, the data provides a snapshot of the committed demand in the medium term.
At the same time, the company shared its forecasts indicating that cloud infrastructure revenues could increase from approximately $10 billion in the last fiscal year to $144 billion by the fiscal year 2030, driven by the growing demand for computing power for generative AI and capacity contracts signed with foundation model developers and large enterprises. That said, the trajectory remains tied to hardware availability and delivery timelines.
Why These Numbers Matter for Enterprise AI Cloud
- Revenue Visibility: High RPO ensures more predictable future cash flows on multi-year contracts.
- Access to accelerators: AI workloads focus on infrastructures equipped with the latest generation GPUs and accelerators.
- Scale effect: increased capacity allows for training and inference on larger models, offering significant advantages for businesses.
- Competitive repositioning: the valuations and capital expenditures (capex) of other providers might adjust to the new pace of demand.
Impact on Market and Valuations
The communications triggered a stock rally of about 36%, while Larry Ellison’s net worth has recently grown by over $100 billion, as highlighted by Reuters. Oracle also announced agreements with some of the leaders in the AI sector, including OpenAI, xAI, and Meta, and has strengthened technological collaborations with Nvidia and AMD for the supply of accelerators. It should be noted that these partnerships intertwine with the capacity expansion plans.
What are RPOs and how to read them
The Remaining Performance Obligations represent the amount of future revenue related to services agreed upon in signed contracts but not yet accounted for. A portion is expected within 12 months, while the rest extends beyond this horizon. These figures do not equate to guaranteed revenue, as they may be subject to changes due to terminations, rescheduling, or variations in delivery times. In this context, for an accurate reading, it is advisable to consider the average contract duration, service mix, and activation rate of the purchased capacity.
Corporate Strategies to Accelerate AI on Cloud
- Hybrid architectures to balance costs, latency, and data localization.
- Partnership with providers that ensure priority access to AI hardware and high-performance networks.
- Development of internal MLOps skills to effectively orchestrate model training, deployment, and monitoring.
- Stipulation of flexible contracts characterized by scalability and SLAs suitable for production needs.
- Implementation of strict data governance in terms of security, privacy, and compliance, especially in regulated sectors.
Growth Scenarios and Investments in Cloud AI Infrastructure
The outlined scenario requires massive investments in data centers, networks, energy, and cooling, as well as a supply chain of accelerators that remains under pressure. Elements such as energy efficiency, reliability, and the availability of “dedicated” computing capacity will be decisive in enterprise competitions. Yet, operational sustainability and deployment times remain key variables.
From Experimentation to Production: Operational Effects
- Large-scale training with reserved compute windows and predictable costs.
- Implementation of production services with controlled latency and measurable service level objectives (SLO).
- Certified integrations with existing enterprise stacks and volume discounts on consumption.
Strategic Context and Role of Larry Ellison
Larry Ellison continues to serve as chairman and chief technology officer of Oracle. In recent years, the company has strengthened its presence in key sectors – such as in the healthcare sector, bolstered by the acquisition of Cerner – and has maintained buyback programs and shareholder return policies, elements that have supported the company’s valuation. In this context, the recent appreciation of the stock is directly reflected in the net worth of the executives.
Trends in Cloud for Enterprise AI in the Next 24 Months
- Increase in dedicated capacity for high-intensity training and inference.
- Greater vertical integration between cloud providers and model suppliers.
- Signing of long-term contracts with minimum commitments and deferred revenues on the rise.
- Development of vertical solutions for regulated sectors and “sovereign” cloud.
- Improvement in efficiency and cost optimization for large-scale AI implementations.
Risks and Variables to Monitor
- Availability of chips and delivery times of GPUs/accelerators.
- Cost of energy and infrastructure constraints of data centers.
- Evolution of regulation on data and AI security in key markets.
- Competition among hyperscalers and the risk of technological lock-in.
- Execution of migration programs and transition to production adoption.
Methodological note: the projection of $144 billion for cloud infrastructure revenues is an internal estimate communicated by Oracle (Q1 FY2026 release, September 9, 2025), based on assumptions regarding utilization rate, pricing, and the mix between training and inference services. The RPO includes components related to contracts expiring within 12 months and beyond, although the detailed breakdown has not been disclosed in the available documents.