Oracle Corporation (NYSE:ORCL) Q3 2026 Earnings Call Transcript

Oracle Corporation (NYSE:ORCL) Q3 2026 Earnings Call Transcript March 10, 2026

Oracle Corporation beats earnings expectations. Reported EPS is $1.79, expectations were $1.7.

Operator: Hello, and thank you for standing by. My name is Regina, and I will be your conference operator today. At this time, I would like to welcome everyone to the Oracle Corporation Third Quarter Fiscal Year 2026 Earnings Conference Call. All lines have been placed on mute to prevent any background noise. After the speakers’ remarks, there will be a question-and-answer session. If you would like to ask a question during this time, simply press star then the number 1 on your telephone keypad. To withdraw your question, press star 1 again. We kindly ask that you please limit yourself to one question. I would now like to turn the conference over to Ken Bond, Head of Investor Relations. Please go ahead.

Ken Bond: Thank you, Regina, and good afternoon, everyone. Welcome to Oracle Corporation’s third quarter fiscal year 2026 earnings conference call. On the call today are Chairman and Chief Technology Officer, Lawrence Ellison; Chief Executive Officer, Clay Magouyrk; Chief Executive Officer, Mike Cecilia; and Principal Financial Officer, Doug Caring. A copy of the press release and financial tables, which includes supplemental financial details on our most recent quarter, guidance for our future results, a GAAP to non-GAAP reconciliation, and a selected list of customers who purchased Oracle Cloud Services or went live on Oracle Cloud recently will be available from our investor relations website. As a reminder, today’s discussion will include forward-looking statements, and we will discuss some important factors relating to our business.

These forward-looking statements are also subject to risks and uncertainties that may cause actual results to differ materially from the statements being made today. As a result, we caution you from placing undue reliance on these forward-looking statements and we encourage you to review our most recent reports, including our 10-Ks and 10-Q, and any applicable amendments. Finally, we are not obligating ourselves to revise our results or these forward-looking statements in light of new information or future events. Before we go to the Q&A portion of the call, we will begin with a few prepared remarks. I will now turn the call over to Doug.

Doug Caring: Thanks, Ken. Let me start by highlighting the changes we are making to our earnings press release and this call. In the press release, we have laid out clearly and explicitly the supplemental financial metrics that we otherwise would have provided on the earnings call so that each of you has the information in writing and in advance. Then as it relates to our approach to the earnings call itself, I will be very brief and then turn it over to Mike and Clay to provide more substantial thoughts on our business, after which, all of us, including Larry, will be available to take questions. In terms of the results for Q3, we had a tremendous quarter that exceeded expectations across the board. Our momentum continues to accelerate with Q3 being the first quarter in over 15 years where both organic total revenue and organic non-GAAP EPS grew at 20% or better in USD.

As we highlighted in the press release, I will quickly mention a couple things and then hand the call over to our CEOs. First, in January, TikTok US completed the separation of its US data operations from ByteDance into an independent company in which Oracle Corporation now holds a 15% equity stake along with a seat on the board. In terms of impact to our financials, there is no impact to the revenue related to the services we have been providing as their technology vendor. That is continuing to the equity investment. We will be accounting for this under the equity method and we will recognize our share of the new company’s earnings for the period from the close of the investment in late January to March 31, in our Q4 results, as there is a two-month reporting period time lag.

It will be recorded as nonoperating income or loss on our income statement and is incremental and additive to our financials. Second, in February, we announced our intent to raise up to $50 billion in debt and equity financing along with the statement that we do not expect to issue any additional bonds beyond this amount in calendar year 2026. Within days of the announcement, we raised $30 billion through a combination of investment-grade bonds and mandatory convertible preferred stock, with a record order book that was substantially oversubscribed. As noted in our release, we have not yet initiated the at-the-market equity portion of the financing program. Finally, I would be remiss not to remind everyone we are reporting our financial results just 10 days after the last day of the quarter, despite the increasing size and complexity of our business.

Using Oracle Fusion, we continue to close and file our financial results faster than any other company in the S&P 500, providing us with a significant strategic advantage as well as an opportunity to help our Fusion customers do the same with their businesses. With that, let me now turn the call over to Mike.

Mike Cecilia: Thanks, Doug. And as Doug just detailed, we really had an excellent quarter across the board. We continue to see strong execution, so let me say a few words about our applications business. Oracle Corporation has the fastest growing, most complete suite of cloud applications in the market, full stop. Our SaaS solutions are industry-complete platforms—highly scalable, trusted, secure, and regulatory-compliant systems and processes in which our customers trust us to run the systems that run their businesses. In constant currency, cloud applications revenue was up 11% in the quarter, reaching an annualized run rate of $16.1 billion. Within that, Fusion ERP was up 14%, Fusion SCM up 15%, Fusion HCM up 15%, Fusion CX up 6%, NetSuite was up 11%.

Industry SaaS solutions for hospitality, construction, retail, banking, restaurants, local governments, and telecommunications combined, were up 19%. So, certainly, very happy with the applications growth in the quarter. In the context of that, I will say a few words about the reported SaaS apocalypse. You have all heard the theses or theory that new companies coding quickly using AI will spell the end of SaaS. I do not agree with that at all. I do think that AI tools and their coding capabilities would be a threat if we were not adopting them, but we are—and very rapidly. Oracle Corporation is using the best AI coding tools and the best developers not only to accelerate our SaaS business, but to deliver solutions that enable entire ecosystems across numerous industries.

The use of AI coding tools inside Oracle Corporation is enabling smaller engineering teams to deliver more complete solutions to our customers more quickly. We are building brand new SaaS products using AI and also embedding AI agents right into our existing applications and suites. Embracing AI with small engineering teams, we have just built three brand new CX applications: lead generation and qualification, sales orchestration and automated selling, and our new website generator. In fact, we just used the website generator to build and launch the new oracle.com. We have built these new CX products to help our customers sell, not simply to administer a forecast or generate email opens. These are three products that salesforce.com does not have.

And, of course, salesforce.com also does not have OCI, the AI data platform, Fusion ERP, and complete industry suites. AI-powered end-to-end ecosystem automation platforms are quite unique to Oracle Corporation. In addition to that, we have already delivered well over 1,000 agents right inside our horizontal back-office and industry applications. This does not even include the agents that our customers are building themselves or the fleet of agents that we are using internally. These are AI features built right into our applications and existing processes. And a great example, I think, is in health care: our brand-new AI-powered EHR—electronic health record—system is live in the market and the results are quite clear. We are reducing administrative overhead, we are allowing clinicians to see more patients, we are improving access to care, and we are increasing provider satisfaction.

In another example, in banking, we provide a comprehensive AI-powered SaaS platform, including everything from commercial banking, retail banking, investment banking, anti-money laundering, financial crimes and compliance, payments, supply chain financing, CX, ERP, and HCM. That banking suite alone contains hundreds of embedded AI agents, all available at no additional cost to our customers. In retail, our AI-enabled solutions span merchandising, assortment planning, supply chain management, point of sale, commerce, and, of course, ERP, CX, and HCM. In summary, these are not systems that can be replaced by a small collection of these features cobbled together and bolted on in the name of AI. So, yes, some smaller or single-focused SaaS players may well be disrupted.

But Oracle Corporation will not be among them. Now let me focus on a few key wins in Q3 in the application space—and, by all means, this is a very short list, not an exhaustive list. Memorial Hermann Health System selected Fusion ERP, SCM, and HCM. This was a win over Workday. University of New South Wales also selected Fusion ERP and HCM—also a win over Workday. Gray Media selected Fusion EPM and ERP—a win again over Workday and also over SAP. Investec Bank selected Fusion EPM and ERP over SAP. HID Global Corporation also selected Fusion ERP and SCM over SAP. Ethiopian Shipping and Logistics Services Enterprise selected Fusion ERP, SCM, and HCM—again, over SAP. A major Wall Street bank elected to standardize on Fusion ERP for the entirety of their business, all of their business units, replacing SAP, full stop.

A team of IT professionals meticulously crafting a large-scale enterprise performance management system.

Loudoun County Public Schools selected Fusion ERP, EPM, HCM, and SCM. The JM Smucker Company selected Fusion ERP and EPM. Westfield Insurance picked Fusion ERP, EPM, HCM, and Procurement. Mitsubishi UFJ Financial Group, an existing cloud customer and database customer, are now moving into both our Fusion ERP and industry SaaS applications. Zain KSA Kuwait, an existing major tech customer, is moving EBS to the cloud to support their growth. So just this very small list of major applications wins in the quarter. In the quarter, we had over 2,000 customers go live in Q3. When you think about our industry applications and our Fusion applications put together, over 2,000 went live, and, more importantly, we continue to see the median time to go-live decrease.

A very small sample of go-lives in the quarter: Hearst expanded their ERP with EPM as well as HCM. JM Huber Company is now live across Fusion ERP and SCM. Emirates Health Services went live with HCM, which enabled a comprehensive HR, payroll, and talent suite to elevate their workforce management. Niagara Bottling went live on SCM, moving from on-premises ERP to Fusion. Seadrill is now live across ERP, HCM, SCM, and EPM. Again, with 2,000 go-lives in the quarter, that is just a very, very short list of go-lives, but you can see, hopefully, not only momentum, but multi-pillar momentum with these customers. I also have an equally short list compared to the overall list of key tech wins in Q3. Lockheed Martin selected OCI high-performance compute to scale AI across their environments efficiently.

Rhombus selected OCI Compute, Networking, and Storage for AI video and security across all of their workloads. Lucid Motors selected OCI core services for data and connectivity in order to expand into European markets. Infomart in Japan selected OCI for their mission-critical B2B platform. Claro Brazil selected OCI Alloy for Sovereign AI. Air France-KLM, which is a multicloud win, featuring a win with the Oracle Database at 13x performance improvement at a significantly lower cost for Air France-KLM. Activision Blizzard, an existing Oracle E-Business Suite customer, was also an Oracle Database at Azure win. Oracle Corporation’s embrace of AI across our strategic applications is leading to broader enterprise conversations with our customers involving our full stack: OCI, AI Data Platform, Fusion applications, industry suites.

These conversations are about ecosystem automation. They are not about single apps. They are about automating the entire ecosystem and they are further enabled by our simplified go-to-market model, which we spoke about in our last earnings call. This is allowing us to close more multiproduct deals with more customers combining the power of the Oracle Database, our OCI platform, our AI tooling, and our complete applications suites. In constant currency, cloud applications deferred revenue was up 14% versus in-quarter cloud applications revenue growth of 11%, which further supports our acceleration thesis. Clay, I will turn it over to you. Thank you, Mike.

Clay Magouyrk: Okay. So I am going to talk about two segments of our business: our multicloud database and AI infrastructure. Both are growing extremely quickly. Multicloud database revenue grew 531% year over year. AI infrastructure revenue grew 243% year over year. Both also have demand that exceeds supply and a clear execution plan from Oracle Corporation that will rapidly turn that demand into profitable recurring revenue. Oracle Database has run on any hardware and operating system for decades. Oracle Database cloud services up until recently were only available in a single cloud: OCI. We created our multicloud partnerships with first Microsoft, then Google, and finally Amazon to bring the best database platform to all clouds.

Those partnerships unlock an enormous backlog of demand—our database customers who want to use our database in other clouds. This quarter, we achieved an important milestone: we have global region coverage in all of our partner clouds. We now have 33 regions live with Microsoft and 14 live with Google. We delivered significant growth with AWS, beginning Q3 with two AWS regions live, exiting Q3 with eight AWS regions live; we will exit Q4 with 22 AWS regions live. AI is also accelerating the adoption of our database cloud services. The rapid improvement in model coding skills and agentic abilities pushes customers to move their most valuable data into our cloud services. They need access to the latest AI features, to support vector embedding, MPT server access, and advanced security controls.

Customers also need their data to be colocated with the agents themselves, and our multicloud database makes that easy. Our multicloud architecture brings the best of Oracle Cloud to our partner regions. This ensures that we will rapidly turn billions of pipelines into highly profitable database service revenue. Demand for AI infrastructure, both GPU and CPU, continues to exceed supply. This is directly visible in our $553 billion RPO. I want to share a model for how that RPO turns into profitable recurring revenue as well as some operational metrics that are early indicators of our progress. AI infrastructure begins with data centers and power generation. Through our partners, we have secured more than 10 gigawatts of power and data capacity coming online over the next three years.

Those infrastructure investments also need funding, and greater than 90% of that capacity is fully funded through our partners, with the remainder planned to finish this month. Once the data center is secured, several things must come together. The data center and on-site power generation have to be constructed. Compute, networking, and storage have to be designed, manufactured, delivered, and installed. All the capacity inside the data center also has to be funded. We continue to innovate across each of these steps. We optimize our data center construction through standardized design. Our supply chain has improved with more suppliers and deeper relationships. We have tripled our manufacturing sites and increased rack output by 4x all in the last year.

We have scaled our installation processes to enable multiple phases of delivery in parallel. Time from rack delivery to revenue has reduced by percent in the past several months. We also continue to innovate on our business models. On our last earnings call, I shared multiple ideas for how we can incrementally grow our AI infrastructure without Oracle Corporation raising more debt or issuing equity. We have signed more than $29 billion of contracts since then, across multiple customers using that new model. A combination of bring-your-own-hardware and upfront customer payments enables us to continue expanding without any negative cash flow from Oracle Corporation. Of course, this $29 billion is in addition to other deals we signed this quarter.

Ultimately, all of this results in capacity delivered to customers and revenue to Oracle Corporation. In Q3, we delivered more than 400 megawatts to customers. 90% of that committed capacity was delivered on or ahead of schedule, as we have consistently done over several quarters. This is why customers continue to choose Oracle Corporation for their infrastructure needs. Investing in AI infrastructure is capital intensive, but our operating model is optimized to ensure profitability. Flexible infrastructure design, high utilization, and handover combined with diversified customers creates an incredible business. Increased scale spreads our fixed cost over a larger base, increasing profitability. It is unprecedented to scale a capital-intensive business so quickly while also increasing profitability.

Looking at the AI capacity we delivered in Q3, our gross margin for that remained above our 30% guidance at 32%. Now combine that with our other segments of OCI, which have much higher margins, like our database services, and you can see why Oracle Corporation is growing so quickly and profitably. Our numbers speak for themselves. We are overdelivering on FY26 revenue and earnings, and we are constantly raising our FY27 forecast. This is made possible by Oracle Corporation’s transition from a predominantly seasonal license business into a highly predictable recurring revenue class. Demand for AI and advanced compute will continue to expand broadly across the economy. There will be many successful models, agentic platforms, and businesses that emerge.

We support hundreds of the most advanced AI customers today, and more continually want to work with us. We build infrastructure that is flexible, fungible, and can support the smallest workloads up to the largest. We continually offer the latest in accelerators, from the most recent NVIDIA and AMD options to emerging designs from companies like Cerebras and PowerCharm. Altogether, we are confident that the investments we make now in data centers, compute capacity, and customer relationships will only grow more valuable with time. Back to Ken for questions.

Ken Bond: Thank you, Clay. Regina, if you could please poll the audience for questions.

Q&A Session

Follow Oracle Corp (NYSE:ORCL)

Operator: We will now begin the question-and-answer session. To ask a question, press star then the number 1 on your telephone keypad. We ask that you please limit your questions to one. Our first question will come from the line of John DiFucci with Guggenheim. Please go ahead.

John DiFucci: Thank you. Wow. A lot going on here. So, listen, I am going to let others ask about the AI infrastructure question, but we have heard Doug talk about a halo effect that the AI infrastructure business is having on the rest of your business. This quarter was strong, and you said that the RPO increase was from large-scale AI contracts. At the same time, we are hearing from the field now that that halo effect is actually turning into business. Outside of AI infrastructure, it sounds like the go-lives are steady, but the business activity and especially the pipeline are up materially from more traditional cloud workloads, including, you know, Dedicated Region, sovereign clouds, even Alloy deals we have heard you are starting to hear about, in addition to what Mike started talking about with the often-related apps deals.

I realize these types of deals are not the scale of these AI deals. But can you talk about what seems to be an underlying momentum building in these businesses? Am I right to be thinking of this? And if I could, on a sort of related topic, can you give us any visibility into CapEx for fiscal 2027?

Mike Cecilia: Okay, John. This is Mike. I will take the question. So, yes, we absolutely are seeing a halo effect. And let me add a little bit of color on that. As far as the apps business, the fact that we are training so many models on OCI and closely provisioned for applications allows us to embed very high-quality AI services right into our applications as features. So not only are we serving the model vendors for training, but we are also embedding a lot of the output right into our application cores. We are doing prompt engineering and things like that to make it relevant to the business. But the fact that we are the custodian in our applications business of so much of the world’s mission-critical data, and we have very close provisioning—very close proximity—to these models, putting those two things together allows customers to get value from AI very, very quickly.

And if you have heard any criticism of AI in the world, it is “well, cannot get value quickly enough.” Well, actually, when you bundle it up as a service and expose the private data to AI that we are the custodian of in the applications, we have seen terrific wins. I mentioned some of the verticals you heard about there, but I think that is true across the board. The other piece that is a very interesting halo effect is leveraging our infrastructure—just OCI infrastructure—as a budget creator for customers. You have heard us say it before: we are faster and cheaper than everybody else. And when customers are thinking about these large-scale application or large-scale infrastructure transformations, we can also help them get to a position of budget creation to be able to fund that transfer simply by moving their workloads to OCI, because we can run them more quickly and more efficiently and less expensively than our competitors.

And then, finally, the other halo effect before I turn it over to Doug for your question on CapEx is around Sovereign AI. Our sovereign story is not new, and it is not a knee-jerk reaction to things that are happening in the world. Combined together with our Alloy story, we are really seeing increasing pipeline across the world. The fact that our form factor—we are so differentiated in our form factor—and we can deliver not just a smaller form factor, but complete OCI services on top of that form factor no matter how many racks are involved, whether it is three racks or 500 racks, we think that is a huge differentiator in the market. So you put apps together, you put OCI AI services together, you put sovereignty together, and yes, it is a pretty big halo effect.

Doug Caring: Yeah, and, John, let me start by acknowledging the creativity in getting two questions in at the same time. That is always fascinating to watch. So on CapEx, I think we will get back to everyone after the end of the fiscal year and talk about next year’s CapEx at that point in time. But I will state a couple of things. Obviously, from what Clay has gone through, the most interesting thing that you should start thinking about is the uncoupling of CapEx with capital requirements from Oracle Corporation. Obviously, when we have these additional funding mechanisms, there may be additional CapEx, but it does not require out-of-pocket cash from Oracle Corporation, which is quite interesting. Underlying that is we remain committed to what we talked about last quarter, which is maintaining the investment-grade rating at Oracle Corporation as well as staying within the financing envelope that we talked about, of which we have announced that we are doing $50 billion this calendar year of that total.

So more to come, John, on the CapEx after next quarter.

John DiFucci: Very much appreciate the color on that, Doug. And, Mike, your prepared remarks on AI and how Oracle approaches it—everybody should use that because it is a logical approach. So thanks, and nice job.

Operator: Our next question will come from the line of Mark Murphy with JPMorgan. Please go ahead.

Mark Murphy: Thank you. Congrats on the acceleration. Clay, as Oracle Corporation transitions to higher levels of AI inferencing, what do you view as the right strategy for trying to optimize the location of your data centers? For instance, if you have these huge centralized data centers in Texas and Wyoming, they are very close to power, but they are pretty far from the population centers and the fiber routes that are out there on the seaboard. So it crosses our minds that the users and the devices are a long distance away. So as you make a move more into inferencing, are you seeing any reason to try to pivot those locations a little closer to where the users and the traffic are?

Clay Magouyrk: Sure. Great question, Mark. So let me start by highlighting our perspective on inferencing and then how that impacts data center deployment. First thing I would say is for a while there was a lot of training going on. Inferencing is very rapidly growing everywhere and anywhere. I think it is because of higher and higher utilization of the models themselves and also new use cases—as anyone who has been using Claude or Codex recently in the software space knows. These are incredible tools. They are changing how we do everything. So inferencing is going to have a huge amount of demand. Now, you talk about data center location—you mentioned latency is the one. Realistically, there are several reasons you might care about the location.

It might be for the cost. It might be overall availability. It might be for sovereignty. So there are different reasons to pick a location. But to hone in on your point about latency, the thing to understand is that latency is all proportional. Meaning, if what you are trying to do is a very low-latency trade on the stock market, waiting for the 100 millisecond round trip from coast to coast is a bad idea. If what you are doing is you are asking a question of your business that is going to take an AI model several seconds to think about, an extra 40 milliseconds of latency from New York to Wyoming is not going to hurt you. And so when you actually talk to customers about use cases where they need lower latency, the latency problem right now is not actually the location of the hardware, it is the type of hardware that is being deployed.

And that is why you are seeing so much innovation going on around these AI accelerators. If you look at what GROQ does, or Cerebras, or Positron—all of these different types of companies are saying, well, not only how do we reduce the cost of inferencing, but also how can we significantly reduce the latency of it? And I think, if you look forward to GTC from NVIDIA next week, you will see an announcement from them. But across the board, I think the way that, as an industry, we are going to consolidate and reduce latency has to first start with a different architecture for that inferencing. And, thankfully, the data center location is actually a very tiny part of that. So it makes it much more flexible for us to go out and put data centers where power is abundant, land is plentiful, and we can actually optimize for what is available to meet this ever-increasing demand.

Mark Murphy: Thank you very much.

Operator: Our next question comes from the line of Siti Panigrahi with Mizuho. Please go ahead.

Siti Panigrahi: Great. Thanks for taking my question. I want to ask about the opportunity with your AI Database and AI Data Platform. So with recent excitement on AI and around enterprises now adopting tools from frontier LLMs, what are you hearing from customers about training their private data and building their private LLMs? And how confident are you in seeing the inflection in your AI Database growth that you talked about at the Analyst Day in October?

Clay Magouyrk: Yeah. Thanks. So, look, I think there are two parts to that question. One is how much adoption we are seeing of private LLMs, and then how much we are seeing of using AI with private data. I think in the early days, a lot of people thought that most customers would be doing very specific training of their own large language model. I think that has largely proven to not be the case. Instead, what I think is incredibly popular and growing in popularity is people taking the best models and wanting to combine that in a private way with their private data. And we are seeing a lot of demand for that. If you listen to Mike earlier talk about how we are embedding these AI models into our applications, that is one use case.

But, obviously, not everything, unfortunately, runs inside of an Oracle Corporation application, and lots of custom applications are written. So we added a lot of functionality to our Oracle AI Database to make it easy to connect—whether it be through MPT servers or natural language SQL—that you can use these models to use. But, also, we have our AI Data Platform product. This is really about solving this exact problem. You have a lot of data—it may be application data, it may be custom data in different data lakes and lake houses, it may be data in a structured database. All of that together gives you an agentic platform to quickly build applications on as well as access to all of the greatest models from multiple providers. So across the stack, we are seeing a lot of momentum across that.

And that is why, in my prepared remarks, I talked about the growth that we are seeing with our multicloud database. What we see is that for customers to take advantage of the latest and greatest AI, they first have to be in the cloud. There is still a lot of data that is not in the cloud. And so we see acceleration of moving that most important private data to cloud environments so they can then take advantage of the latest and greatest AI with that data.

Siti Panigrahi: Great. Thanks for the color.

Operator: Our next question comes from the line of Mark Moerdler with Sanford Bernstein. Please go ahead.

Mark Moerdler: Congratulations on what is a really good quarter. Really great work. I am going to change over a little bit and discuss the financial side a little bit. Now that you have completed your major debt raise, can you explain, given the blend of the cost of building out the AI data center and the cost of raising capital to fund the AI data center, how comfortable are you with the values you are creating from the AI data center business itself? And then as an adjacency, if you do not mind, can you talk a little bit more on the Sovereign Cloud? Can you discuss how you parlay the AI data center business into being the AI provider for sovereign clouds and how that should impact your value of work to Oracle Corporation? Thanks.

Clay Magouyrk: Sure. I think we are going to split this one up. I will take the first half, and then I am going to throw it to Mike to talk about some of the Sovereign Cloud stuff. So, look, when you think about the overall profitability of these AI data centers, there are two pieces. One is how profitable it is purely on the accelerators themselves. We gave guidance in the past that we see gross margin in the 30% to 40% range on that. That continues to hold for us. And we continue to get better and better at running these data centers, delivering them more cheaply—the amount of cost in networking and hardware spend as well as power—we see that continuing to incrementally improve. So we are very pleased with that. The other thing to understand is that in these AI data centers, whether it be for inferencing or for training work, the only thing being procured is not AI accelerators.

There is a lot of general-purpose compute. There is a lot of, whether it be high performance or large-scale blob storage. There is load balancing. There are identity and security products, etc. That is typically on the order of 10% to 20% of the total spend that ends up going to adjacent services. And when you factor that in, which has higher margins depending on the mix of services, overall profitability continues to improve. And that is without taking into account, as I mentioned earlier about our multicloud database business, that that is a much higher-margin business—more in the 60% to 80% range. It is growing very, very rapidly. So when you combine all of these pieces together, the overall margin profile of OCI continues to strengthen and grows rapidly.

The thing I would say—the question that I think underlies this that maybe people do not understand—is the limitation on the profitability is not on the capacity we have delivered. So let us say that I am building a data center and it has four data halls, and I deliver the first data hall. That one is profitable. The reason we are not even more profitable right now, despite the fact that we are continuing to grow EPS, etc., is because we have so much under construction at one time, and we have some expenses for those things. Now we are really good at that. We are very, very good at minimizing the time under which that construction is happening. We are very, very good at reducing those costs during that time period. But they are not zero.

And so as our business is going through this hypergrowth phase, that is the only drag on profitability. But, thankfully, we are very good and getting better at delivering that capacity. That capacity, when we deliver it, is all already contracted for at a very profitable rate. So when you combine those things together, we are extremely confident in both the capacity we delivered and the continuing increase in profitability of our AI business.

Mike Cecilia: Mike, want to talk about sovereignty? Yeah. So sovereignty, as I mentioned earlier, I think we are very well positioned. A year ago, sovereignty was about data sovereignty, and there were some faux solutions in the market where there was sovereign data from a primary perspective, but the backup was maybe somewhere else, maybe in another country. Of course, that is no longer acceptable. Sovereignty is about sovereign data, sovereign operations, and even sovereign contracting. Our Alloy model is perfectly positioned to deliver on all three of those things. And by delivering full-stack solutions—again, the big difference between what we are doing with sovereignty and what some of our competitors are doing—we are not simply putting an edge sovereign zone in.

We are putting full-stack OCI, which has all of our OCI services and, as you mentioned, margin mix also allows us to run all of our applications suite, our AI Data Platform, in that sovereign zone as well. Of course, the margins on some of those are different than our infrastructure margins. So I think we are in a very unique position to deliver all that we have at Oracle Corporation in a sovereign zone. That sovereign zone can be as small or as large as a customer wants it to be. The other piece is that we have full flexibility as to where we draw the line of sovereignty. We often think about sovereignty in terms of lines of countries, but we also have customers that we have been talking with—enterprise customers may operate across multiple countries, let us say, in Europe or in Africa—that actually want to have a sovereign zone, a sovereign zone that they control and they operate in their data center, and they are serving customers in a certain vertical industry like health care, for example, or retail, for example.

Their sovereign zone is drawn in their Alloy across those countries. We can accommodate all of that. We have the most flexibility—we think we have the most flexibility in contract and the most flexibility in delivering—and, again, the most important thing is that we deliver all that Oracle Corporation has in these sovereign zones. It is not a subset. It is not a few edge devices. It is all of OCI.

Mark Moerdler: Extremely helpful, both the answers. I much appreciate it, and congrats again.

Operator: Our next question will come from the line of Raimo Lenschow with Barclays. Please go ahead.

Raimo Lenschow: Perfect. Thank you. Congrats from me as well. I wanted to ask something that we are struggling a lot with when we talk to investors, and that is the theme of SaaS software, application software—“Is AI going to kill it?” I just wanted to hear what you guys are hearing when you talk with customers. Is that one of these investor things? Is that getting discussed on the customer side as well? And how do you explain it? I am just thinking about what you guys do is a lot of deterministic rather than probabilistic, so that might probably be the explanation here, but just wanted to hear your perspective again. Thank you.

Mike Cecilia: This is Mike. I will take the question. As far as the customers that I spoke with, I have not yet met a customer who tells me they are ready to give away their retail merchandising system, their core banking system, demand deposit account systems, electronic health records systems, and some cobbling together of niche AI features are going to replace all of that overnight. In fact, you hear quite the opposite with customers. What they are asking is how can we consume as much AI out of the box as you are putting into your applications across the board, and how can we get that up and live as quickly as we can? Because we think that is the best way to actually realize value. These systems—what we are running at Oracle Corporation, as you know—are highly complex, mission-critical, with decades of industry experience, decades of regulatory compliance, and these are the systems that our customers use to run their business, run their government agency, run their health care organization, whatever the case is.

I really like our position here. As I said, we are leaning very heavily into AI ourselves. We have a thousand AI agents already live in Fusion. Our banking suite alone has hundreds of AI agents just inside our banking solution. So, yes, we think AI is disruptive—we do—but we think we are the disruptor because we are actually embedding the AI right into our applications, full stop, again at no additional cost. These are features that come in the application suite as part of quarterly upgrades, as part of a regular cadence. So I am actually—rather than thinking that AI spells the death of SaaS, at least for Oracle Corporation—I think it actually helps our SaaS position and helps us get to market even more quickly. We are thrilled with the results that we have and expect to have a lot more color on this as we go forward.

Raimo Lenschow: Okay. Thank you.

Operator: Our final question will come from the line of Brad Zelnick with Deutsche Bank. Please go ahead.

Brad Zelnick: Great. Thank you very much, and I will echo my congrats and also just say that the messaging is very, very clear and very helpful. My question is for Mike and perhaps Larry, and it extends on what Raimo asked. You have introduced AI Agent Studio inside of Fusion, and we all know that the crown jewels within an enterprise live inside of Oracle Database and Oracle apps. But I am curious, how do you see Oracle Corporation’s role evolving in a world where many other players are vying to be the AI interaction layer across multiple different enterprise systems and workflows?

Mike Cecilia: So, Brad, it is Mike. I will start. Look, I think data gravity matters here, and I think mission-critical data gravity matters even more. So, as we said, we have announced the AI Agent Studio inside of Fusion. Fusion is a system inside our customers that is the custodian of their operational data, their mission-critical data. If you are going to build a bunch of AI agents—or your system integrator is going to build a bunch of AI agents—the question I would have is where would you start? Well, you would start inside the system of record. You would start inside the system of gravity because that is the data, from an inferencing standpoint and from a retrieval-augmented generation standpoint, that is going to be highly relevant and highly specific and add a bunch of context to AI.

Now, the AI Agent Studio that we have released in Fusion is not just specific to Fusion data. You can build AI agents across our industry applications, across third-party applications. Third parties can build AI agents in there. So the fact that we are delivering an all-in-one best-of solution—a full-scale SaaS application, AI-powered SaaS applications—and giving you the ability to create your own AI agents either on top of that or next to that in a standard, upgraded-quarterly platform release schedule, I think, is going to be quite attractive. Because this AI Agent Studio that we built in Fusion is part of our quarterly upgrades. It is part of our regular security patching. So you are getting the best, we think, of both worlds. You are getting packaged SaaS applications.

You are getting an agent studio which is very, very close to the most mission-critical, germane data that the enterprise possesses, and you are getting the ability to create your own custom bespoke agents if you would like to as well.

Lawrence Ellison: I will just end with: we provide a bunch of prebuilt agents for all of our applications. But in addition, we provide a development environment—the AI Data Platform development environment—that allows our customers to easily add their own agents to what we built. We do not think we can build all the application agents for a banking system or all the application agents for a health care system. A lot of our partners are going to do that. A lot of our customers are going to do that. What the AI Data Platform does is it provides a complete integrated development environment where you can build your own agents using any AI model that is in the Oracle Cloud. And that is basically all of the popular AI models.

You can use it for coding the agent. You can use it to do multistep reasoning for queries. We plan, in our Fusion accounting system, for example, we will have a complex agent that does something called the close. So when you close your books with Fusion in the not-too-distant future, it will be an autonomous agent—no human beings involved. You will close your books by simply telling the AI agent to go ahead and close the books, and then you will get your results. We provide a lot of AI capability built into our applications, but they are open. They are open to allow our customers and our partners to add to that portfolio of agents, and we build an entire ecosystem that automates health care, automates financial services, automates retail.

That is what AI is allowing us to do: to expand our horizons for the scope of the suites of the SaaS software we are building to automate entire ecosystems. Let me talk about health care. In health care, Epic automates hospitals—acute care hospitals—and, in some cases, clinics, but primarily acute care hospitals. We automate acute care hospitals. We automate clinics. We automate laboratories. We automate the payers—the people who actually pay the bills. We automate the insurance companies. We automate the HCM system that trains their nurses, that schedules their radiologists to get the right radiologist when an MRI is given, that automates the hospital’s financials, that also automates the FDA and the regulators that approve the latest drugs, that deals with the pharmaceutical companies.

That is the health care ecosystem. It is enormous. And thank God we have these coding tools now that allow us to build a comprehensive set of software—agent-based software—to automate an ecosystem like health care or financial services. That is what we are doing at Oracle Corporation. That is why we think we are a disruptor. That is why we think the SaaS apocalypse applies to others, but not to us.

Brad Zelnick: Really great stuff. Thank you, Larry. Thanks, Mike, and congrats.

Ken Bond: Thank you, Brad. A telephone replay of the conference call will be available for 24 hours on our investor relations website. Thank you for joining us today. And with that, I will turn the call back to Regina for closing.

Operator: This will conclude today’s call. Thank you all for joining. You may now disconnect.

Follow Oracle Corp (NYSE:ORCL)