Oracle Corporation (NYSE:ORCL) Q3 2023 Earnings Call Transcript

This is a cancer AI system. But we’re also doing wellness, heart disease, et cetera, down the road. We think — so our platform runs AI very, very well because we create these clusters of GPUs that we — that can attack big problems very quickly. We do it economically, then we build the applications on top of that. We provide the service to a lot of the startups in the AI world. This is one example of where we’re just way ahead of the other hyperscalers in terms of our network and our ability to do AI. Let me point out one last AI thing. The Oracle Autonomous Database doesn’t have any database administrators. It’s completely self-driving. The Oracle Autonomous Database is self-driving because it is driven by — it is an AI module that is the DBA.

We’ve replaced the DBAs with AI inside of our own cloud. The Oracle Autonomous Database actually it runs all the databases inside of the administrative part of our cloud, keeps track of all of our users, our billing, all of those things, recovery data sets, all of that stuff is now done using AI and our autonomous database. So we’re a huge consumer of AI. We’re a huge vendor of AI, GPU capacity, clustered capacity is — we build AI modules in health care. And people are coming to us. NVIDIA is often recommending us as the best cloud for AI, and this is a good time to be there.

Operator: Your next question comes from the line of Mark Murphy with JPMorgan.

Mark Murphy: Thank you, Larry. My question was very much related to that, but maybe from a slightly different angle. I’m wondering if you could drill into the opportunity that you do see on the generative AI side? We’re repeatedly hearing that companies are running those kinds of models on OCI. NVIDIA is moving some of those workloads to the Oracle Cloud. And the other concept being that these AI models are so data hungry and that you have all the data already contained in the Fusion applications. I am curious if that piece of it, the generative AI piece, is something that you see lining up as a growth driver that is material overall on the entire business.

Larry Ellison: The answer is absolutely yes. There’s actually more demand for AI processing than there is available capacity. So — and we’re the only ones, again, that can dynamically — and by the way — and we’re short. We are expanding as fast as we can. It’s really — it’s an exciting opportunity, but it’s challenging when there’s more demand than supply. But the great — the difference with us is our standard network allows us to group together these GPUs and have them attack these problems. Whether it’s a medical diagnostic problem or it’s a generative language problem, a la ChatGPT. So we have a lot of ISVs seeking us out because we have the — not only do we have the most cost-effective solution, we can make the solution available to them very quickly because it runs on our standard network.

So they can — we can create a cluster for them, they run their workload. And the moment their workload is through running, we can reallocate that cluster or break that cluster up and allocate it to other users. The other guys can’t do that, they can’t do it dynamically.

Operator: Your next question comes from the line of Derrick Wood with TD Cowen.