NetApp, Inc. (NASDAQ:NTAP) Q2 2024 Earnings Call Transcript

And then, of course, pricing discipline is in there as well. So I would rank from a sequential perspective, the order that I did in my prepared remarks in terms of mix shift, which is both product and capacity than favorable COGS and then pricing discipline. So hopefully, that helps.

Wamsi Mohan: Yes, thanks.

Operator: Thank you. And our next question comes from Nehal Chokshi with Northland Capital Markets. Please go ahead.

Nehal Chokshi: Great. Thanks for taking my question. You guys mentioned 30% growth in first-party storage services. What’s the NRR underlying that?

George Kurian: We don’t break those out. I think we feel very good about — we have brought in the number of workloads that we serve. We have brought in the number of hyperscalers now with Google coming online. We have brought lower price points for Azure and Google, and we have brought higher price points for Amazon. So I feel really, really good about the momentum in our first-party cloud storage services.

Nehal Chokshi: Maybe phrasing it in a different way, the driver of that growth is expansion or lands?

George Kurian: The combination of both new customer adds and add new workload use cases within existing customers as well as expansions.

Nehal Chokshi: Great. And then a follow-up, quick question. George, you talked about how you had a $16 million win with the C-Series. And you mentioned four drivers behind that. Much of those four drivers was really probably the biggest element of that win?

George Kurian: Listen, I think we have a really strong operating system capability for performance and simplification at scale. Many of the other vendors that start simple run into real trouble when you try to build a large enterprise environment. And we have a really good portfolio to do that. I think that was probably the number one reason. And the number two reason is now that we have the C-Series, we have a price point to deliver to customers who used to not have it.

Nehal Chokshi: Awesome. Thank you.

Operator: Thank you. And our next question comes from Simon Leopold with Raymond James. Please go ahead.

Simon Leopold: Great. Thanks for taking the question. First, just a quick clarification. On the strategic review update, I just want to confirm, it sounds like you’ve concluded that review apart from sort of regular business kind of reviews. I just want to confirm that, that’s the case. And then really, the element I’m trying to sort of tease out here is you’ve taken out $55 million of ARR, so roughly $15 million of revenue, yet your outlook is higher. What is informing the higher outlook? What’s been the biggest surprise and the biggest delta contributing to the higher outlook? Thank you.

George Kurian: So first of all, let me hit that in three parts, right? First, we have concluded the strategic review. We have a set of good decisions we’ve made that we need to now go implement that will result in a more focused cloud business and a healthier subscription base, albeit a smaller one that build off. We believe that these actions should allow us to get back to growth in fiscal year ’25 of a healthier business mix in cloud. We always will do reviews of various aspects of our portfolio as ongoing parts of our business. But the focused strategic review, I would say, is mostly complete. I think the second is with regard to the confidence we have. Listen, we said that when we guided the second half of the year, we took up the overall guide by close to $100 million.

That is mostly based on the momentum of our all-flash Hybrid Cloud storage portfolio. We’ve raised the second half guide by substantially more than we beat in the second quarter. And it also accounts for the fact that we will have some headwinds through the rest of fiscal year ’24 in our cloud subscription business, which will only be partially offset by growth in our cloud consumption business.

Simon Leopold: Thank you.

Operator: Thank you. And our next question then comes from Aaron Rakers with Wells Fargo. Please go ahead.

Aaron Rakers: Yes. Thanks for taking the question. A lot of those have been asked and answered. But I wanted to go back to some prior discussion around this notion of AI. And we hear a lot about like AI, large language models becoming smaller and implemented more maybe pervasively over time in traditional enterprise environments. We’ve even heard more about inferencing and how that might evolve in enterprises. I’m just curious, are you seeing at all any signs of that pulling either discussions or early signs of demand? And if so, is it a prerequisite that, that has to pull as all-flash storage with that kind of footprint. And the reason I ask is there’s a lot of discussion about a lot of this existing infrastructure that’s going to have to be upgraded to support these acceleration of AI in infrastructure? Sorry for the long-winded question.

George Kurian: Yes. No problem. I’ll address that in three steps. I think first is the use of smaller models as opposed to the very, very large model. Yes, that term is distillation. We do see that going on in customers, whereas they kind of run these different models, they begin to realize that you can get as effective an outcome with much faster results and a smaller number of parameters. For example, the demonstration, the live demonstration that we showed at NetApp INSIGHT actually was the distillation. We started with a much larger LLM and we brought it to a much smaller range of parameters because you get the same benefit. So that’s going on. The second is with regard to training environment, which is the part of the data life cycle in AI where you aggregate a data set and you train the algorithm or the language model for better answers to be able to either predict a good outcome or generate a relevant outcome.