Equinix, Inc. (NASDAQ:EQIX) Q4 2023 Earnings Call Transcript

Charles Meyers: Thanks, David. So on private AI, I do think there’s strong similarities and some differences between what we think about private cloud or hybrid cloud. But I think the dynamic is quite similar. And in fact, I was looking at an industry survey that was recently given to me that showed that based on their discussion with respondents that we’re implementing Gen AI, about 32% of those funds were doing that in public clouds, exclusively. About 32% we’re doing it in private cloud exclusively. And about — and now 36%, we’re doing it in a hybrid between some private cloud, some public cloud. And the folks who are doing it in public, many of them were doing it in more than 1 public cloud. And so that dynamic in sort of how we saw cloud large play out over the last several years.

And I think the end state is going to be that, that 36% is going to be a much bigger number. In other words, a much larger number of people are going to be doing sort of prosecuting their AI agenda through both public and private infrastructure. But I would say when we talk about where private AI happens, a lot of it is really focused on where people want to place their data. And this desire to — and it is sometimes about the proprietary nature of that data and controlling it, etcetera and it sometimes about the cost of moving data in and out of public clouds. And other factors, including performance. And so private AI, what we’re seeing is people saying, look, I want to maintain my control over my enterprise data. And I want to place it somewhere that is cloud adjacent because the hyperscalers are innovating at such a rapid rate that I want to use their models, their tools, their — and then you have this broader ecosystem outside of just the hyperscalers that is also evolving that people want to connect to.

And so cloud adjacent storage Equinix Fabric and Fabric Cloud Router are incredible tools and then you mix that with the color [ph] opportunity that they might need to place GPU infrastructure and that kind of thing, that’s really what we see as the essence of the private AI opportunity. And so — and it does, I think, seem to be taking shape in a way that’s really positive for us. And then, go ahead on the second piece on the — I know that we had that question before, David, we figured that one might be coming.

Keith Taylor: Yes, David. So as it relates to some new metrics, we’re continuing to review the data sets. The team, we’re not clear that exactly what needs to be presented that we can comfortably put out of the market on a consistent basis. But one of the things we’re thinking about, just to give you a sense and we’re not ready for prime time yet, is looking at density or a threshold and the extent that there’s a certain amount of density over some required threshold, we modify the cabinets again, as you all know, we report on a cabinet equivalent basis. So that’s what we’re thinking about because we think the cabinet is probably the best representation for you to get a sense of how we’re utilizing the asset. That all said, we still — I think we have to continue to be quite transparent about the overall density of the cabinets sold so that you can see sort of a trend line.

We spent some energy thinking about power prices, just doesn’t feel like the right metric to be sure. And given the nature of our business model relative to others. Again, as you know, we’re a retail player and it just — it’s just a different type of metric and we’re not sure that, that is a valuable metric. So looking forward, we’re going to continue to work it and we’ll absolutely be actually — be sort of ready to go, I think, sometime in the first half of this year with either adjusted metrics or a different view on how we’re going to represent our fill rates.

Operator: Our next question is from Michael Elias with TD Cowen.

Michael Elias: Great. Two, if I may. One of the questions we get from investors is whether GPU-based compute is distance remediating CPU-based compute. And if so, how the legacy data centers designed at lower cabinet densities, we’ll be able to handle that. Are you seeing customers swapping CPUs for GPUs for their existing data center deployments. If so, how do you mitigate essentially against the obsolescence in existing facilities? That’s the first question. And then the second question is along a similar vein for AI inference. The thought is about the model we need to sit proximate to the data which candidly lives within your facilities. Although I think there’s also a question of whether that’s CPU-based or GPU-based. As you look to capture demand for inference, how is the standard data center design for you, guys, evolving from both a power density perspective and a cooling architecture standpoint? Any color there would be helpful.

Charles Meyers: There’s lots there. All of things — thanks for the question, Michael. All things that are obviously top of discussion around FX in various places. I do think that, look, GPUs are sort of something that is much more special purpose, dedicated compute that is — that goes beyond the traditional CPU realm. I mean, I think is a very, very clear trend. That said, I don’t think that it’s a world where all things compute and all things AI are necessarily done by GPUs. And I think that there is going to be a range of players that I think continue to evolve on the compute side of things to provide chips that meet various sets, of various purposes in the AI realm. And so in terms of the AI and I don’t — we aren’t seeing is this massive shift out or from CPU to GPU.

What we’re typically seeing is people adopting GPUs in parallel. And I think that even some things that are currently GPU-centric, we think over time, may actually be well served by either current or future generations of CPU. And so we’re not seeing that as a big obsolescence trend. And that relates a little bit to the second part of your question and I think both on inference and training because I would say that the evolution of the data center design needs to respond to both of those things. I would say the more acute near-term evolution is on the training side. Because it’s substantially more power dense and does require, I think, different thinking around that power density and the cooling to support it. And so I think the much higher average density design that we would probably put forward xScale build-out would be that more acute representation of the near-term change.

On the inference side and I think broadly on the retail side, we are seeing densities, power densities rise but at a slower rate. And I think that our ability to implement liquid cooling as long as we have access to a chilled water loop, our ability to get liquid cooling into the facility to support high-density implementations is quite high. And in fact, we announced that we can do that in a large number of markets around the world. So I think we’re in a good position. I don’t think we face a situation where we’re going to have meaningful obsolescence even of our significantly more dated assets. And so especially as we can implement liquid cooling inside of those facilities. And so — but I think we’re — those are things we continue to track and I do think they’re going to have to be very top of mind for us.

And probably the overall pace of change in our design is going to increase in this next decade than it was in the one prior for sure. piles.

Operator: The next caller is Matt Niknam with Deutsche Bank.

Matt Niknam: I will keep it brief, it’s 2 follow-ups. Number one, what guides your expectation for churn to, I guess, improve slightly in the second half? Is there anything you have — you’re seeing in terms of visibility or anything guiding that expectation for improvement? And then secondly, in terms of its macro, you talked about some, I guess, deal slippage and dynamics that resemble maybe 1Q of ’23 that you saw in 4Q. Just any updates in terms of — we’re now, I guess, halfway into 1Q. Have those deals closed? Are they still out there? Just any color, I guess, from what you see in the first 6 weeks of this year.

Charles Meyers: Yes. Let me take that one first, Matt. We have — some of that business has close to — some of that closed very quickly immediately after the quarter and that’s just sort of a natural turn of events. Some of it is in our commit for Q1 and some of it has rolled into Q2 or quarters forward from that. So very little lost. We did — we had lost some of that but very little of it. And so really primarily push forward. Again, as I said, Q4 did unfortunately look a little more like Q1. We would have referred it to look a lot more like Q2 and Q3. But that is the dynamic. And I think that we’re — it’s hard to fully predict. But again, our customers, the sentiment we hear from customers is one, yes, we have tighter budgets.

Yes, we’re continuing to optimize. But boy, we sure are committed to what we’re doing on the digital side of things. And yes, we want to talk to you about what we’re doing in AI. And yes, we want to figure out where to place our data. But I think those things take a little time to translate into firm bookings trajectory. And then as to why we feel a comfort level around mitigation churn, we do have good visibility to our pipeline. In fact, our large deal churn, we’ve gotten very good at forecasting. We saw a little bit more midsized churn in Q4 which contributed some to the elevation. And so I think we have to keep our eye very closely on that. And I don’t have a ton more to offer you on that particular view. But I think that we do think that it is realistic for us based on what we’re hearing in terms of appetite from customers that we would see some abatement and churn in the second half.

I’ll add one more comment. It used to be, I would say, ’21 and most of ’22, I think there was a scarcity mindset relative to data center capacity. ’23 really changed pretty meaningfully. The macro conditions changed. This sort of desire to tighten budgets, the desire to kind of offset the impact of PPI. I think all did play into what we were hearing was this very different appetite and a higher degree of optimization. I would say, I think we’re seeing the front end, though, of some of our customers who have at least talked to us about turning back some capacity, sort of come back and say, yes, don’t put that back on the market yet because we’re not sure we want to give up capacity in this market. And so that’s the first time, I think, in a while that we’ve heard that kind of mindset.

It’s typically from larger service providers. But I think we’re going to probably see — we’re starting to see the front end of that. And so again, if macro does what we think it will do which we would probably see some improving interest rates over the course of the year. I think we would see a generally improved macro environment and I think that’s sort of informing our guidance.

Operator: Our next caller is Richard Choe with JPMorgan.

Richard Choe: I wanted to follow up on the competitive environment. Are you seeing deals go to competitors or…