Advanced Micro Devices, Inc. (NASDAQ:AMD) Q1 2024 Earnings Call Transcript

Lisa Su: Yes, so I think again, I think we’re pretty excited about the AI PC, both opportunity in let’s call it the near-term and even more so in the medium term. I think the client business is performing well, both on the channel and on the MNC side. We expect clients to be up sequentially in the second quarter. And as we go into the second-half of the year, to your question about units versus ASPs, I think we expect some increase in units as well as ASPs. The AI PC products, when we look at the Strix products, it’s — they’re really well suited for the premium segments of the market. And I think that’s where you’re going to see some of the AI PC content strongest in the beginning. And then as we go into 2025, you would see it more across the rest of the portfolio.

Ross Seymore: Thank you.

Lisa Su: Thanks, Ross.

Operator: Our next question is from Matt Ramsey with TD Cowen. Please proceed with your question.

Matt Ramsey: Yes, thank you very much. Good afternoon everybody. Lisa, I have sort of a longer-term question and then a shorter-term follow-up one. I guess the — one of the questions that I’ve been getting from folks a lot is, obviously, your primary competitor has announced, I guess, a multiyear road map, and we continue to hear more and more from other folks about internal ASIC programs at some of your primary customers, whether they be for inference or training? I guess it would be really helpful if you could talk to us about how your conversations go with those customers? How committed they are to your long-term road map, multigeneration, as you described it? How they juxtapose doing investments of their internal silicon versus using a merchant supplier like yourselves and maybe what advantages the experience across a large footprint of customers can give your company that those guys doing internal ASICs might not get?

Lisa Su: Yes. Sure, Matt. Thanks for the question. So look, I think one of the things that we see and we’ve said is that the TAM for AI compute is growing extremely quickly. And we see that continuing to be the case in all conversations. We had highlighted a TAM of let’s call it, $400 billion in 2027. I think some people thought that was aggressive at the time. But the overall AI compute needs, as we talk to customers is very, very strong. And you’ve seen that in some of the announcements even recently with some of the largest cloud guys. From my view, there are several aspects of it. First of all, we have great relationships with all of sort of the top AI companies. And the idea there is we want to innovate together. When you look at these large language models and everything that you need for training and inferencing there, although — there will be many solutions.

I don’t think there’s just one solution that will fit all. The GPU is still the preferred architecture, especially as the algorithms and the models are continuing to evolve over time. And that favors our architecture and also our ability to really optimize CPU with GPU. So from my standpoint, I think we’re very happy with the partnerships that we have. I think this is a huge opportunity for all of us to really innovate together. And we see that there’s a very strong commitment to working together over multiple years going forward. And that’s, I think, a testament to some of the work that we’ve done in the past, and that very much is what happened with the EPYC road map as well.

Matt Ramsay: I appreciate that, Lisa. As my follow-up, a little bit shorter term. And I guess having followed the company super closely for a long time. I think there’s been — there’s always been noise in the system from whether the stock price is $2 a share or $200, there’s been kind of always consistent noise one way or the other. But the last 1.5 months has been extreme in that sense. And so I wanted to just — I got random reports in by inbox about changes in demand from some of your MI300 customers or planned demand for consuming your product. I think you answered earlier about the supply situation and how you’re working with your partners there. But has there been any change from the customers that you’re in ramp with now or that you soon will be of what their intention is for demand? Or in fact, has that maybe strengthened rather than gone down in recent periods because I keep getting questions about it? Thanks.

Lisa Su: Sure, Matt. Look, I think I might have said it earlier, but maybe I’ll repeat it again. I think the demand side is actually really strong. And what we see with our customers and what we are tracking very closely is customers moving from, let’s call it, initial POCs to pilots to full-scale production to deployment across multiple workloads. And we’re moving through that sequence very well. I feel very good about the deployments and ramps that we have ongoing right now. And I also feel very good about new customers, who are sort of earlier on in that process. So from a demand standpoint, we continue to build backlog, as well as build engagements going forward. And similarly, on the supply standpoint, we’re continuing to build supply momentum. But from a speed of ramp standpoint, I’m actually really pleased with the progress.

Matt Ramsay: All right, thank you very much.

Operator: Thank you. Our next question is from Aaron Rakers with Wells Fargo. Please proceed with your question.

Aaron Rakers: Yes, thanks for taking the question, and I apologize if I missed this earlier, but I know last quarter, you talked about having a — securing enough capacity to support significant upside to the ramp of the MI300. I know that you upped your guide now to $4 billion. I’m curious how you would characterize the supply relative to that context offered last quarter as we think about that new kind of target for? Would you characterize it as still having supply capacity upside potential? Thank you.

Lisa Su: Yes, Aaron. So we’ve said before that our goal is to ensure that we have supply that exceeds the current guidance, and that is true. So as we’ve upped our guidance from $3.5 billion to $4 billion, we still — we have supply visibility significantly beyond that.

Aaron Rakers: Yes. Okay, thank you. And then as a quick follow-up, going back to an earlier question on server demand, more traditional server. As you see the ramp of maybe share opportunities in more traditional enterprise, I’m curious how you would characterize the growth that you expect to see a more traditional server CPU market as we move through ‘24 or even longer term, how you’d characterize that growth trend?