TE Connectivity Ltd. (NYSE:TEL) Q2 2024 Earnings Call Transcript

And that’s what we get excited about that destocking is over, destocking has masked some of our content growth as we work through it, but I do also think as we work through this year and get to 25%, it’ll be a nice tailwind that that’s finally behind us.

Operator: Your next question is from the line of Amit Daryanani with Evercore Partners.

Amit Daryanani: I guess, Terrence, I’d love to kind of get your perspective. AI is clearly a very big focus for everyone. You just talked about some fairly strong revenue trajectory as you go forward that you expect to see. Can you maybe spend some time talking to us about what exactly are you really providing when it comes to AI solutions and any sense of where the opportunities right now across processor companies like NVIDIA versus cloud provider that might be running their own infrastructure and what is the conversion to revenue look like as you go forward from here? Thank you.

Terrence Curtin: Yes. No, and I made some comments, Amit, so I do appreciate the question. And the one thing that’s a little bit different than we talked about today on the prepared comments was we typically always talk to you about design win momentum and that’s continued, but we did give you more highlights about where does revenue go for here. And we told you all year as we were seeing the ramp in AI, we were going to do about $200 million of revenue this year in AI applications. And certainly 60% of that would be in the back half. That really hasn’t changed. But when we look at the design momentum that we have and also expectation of what we’re hearing from our customers, like I said on the call, we expect that $200 million to essentially double next year to $400 million and we actually see a path that could get up to $1 billion a year, a few years after that.

So, we’re actually seeing the traction with the design wins. Our teams, the investments we’re making to ramp it both from engineering and operations are in place to drive it. And like you said, when you have to service this area, there’s an ecosystem that’s here. And that ecosystem, when you look at our engagements, they’re with hyperscale customers, some of who are developing their own AI solutions. We also have to work closely with semiconductor companies, including both the processor companies and the other semi players that make acceleration chips and other silicon solutions. So, when you look at, we have to play with everybody in that ecosystem and our teams are doing a nice job. And then, as important is, as you work with them, how do you get on reference designs that then are really ready to deploy offerings that do allow further cloud customer deployments.

And our sales are across the entire ecosystem, if not with one. So, I like the breadth that we have and that’s really driving the momentum that we have. And from a product perspective, where you start at, it starts with the socket that’s right up against the GPU. You can have things that are on the board and then you also get into things that are really a cable backplane where you have things that are very important to make sure you don’t get the latency and you keep the high speed going to really make sure that cluster can really crank at the speeds they need to do the LLM. So, net-net is pretty broad in the products we play and it’s just really in many cases what we did on the cloud moving up to the next level of performance in this application.

Operator: Your next question is from the line of Wamsi Mohan with Bank of America.

Wamsi Mohan: Terrence, appreciate all the comments here on the prior question around AI. If I could just ask a quick clarification on that. How are you thinking about your share in these high speed, low latency applications? And for my question, I was wondering if you could comment a little bit beyond fiscal ’24. It seems like destocking is coming to an end. Your orders are bouncing back a fair amount over here. And you mentioned stability in some of the end markets as well. Any early thoughts on how fiscal ’25 is shaping up from a growth and margin perspective?

Terrence Curtin: Yes. So, when you think about share in the AI application, it would be similar to the share we had in cloud applications. So, I think when you look at that, and we talked about that a lot, the momentum we had when we went through the cloud during COVID and the momentum we had across broad set of customers, I think you can expect similar share type thoughts on that for the AI applications and once again being broad. When you look at 2025, I guess the first thing is it will be nice not to talk about destocking. So that has been a headwind, I know we’re not the only one that dealt with it, but that will be something that turns and you’re starting to see it and see us and that will help us. So I do think that will turn to a headwind to be honestly to a tailwind as we get rid of that.

And what’s nice is we’re going to start seeing that in communications already here later this year, and I do think we have to wait towards the end of our fiscal year to get there with the industrial equipment business. The other thing as we look at 2025, there’s going to be if you just start with transportation, you’re going to continue to have electric vehicles, the broad category that we put everything in, continue to grow in the world. And I think you can continue to look at 4% to 6% growth on top of that. I wouldn’t say we view it’s going to be auto production is going to boom in ’25, but I think you’re going to continue to see an area where auto is below peak where we are today and then we’re going to get the content benefit on top of it that I think we’ve proven to you.