Adobe Inc. (NASDAQ:ADBE) Q1 2024 Earnings Call Transcript

Page 2 of 2

And as the community starts looking at places as if I’m a PhD that wants to go work in a particular environment, I start to ask myself a question of which environment do I want to pick. And a lot of people want to do AI in a responsible way. And that has been a very, very good opportunity for us to bring in amazing talent. So we are investing. We do believe that we have the best — one of the best, if not the best, research labs around imaging, around video, or around audio, around 3D, and we’re going to continue to attract that talent very quickly. We’ve already talked about we have the broadest set of creative models for imaging, for vector, for design, for audio, for 3D, for video, for fonts and text effects. And so this gives us a broad surface area to bring people in.

And that momentum that starts with people coming in has been great. The second part of this too is managing access to GPUs while maintaining our margins. We’ve been able to sort of manage our cost structure in a way that brings in this talent and gives them the necessary GPUs to do their best work.

Anil Chakravarthy: And regarding the sales positions in Enterprises. In Enterprise, we’re in a strong position because what we — I mean, this area of customer experience management, it remains a clear imperative for Enterprise customers. Everybody is investing in this personalization at scale, and current supply chain. These help drive both growth and profitability. So when you look at these areas, these from an enterprise perspective, these are a must-have. This is not a need to have. And that’s helping us really attract the right kind of talent. We just onboarded this week a VP of Sales who have prior experience and a lot of experience in Cisco and Salesforce, etc. So that’s an example of we’re really bringing on some excellent enterprise sales talent.

David Wadhwani: And I don’t know if Jay, you were asking for — building your model, or if you were looking for a job? But if you’re interested in any of these positions, let us know more.

Jay Vleeschhouwer: More the former.

David Wadhwani: Okay.

Jonathan Vaas: Hey, operator, we’re coming up on the hour. Let’s try to squeeze in two more questions. Thanks.

Operator: Thank you. We will take our next question from Kash Rangan with Goldman Sachs.

Kash Rangan: Hey, thank you very much. Looks like there is more trust in AI and Excel models than what you’re actually saying qualitatively on this call. I just wanted to give you an opportunity to debunk this hypothesis that is going around, that AI, it is generating videos and pictures, but the next step is it’s going to do the actual editing and put out Premiere Pro use or whatnot. So that is probably the existential threat that people are debating. So why don’t you see if we could take a shot at why that scenario is very unlikely that right now it’s about generation of images and then your tools pick up where the generation stops and you do the processing, right? So help us understand why this can coexist with AI. That’s a philosophical question.

And Dan, one for you. Besides the net new ARR that you’ve already reported on Creative and DM, what are the other indicators such as new business bookings that you don’t quantify necessarily that you qualitatively saw in Q1 that makes you feel good about the year? Thank you so much.

David Wadhwani: Great. So maybe I’ll take your first part, and Dan obviously, can take the second. So as it relates to generated content, I’m going to sort of break it up into two parts. One is around the tooling and how you create the content, and the second is around automation associated with the content. I think if you take a step back before we even get into either one of those, there is no question that there’s a huge appetite because of personalization at scale, the need to engage users, the need to build your personal brand online. That content is going to explode in terms of the amount of content being created, and it’s going to explode because of one of two things. The first is around the ability to create audio clips, video clips, images, vectors.

These are things that get users started. It’s a great for ideation. You’ll see in a few weeks that some of the incredible work the team has been doing around ideation on Firefly.com. And once those things are created, they do flow into our tools for the production, work and process. We’re clearly seeing a huge benefit from that because the more content that gets created, the more editing that’s required, and that’s what’s driving more commercial CC subscribers this quarter than any other Q1 before. So that’s the foundation of it. The second part of this, though, from an editing perspective is the controllability of — and the editability of not just pixels and vectors and timelines, but also the editability of the latent space itself. The latent space being the core capability — core model capabilities that actually generate the output.

We have a lot of research that we’ve already started releasing, the first of which was Style Match, and you’ll start to see more and more of that actually coming out at summit and beyond. But we are, in my mind, very, very clearly the leader in terms of creating models that can also be tooled on top of. So that combination of the models getting better and the controllability of those models, we’re in a remarkable position for that. So we benefit from that. The second part, Kash, is around automation. So as people are generating more content, you clearly need to be able to automate that content and how it’s created. And that’s really where Firefly Services come in. First, it’s built on the strength of our Firefly models, say for commercial use integrated into our tools, but it also adds the ability to have custom models so control what kinds of information or brand and content it’s trained on for both brand styles and product replica, and it’s also part of an ecosystem of API services, not just generate something which is a core part of it like text to image, or Generative Fill or Generative Expand, but also process.

So once you generate some images through APIs and automation, you want to be able to remove the background, you want to be able to blur the depth, you want to auto tone, you want to apply actions to that image. And then the last is you want to be able to assemble that for delivery. Firefly services don’t just generate something, but they let you have that entire ecosystem, and then you can embed that using low code, no code environments into your flows, and we are already embedding it into GenStudio and all of the capabilities that we’re shipping. So I think the core part of this is that as more of this content creates, you need more tool ability. The best models are going to be the models that are safe to use and have control built in from the ground up.

And I think we have the best controls of anyone in the industry, and they need to be able to be done in an automated fashion that can embed into your workflow. So I think all three of those vectors point to benefits for Adobe.

Kash Rangan: That’s very convincing.

Dan Durn: And then as we think about the forward-looking, a couple of points I would turn to, just think about cash flow in Q1. Strength of our cash flow, once you normalize for the $1 billion termination payment, that’s up 28% year-over-year. When you think about RPO, 3 point acceleration sequentially, and when I break that up on deferred revenue, unbilled backlog, you saw that acceleration in each of those subcomponents, which as you look through that acceleration, the near-term underscores the strength of the business and it underscores the longer term strength we have around the momentum of the business. When I think about individual product commentary, we talked about it a lot on this call. You see record commercial subscriptions in the Creative business.

In Q1, you see engagement going up on the products. Usage of Firefly capabilities in Photoshop was at an all-time high. In Q1, Express exports more than doubling with the introduction of Express Mobile in beta now going to GA in the coming months, AI Assistant Acrobat, same fact pattern. You can see that momentum as we look into the back half of the year. And from an enterprise standpoint, the performance in the business was really, really superb in Q1, strongest Q1 ever in the enterprise. So there’s a lot of fundamental components that we’re seeing around performance of the business that give us confidence as we look into the back half of the year.

Kash Rangan: Super. Thank you so much, Dan.

Operator: We’ll take our next question from Keith Weiss with Morgan Stanley. Please go ahead.

Keith Weiss: Excellent. Thank you, guys, and I appreciate you squeezing me in. I’m going to take one last crack at this. Shantanu and team, we definitely, hear your confidence in the business, but obviously, the stock market reactions reflecting investors are worried about something. And the two things that worry investors more so than anything is uncertainty number one, and number two is back half ramps, right? And so I think what investors would really love to hear is Dan Durn actually say we still expect to do $1.9 billion in net new Digital Media ARR, and get some certainty there, and then also have a little bit more certainty or a little bit more explanation of what are the building blocks to that second half ramp? Which products are expected to go GA?

Are we including stuff like document intelligent — intelligence? Is there new monetization avenues that we’re putting into the back half? Or is there just some mechanism within sort of the Creative Cloud pricing that’s going to turn on in the back half of the year? Any further specifications in there, I think would help sort of close the gap between your confidence and sort of the lack of confidence exhibited by the after-hour reaction.

Shantanu Narayen: And let me tackle that, Keith, and maybe I’ll just tackle it by taking a couple of the other questions as well, summarizing that and ending with your question associated with the financial results. I think the first question that I hear across many folks is, hey, with the advent of AI and the increase in the number of models that people are seeing, whether they be image models or video models, does that mean that the number of seats, both for Adobe and in the world, do they increase or do they decrease? To me, there’s no question in my mind that when you talk about the models and interfaces that people will use to do creative content, that the number of interfaces will increase. So Adobe has to go leverage that massive opportunity.

But big-picture models will only cause more opportunity for interfaces, and I think we’re uniquely qualified to engage in that. So that’s the first one. The second one, I would say, is that does Adobe innovate, and when we do that, do we only leverage the Adobe model, or is there a way in which we can leverage every other model that exists out there? Much like we did with plugins, with all of our Creative applications, any other model that’s out there, we will certainly provide ways to integrate that into our application. So anybody who is using our application benefits not just from our model creation, but from any other model creation that’s out there. The way we first started to execute against that is in the enterprise because for us, the enterprise and the ability to create custom models so people can tweak their models to be able to do things within Photoshop that are specific to a retailer or a financial service was where we focus.

But long term, certainly, as I’ve said with our partnerships, we will have the ability for Adobe in our interfaces to leverage any other model that’s out there, which again, further expands our opportunity. I think as we play out the year, when we gave our targets for the $1.9 billion in ARR and the $410 million in Digital Media ARR for Q1, it factored in both our product roadmap and how things would evolve in the year. All of the product roadmaps we knew whether it was Acrobat, whether it was Express, whether it was Firefly, whether it was Creative Cloud, or whether it’s GenStudio that brings all of these together, we knew the product roadmap, which we’re executing against. In the first half of the year, a lot of that was beta, and in the second half of the year, a lot of that’s monetization.

It’s playing out as expected. If anything, I would say the excitement around that, and in particular, the enterprise is faster than expected. And so I think our ability to monetize it just — not just through new seats but also through these new Firefly services is expanded as it relates to what we are doing. And then as it relates to your question around financial results and the go-forward execution, we gave a Q1 target, we beat the Q1 target. And that gives us confidence that the financial target that we gave at the beginning of the year, we’re ahead of that. And that’s how I’ll play it out. You’re right. You have to model it. You can look at last year’s model and look at last year’s model and say, hey, they got to $1.913 billion. If they’re ahead, does that fundamentally change Adobe’s thesis on why we get to $1.9 billion and beyond, and in my mind it doesn’t.

And so that’s the way I would answer that question. We have to go execute against the opportunity that we have. I look forward to those who are at Summit. I’m sure we’ll have a little bit more conversation. But Q1 was a strong start. It was a strong start against product execution. It was a strong start against the financial metrics that we outline, and we’re going to go do it again, Keith. So that’s how I’d answer your question. So thank you all for joining.

Keith Weiss: Excellent. That’s super helpful.

Shantanu Narayen: And with that, I’ll hand it back to Jonathan.

Jonathan Vaas: All right. Thanks, everybody. I look forward to speaking with many of you soon. And this concludes the call. We look forward to seeing you at Summit.

Follow Adobe Inc. (NASDAQ:ADBE)

Page 2 of 2