Globant S.A. (NYSE:GLOB) Q1 2026 Earnings Call Transcript May 14, 2026
Globant S.A. reports earnings inline with expectations. Reported EPS is $1.5 EPS, expectations were $1.5.
Arturo Langa: Good afternoon, and welcome to Globant’s First Quarter 2026 Earnings Conference Call. I am Arturo Langa, Investor Relations Officer at Globant. [Operator Instructions] Please note, this event is being recorded and streamed live on YouTube. By now, you should have received a copy of the earnings release. If you have not, a copy is available on our website, investors.globant.com. We will begin with remarks by our Chief Executive Officer, Martin Migoya; our Chief Technology Officer, Diego Tartara; and our Chief Financial Officer, Juan Urthiague, followed by a Q&A, where they will be joined by our Chief Revenue Officer, Fernando Matzkin. Before we begin, I would like to remind you that some of the comments on our call today may be deemed forward-looking statements.
This includes our business and financial outlook and the answers to some of your questions. Such statements are subject to the risks and uncertainties as described in the company’s earnings release and other filings with the SEC. Please note that we follow IFRS accounting rules in our financial statements. During our call today, we will report non-IFRS or adjusted measures, which is how we track performance internally and the easiest way to compare Globant to our peers in the industry. You will find a reconciliation of IFRS and non-IFRS measures at the end of the press release we published on our Investor Relations website announcing this quarter’s results. I will now turn the call over to Martin Migoya.

Martín Migoya: Good afternoon, everyone, and thank you for joining us. We are standing at the beginning of the most important transition the technology services industry has lived through. This nearly $2 trillion industry is being rewired in front of us and the signals coming from the AI ecosystem and from the largest software companies are all pointing in the same direction. The influx of capital, the need for the right talent and the need to deploy cost-effective AI solutions quickly has never been greater. We have been providing AI solutions to some of the world’s most important companies for more than a decade. Globant was built for this moment. For 23 years, we have been building deep engineering capabilities and a profound understanding of our clients’ businesses across more than 1,200 customers.
We have built a strong AI-native services practice on top of that foundation. Enterprises do not need just models. They need AI-native services delivered by AI agents, supervised by humans driving their agentic transformation. That is exactly what Globant has been executing since 2025, and it is exactly what the market is now validating. This is our moment, and we are entering it from a position of strength. As I meet with our customers around the world, their need to deploy AI solutions that add value and transform their business has never been greater and strengthens my conviction in our innovative business model. This is further supported by the convergence of the most influential voices in technology. Sequoia Capital through Julian Bek is calling services the new software, noting that for every $1 spent on software, roughly $6 are spent on services, and that AI lets enterprises buy outcomes instead of tools with autopilots replacing copilots and meaningfully different unit economics.
Satya Nadella is calling 2026 the year agentic systems start to reshape how enterprises consume software with the boundary between software and services gradually narrowing. And capital is following the thesis. This Monday, OpenAI launched a $4 billion deployment company, and 1 week earlier, Anthropic launched a $1.5 billion Enterprise AI venture with a different group of PE firms. The 2 most valuable AI companies in the world are putting capital behind delivery, not just behind models. We are proud to be an OpenAI partner since 2025, and every dollar of capital flowing into this layer expands the market for the model we have already built. Q1 2026 revenue came in at $607.1 million, above the high end of our guidance. We are reaffirming the midpoint of our full year revenue outlook while raising the lower end of the range, narrowing our guidance with greater confidence in our trajectory.
Q&A Session
Follow Global Med Technologies Inc (NYSE:GLOB)
Follow Global Med Technologies Inc (NYSE:GLOB)
Receive real-time insider trading and news alerts
Importantly, Q2 guidance returns to sequential growth with the upper end of our guided range translating into year-over-year growth. Q1 appears to mark the trough of this cycle, and we see Q2 as a meaningful step toward a healthier trajectory. Free cash flow was strong and operating margin held within our guided range. Our pipeline remains healthy and continues to build with strategic AI-native opportunities we expect to convert through the rest of 2026. On capital allocation, our original share repurchase program announced last September was completed during Q2, and our Board has now authorized a new share repurchase program of up to $125 million over the following 6 quarters, representing close to 7.5% of today’s market cap and close to 15% of market cap in aggregate between the 2 plans.
The program will be executed at management’s discretion, balanced against our investment priorities, including the continued build-out of our AI Pods business. We are committed to returning capital to shareholders because we believe Globant is undervalued relative to the trajectory we see in our pipeline and in AI Pods. A year ago, we announced the shift toward AI-native technology services on top of 2 decades of engineering and industry expertise. Our core business remains the foundation of the company, and the AI-native layer is its natural evolution. Our AI studios are progressing nicely, layering on nonlinear revenue, incorporating talent, sophisticating our offering and getting closer to what our customers need. For each industry we serve, we are getting deeper.
Our global delivery capabilities are helping our clients run the AI transformation they need with greater leverage, better unit economics and a more strategic seat at the table. Critical differentiators of our enterprise solutions are model independence and token sovereignty. Our clients are never locked to a single AI provider. Our platform routes intelligently across more than 140 models, giving enterprises the freedom to adopt the best available technology as the landscape evolves. And every token consumed stays within the enterprise’s own governance. Their data does not train third-party models. Their institutional knowledge remains entirely their own. This is particularly beneficial for many of our innovative clients concerned with their intellectual property in highly competitive environments.
Our answer to the agentic transformation is the combination of 2 things: our AI Pods and our forward-deployed engineers. The AI Pods are AI-native service units, each specialized by task and by industry that deliver outcomes instead of effort. The forward-deployed engineers are the human layer that lands inside our customers, embeds AI Pods into their reality and drives the agentic transformation from the inside. The annual recurring revenue of our AI Pods has reached $32.8 million as of March, with strong growth versus Q4. AI Pods is a young, fast compounding practice in our portfolio, and the percentage growth rates we are seeing at this stage reflect early base dynamics that will naturally moderate as scale builds. We have already incorporated the AI Pods business model in 40% of our top 20 revenue-generating accounts, up from 30% from last quarter.
The AI Pods pipeline stands at $352 million, including a clear path to coverage in 70% of our top 20 clients. Gross margins on this model continue to run materially above our blended company gross margin. As AI Pods grow as a share of our revenue mix, their structurally higher margin can begin to contribute to our blended margin profile over time. AI Pods and AI-native services are an important evolution of how we deliver value and a meaningful growth lever going forward. Within our digital studios, data and AI, which includes core AI deployments such as NLP, analytics, facial recognition, machine learning and data science is now our second largest studio by revenue right after engineering. Both studios are growing markedly above the company average, with data and AI growing north of 25% year-over-year and engineering in the mid-single digits.
This growth mix reflects the priorities shaping enterprise technology investment today. AI is present in 100% of our pipeline. Every project, without exception, incorporates it either as a core element or as a satellite component. Within that, 26% of opportunities are AI core, a figure that rises to 32% when looking only at deals originated in 2026, reflecting the accelerating shift in how clients are coming to us. We are particularly proud of the evolution of our revenue per Globant. As of Q1 2026, we are run-rating a level north of $90,000, representing 8% year-over-year growth. This has been supported by the steady expansion of our AI Pods business and the AI tooling we have embedded into our delivery. This is the structural signature of a company moving up the value chain.
What we are seeing in the market reinforces every part of this picture. Our big deals with large customers continue to gain momentum and AI transformation programs are now the predominant pattern across our pipeline. Three large pools of demand are converging on AI-native delivery, technical debt and core modernization estimated at $1.5 trillion to $2 trillion across the world’s 2,000 largest public companies; interface and experience debt across customer-facing surfaces; and agentic process transformation, the redesign of business processes and operating models around agents. Globant is positioned across all 3. We are winning the modernization pool as an AI-native partner as the IGT relationship with Apollo demonstrates and private equity is becoming a structural channel for us.
The largest prize, however, is agentic process transformation. AI is now adopted across the vast majority of our projects, and the real gains come not from layering AI on top of unchanged processes, but from reengineering the business around agents and rewiring the organizational chart itself. Across our studios, our 100 squared accounts continue to gain traction and maintain momentum. Our top 50 clients grew over 5% year-over-year in Q1 with our top 10 and our 2 to 5 cohort growing at similar rates, all meaningfully above the company average. AI Pods are now showing up in concrete ways across every studio. A quick tour. In financial services, our work with Banco Galicia is scaling. They have validated the productivity, speed and quality of our AI Pods, and we are now successfully deploying an operating model to manage the high use case backlog.
We are now focused on implementing a new AI Pod dedicated to the definition and construction of data products with an initial use case targeting the reduction of the customer contact rate. We are also applying AI Pods in our work with a leading student loan company in the U.S. to modernize its loan management systems at scale and speed. In health care, Life Sciences and Private Equity Studio, we are migrating 5 clients to the AI Pods model. At EmployBridge, the first pilot was executed in Q1, and we expect the remaining scope to convert during Q2 into a fully AI Pod engagement. Johnson & Johnson continues to be the largest account in this studio with continued growth. In media, entertainment, sports and hospitality, our 15-year relationship with the Walt Disney Company continues to expand dialogue with several discussions for AI Pods.
We are keen to grow with the company by delivering an interconnected experience anchored by Disney+ and across all core businesses. The LA Clippers, one of our most visible partnerships through our work on the Intuit Dome, is transitioning their full operation to AI Pods. Our partnership with FIFA is now in its fifth year. Diego Tartara will share how we are progressing our support across FIFA’s business digital initiatives. At the same time, we have expanded our role as their technology partner for the 2026 and 2027 World Cups, enhancing their digital platforms, delivering a new fan engagement mobile application and bringing our AI-native capabilities to one of the world’s most watched sporting events. The Mexican Football Federation has chosen sports and performance to build the most advanced football intelligence ecosystem, including new Physical AI applications to further improve sports performance.
The agentic solution implemented in LaLiga exemplifies the characteristics of a modern sports organization and will serve as a lighthouse for the industry. We now work with all of the big 3 cruise lines and look forward to growing our work in this space based on our expertise in customer experience. In gaming, we continue to develop our AI project with Riot. Since booking the largest deal in this sector with them last year, we are now integrating our AI Pods model into quality assurance. In retail, we introduced AI Pods into our dialogue with one of the largest retailers in the United States, with whom we have worked for 4 years. The retailer was previously considering a global compatibility center, but exposure to AI Pods shifted the conversation toward an agentic-first solution by unlocking more value and efficiency to build a new mobile app and loyalty program.
In the technology space, we are happy to announce that we are strengthening our strategic partnership and 360-degree relationship with Google, where we are projecting strong growth into new areas. In the energy space, we continue our 3-year relationship with the U.S. Green Building Council, migrating our work on their primary certification, LEED version 5 to AI Pods. Although the change has been recent, we have already seen significant improvements in process efficiency, and we look forward to seeing this growth in the future. The aviation space has been going through some turbulence by volatility following the sudden rise in fuel prices. This has further reinforced the efficiency gains delivered through our AI Pods model. Two of our largest airline clients will be transitioning from a traditional delivery model to AI Pods as part of a multiyear commercial and digital transformation.
This shift will allow them to increase throughput, reduce cycle times and operate with a more flexible cost structure, helping offset fuel-driven pressure while continuing to modernize core systems, improve direct channels and enable more dynamic retailing capabilities. In our new markets region, although our clients face many challenges due to the currently volatile situation in the Middle East, Globant continues steadfast in being a strong and stable partner, focusing on long-term growth. Our partnership with Qiddiya City has centered on major projects, including Six Flags Qiddiya City and Aquarabia, the largest water park in the Middle East, with Globant building the end-to-end digital backbone for the guest journey. We are also partnering with Saudi Arabia’s local organizing committee to reinvent the football experience centered on the fan ID ecosystem, focusing on a unified AI-powered platform connecting identity, venue access, safety and fan engagement at scale.
The Enterprise AI Studio anchors the platform layer that powers AI Pods with multi-cloud integration across Azure, AWS, Oracle and Google Cloud. Our partnerships with the major hyperscalers, AWS, Google, Microsoft and Oracle Cloud Infrastructure all expanded this quarter, and we were named Google Cloud Country Partner of the Year in Argentina for the fourth time. With AWS, this quarter, we surpassed the original KPIs from our strategic cooperation agreement signed last September, aimed at several ambitious indicators, including annual recurring revenue and new solution development. We also continued to deepen our relationship with NVIDIA, which is central to how we deliver AI-native services at the compute and infrastructure layer. Together, these alliances make Globant the AI-native orchestration partner that connects model providers, hyperscalers and the enterprise.
And finally, our AI-powered network, which elevates advertising, marketing strategy and media. GUT kicked off the year with incredible momentum, adding 18 new client logos and delivering groundbreaking work, a surreal celebrity-driven music video, which became a full-blown cultural event for Cheetos, the Stella Artois FIFA World Cup 2026 campaign featuring David Beckham in the U.S., a special project for Bancolombia and Pura Magia, a new campaign in partnership with Disney, which reimagines the meaning of transformation at Walt Disney World. Q1 2026 delivered above the high end of our revenue guidance, advanced AI Pods into a clearly strategic position in our portfolio, and gave us additional evidence that the macro shift toward AI-native services is being underwritten by the most credible investors in the technology space.
In summary, we are reaffirming the midpoint of our full year revenue outlook, which implies quarter-over-quarter improvement throughout the rest of the year and also a strong focus on capital allocation and returns and accelerating the highest-margin product in our portfolio. We are confident in 2026. I want to thank our clients for their trust, our partners for their collaboration and our Globers around the world for the work they do every day. With that, I will hand it over to Diego, our CTO, who will walk you through the technology and delivery layer. Thank you.
Diego Tartara: Hello. Globant is no longer preparing for the AI era. We are operating within it. We are systematically reinventing our delivery model so that every solution we deliver is secure, scalable and AI-native from day 1. This is what’s driving the $352 million pipeline in AI Pods and the technology layer underneath is what makes this work, and that is where I want to focus today. We have evolved our signature delivery framework built on hundreds of autonomous units to embed AI into every dimension of execution. The 3 pillars of our delivery model have been overhauled to meet this moment. Our Globers are not just using AI. They are augmenting their technical domains to become multidisciplinary orchestrators. We are also expanding to new roles such as our forward-deployed engineering team.
We have redefined project management with AI-powered observability and agentic workflows that drive measurable efficiency gains. AI readiness and accountability are now mandatory across all offerings, evolving our agile DNA into a truly AI-native model. We are also building what we call our agentic economy, an inner source ecosystem of more than 20 validated cross-industry agentic solutions that we package as deployable assets directly into AI Pods engagements, whether it is an IT root cause analysis tool for airlines or a supply chain agent for oil and gas. These assets are now being replicated across media, pharma and tech in weeks rather than months. We put special focus on the major demand pools that Martin mentioned, AI delivery, modernization and technical debt.
Our forward-deployed engineers are prototyping these solutions in 14 to 21 days. This is the practical mechanism that lets Globant participate in what Sequoia and others have called the services as the new software era. We are building compounding IP that generates recurring value and positions us as a long-term strategic partner. AI Pods are how that IP gets monetized. As we approach the 2026 World Cup, our work with FIFA is accelerating. AI agent networks with human supervision will power key FIFA platforms, enabling more consistent fan engagement across competitions, smarter activation of partnerships and faster deployment of new digital experiences. In Latin American football, Deportivo Toluca, the current Liga MX’s champion has launched a new engagement platform developed by Globant and our sports products division, Sportian.
The platform offers supporters live in-match services, ticketing, e-commerce, statistics and exclusive content, helping the club personalize the fan experience. In consumer goods, we are working with Grupo Mariposa, one of the region’s leading CPG companies to transform their marketing model around the consumer. The initiative integrates data, AI, martech and agile methodologies to enable smarter, faster, more precise decisions. Marketing evolves into a continuously learning system in which creativity and technology adapt together to consumer behavior. Globant is supporting CMPC, a global leader in sustainable pulp and paper to deploy an AI-powered solution that enhances supply chain traceability and compliance, end-to-end visibility, regulatory adherence and a clear sustainability narrative.
Our ecosystem continued to deepen in Q1. We announced a strategic partnership with Adyen for next-generation merchant payment experiences. We obtained the GenAI competency from AWS and achieved expert status for SAP Business Data Cloud specialization. Our collaboration with Adobe expanded as we became the first customer experience orchestration partner in Lat Am. Our partnership with Autodesk now includes integration with Tandem digital twin technology, unlocking new efficiencies in design and operations. Combined with the partnership recognitions Martin shared earlier, this reinforces our position as the AI-native delivery layer. In Q1, we published 2 new reports through our research arm, one guiding financial institutions on adopting real-time AI-driven operations, and another helping airlines transition to modern retail models.
Both are available at reports.globant.com. Our role in this market is straightforward. We turn AI from a tool into a delivery model, and we package the IP we generate into assets that our clients can deploy. The technology layer behind AI Pods is now compounding, and that is what gives us conviction in the trajectory Martin described today. With that in mind, I will now turn it over to Juan. Thank you very much.
Juan Urthiague: Hello, and good afternoon, everyone. I am pleased to discuss our results for the first quarter of 2026. We have begun the year with a focus on stability and execution. We are operating in a discerning client environment, and we are seeing buyers concentrating on high-impact agentic AI projects and digital transformation, which is exactly where we are positioned. We are executing on the financial front, protecting the bottom line, improving working capital, increasing CapEx efficiency and repurchasing our shares. In the first quarter, our revenue stood at $607.1 million, representing a 0.7% decrease on a reported basis, coming in above the high end of our guidance and reflecting a 400 basis point improvement in year-over-year trajectory compared to last quarter.
Q1 revenues included 200 basis points of FX tailwind. The improvement is most visible in our top accounts. Our top 50 clients grew 5.2% year-over-year. Our top 10 grew 4% and our 2 to 5 cohort grew 8.2%, all materially above the company average. Many of our top 20 clients returned to positive year-over-year growth this quarter. This is aligned with our 100 Squared strategy. Our revenue per employee also increased again this quarter, driven by our pivot into platform and AI-led delivery, which allows us to maintain our revenues with a slightly lower headcount. Our adjusted gross profit margin for the quarter was 37%. Gross margins continue to be impacted by the relative strength of Lat Am currencies, primarily the Mexican peso, the Colombian peso and the Brazilian real compared to the prior year, alongside statutory cost increases in our delivery centers.
Over time, as AI Pods grow as a share of our revenue mix, their structurally higher margin profile can begin to contribute to our blended company margins. This is the longer-term margin opportunity we are building toward. Our adjusted operating margin came in at 14.1% for the quarter, with SG&A at 18.5% of revenues. The effective tax rate for the quarter stood at 23.5% within our guided range. Our adjusted net income for the quarter was $65.2 million, representing an adjusted net income margin of 10.7%. Q1 adjusted diluted EPS came in at $1.50, above the midpoint of our guidance. This number absorbed meaningful FX headwinds, primarily from the Mexican peso, the Colombian peso and the Brazilian real. On an FX-neutral basis, adjusted EPS would have been higher.
The underlying operating performance was consistent with our internal plan. Our balance sheet remains strong, ending the quarter with $200.5 million in cash and short-term investments or $161.2 million in net debt. During the first quarter, we invested $50 million to repurchase shares as per the plan announced in October 2025. Our original share repurchase program was completed during Q2. In Q1 2026, we generated $36.1 million of free cash flow, achieving a free cash flow to adjusted net income ratio exceeding 55% compared to negative $5.7 million of free cash flow in Q1 2025. This is the first time Globant has generated free cash flow in the first quarter since 2019. We expect to continue generating strong organic free cash flow for the full year 2026.
We will continue to allocate capital with discipline across 2 priorities: returning capital to shareholders through the newly authorized repurchase program and investing in high-return growth initiatives, principally the continued build-out of our AI Pods business. Now let me move to our outlook for Q2 and for the remainder of the year. For the second quarter of 2026, based on current visibility, we expect revenue to be between $610 million and $616 million. The Q2 year-over-year guidance implies at the midpoint, a positive FX tailwind of 100 basis points. We expect a non-IFRS adjusted operating margin between 14% and 15% and the IFRS effective income tax rate is expected to be in the 22% to 24% range. Non-IFRS adjusted diluted EPS is expected to be between $1.45 and $1.55 per share, assuming an average of 43.6 million diluted shares outstanding during the second quarter.
With respect to the full year, at the midpoint, we are maintaining our 2026 revenue guidance unchanged. We expect revenues in the range of $2.462 billion to $2.508 billion, implying 0.3% to 2.2% year-over-year growth, with approximately 100 basis points of FX tailwind. Both Q2 and subsequent quarters imply sequential growth and a healthy exit rate more aligned with industry growth averages. In terms of profitability, we continue to expect our adjusted operating margin for the full year to be between 14% and 15%. Our margins continue to be pressured by the strength of Lat Am currencies relative to the dollar. The IFRS effective income tax rate is expected to be in the 21% to 23% range. For the full year, we are also reiterating an adjusted diluted EPS range of $6.10 to $6.50, assuming an average of 44.1 million diluted shares outstanding for the full year.
To conclude, Q1 was a quarter of steady execution. We are seeing improvements across our top clients. We are executing on the things that we can control, and our balance sheet remains a source of strength. Our focus on embedding AI into the core of our value proposition is clearly resonating with our most strategic partners. Thank you for your continued support.
Arturo Langa: [Operator Instructions] So with that in mind, we will take the first question from the line of Bryan Bergin from TD Cowen.
Bryan Bergin: Maybe my first one, just as it relates to demand conversion, can you just talk about what you’ve been seeing in the broader conversation? So it’s good to hear the traction on the AI Pods. But when you just think about the broader conversation, have you seen anything shifting in more recent weeks, April and May? How is that compared to the first quarter as it relates to pipeline conversion?
Martín Migoya: Yes. The pipeline remains — sorry. The pipeline remains in a very healthy state. Conversion is quite good. We’re seeing large deals that we have been closing during last year and also closing during this first quarter that will gain traction moving forward. So it’s configuring like a space in which we’ll have like several big deals starting to yield some results moving forward, and we are very happy for that. Of course, there are some concerns around what’s happening in the Middle East. However, we see the business healthy there. We have some concerns around airlines and the amount of the price of the fuel that is kind of changing the landscape for some trips. But in general, we see a quite positive environment in terms of bookings, long-term deals, which are extremely important that are coming back and gaining traction.
And one remarkable thing that I would like to mention is that the growth on the main accounts is way above what we are seeing in the full company, right? So this is kind of the result of the focus we are having on the 100 square accounts and the focus that we are getting on those large customers that are needing more than ever. When we talk about AI Pods, we’re talking about a way to deliver the traditional services. We’re not talking about a specific offer, but we’re talking about the broader conversation, too, and a new way of delivering that broader conversation in a way which is AI-native. So I cannot split the conversation from the AI Pods from the broader conversation. But yes, I can say that in general terms, conversations are going in the right direction.
Bryan Bergin: Okay. Okay. That’s clear. And then maybe on the margin, Juan, can you quantify just how much FX pressure there is within the gross margin in the first quarter and in your outlook for this year? And in gross margin assumption for the year, are you including any tailwind from these — from the AI Pods structures?
Juan Urthiague: Sure. So in the first quarter, compared to the last quarter, we see about 1 percentage point of FX headwind coming from mainly Colombia, Brazil, a little bit in Mexico. In terms of — for the rest of the year, so far, the assumption is the same. I mean we cannot predict what’s going to happen in terms of FX, but we will definitely work to be as efficient as possible to increase utilization levels as different ways to offset part of that. And we do have very little positive impact being assumed toward the last part of the year. As you know, last quarter, we mentioned a run rate — an expected run rate of $60 million to $100 million of AI Pods, and that’s going to be towards the end of the year. So even though it’s growing very fast, it is still a small part of our business.
But what is more interesting is that this continues to scale the way it is scaling? It’s a good place to be looking into the future, right? Because this model is proving to be with better margins overall relative to the rest of our business or to the rest of the delivery models.
Arturo Langa: The next question comes from the line of Maggie Nolan from William Blair.
Margaret Nolan: Maybe to follow up on what Bryan was just talking about. The AI Pod margin is obviously quite strong. Do you expect that to be sustainable into the future? Or what are your expectations for competitive pricing pressure there?
Juan Urthiague: Yes. I mean as always — I mean there is competition. But at the same time, we see that as we scale projects with AI Pods, we improve significantly our agents. We improve how we use the different models and the tokens. And also they get more efficient because the agents that we build, as they learn, as they evolve in the project, they get more efficient as well. So I think that, that’s going to take us or that’s going to help us offset whatever pressure might come from other places. So the expectation for us is that the AI Pods delivery model will overall deliver higher margins than the other more traditional…
Martín Migoya: We are in the trend of growing revenue per head for many, many years already, right? So this is constantly reflecting the way we understand how to scale our business, always looking for margin or looking for new practices and innovation, and customers are reacting quite well to that as the numbers demonstrate.
Margaret Nolan: Martin, maybe to build on that revenue per head comment, I would imagine that the forward-deployed engineers are contributing to that growth as well. And maybe you could just comment on how those capabilities differ from your historical workforce and what additional changes do you need to make to the workforce to continue to capture market share?
Martín Migoya: Yes. I will answer the first part. The second one will be on Diego. The forward-deployed engineers is something that we have been doing for many, many years. We didn’t call it that way. But our engineers working at our customers’ premises and understanding the processes and trying to propose like new ways of doing things. And now they became like agents of transformation, but instead of using just one platform, they can use pretty much any platform. And so they are being very welcome within our customers. And this is something we have been doing forever. And now I think the next generation of understanding or knowledge has to do with changing full processes. I mean processes that before were impossible to change.
Now it’s possible to automate or at least thinking in a totally different manner. And that kind of mindset is the mindset that we are embedding and putting into our teams in front of our customers or forward-deployed engineers in front of our customers. I don’t know, Diego, if you want to complete.
Diego Tartara: No, I think Martin captured pretty well the whole idea. So Maggie, just to give you an idea and similarity, this is the actual version of what an enterprise architect used to be. The thing is they’re called deploy because now there’s a platform involved that you need to deploy and then implement. And implementation is exactly what Martin said, which is having knowledge on how to map the company’s data structure, architecture, different components, connect that and build and create the platform for building the solutions on top of that. So it’s that typically accounts for the top-notch engineers, the most senior-tier engineers. And like you said, they tend to contribute to that revenue per head uplift.
Arturo Langa: Next up is Tien-Tsin Huang from JPMorgan.
Tien-Tsin Huang: Martin, I like your comments on the — in the prepared remarks about the rewiring of the industry and things like that. So I’m just curious, just to focus on that with these LLMs investing in services, how do you see the competitive landscape changing? I know we’ve seen this a little bit before in software and enterprise software. Do you see it differently here? It is a validation of services, of course, but not sure exactly what their long-term intent without domain knowledge can be like what Globant has. So how do you see the competitive dynamics evolving here?
Martín Migoya: That’s a great question, Tien-Tsin. Great to see you back. So first, I believe that the massive change that is happening and the things that must be done moving forward are so large, but so large that there’s no way that all those things can be captured by just 1 or 2 companies, which, by the way, wouldn’t be independent. So I think the value of being an independent company, the value of being able to create and facilitate — can you hear me?
Arturo Langa: Yes.
Martín Migoya: Okay. Sorry — so the value of creating and saving the tokens that the customers are producing, the value of advising your customer and reengineering the processes without any bias and being able to use the best possible technology is still there. So I know — I would like to say that I’ve been in this business for many, many years, and we have seen many times in which companies are moving back to services from product to services. And I believe this is a huge stamp of approval that we have been talking about that for the last 2 years. I mean the big prize is deployed. It’s not just the model. And we are playing the game of deploying it. And our relationships with our customers and trust and confidence and level of innovation is absolutely there. So I see this as a very exciting news. So I don’t know, Diego, if you want to…
Diego Tartara: I want to make a little comment. We are actually in conversations with our partners. The same companies that are putting together their services are in conversations with that. And this is actually the model that the industry has been having for 20 years. It’s what Amazon does, it’s what Google does. They all have services capabilities, Microsoft as well. And that’s the excuse to capture business and route it to the partner network. So a ton of the work that we do for AWS, as an example, comes from AWS themselves. So this is actually a very — and that’s why their team is actually so small in scale. So the idea is the major lift of this type of work should be done through their partner networks, but they need to be able to capture. If they don’t provide services, they are not a go-to person for capturing that.
Martín Migoya: That’s a full demonstration of what we do, 100%.
Tien-Tsin Huang: Yes. No, I’m glad you guys answered it that way. It feels like a validation, and I think the market, hopefully, will appreciate that. Maybe just my quick follow-up, just to ask a model question for you, Juan. Just thinking about — I think you talked about it earlier, Bryan’s question, but just thinking about Q1 being the trough, 2Q seeing a little bit of sequential growth. Should we continue to assume it builds from there based on the backlog of work and what you expect in terms of closed sales? I mean should we walk into faster accelerating growth and then we exit the fourth quarter a little bit faster, assuming there are no other surprises from the macro?
Juan Urthiague: Yes, definitely. When we look at the fourth quarter last year, we did — we closed at minus 4.7%. We look at this quarter, it was minus 0.7%. The next quarter — and between Q1 and Q4, there was some sequential decrease. Now when we look at Q1 to Q2, we’re going to see that there is sequential growth. We might also — depending on where we land within the range, there’s a chance that we end up with some year-over-year growth. And when we look at the second part of the year, the combination of more working days that are still relevant, plus what Martin explained at the very beginning, some large contracts that have already been signed, they will start to generate incremental revenues. I think that’s what should help us to have sequential growth both in Q3 and then in Q4 and probably end the year in a much better way when we look at the year-over-year growth.
And then on top of that, we will see, hopefully, the acceleration of our AI Pods that should be very — I mean should be — it is important because it’s going to start helping us not just on the growth, but also on the margin going forward.
Arturo Langa: The next question comes from the line of Bryan Keane from Citi.
Bryan Keane: Can you just talk a little bit about those larger clients? Your top clients are growing faster with you guys. Maybe is that just a sales focus and a little bit about maybe what those clients are doing in particular that could be the start of something bigger as we go forward this year and into next?
Fernando Matzkin: Yes. Thank you, Bryan. I’ll take that one. So we are seeing some very strong growth on our top 10 clients, around 4% and quarter-over-quarter. And that is a result of the 100 square strategy where we’ve been investing so much focus and effort. When you take a look every sample that you take from our top 20 clients, you’ll see that the growth is really much higher than the rest of the lineup of clients. And the kind of work that we are doing largely focuses on AI-infused projects. We are advancing conversations around migrating existing operations or starting new development with AI Pods in the vast majority on our top customers. So a lot of focus on using AI to gain efficiency to develop faster, better to get to market in better shape to deliver more features and to test these products with consumers in real-life production in shorter iterations.
One very good example of that is Disney, where we’re also seeing some very nice recovery this year compared to 2025. And where the focus of the new CEO, Josh D’Amaro, is really interconnecting the Disney guests, the Disney customer experience using Disney+ at the center. And by having Josh coming from parks and experiences, he knows very well our work with Disney for so many years, and we are very well positioned to capitalize on his strategy moving forward. And at the same time, obviously, we are also having with both Disney parks and Disney media, a ton of conversations around migrating their operation to AI Pods as well.
Diego Tartara: I want to add something to that because I think it’s important. You talked about large customers. And with every single large customer, operational efficiencies and getting the most out of money is still the conversation. It’s there. And those large accounts, those top accounts, typically, they either belong to a heavily regulated market like airlines, banks. They have very high standards of security. And when you see that, we have deployed AI Pods on 40% of the top 10, correct me. 40% and more are on the way, which pretty much not only validates AI Pods, the concept with top clients, but also speak, most of those companies are pretty advanced when it comes to AI. It’s not like they haven’t tested, they haven’t done it. And still, they found a lot of value there. So I just wanted to add that because I think it connects with what they needed and what Fernando mentioned before.
Bryan Keane: No, that’s really helpful. And then just a quick follow-up. Middle East exposure in general, how much do you guys have? And what’s the outlook for Middle East kind of going forward? Will it — I mean have you built in some cushion for potential disruption there?
Juan Urthiague: Yes, I’ll take that one. So when we look at new markets, that accounts for about 6% of revenues. Middle East is about 2/3 of that. In the numbers that we provided, the midpoint basically is assuming that things will continue more or less the way they are. We are seeing deals getting closed. We are seeing some deals actually starting, hopefully very, very soon.
Fernando Matzkin: The strong pipeline as well.
Juan Urthiague: The pipeline is very solid. So the way we build the guidance for the year, basically the lower end would imply a significant deterioration of that business. That’s the main assumption on the low end is a deterioration from today that so far, we are not seeing it in our numbers.
Arturo Langa: The next question comes from the line of Arvind Ramnani from Truist Securities.
Arvind Ramnani: Good set of results. I wanted to follow up on the question Tien-Tsin asked earlier. Certainly kind of validation sort of these Anthropic and OpenAI making these investments, it validates the services model in terms of like last-mile delivery. I think that’s quite clear. Do you all kind of view them as a little bit of competition because they’re going to be out there grabbing some resources, competing with your clients? And how do you view the competitive environment? And just on that as well, if you can also comment on sort of the relationship you have with Palantir.
Martín Migoya: Arvind, thanks for your question. This is an extremely large market. I mean as I was saying before, every day, there’s a new company that is trying to compete there, and these guys has very good distribution. But they have some, I would say, some things that are structural. For example, any of those 2 companies that are being formed right now cannot offer like model independence or cannot offer now like the way of — I would provide you with a piece of advice, which is absolutely the best for you, right? And that’s something that is difficult to cope with. But of course, we see them as a competitor. We see them as also a partner because we do things and they will do things through us. We have built in these years, in these 23 years, exactly what they want to build right now.
This is a massive support to our long-term business. And the other thing is this is not a business in which one winner takes all. There are other business in which one winner takes all. In this business, there’s a — it is a business in which the relationships with the customers count the whole relation — corporate relationships and MSAs you have and people that knows people and things like that counts a lot. And I would say that this is a scenario in which we have a lot of assets and a lot of things that we developed with the years to cope with that. So again, this is a massive market. I said $2 trillion, only for technology. If you expand that into BPO, into core process, all the processes that you have outside there, it is much larger than that.
So I don’t know. This is something that is too large. The change is that much. We are innovating. We are bringing the latest way of thinking to our customers. The customers are loving it. I feel extremely confident. I feel extremely confident that for us, it’s only a great signal of what we’re building, as I said on my opening remarks.
Arvind Ramnani: Yes. Yes. Martin, I would agree with that sort of my own research also suggests that enterprises are looking for multi-model approaches because if you just kind of go with a single model, whether it’s Anthropic or OpenAI model, you kind of are locked into that model, right? But — and then you’re kind of stuck with their pricing and kind of development. But if you have this multimodel orchestration layer on top, then you can — and not just these 2, right, you can even use some Chinese models. So what are you saying makes a ton of sense.
Diego Tartara: Not only that, Arvind, but also hybrid models where you…
Martín Migoya: You can switch.
Diego Tartara: You can actually use your own locally run chip models for simple things. So yes, that is totally correct.
Arvind Ramnani: Right, right. And if I can just also maybe reask the question on Palantir, right? Because, I mean are you all seeing Palantir on certain deals? Do you look at them as a partner? Or do you compete with them? Kind of what’s your approach on Palantir?
Martín Migoya: No. We see them as a partner. I mean we like them. They’re a great company. They — I think that in certain places, in certain specific situations, we cooperate. So again, we are a company that needs to solve the problems of our customers in the best possible way. And that commitment that we have and that we will continue having is independent on any of the products that we use in the back. And sometimes we have things that can do a good portion of what our customers need, we use them. Some other times, we need to go with a partner and we go with a partner. But what is not compromised in any of our decisions is that what we are proposing to the customer is the best for them. So I see Palantir as a company that we can cooperate, that we can expand our relationship, and we respect them a lot. They have a great product. And we believe that in certain places, we will cooperate with them.
Arvind Ramnani: Perfect. And then just kind of a quick follow-up, right? I mean certainly, a good set of results. When I think of like guidance, which is reaffirmed — was that reaffirm mostly because you’re being a little bit conservative? Like why not raise guidance in line with the beat and the momentum and all of that?
Juan Urthiague: Arvind, there is — as you know, we have part of our business in the Middle East. The situation there, as you know, changes all the time. And we need — we wanted to be in a safe place and not take unnecessary risk at this point in time. There is nothing to gain from taking too much risk on the guidance.
Arturo Langa: The next question comes from the line of James Friedman from Susquehanna.
James Friedman: I’ll just ask Slide 2 upfront. So Martin, I’d be interested in your perspective about Globant GUT. It seems like it’s kind of like the higher end of strategy consulting. And I’m wondering how you can use Globant GUT to generate incremental revenue downstream. And then, Juan, if I could just ask, last year, macro was under pressure in Lat Am. If you could just give us the macro for — the quick notes on macro, macro for dummies in Latin America, that would be great.
Martín Migoya: I would take the first. I will ask Diego to complement. But GUT for us is a very good way of entering customers, marketing and technology and to the AI gets connected deeper every day. And the kind of caliber of ideas that these guys are having is really impressive. When I see the reviews that we do every month with the teams, and I see when the guys from GUT come, it’s very rewarding to see the caliber of campaigns, the caliber of ideas, the caliber of customers that they are getting. And this is for us, like a door open for then come with all the rest of innovations we have on the technology side and keep on selling. I see this as a magnificent vehicle to expand that relationship that every day, marketing, technology, operations will be more integrated into the same thing. And we can, as a company, tackle those 3 things with our offering.
Juan Urthiague: When we look at Latin America, we are seeing a more stable scenario. Argentina is the country that is driving most of the growth in the region right now, followed by Brazil. But in general, I think it’s a more stable scenario relative to other years.
Arturo Langa: The next question comes from the line of Guggenheim from Jonathan Lee.
Yu Lee: I appreciate the commentary earlier about the positive bookings environment. Is there any way to quantify what bookings were in the quarter, how that momentum may have trended from January through March and what you saw in April into May?
Martín Migoya: In the second quarter, no?
Juan Urthiague: No. I mean — so basically, when you look at the first quarter and April and the couple of weeks from May, situation is similar. We are not seeing big changes there. Overall, we had a very strong Q4. As you remember, it was a record quarter. That is helping us a little bit on what happened in Q1, where we were able to exceed a little bit our initial guidance. And it’s also helping us to reaffirm the full year. The level of bookings that we’ve seen in Q1 and in whatever has passed of Q2 is also allowing us to maintain the guidance even in the situation that we know that the new market region is under a little bit of stress. But so far, no big changes, and that’s enough for the guidance that we have today.
Yu Lee: Got it. Appreciate that color. And what in your customer conversations gives you confidence around that back half ramp, particularly around sequential growth in the fourth quarter? And how much go-get is potentially needed there?
Juan Urthiague: So I mean, as we were discussing before, part of the — or the second part of the year has 2 main components. One, which is a higher number of days in the second half of the year. And the second one is there is basically 4 large customers with whom we have already signed contracts and those contracts should be able to help us because they are scaling as we speak, right? One of them is a professional services company that has been one of the drivers of the lack of growth in that sector that is coming back. There is a big tech company that we recently became a preferred vendor that’s going to help us also on the second part of the year. So — and then we have already spoken about a gaming company in the last quarter and a PE-backed company as well. So those 4 contracts that are scaling as we speak, combined with the account days of the second half are the drivers of the second half of the year.
Arturo Langa: The next question comes from the line of Sean Kennedy from Mizuho.
Sean Kennedy: Congrats on resilient results, really great to see. So I was wondering if you could discuss a bit more about the early customer feedback from AI Pods. And specifically, what are they saying are the greatest benefits from the program? And if there are specific industries that are seeing more traction than others currently?
Diego Tartara: So like I mentioned — sorry, like I mentioned before, one of the good things is that the data comes from our top accounts, which are the main drivers of the growth of AI Pods. So 2 different things. The first one is the benefits of the models, both in terms of efficiency but also in terms of the quality of the outcome. So we’ve not just proven that we can produce faster, but actually that the quality and the full product and release is top notch, right, according to the standards. So feedback has been very positive. In fact, a lot of the growth, we typically — the way we do AI Pods is we have an experimentation phase — we have — and then we have a scaling phase. And so typically, you don’t scale if you haven’t passed a testing, right, a POC, only one part operating under this model and then they do the switch.
So it is important to understand that when we are reporting revenue, it’s actually — it has been tested. A couple of things that I think are super important and make a big difference. First of all, the process is baked into the tool. And this is actually great. This is the first time we can do that. Before that, the process was actually a manual, if you wish, and you train people to follow that process. Now it’s being baked into the tool. So it’s a lot more resilient to people and it holds your knowledge. The second one, token consumption. This is something that we haven’t mentioned, but it’s extremely — it’s becoming extremely important. A lot of the companies that have implemented tools such as cloud code, Copilot, et cetera, they’re working with them, and they’re not getting the most out of that.
But on the other hand, they have increasing spend on the token consumptions that start to be actually part of their P&L, an important part of the P&L. And part of that consumption comes from misuse of the tool. which means incorrect prompting and data preparation, incorrect description of the process you must follow, quality gates, et cetera, et cetera. So a task well executed typically takes 1/3, 1/4 or even less tokens than the same task with reprompting corrections, et cetera, et cetera. So that’s another interesting and important data point because you can actually see the tool in action and you pay for what it actually produces output. So feedback so far has been great on every single implementation. The efficiencies have been shown, and that’s why it is a keeper.
We have had not a single client actually going back from AI Pods, which I think it’s a solid statement of the benefits it has.
Martín Migoya: Yes. And also 2 more things. The fact that you can have like an enterprise-ready process baked in into that thing. I mean there are plenty of people now by coding solutions inside corporations. But then they’re local in their machines, they’re not scalable, not secure, not ready for the enterprise. And with our AI Pods, we’re being able to grab those things and those ideas and take it to the next level in terms of enterprise readiness. So that’s extremely important. How the process is documented all along the way, and we have an enterprise-ready kind of solution, right? And then the rest of the things is just benefits, right? You consume less tokens, you have a corporate process inside there. You can repeat the process and improve the process every time with your customer, you can customize the process with the data of your customer.
So it’s much better than the traditional services. This is what Diego was describing, is a real AI-native service as opposed to the traditional service. And this kind of scale on supervision is more similar to an assembly line rather than just a massive matrix of projects that you need to fulfill. So the current business is not going anywhere. And as you see and demonstrated, it will keep on being and keep on existing. But this AI Pods is on a different level of execution, totally AI-native, totally AI-driven plus human supervision and it’s being charged per consumption. You don’t need to interview people. You don’t need to do anything. You just submerge it there and everything gets executed with enterprise class and human supervision to ensure that you are doing the right thing.
So it’s really a different value proposition, and our customers are really welcome it because it’s a totally different way of being transparent to, right? A full transparency in each asset that we create, each artifact gets connected to a number of tokens, get connected to a number of consumption that you have there. So it’s really paying by the output rather than paying hours. So it’s really a new model.
Arturo Langa: The next question comes from the line of Gustavo Farias from UBS.
Gustavo Farias: My question is on capital allocation. So buybacks, of course, suggest the confidence that you have in the stock. But on the other hand, it could limit potentially strategic M&A. So if you could please share how do you see the need for M&A in the short to medium term to remain competitive?
Martín Migoya: We are always looking for things and for opportunities. That won’t go away. And the other thing on the buyback, right, was — it’s at management’s disposal. And as we see an opportunity to buy the company that we know that is Globant, and that we trust, and we think it’s undervalued. So — but if we find a better opportunity or better use to invest, of course, on our AI Pods or a company that we like and we want to acquire, so on and so forth. I mean we are not limited in doing so. So a constant assessment on which are the priorities for our capital is being carried out, but we want to be extremely disciplined in how we increase our return on equity or return on invested capital moving forward. I don’t know, Juan, if you want to.
Juan Urthiague: No, definitely. We have the firepower to do M&A in case we want. But of course, I mean there is one stock that today is trading at a very low multiple, and it’s a company that we know very, very well, and that’s our company. So between that and buying something else, sometimes it really needs to be very, very strategic. It really has to meet a very clear need for us at this point to go and do that because at the current valuation, there is a mismatch between Globant and some companies in private sector.
Gustavo Farias: Very clear, guys. If I may follow up. I think last quarter, you guys seemed a little bit conservative on tackling fixed price contracts, mainly on margin concerns. And we see fixed price as a percentage of revenues roughly stable versus last quarter. What’s your current view on that?
Juan Urthiague: Yes. The market today pushes for fixed price. It’s something that is a reality across the industry. Interestingly, now we have tools that allow us to deliver in a more efficient way. So we can use our AI Pods to deliver on many of those contracts. So we are more confident on the fixed price than what maybe we would have been a year ago or 2 years ago. So it’s something that we need to look at because the market is pushing for fixed price. And the tools that we have today in front of us allow us to do better than — or to do well in fixed price contracts as well.
Arturo Langa: The last question that we have time for today, unfortunately, comes from the line of Surinder Thind from Jefferies.
Surinder Thind: I guess, Martin, you’ve been very clear about the ambitions for the AI Pods model. Is 2026 effectively the year that you’re gathering, I guess, data points to kind of really push forward with the strategy at this point? How do we think about where the longer-term ambition is? How much of your revenue can you get there? And then what is the current friction in terms of why clients aren’t adopting it faster given all of the data points that Diego kind of highlighted?
Martín Migoya: I didn’t get the last part of your question, Surinder.
Surinder Thind: The last part is just that Diego highlighted a lot of really positive data points about the AI Pods model. And so how do we think about that in terms of why clients wouldn’t be adopting it faster? Is there a natural balance within your business ultimately in the long run between the AI Pods model, maybe fixed price contracts? Or how do you think about where all this is heading and where you would like it to ideally be?
Martín Migoya: Sure. AI Pods is kind of a way of delivering, which is absolutely different from before. And it’s an AI-native service in which we deliver in a totally different manner from before. Now our forward-deployed engineers are there just trying to do the agentic transformation, the AI agentic transformation of our customers, helping them with processes with reinventing and rethinking marketing processes, human resource processes, account receivables, account payables, finance processes. So there’s a constant connection between how you create a future project with our deployed engineers and how ambitious is that project and how you deliver that. Both things must be extremely innovative. And this is what we’re bringing to the market.
First, the idea, how to make an agentic transformation of your business. And then when that is ready, we don’t deliver in the traditional manner, but we deliver with AI-native services. So long term, my ambition is that although our current business as we deliver today will keep on existing because many of our customers is the preferred way of engagement, this new way will start to slowly, number one, gaining market share, right, from other competitors; and number two, transforming the way we deliver our current services for our customers. So the ambitions are large. I prefer to refrain from giving you any number, but I have in mind like as a way of transforming our current business into something totally different. And this gives you a place to think about many other things, like how to create those tokens that we will use in the AI models, how to distribute them, how to route them, how to — I mean it’s a pretty different game when you start thinking about charging per output or per consumption in the case of software development life cycle.
But then when you are talking about an AI-powered operations, you may charge per streaming monitoring hour or ticket sold. I mean there are hundreds of different delivery units that we may use on our AI Pods that our customers will be extremely tangible and not in the same way it used to be. So this technology is giving us an opportunity to change the whole business, and this is the aspiration we have. Now how fast we can do it will depend on our customers on how good our AI Pods work, so on and so forth, that is what Diego was describing. I don’t know, Juan or…
Fernando Matzkin: Listen, we are seeing like many, many very interesting conversations progressing with all of our top customers, right, most of our top customers. And our top customers are really marquee customers, and they are leaders in the space and where they are. And without exception, they are either in the piloting phase or scaling phase and those who are about to start are assessing the technology. But the interest we are seeing in the top customers is very strong. There’s no structural friction with the clients whatsoever in the model. I think it’s just a matter of our clients understanding the technology and progressing the processes and so on and so forth.
Surinder Thind: That’s helpful. And then just as a quick follow-up on the commentary about the forward-deployed engineer. How much of that based on your earlier comments, is what I would call just a rebranding of the way that work used to be done in certain instances? Or how much of that is potentially structurally different that you guys might have to do, meaning does a forward-deployed engineer have to be on-site to a larger extent? Does it change some of the economics? How does that impact you guys given that you guys have optimized for mostly delivery offshore?
Diego Tartara: I think it’s actually — it’s an interesting question. So I’ll tackle that in 2 different stages. First, the formation. I mean, what does it mean and what’s the delta between a very good engineer and a forward-deployed engineer. First of all, is the platform and knowledge of the platform. Any forward-deployed engineer deploys a platform and has to have profound knowledge of that platform. The second one has to do with the work that most of enterprise architects used to do, and that’s the keeper, which is understanding the architecture components, data, security, et cetera, of the target client and what’s the best integration strategy. The third one is a little bit of building on top of that with regards to building the solution.
So that’s the enablement of the forward-deployed engineer, then it comes to solutioning. And the deployment is actually functional to that solution. So it’s also important that, that person understand the problem is trying to address. So there’s a conversion and reskilling that while conserving a lot of the learnings of that type of engineer, you need to build something on top of that. So we needed a retraining of most of our top engineers to make them forward-deployed engineers.
Surinder Thind: And do they have to do more work on-site or — because on-site are very different than offshore.
Diego Tartara: Yes, second part of the question. Yes, we can actually execute a lot, typically what’s called the discovery. It’s a lot better when it happens on-site because a lot of the things have to do with interviewing people that are related not only with the platforms, but also with the areas and the processes you need to convert. So typically, if you do it on-site, it’s a lot better, still can be done offshore, and we have done it.
Arturo Langa: So with that in mind, that’s the last question we have time for, unfortunately. I will now turn it back to Martin Migoya for some closing remarks. Martin, please go ahead.
Martín Migoya: Thank you, Arturo, and thank you, everyone, for being here today and for your continuous support. Thank you so much. See you in the next quarter.
Arturo Langa: Thank you.
Follow Global Med Technologies Inc (NYSE:GLOB)
Follow Global Med Technologies Inc (NYSE:GLOB)
Receive real-time insider trading and news alerts



My name is Inan Dogan. I’m the co-founder and Research Director of Insider Monkey. I have an important message for you today.