Elastic - Q3 2026
February 26, 2026
Transcript
Operator (participant)
Good afternoon, welcome to the Elastic Third Quarter Fiscal 2026 Earnings Results conference call. All participants will be in listen-only mode. Should you need assistance, please signal a conference specialist by pressing the star key followed by zero. After today's presentation, there will be an opportunity to ask questions. To ask a question, you may press star then one on your telephone keypad. To withdraw your question, please press star then two. Please note, this event is being recorded. I would now like to turn the conference over to Eric Prengel, Global Vice President of Finance. Please go ahead.
Eric Prengel (Global VP of Finance)
Thank you. Good afternoon. Thank you for joining us on today's conference call to discuss Elastic's Third Quarter Fiscal 2026 Financial Results. On the call, we have Ash Kulkarni, Chief Executive Officer, and Navam Welihinda, Chief Financial Officer. Following their prepared remarks, we will take questions. Our press release was issued today after the close of market and is posted on our website. Slides, which are supplemental to the call, can also be found on the Elastic Investor Relations website at ir.elastic.co. Our discussion will include forward-looking statements, which may include predictions, estimates, our expectations regarding demand for our products and solutions, and our future revenue and other information. These forward-looking statements are based on factors currently known to us, speak only as to the date of this call, and are subject to risks and uncertainties that could cause actual results to differ materially.
We disclaim any obligation to update or revise these forward-looking statements unless required by law. Please refer to the risks and uncertainties included in the press release that we issued earlier today, included in the slides posted on the investor relations website and those more fully described in our filings with the Securities and Exchange Commission. We will also discuss certain non-GAAP financial measures. Disclosures regarding non-GAAP measures, including reconciliations with the most comparable GAAP measures, can be found in the press release and slides. Unless specifically noted otherwise, all results and comparisons are on a fiscal year-over-year basis. The webcast replay of this call will be available on our company website under the Investor Relations link. Our fourth quarter fiscal 2026 quiet period begins at the close of business on Thursday, April 16th, 2026.
We will be participating in the Morgan Stanley Technology, Media & Telecom Conference on March 2nd. With that, I'll turn it over to Ash.
Ash Kulkarni (CEO)
Thank you, Eric. Good afternoon, everyone. Thank you for joining today's call. Elastic delivered yet another outstanding quarter, beating the high end of guidance across all key metrics and showcasing the power of the Elastic platform and our business model. Sustained platform demand, strong sales execution, and our relentless focus on customers drove Q3 momentum. As LLMs rapidly evolve their capabilities around inference and reasoning, it is becoming increasingly clear that context is the most important ingredient in making these models useful within an enterprise. With that backdrop in Q3, we continued to see enterprises choose Elastic to power context for their most critical AI needs. Translating the success to our performance, we achieved 18% total revenue growth and an 18.6% non-GAAP operating margin.
Sales-led subscription revenue accelerated to 21% alongside our growing cohort of $100,000 ACV customers, which now exceeds 1,660. Q3 marked our sixth consecutive quarter of strong field execution, driving solid customer commitments and supporting healthy CRPO growth. That execution is also translating into a strong pipeline as we head into Q4. The lifeblood of organizations is the proprietary data that they create, manage, and analyze every day to drive business decisions and operations. This data is massive, often many petabytes in scale, and simply cannot be moved for cost and security reasons outside of the organization's control. For businesses to use agentic AI, the LLM needs to come to the data. This is where Elastic comes in.
With our ability to help organizations store and manage all of their data in very cost-effective ways and by providing accurate real-time context to AI by searching through all of this organizational data in real time. Furthermore, Elastic is capable of doing this consistently across cloud and Self-Managed environments. This hybrid flexibility allows sensitive data and workloads to remain in their preferred environments, eliminating the need for costly re-platforming. This unique flexibility is why we continue to displace legacy vendors and niche cloud-native players alike. The results are clear. The number of commitments for over $1 million in annual commitment value signed this quarter grew over 30% compared to the same period last year, driven by new logos and customer expansion. Consolidation and AI are powerful tailwinds.
As organizations manage exploding data volumes, they are turning to Elastic to drive both innovation and efficiency to their search, observability, and security needs. For example, we signed a seven-figure new logo deal with the Fortune 100 insurance institution for Elastic Security. Seeking to modernize their security operations, the company initiated a competitive process to replace a legacy SIEM solution that was plagued by slow query speeds, inefficient data retention, and rigid SOC workflows. By leveraging features like logsdb and searchable snapshots, they are consolidating data into a single cyber data lake with integrated AI-powered SIEM workflows. All powered by Elastic and its capabilities, including AI Assistant, Attack Discovery, and AI-driven orchestration. This transition enables their analysts to achieve markedly faster cybersecurity detection and remediation outcomes while meeting strict regulatory requirements.
In another large deal from the quarter, a global leader in data resiliency software chose Elastic Observability to power the monitoring layer for its new cloud offering. As they migrate their vast user base to the cloud, they are leveraging our full observability suite, including AI Assistant and logsdb, to transform from reactive troubleshooting to intelligent semantic-aware analysis. By integrating OpenTelemetry and our vector search capabilities, the customer is now able to proactively detect anomalies and remediate issues using natural language queries, significantly reducing mean time to resolution. They chose Elastic over incumbents due to our deep integration flexibility, superior handling of unstructured data, and the ability to provide a single source of truth across the organization. Crucially, as companies navigate their cloud migrations, they require a platform that doesn't force them to choose between their existing data centers and the cloud.
Our asymmetric advantage in supporting modern cloud and hybrid environments drove a significant win with a global financial group. During the quarter, we closed a seven-figure expansion deal for Elasticsearch, which serves as the core of their online banking application for tens of millions of users. They needed a central data repository capable of supporting both cloud and Self-Managed architectures, allowing them to run mission-critical workloads in their preferred environment without compromising performance. Elastic succeeded where their existing MongoDB implementation failed to provide the scalable retrieval and precision necessary to move beyond simple search into production-grade context engineering. Moving forward, they are integrating semantic search and advanced AI features to further personalize the user experience through faster, more accurate retrieval. Central to these enterprise engagements is the rise of agentic AI. Customers are moving from passive Q&A to active agents that drive workflows. Precise action requires precise data.
The conversation has shifted from which model to use to how to feed it the most accurate context. Enterprises realize that to unlock the value of AI, they must bridge the gap between their LLMs and their proprietary unstructured and structured data. Elastic makes this AI work. We are the engine that allows enterprises to build production-grade AI systems that are actually worthy of their business. While others offer simple vector databases, we know that vectors alone are not enough. We deliver the full retrieval toolkit from hybrid search to advanced re-ranking, ensuring that agents have the relevant context they need to take precise actions. This ability to bridge enterprise data to the LLM with our platform is directly translating into expanded AI adoption.
In Q3, new customer commitments with AI continued to grow. We now have over 2,700 customers on Elastic Cloud using us as a vector database, with additional customers using us for broader AI capabilities, including Agent Builder and Attack Discovery, bringing our total count of AI customers to over 3,000. We now have over 470 customers with an ACV of $100,000 or greater using us for AI. This includes more than 410 using us as a vector database. Cumulatively, AI use cases have now penetrated over a quarter of our $100,000 ACV customer cohort. We are seeing sustained demand from the largest companies in the world, alongside interest from the new wave of AI-native companies.
During the quarter, we closed multiple new logo and expansion deals with AI-first innovators, validating that our platform is the standard for both established enterprises and disruptors. A leading AI recruiting platform used by large enterprises and startups alike chose Elastic's vector database to power their core customer-facing software because our search performance at scale was better than competitors. An AI-enabled driver and fleet safety company expanded their use of Elasticsearch in Q3 as they scale into new global regions. Elastic provides the real-time retrieval necessary to power their platform, ensuring they can manage increasing data volumes without sacrificing performance. A leading AI-native cybersecurity company focused on AI automated penetration testing has integrated our SIEM solution into their product. Elastic centralizes all of their logs without complication, allowing them to effortlessly scale through their massive growth trajectory.
At the heart of these wins is the performance of our Search AI Platform. We aren't just adding features, we are aggressively optimizing our engine, focusing our development efforts on delivering market-leading relevance, speed, and efficiency. In the last 18 months, we have driven two orders of magnitude less RAM required for vector search through innovations like Better Binary Quantization or BBQ, DiskBBQ, and our ACORN filtering algorithm, among other things. This investment makes Elasticsearch vector search up to 8x faster than OpenSearch. Our superior performance led to one seven-figure deal with a global heavy equipment manufacturer. The customer continues to migrate mission-critical workloads over to Elastic Cloud from OpenSearch to improve scalability and performance. They are relying on our platform to power their high-speed search for telemetry data collected via the Starlink network.
By leveraging logsdb, they have achieved a significant reduction in cloud costs while managing seven years of historical customer data. Our focus on performance extends to our partnership with NVIDIA as well, where together we help enterprises deploy AI applications faster without draining IT infrastructure. We recently announced the technical preview of our Elasticsearch GPU plugin for a GPU-accelerated vector database, which allows for 12x faster indexing. Additionally, the Dell AI Data Platform, now with NVIDIA and Elastic, delivers a tightly integrated AI stack that streamlines the ability to build, deploy, and scale AI. By making Elasticsearch a core component of the Dell and NVIDIA AI factories, we are meeting the critical demand for building AI on customer-controlled infrastructure. As we deepen these technical advantages, we strengthen our technical moat while removing friction from scaling AI.
This quarter, we reached several product milestones designed to simplify the path from data to action for our customers. We are providing an end-to-end framework for building the next generation of intelligent applications. First, we officially launched the general availability of Agent Builder. Agent Builder allows developers to build secure, context-driven AI agents in minutes. Unlike consumer apps that surf the web, our focus is on internal business applications using company data. We piloted Agent Builder with a global 100 financial group to investigate and troubleshoot its production infrastructure, demonstrating an order of magnitude improvement in performance for complex issues and democratizing the specialized expertise necessary for rapid troubleshooting. An international entertainment and media company created a chat interface for customer interactions. They found the Agent Builder results to be significantly more reliable and accurate than the other LLM-centric approaches they had tried.
Building an agent is only half the battle. The other half is ensuring that Agent has the most relevant information at its fingertips. This quarter, we expanded our Elastic Inference Service to include Jina AI's multilingual re-ranking models. Jina AI delivers a best-in-class model for search accuracy with Jina v3, currently the number one re-ranker in its model size category on the MTEB English retrieval benchmark, a gold standard for search and RAG relevance. Jina AI's v5 Nano and v5 small models continue to outpace peers as well, scoring high in retrieval, re-ranking, and other tasks. By making these models available natively, we are allowing our customers to tune their AI applications for maximum precision and recall. Re-rank is the critical next step in a context engineering pipeline that ensures the most relevant data is presented to the LLM. Jina's state-of-the-art models deliver superior performance across over 80 languages.
While AI provides the reasoning, enterprises still require the reliability of rule-based automation for critical business tasks. This is why we introduced Elastic Workflows in technical preview. Workflows adds automation capability directly into our platform, allowing agents to orchestrate actions across internal and external systems like Slack or ServiceNow. It moves Elastic from being a search box to a complete system of action. Finally, we are delivering on our promise of hybrid flexibility with Cloud Connect for Self-Managed customers. We recognize that many of our largest customers, particularly in financial services and government, maintain data on-premises for regulatory or sovereignty reasons. However, procuring and managing GPU hardware for AI is a massive hurdle for these teams. Cloud Connect allows customers to keep their data local while securely bursting to Elastic Cloud to leverage NVIDIA GPUs for high-performance inference.
This ability to bridge modern AI capabilities with rigorous enterprise requirements is exactly why we are winning large-scale displacements against legacy providers. As organizations prioritize both innovation and operational efficiency, they're moving away from fragmented legacy tools in favor of Elastic's unified Search AI Platform. The results of this quarter, accelerating growth, large deal momentum, and major competitive displacements, confirm that our strategy is resonating and that we are winning the race to become the essential infrastructure for the next generation of AI-powered businesses. I want to thank our customers for their partnership, our shareholders for their trust, and most importantly, our employees for their tireless spirit of innovation. With that, I will turn the call over to Navam to review our financial results in more detail.
Navam Welihinda (CFO)
Thank you, Ash. Good afternoon, everyone. We delivered yet another outstanding quarter. We outperformed the high end of revenue and profitability guidance ranges, driven by another quarter of consistent execution, strong consumption, and strong customer commitments across search, security, and observability. The momentum in our performance throughout this fiscal year is a testament to our team's ability to deliver rapid innovation and sales execution consistently quarter-over-quarter. The ongoing market demand we see is translating to total revenue growth, sales-led subscription revenue growth, and healthy increases in pipeline generation to support our future growth. These factors together underscore our increasingly strategic value as a critical data platform in the age of AI. Our total revenue in the third quarter was $450 million, representing growth of approximately 18% as reported and 16% on a constant currency basis.
Sales-led subscription revenue in the third quarter was $376 million, growing 21% as reported and 19% on a constant currency basis. We saw commitment contribution from both our Self-Managed and cloud offerings, and aggregate consumption trends in the third quarter remained strong. Our current remaining performance obligations, or CRPO, which is the portion of RPO that we expect to recognize as revenue within the next 12 months, crossed the billion-dollar mark for the first time in Q3. CRPO accelerated to approximately $1.06 billion, growing 19% as reported and 15% on a constant currency basis. In our consumption business, we structure customer contracts based on their annual usage. Our CRPO gives us a very clear view into the revenue we will recognize in the next 12 months, giving us visibility and confidence in our business.
As Ash mentioned, we saw deal momentum continue in Q3. This quarter's strength was balanced across all geographies, and we continued to see customers make multi-year commitments this quarter, which serves as a clear indicator of how our customers view the Elastic platform as a critical foundational element in their long-term data architectures. The positive momentum was reflected in our RPO, which saw strong growth of 22% in the quarter as reported and 18% on a constant currency basis. Our deal momentum is also evident in the growth of the count of customers with over $100,000 in annual contract value. We ended the third quarter with over 1,660 customers with ACV of more than $100,000, growing 14%. Quarter-over-quarter, we added approximately 60 net new $100,000 ACV customers.
We saw strong field execution and healthy growth across our solutions, where search continues to see ongoing momentum from AI. This demand is benefiting both cloud and Self-Managed, where both form factors are relevant for AI use cases. We continue to see customers taking a Self-Managed license and deploying Elastic into their own modern cloud and hybrid environments. The demand reflects customers' preference for Elastic, which uniquely provides the necessary control and cost efficiency for AI initiatives. AI also continues to be a powerful catalyst for customer expansion. 28% of our greater than $100,000 cohort now utilizes Elastic for AI, which includes incremental AI capabilities like Attack Discovery and Agent Builder.
Today, we are still in the early stages of expansion, we see considerable opportunity for ongoing upside for both new and existing customers to accelerate their AI adoption in the years ahead, particularly as they scale into and within our $100,000 ACV cohort. Now turning to third quarter margins and profitability. I will discuss all measures on a non-GAAP basis. Our commitment to balancing growth with disciplined spending translated into robust operating leverage and strong bottom-line results. We continue to focus on costs and efficiency in our business. We recorded subscription gross margins of 82% and total gross margins of 78%, delivering an operating margin of 18.6%. The outperformance on Q3 operating margin was the result of our strong revenue performance, the sustained leverage in our model, as well as some Q3 expenses moving into Q4.
Due to this outperformance, we now expect to see our full year margins to come in slightly ahead than previously anticipated, with updated FY 2026 operating margin guidance now at 16.3%. Regarding cash flow, adjusted free cash flow was approximately $54 million in Q3, representing a margin of approximately 12%. Our cash flows are expected to fluctuate on a quarterly basis based on the timing of bookings and collections related to the enterprise booking seasonality. We continue to manage cash flow on a full year basis. For fiscal 2026, we do not see any change in our full year outlook, where we continue to expect to sustain the level of adjusted free cash flow margins that we achieved in fiscal 2025. We have made significant progress on the $500 million share repurchase program that we announced in October.
During the third quarter, we returned approximately $186 million to shareholders, representing purchases of approximately 2.4 million shares. Cumulatively, we have repurchased 3.8 million shares. I mentioned at our Financial Analyst Day in October that we expect to use more than 50% of the $500 million authorized amount in fiscal 2026. We have already exceeded this goal. As of the end of Q3, we have completed 60% of our repurchase program. We are continuing our repurchase program here in Q4. Let's move to our outlook for the fourth quarter and the remainder of fiscal 2026.
For the fourth quarter of fiscal 2026, we expect total revenue in the range of $445 million-$447 million, representing 15% growth at the midpoint or 13% constant currency growth at the midpoint. We expect sales-led subscription revenue in the range of $371 million-$373 million, representing 18% growth at the midpoint or 15% in constant currency growth at the midpoint. We expect non-GAAP operating margins to be approximately 14.5%. We expect non-GAAP diluted earnings per share in the range of $0.55-$0.57, using between 105.5 million and 106.5 million diluted weighted average ordinary shares outstanding.
Based on our fourth quarter guidance, we are raising our full year total revenue and sales and subscription revenue targets as well. We expect total revenue in the range of $1.734 billion-$1.736 billion, representing approximately 17% growth at the midpoint or 15% constant currency growth at the midpoint. We expect sales-led subscription revenue in the range of $1.434 billion-$1.436 billion, representing 20% growth at the midpoint or 18% in constant currency growth at the midpoint. We expect non-GAAP operating margin for full fiscal 2026 to be approximately 16.3%.
We expect non-GAAP diluted earnings per share in the range of $2.50-$2.54, using between 107 million and 108 million diluted weighted average ordinary shares outstanding. A few other financial modeling points to keep in mind. The diluted weighted average shares outstanding reflect only share buybacks completed as of January 31st, 2026. As you consider comparing sequential quarters, keep in mind that Q4 has three fewer days than we had in each of the first three quarters of the year, which creates a sequential headwind to revenue, which we have accounted for in our guidance. As is typical with prior Q4 periods, we expect to see seasonally higher expenses related to the timing of employee benefit costs.
These expenses were already part of the guidance that we had initially laid out for the year. As in past years, we finalize our plans for the upcoming fiscal year during the fourth quarter, and we will provide our initial FY 2027 guide during our earnings call in May. In summary, Q3 was another very strong quarter at Elastic. Consistent sales execution throughout FY 2026 continues to drive our sales-led subscription revenue growth expectations higher for the year, validating the durability of this business motion. As I said last quarter, while quarterly revenue can naturally vary in a consumption model, our strong customer commitments drive strong annual growth. Fueled by a highly differentiated platform and the expanding value we deliver to our customers, we remain on track to achieve our medium-term targets for both sales-led subscription revenue growth and adjusted free cash flow.
Looking forward, we're confident in our ability to continue to drive profitable growth. We are the critical technology that accelerates data discovery, secures infrastructure, and maximizes application performance. With that, I'll open it up for Q&A.
Operator (participant)
We will now begin the question and answer session. To ask a question, you may press star, then one on your telephone keypad. If you are using a speakerphone, please pick up your handset before pressing the keys. To withdraw your question, please press star then two. Our first question today is from Sanjit Singh with Morgan Stanley. Please go ahead.
Sanjit Singh (Executive Director and Senior Equity Analyst)
Thank you for taking the questions and congrats on the stability that we're seeing across the business. Navam, I wanted to go back to some of the themes on the Investor Day a couple of months ago. There was a data point that you provided around the AI native customers or the AI customers being a relatively small amount of the customers in fiscal year 2024, but driving an outsized degree of expansion, that sort of year one to year two expansion. The gist of this question is that as, you know, we get to like 25% of penetration of your $100,000 customer accord, is there an opportunity here for a growth to not just be stable, but actually to accelerate on a more sustained basis as we hit those critical tipping points, if you will?
Navam Welihinda (CFO)
Thanks for the question, Sanjit. The trends that we laid out during the Financial Analyst Day for the generative AI cohort, they remain the same. We continue to perform well, and we're seeing stronger growth on those generative AI cohorts today, as it was when we disclosed it to you during Financial Analyst Day. We're seeing these tailwinds right now, and we're seeing more of our customers reach the $100,000 mark. Now, remember that each of these customers in the $100,000 mark are also early in their journey. There's this other dimension of additional penetration and maturation in their own AI journey, which will drive faster growth as well. We're seeing the tailwinds right now.
We've seen tailwinds that average to 5%, but there's obviously more that there are some customers that have higher growth than that. To answer your question, yeah, absolutely. There is a possibility the art of the possible is there for us to actually accelerate beyond that 5% that we laid out during Financial Analyst Day. The trends remain positive.
Ash Kulkarni (CEO)
Sanjit, just to add to that is exactly why we are so focused on the penetration of AI within our customer base. As these customers, you know, right now, every quarter, you're seeing us increase the penetration. The penetration, you know, initially starts with them using us in some small way. As that usage grows, as you rightly pointed out, that's gonna add to the consumption, it's gonna add to the overall revenue, and that's gonna show in, you know, the continued strength and acceleration of the business.
Sanjit Singh (Executive Director and Senior Equity Analyst)
Understood. Ash, maybe a question for you. You made the point in your script about vector search and vector databases are not enough in terms of building a resilient and powerful AI application. I think a lot of people would agree with that statement. When we brand the company as a context engine, what are the core pieces that are mandatory to secure your status as the leading provider of context for AI applications?
Ash Kulkarni (CEO)
That's a great question. I think the most important thing to keep in mind is, you know, context is gonna change from task to task. The data platform, the context engineering platform that you provide, needs to be able to do a whole bunch of things all together in a very consistent way. The first is the ability to bring in any and all kinds of data. As you know, we have some unique capabilities in our ability to bring in not just structured information, but also unstructured, really, really messy information. The second is to then take that data and convert it into vectors for vector search, which is a very powerful technique, especially in the AI world for semantic search, but then also to be able to mix it with hybrid search techniques.
That includes textual search and then being able to re-rank against multiple techniques to get the most accurate context. The Jina AI embedding models, the Jina AI re-ranker models, those are a key part of their overall platform infrastructure. On top of it, then you need something that'll allow you to assemble agents using all of these capabilities. That's what Agent Builder was all about. As you know, it's a relatively new feature from us and a relatively new capability, but we are seeing great traction and adoption within our customer base. On top of it, you need workflows because agents are not just about chat anymore. They're not just about conversations. They're about taking precise actions, and that's where the workflow functionality that we released becomes really important.
Lastly, the ability to monitor all of this, and that's where our LLM observability functionality becomes key. We believe that it's all of these capabilities, Sanjit, that taken together make the platform a very compelling platform for context engineering. On top of that, we've also added our Elastic Inference Service, so you don't need to bring your own LLM. We can help you proxy to any LLM of choice that you might want to use. We integrate with pretty much all of them.
Sanjit Singh (Executive Director and Senior Equity Analyst)
Appreciate the color, Ash. Thank you.
Operator (participant)
The next question is Rob Owens with Piper Sandler. Please go ahead.
Rob Owens (Managing Director and Senior Research Analyst)
Great. Thank you very much for taking my question. I'll apologize up front for the flurry of questions in one here, but I will keep it to one question, but maybe three parts. Really want to focus on the outperformance in other subscription. I understand you're gonna meet customers where they want to buy. I guess upfront, with some of that strength potentially push outs that you saw in the prior quarter, then if I look at your sales-led subscription forecast for Q4 and the fact it's down quarter-over-quarter, which you haven't seen historically, it's usually a little bit up, is that really a function of the strength you saw here in the January quarter or something else to be read into that?
Lastly, when we think about monetization of Self-Managed versus cloud customers and your ability to expand them over the coming years, can you maybe articulate the difference between the two if there's much there? Again, apologize for the three questions, but hopefully they're brief answers.
Ash Kulkarni (CEO)
Rob, let me start. This is Ash. Let me start and then pass it on to Navam. You know, in terms of our strength in Self-Managed, this is not just about push outs or anything of that sort. We are continuing to see a lot of strength in our Self-Managed business. You know, at the end of the day, what we are seeing now, especially with AI, is a lot of customers are applying AI on data that they consider to be extremely critical, extremely sensitive. This is not just with government customers. This is also in other regulated industries. For that reason, they're choosing or they're preferring to keep the data where it's within their control, within their environment. That doesn't always mean in their own data centers. It might also mean within their own cloud VPCs.
We give them the flexibility to be able to do that. These are modern workloads that continue to grow as their usage of AI grows, and we are gonna continue to benefit from it, which is why we believe it is really important to not just look at cloud, but to look at the whole picture and take into account the strong growth that we are seeing even on Self-Managed. I'll let Navamad address the other questions.
Navam Welihinda (CFO)
Yeah, Rob, I'll address your our quarterly sequential question. You know, overall, I'll start with how the business is doing. We're continuing to execute very well on the sales-led motion. This is another quarter of good execution from the sales side, and we saw that play out in our CRPO and RPO numbers accelerating as well. If you're looking at commitments, we're seeing a good commitment volume, and there's no deceleration on that. On top of that, the pipeline is very healthy and growing each quarter. Overall, from a business perspective, very happy with where the quarter turned out and very positive about the future quarters as well. That leads us to the guide.
When you think about the guide, we always guide with an appropriate amount of prudence on what we can achieve and outperform every quarter. When you look at historical numbers versus actuals and guidance, you're comparing an actual number against a guidance number, and the guidance number has risk incorporated into that forward-looking projection. I'll first point to that. The second point I'd make is that the fourth quarter has three less days, which translates to a 3% headwind or a $14 million-$15 million headwind for us on a revenue basis because there's just less days of revenue to recognize. All of that is incorporated in the guide.
If you look at last year's, Q4 guides in the past, there have been occasions where we've guided lower than the current quarter. Just keep that in mind. We continue to keep well on track with achieving our midterm targets, and we feel very positive about the strength of the business itself.
Rob Owens (Managing Director and Senior Research Analyst)
Great. Thank you.
Operator (participant)
The next question is from Matt Hedberg with RBC. Please go ahead.
Matt Hedberg (Managing Director and Software Research Analyst)
Great. Thanks for taking my question, guys. You know, Ash, you know, I wanted to ask you about AI. Obviously we've all seen the pressure of the software market. I appreciate your comments at the start of the call. I thought it was really helpful to kinda get your perspective on AI, and it seems like there's a lot of great momentum from a customer perspective. I guess my question is, you know, when we're looking at these frontier models, do you see them as future competition or more of a partnership opportunity?
Ash Kulkarni (CEO)
You know, really we don't. In our opinion, AI doesn't displace us. It really depends on us because if you think about these frontier models, they're amazing reasoning engines. Like the way I think about them is they are gonna be the operating systems of tomorrow. Just as operating systems today also require data systems to feed appropriate data and context to these operating systems to actually, you know, build applications with, you're gonna need the same thing going forward. Our role in this whole ecosystem is to make sure that we can very quickly, in real time, across all of the petabytes of data that every organization holds, give the right context to these LLMs so they can do their job. That's the reason why I believe that, you know, in the, in the world of tomorrow, you're gonna have agents talking to each other.
You're gonna have agents that you build with Elastic Agent Builder that are talking to Claude Cowork, that are talking to, you know, things that you build with OpenAI Frontier. We already support the MCP protocols, the A2A protocols that allow for that kind of communication. This is a world where we feel that, you know, the fact that we have this tremendous position, the capabilities with our vector database, the capabilities with our entire context engineering platform to become a critical part of the infrastructure going forward, and we're already partnering with hyperscalers, and we already integrate with all of these frontier class models today.
Matt Hedberg (Managing Director and Software Research Analyst)
It's, it's a great perspective. Thank you for that. Then maybe just one quick follow-up about Elastic internally using AI. You know, how are you seeing some of the tangible benefits and, you know, how might that impact headcount in the future?
Ash Kulkarni (CEO)
Yeah, look, we are all in on AI, not just in terms of what we are doing externally in terms of providing the platform that we are building, but also in terms of how we are using AI internally. You know, just to give you some context on this, a couple of years ago, we built out our first agent, our first support agent within the company, and that's been, you know, in production for a long time now. It's what our customers first hit when they have support questions. The amount of queries it's able to answer, and the number of support tickets it's able to deflect has not only improved the overall performance, the overall experience for our customers when they come to us for support, but it has also significantly reduced the demand on headcount from our side.
In the last two years, even as our business has been growing, and as you can imagine, typically support workloads grow with the business, we have been able to manage that workload growth without adding any headcount to that support team. You know, in other parts of the business, whether it's in HR, whether it's in finance, in legal, we are heavily using AI tools. Some of these are built on our stack, some of them might be external, products that we are leveraging. Even in engineering, we are finding tremendous value in using multiple different, you know, code generation tools that we use within the company.
Overall, we believe that this is gonna definitely help us not just accelerate the pace of innovation, which we are already seeing now, but also, you know, improve the productivity and improve the overall efficiency of the business. And that's what's exciting about this. You know, we are able to help our customers with this, but we are also able to benefit from it ourselves.
Matt Hedberg (Managing Director and Software Research Analyst)
Got it. Thanks.
Operator (participant)
The next question is from Brian Essex with JPMorgan. Please go ahead.
Brian Essex (Executive Director and U.S. Software Equity Research Analyst)
Hi. Good afternoon. Thank you for taking the question. I appreciate your response to Matt's question, with, you know, with regard to your vector database capabilities and context engineering platform. I guess as you look at the changing landscape and you look at different approaches, different ways to think about things, are there anything... How do we think about the platform and its ability to adhere to some of those approaches? Like for example, the page index approach to RAG. Are you know, if they solve the cost and latency issues involved with that approach, you know, are you well-positioned to benefit from something like that and pivot with your approach?
Ash Kulkarni (CEO)
Yeah. Look, RAG, you know, retrieval augmented generation itself has progressed a lot since the last several years when it was first introduced as a concept. Fundamentally, you know, this comes down to, you know, finding the most appropriate context that is relevant for the LLM to do its job. Sometimes that requires you to understand, you know, specific data relationships that might exist. Sometimes it requires you to just search through all of your data. Sometimes it requires you to understand, you know, specific things in things like preferences and so on that you might have captured in other data systems. It's an amalgamation of all of this. As RAG continues to evolve, as these techniques become more and more sophisticated, we are actually on the leading front of capturing, you know, more than one single technique into our platform.
You know, we were one of the first to adopt hybrid search. We were the first to talk about it. Since then we have continued with that kind of momentum. Absolutely, I feel very confident that we are gonna be on the bleeding edge. You know, this is, at the end of the day, what Elastic was born to do. We've always been in the business of relevance. Without relevance, you don't get good search. Without relevance, you don't get good accurate AI.
Brian Essex (Executive Director and U.S. Software Equity Research Analyst)
Great. That's super helpful. Maybe just one quick follow-up. Any traction from the recent CISA win that you had? Are any Fed agencies leveraging that for SIEM reference ability? Are you seeing any better activity on the back of that win?
Ash Kulkarni (CEO)
It's been a great success. Yeah. Thank you. It's been a great success for us already. I think we mentioned it in our press release as well. That, you know, SIEM as a service with CISA continues to grow. We saw additional agencies coming on board even in Q3. I would expect that CISA win to be just the beginning of, you know, multiple agencies coming onto that service over the next several quarters. Fundamentally, CISA is considered to be, you know, the primary agency responsible for cybersecurity in the civilian government in the United States. That kind of endorsement is something that goes a long way. It's a very exciting win. Like I said, you know, we are gonna benefit from it for many quarters and many years to come.
Brian Essex (Executive Director and U.S. Software Equity Research Analyst)
Great. Super helpful. Thank you very much, and congrats.
Ash Kulkarni (CEO)
Thank you.
Operator (participant)
The next question is from Brent Thill with Jefferies. Please go ahead.
Brent Thill (Managing Director and Senior Equity Research Analyst)
Thanks. Hey, Ash. Just on the CRPO 15%, constant currency, 15% last quarter. I guess, I mean, good mid-teen growth, but I think everyone is asking, you know, why aren't we seeing a faster inflection? I know you have a true north to 20%. It seems like the numbers support that you can accelerate to 20%. Just curious how you bridge to 20% and perhaps why maybe you're not seeing a little bit stronger AI tailwind in the near term.
Navam Welihinda (CFO)
Yeah, thanks for the question, Brent. I'll start. you know, CRPOs crossed over $1 billion. we're at 19% growth right now. RPO is at 22% growth. That's the best we've seen in two years, and we're very happy with the progress that we're making. If you just look at the absolute dollar additions that we added in the, in the quarter, it's progressing very, very well. That's all pointing to the core things that are driving that CRPO growth, which is strong customer commitments, which now we've been talking about for a couple of quarters now, and it's been a yet another quarter of very good sales execution leading to strong customer commitments.
The AI tailwinds we talked about during Financial Analyst Day, we're seeing them right now, and they are continuing to grow as we see more and more of the $100,000 have or adopt AI workloads from us. We think that there is a good strong trajectory from this point ahead as we see more AI penetration among our $100,000 customer base.
Ash Kulkarni (CEO)
The other thing that I will say to this, Brent, is that if you look at the full year guide for sales net subscription revenue, you can see that the strength in our business, you know, continues. Look, for us, the midterm guide that we laid out is not the place where we end up. That's the place that, you know, we believe we can go beyond that. If you remember, we talked about 20%+. Really that's the way we see it. As more and more customers adopt our AI functionality, given the fact that those cohorts tend to grow and expand faster, we feel very, very good about how we are tracking to that midterm. We feel very good about the fact that, you know, as that traction continues, you know, we feel good about even exceeding what we've, what we've talked about in the past.
Operator (participant)
The next question is from Howard Ma with Guggenheim. Please go ahead.
Howard Ma (Director and Equity Research Analyst)
Hey. Great. Thanks. I wanted to ask about cloud, and I guess this one's for Navam. I wanna throw out a caveat first, which is that I appreciate your deployment agnosticism and fewer days in Q4. When I look at cloud revenue in Q4 versus Q3 in FY 2022 and earlier, there was more of a sequential step up than in FY 2023 through FY 2025, which were obviously impacted by, you know, you had industry-wide cloud optimization. Also, Elastic had company-specific go-to-market issues. Now that the go-to-market execution has improved significantly, and given the visibility that you now have into how large customers ramp consumption relative to the commits, and that includes some of the $10+ million TCV contracts that you signed last quarter.
The question is there any reason why the sequential cloud growth in Q4 would not be more in line with the earlier years?
Navam Welihinda (CFO)
I'll start off with what I always start off on, which is sales-led subscription revenue growth is the right metric for you to focus on in measuring us as a barometer, as the success of the company and the barometer of success of the company. I talked about this during our prepared remarks as well. There's multiple examples, including this quarter, of AI workloads being sold as Self-Managed and deployed either in the customer's cloud or in their hybrid environments. Sales-led subscription grew a healthy 21% this year. If you look at just cloud, you know, and the number there again is what is the sales-led cloud number, that grew 27% year-over-year this quarter. We're seeing very good traction on the metric that matters to us, which is sales-led subscription revenue.
Also on the annual cloud number this quarter was very good as well at 27%. The forward quarters, number one, you have three less days, so that's three less days to focus on. The forward quarter is a risk-adjusted number, so you can't really compare an actual to a guidance number. The point I'd like to make is that we're seeing very strong commitments and very strong performance on sales-led.
Howard Ma (Director and Equity Research Analyst)
Okay, thanks.
Operator (participant)
The next question is from Ryan MacWilliams with Wells Fargo. Please go ahead.
Speaker 13
Hi, team. This is [Zeeshan] on for Ryan MacWilliams. I wanted to ask, it really seems that, based on some of the work we've been doing, that the number of agents and AI services in production have really increased over the past couple of months. I wanted to hear from you what you're seeing within your customers. Like, are you seeing the types of AI use cases broaden out compared to what you were seeing maybe two quarters ago and how that's impacting, you know, usage and spend amongst those customers?
Ash Kulkarni (CEO)
Yes, we are seeing the usage broaden out in the sense that we are seeing more and more variety of use cases that involve AI. You know, eight quarters ago, the bulk of what we were seeing was only around vector databases, vector search, hybrid search, semantic search. It was mostly around the chat style interface kind of work. Now we are seeing, you know, agentic workflows being put together, not just around, you know, what you would typically think of as search-related workflows, but also around security workflows, around observability workflows. That was the reason why we gave the stat around our total count of customers using us for various AI use cases beyond just vector database. You know, that includes things like Agent Builder, that includes things like Attack Discovery.
In these kinds of scenarios, people are trying to automate their SOC workflows, their cybersecurity workflows, you know, for detection, for remediation. They're trying to do the same for SRE workflows around observability. The variety of use cases is growing, and as that grows, you know, we see an opportunity not just in our core search business, but also in the work that we are doing in security and observability.
Speaker 13
Thanks, guys. Appreciate it.
Operator (participant)
The next question is from Miller Jump with Truist Securities. Please go ahead.
Miller Jump (VP and Equity Research Analyst)
Hey, great. Thank you for taking the question and congrats on the sales-led momentum. Ash, you mentioned a MongoDB competitive win in the prepared remarks. We hadn't heard as much about this head-to-head between the two of you until fairly recently. Are you seeing MongoDB increasingly in bake-offs as customers look to build AI apps, or was that more of a one-off?
Ash Kulkarni (CEO)
No, this was a situation where the customer had started to use, you know, that technology for a basic search application. They had some issues scaling it, and as they were trying to build a more scalable solution, especially for hybrid search, they realized that they needed something that could perform, and that was the customer win that I talked about. You know, at the end of the day, where we tend to typically play is in the area of unstructured data. We don't tend to see them as much, but from time to time, you know, you do see these kinds of situations.
Miller Jump (VP and Equity Research Analyst)
Thanks. If I could just ask a quick follow-up for Navam. As large deals are becoming more of a contributor in your go-to-market strategy moving up market, can you just remind us how you're handling those large deals in your guidance process and any considerations around seasonality there? Thanks.
Navam Welihinda (CFO)
Yeah. Seasonality-wise, I think it just follows the normal typical enterprise seasonality pattern, where they end up being more tail-end weighted in Q3 and Q4. You know, we talked about large deals in the last quarter. They happen every quarter. It's just the volume of bookings are bigger towards the tail end of the year. In terms of how we handle it, I think that this is a natural byproduct of just being successful with our customers, particularly the larger customers within the G2K, so we welcome it. When we look at our guidance and what we expect the full year to be, we naturally take a haircut on specific deals that could move from one quarter to another. That's how we incorporate it into our guidance. A risk-adjusted number on not actually counting on everything going our way.
Miller Jump (VP and Equity Research Analyst)
Thanks very much.
Operator (participant)
The next question is from Koji Ikeda with Bank of America. Please go ahead.
George McGreehan (Equity Research Analyst)
Hi, this is George McGreehan on for Koji. I appreciate you guys taking our questions today. I wanted to ask, just in the conversations that you guys have with customers, and their strategy around adopting AI, how would you say that the tone and the conversations differ versus a year ago? What kind of in-inning are they in today versus maybe a year ago in their adoption journey with Elastic? Thank you.
Ash Kulkarni (CEO)
You know, the general tone is definitely, one of, greater enthusiasm for AI. I think there's been enough proof points now for AI helping in all kinds of use cases, whether it be around, you know, code development, whether it be around customer support, in legal e-discovery, like, lots and lots of use cases across all functions. We are seeing the conversations be less evangelism, and more about helping them put together, you know, these kinds of sophisticated agentic applications. There's definitely been maturity. In terms of the total number of, you know, these agents that people have within their organization, you know, that number is still in the early days.
Like, if you think about the total number of business processes and workflows that can be automated by AI, I think you have to be realistic that we are still in the early days because, you know, AI just is a pretty powerful and transformative capability. You know, what you can do with these LLMs in terms of reasoning can be applied to many different functions and different work processes. We believe that the opportunity is still, you know, very significant and still ahead of us.
Operator (participant)
The next question is from Mike Cikos with Needham. Please go ahead.
Matt Calitri (VP of Equity Research)
Hey, guys. This is Matt Calitri on for Mike Cikos over at Needham. Thanks for taking our questions. With all the advancements you're making to search with things like the Jina reranking models, are you able to charge customers more, or is the improved speed and accuracy more of a acquisition vehicle?
Ash Kulkarni (CEO)
We do charge in terms of consumption, right? We have a consumption model, as you know. Pretty much everything that you do on our platform, it's metered and effectively, you know, based on compute, based on storage and so on. For anything that's LLM or model related, it's based on tokens. All of our pricing is sort of public on our pricing pages. Yes, with these newer models, we are monetizing everything. As the usage continues to grow, as customers do more and more on our platform, that is what drives revenue for us.
Matt Calitri (VP of Equity Research)
Got it. Very helpful. Thanks. Then maybe just taking a different slice at the guidance question here. You beat on the 3Q guide and constant currency, and then you raise the constant currency guide for sales rep subscription revenue, but you left constant currency unchanged for the full year guide. I can appreciate the three fewer days and the risk adjusted, but that would have been baked into the prior guide. Can you just help walk through the mechanics there of why that wouldn't have increased?
Navam Welihinda (CFO)
Yeah, I mean, it's quite simple. The number that we care about is sales-led subscription revenue. We handily beat that number this quarter. We raised more than we beat. That's a reaction of what we think is happening with the business and the sort of the positive momentum that we're seeing on the sales line. Overall, you know, we're not thinking about it too much more than we feel good about the forward momentum of sales-led subscription revenue, and we beat the number, and we're raising more than we beat.
Matt Calitri (VP of Equity Research)
Got it. Thank you.
Operator (participant)
The next question is from Eric Heath with KeyBanc Capital Markets. Please go ahead. Mr. Heath, your line is open on our end. Perhaps it's muted on yours. Showing no further questions, this concludes our question and answer session. I would like to turn the conference back over to Ash Kulkarni for any closing remarks.
Ash Kulkarni (CEO)
Thank you all for joining us today. We here at Elastic are very proud of our business results and excited about the opportunity ahead. Thank you.
Operator (participant)
The conference is now concluded. Thank you for attending today's presentation. You may now disconnect.