Datadog - Earnings Call - Q2 2025
August 7, 2025
Executive Summary
- Delivered strong Q2: revenue $827M (+28% YoY) and non-GAAP EPS $0.46; revenue exceeded the high end of guidance and both revenue/EPS were above Wall Street consensus* for the quarter.
- Mix and durability: AI-native cohort represented ~11% of Q2 revenue and contributed ~10 points of YoY growth; trailing 12‑month NRR was ~120% and gross revenue retention remained mid–high 90s, underscoring mission-critical stickiness.
- Guidance raised across the board: FY25 revenue to $3.312–$3.322B (from $3.215–$3.235B), non‑GAAP operating income to $684–$694M (from $625–$645M), and non‑GAAP EPS to $1.80–$1.83 (from $1.67–$1.71).
- Margin/cash: Non‑GAAP operating margin 20% (impacted by ~$13M DASH cost and ~$6M negative FX); free cash flow $165M (20% margin). Management expects further gross margin improvement in 2H on cloud efficiency programs.
- Catalysts: >125 innovations unveiled at DASH (AI agents, data/AI observability); S&P 500 inclusion effective July 9, 2025; expanding FedRAMP path—each expands TAM and institutional interest.
What Went Well and What Went Wrong
-
What Went Well
- Beat and raise: Revenue grew 28% YoY to $827M and was above the high end of guidance; FY25 guidance was raised across revenue, operating income, and EPS.
- AI momentum: AI-native customers reached ~11% of revenue (up from 8% in Q1 and ~4% YoY), contributing ~10 points of Q2 YoY growth; management reiterated long-term AI tailwind.
- Product velocity and platform adoption: >125 innovations at DASH across AI agents, AI/data observability, security; CEO highlighted multi-product traction (52% using ≥4 products, 29% using ≥6, 14% using ≥8).
-
What Went Wrong
- GAAP profitability and margin mix: GAAP operating loss $(36)M (−4% margin); non‑GAAP operating margin slipped to 20% (vs 22% in Q1, 24% in Q4) given DASH costs and FX headwind.
- Cohort concentration risk: Management cautioned on potential near‑term volatility from AI-native revenue concentration and contract renewals despite strong growth.
- Free cash flow margin normalized: FCF $165M (20% margin) vs 32% in Q1 and 33% in Q4’24; management still targets capex+capitalized software at 4–5% of revenue for FY25.
Transcript
Speaker 3
Good day, and thank you for standing by. Welcome to the Q2 2025 Datadog Earnings Conference Call. At this time, all participants are in a listen-only mode. After the speaker's presentation, there will be a question and answer session. To ask a question during the session, you will need to press *11 on your telephone. You will then hear an automated message advising your hand is raised. To withdraw your question, please press *11 again. Please be advised that today's conference is being recorded. I would now like to hand the conference over to your speaker today, Yuka Broderick, SVP of Investor Relations. Please go ahead.
Speaker 0
Thank you, Deedi. Good morning, and thank you for joining us to review Datadog's second quarter 2025 financial results, which we announced in our press release issued this morning. Joining me on the call today are Olivier Pomel, Datadog's Co-founder and CEO, and David Obstler, Datadog's CFO. During this call, we will make forward-looking statements, including statements related to our future financial performance, our outlook for the third quarter in the fiscal year 2025 and related notes and assumptions, our gross margins and operating margins, our product capabilities, and our ability to capitalize on market opportunities. The words "anticipate," "believe," "continue," "estimate," "expect," "intend," "will," and similar expressions are intended to identify forward-looking statements or similar indications of future expectations. These statements reflect our views only as of today and are subject to a variety of risks and uncertainties that could cause actual results to differ materially.
For a discussion of the material risks and other important factors that could affect our actual results, please refer to our Form 10-Q for the quarter ended March 31, 2025. Additional information will be made available in our upcoming Form 10-Q for the fiscal quarter ended June 30, 2025, and other filings for the SEC. This information is also available on the Investor Relations section of our website, along with a replay of this call. We will discuss non-GAAP financial measures, which are reconciled to their most directly comparable GAAP financial measures in the tables in our earnings release, which is available at investors.datadoghq.com. With that, I'd like to turn the call over to Olivier.
Speaker 2
Thanks, Yuka, and thank you all for joining us this morning to go through our results for Q2. Let me begin with this quarter's business drivers. Overall, we saw trends for usage growth from existing customers in Q2 that were higher than our expectations. We experienced strong growth in our AI-native customer cohort. The number of AI-native customers is growing meaningfully with us as they see rapid usage growth with their products. Meanwhile, we saw consistent and steady usage growth in the rest of the business. We continue to see the overall demand environment as solid, with an ongoing healthy pace of cloud migration and digital transformation. Churn has remained low, with gross revenue retention stable in the mid to high 90s, highlighting the mission-critical nature of our platform for our customers.
Regarding our Q2 financial performance and key metrics, revenue was $827 million, an increase of 28% year over year, and above the high end of our guidance range. We ended Q2 with about 31,400 customers, up from about 28,700 a year ago. This includes about 150 new customers from our Eppo and MetaPlan acquisition. We ended Q2 with about 3,850 customers with an ARR of $100,000 or more, up from about 3,390 a year ago, and these customers generated about 89% of our ARR. We generated free cash flow of $165 million, with a free cash flow margin of 20%. Turning to platform adoption, our platform strategy continues to resonate in the market. At the end of Q2, 83% of customers were using two or more products, the same as last year. 52% of customers were using four or more products, up from 49% a year ago.
29% of our customers were using six or more products, up from 25% a year ago. 14% of our customers were using eight or more products, up from 11% a year ago. Our customers continue to adopt more products, including our security offerings. As a reminder, our security customers can identify and manage vulnerabilities with Code Security, Cloud Security, and Sensitive Data Scanner, and they can detect and protect from attacks with App and API Protection, Workload Protection, and Cloud SIEM. We are pleased that our security suite of products now generates over $100 million in ARR and is growing mid-40s % year over year. While we are pleased to achieve this milestone, we're still just getting started in solving customer problems in this area with new innovations such as our Beats AI Security Agent.
Moving on to R&D, we held our Dash User Conference in June, where we announced over 125 exciting new products and features for our users. Let's go through some of the announcements. First, we launched fully autonomous AI agents, including Beats AI SRE Agent to investigate alerts and coordinate incident response, Beats AI Dev Agent, an AI-powered coding assistant to proactively fix production issues, and Beats AI Security Agent to triage and adopt Cloud SIEM signals. To further accelerate our users' incident response, we announced AI Voice Agent for incident response, so users can quickly get up to speed and start taking action on their phones. We also announced handoff notifications that make it easy to jump straight into the relevant context and quickly communicate with our responders, and status pages to enable automatic updates for customers undergoing an incident.
Second, we delivered a series of products to help customers ship better software with confidence. With the Datadog internal developer portal, developers can ship better and faster by gaining a real-time view into their software systems and APIs with the software catalog, by provisioning infrastructure, scaffolding new services, and managing code changes and deployments with self-service actions, and by following engineering and readiness standards with scorecards. We launched a Datadog MCP server to enable AI agents to access telemetry from Datadog and to act as a bridge between Datadog and MCP-compatible AI agents like OpenAI Codex, Cursor, and Cloud Code from Anthropic. We worked together with OpenAI to integrate our MCP server within the OpenAI Codex CLI, and the Datadog Cursor extension now gives developers access to Datadog tools and observability data directly within the Cursor ID. Third, we are reimagining observability to meet our customers' increasingly complex needs.
Our APM Latency Investigator formulates and explores hypotheses in the background, helping teams to quickly isolate root causes and understand impact without combing through large amounts of data. Proactive app recommendations help users stay ahead of growing system complexity by analyzing APM data to detect issues and propose fixes before they become problems. We announced a Flex Logs frozen tier so customers can keep logs in fully managed storage for up to seven years and be able to search without data movement or rehydration. Archive search now enables teams to query archive logs directly in cloud storage, like Amazon S3 buckets or in the Flex frozen tier. Datadog now supports advanced data analysis features within notebooks. Our security products cover new AI attack vectors across the application, model, and data layers.
At the AI data layer, Sensitive Data Scanner can now prevent the leakage of sensitive data in training data, as well as LLM prompts and responses. At the model layer, we help secure against supply chain attacks in open-source models and prevent model hijacking attacks. At the application layer, we help prevent prompt injection attacks and data poisoning in runtime. We showcase our new end-to-end AI and data observability capabilities. Engineers and machine learning teams can use GPU monitoring to gain visibility into GPU fleets across cloud, on-prem, and GPU-as-a-service platforms, such as CoreWeave and LambdaLab. With AI Agents Console, enterprises can monitor the behavior and interactions of any AI agent used by their teams. We now offer LLM observability experiments to help understand how changes to prompts, models, or AI providers influence application outcome.
We added a new agentic flows visualization to LLM observability to capture and understand the decision path of AI agents. Last but not least, and accelerated by our recent acquisition of MetaPlan, Datadog now offers a complete approach to data observability across the entire data lifecycle, from injection to transformation to downstream usage. We continue to relentlessly innovate to solve more problems for our customers. In doing so, we are being rightfully recognized by independent research, and we are pleased that for the fifth year in a row, Datadog has been named as a leader in the 2025 Gartner Magic Quadrant for Observability Platforms. We believe that this validates our approach to deliver a unified platform which breaks down silos across teams. Now, let's move on to sales and marketing. We had a number of great new logo wins and customer expansions this quarter.
Let's go through a few of those. First, we signed a seven-figure annualized expansion in a three-year contract worth more than $60 million, with one of the world's largest banks. This company believes getting to the cloud is essential, so they can use AI on their extremely rich dataset to improve how they manage risk and serve their customers. They are using Datadog as their strategic cloud observability platform, and they continue to migrate more applications to the cloud. This customer is expanding to 21 Datadog products, with thousands of users who log into the Datadog platform every month. Next, we signed a seven-figure expansion to an eight-figure annualized contract with a leading U.S. insurance company. Datadog is supporting this customer's efforts to consolidate observability tools and expand their cloud-based products.
By adopting Datadog, they are experiencing fewer and less severe incidents, with estimated savings of over $9 million per year in incident response costs and improving more than 100,000 customer transactions that would otherwise be impacted every year. With this expansion, this customer will adopt 19 Datadog products and will consolidate a couple dozen tools across multiple business units. Next, we signed a nearly seven-figure annualized expansion with a leading American media conglomerate. This customer has about 100 observability tools across more than 300 business units, and this tool fragmentation has resulted in inefficiencies in extra costs and lost engineering time. They are expanding to 21 Datadog products, including all of our security products, and replacing their staging solution with Datadog on-call and incident management. Next, we landed a seven-figure annualized deal with a leading Brazilian e-commerce company.
This customer's previous observability vendor was unable to support them as they moved to newer software platforms and modern cloud infrastructure. By replacing this tool with Datadog, the company was able to gain full visibility into its cloud tech stack and saw significant improvements in application stability and incident resolution times. This customer will start with seven Datadog products, including Flex Logs. Next, we landed a seven-figure annualized deal with the delivery app of a major American retailer. This customer found our ROM and error tracking products to be immediately valuable, finding an issue on the first day of their Datadog trial that they hadn't identified after months of searching with their old tool. By adopting Datadog with seven products to start, this customer will consolidate half a dozen tools while meeting their PCI compliance requirements. Finally, we welcome back a leading U.S.
mortgage company in a nearly seven-figure annualized deal. This customer had moved to using a dozen open-source disconnected tools, which led to fragmented visibility, ad hoc fatigue, and poor customer experience. In returning to Datadog, they plan to adopt six products, including replacing their paging system with Datadog on-call. That's it for another product reporter from our go-to-market teams, who are now very hard at work on a busy Q3. Before I turn it over to David for a financial review, I want to say a few words on our longer-term outlook. There's no change to our overall view that digital transformation and cloud migration are long-term secular growth drivers of our business. As we think about AI, we are incredibly excited about our opportunities. First, AI is a tailwind for Datadog, as increased cloud consumption drives more usage of our platform.
Today, we see this primarily in our AI-native customer cohort who are monitoring their cloud-native applications with us. There are hundreds of customers in this group. They include more than a dozen that are spending over $1 million a year with us, and more than 80 who are spending more than $100,000. They include eight of the top 10 leading AI companies. While we know there's a lot of attention on this cohort, we primarily see it as an indication of what's to come, as companies of every size and every single industry incorporate AI into their cloud applications. We continue to see rising customer interest for next-gen AI observability and analysis. Today, over 4,500 customers use one or more Datadog AI integrations. Second, next-gen AI introduces new complexity and new observability challenges.
Our AI observability products help our customers gain visibility and deploy with confidence across their entire AI stack, including GPU monitoring, LLM observability, AI agent observability, and data obscurity. We will, of course, keep innovating as the AI landscape develops further. Third, we are incorporating AI into the Datadog platform to deliver more value to our customers. As I discussed earlier, we launched Beats AI SRE Agent, Dev Agent, and Security Agent. We are seeing very good results with those, with more improvements and new capabilities to come. Finally, as a SaaS platform focused on our customers' critical workflows, we have a large volume of rich, clean, and detailed data, which allows us to conduct groundbreaking research. A great example of that is our TOTO for National Model for Time Series Forecasting, which shows state-of-the-art performance on all benchmarks, even going well beyond specialized observability use cases.
You should expect to see more from us on that front in the future, as well as taking novel research approaches and models straight into our products to improve customer outcome. We are extremely excited about our progress so far against what we expect to be a generational growth opportunity. In other words, we're just getting started. With that, I will turn it over to our CFO. David?
Speaker 1
Thanks, Olivier. Q2 revenue was $827 million, up 28% year over year, and up 9% quarter over quarter. Now, to dive into some of the drivers of this Q2 revenue growth. First, overall, we saw trends for usage growth from existing customers in Q2 that were higher than our expectations. This included strong growth in our AI-native cohort, as well as usage growth from the rest of the business that was consistent with recent quarters amidst a healthy and steady cloud migration environment. We saw a continued rise in contribution from AI-native customers in the quarter, who represented about 11% of Q2 revenues, up from 8% of revenues in the last quarter and about 4% of revenues in the year-ago quarter. The AI-native customers contributed about 10 points of year-over-year revenue growth in Q2 versus about six points last quarter and about two points in the year-ago quarter.
As previously discussed, we do see revenue concentration in this cohort in recent quarters. If we look at our revenue without the largest customer in the AI-native cohort, our year-over-year revenue growth in Q2 was stable relative to Q1. We remain mindful that we may see volatility in our revenue growth on the backdrop of long-term volume growth in this cohort, as customers renew with us on different terms and as they may choose to optimize cloud and observability usage over time. As you heard from Olivier, we continue to believe that adoption of AI will benefit Datadog in the long term. We believe that the growth of this AI-native customer group is an indication of the opportunity to come, as AI is adopted more broadly and customers outside the AI-native group begin to operate AI workloads in production.
Now, regarding usage growth by customer segment, in Q2, our year-over-year usage growth was fairly similar across segments relative to previous quarters, as SMB and mid-market usage growth improved in Q2, while enterprise customer usage growth remained roughly stable. Note that we are excluding the AI-native cohort for the purposes of this commentary. As a reminder, we define enterprise as customers with 5,000 or more employees, mid-market as customers with 1,000 to 5,000 employees, and SMB as customers with less than 1,000 employees. Regarding our retention metrics, our 12-month trailing net retention percentage was about 120%, higher than the high 110% last quarter. Our trailing 12-month gross revenue retention percentage remains in the mid to high 90%. Now, moving on to our financial results. First, billings were $852 million, up 20% year over year. Remaining performance obligations, or RPO, was $2.43 billion, up 35% year over year.
Our current RPO growth was in the low 30% year over year, and our RPO duration was up slightly year over year. As previously mentioned, we continue to believe that revenue is a better indication of our business trends than billings and RPO, as those can fluctuate relative to revenue based on the timing of invoicing and the duration of customer contracts. Now, let's review some of the key income statement results. Unless otherwise noted, all metrics are non-GAAP. We have provided a reconciliation of GAAP to non-GAAP financials in our earnings release. First, gross profit in the quarter was $669 million for a gross margin of 80.9%. This compares to a gross margin of 80.3% last quarter and 82.1% in the year-ago quarter.
As we've discussed in the last call, we saw an increasing impact of our engineers' cost savings efforts throughout this quarter as they delivered on cloud efficiency projects. We are continuing our focus on cloud efficiency and believe that we have further opportunity for gross margin improvement in the second half of the year. Our Q2 OpEx grew 30% year over year, up from 29% last quarter. As we've communicated over the past year, we plan to grow our investments to pursue our long-term growth opportunities, and this OpEx growth is an indication of our execution on our hiring plans. Q2 operating income was $164 million for a 20% operating margin compared to 22% last quarter and 24% in the year-ago quarter. Within that, as we've noted, we held our Dash User Conference in June. As expected, the event cost $13 million.
We also experienced a rising impact from the weaker dollar and absorbed $6 million of negative FX impact during Q2. Excluding those expenses, operating income would have been 22% in Q2 or 200 basis points higher. Now, turning to the balance sheet and cash flow statements, we ended the quarter with $3.9 billion in cash, cash equivalents, and marketable securities. Our cash flow from operations was $200 million in the quarter. After taking into consideration capital expenditures and capitalized software, free cash flow was $165 million for a free cash flow margin of 20%. Now for our outlook for the third quarter and the remainder of fiscal 2025. First, our guidance philosophy overall remains unchanged. As a reminder, we base our guidance on recent trends observed and apply conservatism on these growth trends.
For the third quarter, we expect revenues to be in the range of $847 to $851 million, which represents a 23% year-over-year growth. Non-GAAP operating income is expected to be in the range of $176 to $180 million, which implies an operating margin of 21%. Non-GAAP net income per share is expected to be $0.44 to $0.46 per share, based on approximately 364 million weighted average diluted shares outstanding. For fiscal 2025, we expect revenue to be in the range of $3.312 to $3.322 billion, which represents a 23% to 24% year-over-year growth. Non-GAAP operating income is expected to be in the range of $684 to $694 million, which implies an operating margin of 21%. Non-GAAP net income per share is expected to be in the range of $1.80 to $1.83 per share, based on approximately 364 million average diluted shares.
Some additional notes on our guidance: we expect net interest and other income for fiscal 2025 to be approximately $150 million. Due to the impact of the recent federal tax legislation, we now expect cash taxes for 2025 to be about $10 to $20 million. We continue to apply a 21% non-GAAP tax rate for 2025 and going forward. Finally, we expect capital expenditures and capitalized software together to be 4% to 5% of revenues in fiscal year 2025. To summarize, we are pleased with our execution in Q2, including the many products and features we launched at Dash. We are well positioned to help our existing and prospective customers with their cloud migration and digital transformation journeys, including their adoption of AI. I want to thank all Datadogs worldwide for their efforts. With that, we'll open the call for questions. Operator, let's begin our Q&A.
Speaker 3
Thank you. As a reminder, to ask a question, please press *11 on your telephone and wait for your name to be announced. To withdraw your question, please press *11 again. Please stand by while we compile the Q&A roster. Our first question comes from Raimo Lenschow of Barclays. Your line is open.
Perfect. Thank you. Two quick questions from me. Olivier, like you talked about the AI contribution and slowly broadening out, how should we think about it in terms of when this goes much broader into inference, et cetera, so that everyone like Barclays, JPMorgan Chase, et cetera, they all kind of need to do more around observability because they're going to do more inference, et cetera. In a way, like OpenAI, et cetera, is just setting the scene for a future. What do you think about the market opportunity there? Then, David, in the second half of last year, you hired a lot of extra sales guys. Can you talk a little bit about that REM and where they are on their productivity curve? Thank you.
Speaker 2
Yeah, on the AI opportunity, there's really multiple layers to it. The first layer is largely what we see today, which is companies that are running their inference stack and their application around it in cloud environments. That's the case of the model makers. You think of the companies that are doing coding agents, things like that. That is what we see today, and it looks a lot like normal compute. You have normal machine CPUs, some GPUs, quite a few other components, databases, web servers, things like that. That's the bulk of what we see today, and there's going to be more of it as the AI applications come into production. There are more specialized inference workloads and even training workloads in some situations that rely on instrumenting GPUs.
For that, we have a new product out there that does GPU monitoring that we announced at the Dash User Conference. All that, I would call the infrastructure layer of AI. On top of that, there's new problems in terms of understanding what the applications themselves are doing. The applications are largely non-deterministic anymore. They either are run by a model that is non-deterministic by nature, or they're running code that was not as carefully written as it used to be. It's not completely written by humans, it's largely written by AI agents. As a result, you also need to spend a lot more time understanding how that code is working, and that largely happens in production. That's a brand new area of observability, which is how you deal with applications that have not been completely defined in development and that have to be evaluated in production.
We think the whole market is going there, not just the AI-native customer cohort. The AI-native customer cohort is definitely doing that today. Both applications are running on models and code that has been largely written by agents, but the rest of the market is going there. The best proof points you see of that is the very, very broad adoption today, both of the API-guided AI models and of the coding agents, which you see in every single large enterprise today.
Thank you.
Speaker 1
Yeah, as to sales capacity, we have been successful in increasing both our number of salespeople and our REM sales capacity. We started that, as you said, in the last part of 2025. We are seeing evidence of that through our new logo production and our pipeline. We need to, as we talked about previously, go through the ramping of that. In looking at the size, productivity, and performance, we see some good signs that that quota capacity is becoming productive.
That's encouraging.
Speaker 3
Thank you. Our next question comes from Sanjit Singh of Morgan Stanley. Your line is open.
Thank you for taking the questions. Congrats on the really stellar results this quarter. David, when I look at the guide, I mean, this is probably one of the more impressive guides coming out of a Q2 that I've seen in a couple of years. If I square that against the commentary that you guys made on the AI-native cohort that, look, there could be volatility from this cohort. When I kind of put those two together, the guidance is really strong. When I think about that potential risk, is it fair to assume that it's not something that you're seeing right now and may come to play later on down the road? The guidance seems really strong. It doesn't seem to, at least on the face, doesn't seem to anticipate that risk volatility from the AI-native cohorts.
Speaker 1
Yeah, I think we gave metrics indicating that based on what we saw in the quarter and we're seeing now that the AI-native customer cohort continues to grow quite rapidly and we're winning a good market share in that. How we incorporate that into the guidance is, as we discussed previously, we know that there might be volatility in usage or, as we negotiate contracts, in unit rates. Therefore, we adopt conservative assumptions as to that performance in the remainder of the year. It's not something, as you can tell from the growth metrics, that we see yet in our results. As we learned in the previous cycle with cloud natives, there can be volatility and we want to make sure we incorporate that in our guidance.
Perfect. Olivier, with the new security disclosures, congrats on crossing the $100 million threshold. Is there any sort of change in the buying behavior? There's been consolidation in the industry. You guys have been advancing your portfolio quite significantly. You guys have fully autonomous security agents. What's your prospect for this pool of the business, or this part of the business, to drive growth for the balance of the year and going into 2026?
Speaker 2
Yeah, so we have a very good product set. We mentioned a Swedish product in there. There are a couple of those products that are really, I would say, reaching an inflection point in terms of what they're doing on the customer side. When I think of where we're successful today in security, we're very successful at getting broad adoption, like a large number of customers, a few customers that are spending $1 million plus on security with us. We're very happy with the proof points we have there. What we haven't done very well yet is getting standardized adoption wall to wall in large enterprises. That's the next focus for us on the security side. Some of that is product work, but a lot of it is a few customizations to go to market there.
We get better at selling enterprise-wide security top down, which is not something we have done a lot of in the past. That's sort of where we are as a product. Happy with where we are. A lot of groundwork has been done on the product side, but there's quite a bit more work to be done and a ton more opportunity in front of us. That's why we're focusing on it.
Appreciate the thoughts, Olivier. Thank you.
Speaker 3
Thank you. Our next question comes from Cash Rankin of Goldman Sachs. Your line is open.
Hey, thanks for taking my question. This is Matt Martino on for Cash Rankin. David, you called out enterprise consumption volatility in the last quarter. It sounds like that may have been consistent this time around while SMB continues to improve. Could you perhaps characterize any discernible trends between these two customer demographics? What went right relative to your expectations heading into Q2 and really how that informs your second-half guide? Thanks a lot. That's it for me.
Speaker 1
Yeah, I think broadly we're calling out that the usage trends across the segments were roughly consistent with the previous quarters. We said we did see some more concentrated. This is not a comment about AI. This is a comment about enterprise. Take that less consumption relative to a spike, but we saw that stabilize. We've seen small but gradual improvement of the SMB as a result of their usage of our products.
Speaker 3
Thank you. Our next question comes from Mark Murphy of JPMorgan. Your line is open.
Thank you, Lamai. Congrats. Olivier, I actually wanted to ask you about TOTO and Boom. Those announcements, it looks like you're bringing very serious AI research to a space where it is applicable and opening it up very broadly. The size of the data set is vast. I'm curious, what type of response do you expect to see here? Just help us understand maybe how that can sustain growth in future years and have a quick follow-up for David.
Speaker 2
I mean, look, we think there's so much opportunity in automation with autonomous AI agents. We really broke it up in three different categories so far. One is the SRE and responding to alerts and investing in alerts and maybe auto-remediating those issues. The second one is coding, fixing issues that we find in the code that happen in production and verifying the safety ourselves. The last one is security on investing in security signals on our own so that customers don't have to do that themselves. There's so much at least that can happen there. A lot of it is going to depend on great research, which is why we built a research team and which is why we developed and released with OpenWeight research models already. The next step after releasing these research models is to incorporate them into the product.
That's also one of the things we're working on right now. There's just so much opportunity in front of us there that we're, at this point, we're happy. We got a great start. We got a fantastic result in our first release as a research output. It's really a state-of-the-art model that beats every single other model in the category that has seen quite a bit of action over the years. Time series forecasting has very wide applicability in a lot of different domains. I think it shows that we can perform at the highest level there. I think it's a great sign of things to come in terms of AI automation and AI agents.
Okay, thank you. David, we keep pointing out that Datadog is one of the only software companies that's investing seriously in headcount growth. It feels like that is paying top-line dividends pretty tremendously today. We noticed the R&D spending is up noticeably in Q2. Just wondering, what are the mechanics that are driving that on the R&D line? The flip side is, what's allowing you to guide operating income so much higher in Q3 than you had guided that for Q2?
Speaker 1
Yeah, in R&D, as we talked about, we had an aggressive investment plan and we've been able to execute. I think our recruitment, credit to our recruitment team, we've been able to get people in the door, the right people earlier in the year. There are some things within that around FX that weigh a little bit on it because, as you know, we do have a significant R&D center in Paris. I think the overall trend is the execution in recruiting. We talked about some of the factors in Q2 that caused the operating income to increase at a rate of 36%. Some of those are things like the timing of Dash. We talked about $13 million, the FX.
I think that we have a good line of sight on the drivers in R&D, both in terms of, as we talked about, and some of the operating expenses have some seasonality in it.
Speaker 2
The one thing I would add, which is that we also are spending more on AI training and inference in R&D. If you compare it to past years, you know, and the output of that is things such as TOTO or the next versions of it that we're training right now and experiments we're running to train agents, run simulations to train agents, and things like that. You shouldn't expect the overall picture of our R&D investment to change in the future, although I think we expect the same envelope to be what we use moving forward.
Speaker 1
Yeah, I'll add that and really call out to our R&D team and our FinOps that we said last quarter that we were going to focus on how we use cloud. That applies to both the gross margin and, as you know, we dog food. We use a lot of our applications internally. We were quite successful in Q2, and that run rate we expect to continue forward in optimizing our cloud usage, which will have an effect on the margins and the OpEx growth rates as we proceed through the year.
Speaker 2
Thank you very much.
Speaker 3
Thank you. Our next question comes from Koji Akira of Bank of America. Your line is open.
Yeah, hey guys, thanks so much for taking the questions. We all see that the second quarter was really, really strong. Guidance for 2025 looks really, really great. I wanted to ask you about contract visibility. How are you feeling about contract visibility specifically with your large AI-native customers? I have to imagine you're very close to these customers and having lots of conversations with them. I know there is some concern about that. David, you mentioned potential volatility. I really want to ask about how you're feeling about contract visibility. Thanks.
Speaker 2
I mean, look, we can't really speak about any specific customers. As a reminder, you know, any individual customer can do whatever they want. They're the heroes of their own stories, and you know, we can't really speak for them. I would say we have strong product engagement from our top customers in general. We're working on making Datadog the very best platform for every company at any scale, including scale that has never been seen before in companies with high growth. I would say that's about it. When you look at the way we forecast the business, remember that we are overall at an extremely high retention product. For most customers, it's not rational to do it themselves, build their own solutions. We have many customers who did turn to build it themselves who come back afterwards, and we named one on the call today.
We feel confident about the way we forecast the business and the mid to long term there. Of course, as we renegotiate with customers, as they increase volume, what typically happens is we see short-term drops and long-term growth in the revenue that's associated with them. That's the way we've always implemented.
Thank you so much. I did have a follow-up on security. It sounds great to hear about the milestones, $100 million growing 40%. Thinking about the product set, how are you thinking about expanding the capabilities from here? Are you focused on more organic, inorganic, and maybe an update to your M&A philosophy? I guess the question here is, are you willing to go much bigger to supplement your security strategy? Thank you.
I mean, look, we're looking at a number of different things with security. There's a lot of companies out there. There's a lot of product areas we cover already, and a lot more product areas we can cover. It's also a space where you need to cover a lot of the, we call them, boring must-have table stakes features on one end, but also there's quite a bit of investment in the future with the way the whole field is being disrupted with AI. There's quite a bit of work to be done there. You should expect us to do more M&A around that, as we do in the rest of the business, as there are a lot of assets out there and a lot of opportunities to grow.
Thank you so much.
Speaker 3
Thank you. Our next question comes from Karl Kirstad of UBS. Your line is open.
Okay, great. Thanks. Maybe I'll direct this to David and link the AI-native exposure to margins. So David, now that the AI natives are 11% of Datadog's revenue mix, I think it's fair to ask whether the revenues from that cohort are coming at similar margins as the rest of the business, or do you think that this could be even short-term a modest source of margin pressure? Thank you.
Speaker 1
I would say, like we talked about last quarter, this isn't about the AI in margin, the AI cohort versus non-AI cohorts. We price based on volume and on term. To the extent you would have an AI customer who's doing much the same things as our other customers in the use of the product, has similar volumes and similar terms to the non-AI, it would be similar margins. To the extent that we have a larger customer in there, given our price grids, that customer would get a better discount. That's the way we've always priced. It really is related to customer size rather than AI-native or non-AI-native.
Speaker 2
Yeah, I will double down on the piece with the data for the infomercial. We did see, as you mentioned, last quarter, we were seeing gross margins going down a little bit further than we would like them to. What happened is we tasked our engineering teams with optimizing the cloud usage, which goes across all of our customer base. We turned to our own product. We turned to our Cloud Cost Management product and our profiling product largely. In a matter of months, we really turned up substantial improvements, savings on our bills, and improvements in performance and efficiency of our systems while we're still shipping new features. That's something that we're working right now to bring to all of our customers so they can get the same effect and they can see their margins go up as well.
Got it. Maybe the natural follow-up there is, David, you mentioned that you're optimistic about gross margins in the second half. Is that because of what Olivier just mentioned, or are there some other drivers you have in mind?
Speaker 1
No, it's because of what Olivier mentioned. We said we were engaging in these efforts, and as we were more successful in the quarter, we will be carrying that run rate forward, which wasn't fully in Q2, as well as using what Olivier mentioned, using Cloud Cost Management and our projects to have further opportunities going forward. It's really about our progress and pace, which has been successful in our cloud efficiency going forward.
Got it. Thank you both.
Speaker 3
Thank you. Our next question comes from Mike Sikos of Needham. Your line is open.
Hey guys, thanks for taking the question here. I just wanted to double back on the enterprise segment. This is for Olivier, but if I'm thinking about it, I know that we have the enterprise demonstrating the stable growth. Is it fair to assume, like is the analogy for enterprises who are more traditionally using CPU versus the AI-native customer cohort or growing investment in GPUs, is it analogous to like 15 years ago where we saw, hey, on-prem continues to see investment, but maybe more dollars are going towards cloud? Is that a fair analogy when we think about what sort of behavior is exhibited by these different customers and where Datadog is headed?
Speaker 2
I don't know if you can say it exactly this way because at the time, the on-prem versus cloud, they tended to be different customers. Whereas today, sorry, they tended to be the same customers. Whereas today, like the AI natives and the enterprise are different companies altogether. I think the main difference is the AI natives have businesses that are growing very, very fast and infrastructures that are growing very, very fast themselves. The enterprises are still going through a controlled migration from on-prem into the cloud, and the rate there is more limited by their bandwidth to undergo that migration as opposed to being driven by an explosion of traffic on the demand side for them.
If I look at our enterprise segment in general, we see great trends in terms of the bookings, in terms of new products attached, new customers, things that these customers are buying from us that are net new. We see that the usage growth is a bit more moderate than that at this point. I think that speaks to the bandwidth on their end just to move the workload and to go fast there. That relates in part to the fact that a lot of their attention is spent on figuring out what AI technologies they're going to adopt and how they're going to ship these AI applications into production. Overall, we see that rate as stable. We think this is healthy. We think we will see more growth from these enterprise customers as they actually get into production with the AI applications in the future.
Understood. Thank you for that. Congrats on the security. I didn't want to leave hanging. I don't know if we got commentary on it, but could we please get an update on Flex Logs? I know it was a shining star if I go back a quarter ago, just wanted to see how progress is tracking on the Flex Logs side of the house.
All of the big deals with enterprise customers now involve Flex Logs in some form. That's a story that resonates very well, especially when we have customers that want to migrate from legacy solutions from logs. There are a number of things that we're working on with them, in particular, making sure the migration is painless for them. There are a number of things that we're investing in on that side. Flex Logs is a big draw for them as it really changes the picture economically and the predictability of the observability cost for them, which is a major concern for data-intensive parts of observability such as logs.
Great. Thank you, guys.
Speaker 3
Thank you. Our next question comes from Jake Roberts of William Blair. Your line is open.
Yeah, thanks for taking the questions. There's obviously been a lot of talk about AI natives around the business. I know you've talked about the potential for optimization for several quarters, but we continue to see really strong growth in that segment. If you were to see optimization, when would you expect that to happen? As you get a wider swath of customers in that AI-native cohort, do you think you're at the place where you could actually digest an optimization by one or two of those customers?
Speaker 2
If I knew when it was going to happen, I would tell you. The nature of our customers is they grow, they have their own businesses to run, they have their own constraints. We're here to help them deliver their services, and that's what we work on every single day. Every now and then, there's a renegotiation, a renewal, on occasions for customers to figure out what they need to optimize and what they need to do for the future. We never know whether it's going to happen this quarter, next quarter, in three quarters, next year, never. That's really hard to tell.
Okay, that's helpful. Could you also talk about the uptake and feedback that you're getting for your own AI solutions like Beats AI, the new observability agents, and when you think those could really start layering into the model?
Yeah, the initial response to the AI agents is really, really positive. The Beats AI SRE Agent actually works surprisingly well. If you think of how far the technology has grown in a number of years, right now we're busy basically shipping it to as many customers as we can and enabling the customers with it. That's a big area of focus in the business as well. It was developed by a fairly small team, the actual product that we shipped, and now we're busy scaling that up as fast as we can so we can serve all those customers. That's the core focus of the business today. The initial response is very positive. We've had customers purchase an extra SKU for it pretty quickly in their trial, and we feel very good about it.
Very helpful. Thanks for taking the questions and congrats on the great results.
Speaker 3
Thank you. Our next question comes from Brent Bill of Jefferies. Your line is open.
Good morning, David. Just on the quota carrying rep capacity, and I know you've been investing aggressively ahead of the curve, when you think about 2025, are you accelerating that count based on the great results you've seen? Are you digesting that count given those reps who are on board? Give us a sense and flavor of what that quota rep count looks like through the rest of the year. If you can shape the year, how that looks versus 2024.
Speaker 1
Yeah, what we're doing is we're executing the plan we entered the year with. We knew, I think we said that we had underinvested in go-to-market and looked at that with the white space, etc. I would say we're successfully executing that. The plan was a little more front-weighted given our appetite for taking advantage of that opportunity, but we're executing that. We will look towards the end of the year as we plan for next year on the metrics around that and try to calibrate how we look at that growth next year.
Okay. Olivier, I'm just curious, many CEOs are either holding headcount flat or down. We've seen Meta headcount down from two years ago, Microsoft headcount flat. Others comment here saying they're going to shrink headcount in 10x revenue. Do you believe you can become more efficient with fewer, or do you think that that model doesn't apply that you're seeing at other software companies?
Speaker 2
I mean, look, there's definitely the spend is shifting a little bit on the engineering side. As I said, we see more AI training, AI inference. That's definitely changing a bit of the balance between what you have humans do and what you offload to GPUs. That being said, we're still completely constrained by the amount of product we can put up there. There's a ton of opportunity in every single direction we look, whether that's on the AI automation, whether it's on the security side, whether that's in the new areas such as data observability or experimentation that we're going after. For us, there's a very strong ROI in the ads that we're making at the moment.
Great, thanks.
Speaker 3
Thank you. Our next question comes from Andrew Degasperi of BNP Paribas. Your line is open.
Thanks for taking my question. First, on the ramp-up in terms of sales capacity, would you say that's been broad-based in terms of the productivity across both international and domestic?
Speaker 1
As we talked about previously, we have a less developed international footprint, and our growth rate internationally is running higher. We have markets we've talked about before, like Brazil and India and parts of APJ and Middle East that we have opportunities to grow our footprint. We are executing in that way. We're doing it bottoms up, as always. We're looking at the accounts, we're looking at the TAM, and we're looking at how much we're covering it. That produces a result of a little more investment intensity internationally versus in North America. There are lots of opportunities in North America as well.
Thanks, that's helpful. On the enterprise side, given that some of these reps are obviously on the ground, should we expect a number of the attach rates in terms of the three or four more products per customer to sort of accelerate at this level? I know they've been ticking up about a point every quarter. Just wondering if that's something we should be seeing.
I think broadly we expect the trends that we've seen of landing with some of the core products in the pillars and then expanding to continue. As the platform has expanded, we've tended to land with more products. Those trends that we evidenced in the script are we expect to continue in the geographies.
Speaker 2
Keep in mind, a lot of the work we're doing in territory management and in comp planning for the sales teams is really to make sure that there's enough of an incentive to go and look for a new customer. We keep driving a number of new customers up as well. There's this balance always between do you direct the sales force at upselling existing customers or landing new customers?
Thank you.
Speaker 3
Thank you. Our next question comes from Patrick Colville of Scotiabank. Your line is open.
All right, thank you for squeezing me in. I just wanted to say before I ask my question, congrats on the S&P 500 index inclusion. I think that's a really nice milestone for you guys. The question we get consistently from investors is on competition. You referred to your views on competition kind of tangentially in other answers, but maybe more specifically, what are you seeing competitively in observability? The one we get asked about a lot is versus Grafana and CloudSphere.
Speaker 2
Yeah, I mean, look, there's always been competition in the field. As I like to say, when I first fundraised for Datadog, the word that was coming back to me every single time with every single news I was getting from early VCs was crowded space. Throughout the life of the company, there's been not only incumbents that have mostly been in the market now, but also a steady stream of new entrants that we also have, year after year, been in the market. There's always new companies, always new folks that are building new things in observability. I think it's very attractive for engineers to build that. I would know something about it. Generally speaking, the computer landscape hasn't changed much in the past 10 to 15 years. It's about the same.
The way we win, and we will keep winning, is by offering an integrated platform that solves as many problems as possible for our customers end to end. We don't just focus on one tool of the cloud. We don't just focus on one data store, one specific bridge that our customers might want to use. We solve the whole problem for them end to end. In the long run, we win by being more innovative, by having an economic model that lets us invest more in R&D, develop more products, build the existing products into the future faster than anybody else can do, and cover more adjacencies faster than anybody else can do so they can have the broadest platform. That's the reason we win. If you look at all of the companies you mentioned, none of them are in a position to do the same.
That's where we're going to end up in the end. I think that's the end of the call. That would be the last question. Just to close out, I want to thank our customers for working with us to bring all of those great new products to market. We had a lot on our plate this year. You've seen that at Dash. It was amazing, by the way, to see all these customers and meet with them at Dash and see the reception we would get for all these new products. I want to thank them. I know we're working with many of them on how these products are going to be adopted and what's going to happen in Q3 and Q4. Again, thank you, and I will see you next quarter.
Speaker 3
This concludes today's conference call. Thank you for participating, and you may now disconnect.