Baidu - Earnings Call - Q1 2025
May 21, 2025
Transcript
Operator (participant)
Hello, and thank you for standing by for Baidu's first quarter 2025 earnings conference call. At this time, all participants are in a listen-only mode. After management's prepared remarks, there will be a question-and-answer session. Today's conference is being recorded. If you have any objections, you may disconnect at this time. I would now like to turn the meeting over to your host for today's conference, Juan Lin, Baidu's Director of Investor Relations.
Juan Lin (Director of Investor Relations)
Hello, everyone, and welcome to Baidu's first quarter 2025 earnings conference call. Baidu's earnings release was distributed earlier today, and you can find a copy on our website as well as on newswire services. On the call today, we have Robin Li, our co-founder and CEO, Julius Rong Luo, our EVP in charge of Baidu Mobile Ecosystem Group, MEG, Dou Shen, our EVP in charge of Baidu AI Cloud Group, ACG, and Jackson Junjie He, our Interim CFO. After our prepared remarks, we will hold a Q&A session. Please note that the discussion today will contain forward-looking statements made under the safe harbor provisions of the U.S. Credit Security Litigation Reform Act of 1995. Forward-looking statements are subject to risks and uncertainties that may cause actual results to differ materially from our current expectations.
For detailed discussions of these risks and uncertainties, please refer to our latest annual report and other filings with the SEC and Hong Kong Stock Exchange. Baidu does not undertake any obligation to update any forward-looking statements except as required under applicable law. Our earnings press release and this call include discussions of certain unallocated non-GAAP financial measures. Our press release contains a reconciliation of the unallocated non-GAAP measures to the unallocated most directly comparable GAAP measures and is available on our IR website at ir.baidu.com. As a reminder, this conference is being recorded. In addition, a webcast of this conference call will be available on Baidu's IR website. I will now turn the call over to our CEO, Robin.
Robin Li (Co-Founder and CEO)
Hello, everyone. We kicked off 2025 with a solid start. In the first quarter, Baidu Core's total revenue reached CNY 25.5 billion, representing a 7% year-over-year increase. The growth was primarily attributable to the robust performance of our AI Cloud business. In Q1, AI Cloud revenue reached CNY 6.7 billion, increased by 42% year-over-year, representing a significant acceleration for our Cloud business. Such performance reinforces the widespread market recognition of our distinctive AI capabilities, underpinned by our unique four-layer AI architecture, while affirming the ongoing demand for our full-stack end-to-end AI products and solutions. Notably, AI Cloud accounted for 26% of Baidu Core revenue, up from 20% a year ago, reflecting the growing significance of our AI Cloud business within our business portfolio. Throughout the first quarter, amid rapid evolution across the AI landscape, advancing our AI capabilities remains our core priority.
We have accelerated the iteration of our foundation models, allowing us to maintain our leading position as one of the top players in this dynamic field. In March, we released ERNIE 4.5 and ERNIE X1. ERNIE 4.5 is our first flagship model with multimodal capabilities, and it excels at understanding, analyzing, and processing multimodal content precisely. ERNIE X1, our first reasoning model, brings advanced reasoning capabilities with best-in-class function calling, tackling complex problems with extended chains of thought. Notably, both ERNIE 4.5 and ERNIE X1 come with highly competitive pricing. Furthermore, in April at Baidu Create 2025, we unveiled their upgraded version, ERNIE 4.5 Turbo and ERNIE X1 Turbo, which feature enhanced performance and dramatically lower pricing, making them among the most cost-effective options on the market. Our rapid and continuous cost reductions stem from our unique four-layer AI architecture and full-stack capabilities.
This distinctive architecture enables end-to-end optimization at every layer, spanning infrastructure, framework, models, and applications, allowing us to holistically enhance both performance and efficiency. As a result, we deliver superior performance and stability at highly competitive pricing, positioning us to offer industry-leading foundation models and AI solutions with exceptional price-performance ratios. With stronger capabilities and lower pricing, foundation models are becoming increasingly accessible, enabling diverse applications at scale and unlocking significant value across industries. Beyond the model iterations, we are also taking steps to make AI more open and collaborative. As previously announced, we plan to open source our most advanced ERNIE 4.5 series of models on June 30th, a move that reflects both our technological confidence and our efforts to make ERNIE more accessible.
In parallel, we are proactively embracing open standards such as the Model Context Protocol, or MCP, which provides easier access to AI-powered tools and further lowers barriers to AI development. As development becomes simpler, we expect to see a growing number of AI applications emerging on our platform. Together, these efforts echo our consistent application-driven approach to innovation and our determination to make AI more accessible, applicable, and impactful. In our AI Cloud business, we are strengthening Qianfan, our industry-leading math platform, to better support developers and enterprise clients in building models and facilitating AI applications. Qianfan boasts a comprehensive model library of foundation models, covering nearly all mainstream options on the market. It offers not only our own ERNIE family of models but also a wide range of open-source and third-party models, including the latest reasoning and multimodal models.
This breadth allows individual developers and enterprise clients to choose suitable models with greater flexibility. Importantly, Qianfan provides these models with industry-leading cost effectiveness. When running models like DeepSeek, Qianfan achieves what we believe to be some of the lowest inference costs in the industry today, with lightning speed and massive concurrency. Qianfan also delivers an expanding tool chain, continuously enriched to provide the most comprehensive and user-friendly toolkits for AI development. This quarter, complementing our existing app builder, model builder, and agent builder, we introduced a data builder to support AI data processing and preparation, while rolling out system-wide upgrades across the entire tool chain to further improve efficiency and ease of use. First, we enhanced the model builder to support the customized development of reasoning models by incorporating advanced training techniques, including reinforcement learning methods like RFT and GRTO.
Second, we extended our fine-tuning capabilities to multimodal models, offering multimodal reinforcement learning techniques and enabling full-process support from model building and training to evaluation and development. Third, as foundation models grow in size, model distillation has become essential for enterprise adoption. Hence, we introduced a one-click distillation feature that streamlines the previous multi-step process. With our expanded model library now covering reasoning models, enterprise clients can effortlessly build smaller models that maintain reasoning capabilities with reduced costs, making it easier to adopt advanced AI technology. Together, these enhancements significantly strengthen Qianfan's tool chain, lowering the barriers for AI adoption and enabling faster, more efficient innovation across diverse use cases. On our legacy consumer-facing product, Baidu Search, we accelerated its AI transformation with an unrelenting focus on enhancing user experience.
Our journey exemplifies how complex AI capabilities can be applied to create meaningful improvements that directly benefit our hundreds of millions of users. After exploring and validating for several quarters and with consistent positive user feedback, we established a relatively mature and scalable product framework for our GenAI-enabled search early this year. Building on this, we are determined to further accelerate the AI transformation of search. In April, about 35% of mobile search result pages contain AI-generated content, increasing from 22% in January. We are further enhancing the search experience by prioritizing multimodal content, including images, videos, agents, digital humans, and live streaming. We believe this is a more effective way to present search results, as it aligns with evolving user preferences and better addresses the growing complexity of search queries. The distribution of multimodal content has been rapidly increasing.
This trend reflects our continued progress in delivering a more intuitive and effective search experience. Also, the volume of content accessible within Baidu has continued to expand, particularly with the empowerment of foundation models. One example is AI-generated digital human videos, which have surged over 30-fold from the beginning of 2025 through April in just a few months. The growing volume of content enriches what users can discover and provides access to a more expansive information landscape. Our efforts have led to consistent improvements in user experience. Users exposed to AI-generated search results find their search intent fulfilled more easily and quickly, indicating they get the desired information more efficiently. These users are also increasingly inclined to search for more varied questions or topics, have demonstrated higher retention over time. We are delighted to see more users can enjoy these improvements.
In March, the MAU of Baidu App increased by 7% year-over-year, reaching 724 million. We firmly believe that agents and intelligent digital humans represent promising real-world applications of AI technology that will open up vast market opportunities ahead. Last quarter, I introduced the convergence of agents and intelligent digital humans, a powerful combination that brings together foundation models' capabilities and digital human technology. Today, they are already widely deployed throughout our mobile ecosystem, effectively supporting different scenarios across industries. At our recent Baidu Create 2025, I further introduced an upgraded version of intelligent digital human with hyper-realistic interactions, delivering natural conversation with vivid facial expressions and fluid human-like gestures. In the future, we believe they can match or even outperform humans in certain scenarios. We're preparing to launch and scale our next-generation hyper-realistic digital humans into production soon.
Now, turning to intelligent driving, which represents another compelling frontier of our AI applications in the physical world. As highlighted last quarter, Apollo Go, our autonomous ride-hailing service, has successfully validated its business model in the key operational region with highly complex transport conditions and cost-sensitive local passengers. It has achieved 100% fully driverless operations in mainland China. This gives us strong confidence to expand into international markets with higher pricing for ride-hailing service, where we aim to replicate and further optimize our proven approach. In Q1, we reached critical milestones in international expansion, with Apollo Go entering both Dubai and Abu Dhabi, aiming to provide safe, comfortable, and affordable autonomous ride-hailing services in these booming markets. In May, we began open-road validation testing in Dubai, and we expect to start testing in Abu Dhabi soon.
Meanwhile, we have also expanded our testing area in Hong Kong and obtained permission to conduct open-road testing with designated passengers in April. With over 1,000 fully driverless vehicles now deployed globally, we continue to solidify our position as the world's leading autonomous ride-hailing service provider. We are scaling up our services globally. Looking ahead, we will deepen our presence in existing markets while strategically entering new ones, capturing broader growth opportunities worldwide. Now, let me review the key highlights for each business for the first quarter. AI Cloud revenue reached CNY 6.7 billion in Q1, delivering a strong year-over-year increase of 42%, with non-GAAP operating profit remaining positive. GenAI and foundation model-related revenue recorded triple-digit year-over-year growth, as accelerating AI adoption across multiple sectors drove a notable increase in customer demand for our highly cost-effective AI Cloud services.
As mentioned earlier, we also upgraded our MaaS platform, Qianfan, with an expanded model library and more comprehensive toolkits, extending support for the training and fine-tuning of multimodal and reasoning models to further facilitate AI-native application development. On applications, you may recall that at Baidu World last October, we previewed MiaoDa, which delivers no-code capabilities. In this quarter, we officially launched MiaoDa, making it available to everyone, programmer or not. MiaoDa reflects our mission to democratize AI and empower more people outside the developer community to create innovative applications with natural language inputs. The growing market recognition of our AI expertise continues to drive strong customer growth. In Q1, we deepened our collaboration with existing clients while also expanding our customer base with new partnerships.
We worked with a wide range of leading enterprises, such as China Merchants Group and a top e-commerce company in China, further validating our position as the AI partner of choice. Our client pipeline remains healthy. We saw strong growth in the automotive sector and began expanding into emerging verticals, such as embodied artificial intelligence, where we recently entered into a strategic partnership with Beijing Humanoid Robot Innovation Center, the developer of the Tiangong Ultra Humanoid Robot. For our mobile ecosystem, we accelerated the AI transformation of search in Q1 while continuing to improve the efficiency of our monetization approaches. Agents continue to demonstrate enhanced efficiency as a monetization channel for our advertising business. In March, over 29,000 advertisers had daily ad spending through agents, with many demonstrating increased willingness to allocate more of their ad budget to our agents.
Adoption spans sectors like healthcare, education, lifestyle services, B2B, real estate, and business services, including legal services. In Q1, revenue generated by our agents for advertisers increased 30-fold year-over-year, accounting for 9% of Baidu Core's online marketing revenue. On the other hand, our industry-leading intelligent digital humans have proven their transformative value across business scenarios. For example, our digital humans serve as live streaming hosts for merchants on our platform. Over the past few quarters, tens of thousands of such digital humans have been live streaming on our platform every month, serving not just the merchants but also expanding into fields like legal services, healthcare, education, and more. Turning to intelligent driving, as just highlighted, Apollo Go made solid progress with its international expansion. Following our entry into Dubai and Abu Dhabi, our global footprint now spans 15 cities.
Backed by our validated business model and proven operational expertise, we aim to further broaden our presence across more cities globally. In terms of ride volume, we are seeing clear acceleration. Apollo Go provided approximately 1.4 million rides to the public in Q1, representing a robust year-over-year growth of 75%. As of May 2025, the cumulative rides provided to the public have exceeded 11 million. Meanwhile, we continue to scale up our service capabilities in cities where we have long been operating. Also, we are exploring asset-light business models as a key strategic direction for our future growth, and we have started to see early adoption in certain areas recently. In May, Apollo Go entered into a long-term strategic partnership with CAR Inc., China's leading auto rental service provider, to introduce fully autonomous vehicle rental services and explore new models for smart mobility together.
As our technology and operations mature at scale, we see significant opportunities and commercial sustainability across more use cases and regions. Combined with our regionally validated business model and global expansion efforts, we believe we are well-positioned to create substantial value and reshape the future of mobility in the coming years. Looking back at the quarter's developments, we are seeing encouraging progress in AI applications across the board, from enterprise services to consumer-facing products and intelligent mobility. AI technologies are beginning to generate tangible, meaningful value through applications, which embodies the ultimate goal of our application-driven innovation and aligns perfectly with our long-standing strategic focus on AI. With that, let me turn the call over to Jackson to go through the financial results.
Jackson Junjie He (Interim CFO)
Thank you, Robin. Now let me walk through the details of our first quarter financial results. Total revenues were CNY 32.5 billion, increasing 3% year-over-year.
Revenue from Baidu Core was CNY 25.5 billion, increasing 7% year-over-year. Baidu Core's online marketing revenue was CNY 16.0 billion, decreasing 6% year-over-year. Baidu Core's non-online marketing revenue was CNY 9.4 billion, up 40% year-over-year, mainly driven by AI Cloud business. Within Baidu Core's non-online marketing revenue, AI Cloud's revenue was CNY 6.7 billion, increased by 42% year-over-year and accounted for 26% of Baidu Core's revenue. Revenue from ITE was CNY 7.2 billion, decreasing 9% year-over-year. Cost of revenues was CNY 17.5 billion, increasing 14% year-over-year, primarily due to an increase in costs related to AI Cloud business and traffic acquisition costs. Operating expenses were CNY 10.5 billion, decreasing 3% year-over-year, primarily due to a decrease in personnel-related expenses, partially offset by the increase in channel spending and promotional marketing expenses. Baidu Core's operating expenses were CNY 9.1 billion, decreasing 4% year-over-year.
Baidu Core's SG&A expenses were CNY 4.9 billion, increasing 10% year-over-year. SG&A accounted for 19% of Baidu Core's revenue each quarter, which is basically flat from last year. Baidu Core's R&D expenses were CNY 4.9 billion, decreasing 16% year-over-year. R&D accounted for 16% of Baidu Core's revenue each quarter, compared to 21% in the same period last year. Operating income was CNY 4.5 billion. Baidu Core's operating income was CNY 4.2 billion, and the Baidu Core's operating margin was 16%. Non-GAAP operating income was CNY 5.3 billion. Non-GAAP Baidu Core operating income was CNY 4.9 billion, and Non-GAAP Baidu Core operating margin was 19%. Total other income net was CNY 4.5 billion, increasing 260% year-over-year, mainly due to an increase in fair value gain and pickup of earnings from long-term investments, partially offset by the decrease in net foreign exchange gain arising from exchange rate fluctuation between Renminbi and US dollar.
Income tax expense was CNY 1.2 billion, compared to CNY 883 million in the same period last year. Net income attributable to Baidu was CNY 7.7 billion, and diluted earnings per ADS was CNY 21.59. Net income attributable to Baidu Core was CNY 7.6 billion, and net margin for Baidu Core was 30%. Non-GAAP net income attributable to Baidu was CNY 6.5 billion. Non-GAAP diluted earnings per ADS was CNY 18.54. Non-GAAP net income attributable to Baidu Core was CNY 6.3 billion, and Non-GAAP net margin for Baidu Core was 25%. As of March 31st, 2025, cash, cash equivalent, restricted cash, and short-term investments were CNY 142.0 billion, and cash, cash equivalent, restricted cash, and short-term investments excluding ITE were CNY 136.7 billion.
Free cash flow was -CNY 8.9 billion, and the free cash flow excluding ITE was -CNY 9.2 billion, mainly due to an increase of investments in AI business. We define net cash position as total cash, cash equivalent, restricted cash, short-term investments net, long-term time deposits, and held to maturity investments, and others, less total loans, convertible senior notes, and notes payable. As of March 31st, 2025, net cash position for Baidu was CNY 159.0 billion. Baidu Core had approximately 31,000 employees as of March 31st, 2025. Finally, since last year, we have accelerated our share purchase program. From the beginning of Q1 2025, we purchased a total of $445 million of our shares, reflecting our long-standing commitment to delivering long-term value to shareholders. With that, operator, let's now open the call to questions.
Operator (participant)
Ladies and gentlemen, we will now begin the question and answer session.
If you wish to ask a question, please press star one on your telephone and wait for your name to be announced. If you wish to cancel your request, please press star two. Your first question comes from Alicia Yap with Citigroup. Please go ahead.
Alicia Yap (Managing Director and Senior Equity Analyst)
Thank you. Good evening, Robin, Jackson, Juan, and management. Thanks for taking my questions. I wanted to ask about the AI model. Given the rapid pace of the model iterations and also your upcoming open-source strategy, can management share the latest update on your AI overall strategy? What is the technology roadmap for ERNIE in 2025? Will Baidu continue iterating on the foundation model, such as the ERNIE 5.0, for example? Can you further reduce inference costs going forward? Thank you.
Robin Li (Co-Founder and CEO)
Hi, Alicia. This is Robin. Over the past few months, we've seen an accelerated iteration of foundation models.
No matter how fast the models advance, I think one thing is always clear. The true value of foundation models ultimately lies in applications built on the models. That is why we stick to an application-driven approach for innovation. The foundation model space is very broad, so we do not necessarily have to make early leads in every possible direction. Instead, we strategically focus our model iteration efforts on areas with real application value where we can build the most competitive capabilities. For example, we make sure our model development matches what our products actually need. For over one year, we have been using foundation models to drive AI transformation across our mobile ecosystem, including search. These hands-on experiences have shown us which model capabilities bring real value and are worth prioritizing, like multimodality. We have also spotted some promising application areas. Take our digital humans, for instance.
By combining different model capabilities, we've created hyper-realistic digital humans that can even perform better than real humans in certain situations. We'll be rolling them out at scale soon, opening up their value in many new scenarios. On earnings technology roadmap, we're set to continue the evolution of ERNIE. We're already working on the next generation of models, and we expect to further accelerate our pace of model iteration. Back to your question on inference costs, yes, we absolutely believe we can keep driving down costs. In fact, each new model we've launched recently has come with significant price cuts. As I mentioned before, we released ERNIE 4.5 and X1 in March. X1 matched DeepSeek R1's performance at only half the price. About a month later, we rolled out Turbo versions with better performance and even more aggressive pricing.
ERNIE 4.5 Turbo at 80% lower price than 4.1 and ERNIE X1 Turbo at half the price of ERNIE X1. These price cuts are supported by our full-stack AI capabilities to continuously lower inference costs, making our model one of the most cost-effective options on the market today. We're also opening up our best capabilities to the broader community. We are on track to open-source ERNIE 4.5 series on June 30th. We are excited to see markets respond and look forward to more people exploring what ERNIE can do. Ultimately, we hope this helps more users experience the true value of our models and explore new real-world applications. Thank you.
Operator (participant)
Your next question comes from Lincoln Kong with Goldman Sachs. Please go ahead.
Lincoln Kong (Executive Director)
Thank you, management, for taking my questions. My question is about the cloud business.
We have seen a very strong growth in the first quarter for the cloud revenue. What are the key drivers for this strong growth? How should we think about sustainability here? Also, could the management provide us some breakdown in terms of the by-category, for example, like infrastructure, industry solutions, project-based service, as well as this personal cloud? How should we think about the outlook for the growth as well as the profitability within 2025 for cloud business? Lastly, with the recent tightening of our U.S. export restrictions on those advanced AI chips, what are the potential impacts here on Baidu cloud operation and our growth plan? Thank you.
Dou Shen (EVP of AI Cloud Group)
This is Dou. Thank you for your question.
In Q1, our AI Cloud revenue growth further accelerated from a 26% year-over-year growth in Q4 last year to a 42% year-over-year, mainly driven by surging demand for GenAI and foundation models across industries for both training and inference. As foundation models have undergone faster iteration recently, we have seen a fast increase in the model training needs, not just for large language models, but also for other types of models. Customers are increasingly choosing Baidu AI Cloud for our recognized leadership in AI infrastructure and our enhanced mass platform, Qianfan, which consistently lowers inference costs and improves toolchain efficiency. In terms of the revenue breakdown, Baidu AI Cloud primarily consists of two parts: personal cloud and enterprise cloud. Enterprise cloud contributes to the vast majority of AI Cloud revenue, which has consistently outgrown overall AI Cloud. Within the enterprise cloud, we have subscription-based and project-based revenue.
For the subscription-based revenue, it currently accounts for the majority of the enterprise cloud revenue, providing a sustainable revenue stream. Among subscription-based revenue, GenAI-related revenue has maintained triple-digit year-over-year growth for several consecutive quarters. Project-based revenue may fluctuate from time to time, but in the long run, we believe the proportion for subscription-based revenue will continue to rise, supporting more sustainable and healthier long-term growth for our cloud business. On the profit side, AI Cloud's non-GAAP operating margin continues to expand year-over-year in Q1, maintaining its upward trend. This was driven by an improved revenue mix towards higher value offerings as we focus on opportunities that align with our strategic priorities. As a result, our AI Cloud's non-GAAP operating margin is now at the level of 10%.
Regarding AI chip export restrictions, as Robin just mentioned, we follow an application-driven approach because we believe the greatest value of AI eventually lands at the application layer. Even without access to the most advanced chips, our unique full-stack AI capabilities enable us to build strong applications and deliver meaningful value. Also, our AI infrastructure is both scalable and highly efficient, enabling strong GPU utilization to support both training and inference with high-cost performance. In parallel, we have the flexibility to select from a range of chip solutions based on different business scenarios, especially for inference. Looking forward, we believe that in the overtime, domestically developed self-sufficient chips, along with increasingly efficient homegrown software stack, will jointly form a strong foundation for long-term innovation in China's AI ecosystem. Thank you.
Operator (participant)
Your next question comes from Alex Yao with JPMorgan. Please go ahead.
Alex Yao (Managing Director and Senior Equity Analyst)
Thank you, management, for taking my question. You accelerated the AI search transition this quarter. What's management's rationale behind the move? What are your expectations for AI answers penetration? Any updates on upcoming testing for AI monetization in Q2? How should we think about the ramp-up into the second half of this year in terms of monetization and consumer behavior? Thank you.
Robin Li (Co-Founder and CEO)
Thank you, Alex, for your question and that material question. In Q1, actually, we have seen significantly accelerate our AI transformation of search aiming to further enhance the user experiences through innovative technology. Our top priority remains the same. That is the user experiences, as we believe that the high quality of the UE will and the continuous user metric improvements are critical for sustainable long-term growth.
In April, we have seen that around 35% through five of the mobile search result pages contain AI-generated content, up from 32% in January, which has made our largest expansion so far. We expect this percentage to keep rising rapidly in Q2. There are a couple of reasons behind that. In the first place, today's AI landscape is evolving very quickly, and the user's information-seeking behaviors continue to diversify. It is more necessary than ever to make rapid innovations in such capabilities. Second, ongoing progress in the model's capabilities has also helped us to keep improving the quality, quantity, and presentation format of the search results while generating multimodal contents at a massive level. Meanwhile, the average cost per query will keep falling as the inference costs drop, and we can roll this out across more queries more quickly.
In the third place, we have established a solid product framework that works well with our capabilities to incorporate multimedia contents, which play a very important role in this whole process, and which can help us to provide much easier-to-digest answers and align better with the user's behavior. We have seen very clearly signs of improvement in the user experiences. The users who have been exposed to the AI-generated search results are finding the information much more efficiently and exploring more types of queries and showing stronger retentions. Therefore, we are actively investing to accelerate our AI transformations of search. On monetization, however, still very early, has just started to prepare for testing. Since our AI search differs significantly from the traditional search, the corresponding monetization approach needs to be rebuilt and refined, and this takes some time. That said, we can see huge potential ahead.
Currently, only a small percentage of the traditional search queries can be monetized, while the vast majority cannot. Over time, we anticipate AI search will greatly enhance our ability to monetize the long-tail queries and previously untailed areas. We expect more queries to become monetizable as compared to the traditional search. Plus, AI search can likely create formats that are more flexible and native and fit naturally into the user experiences. They can be less intrusive but potentially even enhance the overall expenses. Looking ahead, we believe that the long-term potential of these expanding monetization capabilities is quite promising, opening up the possibilities that go beyond what traditional search can achieve. While our current move towards AI search will inevitably put notable near-term pressures on revenue and margin, we still believe this is the right path to follow for the long-term growth. Thank you, Alex.
Operator (participant)
Your next question comes from Gary Yu with Morgan Stanley. Please go ahead.
Gary Yu (Equity Analyst)
Hi. Thank you, management. I have a question regarding robotaxi. Given the recent development in the robotaxi space, including some of the players announcing their new robotaxi vehicles and partnership with Uber, how do you view the evolving competitive landscape? What differentiates Baidu RT6 from other robotaxi vehicles? Also, are you exploring similar types of partnerships like your peers? Should we expect faster expansion of Apollo Go this year? Lastly, what scale and unit economics are you targeting and how to think about the long-term profitability potential? Thank you.
Robin Li (Co-Founder and CEO)
Gary, we've been invested in autonomous driving for over 12 years. Today, Apollo Go is among the first and the very few players globally to operate autonomous ride-hailing services at scale, making us both China's largest and a global leader in this space.
RT6 is the world's first and, as of today, the only purpose-built, mass-produced level four autonomous vehicle. It is designed from the ground up for fully driverless operations with self-developed hardware design, algorithms, and software, and featuring top safety redundancy. RT6 is now running at meaningful scale across multiple cities, and its unit cost is below $30,000, far better than anyone else on the planet. With these unique strengths and proven business model, Apollo Go has been making steady progress in global expansion, most recently entering Dubai and Abu Dhabi. Altogether, our global footprint covers 15 cities, as I mentioned during the prepared remarks. On expansion strategy, we are highly open and adaptive. We're ready to enter into any city worldwide. As long as regulations and conditions allow, we can enter quickly and scale efficiently. Our confidence comes from our long operating history, low-cost structure, and excellent safety record.
With over 1,000 fully driverless vehicles deployed, we are expecting to see faster growth in our global fleet size, geographic reach, and ride volumes this year and beyond. Meanwhile, we are proactively exploring new business models through partnerships, especially those that can scale fast and land quickly. We're in active discussions with various players, including ride-hailing platforms, fleet operators, and more. While serious partnerships usually take time to work through all the details, some of them are already taking shape. We are happy to share progress that is really meaningful and concrete. Our strategic partnership with CAR Inc., Shenzhou Zhuche, our top auto rental, China's top auto rental service provider, is one example. I'm sure there's more to come. Looking into the longer term, we see a clear path to profitability as hardware and labor costs keep coming down and our growing operational scale brings more efficiencies.
Given our leading position in both technology and operations, we are confident Apollo Go will continue to lead the field. We expect Apollo Go to be a key driver of Baidu's long-term growth. Thank you.
Operator (participant)
Your next question comes from Miranda Zhuang with Bank of America Securities. Please go ahead.
Miranda Zhuang (Equity Analyst)
Thank you for taking my questions and congrats on the good results. My question is about the competition. We see other AI applications ramping users with enhanced models and offer more advanced functionalities like deep search or agents, and some are leveraging existing large traffic super app platforms. To these, how will Baidu compete with other AI applications and platforms? Thank you.
Julius Rong Luo (EVP)
Thank you for your question. This is Julius. As I mentioned earlier, with the AI evolving very quickly, people now have different ways and choices to find information and enjoy the contents.
We are still early on that. If we did not take both steps to renovate the search, we will be challenged sooner or later. That is why we were among the first major tech companies in China or globally to use AI to transform a legacy core business. In recent months, you have seen us accelerate with transformations. As for AI chatbots specifically, for sure, they are one innovative form of AI applications. We also have our own chatbot products like ERNIE Bots, and we have also integrated conversational AI capabilities in the Baidu App. I would say the AI chatbots represent one important exploration of AI applications, but they are not the ultimate form. Our goal is to find the most powerful AI applications that can create lasting values in these AI areas. When we talk about the competition, it is not just about competing with chatbots.
I think it's about rethinking the value of search in AI times. From there, we have identified two directions we are currently focused on. First, we believe the AI search should efficiently meet the user needs while offering highly engaging experiences. Our AI search now looks very different from the traditional search. We prioritize the rich, impressive, multimodal content that is easier to understand and more natural to interact with. Second, we are re-imagining the AI search to go beyond just finding information. Our ambition is to expand the search to help users make decisions, provide solutions, and ultimately deliver results. To support this, I think a key strategy initiative for us is to make Baidu more capable through innovative approaches like agents. Our agents today can already help the users handle complicated problems, support the decision-making, and connect them with the right services behind the scenes.
Looking ahead, we aim to further enhance our services capabilities through agents. This will not only improve the user experiences by meeting more needs, but also help our customers and our partners connect much better with users than before. Besides, we are also making Baidu more open. This quarter, we proactively embraced MCP and the third-party agents. On one hand, by connecting the third-party tools and capabilities, we are enabling the users and customers and the partners to do even more things within our platform. On the other hand, we are also trying to open up our capabilities through MCP. For example, our e-commerce MCP. By doing so, we are helping all parties in the ecosystem to benefit from greater flexibilities and unlock the broader opportunities, both within and beyond our platform.
In the face of today's increasingly dynamic market, we believe our innovative approaches can expand both the depth and the breadth of what search can do and ultimately create values for everyone in the ecosystem. Thank you.
Operator (participant)
Your next question comes from Wei Xiong with UBS. Please go ahead.
Wei Xiong (Equity Research Analyst)
Hi. Good evening, management. Thank you for taking my question. I want to follow up on the cloud side. With accelerating enterprise AI adoption, how should we think about the enlarged TAM for China's cloud market? How is our cloud business differentiated from competitors in terms of strategy, technology, and customer base? Also, regarding the Qianfan platform, could you please maybe share more updates here? Which industries are seeing the fastest AI adoption, and where do we see the largest long-term potential? Thank you.
Dou Shen (EVP of AI Cloud Group)
Thank you for the question. I'll take it.
As AI adoption accelerates, the time for China's cloud market is expanding meaningfully, driven by changes across the stack. With foundation models driving up the need for massive computing power, the abilities to build and manage large-scale GPU clusters and to utilize GPUs effectively have become key competitive advantages. Meanwhile, as models evolve with different strengths and AI applications bring increasingly diverse needs, the cloud platforms with broad model portfolios and full-stack capabilities are best positioned to offer the flexibility and scalability customers require. This industry shifts align perfectly with Baidu's positioning, making our unique AI capabilities increasingly valuable. Baidu is one of the very few cloud providers globally with end-to-end full-stack AI capabilities. So we offer China's most efficient AI cloud infrastructure backed by strong GPU cluster management. This allows us to deliver high-performance, stable, and cost-effective AI services while continually improving training and inference efficiency.
Qianfan, our MaaS platform, is among the most advanced on the market. It offers a comprehensive model library covering both our ERNIE series and nearly all mainstream supporting and open-source models. This quarter, we further enhanced Qianfan's two chips with extended support for training, fine-tuning, and distillation, especially for newly added multimodal and reasoning models. Strategically, we remain firmly committed to an application-driven approach. As the AI landscape evolves, this approach positions us well to deliver differentiated value. We offer full-stack end-to-end AI products and solutions tailored to the needs of specific industries and scenarios by bringing AI into every aspect of their operations. We help enterprises improve efficiency, reduce cost, and eventually transform their entire workflows. With these strengths, we have captured the opportunity presented by industry shift and repositioned ourselves as China's top-tier cloud provider in the AI era.
The increasing market recognition of our AI expertise continues to drive strong momentum in our client pipeline as we build and deepen partnerships with leading enterprises. We are also increasingly becoming the preferred choice for mid-tier businesses. As for enterprise adoption, we are seeing rising interest across sectors. Early movers like Internet, tech, and online education companies are adopting AI quickly, while others such as automotive, financial services, utilities, and the public sector are also actively exploring with openness. As technology advances and costs decline, we're ready to lead the next wave of enterprise AI adoption and translate innovation into real-world value. Thank you.
Operator (participant)
Your next question comes from Thomas Chong with Jefferies. Please go ahead.
Thomas Chong (Managing Director)
Hi. Good evening. Thanks, management, for taking my question. Can management elaborate about the overall capital allocation plan and the key priorities for 2025? Thank you.
Jackson Junjie He (Interim CFO)
Hi, Thomas. This is Jackson.
I will take your call. We are firmly committed to our investments in AI, as we believe this will generate meaningful returns over the long term. When quantifying such investments, please note our AI investments are reflected not only in CapEx but also in operating cash flow outflows. Taking these two together, our total AI investments in 2024 significantly increased from 2023, reflecting our conviction in the huge future growth potentials. In 2025, we plan to continue to increase our AI investments to further solidify our AI foundation and prepare for future growth. Let me walk through the key areas that we are now focusing on. First, in AI cloud business, we are investing in AI infra to meet rising market demand driven by accelerating AI adoption.
This is prompting us to quickly scale up our AI infra to capture emerging opportunities. As you've seen, our AI Cloud revenue grew 42% year-over-year this quarter with non-GAAP operating margin in the teens. Secondly, we continue to invest in advancing our ERNIE models. Ongoing technical progress has enabled us to train better models more efficiently and at a lower cost. As a result, we brought a series of high-quality ERNIE models to market in quick succession. Also, we are investing in autonomous driving technology. Last quarter, Apollo Go validated its business model in its large operational area in China. This milestone has motivated us to move faster in expansion, including entering the Middle East market while ride volume continues to accelerate across our existing operations. We will continue this momentum going forward. Lastly, we continue to invest in AI search transformation with a focus on user experience.
As Julius said, when we stack up this investment, while monetization for AI search results remains at an early stage, we do expect some margin and pressure in the near term. We see this as a necessary step to unlock long-term growth. Another key focus of our capital allocation is shareholder returns. As I said earlier, since last year, we have significantly accelerated our share repatriation program. We further repatriated $445 million from the beginning of Q1 this year. These are our strongest repatriation efforts in the past three years. We expect to keep a similar pace this year, reflecting our long-standing commitment to shareholders and our confidence in our long-term growth. Thank you.
Operator (participant)
Ladies and gentlemen, that does conclude our conference for today. Thank you for participating. You may now all disconnect.