Question · Q3 2025
Zhao Yili inquired about the main factors driving the faster expansion of Pony AI's operational areas beyond technology, and whether large language models (LLMs) are being used to advance L4 autonomy.
Answer
Tiancheng Lou, CTO, explained that Pony AI's L4 native tech architecture is inherently built for generalization, allowing rapid expansion into new areas like Shanghai Pudong and Shenzhen Nanshan within weeks without additional model training, as corner cases are consistent across regions. He noted that the speed of expansion is primarily limited by the number of robotaxi vehicles to maintain density. Regarding LLMs, Lou stated they are not suitable for L4 onboard driving models due to non-negotiable requirements for uncompromising safety (LLMs have 'model health' issues) and low latency. He added that LLMs rely on human data, which can introduce errors. However, Pony AI uses LLMs extensively in R&D for AI-enhanced human-machine interaction, engineering productivity tools, and rider feedback analysis.
Ask follow-up questions
Fintool can predict
PONY's earnings beat/miss a week before the call