Question · Q4 2025
Brian Schwartz asked about the expected mix of AI inferencing and training workloads on the ServiceNow platform in 2026, specifically the breakdown between ServiceNow's own LLMs and third-party foundation models, given the complementary nature of partnerships.
Answer
President, Chief Product Officer, and COO Amit Zavery stated that ServiceNow aims to provide customer choice, with customers potentially using frontier models for certain use cases. He noted that inferencing is a small percentage of overall cost, with most value derived from context, data management, and workflow integration. He anticipates more frontier models for inferencing long-term, but ServiceNow's LLMs will still be used for sovereign requirements and on-prem deployments.
Ask follow-up questions
Fintool can predict
NOW's earnings beat/miss a week before the call


