Sign in

    Pierre Ferragu

    Research Analyst at New Street Research

    Pierre Ferragu is Managing Partner and Head of Global Technology Infrastructure Research at New Street Research, specializing in the technology sector with coverage spanning companies such as Intel, Broadcom, AMD, and Palo Alto Networks. Over his career, Ferragu has consistently achieved top analyst rankings, including multiple #1 designations in Institutional Investor, Extel, Thomson Reuters, and Greenwich Associates surveys, reflecting a strong track record of performance and influential investment calls. He began his career as a principal at Boston Consulting Group before spending more than a decade at Bernstein as a leading analyst in Telecom Equipment, Data Networking, Cybersecurity, and Semiconductors, joining New Street Research afterward as managing partner. Ferragu holds degrees in Telecom and Computer Sciences from Centrale-Supélec and in Sociology from Sciences-Po Paris; he is widely respected in the investor community though his U.S. professional credentials such as FINRA registrations are not publicly listed.

    Pierre Ferragu's questions to Tesla (TSLA) leadership

    Pierre Ferragu's questions to Tesla (TSLA) leadership • Q1 2025

    Question

    Pierre Ferragu asked why the Model 3 and Model Y have not captured a larger share of their addressable market, similar to how the iPhone dominated smartphones, given their product superiority.

    Answer

    CEO Elon Musk responded by comparing the current automotive landscape to the pre-smartphone era of flip phones. He argued that the future is not about competing for market share in traditional cars, but about the transition to autonomy, predicting that non-autonomous gasoline cars will become a rarity, much like riding a horse or using a flip phone.

    Ask Fintool Equity Research AI

    Pierre Ferragu's questions to Tesla (TSLA) leadership • Q4 2024

    Question

    Pierre Ferragu of New Street Research inquired if the upcoming robotaxi service in Austin would be open to public use and asked if Tesla is developing an 'eyes-off' FSD feature for drivers to use for personal tasks like checking email.

    Answer

    CEO Elon Musk clarified that the initial robotaxi launch will be for Tesla's internal fleet to ensure reliability, with public participation likely next year. He acknowledged the demand for an 'eyes-off' mode and the current 'perverse situation' where drivers disengage FSD to text. He stated that while the capability is not far off, likely a matter of 'low single-digit months,' the company must first prove unequivocal safety to itself and regulators.

    Ask Fintool Equity Research AI

    Pierre Ferragu's questions to Tesla (TSLA) leadership • Q3 2024

    Question

    Pierre Ferragu asked about the application of Tesla's expanding AI compute power and the operational plan for the ride-hailing network launch in Texas and California.

    Answer

    CEO Elon Musk explained that real-world AI for vehicles differs from LLMs, requiring heavy training to compensate for limited in-car inference compute. Head of Autopilot Software Ashok Elluswamy added that a key bottleneck is the time required to validate model improvements through real-world mileage. Regarding the ride-hailing launch, an executive noted they will follow all state regulations for safety drivers, but Elon Musk affirmed the goal is to have driverless paid rides sometime next year.

    Ask Fintool Equity Research AI

    Pierre Ferragu's questions to NVIDIA (NVDA) leadership

    Pierre Ferragu's questions to NVIDIA (NVDA) leadership • Q3 2025

    Question

    Pierre Ferragu from New Street Research asked for a high-level breakdown of compute allocation between pretraining, reinforcement learning, and inference for large AI models, and which area is seeing the most growth.

    Answer

    CEO Jensen Huang explained that today, the compute workload is 'vastly in pretraining' because advanced post-training and inference-time scaling techniques are still new. He noted that the ultimate goal is to minimize inference cost by maximizing pre- and post-training. However, he expects all three areas to scale for the foreseeable future, driven by trends like multimodality, which will require more compute and necessitate NVIDIA's continued drive to improve performance and reduce costs.

    Ask Fintool Equity Research AI