Earnings summaries and quarterly performance for ADVANCED MICRO DEVICES.
Executive leadership at ADVANCED MICRO DEVICES.
Lisa Su
Chair, President and Chief Executive Officer
Ava Hahn
Senior Vice President, General Counsel and Corporate Secretary
Darren Grasby
Executive Vice President, Chief Sales Officer
Forrest Norrod
Executive Vice President and General Manager, Data Center Solutions Business Unit
Jack Huynh
Senior Vice President and General Manager, Computing and Graphics Business Group
Jean Hu
Executive Vice President, Chief Financial Officer and Treasurer
Mark Papermaster
Executive Vice President and Chief Technology Officer
Philip Guido
Executive Vice President, Chief Commercial Officer
Board of directors at ADVANCED MICRO DEVICES.
Research analysts who have asked questions during ADVANCED MICRO DEVICES earnings calls.
Aaron Rakers
Wells Fargo
7 questions for AMD
Joshua Buchalter
TD Cowen
7 questions for AMD
Timothy Arcuri
UBS
7 questions for AMD
Vivek Arya
Bank of America Corporation
7 questions for AMD
Ross Seymore
Deutsche Bank
6 questions for AMD
Thomas O’Malley
Barclays Capital
6 questions for AMD
Stacy Rasgon
Bernstein Research
5 questions for AMD
Joseph Moore
Morgan Stanley
4 questions for AMD
CJ Muse
Cantor Fitzgerald
3 questions for AMD
Harlan Sur
JPMorgan Chase & Co.
3 questions for AMD
Antoine Chkaiban
New Street Research
2 questions for AMD
Ben Reitzes
Melius Research LLC
2 questions for AMD
Joe Moore
Morgan Stanley
2 questions for AMD
Toshiya Hari
Goldman Sachs Group, Inc.
2 questions for AMD
Benjamin Reitzes
Melius Research
1 question for AMD
Blayne Curtis
Jefferies Financial Group
1 question for AMD
Christopher Muse
Cantor Fitzgerald
1 question for AMD
C J Muse
Tanner Fitzgerald
1 question for AMD
Harsh Kumar
Piper Sandler & Co.
1 question for AMD
Recent press releases and 8-K filings for AMD.
- AMD and Hunan KunlunMeta unveiled the GPT-Station AI Super Mobile Terminal at CES 2026, marking their first global mobile device collaboration.
- The terminal embeds an AMD Ryzen™ AI MAX+395 chip alongside KunlunMeta’s multi-agent collaborative OS, enabling sustained professional-grade AI performance in a portable form factor.
- All data processing occurs locally—eliminating cloud dependency—and offers users a “portable team” of AI agents for tasks like market analysis, coding, and content creation.
- KunlunMeta’s underlying innovations, including the TransformerX algorithm and ScaleFusionMoS model, drive efficiency gains of up to 80% in memory usage and 78% in latency reduction, highlighting China’s shift toward full-stack AI rule-shaping.
- At CES, Lisa Su stated AI has not reduced hiring but shifted AMD’s recruiting toward AI-fluent candidates.
- The company uses AI to accelerate chip design, manufacturing, and testing workflows.
- AMD’s global workforce was about 28,000 employees as of December 2024.
- Wall Street holds a Strong Buy consensus on AMD with an average price target of $282.81, implying roughly 32.7% upside.
- Nvidia controls over 90% market share in the AI chip market, underscoring AMD’s competitive landscape.
- AMD unveiled its new Ryzen AI Embedded P100 and X100 Series processors combining “Zen 5” CPU cores, an RDNA 3.5 GPU and an XDNA 2 NPU to enable low-power, low-latency AI at the edge for automotive, industrial and autonomous systems.
- The P100 Series (4–6 cores) delivers up to 50 TOPS of AI inference performance and an estimated 35% GPU rendering improvement, powering up to four 4K (or two 8K) displays at 120 fps for next-generation digital cockpits and HMIs.
- A unified, open-source software stack built on the Xen hypervisor supports CPU, GPU and NPU workloads—isolating multiple OS domains (Yocto, Ubuntu, FreeRTOS, Android/Windows) with ASIL-B capability for secure, parallel applications.
- P100 Series (4–6 cores) is sampling now with production shipments expected in Q2 2026; sampling for higher-core P100 variants and X100 Series (up to 16 cores) begins in H1 2026.
- AMD introduced Helios, a next-generation rack-scale AI platform integrating four MI455 GPUs, EPYC “Venice” CPUs, and Pensando networking per tray, delivering up to 2.9 exaflops of performance, 18,000 GPU compute units, 31 TB HBM4 memory, and ultra-low latency scale-out bandwidth per rack.
- The company launched the Ryzen AI 400 Series notebook processors featuring up to 12 Zen 5 CPU cores, 16 RDNA 3.5 GPU cores, and an XDNA 2 NPU delivering up to 60 TOPS for on-device AI workloads.
- AMD highlighted strong customer partnerships, noting 60% of Luma AI’s inference workloads run on AMD cards and previewing Liquid AI’s upcoming LFM 3 multimodal models optimized for AMD Ryzen AI platforms.
- Demonstrations included edge and physical AI use cases with Generative Bionics’ human-like robots powered by Ryzen AI Embedded and Versal AI Edge, and Blue Origin’s spaceflight computers based on Versal 2 for lunar lander navigation.
- AMD announced the Helios AI rack, a double-wide OCP design housing 72 MI455 GPUs (320 B transistors, 432 GB HBM4) and Venice CPUs, delivering up to 2.9 exaflops per rack.
- The MI455 GPU offers up to 10× inference performance over the prior generation and will ship later this year as part of Helios.
- AMD introduced the Ryzen AI 400 Series mobile processors (up to 12 Zen 5 CPU cores, 16 RDNA 3.5 GPU cores, and an XDNA 2 NPU delivering 60 TOPS) alongside the Ryzen AI Max platform for local AI development.
- The company outlined its MI500 series roadmap—built on CDNA 6 with 2 nm process and HBM4E—to achieve a 1,000× increase in AI performance by 2027.
- Showcased the Helios AI Rack delivering 2.9 Exaflops, 31 TB HBM4 memory and 43 TB/s scale-out bandwidth on a 2 nm/3 nm “Zen 6” CPU + MI455X GPU platform.
- Introduced the next-gen Instinct MI455X GPU with 10× performance gain over MI355X and outlined a roadmap to >1000× AI performance improvement by 2027 via the M1500 series.
- Launched the AMD Ryzen™ AI 400 series CPUs featuring 60 AI TOPS, the AMD XDNA™ 2 NPU, up to 1.7× faster content creation, available starting Q1 2026.
- Emphasized an open AI ecosystem with partnerships spanning AWS, Google, Microsoft, Oracle, IBM, Dell, Lenovo and developer support via the ROCm™ platform.
- AMD rolled out high-performance AI data center solutions including the Helios rack powered by MI455 GPUs and Venice CPUs delivering up to 2.9 exaflops per rack.
- Announced AI PC platforms: Ryzen AI Max with 16 Zen 5 CPU cores, 40 RDNA 3.5 GPU CUs, XDNA 2 NPU (50 TOPS, up to 128 GB UMA), and the Ryzen AI Halo developer system supporting 200 billion-parameter models, shipping in Q2 2026.
- Highlighted strategic partnerships, including a 10× expansion with OpenAI for Luma multimodal workloads (60% inference on AMD cards) and collaboration with Illumina to accelerate genomics compute.
- Emphasized leadership in HPC and AI convergence by powering the world’s two fastest supercomputers and unveiling DOE Genesis program systems Lux (AI factory) and Discovery (2028 flagship) to advance scientific research.
- At CES 2026 AMD launched the Ryzen AI 400 Series (codename “Gorgon Point”), pairing Zen 5 CPU cores, RDNA 3.5 graphics and XDNA 2 NPUs for up to 50–60 TOPS on high-end SKUs.
- The lineup spans Ryzen AI 400/PRO 400 for Copilot+ PCs (including AMD’s first Copilot+ desktop CPU), Ryzen AI Max/Max+ mobile and small-form-factor SKUs, embedded P100/X100 for automotive/industrial use, and a Halo/Strix Halo refresh for developers and gamers.
- AMD previewed the Ryzen 7 9850X3D, an 8-core, 96 MB 3D V-Cache enthusiast part with peak boost increased from 5.2 GHz to 5.6 GHz at a 120 W TDP, due in Q1 2026.
- The company cites 1.3–1.8× multitasking and AI throughput gains versus select rivals, notes OEM adoption across over 250 distinct AI PC designs, and highlighted its MI455 data-center AI processors.
- AMD views AI as a multi-decade investment cycle, noting hyperscale customers are funding ever-higher data center CapEx through free cash flow and are now constrained by compute capacity on both GPUs and CPUs.
- AMD estimates its silicon-addressable data center TAM at >$1 trillion, with 75–80% captured by programmable GPUs and 20–25% by ASIC or custom silicon workloads.
- AMD entered a definitive agreement to supply OpenAI with six gigawatts of MI450 and next-generation accelerators, deploying the first gigawatt in 2H 2026 and ramping into 2027 under a performance-based warrant structure.
- Via its Helios rack reference design, AMD focuses solely on selling silicon (GPUs, CPUs, scale-up NICs) while OEM/ODM partners handle system assembly, aiming for a 55–58% long-term gross margin driven by premium CPU, GPU, and FPGA portfolios.
- AMD is navigating China export-control uncertainties for MI325 and MI308 products, excluding potential China revenue from Q4 guidance while applying for required licenses and assessing regional demand.
- AMD views AI as a multi-decade investment cycle, with hyperscalers funding increased data-center CapEx through free cash flow and now constrained by compute/infrastructure capacity.
- AMD targets a silicon addressable market of over $1 trillion for data-center GPUs, CPUs, and networking—excluding rack-level and infrastructure—with programmable GPUs expected to capture 75–80% and ASICs 20–25% of the accelerator TAM.
- AMD and OpenAI signed a definitive multi-year deal for 6 GW of MI450/MI455 deployments, starting with 1 GW in H2 2026 and ramping into 2027, including performance-based warrants tied to AMD revenue.
- AMD remains fabless, focusing on high-value silicon sales and licensing reference designs to OEM/ODM partners, while targeting long-term corporate gross margins of 55–58%.
Quarterly earnings call transcripts for ADVANCED MICRO DEVICES.
Ask Fintool AI Agent
Get instant answers from SEC filings, earnings calls & more