AMD CEO Lisa Su just threw down the gauntlet against Nvidia with an audacious promise: 35% annual revenue growth over the next three to five years, driven by what she calls "insatiable" AI demand. Speaking at the company's first analyst day since 2022, Su outlined plans to capture double-digit market share in data center AI chips - a direct assault on Nvidia's 90% stranglehold on the $500 billion market.
AMD shares swung wildly Tuesday as CEO Lisa Su delivered what amounts to a declaration of war against Nvidia's AI chip empire. At the company's first analyst day in three years, Su laid out an aggressive roadmap to capture what she sees as an unprecedented opportunity in artificial intelligence infrastructure.
The numbers are staggering. Su projects AMD's overall revenue will grow 35% annually through 2029, with the AI data center division exploding at 80% yearly growth. That pace would drive AI chip sales from $5 billion in fiscal 2024 to tens of billions by 2027 - putting AMD squarely in competition with Nvidia's dominance.
"This is what we see as our potential given the customer traction, both with the announced customers, as well as customers that are currently working very closely with us," Su told analysts during the livestreamed event.
The timing isn't coincidental. Companies are pouring hundreds of billions into GPU infrastructure for AI applications like OpenAI's ChatGPT, but they're desperately seeking alternatives to Nvidia's near-monopoly pricing and supply constraints. AMD represents the only other major GPU developer capable of handling enterprise-scale AI workloads.
That positioning paid off spectacularly in October when AMD announced a multi-billion dollar partnership with OpenAI, starting with enough Instinct AI chips in 2026 to power 1 gigawatt of computing. The deal potentially gives OpenAI a 10% stake in AMD, cementing the relationship as both companies challenge Nvidia's dominance.
Su also highlighted new long-term contracts with Oracle and Meta, signaling growing enterprise confidence in AMD's AI roadmap. The partnerships represent a fundamental shift as hyperscalers diversify their chip suppliers to avoid single-vendor dependency.
The technical stakes are equally high. AMD's upcoming Instinct MI400X chips will support "rack-scale" systems where 72 processors work as a unified unit - essential for training the largest AI models. If successful, AMD will finally match Nvidia's three-generation head start in rack-scale deployments.












