Nvidia just dropped a number that's rewriting the AI infrastructure playbook. CEO Jensen Huang told a packed crowd at GTC 2026 that the company has locked in $1 trillion in orders for its Blackwell and next-gen Vera Rubin chips stretching through 2027. The figure, disclosed during Nvidia's annual developer conference, dwarfs anything the semiconductor industry has seen and signals hyperscalers and enterprises aren't just betting on AI - they're going all in.
Nvidia isn't just winning the AI chip race anymore - it's lapping the field. At the company's GTC 2026 developer conference, CEO Jensen Huang revealed that Nvidia has secured $1 trillion in orders for its Blackwell and upcoming Vera Rubin chip architectures, spanning deliveries through the end of 2027. According to CNBC's coverage of the keynote, Huang described demand as "booming" for the company's latest technology, an understatement given the staggering order book.
The trillion-dollar figure represents a watershed moment for the semiconductor industry. To put it in perspective, Nvidia's entire fiscal 2025 revenue totaled around $129 billion, meaning this pipeline alone equals roughly eight years of recent annual sales compressed into a two-year window. The orders span both the currently shipping Blackwell architecture and Vera Rubin, Nvidia's next-generation platform that's expected to push AI training and inference performance to new heights.
Hyperscalers are writing the biggest checks. Amazon Web Services, Microsoft Azure, and Google Cloud have been locked in an arms race to build out AI infrastructure, each announcing multi-billion-dollar data center expansions over the past year. and have also been aggressive buyers, with building massive GPU clusters to power its AI research and scaling compute for autonomous driving development. The trillion-dollar order book suggests these companies aren't just preparing for current AI workloads - they're betting on exponential growth in model training, inference, and entirely new AI applications we haven't seen yet.












