Nvidia shares climbed 2% Wednesday after CEO Jensen Huang declared AI computing demand has jumped "substantially" over the past six months, driven by advanced reasoning models that require exponential computing power. The comments signal the chip giant is riding an unprecedented wave of enterprise AI adoption that's reshaping the entire tech landscape.
Nvidia just delivered the kind of update that sends shockwaves through Silicon Valley. CEO Jensen Huang's appearance on CNBC's "Squawk Box" Wednesday morning wasn't just another earnings preview - it was a real-time glimpse into an AI boom that's accelerating faster than anyone predicted. "This year, particularly the last six months, demand of computing has gone up substantially," Huang told CNBC, and investors immediately took notice, pushing shares up 2% and lifting the broader Nasdaq.
The timing couldn't be more critical. While competitors scramble to match Nvidia's AI chip dominance, Huang is describing a market that's entered what he calls a "double exponential" phase. AI reasoning models aren't just getting smarter - they're demanding exponentially more computing power while simultaneously seeing exponential growth in actual usage. "The AIs are smart enough that everybody wants to use it," Huang explained, painting a picture of enterprise adoption that's outpacing even the most optimistic forecasts.
Hung's confidence shows most clearly when discussing Blackwell, Nvidia's flagship GPU architecture. "Demand for Blackwell is really, really high," he said, describing what sounds like a supply crunch that would make any CEO nervous - except when you're sitting on the most coveted chips in tech. The CEO framed this moment as "the beginning of a new buildout, beginning of a new industrial revolution," language that suggests we're still in the early innings of AI infrastructure spending.
The numbers backing up Huang's optimism are staggering. Nvidia announced last month a $100 billion investment in OpenAI's massive data center expansion. OpenAI plans to build 10 gigawatts worth of data centers - that's equivalent to powering 8 million US households or matching New York City's peak summer electricity demand. These aren't just big numbers; they represent a fundamental shift in how we think about computing infrastructure.
But Huang's most revealing comments came when discussing the AI race's biggest constraint: power. When asked about global competition, the CEO delivered a sobering assessment - the US is "not far ahead" of China, and Beijing is moving much faster on the energy infrastructure that AI demands. "China is way ahead on energy," Huang admitted, highlighting a competitive gap that goes beyond chip design to basic industrial capacity.