The AI industry just hit a wall nobody saw coming - but it's not about chips or code. OpenAI CEO Sam Altman and Microsoft CEO Satya Nadella openly admitted they can't predict how much electricity AI will actually need, creating a dangerous guessing game that could burn investors who bet wrong on energy infrastructure.
The confession came during a candid BG2 podcast appearance where both CEOs laid bare an uncomfortable truth - the AI boom has outpaced the energy grid's ability to keep up. Nadella's admission that Microsoft has ordered more GPUs than it can actually power reveals how badly the industry miscalculated.
"The biggest issue we are now having is not a compute glut, but it's a power and it's sort of the ability to get the [data center] builds done fast enough close to power," Nadella explained. The company literally has chips "sitting in inventory that I can't plug in" because there aren't enough "warm shells" - industry speak for data centers ready to go online.
This power crunch represents a fundamental shift in how tech companies operate. For over a decade, U.S. electricity demand remained flat, lulling utilities into thinking growth was dead. Then data centers exploded, with demand ramping up faster than anyone anticipated. The result? Companies are scrambling to secure behind-the-meter power deals that bypass the traditional grid entirely.
Altman sees even bigger problems ahead. "If a very cheap form of energy comes online soon at mass scale, then a lot of people are going to be extremely burned with existing contracts they've signed," he warned. The OpenAI chief has skin in the game too - he's invested in nuclear startups Oklo and Helion, plus solar storage company Exowatt.
The math behind Altman's concern is staggering. He estimates AI efficiency improves roughly 40x per year - "a very scary exponent from an infrastructure buildout standpoint." If compute costs plummet by 100x overnight, usage would spike far beyond that, creating demand spikes no one can predict.












