Microsoft just committed another $4 billion to build a second Wisconsin data center, doubling down on AI infrastructure as OpenAI's ChatGPT usage hits 700 million people. The announcement comes as cloud giants race to secure compute capacity for the AI boom, with the first facility housing hundreds of thousands of Nvidia Blackwell chips going live in early 2026.
Microsoft is going all-in on AI infrastructure with a massive $4 billion commitment for a second Wisconsin data center, CEO Satya Nadella announced Thursday from Racine. The move signals just how desperate cloud providers have become to secure compute capacity as AI demand explodes across enterprise customers. The first Wisconsin facility, already under construction with a $3.3 billion price tag, will come online in early 2026 packed with hundreds of thousands of Nvidia Blackwell GB200 chips - the latest generation designed specifically for large language model training and inference. Brad Smith, Microsoft's president and vice chair, told reporters the facility will deliver "10x the performance of the world's fastest supercomputer today," enabling AI workloads "at a level never before seen." The timing isn't coincidental. OpenAI's ChatGPT now serves over 700 million users, all running on Microsoft's Azure cloud infrastructure through their exclusive partnership. That's created a compute bottleneck that's reverberating across the entire tech industry. Enterprise software giants from Adobe to Salesforce are racing to embed AI features into their products, creating unprecedented demand for GPU-powered data centers. The Wisconsin expansion sits on land originally earmarked for a Foxconn manufacturing plant that never materialized - a fitting metaphor for how quickly AI has reshaped tech infrastructure priorities. Microsoft's chosen location will consume up to 2.8 million gallons of water annually, a fraction of the 7 million gallons per day Foxconn was permitted to use. Energy requirements tell the real story of AI's appetite. The two Wisconsin centers combined will need more than 900 megawatts of power, with Microsoft building a 250-megawatt solar farm 150 miles away to offset the load. "We're pre-paying for the energy and electrical infrastructure we'll use - ensuring prices remain stable and protecting consumers from future cost increases," Smith wrote in a company blog post. The Wisconsin announcement is just the latest in Microsoft's global infrastructure blitz. Earlier this week, Smith revealed the company has allocated $15.5 billion for additional UK data center capacity through 2028. Separately, Amsterdam-based announced Microsoft agreed to spend up to $19.4 billion over five years to rent AI data center capacity - suggesting the company is hedging its bets on both owned and leased infrastructure. The numbers reveal how AI has fundamentally changed the economics of cloud computing. Traditional data centers focused on storage and basic compute. These new facilities are essentially GPU supercomputers designed for neural network training and inference at unprecedented scale. The second Wisconsin center will match the first facility's scale and enter operation in 2027 or later, Smith confirmed. "We did pause to think through exactly what we would build for phase two, how we would build it," he said, hinting at potential architectural improvements as chip technology evolves. For , Microsoft's commitment represents massive locked-in demand for its Blackwell architecture. The chip giant has been struggling to meet AI demand, with hyperscale customers like Microsoft, , and competing for limited GPU supplies. Industry analysts estimate the Wisconsin facilities could generate over $500 million in annual revenue for Nvidia once fully operational. The broader implications extend beyond Microsoft's immediate AI needs. Enterprise customers are watching closely as prepares to launch more powerful models requiring even more compute resources. Microsoft's infrastructure investments signal confidence that AI demand will continue growing exponentially, even as some competitors question the sustainability of current AI spending levels.