
Special Edition: Why AI's Future Depends More on Electricity Than GPUs
IN PARTNERSHIP WITH
Why AI's Future Depends More on Electricity Than GPUs
The limiting factor to compute is no longer chips. It is electricity. Data Centers and REITs explode and AI infrastructure debt raises fears of an infrastructure bubble.
By Sriram Muthu for The Tech Buzz
Elf Labs built a global media-tech business in less than 12 months. They secured distribution across 200m+ TVs, signed up customers in over 30 countries, and own 500+ historic trademarks & copyrights for iconic assets like Cinderella and Snow White.
Powered by 12 patented technologies, its executive team — behind over $6B in licensing deals, and productions from Marvel & Disney — is rapidly scaling its own AI media empire.
And, at just $2.25/share, they are letting public investors get in early.

By Sriram Muthu for The Tech Buzz
In the past years of AI, the focus has been on Graphics Processing Units (GPUs). Companies including Nvidia and AMD are in the spotlight for their scale, topping the charts with quarterly earnings blowouts, impressive margins, and supply craze for their H100s and B200s. AMD’s MI-series also is a competitor now with promising training and inference performance at lower costs. So in the past few years as Nvidia and AMD both grew 100s of percentage points with Nvidia recently hitting their $5 trillion market cap milestone, every earnings call since has been more GPUs, bigger clusters, and larger model sizes. Capacity was the currency and compute has been their moat.
This has worked for about 3 years, although under the surface another narrative has been slowly taking shape. Chip scarcity was initially the main problem for AI and dominated headlines, the model training and inference takes lots of resources. But what do these chips need? Energy. Model training is not just compute intensive, that compute makes it energy intensive as these hyperscale clusters are now anchored to power availability.
Nvidia’s exponential revenue surge became a proxy for demand, but the reality of this demand sits with data center operators, utilities, and financiers stitching together sufficient compute for AI projects on tight timelines and grids. Those people are the real ones driving the race for AGI and powering the AI that drives businesses and society. And AMD’s cheaper MI-series does not make up for the electricity or other resources just with more efficient silicon.
This shift reframes AI’s growth from a chip race to an infrastructure campaign. The capex spend is not just on GPUs, but its commitments for energy, real estate, and debt. And as Big Tech locks in debt to continue building out massive scale AI factories, bubble chatter follows not far behind.
The limiting factor is no longer chips. It is electricity. Over the past year, hyperscale builds have slipped because sites could not secure power fast enough. Coreweave, an increasingly popular AI data center company flagged a data center delay that hit its guidance, even with strong customer demand, here we see that the chips are ready but the power isn’t. And Big Tech on the other hand is pre booking megawatts years into the future, as deals with hydro and nuclear sources are coming in to lock in their supply. Meaning performance headlines can hide the real issue, as if they can’t energize on time, models launch later, costs rise, and returns are pushed out. Now, winners are the operators who can get megawatts delivered on schedule and at predictable prices.
Data center REITs (Real Estate Investment Trusts) can turn AI demand into leases and dividends, as long as power and cooling are live. When utilities slip, revenue follows suit. Investors have bet on these names with the promise of AI capacity, although the real test is energized megawatts and on time tenant ramps. Recent quarters across the sector show much higher build costs (land, power gear, liquid cooling, etc), and longer timelines which equals pressured margins. If AI demand softens or projects miss schedules, the most leveraged players are the ones who will feel it first.
The infrastructure push is being bankrolled with bonds, loans, and long‑dated power purchase agreements. Analysts now estimate over a trillion dollars aimed at AI infrastructure, much of it through debt. That’s fine until timelines slip. Interest costs arrive monthly, AI revenue often arrives later. We’ve already seen high‑profile investors trim or rotate away from the hottest AI equities, citing froth and timing worries. Their checklist is:
Are projects servicing debt before full utilization?
Are power contracts economical if prices spike?
Are customers locked in with take‑or‑pay terms?
After answering these questions they can assess whether it falls under a bubble or not.
AI’s energy needs are colliding with climate goals and local permitting. Cities are asking tougher questions about grid stress and water use; utilities are juggling interconnect queues; operators are rolling out liquid‑cooled racks and on‑site generation to cut waste. Expect more deals near firm, low‑carbon power: hydro, nuclear, or renewables paired with storage and more scrutiny of water and transmission impact. For readers, this matters because policy can speed or stall builds, and sustainability choices influence operating costs and public perception. The more resilient operators will secure clean power and permits early for efficiency from day one.
In three years, chips made the headlines, but electricity, land, and financing now decide the pace. Nvidia and AMD’s surge proved demand, yet delays like CoreWeave’s show the choke point is energizing campuses on time. Data center REITs translate this into cash flows, when power is live; otherwise, margins slip and leverage bites. The financing machine, bonds, loans, long‑dated power deals works, until timelines miss and interest costs outrun ramp. Meanwhile, sustainability and policy have become the silent governors of scale.
The takeaway is simple: the winners will not just buy more chips and hope it works out. They will seek out affordable power, de-risk interconnects, and bring capacity online consistently. If AI demand arrives in line with power, this infrastructure era compounds, and if it doesn’t then expect a painful repricing for popular services. First it’ll lag for the most leveraged users, then it’ll hit everyone else tied to it.

Instagram: https://www.instagram.com/thetechbuzz.ig/
| Get the daily newsletter that helps you understand the tech ecosystem sent to your inbox.