With tech giants preparing to spend $1 trillion on AI infrastructure over five years, a critical accounting question is reshaping investment strategies: how long before those expensive Nvidia GPUs become obsolete? The answer could make or break the economics of the biggest tech buildout in history, as companies grapple with depreciation schedules ranging from two to six years.
The biggest question in AI today isn't about artificial general intelligence or robot overlords. It's about spreadsheets and depreciation schedules - specifically, how long those hundreds of thousands of Nvidia GPUs will keep printing money before they become expensive paperweights.
CoreWeave just gave investors a glimpse into this high-stakes guessing game. CEO Michael Intrator told CNBC this week that his company's older A100 chips from 2020 are still fully booked, while H100s from 2022 are commanding 95% of their original rental prices. "All of the data points I'm getting are telling me that the infrastructure retains value," Intrator said, defending CoreWeave's six-year depreciation timeline.
But the market isn't buying it. CoreWeave shares plunged 16% after earnings, now down 57% from their June peak. The selloff reflects broader concerns about AI overspending that have also hammered Oracle, which has dropped 34% from its September highs despite aggressive data center expansion plans.
The depreciation debate cuts to the heart of AI economics. Microsoft hedges its bets with two-to-six-year equipment lifespans in SEC filings, while Google, Oracle, and other hyperscalers bank on six-year useful lives. But here's the wrinkle: there's virtually no historical data to support any of these assumptions.
"Is it three years, is it five, or is it seven?" asks Haim Zaltzman, who handles GPU financing deals at Latham & Watkins. "It's a huge difference in terms of how successful it is for financing purposes." Nvidia's first AI-focused data center chips only appeared around 2018, and the current boom started with ChatGPT's 2022 launch. Since then, Nvidia's data center revenue has exploded from $15 billion to $115 billion annually.
Short seller Michael Burry, famous for predicting the 2008 financial crisis, has emerged as the most vocal skeptic. After disclosing bets against Nvidia and Palantir, Burry suggested this week that Meta, Microsoft, Google, and Amazon are "overstating the useful life of their AI chips and understating depreciation." He pegs actual server lifespans at two to three years, claiming companies are inflating earnings as a result.
Nvidia CEO Jensen Huang hasn't helped depreciation bulls with his trademark brutal honesty. When announcing the new Blackwell chip earlier this year, he joked about its predecessor: "When Blackwell starts shipping in volume, you couldn't give Hoppers away." He quickly added there are circumstances where Hopper chips remain useful - "Not many."
The acceleration is real. Nvidia now releases new AI chips annually versus the previous two-year cycle, with competitor AMD following suit. This faster pace is already forcing adjustments - Amazon reduced server lifespans from six to five years after a study found "an increased pace of technology development, particularly in artificial intelligence and machine learning."
Microsoft CEO Satya Nadella revealed his company's strategic response during a recent interview: deliberately spacing out AI chip purchases to avoid getting "stuck with four or five years of depreciation on one generation." It's a telling admission that even the biggest players are hedging their GPU bets.
The stakes couldn't be higher for the investors and lenders financing these massive buildouts. Longer equipment lifespans mean companies can stretch depreciation over more years, protecting profit margins. Shorter lifespans accelerate those costs, potentially making entire AI strategies uneconomical.
Dustin Madsen from the Society of Depreciation Professionals notes that these estimates must survive intense auditor scrutiny, requiring engineering data and historical analysis to support useful life claims. But with AI chips, that historical data simply doesn't exist yet.
What's emerging is a fundamental tension between technological progress and financial engineering. The faster Nvidia and competitors innovate, the more pressure they put on existing equipment values. Each new chip generation doesn't just offer better performance - it potentially obsoletes billions in installed infrastructure.
The $1 trillion GPU depreciation question isn't just an accounting exercise - it's the linchpin of AI's economic viability. While CoreWeave bets on six-year lifespans and Michael Burry warns of two-year obsolescence, the truth likely lies somewhere in between. What's certain is that Nvidia's accelerating innovation cycle is forcing every AI investor to confront an uncomfortable reality: in a world where next-generation chips arrive annually, yesterday's cutting-edge hardware becomes tomorrow's expensive legacy equipment faster than anyone anticipated.