With tech giants preparing to spend $1 trillion on AI infrastructure over five years, a critical accounting question is reshaping investment strategies: how long before those expensive Nvidia GPUs become obsolete? The answer could make or break the economics of the biggest tech buildout in history, as companies grapple with depreciation schedules ranging from two to six years.
The biggest question in AI today isn't about artificial general intelligence or robot overlords. It's about spreadsheets and depreciation schedules - specifically, how long those hundreds of thousands of Nvidia GPUs will keep printing money before they become expensive paperweights.
CoreWeave just gave investors a glimpse into this high-stakes guessing game. CEO Michael Intrator told CNBC this week that his company's older A100 chips from 2020 are still fully booked, while H100s from 2022 are commanding 95% of their original rental prices. "All of the data points I'm getting are telling me that the infrastructure retains value," Intrator said, defending CoreWeave's six-year depreciation timeline.
But the market isn't buying it. CoreWeave shares plunged 16% after earnings, now down 57% from their June peak. The selloff reflects broader concerns about AI overspending that have also hammered Oracle, which has dropped 34% from its September highs despite aggressive data center expansion plans.
The depreciation debate cuts to the heart of AI economics. Microsoft hedges its bets with two-to-six-year equipment lifespans in SEC filings, while Google, Oracle, and other hyperscalers bank on six-year useful lives. But here's the wrinkle: there's virtually no historical data to support any of these assumptions.
"Is it three years, is it five, or is it seven?" asks Haim Zaltzman, who handles GPU financing deals at Latham & Watkins. "It's a huge difference in terms of how successful it is for financing purposes." Nvidia's first AI-focused data center chips only appeared around 2018, and the current boom started with ChatGPT's 2022 launch. Since then, Nvidia's data center revenue has exploded from $15 billion to $115 billion annually.
Short seller Michael Burry, famous for predicting the 2008 financial crisis, has emerged as the most vocal skeptic. After disclosing bets against Nvidia and Palantir, Burry suggested this week that Meta, , , and are "overstating the useful life of their AI chips and understating depreciation." He pegs actual server lifespans at two to three years, claiming companies are inflating earnings as a result.












