How I Made $5000 in the Stock Market

Inside the Accounting Question Roiling Nvidia and Other AI Stocks—and Why It May Be ‘Overblown’

Nov 17, 2025 13:53:00 -0500 by Tae Kim | #Chips

An Nvidia A100 graphics processing unit. (Courtesy Nvidia)

Key Points

A debate has broken out among investors about the accounting assumptions that large technology companies are using for the Nvidia artificial-intelligence chips that power their data centers.

AI trade skeptics including Michael Burry, whose prescient bet against subprime mortgages before the 2008-2009 financial crisis was depicted in the movie The Big Short, have publicly questioned whether technology companies are accurately reflecting the long-term economic value of the hardware by using six-year depreciation schedules for graphics processing units.

Depreciation lets companies spread the cost of an asset across its useful life, rather than recording its entire cost as an expense in the year it was purchased. A longer period of depreciation means the share of the cost recorded in any one year is smaller, boosting earnings.

Burry believes the useful lives of the chips are shorter than six years.

Bernstein analyst Stacy Rasgon disagrees. “The depreciation accounting of most major hyperscalers is reasonable,” he wrote in a report to clients Monday, noting GPUs can be profitable to owners for six years.

The analyst said even five-year old Nvidia A100 GPUs can generate “comfortable” profit margins. He said that according to his conversations with industry sources, GPUs can still function for six to seven years, or more.

“In a compute constrained world, there is still ample demand for running A100s,” he wrote, adding that according to industry analysts, the A100 capacity at GPU cloud vendors is nearly sold out.

Earlier this month, CoreWeave management said demand for older GPUs remains strong. The company cited the fact that it was able to re-book an expiring H100 GPU contract within 5% of its prior contract price. The H100 is a three-year-old chip.

Microsoft CEO Satya Nadella also shed light on why GPUs have longer life spans. “You’ll use [GPUs] for training and then you use it for data gen, you’ll use it for inference in all sorts of ways,” he said on a Dwarkesh podcast published last week. Inference is the process of generating answers from already developed AI models. “It’s not like it’s going to be used only for one workload forever.”

Write to Tae Kim at tae.kim@barrons.com