Nvidia Is Now Misunderstood. Here’s What Mattered Most From Earnings.
Nov 20, 2025 15:16:00 -0500 by Tae Kim | #AI #Tech Trader“There’s been a lot of talk about an AI bubble,” Nvidia CEO Jensen Huang said on Nvidia’s earnings call Wednesday. “From our vantage point, we see something very different.” (I-HWA CHENG/AFP/Getty Images)
Nvidia delivered the goods once again. The volatile stock reaction is almost beside the point.
Amid rising fears about the sustainability of the artificial-intelligence buildout, Nvidia reiterated that the growth fundamentals are real, while generating the numbers to prove it. More important, the results offer clues that the AI spending cycle has a way to go.
On the topic atop everyone’s minds, Nvidia CEO Jensen Huang was defiant. “There’s been a lot of talk about an AI bubble,” he said on the call. “From our vantage point, we see something very different.”
Nvidia’s revenue for the October quarter was up 62% year over year to $57 billion, handily ahead of expectations of $54.9 billion. Nvidia’s data-center business, primarily driven by AI chip demand, grew even faster, up 66% from last year to $51.2 billion.
Guidance was also robust. For the current quarter ending in January, Nvidia gave a revenue forecast range with a midpoint of $65 billion, well ahead of analysts’ consensus of $62.2 billion. During the company’s earnings call, Nvidia Chief Financial Officer Colette Kress said the guidance doesn’t assume any revenue from China, making the outlook even more impressive.
Nvidia, to its credit, is well aware of the doubts surrounding the AI trade. It addressed the concerns point by point on its earnings call last night, highlighted by Huang’s bubble pushback. Kress took on the argument that some of Nvidia’s customers were inflating profits by spreading out depreciation costs over a long period—a claim that Big Short hero Michael Burry amplified earlier this month.
Burry, who predicted the 2008-09 financial crisis, said that Big Tech firms are understating depreciation by extending the useful lives of AI hardware to six years, when they have previously depreciated the costs of networking and computing hardware from three to four years.
The longer cycle could inflate profits in the near term, he said. But Kress offered real data to indicate that AI hardware is different, deserving of its six-year useful life.
“A100 GPUs we shipped six years ago are still running at full utilization today,” she said on the earnings call.
As to whether demand is slowing, Huang said: “Blackwell sales are off the charts, and cloud GPUs are sold out.”
Nvidia shares rallied immediately following the results, but gave up the gains amid a general market selloff during Thursday’s trading session. The stock was down 2.4% Thursday afternoon; it’s still up 35% for the year.
Day-to-day and week-to-week volatility is difficult to explain. During Nvidia’s 10X rally over the past three years, the stock has often sold off after earnings. This latest move could be profit-taking. Macro worries tied to Federal Reserve policy and the falling likelihood of a rate cut aren’t helping matters for Nvidia stock.
Frankly, I don’t know why the stock sold off on such an impressive quarterly report. I do know that after these results, Nvidia is more attractive and inexpensive relative to its growth prospects. The stock trades at just 27 times forward earnings estimates. Advanced Micro Devices and Broadcom, Nvidia’s main competitors in AI chips, trade at 37 times and 42 times, respectively.
There are two positive signals about Nvidia and the AI trade more broadly that might be getting lost amid the stock reaction.
For the first time in nearly two years, Nvidia’s revenue growth accelerated, suggesting the current AI server product cycle is robust and has at least several quarters to go.
The chip maker is still poised to benefit from the largest product cycle in its history, the 72 GPU rack servers, called NVL72. Those systems, which have only just started shipping in volume, likely explain the sales acceleration that Nvidia is seeing.
Each system costs multiple millions of dollars and incorporates 72 GPUs linked together inside one server rack, up from eight GPUs in the previous model. Customers are eager to purchase the NVL72 servers because they’re far more powerful for AI workloads.
Secondly, Nvidia’s networking revenue segment also surged 162% year over year. That’s an important sign for future demand. An AI GPU cloud executive recently told Bernstein that Nvidia’s networking products are often purchased four to six months in advance of data-center buildouts.
Huang remains AI’s biggest champion. As the tech industry’s longest-serving CEO, he’s seen every technology transformation since the PC. He says Nvidia is now benefiting from three simultaneous computing shifts, changes that go beyond any one quarter of results.
The first is the move from CPUs to GPUs for tasks like data processing. The second is AI systems replacing traditional machine-learning workloads for content recommendation and ad targeting. Finally, there is the newest wave of agentic AI applications, including the coding assistant from the start-up Cursor and the legal assistant from Harvey. Huang calls them both the “next frontier” of computing.
“As you consider infrastructure investments, consider these three fundamental dynamics. Each will contribute to infrastructure growth in the coming years,” he said.
Of course, at some point the cyclical chip industry will add too much capacity and the fundamentals will deteriorate for Nvidia. It may be years away, but I will be vigilant in looking for signs. With Nvidia’s revenue accelerating amid rising AI demand, that negative scenario isn’t here now, at least not yet.
Write to Tae Kim at tae.kim@barrons.com