The AI Trade’s Biggest Risk Still Needs to Be Addressed
Jul 23, 2025 15:26:00 -0400 by Tae Kim | #AI #Barron's TechConstellation Energy’s LaSalle Clean Energy Center nuclear power plant near Marseilles, Ill. (Scott Olson/Getty Images)
This article is from the free weekly Barron’s Tech email newsletter. Sign up here to get it delivered directly to your inbox.
Power Gap. Hi everyone. For artificial intelligence, everything seems to be falling into place. Chip production is ramping up, financing for data centers is becoming more available, and the makers of the leading AI models are innovating relentlessly.
But there is one significant area in which progress is lacking, and it could derail the entire AI infrastructure trade: electricity supply.
In a few years, there may not be enough power for the data centers required to win the AI arms race against China unless the U.S. government and industry quickly address this issue.
AI model companies are doing their part on the innovation front. Usage of the latest reasoning models from OpenAI, Alphabet’s Google, xAI, and Anthropic is exploding. AI agent capabilities are coming online, which will only increase the demand for AI computing resources.
The ramp in AI infrastructure announcements in the past week alone is noteworthy. Elon Musk said his xAI start-up already has a 230,000 GPU cluster operational, with an additional buildout of a 550,000 GPU cluster using Nvidia GB200/GB300 AI servers starting in a few weeks. Meta Platforms CEO Mark Zuckerberg outlined his plan to build several multi-gigawatt data center clusters. Earlier this week, OpenAI announced it has over five gigawatts of data center capacity under development, which will house more than two million chips.
These numbers are incredible. Remember just one year ago, a leading AI model, Llama 3, was trained on 16,000 GPUs, and a traditional cloud computing data center used about 50 megawatts of power. Now, training clusters with hundreds of thousands of GPUs are becoming commonplace, with one million GPU clusters around the corner.
But these superclusters of the future will require vast amounts of power. Nvidia’s Rubin Ultra AI server in 2027 will draw 600 kilowatts per rack versus today’s AI server at 120 kilowatts.
The power supply is going to be a challenge. Morgan Stanley’s recent survey of data center operators said energy supply would be the primary bottleneck in the coming years.
Leading AI model start-up Anthropic quantified the power gap for 2028 in a report this week, saying the U.S. AI industry will require 50 gigawatts of electricity capacity by 2028 to maintain its leadership. The U.S. is far behind China in building capacity, the report said, with China adding 400 gigawatts last year.
“Today, America is not on track to meet the energy needs of AI training or inference by 2028,” Anthropic said, adding that getting regulatory approvals to build U.S. electrical infrastructure can take years.
On Wednesday, the White House published an “AI Action Plan” report that addressed the need for new energy sources, making federal land available for power generation, and upgrading the electric grid. But the report is missing details around timelines and specific actions and offered no numeric goals for capacity expansion.
There is still time to build the U.S. electric grid of the future and the power supply infrastructure to meet the surging AI demand, but more power plants and a better transmission grid are needed. For the AI trade to keep going, the work needs to start now.
This Week in Barron’s Tech
- Texas Instruments Stock Slides on Disappointing Outlook
- SAP Reports Mixed Results. The AI Stock Is Falling.
- Microsoft Says China-Linked Hackers Are Behind SharePoint Attacks
- Taiwan Semi Joins the $1 Trillion Club. What’s Ahead for the Chip Maker?
- Figma’s IPO Seeks $13B Valuation in Latest Tech Debut
Write to Tae Kim at tae.kim@barrons.com or follow him on X at @firstadopter.