Dear Readers,

Can a country win the AI race if it cannot keep the lights on? Today’s issue starts with the overlooked battle that may matter more than chips, money, or model size: energy, because the real question is no longer who can build the smartest systems, but who can generate, route, and afford the electricity to run them at scale.

From America’s gas and nuclear flexibility to China’s breathtaking buildout of solar, wind, and grid infrastructure, we unpack why power is becoming AI’s hard ceiling, why transformer shortages and local grid constraints are turning into strategic chokepoints, and why even the semiconductor supply chain now runs through fragile geopolitical fault lines, including a helium market suddenly under pressure. If you want to understand where AI is actually heading, not just in theory, but in steel, cables, substations, and state capacity, keep reading.

All the best,

Kim Isenberg

The Race Nobody Talks About:
Why Energy, Not Chips, Could Decide the AI Future

In the 2030s, intelligence and energy—ideas, and the ability to make ideas happen—are going to become wildly abundant. These two have been the fundamental limiters on human progress for a long time; with abundant intelligence and energy (and good governance), we can theoretically have anything else.”

— Sam Altman, The Gentle Singularity

Everyone in the AI world is obsessed with chips. NVIDIA's latest GPUs sell out before they hit the market, export controls dominate headlines, and governments throw billions at semiconductor fabs. But here is the uncomfortable truth that most people overlook: you can have all the chips in the world, and they are worthless without the electricity to power them. A data center can be built in two to three years. The power grid infrastructure to feed it? That often takes much longer (International Energy Agency, "Energy and AI," 2025).

Energy is quietly becoming the defining constraint of the AI era, arguably even more binding than compute itself. While a chip shortage can be addressed with new fabrication capacity, energy bottlenecks involve physical grids, transformers weighing hundreds of tons, transmission lines stretching thousands of kilometers, and regulatory processes that move at a pace the tech industry finds agonizing. The question is not just who can build the most data centers, but who can actually power them, reliably, affordably, and at scale.

This brings us to the two giants that will shape the trajectory of AI for the next decade and beyond: the United States and China. Together, they are projected to account for the vast majority of global data center electricity growth through 2030 (IEA, "Energy and AI," 2025). But their energy systems could not be more different. The U.S. runs on market dynamics, fragmented regulation, and natural gas flexibility. China operates through centralized planning, massive coal infrastructure, and an unmatched renewable supply chain. So which system is better equipped to fuel the AI revolution, and could energy end up determining who wins the broader AI race?

logo

Subscribe to Superintel+ to read the rest.

Become a paying subscriber of Superintel+ to get access to this post and other subscriber-only content.

Upgrade

A subscription gets you:

  • Discord Server Access
  • Participate in Giveaways
  • Saturday Al research Edition Access

Reply

Avatar

or to participate

Keep Reading