← Home

The AI Power Build-Out Is the Largest Energy Project in American History

Five companies are spending $400B on AI infrastructure this year, half of it on the electricity to run it. Nobody is calling this the energy story it actually is.

Capital expenditure across the five hyperscaler tech companies hit $400 billion in 2025 and is projected to climb 75% in 2026, according to the International Energy Agency. Data-center electricity demand grew 17% in 2025 — far outpacing global electricity demand growth of 3%. Total data-center consumption is projected to reach 945 TWh by 2030, with the AI-specific portion tripling. Tech companies signed roughly 40% of all corporate renewable-power purchase agreements last year. The contracted pipeline for small modular reactors has grown from 25 GW to 45 GW in twelve months.

I want to put this build-out in perspective, because the people writing about it are mostly tech reporters, and tech reporters are not in the habit of comparing things to the Hoover Dam.

Four hundred billion dollars is more than the Apollo program in inflation-adjusted dollars. It is roughly the entire federal infrastructure spend under the Bipartisan Infrastructure Law. It is more than what every utility in the United States combined spends on annual capex. It is being committed in a single year, by a handful of companies, mostly to acquire access to electricity.

The thing that's actually being built isn't model weights. It's transmission lines, substations, gas turbines, hyperscale cooling systems, and reactor offtake agreements. The AI industry has quietly become the largest private buyer of generation capacity in the United States. American Electric Power recently disclosed a 63 gigawatt contract pipeline, about 90% of which is data centers. That's the load equivalent of roughly sixty large nuclear plants.

I have seen industries claim they would reshape the grid before. Crypto mining made the claim and ended up being a rounding error. Cloud computing in the 2010s made it and the build-out was real but spread across a decade. This is bigger and faster than either, and the constraint isn't capital — there is plenty of capital — it's permitting, transmission, and the speed at which you can construct generation capacity that takes five to ten years to bring online.

What follows from that has very little to do with AI and a lot to do with electricity prices, water rights, county-level political fights, and the question of whether the power being built for inference will eventually serve households at all. The early evidence is mixed: ratepayers in Virginia, Texas, and Ohio are starting to see data-center-driven cost allocations show up in their bills.

I do not have a tidy take on whether this build-out is a triumph of capitalism or an exercise in mass concentration. Both are probably correct. What I am willing to say is that anyone analyzing the AI industry purely as a software story is missing the actual business — which has, in 2026, become an electricity arbitrage business with model weights as a side effect.

The technocratic counter is that the build-out produces abundance — AI demands compute that demands power, but the power infrastructure produced as a byproduct (especially the SMR pipeline) ends up serving the broader economy at lower prices a decade out. That is the case the industry's lobbyists are making, and there is a real version of it that may turn out to be right.

  • IEA projects total data-center electricity at 945 TWh by 2030 — about 3% of global consumption
  • A single AI inference can consume up to 1,000× the electricity of a traditional web search
  • The 45 GW SMR offtake pipeline grew from 25 GW in twelve months — that's a real industry, not a hypothetical
  • See CNN's framing on why the obvious fixes aren't happening

If you cover technology professionally, you should also be covering electricity. The two stories are now one story, and the people who understand both are going to make better calls than the people who specialize in either.

— Hank