JIL's Take on AI: Big AI Companies Have Become Power Companies

The binding constraint on AI scaling is no longer capital, compute, or talent. It is megawatts. Microsoft, Google, Amazon, and Meta are power companies that happen to run AI models. The investors who understand this will own the right assets for the next decade.

Share
JIL's Take on AI: Big AI Companies Have Become Power Companies

There is a question circulating in infrastructure investment circles that would have seemed absurd three years ago: does Microsoft need to build its own nuclear power plant? The answer, increasingly, is yes. The hyperscalers have discovered that their data centre expansion programmes are constrained not by capital — they have ample capital — nor by GPU availability — they have reserved the supply — but by grid connection timelines measured in years, not months.

The Numbers

US data centres consumed 4% of national electricity in 2024. The Department of Energy projects this rises to 12% by 2028. The IEA estimates global data centre power demand grows from 240 TWh in 2022 to over 1,000 TWh by 2026 — a fourfold increase in four years. A single large-scale AI training cluster requires between 100MW and 500MW of continuous power. For context, 100MW is the average electricity consumption of approximately 80,000 US homes. The grid infrastructure required to deliver this power does not exist in most geographies — building it takes five to seven years under current permitting regimes.

The Hyperscaler Response

Microsoft signed a 20-year power purchase agreement with Constellation Energy to restart the Three Mile Island nuclear plant — specifically to power its AI data centres. Google has committed $2 billion to small modular reactor development. Amazon has purchased nuclear-adjacent data centre campuses with existing grid connections at significant premiums. Meta's Menlo Park expansion negotiated directly with PG&E for a dedicated substation. These are not incremental decisions — they are evidence that the hyperscalers have internalised the constraint and are building around it.

The Trump administration has added a policy dimension. The White House has urged PJM Interconnection — the largest grid operator in North America — to prioritise data centre power access in its capacity auctions. This represents an unprecedented politicisation of grid allocation that will accelerate some projects and delay others based on political rather than economic criteria.

The Investable Thesis

For investors, the power constraint creates three distinct opportunity sets. First, nuclear: companies with operating nuclear assets or advanced SMR programmes (Constellation Energy, NuScale, X-Energy) are positioned to sell power to the hyperscalers at premium rates under long-term contracts — the most creditworthy counterparties imaginable. Second, grid infrastructure: transformer manufacturers, high-voltage cable producers, and grid automation companies face a decade of inelastic demand growth. Third, energy-efficient AI: chips and systems that deliver more compute per watt command structural premium pricing because they expand the addressable data centre footprint on constrained grid connections.

The risk to the thesis is regulatory: permitting reform that dramatically accelerates grid buildout could partially relieve the constraint. The probability of that occurring at the pace required to close the gap is low — permitting reform in energy infrastructure has historically taken a decade to achieve meaningful impact. For the foreseeable investment horizon, power is the binding constraint on AI, and the investors who own the power own the AI.