Datacenters are edging their way up to becoming one of the biggest demand drivers of electricity in the US.
The facilities that power our digital economy have quietly undergone a dramatic transformation in their energy footprint. From accounting for less than 1% of US power demand in 2004, datacenters now consume approximately 7% of the nation’s electricity, according to research from Goldman Sachs Global Investment Research, the Department of Energy, and Aterio which was shared by Callum Williams who writes for The Economist.

The growth trajectory tells a story of exponential acceleration. For the first decade tracked, from 2004 to 2014, datacenter power consumption remained relatively flat, barely moving above 1% of total US demand. But starting around 2015, the curve began to steepen noticeably. By 2019, datacenters had crossed the 2% threshold. The pace then intensified dramatically—reaching 3% by 2021, 4% by 2023, and now sitting at approximately 7% in early 2026.
This surge coincides directly with the explosion of artificial intelligence workloads. Training large language models and running inference at scale requires massive computational resources, and those resources demand power—lots of it. The AI boom that began in earnest with ChatGPT’s launch in late 2022 has sent hyperscalers and cloud providers scrambling to build out infrastructure, leading to unprecedented datacenter construction and expansion.
The implications extend far beyond the tech sector. A 7% share of national electricity demand means datacenters now rival major industrial sectors in their power consumption. This has sparked concerns among utilities about grid capacity, accelerated discussions about energy infrastructure investment, and raised questions about the sustainability of AI’s growth trajectory without corresponding advances in power generation and efficiency.
Energy companies are already responding. Utilities are projecting continued double-digit growth in datacenter power demand through the end of the decade. This has triggered a wave of new power plant proposals, grid modernization projects, and aggressive pursuit of both renewable energy sources and nuclear power options to meet the anticipated load.
For datacenter operators and cloud providers, power availability has become as critical a constraint as chip supply. Site selection increasingly hinges on access to reliable, abundant electricity rather than just fiber connectivity or tax incentives. Some hyperscalers are even exploring direct power purchase agreements and co-location with power generation facilities to secure their energy needs.
The broader economic question looms large: at what point does datacenter power consumption begin to meaningfully constrain AI development or force a reckoning with energy efficiency? If the current trajectory continues, datacenters could account for 10% or more of US electricity demand within just a few years, and former Google CEO Eric Schmidt says that it could end up being 99% at some point. That would represent a fundamental shift in the nation’s energy consumption patterns, one that policymakers, utilities, and the tech industry itself are only beginning to grapple with.
For now, the curve continues its upward march, with no signs of plateauing.