While everyone argues about whether AI will steal jobs or solve climate change, the technology is quietly devouring electricity at a pace that would make even the most power-hungry cryptocurrency miner blush.
The numbers are staggering. US data centers consumed 183 terawatt-hours of electricity in 2024, representing over 4% of total US electricity consumption. That's just the beginning. By 2030, data centers could account for 20% of global electricity use. Twenty percent. To put this in perspective, global data centers already rank as the 11th largest electricity consumer worldwide, nestled between Saudi Arabia and France.
North American data center power requirements nearly doubled in just one year, jumping from 2,688 megawatts at the end of 2022 to 5,341 megawatts by late 2023. The culprit? Generative AI demands that refuse to slow down.
North American data centers nearly doubled their power appetite in twelve months, all thanks to AI's insatiable energy demands.
A typical AI-focused hyperscaler annually consumes as much electricity as 100,000 households. The larger facilities under construction? They'll use 20 times that amount. Because apparently, teaching computers to write poetry requires the same energy as powering small cities.
Regional impacts tell an even more dramatic story. Virginia dedicates 26% of its total electricity supply to data centers. North Dakota burns through 15% for the same purpose, while Nebraska, Iowa, and Oregon each allocate around 11-12%. These aren't distributed evenly across states either. Data centers cluster together, creating localized power grid nightmares.
The projections get worse. Global power demand from data centers is forecasted to increase 50% by 2027, with some estimates suggesting a 165% spike by decade's end. By 2026, data centers will become the fifth-largest electricity consumer globally, surpassing entire countries. US power consumption is projected to reach record levels in 2025 and 2026, hitting 4,179 billion kWh and 4,239 billion kWh respectively.
AI workloads currently account for 5-15% of data center power use, but this could jump to 35-50% by 2030. Generative AI training clusters consume seven to eight times more energy than typical computing workloads. The power density requirements are brutal. Even a simple ChatGPT query uses five times more electricity than a standard web search.
The infrastructure scramble is real. Pipeline expansions, new power plants, grid upgrades. All to feed humanity's growing appetite for artificial intelligence that can generate memes and write emails. Despite AI's projected economic contribution of $19.9 trillion to the global economy by 2030, the energy costs continue to mount. Progress has never been so power-hungry.

