AI’s Power Problem: Data Centers Strain the Grid
AI boom meets a wall of watts
The rapid rise of artificial intelligence is colliding with a slower-moving reality: electricity. As tech companies race to build and deploy larger models, utilities and regulators warn that power systems are struggling to keep up. New data centers are breaking ground across the United States and Europe, but many face delays linked to limited grid capacity, long interconnection queues, and local opposition.
The International Energy Agency (IEA) says the world is entering a pivotal period. In its 2024 outlook, the agency wrote that "electricity consumption from data centres, cryptocurrencies and AI could double by 2026." The IEA estimates that data centers alone could use between 620 and 1,050 terawatt-hours of electricity in 2026, up from about 460 terawatt-hours in 2022. That higher figure would be roughly the same as Japan’s total electricity consumption.
Where demand is surging
Growth is concentrated in a few hubs. Northern Virginia, long a global epicenter for data centers, has seen multi-year waits for new grid connections. Georgia, Ohio, and Oregon are also emerging as hotspots. In Europe, Ireland illustrates the challenge. Data centers accounted for around 18% of the country’s electricity use in 2022, according to official statistics. The grid operator has placed constraints on new connections around Dublin to protect reliability.
- United States: Utilities report fast-rising requests for large, steady loads. Many are revising demand forecasts upward after years of flat growth.
- Ireland: Capacity constraints in greater Dublin have prompted tighter connection rules and, in some cases, deferrals.
- Nordics: Abundant hydropower and cooler climates continue to attract facilities, though transmission bottlenecks still apply.
Power demand is only part of the picture. Water is, too. A 2023 academic study on the "water footprint" of AI estimated that training a single large model can consume hundreds of thousands of liters of freshwater when accounting for direct and indirect cooling needs. The authors called for improved reporting and better siting to reduce stress on local resources.
Why AI uses so much energy
AI’s energy use comes from two phases: training and inference. Training the largest models involves billions of parameters and weeks of nonstop computation across thousands of specialized chips. Inference—the process of generating answers for users—runs every time someone queries a model. Training is power intensive; inference is persistent and scales with popularity.
- Hardware: Graphics processing units and other accelerators drive high electricity needs and dense heat output.
- Cooling: Traditional air cooling is giving way to liquid systems to handle higher chip densities, increasing water and infrastructure demands.
- Scale: When models serve millions of users, small per-query costs become large system totals.
Energy per AI query varies widely based on model size, hardware, and software optimizations. Academic estimates suggest it is higher than a typical web search, particularly for large, general-purpose systems. Without efficiency gains, wider adoption could push power use much higher.
Industry’s response: efficiency and new electrons
Tech firms say they can grow AI while cutting emissions intensity. Their strategies include efficiency improvements and long-term power contracts.
- Efficiency: Companies are optimizing models, using quantization, and deploying more efficient chips. Each generation of accelerators improves performance per watt, and software stacks are being tuned to reduce overhead.
- Renewables: Cloud providers are signing large wind and solar power purchase agreements. Google says its goal is to "run on 24/7 carbon-free energy by 2030," matching consumption with clean power every hour, in every region.
- New sources: Firms are exploring nuclear, geothermal, and even fusion. In 2023, Microsoft announced a deal with Helion to purchase electricity from a future fusion facility, signaling interest in next-generation supply, though the technology remains unproven at scale.
Large buyers argue that their contracts help bring new clean capacity onto the grid. Yet matching is complex. Wind and solar output varies by hour and season, and many data centers require near-constant power. That mismatch pushes companies toward round-the-clock solutions like nuclear power, long-duration storage, advanced geothermal, or grid balancing portfolios that combine multiple resources.
Grid constraints slow the buildout
Utilities and regulators face a planning challenge. Data centers are large, often requiring hundreds of megawatts. They can cluster in small areas, creating local transmission congestion and reliability risks. Upgrades take years, from permitting lines to building substations.
Some grid operators have tightened interconnection rules, prioritizing projects that are sited near available capacity or that invest in on-site generation. Others are exploring tariffs that encourage flexibility. Location decisions increasingly weigh not only taxes and land costs, but also power availability and proximity to clean sources.
Policy accelerators and safeguards
Governments are starting to respond. The United States released a national framework for AI risk management through the National Institute of Standards and Technology in 2023, emphasizing practical governance. The White House’s October 2023 executive order called for "safe, secure, and trustworthy" AI and directed agencies to study impacts, including on infrastructure and the workforce.
In Europe, lawmakers passed the EU AI Act in 2024, the first broad attempt to regulate AI by risk category. While the Act focuses on safety and transparency, it adds reporting duties that could make environmental impacts more visible. Separately, energy regulators are working on streamlined permitting for grid upgrades and clearer interconnection queues to handle large, fast-arriving loads.
What to watch next
- Efficiency vs. scale: Will hardware and software progress outpace demand growth? The answer determines whether AI’s share of electricity stabilizes or climbs.
- 24/7 clean power: Hourly matching is expanding, with more granular certificates and storage. Adoption beyond the largest tech firms could broaden its impact.
- Water stewardship: Expect tighter reporting and siting rules in stressed regions, along with a shift to non-potable water and advanced cooling.
- Location strategy: New clusters may emerge near abundant clean energy, from Nordic hydropower to U.S. nuclear and geothermal corridors.
The bottom line
AI’s promise is large, but so is its power bill. The IEA’s warning that demand could double within a few years is a reality check for planners and tech leaders alike. If companies can combine efficiency gains with round-the-clock clean energy, the sector could grow with a lighter footprint. If not, grids will face rising strain, and emissions could climb.
For now, the direction is clear. Data centers are becoming critical infrastructure, and electricity is the limiting reagent. As one industry target puts it, the challenge is to make AI both powerful and sustainable—"safe, secure, and trustworthy" not only in how systems behave, but in how they are powered.
Sources: International Energy Agency (Electricity 2024); national statistics and grid operator notices in the U.S. and Ireland; academic research on AI energy and water use; company sustainability statements by Google and Microsoft.