AI’s Power Problem: Data Centers Race for Energy

The surge behind the socket

The artificial intelligence boom is now a power story. Behind every rapid advance in large language models sits a forest of servers and cooling systems that need electricity, land, and water. Utilities are revising forecasts. Regulators are asking new questions. Communities are weighing jobs against strain on local grids.

Analysts warn that the curve is steep. “Electricity consumption by data centres, AI and cryptocurrencies could double by 2026,” the International Energy Agency said in its Electricity 2024 outlook. The agency estimates that global data centres used roughly hundreds of terawatt-hours of electricity in 2022 and could reach well over 600 TWh by mid-decade. AI workloads are a growing share of that demand.

Chip makers, cloud providers, and startups are racing to build capacity. New clusters of advanced processors are coming online across North America and Europe, as well as in parts of Asia. That growth brings opportunity. It also brings a test: can power systems keep up while emissions targets remain intact?

Why AI needs so much power

AI uses energy in two distinct ways. The first is training. The second is inference.

  • Training: This is the process of teaching a model to recognize patterns. It often requires weeks of nonstop computing on specialized chips. Power draw is high and sustained.
  • Inference: This is the everyday use of a trained model. Millions of prompts and responses add up. The load can be spiky, tied to user demand.
  • Cooling and water: Servers run hot. Cooling systems, including chillers and sometimes evaporative equipment, add to electricity and water needs. Operators in arid regions are shifting to designs that use less water.
  • Location: Data centres tend to cluster near fiber networks and skilled labor. That can concentrate demand in a few utility service areas.

The difference between training and inference matters for planners. Training can be scheduled. Inference is more like a 24/7 retail operation. Both are growing as companies release more features and refresh models more often.

Industry response: more watts, cleaner watts

Cloud providers say they can scale responsibly. They are signing long-term power contracts for new wind, solar, and battery projects. They are also investing in efficiency. Liquid cooling is moving from niche to mainstream. Model designers are adopting quantization and sparsity to cut the energy per query. New chips promise more operations per watt.

Some operators are looking beyond renewables. Companies are exploring nuclear options, from extending existing plants to considering small modular reactors. In 2023, Microsoft announced an agreement with fusion startup Helion to buy electricity as early as 2028, subject to approvals and technology milestones. The deal is speculative, but it signals how far buyers will go to secure firm, carbon-free power.

Corporate climate targets add pressure. Microsoft aims to be carbon negative by 2030 and to remove its historical emissions by 2050. Google is working toward 24/7 carbon-free energy by 2030. Amazon has pledged net-zero carbon by 2040. Meta targets net zero across its value chain by 2030. All say AI will be a major driver of future load. Hitting these goals while demand rises is a complex equation.

Grids under strain—and in transition

Power networks were not built with today’s AI clusters in mind. Interconnection queues for new generation and storage projects have reached record levels in the United States. Some European hubs have run into local bottlenecks. Ireland has curbed new data centre connections around Dublin to protect grid stability. The Netherlands temporarily paused new hyperscale projects while updating siting rules. Similar debates are starting in parts of Asia.

Regulators are sending a message on claims and conduct. “There is no AI exemption to the laws on the books,” U.S. Federal Trade Commission Chair Lina Khan has said, indicating that existing consumer protection and competition rules apply to AI marketing and to environmental claims. That includes how companies describe the footprint of their AI services.

Utilities, meanwhile, are updating forecasts. Some are adding new natural gas units to handle peak demand. Others are accelerating transmission projects and grid-scale storage. Stakeholders are pushing to prioritize long-duration storage and demand response to reduce reliance on fossil peakers. The mix will vary by region, depending on policy, resource availability, and public acceptance.

Counting the true footprint

Tracking the environmental impact of AI is not simple. Energy use is only one part. Water use, land use, and embodied emissions from building servers and facilities also matter. Reporting is improving, but gaps remain. Companies publish annual sustainability reports, yet the granularity on AI-specific workloads is limited.

Researchers are calling for clearer metrics. One proposal is to disclose a standardized energy-per-inference and training energy figure for major models, validated by third parties. Another is to adopt hourly matching for clean energy, rather than annual totals, to reflect when the grid is actually green. Google’s 24/7 goal follows that approach. Independent audits could improve confidence and reduce the risk of greenwashing.

Economic stakes and local tradeoffs

Data centres support jobs, tax revenue, and secondary investment. Ports, construction firms, and equipment suppliers all benefit. Communities also face tradeoffs: higher electricity demand can raise local prices or delay other industrial projects in the queue. Water sharing can be sensitive during drought. Land use and noise from new substations can draw opposition.

Some regions are writing new rules. Local authorities are tying permits to energy efficiency thresholds, district heating plans that reuse waste heat, and commitments to procure clean power. Others are exploring tariffs that encourage data centres to shift non-urgent workloads to off-peak hours. These policies aim to balance growth with reliability and climate goals.

What to watch next

  • Permitting reform: Faster approvals for transmission lines and clean energy projects could ease bottlenecks.
  • Nuclear decisions: Moves on life extensions, small modular reactors, and corporate power deals will shape the firm power mix.
  • Chip efficiency: Next-generation AI accelerators and software optimizations could lower energy per task.
  • Transparency: Standard disclosures for AI energy and water use may become a norm—or a requirement.
  • Local siting rules: Cities and regions will refine where and how data centres can expand.

The bottom line

AI’s rise is now inseparable from energy policy. The technology promises productivity gains across the economy. It also concentrates demand at a speed power systems rarely see. The outcome will depend on three things: how quickly grids can add clean, firm capacity; how efficiently models and chips evolve; and how transparent companies are about their impacts.

For now, the message from engineers and planners is consistent. AI is here, it is growing, and it needs power. The race is on to deliver that power without derailing climate targets. The balance struck in the next few years will shape both the trajectory of AI and the energy transition that must support it.