AI’s Power Problem: Who Keeps the Lights On?

A surge in computing meets a finite grid

Artificial intelligence is expanding fast. So is the energy it needs. Training and running large models require powerful chips, big data centers, and constant cooling. That means more electricity, more water, and more pressure on local grids. The International Energy Agency (IEA) has warned about the trend. In a 2024 analysis, it said, “Global electricity consumption by data centres, artificial intelligence and cryptocurrencies could double by 2026.” The estimate covers a wide range, but the direction is clear.

IEA figures show that data centers used about 460 terawatt-hours of electricity in 2022. By 2026, the total for data centers, AI, and crypto could reach 620–1,050 terawatt-hours. That would be more than many countries use in a year. AI is only part of the load, but it is a growing part. Each new AI service brings millions of queries. Each query runs on specialized chips. The result is a steady increase in demand.

Local bottlenecks from Dublin to Dallas

Energy is a national and local story. The AI boom concentrates in a few hubs. Those hubs feel the stress first. In Ireland, the grid operator tightened rules for new data centers around Dublin due to constraints. In Northern Virginia, one of the world’s largest data center clusters, Dominion Energy paused some new connections in 2022 while it upgraded lines. In Texas, the grid operator has forecast strong load growth from large industrial users, including data centers.

These examples share a theme. Data centers need capacity in bulk. They also need it fast. Transmission projects take time. Permits can be slow. Communities want jobs and tax revenue. They also worry about noise, traffic, and land use. Siting decisions draw public scrutiny. Local officials often face trade-offs. They must balance growth with reliability.

Inside the machine: chips, cooling, and water

AI runs on high-end accelerators. A single advanced chip can draw hundreds of watts. A server holds many chips. A building holds thousands of servers. The heat must go somewhere. Operators use cooling systems to keep temperatures in range. Some sites use air cooling. Others use water or advanced liquid systems. Water can improve efficiency but raises questions in dry regions. A 2023 academic study estimated that training a large model like GPT-3 could directly consume hundreds of thousands of liters of clean water. The exact number depends on location, hardware, and cooling design.

The other demand is ongoing inference. Training grabs headlines, but inference runs every hour of every day. Chatbots, search tools, and code assistants serve constant traffic. That steady load makes grid planning harder. Peaks and valleys still exist, yet the baseline keeps rising.

Tech’s response: cleaner power, smarter siting

Major AI developers have announced plans to reduce their footprint. Some sign long-term contracts for wind, solar, and nuclear power. Some target 24/7 carbon-free energy at specific sites by the end of the decade. Others invest in on-site batteries or flexible operations. Efficiency is another focus. New chips aim to do more work per watt. Software teams prune models, compress weights, and fine-tune systems. Every percent helps when scaled to millions of users.

There is also a geographic shift. Companies seek locations with cooler climates, easy access to power, and speed to market. Some regions welcome the investment. Others impose limits or new fees. The policy landscape is patchy. Local choices can redirect billions in infrastructure spending.

Regulators and the broader policy debate

Most countries do not regulate AI energy use directly. They regulate electricity, water rights, and land. Those tools are now central to AI growth. Grid operators update interconnection queues. Cities revisit zoning rules. Environmental agencies assess cooling systems. The broader AI policy debate also continues. Lawmakers are writing rules on safety, transparency, and data. The United States issued an executive order on AI safety in 2023. The European Union adopted a risk-based AI law in 2024. Energy is not the core of these measures, but compliance affects model design and deployment.

Industry leaders have called for clearer rules. As OpenAI’s chief executive Sam Altman told U.S. senators in 2023, “regulatory intervention by governments will be critical.” He spoke about AI safety, but the energy question is intertwined. Required audits, red-teaming, or content safeguards can change compute demand and timelines. Planning for that demand is now part of public policy.

Who pays for the upgrades?

Expanding AI capacity often means new lines, substations, and in some cases new power plants. Utilities recover costs from ratepayers or large customers. Advocates argue that AI firms should fund most of the upgrades they drive. Others say broader customers benefit from a modern grid. The financial structure varies by region. Long-term contracts can anchor projects and lower risk. But the clock is ticking. If upgrades lag, some projects move elsewhere.

  • Grid planning cycles are slow. Permitting and construction can take years. AI buildouts happen in months.
  • Location matters. Regions with spare capacity or fast permitting win new sites.
  • Transparency helps. Public load forecasts and project timelines reduce surprise.
  • Efficiency buys time. Better chips and software lower the energy curve per task.

Risks and trade-offs

There are real risks. If AI demand spikes faster than expected, grids could face reliability stress during heat waves or cold snaps. Water use can strain local supplies in drought-prone areas. Rapid construction can outpace community services. On the other side, there are real benefits. AI data centers bring investment, tax revenue, and jobs in construction and operations. Some operators capture waste heat for district heating in colder climates. Others retire older, less efficient sites and consolidate into greener facilities.

What to watch next

The next two years will bring tests in policy and engineering. Three questions stand out:

  • Can efficiency outpace demand? If each model run gets cheaper, growth may be manageable. If not, pressure will spread.
  • Will clean power scale in time? Faster interconnections for wind, solar, nuclear, and storage will be key.
  • How will siting change? Expect more projects near abundant power and cooler climates, and more scrutiny in stressed regions.

For now, the picture is mixed. The IEA’s numbers show a sharp rise, but not a foregone crisis. Policy signals are stronger than before, but uneven. Companies are spending on cleaner power and efficiency, yet user demand keeps growing. The outcome will depend on coordination. Utilities, regulators, and AI firms will have to plan together, and earlier than they used to.

The stakes reach beyond tech. AI services are becoming part of daily life and work. Keeping them running will require more than clever code. It will require power, water, and time—managed with care, and measured in public.