AI Power Use in Data Centers Nearing 50%
Source: theguardian.com
Artificial intelligence systems may consume nearly half of all datacenter power by the close of the year, according to a new analysis. The International Energy Agency has predicted AI could require as much energy by the end of the decade as Japan uses today.
Alex de Vries-Gao, founder of the Digiconomist tech sustainability website, based his estimates on the power consumed by chips from Nvidia, Advanced Micro Devices, and Broadcom that are used to train and operate AI models. His research is slated for publication in Joule, a sustainable energy journal.
Energy Consumption
The IEA estimates that data centers (excluding those used for cryptocurrency mining) consumed 415 terawatt hours (TWh) of electricity the previous year. De Vries-Gao's research suggests AI could already account for 20% of that amount.
Variables factored into the calculations include a datacenter's energy efficiency and electricity consumption related to cooling systems for servers handling AI workloads.
Sustainability Concerns
Datacenters, which are essential to AI technology, have high energy demands, making sustainability a significant concern. De Vries-Gao estimates that by the end of 2025, AI systems could consume nearly 49% of total datacenter power, excluding crypto mining. AI consumption could reach 23 gigawatts (GW), which is twice the total energy consumption of the Netherlands.
Factors Influencing Demand
De Vries-Gao noted that several factors could slow hardware demand, including waning demand for applications and geopolitical tensions that constrain AI hardware production. He cited barriers to Chinese access to chips, which led to the DeepSeek R1AI model that used fewer chips. He stated that these innovations can reduce the computational and energy costs of AI, but efficiency gains could also encourage more AI use. Multiple countries building their own AI systems could also increase hardware demand.
De Vries-Gao mentioned Crusoe Energy, a US datacenter startup, securing 4.5GW of gas-powered energy capacity for its infrastructure, with OpenAI among the potential customers through its Stargate joint venture. He writes that there are indications these datacenters could worsen dependence on fossil fuels. OpenAI has announced a Stargate project in the United Arab Emirates, its first outside the US.
Microsoft and Google admitted the prior year that their AI drives were endangering their ability to meet internal environmental targets. De Vries-Gao said information on AI’s power demands has become increasingly scarce, and he described it as an opaque industry.
The EU AI Act requires AI companies to disclose the energy consumption behind training a model but not for day-to-day use. Prof Adam Sobey of the Alan Turing Institute stated that more transparency is needed regarding how much energy AI systems consume and how much they could save by improving the efficiency of carbon-emitting industries. Sobey stated that he suspects we don’t need many very good use cases of AI to offset the energy being used on the front end.