AI Power Demand Surges in Data Centers

Source: wired.com

Published on May 23, 2025

AI's Growing Energy Consumption

Research published in the journal Joule indicates that AI's energy usage accounts for as much as 20 percent of the power demand in global data centers. According to the research, this demand could potentially double by the close of the year, making up almost 50% of all data center electricity consumption globally, not including electricity used for bitcoin mining.

Alex de Vries-Gao, founder of Digiconomist, published the research in a commentary. Digiconomist is a research firm that analyzes the environmental effects of technology. De Vries-Gao created Digiconomist to investigate the environmental effects of bitcoin mining, another activity requiring a lot of energy. He says that, in recent years, studying AI has become increasingly important due to the widespread use of ChatGPT and other large language models that consume significant amounts of energy. His research indicates that AI energy demand is anticipated to exceed that of bitcoin mining by the end of this year.

He states that the investments made by bitcoin miners pale in comparison to those being made by major tech firms in AI. He adds that the situation is rapidly intensifying and poses a greater threat.

Impact on Big Tech's Climate Goals

The advancement of AI is already influencing the climate objectives of Big Tech. Tech companies have noted in their recent sustainability reports that AI is largely responsible for the increase in their energy consumption. For example, Google's greenhouse gas emissions have gone up by 48 percent since 2019, making it more difficult for the company to achieve its net-zero target by 2030.

Google's 2024 sustainability report states that as they integrate AI further into their products, it may be difficult to lower emissions due to the increasing energy needs of AI computing.

The International Energy Agency released a report last month that found that data centers accounted for 1.5 percent of global energy consumption in 2024, which is about 415 terawatt-hours. This is slightly less than the yearly energy demand of Saudi Arabia. This figure is expected to rise. Data centers' electricity consumption has increased four times faster than overall consumption in recent years, and investment in data centers has almost doubled since 2022, primarily due to significant expansions to accommodate new AI capacity. The IEA forecasts that data center electricity consumption will rise to over 900 TWh by the end of the decade.

However, the specific proportion of electricity used by AI in data centers is still largely unknown. Data centers support a range of services, such as cloud hosting and online infrastructure, which are not always related to the energy-intensive operations of AI. Meanwhile, tech companies usually keep their software and hardware energy usage a secret. Some attempts to measure AI's energy use have started by calculating the electricity needed for a single ChatGPT search.

De Vries-Gao chose to examine the supply chain instead, beginning with the production side to gain a broader perspective. According to De Vries-Gao, the significant computing demands of AI create a natural bottleneck in the current global supply chain for AI hardware, especially around the Taiwan Semiconductor Manufacturing Company (TSMC), which is the top manufacturer of essential hardware. Companies like Nvidia outsource their chip production to TSMC, which also makes chips for companies like Google and AMD. (Both TSMC and Nvidia chose not to provide a statement for this article.)

De Vries-Gao used analyst estimates, earnings call transcripts, and device details to make an approximate estimate of TSMC's production capacity. He then looked at publicly available electricity consumption profiles of AI hardware and estimates of hardware utilization rates to get a rough idea of how much of the global data center demand is being used by AI.

De Vries-Gao estimates that, without increased production, AI will use up to 82 terawatt-hours of electricity this year, which is about the same as Switzerland's annual electricity consumption. If production capacity for AI hardware doubles this year, as analysts predict, demand could rise at a similar rate, accounting for nearly half of all data center demand by the end of the year.

The Unknowns and the Need for Transparency

Despite the amount of publicly available data used in the paper, much of what De Vries-Gao is doing involves looking into a black box. Factors that affect AI's energy use, such as the utilization rates of AI hardware and the machine learning activities they are used for, are simply unknown, as is the industry's potential future development. Sasha Luccioni, an AI and energy researcher, who was not involved in this research, advised against relying too heavily on some of the paper's conclusions, given the numerous unknowns.

Luccioni says that when it comes to accurately calculating AI's energy consumption, it is essential for tech companies to share information. She says that researchers have to do this because they lack the information. She added that is why there is such a huge margin of error. Tech companies do, in fact, withhold this data. Google released a paper in 2022 on machine learning and electricity consumption, stating that machine learning accounted for 10%–15% of Google's total energy use from 2019 to 2021. They also predicted that with the use of best practices, total carbon emissions from training will decrease by 2030. Google has not released any further details regarding the amount of electricity ML consumes since that paper, which came out before Google Gemini's launch in 2023. (Google declined to comment for this story.)

De Vries-Gao says that one needs to delve deeply into the semiconductor supply chain to be able to make any meaningful statements regarding the energy demand of AI. He added that if big tech companies were providing the same data that Google released three years prior, there would be a good way to assess AI's energy consumption.