AI's Carbon Footprint: Reducing Impact
Source: news.mit.edu
Mitigating AI's Climate Impact
Experts are exploring methods to lessen the carbon footprint of generative AI, as the technology's energy needs are projected to rise significantly. According to an International Energy Agency report, data centers' global electricity demand could more than double by 2030, reaching about 945 terawatt-hours. Goldman Sachs Research also predicts that fossil fuels will meet approximately 60% of data centers' growing electricity needs, resulting in an increase of roughly 220 million tons in global carbon emissions.
Scientists and engineers are investigating ways to mitigate the increasing carbon footprint of AI, including improving algorithms and redesigning data centers.
Embodied vs Operational Carbon
Vijay Gadepally, a senior scientist at MIT Lincoln Laboratory, notes that discussions on reducing generative AI's carbon footprint typically focus on “operational carbon.” However, “embodied carbon”, the emissions from constructing the data center, is often overlooked. Gadepally explains that building and retrofitting a data center involves a significant amount of carbon due to the use of materials like steel and concrete, as well as the installation of hardware and cooling systems. Companies such as Meta and Google are considering more sustainable building materials partly for this reason. Data centers also have a high energy density compared to typical office buildings. Gadepally suggests that efforts to reduce operational emissions may also help lower embodied carbon, but more work is needed.
Strategies for Reducing Operational Carbon
Similar to home energy-saving methods, turning down GPUs in a data center can significantly reduce energy consumption with minimal impact on AI model performance. Using less energy-intensive computing hardware is another strategy. Training demanding AI models often requires many GPUs, but engineers can sometimes achieve comparable results by using less powerful processors that are specifically tuned for an AI task. Furthermore, improving the efficiency of training deep-learning models can also help; Gadepally's group discovered that approximately half of the electricity used during AI model training is spent on achieving the last few percentage points of accuracy, so stopping the training process sooner can conserve energy.
Researchers can also leverage methods that improve efficiency. For example, a postdoc in the Supercomputing Center created a tool to avoid wasted computing cycles, significantly reducing the energy demands of training without affecting model accuracy, Gadepally notes.
Leveraging Efficiency Improvements
Neil Thompson, director of the FutureTech Research Project at MIT’s Computer Science and Artificial Intelligence Laboratory, says that constant innovation in computing hardware continues to improve the energy efficiency of AI models. He adds that the amount of computation that GPUs can perform per joule of energy has been improving significantly each year, and that gains from new model architectures are also doubling regularly. Thompson describes the concept of a “negaflop,” which is a computing operation that is made unnecessary by algorithmic improvements. He suggests that making AI models more efficient is the most important thing we can do to lower their environmental costs.
Maximizing Energy Savings
Gadepally notes that the amount of carbon emissions in 1 kilowatt hour can vary significantly. Deepjyoti Deka, a research scientist in the MIT Energy Initiative, says that engineers can leverage the flexibility of AI workloads and data center operations to maximize emissions reductions. Splitting computing operations so some are performed later, when more electricity is from renewable sources, can help reduce a data center’s carbon footprint. Deka and his team are studying “smarter” data centers where AI workloads are adjusted to improve energy efficiency. They are also exploring the use of long-duration energy storage units at data centers, which could change the emission mix of the system to rely more on renewable energy, Deka says.
In addition, researchers at MIT and Princeton University are developing GenX, a software tool that can help companies determine the ideal location for a data center to minimize environmental impacts and costs. Some governments are even exploring building data centers on the moon. Jennifer Turliuk says that the expansion of renewable energy generation isn’t keeping pace with AI growth, which is a major roadblock to reducing its carbon footprint. Researchers are exploring the use of AI to speed up connecting new renewable energy systems to the power grid. Turliuk and her collaborators developed the Net Climate Impact Score, a framework to help determine the net climate impact of AI projects.
Turliuk concludes that the most effective solutions will likely result from collaborations among companies, regulators, and researchers. She emphasizes the importance of innovating to make AI systems less carbon-intense.