Green AI Movement Aims to Reduce Carbon Footprint of AI
Meeting the demand for AI while reducing its environmental impact is a major challenge for 2024, touching on all AI development practices; hyperscalers seek renewable energy sources
By John P. Desmond, Editor, AI in Business

The demand for AI and the need to reduce greenhouse gas emissions are two realities in conflict, posing a sustainability challenge that is a major trend around AI and data centers in 2024.
Among companies focused on this issue is Vertiv, a Westerville, Ohio-based provider of data center infrastructure services and enabling software.
“AI and its downstream impact on data center densities and power demands have become the dominant storylines in our industry,” stated Vertiv CEO Giordano (Gio) Albertazzi, in a recent account in IT Wire. He added, “Finding ways to help customers both support the demand for AI and reduce energy consumption and greenhouse gas emissions is a significant challenge requiring new collaborations between data centers, chip and server manufacturers, and infrastructure providers.”
Vertiv sees the primary trends as:
AI sets the terms for new builds and retrofits. Organizations seeking to deploy AI applications in-house need to support the high-density computing required to AI with liquid cooling infrastructure. This may require new construction or large-scale retrofits that alter the power and cooling infrastructure, which presents opportunities to build more eco-friendly technology and practices;
Search for energy storage alternatives expanding: New energy storage technologies and approaches that integrate with the electric grid include battery energy storage systems that shift loads as necessary with alternative energy sources such as solar;
Priority on flexibility for enterprises: Organizations with enterprise data centers are likely to diversify AI deployment strategies beyond the cloud or colocation services.
“Businesses may start to look to on-premises capacity to support proprietary AI,” the Vertiv authors stated, noting that migrations to the cloud pose security challenges. Still, research from Gartner projects global spending on public cloud services to increase by 20 percent in 2024. “Mass migration to the cloud shows no signs of abating,” the Vertiv authors stated.
Growing Use of AI Increasing Electricity Demand
The data center industry, physical facilities designed to store and manage information and communications technology systems, is responsible for two to three percent of global greenhouse gas emissions, according to a recent account in Harvard Business Review.
“The data center servers that store this ever-expanding sea of information require huge amounts of energy and water (directly for cooling, and indirectly for generating non-renewable electricity) to operate computer servers, equipment, and cooling systems,” stated the authors of the account, Ajay Kumar and Tom Davenport. Kumar is an associate professor of information systems at Emlyon Business School in France; Davenport is a professor of IT and management at Babson College. For a sense of scale, data centers account for some seven percent of electricity use in Denmark and three percent in the US.
Generative AI models are supported by hyperscale–very large–cloud providers with thousands of servers running on GPU chips, producing major carbon footprints, demanding 10 to 15 times the energy a traditional CPU needs. The three main hyperscale cloud providers today at Amazon AWS, Google Cloud and Microsoft Azure, the authors stated.
To calculate the carbon footprint of a machine learning model, the authors note that the footprint of three stages needs to be considered: training the model, running the inference engine once the model has been deployed, and the footprint of all the computing hardware and cloud data center capabilities needed.
The bigger the model, the more energy it uses. GPT-3, the “parent” of ChatGPT, has 175 billion model parameters and was trained on over 500 billion words of text, the authors state. Model training is the most energy-intensive stage. Researchers have estimated that training of GPT-4 or Google PaLM is estimated to use 300 tons of CO2. The average person in comparison is estimated to create about five tons of CO2 per year, or five times that amount for North Americans, the authors suggest.
The inference stage consumes less energy per session, but more sessions are needed. Nvidia has estimated that 80 to 90 percent of the energy cost of neural networks lies in ongoing inference processing after a model has been trained.
Fine-tuning of prompts and answers to customize to the specific content of an organization consumes less energy and computing power, the authors state.
To make AI greener, the authors’ suggestions included:
Use existing large generative AI models instead of building your own;
Perform fine-tuning on existing models;
Use less expensive computation methods, such as TinyML, for processing;
Use a large model only when it provides significant value; and
Evaluate the energy sources of your cloud provider or data center.
To take advantage of hydroelectric power, Google has recently started to build a $735 million clear energy data center in Quebec. Google aims to shift to carbon-free energy by 2030.
Developers now have access to carbon monitoring tools that can be built into applications. These include CodeCarbon, Green algorithms, and ML CO2 Impact,
“We should encourage the developer community to consider these performance metrics to establish benchmarks and to evaluate ML models,” the HBR authors state.
Green AutoML a Field of Study for These AI Scientists
Interest in green AI has spawned a proposal by a group of AI scientists for “Green AutoML, a paradigm to make the whole AutoML process more environmentally friendly,” recently published in The Journal of AI Research.
The research paper, from scientists with three institutions in Germany and one in Colombia, suggests four categories of actions the AI community could take to reduce the environmental footprint of AI research: design, benchmarking, transparency about the footprint, and research incentives to direct AutoML research in a more sustainable direction. “Additionally, we elaborate on the tradeoff between focusing on environmental impact and the freedom of research,” stated the authors, who are led by Tanja Tornede, a researcher at the Institute of AI at Leibniz University, Hannover, Germany. She is in a PhD program at Paderborn University, concentrating on green AutoML.
Energy-efficient AutoML methods suggested by the authors include: “warmstarting,” in which optimization processes integrate knowledge gained in prior executions; and “zero-shot AutoML,”
And a warmstarting mechanism that recommends a single-candidate pipeline, that is adopted by the system without any evaluation.
The authors suggest researchers apply a sustainability checklist for each AI research paper that includes: key aspects of the development approach designed with efficiency in mind: steps considered to reduce the carbon footprint of the development process; consideration of a metric related to the environmental footprint; any use of existing benchmarks; and whether the generated data was made publicly available.
“We think that the environmental impact of a paper should be considered as a criterion in the review process,” the authors stated.
Microsoft is moving to the use of nuclear energy to power its AI operations, according to a recent account in the Wall Street Journal. Microsoft is employing AI for this work, which the company hopes will speed the time it takes to meet federal licensing requirements, a process that usually takes years. Microsoft is reported to see more potential in nuclear power than other renewable sources of energy.
Read the source articles and information in IT Wire, Harvard Business Review, The Journal of AI Research and the Wall Street Journal.
Click on the image to buy me a cup of coffee and support the production of the AI in Business newsletter. Thank you!