Generative AI and Climate Change Are on a Collision Course

In 2025, AI and climate change, two of the biggest societal disruptors we’re facing, will collide.

The summer of 2024 broke the record for Earth’s hottest day since data collection began, sparking widespread media coverage and public debate. This also happens to be the year that both Microsoft and Google, two of the leading big tech companies investing heavily in AI research and development, missed their climate targets. While this also made headlines and spurred indignation, AI’s environmental impacts are still far from being common knowledge.

In reality, AI’s current “bigger is better” paradigm—epitomized by tech companies’ pursuit of ever bigger, more powerful large language models that are presented as the solution to every problem—comes with very significant costs to the environment. These range from generating colossal amounts of energy to power the data centers that run tools such as ChatGPT and Midjourney to the millions of gallons of freshwater that are pumped through these data centers to make sure they don’t overheat and the tons of rare earth metals needed to build the hardware they contain.

Data centers already use 2 percent of electricity globally. In countries like Ireland, that figure goes up to one-fifth of the electricity generated, which prompted the Irish government to declare an effective moratorium on new data centers until 2028. While a lot of the energy used for powering data centers is officially “carbon-neutral,” this relies on mechanisms such as renewable energy credits, which do technically offset the emissions incurred by generating this electricity, but don’t change the way in which it’s generated.

Places like Data Center Alley‘ in Virginia are mostly powered by nonrenewable energy sources such as natural gas, and energy providers are delaying the retirement of coal power plants to keep up with the increased demands of technologies like AI. Data centers are slurping up huge amounts of freshwater from scarce aquifers, pitting local communities against data center providers in places ranging from Arizona to Spain. In Taiwan, the government chose to allocate precious water resources to chip manufacturing facilities to stay ahead of the rising demands instead of letting local farmers use it for watering their crops amid the worst drought the country has seen in more than a century.

My latest research shows that switching from older standard AI models—trained to do a single task such as question-answering—to the new generative models can use up to 30 times more energy just for answering the exact same set of questions. The tech companies that are increasingly adding generative AI models to everything from search engines to text-processing software are also not disclosing the carbon cost of these changes—we still don’t know how much energy is used during a conversation with ChatGPT or when generating an image with Google’s Gemini.

Much of the discourse from Big Tech around AI’s environmental impacts has followed two trajectories: Either it’s not really an issue (according to Bill Gates), or an energy breakthrough will come along and magically fix things (according to Sam Altman). What we really need is more transparency around AI’s environmental impacts, by way of voluntary initiatives like the AI Energy Star project that I’m leading, which would help users compare the energy efficiency of AI models to make informed decisions. I predict that in 2025, voluntary initiatives like these will start being enforced via legislation, from national governments to intergovernmental organizations like the United Nations. In 2025, with more research, public awareness, and regulation, we will finally start to grasp AI’s environmental footprint and take the necessary actions to reduce it.

Source