Since the public release of ChatGPT in 2022, Generative AI’s transformative power has taken the world by storm: it took the platform only five days to reach one million users, compared to Facebook’s ten months or Netflix’s three and a half years to achieve the same milestone. Today, generative AI models are being included in more and more products and services, driving an exponential growth in the demand for these technologies.
The environmental impact of Generative AI
Understanding AI’s ecological implications and driving responsible use
But this growth comes at a high environmental cost, driven by three main factors:

The energy required by the algorithms themselves, both for their training and their use

 The water required to cool down datacenters

The ecological impact of building the chips and devices required to run AI models
The climate impact of AI’s energy consumption
First, let’s break down Generative AI’s energy usage.
Generative AI requires high levels of energy at every stage of production and use – from the manufacturing of physical terminals, to the powering and cooling of data centers, to the training of the models, and finally to its everyday use. Each step requires a tremendous amount of power, which contributes to its sizable carbon footprint.
For example, training the GPT-3 model used as much energy as 270 French households do in a year, in just 15 days. Doing so, it generated an equivalent of 552 tons of CO2, which equates to 200 return trips between Paris and New York.
And while the training phase has historically been the most energy-intensive part of a Large Language Model’s life cycle, the tremendous growth in usage has now led the models’ deployment phase (or everyday use) to take the lead in terms of energy demands. As a reference point, it’s estimated that every ChatGPT request consumes 10 times more energy than a Google search.
So what is that energy used for?
Powering and cooling the huge data centers that power AI models. The demand for energy to power these is actually so huge that in the US, old nuclear power plants, such as the infamous Three Mile Island one, are being reactivated, and new ones are also being built. And while nuclear power is pretty “clean” in terms of CO2 emissions (although it obviously involves other issues, like nuclear waste), most AI data centers, which are located in the US, are currently being powered by very CO2-heavy power sources, like coal power plants.
AI impacts on limited resources: water and rare earth impacts
Unfortunately, carbon footprint is far from being the only ecological concern associated with AI. The chips used to run large AI models (called GPUs, Graphics Processing Units) generate a lot of heat, and thus, AI data centers require a lot of cooling to prevent damage to the hardware. This is mostly done using massive amounts of fresh water extracted from the data centers’ surroundings, driving a dramatic impact on natural ecosystems, especially in a global context of growing droughts and lack of water due to climate change. Additionally, these data centers are huge and require a large amount of land to be built on, thus having another impact on the ecosystem.
Finally, the third key environmental impact of AI models relates to the production of the physical chips and hardware the models are run on. These chips, despite being tiny in nature, require large amounts of rare earth minerals, which can only be dug up at high environmental costs to their ecosystem, with large amounts of water (again) needed to prepare, clean, and convert these minerals into a usable form, as well as in the building process of the digital components required to run AI models.
What’s more, in addition to the hardware directly needed to run these models, AI models create new potential uses and products for consumers, and drives the demand for more powerful computing systems, which in turn contributes to over-consumption /over-production of new devices, and electronic waste.
Towards a responsible use of AI
In today’s rapidly evolving competitive landscape, AI will continue to develop. However, taking a considerate and environmentally literate approach to using and developing AI models, especially generative ones, is a responsible solution within the reach of everyone.
One way to do this, especially as a data scientist responsible for developing and implementing AI solutions, is by questioning the precision and performance that are truly necessary for each solution to function efficiently. For example, do you need to use a GenAI model (with a larger carbon footprint), or is there a “traditional AI” model that could perform the task? Will a difference of 1-2% accuracy really make a difference to the use of the model I am developing relative to its incremental cost? By asking questions like these, you can ensure that you are conscientious of how and when you use each AI tool to be both efficient and climate-aware.
Another tactic you can use to decrease the impact of your AI use is prompt engineering: make a clear prompt from the start to reduce the number of queries needed (so you don’t send 5 questions in a row).
Additionally, as more and more studies of AI models’ impacts are published, you can compare the impact by model in order to prioritise the use of those that are the most eco-friendly.
Finally, as a user, before deciding to use a generative AI tool for a task, follow the NATU method and ask yourself these questions:

Do I Need to be using this tool?

Are there available Alternatives with a lower environmental impact?

Will using this tool really help me gain Time?

Will using this tool really be Useful for the task I’m trying to achieve?
To go further: additional resources
To dive deeper into the topic of the environmental impact of AI, here are a few helpful resources you can explore:
- EcoLogits calculator: a python library that tracks the energy consumption and environmental footprint of using generative AI models through APIs.
- Research articles from AI environmental impact researcher Sasha Luccioni: Power Hungry Processing:Watts Driving the Cost of AI Deployment?, Quantifying the Carbon Emissions of Machine Learning.

