|
 |
 
|
 How much power and water does AI use? Google, Mistral weigh in How badly does AI harm the environment? We now have some answers to that question, as both Google and Mistral have published their own self-assessments of the environmental impact of an AI query.
In July, Mistral, which publishes its own AI models, published a self-evaluation of the environmental impact of training and querying its model in terms of the amount of carbon dioxide (CO2) produced, the amount of water consumed, and the amount of material consumed. Google took a slightly different approach, publishing the amount of power and water a Gemini query consumes, as well as how much CO2 it produces.
Of course, there are caveats: Each report was self-generated, and not performed by an outside auditor. Also, training a model consumes vastly more resources than inferencing, or the day-to-day tasks users assign a chatbot each time they query it. Still, the reports provide some context for how much AI taxes the environment, even though they exclude the effects of AI training and inferencing by OpenAI and other competitors.
On Thursday, Google said its estimate for the resources consumed by a “median” Gemini query consumes 0.24Wh of energy and 0.26 milliliters (five drops) of water, and generates the equivalent of 0.03 grams of carbon dioxide — the equivalent of 9 seconds of watching TV. Mistral’s report slightly differed: For a “Le Chat” response generating a page of text (400 tokens), Mistral consumes 50 milliliters of water, produces the equivalent of 1.14 grams of carbon dioxide, and consumes the equivalent of 0.2 milligrams of non-renewable resources.
Google said “comparative models” typically are a bit more lenient, and only look at the impacts of active TPU and GPU consumption. Put this way, the median Gemini text prompt uses 0.10Wh of energy, consumes 0.12ml of water, and emits the equivalent of 0.02 grams of carbon dioxide.
Google did not release any assessments of the impact of training its Gemini models. Mistral did: In January 2025, training its Large 2 model produced the equivalent of 20.4 kilotons of carbon dioxide, consumed 281,000 cubic meters of water, and consumed 650 kilograms of resources. That’s about 112 Olympic-sized swimming pools of water consumption. Using the EPA’s estimate that an average car produces 4.6 metric tons of carbon dioxide annually, that works out to the annual CO2 production of 4,435 cars, too.
The environmental impact assessments assume that energy is produced via means that actually produce carbon dioxide, such as coal. “Clean” energy, like solar, lowers that value.
Likewise, the amount of water “consumed” typically assumes the use of evaporative cooling, where heat is transferred from the chip or server (possibly being cooled by water as well) to what’s known as an evaporative cooler. The evaporative cooler transfers heat efficiently, in the same manner as your body cools itself after a workout. As you sweat, the moisture evaporates, an endothermic reaction that pulls heat from your body. An evaporative cooler performs the same function, wicking heat from a server farm but also evaporating that water back into the atmosphere.
Mistral’s environmental impact assessment includes a footnote noting the differences in electricity France and the United States consume.
Google said that it uses a holistic approach toward managing energy, such as more efficient models, optimized inferencing though models like Flash-Lite, custom-built TPUs, efficient data centers, and efficient idling of CPUs that aren’t being used. Clean energy generation — such as a planned nuclear reactor — can help lower the impact numbers, too.
“Today, as AI becomes increasingly integrated into every layer of our economy, it is crucial for developers, policymakers, enterprises, governments, and citizens to better understand the environmental footprint of this transformative technology,” Mistral’s own report adds. “At Mistral AI, we believe that we share a collective responsibility with each actor of the value chain to address and mitigate the environmental impacts of our innovations.”
How much water and electricity does ChatGPT consume?
The reports from Mistral and Google haven’t been duplicated by other companies. EpochAI estimates that the average GPT-4o query on ChatGPT consumes about 0.3Wh of energy, based upon its estimates of the types of servers OpenAI uses.
However, the amount of resources AI consumes can vary considerably, and even AI energy scores are rudimentary at best.
“In reality, the type and size of the model, the type of output you’re generating, and countless variables beyond your control—like which energy grid is connected to the data center your request is sent to and what time of day it’s processed—can make one query thousands of times more energy-intensive and emissions-producing than another,” an MIT Technology Review study found. Its estimates of 15 queries a day plus 10 images plus three 5-second videos would consume 2.9kWh of electricity, it found.
Still, Mistral’s study authors note that its own estimates point the way toward a “scoring system” where buyers and users could use these studies as a way to choose AI models with the least environmental impact. It also called upon other AI model makers to follow its lead.
Whether AI is “bad” for the environment is still up for discussion, but the reports from Google and Mistral provide a foundation for a more reasoned discussion. 
© 2025 PC World 6:25am  
|
|
|