● Some progress has been made in the drive to evaluate the carbon footprint of artificial intelligence, a complex task that has been hampered by a lack of transparency, with the publication of figures from Google and French company Mistral AI, which has commissioned the first ever study of the life cycle of a large language model.
● Researchers like Sasha Luccioni and the artificial intelligence ecosystem, through the Coalition for Sustainable AI, are mobilizing to propose standardized methods, more transparency, and sustainability-focused research so that AI contributes to the fight against global warming.
July 2025 was marked by the long-awaited publication of the first lifecycle analysis (LCA) of a large language model, Mistral Large 2. Commissioned by Mistral AI with support from the French government agency ADEME and conducted by Carbone 4, the study provides a comprehensive report on the model’s water and materials consumption as well as its GHG emissions. With regard to materials, it confirms the predominant role of the production of chips and datacenters in AI’s overall footprint. “Greater transparency on the part of the actors in the GPU value chain, such as NVIDIA, will be necessary to refine analyses and improve the reliability of their results,” notes Carbone 4. The LCA also provides details on the training phase of Large 2, which produced nearly 20.4 kilotons of CO₂ equivalent, consumed 281,000 m³ of water, and used the equivalent of 660 kg of Sb eq (a standard unit for resource depletion) of non-renewable resources, and the impact of Mistral AI’s virtual assistant: a 400-token response provided by the Le Chat assistant generates the equivalent of 1.14 g of CO₂, while consuming 45 mL of water, and 0.16 mg of Sb eq.
Google lifts the veil on Gemini: in May 2025, the median Gemini Apps text prompt consumed 0.24 Wh of power, and 0.26 ml of water while generating 0.03 g of CO2e, which is the equivalent of watching less than nine seconds of television. More information here: https://arxiv.org/abs/2508.15734
A worrying environmental impact
In our era of climate change, the environmental impact of the digital sector is an increasing cause of global concern, not least because fossil fuels (coal, oil and gas) are burned to provide power for networks, batteries, and devices and, even more critically for an ever-growing number datacenters that host innumerable applications as well as storing and processing our data. Since 2023, the impact of AI on increasingly limited water resources has become a hot political topic: a fact that has been highlighted in a recent study by the University of California, Riverside, which found that GPT 3 consumed roughly 500ml water to produce 10-30 responses.
In France, the issue is the subject of a national Digital and Environment Programme launched by the French Institute for Research in Computer Science and Automation in 2022. A year later in the S, Soumya Sudhakar, Vivienne Sze et Sertac Karaman of the Massachusetts Institute of Technology (United States) presented the alarming results of a model that simulated the potential emissions of onboard data processing in electricity-powered autonomous vehicles, which notably concluded that computing required by a global fleet of one billion autonomous vehicles would have a carbon footprint at least as big as the one currently generated by all of the world’s data centres.
An issue made all the more urgent by the rise of AI
A study published in mid-February 2023 modelled the emissions generated by machine learning between 2012 (a breakthrough year in the field) and 2021. The two authors, a specialist researcher working for the company Hugging Face and a postdoctoral student at the Quebec Artificial Intelligence Institute, selected 95 ML algorithms mentioned in 77 scientific articles which were drawn from five data processing fields.
“It’s really hard to gather all the necessary information to perform detailed carbon footprint estimates.”
The idea was not to evaluate the exact quantity of carbon dioxide linked to each of these, but rather to outline the main trends. “It’s really hard to gather all the necessary information to perform detailed carbon footprint estimates,” points out Sasha Luccioni of Hugging Face. “AI papers tend not to disclose the amount of computing power used, nor where training was carried out.”
Lower levels of performance do not necessarily imply lower emissions
The research focused on the training phase of the learning models, which requires a great deal of computing power. The first finding was that 73 out of 95 models were trained using electricity that was mainly generated from coal, natural gas and oil. By way of illustration, models powered by energy sourced from coal generated an average of 512g of CO2 equivalent per kilowatt-hour, as opposed to 100.6g for those that were mainly powered by hydroelectricity (several greenhouse gases were generated but converted to CO2 equivalent to provide a single figure). Secondly, in this context it is important to note that higher electricity consumption does not necessarily imply a larger carbon footprint, given the low emissions of models running on hydroelectricity. Another finding was that when comparing two models powered by fossil fuels, performance did not necessarily correlate with a lower carbon footprint.
The carbon footprint of machine translation algorithms has been declining since 2019
However, the researchers did not observe “a systematic increase of carbon emissions for individual tasks.” Footprints generated by image classification models and chatbots are continuing to grow, but those for machine translation algorithms have been declining since 2019. Nevertheless, the increasing impact of the digital sector as a whole is undeniable. Learning models generated an average of 487 tonnes of CO2 equivalent in 2015-2016. By 2020-2022, this figure, which was only for training, reached 2020 tonnes. Deployment also has a major impact. A single ChatGPT request can certainly be fulfilled at minimal cost in terms of energy, but the millions of requests directed every day to a constantly growing number of chatbots is much more problematic. “That is what I am working on now,” points out Sasha Luccioni. “However, it remains a complex task, given that the manner in which models are deployed, the hardware used, and scaling, etc. all have a big influence on the energy required and carbon emitted.”
Read more :
Mistral: Our contribution to the creation of a global environmental standard for AI at Mistral AI
Google: Measuring the environmental impact of AI inference | Google Cloud Blog
OpenAI open-source models analysed by S Luccioni: The GPT-OSS models are here… and they’re energy-efficient!