GenAI’s Environmental Impact: Current State and Strategies for Mitigation
Vicky Saumell is a teacher, teacher educator, materials writer and presenter. She has worked for major publishers, especially in the areas of project-based learning and the meaningful use of technology for language learning. She has been the IATEFL Learning Technologies SIG Coordinator. She has had an extensive career teaching at primary and secondary schools in Buenos Aires, Argentina, and continues to work as a consultant for publishers and educational institutions.
Background
A year ago, I was honoured to open the 2024 IATEFL conference in Brighton with a plenary about Artificial Intelligence. Later that year I was invited by Christopher Graham for GreenActionELT to deliver a webinar about its connection to climate change, where I could go deeper into this aspect of AI that is not talked about sufficiently. This article explores the issues raised in both events and expands on them with more current information.
The underlying questions that guide this article are:
-
How bad is the environmental damage that AI is causing?
-
What work is being done to counteract it?
-
What can we do about it as individuals?
The inner works of AI
Before we delve into the environmental impact of AI, I want to do a very brief introduction to what is AI and how it works. Artificial intelligence (AI) is an umbrella term that encompasses other terms such as machine learning (ML), deep learning (DL) within machine learning and generative AI (GenAI) within deep learning. Artificial Intelligence has been around since the 1960s, machine learning (ML) and deep learning (DL), in their modern conception, started in the late 1990’s and 2000’s respectively. But generative AI (GenAI) is quite new, around 2022.
Machine learning is a data-driven approach that involves training algorithms using large data sets which are labelled so what they do is they learn and they adapt. Some examples of machine learning are product recommendations either on social media or streaming services like what should you watch next, what should you read next… spam email filtering and algorithmic trading.
Deep learning can process unstructured data, so we went from labelled data sets to unstructured data sets. We don't need predefined formats or labels these days. In health care, for example, deep learning is used to develop predictive models for disease diagnosis, prognosis and treatment, but other examples are facial recognition speech recognition, self-driving cars, virtual assistance and automatic machine translation.
Then we get to GenAI, defined as a set of deep learning techniques used to generate new previously unseen artifacts such as images, videos or texts. Now the question is how does GenAI do that? It analyses the data and it uncovers the underlying grammar within the data sets by recognizing objective patterns and works out the frequencies within these patterns. It then mimics the language to create something new though derived from the data sets. When we talk about generative AI chatbots, like OpenAI ChatGPT, Microsoft Co-pilot or Google Gemini, we are talking about tools that are based on large language models (LLMs) which are capable of working with and generating language. These LLMs have been fed huge amounts of data, they have analysed it and they have come up with statistical correlations to articulate these interactions, so basically GenAI does not really know or understand things, it is doing advanced mathematical calculations and when it does not know, it hallucinates, it invents, it fills in the gaps.
It's important to know that GenAI is not a web search, but a search within its database. That’s why they might return outdated information. LLMs are very expensive to retrain so a newer development in their functioning is Retrieval Augmented Generation (RAG), which is the process of optimizing the output of an LLM so it references an authoritative knowledge base outside of its training data sources before generating a response. RAG extends the existing capabilities of LLMs to specific domains or an organization's internal knowledge base, all without the need to retrain the model. It is a cost-effective approach to improving LLM output so it remains relevant, accurate, and useful in various contexts.
The environmental impact
The environmental impact of AI, including energy consumption, water consumption, carbon dioxide emissions and e-waste, is an extensive issue.
Let's look at energy consumption. So how much power does it consume? How does this compare to other technologies, like web searching? GenAI energy requirements are more significant than other types of technologies in general. The main focus of GenAI providers is to increase performance and accuracy in LLMs. In order to do that they need to increase the parameters, which is the size of the data that is being used to feed a model. So when they increase the parameters or the size of the models that means that they will need more time and more energy to function. GenAI models consume up to 10 times more energy than traditional search queries. Central Processing Units (CPUs) used in regular computer processing consume 150 watts to 200 watts per chip. Graphic Processing Units (GPUs), the ones used for AI, consume 1,200 watts. When we talk about generating images and videos, that is even more energy intensive than generating text.
What about CO2 emissions? We've been hearing about the carbon footprint of the aviation industry and how we need to fly less to reduce our carbon footprint. This is even worse. Carbon dioxide emissions account for 2.5% to 3.7% of global greenhouse gas emissions, more than the aviation industry at 2.4%. Each GenAI query emits 4.32 g of CO2, while a Google search emits 0.2 g per query. It's estimated that training a large AI model can result in 300 tons of carbon dioxide. Creating ChatGPT-3 with 175 billion parameters consumed 1,287-megawatt hours of electricity and produced 502 tons of carbon dioxide, equivalent to 112 gas powered cars for a year. Google estimates that the cost per year of using ChatGPT is 8.4 tons of carbon dioxide per year. If Google’s billions of searches per day were GenAI chatbot queries, we would need as much power as Ireland needs for a day.
Regarding water consumption, why is water important? Because data centres’ servers rooms, where all the machines necessary to run AI are, need to be kept cool, between 10 to 27° Celsius, to prevent equipment malfunctioning. But data centres must use clean fresh water. They cannot use salty water, which causes corrosion and bacteria growth in the equipment. So fresh water is needed to cool the servers and also to control humidity in those rooms. The global water consumption needed to cool all these GenAI centres is definitely at odds with a much-agreed sustainability stance. Microsoft Global water consumption spiked 34% in 2022. This is more than 2,500 Olympic sized swimming pools. ChatGPT 3 training used as much water as producing 370 BMWs or 320 Teslas. Chat GPT 4 which is larger in size is likely to consume much more water and electricity. AI could use between 4.2 and 6.6 billion cubic meters of water annually by 2027, equivalent to the water consumption of a country like Denmark. A very visual indicator is a 500 ml bottle of water for a set of 5 to 15 prompts or questions. So whenever you hear, and I've heard this over and over again, “Oh don't worry if you don't get a precise result, you can refine your prompt and try again until you get what you need”… Whenever I hear that I cringe and go like how much water are we wasting by doing that?
The issue of E-waste is something we need to look at too because the technology used to for GenAI requires more hardware than other types of computing. That means that it cycles through the hardware faster and so the replacement cycle is more intensive than other types of computing. Therefore, there is more e- waste as a byproduct of GenAI. Gen AI, particularly LLMs, could potentially generate between 1.2 and 5 million metric tons of e-waste by 2030. and I think this is something that we haven't been looking at closely enough.
A greener AI
Can AI be greener? I have done some research and these are the initial things that I found. Yes,
-
if we optimize algorithms
-
if we use sustainable energy sources
-
if we recycle and reuse its waste
-
if we improve cooling of the data centres
-
if we apply AI to conservation
All of this is great but is it in my hands? Is it something I can do as an individual? Unfortunately the answer is no. This is something that is in the hands of GenAI products’ providers, so it was very disappointing in a sense.
Strategies for individuals
But what can we do as an individual? I have put together some strategies that we, as individuals, can do to counteract the environmental impact of GenAI. These strategies can be reduced to one concept, AI literacy. Developing AI literacy programmes for educators and students alike is a must. The purpose is fostering active and critical thinking through developing strategies and tasks to engage students in active discussions and reflective learning about AI´s impact on society, and to explore AI´s decision-making process.
We need to learn about AI in order to know how we can use it in a better way and counteract its environmental impact. If we don't know that AI is using a lot of energy and a lot of water and producing more e-waste then of course we won't be worried about using AI because we don't know about these problems, so raising awareness of the issues should be the first aim of AI literacy. There are some already made resources for AI literacy, for example, this AI Literacy Lessons for Grades 6–12.
The second is web searching instead of GenAI prompting. I mentioned the comparison of the carbon footprint of a Google search against a GenAI chatbot search so why would we go to chat GPT for every single thing these days. I see people prompting AI for ridiculous things like “What are the 10 best things to do in Paris?” Why would you need AI for that when you can just Google it. Lately, web browsers are automatically adding an AI overview, but, although they are enabled by default, which is really annoying, there are some tricks to avoid them. One is adding “-ai” to your web query.
We also need to choose the right AI tool for our purpose. There are tools that consume more water and more energy than others, so we need the information to choose the one that do the less damage. In general, the bigger the model, and the more tasks it can carry out, the more extensive damage it does. Smaller models that perform specific tasks are kinder to the environment.
Finally, we need to learn about prompting. Learning how to prompt is important and even more so how to refine prompts before prompting. We need to know what a good prompt is, what information and structure we need to put in there so that we don't need to reprompt on the tool. We need to be able to refine the prompt before prompting and go online only when we think we have the best prompt. We should also have effective prompt banks. I have seen some useful resources:
-
some PDFs of professionals suggesting prompts
I haven't found what I would love to see which is a prompt bank specific to ELT so I have taken it upon myself to see if I can create a prompt bank for ELT and I have created a Facebook group ELT AI prompt bank where we would eventually be sharing effective prompts that I will then be transferring to a more professional web page where I hope to be able to share effective prompts so that people don't prompt and reprompt over and over again on the tool. It’s currently a work in progress, but I think it could be really useful.
Conclusion
In conclusion, while AI holds remarkable potential for numerous applications, its environmental consequences cannot be overlooked. By fostering awareness of these challenges and promoting responsible use starting with individuals, educators can take meaningful steps toward mitigating the environmental impact of AI.
References
AI for Education (2023) AI’s Impact on the Environment (website) https://www.aiforeducation.io/ai-resources/ais-impact-on-the-environment
Brundtland, G.H. (1987). Our Common Future: Report of the World Commission on Environment and Development. Geneva, UN-Document A/42/427. https://sustainabledevelopment.un.org/content/documents/5987our-common-future.pdf
Calvert, B. (2024, March 28). AI already uses as much energy as a small country. It’s only the beginning. Vox. https://www.vox.com/climate/2024/3/28/24111721/climate-ai-tech-energy-demand-rising
Dhanani, R. (2024). Environmental impact of generative AI – 20 stats & facts. Akepa. https://thesustainableagency.com/blog/environmental-impact-of-generative-ai/
Edmett, A., Ichaporia, N., Crompton, H., & Crichton, R. (2023). Artificial intelligence and English language teaching: Preparing for the future. British Council. https://doi.org/10.57884/78EA-3C69
https://youtu.be/XJNDfvHmG2g?feature=shared
Kemene, E., Valkhof, B., & Tladi, T. (2024, July 22). AI and energy: Will AI help reduce emissions or increase demand? Here's what to know. World Economic Forum. https://www.weforum.org/stories/2024/07/generative-ai-energy-emissions/#:~:text=Training%20a%20model%20such%20as,doubling%20roughly%20every%20100%20days.&text=How%20is%20the%20World%20Economic%20Forum%20creating%20guardrails%20for%20Artificial%20Intelligence?&text=In%20response%20to%20the%20uncertainties,and%20the%20Global%20Industries%20team
Kerr, D. (2024, July 12). AI brings soaring emissions for Google and Microsoft, a major contributor to climate change. NPR. https://www.npr.org/2024/07/12/g-s1-9545/ai-brings-soaring-emissions-for-google-and-microsoft-a-major-contributor-to-climate-change#:~:text=To%20generate%20its%20answers%2C%20AI%20uses%20far,gets%20more%20sophisticated%2C%20it%20needs%20more%20energy
Lavi, H. (2023, May 24). Measuring greenhouse gas emissions in data centres: the environmental impact of cloud computing. Climatiq. https://www.climatiq.io/blog/measure-greenhouse-gas-emissions-carbon-data-centres-cloud-computing
Ligozat, A., & De Vries, A. (2024, November 13). Generative AI: energy consumption soars. Polytechnique Insights. https://www.polytechnique-insights.com/en/columns/energy/generative-ai-energy-consumption-soars/
McLean, S. (2023, April 28). The Environmental Impact of ChatGPT: A Call for Sustainable Practices In AI Development. Earth.org. https://earth.org/environmental-impact-chatgpt/
O'Brien and Fingerhut (2023) Artificial intelligence technology behind ChatGPT was built in Iowa – with a lot of water. AP News. Retrieved from: https://apnews.com/article/chatgpt-gpt4-iowa-ai-water-consumption-microsoft-f551fde98083d17a7e8d904f8be822c4
Plan BE ECO (2024) AI’s carbon footprint – how does the popularity of artificial intelligence affect the climate? https://planbe.eco/en/blog/ais-carbon-footprint-how-does-the-popularity-of-artificial-intelligence-affect-the-climate/
Pratt, M. K. (2023) Generative AI's sustainability problems explained. Tech Target. https://www.techtarget.com/sustainability/feature/Generative-AIs-sustainability-problems-explained
Ramachandran, K., Stewart, D., Hardin, K., & Crossan, G. (2024). As generative AI asks for more power, data centers seek more reliable, cleaner energy solutions. Deloitte Center for Technology Media & Telecommunications. https://www2.deloitte.com/us/en/insights/industry/technology/technology-media-and-telecom-predictions/2025/genai-power-consumption-creates-need-for-more-sustainable-data-centers.html
Ren, S. (2023, November 30). How much water does AI consume? The public deserves to know. OECD.AI Policy Observatory. https://oecd.ai/en/wonk/how-much-water-does-ai-consume
Ren, S. (2024, October 28). Generative AI expansion could create up to five million tonnes of e-waste. Science Media Centre Spain. https://sciencemediacentre.es/en/generative-ai-expansion-could-create-five-million-tonnes-e-waste
Saenko, K. (2023) Is generative AI bad for the environment? A computer scientist explains the carbon footprint of ChatGPT and its cousins. The Conversation website. https://theconversation.com/is-generative-ai-bad-for-the-environment-a-computer-scientist-explains-the-carbon-footprint-of-chatgpt-and-its-cousins-204096
Saenko, K., & The Conversation USA. (2023, May 25). Computer Scientist Breaks Down Generative AI’s Hefty Carbon Footprint. Scientific American. https://www.scientificamerican.com/article/a-computer-scientist-breaks-down-generative-ais-hefty-carbon-footprint/#:~:text=Researchers%20estimated%20that%20creating%20the,any%20consumers%20start%20using%20it
Saumell, V. (2024) Climate Action and AI webinar. Green Action ELT. https://youtu.be/XJNDfvHmG2g?feature=shared
Saumell, V. (2024) The AI factor. Have we figured it out yet? IATEFL 2024 Opening Plenary. https://www.youtube.com/watch?v=2IkCub2jFXs&t=187s
Stodd, J., Schatz, S., & Stead, G. (2023). Engines of Engagement: A Curious Book About Generative AI. Bournemouth, UK: Sea Salt Publishing. https://seasaltlearning.com/engines-of-engagement-generative-ai-book/
Coming soon! Please check the Pilgrims in Segovia Teacher Training courses 2026 at Pilgrims website.
Please check the Pilgrims f2f courses at Pilgrims website.
GenAI’s Environmental Impact: Current State and Strategies for Mitigation
Vicky Saumell, ArgentinaGeneration Next: How to talk to your kids about climate crisis
Caroline Hickman, UK