Is AI good for the environment?
Day 45 / 366
As we move towards more general adoption of AI and LLMs in our lives, we also need to be mindful of the impact this will have on the environment. We get fed so much info about climate change and how we should change our habits to save the environment for future generations. We are asked to use paper straws and avoid plastic bags. So it only makes sense to put AI under the same scrutiny as well.
When Blockchain and cryptocurrency were the next big thing, one of its biggest criticisms was how much energy it used. It required the use of graphic cards, which consumed a lot of electricity. Coincidently, LLMs need graphic cards as well, both for the initial training of the AI and the later usage for inference.
The training of LLMs is done in data centers, and data centers globally account for 1% of the total energy consumption. While we don’t know exactly how much energy would have gone into training the ChatGPT models, we can look at some figures from other AI companies to give us a rough idea.
In his paper, De Vries says that in 2021, Google’s total electricity consumption was 18.3 TWh (terawatt-hour, a unit of energy), with AI accounting for 10–15% of the total: “The worst-case scenario suggests Google’s AI alone could consume as much electricity as a country such as Ireland (29.3 TWh per year)
This energy usage leads to CO2 emissions, which as we all know is no good.
But that’s not all. Running these AI models requires a lot of water as well. Since these models require a lot of computing, it generates a lot of heat which is not good for the hardware, and water is used to cool it down.
Some papers claim that if you ask 5–10 questions to ChatGPT in a session, it would lead to 0.5 liters of water usage!
This is a severe issue, and Sam Altman, the CEO of OpenAI has acknowledged it as well. The progress in AI will depend on innovations we can make in clean energy generation as well.