As generative artificial intelligence (AI) models have increasingly gained acceptance and popularity, there is an aspect of this new vogue that has largely been overlooked — its environmental cost.
Several studies have been done and much has been written about using AI to combat climate change. In fact, a survey by the Boston Consulting Group last year shows that as much as 87 percent of the 1,055 global leaders with decision-making power in AI and climate it surveyed believe that AI is an essential tool in the fight against climate change.
However, a growing number of studies are now focusing on the environmental impact of AI models themselves.
Generative AI models use graphics processing units (GPUs), which allow for many pieces of data to be processed relatively quickly. The use of GPUs to train GPT-3, the predecessor of ChatGPT, resulted in 502 tonnes of carbon equivalent emissions, according to a report by Stanford University. This is around 91 times more than the average emission by a person in a year, and 500 times more than a round trip (nearly 6,000 km) from New York City (US East coast) to San Francisco (West coast).
Water cost
Since AI/large language models (LLMs) need significant processing power, the data centres where they are physically stored generate a huge amount of heat — necessitating water-intensive cooling systems.
According to a study published this year, training GPT-3 would have needed as much as 3.5 million litres of water, provided that it was done so in US data centres. If the AI model was trained in Microsoft’s Asian data centres, the water usage would have been 4.9 million litres, enough to produce 600 BMW cars or 2,200 Tesla electric vehicles.
GPT-3 was trained on 175 billion parameters. Meanwhile, GPT-4, which powers ChatGPT Plus, is considered to have been trained on a significantly larger number of parameters, with some media reports putting the figure at 1.76 trillion. This means that as new iterations are released with more parameters, AI models could require more energy and water-intensive cooling systems — increasing their environmental cost.
Making AI models more energy efficient
Now, AI models themselves are being tested to make AI systems more energy efficient. DeepMind, a subsidiary of Google, is currently experimenting on BCOOLER, an AI model that can be used to optimise energy use in data centres.
The live experiment of the model in two-real world facilities has resulted in 9 to 13 percent energy savings.
However, a recent study published in July says that most methods currently being discussed to reduce the cost of AI models on the planet ended up worsening their overall performance.
If things don’t improve soon, AI models could consume more energy than the entire human workforce by 2025, according to estimates by Gartner, a US-based technological research and consulting firm. This could significantly offset global carbon reduction goals.
Some developers are now also looking at building smaller AI models rather than increasing their size on each iteration. Meta’s LLaMa, a 65-billion-parameter LLM, is considered to be more potent than ChatGPT, which is much bigger in size.
“Training smaller foundation models like LLaMA is desirable in the large language model space because it requires far less computing power and resources,” the social media giant said in its statement announcing the release of the model.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.