Please consider supporting us by disabling your content blocker.
loader
Generative AI Is Exhausting the Power Grid

Generative AI has very quickly been adopted across various sectors. However, this has led to increased global electricity consumption that is only predicted to increase further as the technology expands, with many tech companies already at risk of defaulting on their net-zero commitments.

OpenAI’s launch of ChatGPT in late 2022 introduced the world to generative artificial intelligence –commonly referred to as “genAI” – allowing users to generate text and answer complex questions in an almost human-like manner and at incredible speed. The new technology took the world by storm, reaching 100 million active users in the first two months and sparking a race among companies to embed the technology across their operations and products.

Beyond ChatGPT, genAI has already begun disrupting large industries, from biopharma, where the technology can generate millions of candidate molecules for certain diseases, to marketing, where it can personalise content and customer experiences. However, there is a dark side to all this.

Besides requiring huge quantities of fresh water to keep data centres cool, when powered by non-renewable energy sources, artificial intelligence also releases significant amounts of carbon emissions. Each individual use of genAI to answer a question or produce an image comes at an incredible cost to the planet; with the technology spreading at unprecedented pace around the world, its environmental footprint is only destined to increase. To put things into perspective, a single ChatGPT query requires 2.9 watt-hours of electricity, compared with 0.3 watt-hours for a Google search, as found in the International Energy Agency’s (IEA) Electricity 2024 forecast which was released earlier this year for global energy use over the next two years. For the first time, it included projections for energy consumption by data centres, cryptocurrency, and AI, citing market trends including the fast incorporation of AI across a variety of sectors as reasons for increasing electricity demand.

Large Language Models (LLMs), which sit at the heart of many gen AI systems, are trained on vast stores of information, allowing them to generate a response to virtually any query from scratch. A December 2023 study, which is yet to be peer-reviewed, found that using large generative models to create outputs is far more energy-intensive than using smaller AI models tailored to specific tasks. The reason behind this conclusion is that generative AI models tend to do many things at once, such as generating, classifying, and summarising text; this results in the whole model getting activated in response to a query, which is “wildly inefficient from a computational perspective.”

Gen AI runs an immense number of calculations to perform tasks very quickly, usually on specialised Graphical Processing Units (GPUs). Compared to other chips, GPUs are more energy-efficient for AI, and most efficient when running in large cloud data centres – specialised buildings containing computers equipped with those chips. The gen AI revolution led to the rapid expansion of these centres around the world, resulting in a significant rise in power consumption. The IEA’s report projects data centres’ electricity consumption in 2026 to double 2022 levels, reaching 1,000 terawatts, roughly Japan’s total consumption.

Consequently, organisations have reported a rise in their emissions that goes against their commitments to reduce their environmental impact. According to a study by Google and UC Berkeley, training OpenAI’s GPT-3 generated 552 metric tonnes of carbon — the equivalent to driving 112 petrol cars for a year. Last year, Google’s total data centre electricity consumption grew by 17%. While the tech giant did not reveal how much of this was directly linked to gen AI, it admitted that it expects to see this trend grow in the future. Similarly, Microsoft announced in May that its emissions were up almost 30% from 2020 as a result of building new data centres.

More on the topic: Google Emissions Grow 48% in Five Years Owing to Large-Scale AI Deployment, Jeopardizing Company’s Net Zero Plans

As mentioned earlier, the water usage of this technology cannot go unmentioned. To cool delicate electronics, water is required to be free of impurities, resulting in data centres competing for the same water used by people to drink, cook, and wash. In 2022, Google’s data centres consumed around 5 million gallons of freshwater for cooling, 20% more than in 2021. In the same time period, Microsoft’s water consumption rose by 34%.

It is difficult to get accurate estimates on the impact of gen AI, in part due to machine learning models being incredibly variable, able to be configured in ways that can dramatically impact their power consumption, but also due to organisations like Meta, Microsoft, OpenAI not openly sharing relevant information. Data is not systematically collected on AI’s energy use and environmental impact and there is a need for greater transparency and tracking – especially as models grow and gen AI becomes more embedded into society.

As gen AI becomes more mainstream, environmental costs will grow. And with the world heating up, companies working to meet the rising demand of generative AI must commit to more transparency regarding their operations and begin shifting to clean energy. As the IEA report emphasises, governments must introduce regulations to restrain energy consumed by data centres, requiring mandatory reporting obligations, and setting energy efficiency standards, while companies must work on improving efficiency and reducing the amount of energy required by data centres.