Science & Technology

Global electricity consumption by AI could increase by 85-134 TWh annually by 2027

This amount is comparable to the annual electricity consumption of countries such as the Netherlands, Sweden and Argentina

 
By DTE Staff
Published: Friday 13 October 2023

With ChatGPT gaining popularity, global electricity consumption by Artificial Intelligence could increase by 85-134 TWh annually by 2027, according to a report published in the journal Joules.

This amount is comparable to the annual electricity consumption of countries such as the Netherlands, Sweden and Argentina. As per Alex de Vries, a doctoral candidate at Vrije Universiteit Amsterdam, “Looking at the growing demand for AI service, it is very likely that energy consumption related to AI will significantly increase in the coming years”.

Although data centres’ electricity consumption between 2010 and 2018 may have increased by only 6 per cent, the accelerated development raises concerns about electricity consumption and potential environmental impact of AI and data centres.In the recent past, generative AI, used for creating new content such as text, images or videos, such as ChatGPT and DALL-E have grown popular. If generative AI is used in every Google search, the daily electricity consumption would amount to 80 GWh.

In 2021, Google’s annual electricity use was 18.3 TWh, with 10-15% coming from AI. In the worst-case scenario, according to de Vries, Google AI’s electricity usage could be comparable to Ireland's 29.3 TWh per year. This scenario, however, assumed a full-scale AI adoption. Studies have mainly focused on the training phase, which has a large carbon footprint.

For training, large language models (LLMs), including GPT-3, Gopher and Open Pre-trained Transformer (OPT), reportedly consumed 1,287, 1,066 and 324 MWh of electricity, respectively. After they are trained, the LLMs are tested on new data, thereby kicking off the inference phase.

The author expressed concerns that the inference phase might contribute significantly to an AI model’s life-cycle costs. The energy demand of ChatGPT was 564 MWh per day compared to the estimated 1,287 MWh used in the training phase.De Vries suggested some solutions as innovations in model architectures and algorithms could help mitigate or even reduce AI-related electricity consumption in the long term.

Subscribe to Daily Newsletter :

Comments are moderated and will be published only after the site moderator’s approval. Please use a genuine email ID and provide your name. Selected comments may also be used in the ‘Letters’ section of the Down To Earth print edition.