Generative Data Intelligence

AI processing could consume ‘as much electricity as Ireland’

Date:

The recent spike of interest in AI thanks to large language models (LLMs) and generative AI is pushing adoption of the tech by a wide variety of applications, leading to worries the processing needed for this will cause a surge in datacenter electricity consumption.

These concerns are raised in a paper by Alex de Vries, a researcher at the Vrije Universiteit Amsterdam.

In the paper, De Vries notes people have focused on the training phase of AI models when researching the sustainability of AI, because this is generally considered to be the most resource-intensive, and therefore the most energy consuming.

However, relatively little attention is paid to the inference phase, he argues, yet there are indications that inferencing – operating the trained model – might contribute significantly to an AI model’s life-cycle costs.

To back this up, the paper claims that to support ChatGPT, OpenAI required 3,617 servers based on the Nvidia HGX A100 platform fitted with a total of 28,936 GPUs, implying an energy demand of 564 MWh per day. This compares with the estimated 1,287 MWh used for the GPT-3 model’s training phase.

Internet giant Google is introducing AI-powered search capabilities into its search engine, following Microsoft’s move to add chatbot-powered AI search features into the Bing search engine earlier this year. The paper refers to a quote from Alphabet’s chairman that this would “likely cost 10 times more than a standard keyword search”, suggesting an electricity consumption of approximately 3 Wh each.

If every Google search became an LLM interaction, the electricity needed to power this could amount to the same as a country such as Ireland at 29.3 TWh per year, the paper claims. This is based on Google’s total electricity consumption for 2021 of 18.3 TWh, of which AI was said to account for 10 to 15 percent at the time.

But the paper concedes that this is a worst case scenario, as it assumes full-scale AI adoption with current hardware and software, which is “unlikely to happen rapidly,” not least because Nvidia does not have the production capacity to deliver the estimated 512,821 A100 HGX servers it would require, and would cost Google $100 billion to buy.

For a more realistic projection, the paper looks at the expected number of Nvidia-based AI servers that are likely to be acquired, as the company currently has an estimated 95 percent share of the market.

Quoting analyst estimates that Nvidia will ship 100,000 of its AI server platforms in 2023, the paper calculates that the servers based on this would have a combined power demand of 650 to 1,020 MW, consuming up to 5.7 – 8.9 TWh of electricity annually. Compared to a historical estimated annual electricity consumption by datacenters of 205 TWh, “this is almost negligible” de Vries states.

Don’t forget the Jevons paradox

But before anyone heaves a sigh of relief, Nvidia might be shipping 1.5 million units of its AI server platforms by 2027, consuming 85.4 to 134 TWh of electricity. At this stage, these servers could represent a significant contribution to global datacenter electricity consumption, the paper states. This assumes that the Nvidia products in question will have the same consumption as today’s kit, however.

The paper also considers the effect of the Jevons paradox if innovations in model architectures and algorithms should reduce the amount of compute power required to process complex AI models. The Jevons paradox occurs when increases in efficiency stimulate greater demand, meaning in this case that improvements in model efficiency may allow single consumer-level GPUs to train AI models.

This would see growth in AI-related electricity consumption come not just from new high-performance GPUs like Nvidia’s A100, but also from more generic GPUs, the paper argues, negating any increase in efficiency of the models.

As the paper concludes, the future electricity consumption of AI processing is difficult to predict. While infusing AI into applications such as Google Search can significantly boost their power consumption, various resource factors are likely to limit the growth of global AI-related electricity consumption in the near term.

However, de Vries’ research also warns that it is too optimistic to expect that improvements in efficiency will fully offset any long-term changes in AI-related electricity consumption, and says that the wisdom of using AI in everything should be questioned, as it is “unlikely that all applications will benefit from AI or that the benefits will always outweigh the costs.”

Given the current unseemly rush to add AI into everything, this seems like a futile expectation. ®

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?