AI adoption threatens to surge datacenter electricity consumption

Share post:

The explosion of interest in artificial intelligence (AI), driven by large language models (LLMs) and generative AI, is pushing adoption of the tech by a wide variety of applications. This has led to concerns that the processing needed to support this growth will cause a surge in datacenter electricity consumption.

In a new paper, Alex de Vries, a researcher at the Vrije Universiteit Amsterdam, argues that the inference phase of AI models, which involves operating trained models to make predictions, is receiving relatively little attention when it comes to sustainability research. Yet, there are indications that inferencing may contribute significantly to an AI model’s life-cycle costs.

De Vries cites the example of ChatGPT, an LLM developed by OpenAI. To support ChatGPT, OpenAI required 3,617 servers based on the Nvidia HGX A100 platform fitted with a total of 28,936 GPUs, implying an energy demand of 564 MWh per day. This compares with the estimated 1,287 MWh used for the GPT-3 model’s training phase.

De Vries also notes that Google is introducing AI-powered search capabilities into its search engine, following Microsoft’s move to add chatbot-powered AI search features into the Bing search engine earlier this year. He refers to a quote from Alphabet’s chairman that this would “likely cost 10 times more than a standard keyword search”, suggesting an electricity consumption of approximately 3 Wh each.

Quoting analyst estimates that Nvidia will ship 100,000 of its AI server platforms in 2023, De Vries calculates that the servers based on this would have a combined power demand of 650 to 1,020 MW, consuming up to 5.7 – 8.9 TWh of electricity annually. Compared to a historical estimated annual electricity consumption by datacenters of 205 TWh, “this is almost negligible” de Vries states.

He also points out that Nvidia might be shipping 1.5 million units of its AI server platforms by 2027, consuming 85.4 to 134 TWh of electricity. At this stage, these servers could represent a significant contribution to global datacenter electricity consumption, the paper states.

The sources for this piece include an article in TheRegister.

SUBSCRIBE NOW

Related articles

Microsoft’s AI success may spell defeat for it’s climate goals

Microsoft's ambitious strides in AI technology are now posing a significant challenge to its own climate goals, as...

OpenAI’s Chief Scientist Ilya Sutskever Departs Company

Ilya Sutskever, co-founder and chief scientist of OpenAI, has officially announced his departure from the company. This move...

OpenAI snubs Microsoft, launching GPT-4o only on macOS

OpenAI, despite Microsoft's substantial $10 billion investment, has chosen to release its new ChatGPT app exclusively on macOS,...

Apple to integrate ChatGPT into iPhones

Apple Inc. is on the brink of solidifying a deal with OpenAI to integrate the ChatGPT technology into...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways