AI adoption threatens to surge datacenter electricity consumption

Share post:

The explosion of interest in artificial intelligence (AI), driven by large language models (LLMs) and generative AI, is pushing adoption of the tech by a wide variety of applications. This has led to concerns that the processing needed to support this growth will cause a surge in datacenter electricity consumption.

In a new paper, Alex de Vries, a researcher at the Vrije Universiteit Amsterdam, argues that the inference phase of AI models, which involves operating trained models to make predictions, is receiving relatively little attention when it comes to sustainability research. Yet, there are indications that inferencing may contribute significantly to an AI model’s life-cycle costs.

De Vries cites the example of ChatGPT, an LLM developed by OpenAI. To support ChatGPT, OpenAI required 3,617 servers based on the Nvidia HGX A100 platform fitted with a total of 28,936 GPUs, implying an energy demand of 564 MWh per day. This compares with the estimated 1,287 MWh used for the GPT-3 model’s training phase.

De Vries also notes that Google is introducing AI-powered search capabilities into its search engine, following Microsoft’s move to add chatbot-powered AI search features into the Bing search engine earlier this year. He refers to a quote from Alphabet’s chairman that this would “likely cost 10 times more than a standard keyword search”, suggesting an electricity consumption of approximately 3 Wh each.

Quoting analyst estimates that Nvidia will ship 100,000 of its AI server platforms in 2023, De Vries calculates that the servers based on this would have a combined power demand of 650 to 1,020 MW, consuming up to 5.7 – 8.9 TWh of electricity annually. Compared to a historical estimated annual electricity consumption by datacenters of 205 TWh, “this is almost negligible” de Vries states.

He also points out that Nvidia might be shipping 1.5 million units of its AI server platforms by 2027, consuming 85.4 to 134 TWh of electricity. At this stage, these servers could represent a significant contribution to global datacenter electricity consumption, the paper states.

The sources for this piece include an article in TheRegister.

Featured Tech Jobs

SUBSCRIBE NOW

Related articles

OpenAI claims New York Times manipulated ChatGPT “fabricate data”

OpenAI has challenged the New York Times' copyright lawsuit, asserting the newspaper manipulated ChatGPT to fabricate evidence. The...

Companies experiment with four day work week enabled by AI

The dream of a four-day workweek is becoming a reality for some, thanks to AI's integration into the...

Nvidia CEO says we no longer need to teach children to code

At the World Government Summit in Dubai, Nvidia CEO Jensen Huang presented a radical viewpoint, diverging from the...

Major companies monitoring virtual platforms using AI

A recent report has brought to light that major companies like Walmart, Delta, T-Mobile, Chevron, and Starbucks are...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways