AI adoption threatens to surge datacenter electricity consumption

Share post:

The explosion of interest in artificial intelligence (AI), driven by large language models (LLMs) and generative AI, is pushing adoption of the tech by a wide variety of applications. This has led to concerns that the processing needed to support this growth will cause a surge in datacenter electricity consumption.

In a new paper, Alex de Vries, a researcher at the Vrije Universiteit Amsterdam, argues that the inference phase of AI models, which involves operating trained models to make predictions, is receiving relatively little attention when it comes to sustainability research. Yet, there are indications that inferencing may contribute significantly to an AI model’s life-cycle costs.

De Vries cites the example of ChatGPT, an LLM developed by OpenAI. To support ChatGPT, OpenAI required 3,617 servers based on the Nvidia HGX A100 platform fitted with a total of 28,936 GPUs, implying an energy demand of 564 MWh per day. This compares with the estimated 1,287 MWh used for the GPT-3 model’s training phase.

De Vries also notes that Google is introducing AI-powered search capabilities into its search engine, following Microsoft’s move to add chatbot-powered AI search features into the Bing search engine earlier this year. He refers to a quote from Alphabet’s chairman that this would “likely cost 10 times more than a standard keyword search”, suggesting an electricity consumption of approximately 3 Wh each.

Quoting analyst estimates that Nvidia will ship 100,000 of its AI server platforms in 2023, De Vries calculates that the servers based on this would have a combined power demand of 650 to 1,020 MW, consuming up to 5.7 – 8.9 TWh of electricity annually. Compared to a historical estimated annual electricity consumption by datacenters of 205 TWh, “this is almost negligible” de Vries states.

He also points out that Nvidia might be shipping 1.5 million units of its AI server platforms by 2027, consuming 85.4 to 134 TWh of electricity. At this stage, these servers could represent a significant contribution to global datacenter electricity consumption, the paper states.

The sources for this piece include an article in TheRegister.

SUBSCRIBE NOW

Related articles

Tests unable to distinguish AI from human reviews

AI-generated restaurant reviews can now pass the Turing test, successfully fooling both human readers and automated detectors, according...

Zuckerberg shares his vision with investors and Meta stock tanks

In an era where instant gratification is often the norm, Meta CEO Mark Zuckerberg’s strategic pivot towards long-term,...

AI surpasses human benchmarks in most areas: Stanford report

Stanford University’s Institute for Human-Centered Artificial Intelligence (HAI) has published the seventh annual issue of its AI Index...

Microsoft and OpenAI partner to build a $100 Billion AI supercomputer “Stargate”

In a bold stride towards computational supremacy, Microsoft, in partnership with OpenAI, is reported to be laying the...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways