Could AI really consume 25 percent of US power by 2030?

Share post:

The rapid development of generative AI, while driving technological innovation, brings with it an escalating energy demand that raises concerns about sustainability and power consumption. Ami Badani, Chief Marketing Officer of Arm Holdings, addressed these concerns at Fortune’s Brainstorm AI conference in London, highlighting the need for energy-efficient solutions to support the advancement of AI technologies.

Large language models like OpenAI’s GPT-4 currently require vast amounts of electricity, with Badani noting that these AI systems use 15 times more energy than a traditional web search. The proliferation of generative AI in business and the race to create even more sophisticated tools are set to drive compute demand higher, potentially stressing power grids beyond their limits.

Sora, OpenAI’s latest breakthrough capable of generating video clips from text prompts, epitomizes the tension between AI innovation and energy use. Training such an AI model necessitates a fleet of AI chips operating at full capacity, indicative of the hefty power requirements inherent in cutting-edge AI research.

With data centers—AI’s training grounds—already accounting for 2% of global electricity usage, there’s a projected surge in power consumption as AI technologies become increasingly commonplace. Badani’s forecast suggests that generative AI could consume a staggering 25% of the United States’ power by 2030, unless substantial changes are made.

Badani made the case that The Arm chips, energy-efficient RISC processor designs powering the majority of smartphones, could apply the same low-power principles to AI chip development. The goal is to create semiconductors that can operate AI functionalities at minimal energy costs, balancing performance with sustainability.

The AI industry stands at a critical juncture where the marvel of generative AI must reconcile with ecological responsibilities. Whether its Arm or some other architecture, it’s clear that we need a new generation of AI that can scale responsibly without overwhelming our electrical infrastructure, ensuring that AI’s future is not only smart but also sustainable.

 

SUBSCRIBE NOW

Related articles

ChatGPT mobile mania: Why users are flocking to ChatGPT Plus

On the day OpenAI unveiled GPT-4o, ChatGPT's mobile app saw a staggering 22% spike in revenue, marking its...

Starlink’s evolution making it less “TCP/IP friendly”

The rapid evolution of Starlink's satellite internet presents significant challenges for traditional Transmission Control Protocol (TCP), according to...

Study reveals the “disappearing internet”

In a surprising revelation, new research by the Pew Research Center indicates that the internet may not be...

Scarlett Johansson – did OpenAI use HER voice?

Hollywood star Scarlett Johansson expressed shock and anger after a new OpenAI chatbot debuted with a voice eerily...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways