The impact of Artificial Intelligence on the environment

Share post:

Artificial intelligence (AI) is a growing industry that involves the development of machines capable of performing tasks that normally require human intelligence, such as speech recognition, decision-making, and language translation. AI is booming, with many businesses investing in it to improve their business processes and gain a competitive advantage.

This thriving industry, however, has a significant environmental impact due to its high energy consumption. AI is based on machine learning algorithms that must process large amounts of data quickly. This data processing necessitates a substantial amount of computing power, resulting in a high energy consumption rate. The energy used to power AI data centers is primarily derived from fossil fuels, which emit carbon emissions that contribute to climate change.

Cloud computing, which is used by Microsoft, Alphabet, and ChatGPT maker OpenAI, relies on thousands of chips inside servers in massive data centers around the world to train AI algorithms called models, analyzing data to help them “learn” to perform tasks. AI consumes more energy than other forms of computing, and training a single model can consume more energy than 100 US households use in a year.

The amount of energy consumed by AI systems is expected to rise as the technology spreads across industries. Despite the fact that no one knows how much total electricity use and carbon emissions can be attributed to AI, it is a significant factor.

According to a research paper published in 2021, training GPT-3 took 1.287 gigatonne hours, or about as much electricity as 120 US homes would consume in a year. According to the same paper, that training produced 502 tons of carbon emissions, or roughly the same amount as 110 US cars emits in a year. The GPT-3 from OpenAI employs 175 billion parameters, or variables, that the AI system has learned through training and retraining. Its forerunner used only 1.5 billion.

Despite this, OpenAI is already working on GPT-4, and models must be retrained on a regular basis in order to remain aware of current events, and the amount of energy consumed is unknown.

The sources for this piece include an article in DataCenterKnowledge.

SUBSCRIBE NOW

Related articles

Anthropic’s AI Agents Take a Big Leap: Direct Computer Control

Anthropic has unveiled a groundbreaking capability for its Claude large language model: the ability to directly interact with...

AI Agents Could Surpass Humans as Primary App Users by 2030, Accenture Predicts

AI agents are poised to transform the way we interact with digital systems, potentially becoming the primary users...

Target’s new AI is aimed at employees

Target is introducing a new generative artificial intelligence tool aimed at enhancing the efficiency of its store employees...

The good and the bad of AI generated code

Generative AI tools are transforming the coding landscape, making both skilled and novice developers more efficient. However, the...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways