The impact of Artificial Intelligence on the environment

Share post:

Artificial intelligence (AI) is a growing industry that involves the development of machines capable of performing tasks that normally require human intelligence, such as speech recognition, decision-making, and language translation. AI is booming, with many businesses investing in it to improve their business processes and gain a competitive advantage.

This thriving industry, however, has a significant environmental impact due to its high energy consumption. AI is based on machine learning algorithms that must process large amounts of data quickly. This data processing necessitates a substantial amount of computing power, resulting in a high energy consumption rate. The energy used to power AI data centers is primarily derived from fossil fuels, which emit carbon emissions that contribute to climate change.

Cloud computing, which is used by Microsoft, Alphabet, and ChatGPT maker OpenAI, relies on thousands of chips inside servers in massive data centers around the world to train AI algorithms called models, analyzing data to help them “learn” to perform tasks. AI consumes more energy than other forms of computing, and training a single model can consume more energy than 100 US households use in a year.

The amount of energy consumed by AI systems is expected to rise as the technology spreads across industries. Despite the fact that no one knows how much total electricity use and carbon emissions can be attributed to AI, it is a significant factor.

According to a research paper published in 2021, training GPT-3 took 1.287 gigatonne hours, or about as much electricity as 120 US homes would consume in a year. According to the same paper, that training produced 502 tons of carbon emissions, or roughly the same amount as 110 US cars emits in a year. The GPT-3 from OpenAI employs 175 billion parameters, or variables, that the AI system has learned through training and retraining. Its forerunner used only 1.5 billion.

Despite this, OpenAI is already working on GPT-4, and models must be retrained on a regular basis in order to remain aware of current events, and the amount of energy consumed is unknown.

The sources for this piece include an article in DataCenterKnowledge.

Featured Tech Jobs

SUBSCRIBE NOW

Related articles

Zuckerberg shares his vision with investors and Meta stock tanks

In an era where instant gratification is often the norm, Meta CEO Mark Zuckerberg’s strategic pivot towards long-term,...

AI surpasses human benchmarks in most areas: Stanford report

Stanford University’s Institute for Human-Centered Artificial Intelligence (HAI) has published the seventh annual issue of its AI Index...

Microsoft and OpenAI partner to build a $100 Billion AI supercomputer “Stargate”

In a bold stride towards computational supremacy, Microsoft, in partnership with OpenAI, is reported to be laying the...

US Bill Aims to Unveil AI Training Data Sources Amid Copyright Concerns

In a significant move toward transparency, a bill was introduced in the US Congress on Tuesday by California...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways