AI technology is increasing global energy consumption (Disclaimer: AI Generated Image)

New Delhi:

According to a 2009 Google report, a traditional Google search uses an average of 0.0003 kilowatt hours (KWh) of energy. This energy can power your household (9 watt) light bulb for about 2 minutes. Google averages 8.5 billion searches per day by 2023, which translates to 2,550,000 KWh of electricity per day, nearly 2000 times the electricity an average Indian uses for an entire year (1255 KWh).

This May, Google announced that the company will integrate AI into its search engine, powered by its most powerful AI model – Gemini. According to Alex de Vries, a Dutch data scientist who spoke to The New Yorker on the subject, a Google search that incorporates AI will use 10 times more energy (3 KWh) than a traditional Google search. This is 20,000 times more than what an average Indian consumes in a year.

Alex de Vries, who is also the founder of Digiconomist—the organization responsible for the Bitcoin Energy Consumption Index—has said that if Google’s energy consumption reaches about 29 billion terawatt hours (TWh) per year. Find out This figure is equal to the electricity consumption of Ireland, and more than Kenya.

Why are AI systems so power hungry?

AI systems require a lot of computational power to run complex algorithms to process large corpuses of growing data. When you enter a prompt in ChatGPT, the chatbot processes it using servers located in data centers. According to the International Energy Agency, these centers alone account for 1-1.5 percent of total global electricity consumption.

“I think we still don’t understand the energy requirements of this (AI) technology,” OpenAI CEO Sam Altman said at a public event in Davos this January. Altman expressed the urgent need for “breakthrough” technologies like nuclear fusion to power AI operations given the growth of frontier technology. This is a clear indication that industry leaders in large language model chatbots are looking for ways to maintain current power consumption levels and secure future demand.

AI’s Carbon Emissions – Blow to UN 2050 Net-zero Emission Goal?

According to The Shift Project, a French non-profit organization working to reduce energy dependence on fossil fuels, data centers, powering cloud computing, as well as AI systems generate 2.5 to 3.5 percent of global greenhouse gas emissions. do This is equivalent to the entire aviation industry in greenhouse gas emissions.

The energy consumption and resulting carbon footprints of different AI models vary significantly. For example, the BigScience project BLOOM, an AI model with 76 billion parameters (internal variables that a model learns during training) consumes 433 megawatt hours (MWh) of electricity.

In contrast, as of 2020 OpenAI’s GPT-3, which had 175 billion parameters, consumed 3 times more electricity at 1287 MWh, according to data from the Artificial Intelligence Index Report 2024, Stanford Institute of Human-Centered Published by Artificial Intelligence. (HCAI).

CO2 equivalent emissions (in tonnes), i.e. total emissions of greenhouse gases expressed in terms of carbon dioxide, were 25 tonnes for Bloom and for GPT-3, it was 20 times higher at 502 tonnes.

The report also notes that there is a serious lack of transparency from AI developers on the environmental impact of their models, with most failing to make their carbon footprints public.

Google’s recently released Sustainability Report 2024 also shows the energy hunger of new AI technology. The company has seen a nearly 50% increase in carbon emissions over the past 5 years due to its new AI technologies powering it. Microsoft’s Sustainability Report 2024 shows similar trends with a 29% increase in CO2 emissions in 2023 compared to the previous year.

According to SemiAnalysis, an independent US-based AI research and analysis company, AI will drive the power consumption of data centers to 4.5 percent of global energy production by 2030. Another estimate by the International Energy Agency suggests that the total electricity consumption of data centers could double from 2022 levels to 1000 TWh (the equivalent of Japan’s current electricity consumption). There are around 138 data centers in India, with 45 more reportedly operational by the end of 2025.

Lawmakers are beginning to assess the situation. Taking note of this, the European Union adopted a new regulation in March this year. Under the scheme, all data center operators are required to report their energy and water consumption (use for cooling systems). They are also mandated to provide information on performance measures to be implemented to ensure reduction.

In February, US Democrats introduced the Artificial Intelligence Environmental Impact Act of 2024. The Act proposes to establish an AI Environmental Impact Consortium comprising experts, researchers and industry stakeholders to address the environmental impact of AI.



Source link