Crypto Mining with GPU Stock Image 2

Edgar Cervantes / Android Authority

Everything comes at a price, and AI is no different. While Chat GPT And while Gemini may be free to use, they require a surprising amount of computational power to operate. And if that wasn’t enough, Big Tech is currently engaged in an arms race to build bigger and better models. GPT-5. Critics say this increased demand for powerful and energy-intensive hardware will have a devastating impact on climate change. So how much energy does an AI like ChatGPT use and what does that power use mean from an environmental perspective? Let’s break it down.

ChatGPT Energy Consumption: How Much Power Does AI Need?

Chat GPT Stock Photo 58

Calvin Wankhede / Android Authority

OpenAI’s older GPT-3 large language model Just under 1,300 megawatt hours (MWh) is needed of electricity for training, which is equivalent to the annual electricity consumption of about 120 American households. For some perspective, an average American household uses north of 10,000 kilowatt hours per year. That’s not all — AI models also require computing power to process each query, called inference. And to achieve this, you need many powerful servers spread across thousands of data centers globally. At the heart of these servers are typically NVIDIA’s H100 chips, which consume 700 watts each and are deployed by the hundreds.

Estimates vary, but most researchers agree that ChatGPT alone requires a few hundred megawatt hours each day. That’s enough to power thousands of American homes, and perhaps tens of thousands, for a year. Given that ChatGPT is no longer the only creative AI player in town, it stands to reason that usage will only grow from here.

AI could consume 0.5% of the world’s electricity consumption by 2027.

A paper to be published in 2023 It tries to calculate how much electricity the generative AI industry will consume in the next few years. Its author, Alex de Vries, estimates that market leader NVIDIA will ship 1.5 million AI server units by 2027. As a result, AI servers will consume 85.4 to 134 terawatt hours (TWh) of electricity per year, which is more than annual electricity. The consumption of smaller countries such as the Netherlands, Bangladesh and Sweden.

While these are certainly alarmingly high figures, it is worth noting that a few years ago total worldwide electricity production was around 29,000 TWh. In other words, AI servers will account for nearly half of the world’s energy consumption by 2027. Is it still too much? Yes, but it needs to be put into perspective.

The power consumption of AI matters

cryptocurrency data center servers

AI may consume as much electricity as the output of smaller nations, but it’s not the only industry to do so. In fact, the data centers that power the rest of the Internet consume far more than the centers dedicated to AI, and demand on that front is growing regardless of new releases like ChatGPT. According to International Energy AgencyAll data centers in the world use 460 TWh today. However, since the end of the Great Recession in 2009 the trend line has been increasing rapidly – ​​AI had no part in it until the end of 2022.

Even if we consider the researcher’s worst-case scenario from above and assume that the AI ​​servers will account for 134 TWh of electricity, this pales in comparison to the world’s total data center consumption. Netflix alone used enough electricity to power 40,000 American homes in 2019, and that number has certainly increased since then, but you don’t see anyone claiming to kill internet streaming entirely. Not come. Air conditioners account for 10% of global electricity consumption, or 20 times AI’s worst-case 2027 consumption estimate.

AI’s overall power consumption is lower than that of global data centers.

AI’s power consumption can also be compared to Bitcoin’s energy consumption controversy. Like AI, Bitcoin has been heavily criticized for its high power consumption, with many calling it a serious environmental threat. Nevertheless, the financial incentives of mining have forced its adoption in regions with cheap and renewable energy sources. This is only possible due to the abundance of electricity in areas where it might otherwise be underutilized or wasted. All of this means that we should really be asking about AI’s carbon footprint, and not just focus on raw power consumption figures.

The good news is that similar to cryptocurrency mining operations, data centers are often strategically located in regions where electricity is either abundant or cheaply produced. This is why renting a server in Singapore is much cheaper than Chicago.

Google aims to run all of its data centers 24/7 on carbon-free energy by 2030. And according to the company’s 2024 Environmental Report, 64% of its data centers’ electricity consumption already comes from carbon-free energy sources. Microsoft has set a similar goal and its Azure data centers power ChatGPT.

Increased efficiency: Could AI’s power demand plateau?

Samsung Galaxy S24 GalaxyAI transcription summary

Robert Triggs / Android Authority

As creative AI technology continues to evolve, companies are also developing smaller and more efficient models. Since the release of ChatGPT in late 2022, we’ve seen many models that prioritize performance without sacrificing performance. Some of these new AI models can deliver results comparable to their older predecessors from just a few months ago.

For example, OpenAI’s recent GPT-4o gem The GPT-3 is significantly cheaper than the Turbo it replaces. The company hasn’t disclosed performance figures, but the order-of-magnitude reduction in API costs indicates a large reduction in compute costs (and thus, power consumption).

We’ve also looked at on-device processing for tasks like summarization and translation that can be achieved by smaller models. While you could argue that the inclusion of new software suites such as Galaxy A.I Although this results in increased power consumption on the device, the trade-off can be offset by the productivity gains it enables. I, for one, would happily trade slightly worse battery life for the ability to get real-time translations anywhere in the world. The sheer convenience may make the slight increase in energy consumption worthwhile for many others.

Still, not everyone sees AI as a necessary or beneficial development. For some, any additional use of energy is seen as unnecessary or wasteful, and no amount of efficiency gains can change that. Only time will tell if AI is a necessary evil, like so many other technologies in our lives, or if it’s just a waste of electricity.



Source link