Neszed-Mobile-header-logo
Friday, August 1, 2025
Newszed-Header-Logo
HomeGlobal EconomyConsumers Are Footing the Bill for AI’s Insatiable Appetite for Energy

Consumers Are Footing the Bill for AI’s Insatiable Appetite for Energy

Yves here. Perhaps most readers were aware, but I have to admit that even with all of the considerable alarm about the energy staggering demands of AI, I had missed a key point. I had thought the ginormous energy-eating was due to needing to training set demands, that it was generally understood that the widespread LLM implementation like ChatGPT still had marked gaps, and so feeding more data into their maw was an ongoing activity, with accompanying large power needs. It turns out that using these AI models also has an extremely high energy price tag.

In all seriousness, all of you need to stop using AI. Immediately. We’ll provide some examples below of the simply grotesque energy costs of using AI even for seemingly small task. You are directly and greatly contributing to the acceleration of climate change and other resource degradation, like increased use of water, by using AI at all. If you drive an EV out of environmental concern but still use AI, you are either ignorant or a hypocrite.

This article fails to point out that, as far as the US is concerned, we’ve chosen to intensify the damage. Silicon Valley and its bought-and-paid-for political allies have lashed themselves to the mast of these so-called generative AI models, when DeepSeek has already proven to be more efficient in terms of power needs. From Energy Digital, summarizing a study by Greenly which compared ChatGPT-4 with DeepSeek:

Building and running large language models (LLMs) such as ChatGPT-4 and DeepSeek requires substantial computing power

This involves not only high electricity use but also dependence on water for cooling and energy-intensive chip manufacturing.

AI hardware production involves mining rare earth minerals, a process that can result in soil erosion, water contamination and wider pollution.

ChatGPT-4, for example, operates with 1.8 trillion parameters – 20 times more than earlier versions…

In a scenario where an organisation relies on ChatGPT-4 to answer one million emails per month, Greenly calculates the yearly emissions at 7,138 tCO₂e – equivalent to 4,300 round-trip flights from Paris to New York.

Even small tasks carry energy costs.

According to research from Carnegie Mellon University and Hugging Face, a single text-based prompt consumes as much energy as charging a smartphone to 16%.

Under routine conditions, the same email use would still produce 514 tCO₂e per year.

Greenly found that text-to-image models like DALL-E produce up to 60 times more CO₂e than standard text generation.

Amid concerns about AI’s energy demands, DeepSeek offers a potential way forward.

The Chinese-developed model employs a Mixture-of-Experts (MoE) architecture, meaning it only activates relevant sub-models for each task rather than the entire model.

This drastically reduces the power required per operation.

Whereas ChatGPT-4 was trained using 25,000 NVIDIA GPUs and Meta’s Llama 3.1 used 16,000, DeepSeek used just 2,000 NVIDIA H800 chips.

These chips also draw less power than previous models.

As a result, DeepSeek consumed a tenth of the GPU hours compared to Meta’s model.

This not only brings down its carbon footprint but also lessens the load on servers and reduces water usage needed for cooling.

See this Q&A from The Grainger College of Engineering for more technical detail on the merits of DeepSeek.

By Haley Zaremba, a writer and journalist based in Mexico City. Originally published at OilPrice

  • The rapid growth of data centers, particularly due to AI, is significantly increasing energy demand and jeopardizing clean energy initiatives by extending the life of fossil fuel plants and promoting new ones.
  • The issue is compounded by “phantom data centers,” which inflate projected energy demand and give utilities leverage to expand fossil fuel infrastructure.
  • This surge in energy demand and the resulting infrastructure projects are projected to lead to higher energy bills for consumers, especially in the Southeast United States.

As data centers place more and more demand on global power grids, policy and economic priorities are shifting from creating more clean energy to creating more energy, period. Projected clean energy additions are simply not enough to meet the runaway demand of the global tech sector, meaning that climate goals could be at risk.

The proliferation of artificial intelligence is causing massive increases in energy demand from data centers, and the areas that host them are struggling to keep up. A 2024 study from scientists at Cornell University found that generative AI systems like ChatGPT use up to 33 times more energy than computers running task-specific software. As a result, it is estimated that each AI-powered internet query consumes about ten times more energy than traditional internet searches. But these numbers are just our best guess – we don’t really know how much energy AI is sucking up, because the companies who are piloting AI platforms aren’t sharing those numbers.

But we know that the overall picture is pretty grim. Last year, Google stated that the company’s carbon emissions had skyrocketed by a whopping 48 percent over the last five years. “AI-powered services involve considerably more computer power – and so electricity – than standard online activity, prompting a series of warnings about the technology’s environmental impact,” the BBC reported last summer. While Google hasn’t publicly revised its goal of becoming carbon neutral by 2030, the tech firm has admitted that “as we further integrate AI into our products, reducing emissions may be challenging.”

Already, the uptick in energy demand from data centers is causing new plans for gas- and coal-powered plants as well as extending the life of existing fossil fuel operations across the United States. Utility Drive reports that “at least 17fossil fuel generators originally scheduled for closure [are] now delaying retirement” due to data center demand, and that “utilities in Virginia, Georgia, North Carolina and South Carolina have proposed building 20,000 MW of new gas power plants by 2040” for the same reasons.

The issue is particularly acute in the Southeast. Major utilities in Virginia, North Carolina, South Carolina and Georgia project that they will collectively add 32,600 MW of electrical load over the next 15 years. The Institute for Energy Economics and Financial Analysis reports that in Virginia, South Carolina and Georgia, “data centers are responsible for 65% to more than 85% of projected load growth.”

However, it could be the case that this projected demand growth is overblown, and that states will add extra gas power capacity – and therefore extra greenhouse gas emissions – unnecessarily. Because the competition for energy sources is so fierce between data centers, the project managers of new centers are likely to reach out to many different power providers at once with speculative connection requests, creating redundancies and a compounding issue of “phantom data centers.” This inflates demand and makes accurate projecting extremely difficult.

A study published last year by Lawerence Berkley National Lab calculated exactly how big the phantom data center issue might be, and they found that projected energy demand could be as much as 255 terawatt-hours of energy higher than real energy demand. That’s enough energy to provide power to more than 24 million households.

However, it’s not in utilities’ interest to simplify interconnection processes and ferret out phantom data centers. In fact, the panic over rising energy needs from data centers is giving them great leverage to expand their businesses and push through huge fossil-fuel powered energy projects. Plus, while building new plants and extending the lives of old plants is costly, those costs will be borne by the ratepayers.

Consumers across the U.S. – and especially in the data-center-laden Southeast – can expect their energy bills to rise in response. “We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure,” Maryland People’s Counsel David Lapp recently told Business Insider. “Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”

Consumers Are Footing the Bill for AI’s Insatiable Appetite for Energy

Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments