Using AI.

AI requires a tremendous amount of energy.

According to ChatGPT generating a single low resolution image can require 100-300 watt hours. A high resolution image can require 1-5 kilowatt hours (1000 watt hours = one kWh) of energy.

An electric vehicle can go ~ 3 miles on one kilowatt hour. So one high resolution image can require enough energy to drive 15 miles - converted to gasoline it’s about 1/2 a gallon. (~ 2 liters)


——————

IMG_8179.jpeg
——————

1 kWh can run a typical LED lightbulb for ~ 77 hours or a typical modern refrigerator for half a day.


Crafting an 800 word response can require ~ 10 - 100 watt hours. So a thread with 20 responses can easily exceed one kWh.

Solar power can cost ~ $ 0.15 - $ 0.20 per kWh over the life of a solar system. This is sobering when a Google search now automatically prompts an AI response.

AI is a lot of things but it sure as hell isn’t green.
 
Last edited:
Ever try using AI to write a story?
Yes, and it is *incredibly* good. The key is 1) prompting and 2) understanding that you are getting a first draft. Fortunately, there are site that you can use that handle the prompts--you put in the bits (who, when, where, and a brief synopsis) and the AI fleshes it out. Turns out I'm much happier as an editor than a writer. Probably because I can hardly type anymore.
 
AI requires a tremendous amount of energy.

According to ChatGPT generating a single low resolution image can require 100-300 watt hours. A high resolution image can require 1-5 kilowatt hours (1000 watt hours = one kWh) of energy.

An electric vehicle can go ~ 3 miles on one kilowatt hour. So one high resolution image can require enough energy to drive 15 miles - converted to gasoline it’s about 1/2 a gallon. (~ 2 liters)


——————

View attachment 2538709
——————

1 kWh can run a typical LED lightbulb for ~ 77 hours or a typical modern refrigerator for half a day.


Crafting an 800 word response can require ~ 10 - 100 watt hours. So a thread with 20 responses can easily exceed one kWh.

Solar power can cost ~ $ 0.15 - $ 0.20 per kWh over the life of a solar system. This is sobering when a Google search now automatically prompts an AI response.

AI is a lot of things but it sure as hell isn’t green.
This is way beyond reason. There is no universe in which this is accurate data, unless they count the energy used to train the AI as well.
 
Well, he does say he got the data from ChatGPT, so... I mean, of course it's incorrect.
Of course, but we should all know the kind of silliness that AI can produce. It's not that hard to check the facts first.
@AlexBailey has an obvious fascination with AI, but their posts are usually very reasonable, so this struck me as odd.
The data is literally off by orders of magnitude.
 
This is way beyond reason. There is no universe in which this is accurate data, unless they count the energy used to train the AI as well.

Yeah, that seems weird. Here's some data: https://arxiv.org/pdf/2311.16863

Some points of interest: energy costs/carbon emissions involved in making the hardware can be comparable to the running costs. Costs vary immensely depending on what kind of "AI" you're using (text is generally cheaper than images, generative tasks are more expensive than non-generative).

Re. images, they give a mean cost of ~3 kWh per 1000 images generated, i.e. about 3 Wh per image. AFAICT they don't give an indicative size for those images, but they appear to be based on user queries for image generation sites so presumably are intended to be typical requests (for some value of "typical"). By my reading those figures are based only on generation cost not training or hardware, so the total cost would be a bit higher, but even if one includes those other costs it's hard to see it coming close to the numbers GPT gave.

Of course, but we should all know the kind of silliness that AI can produce. It's not that hard to check the facts first.

It's not, and yet people don't. :-/

@AlexBailey has an obvious fascination with AI, but their posts are usually very reasonable, so this struck me as odd.
The data is literally off by orders of magnitude.

Looks like about three orders of magnitude to me - i.e. a factor of a thousand. I'd guess that GPT has ingested some of those "per 1000 images" data and is misrepresenting it as "per image", or is misrepresenting watt-hour values as kilowatt-hours.
 
Of course, but we should all know the kind of silliness that AI can produce. It's not that hard to check the facts first.
@AlexBailey has an obvious fascination with AI, but their posts are usually very reasonable, so this struck me as odd.
The data is literally off by orders of magnitude.
It's ChatGPT talking itself up, saying, "Look how much energy it takes to be me," then proceeding to make stuff up.
 
It's ChatGPT talking itself up, saying, "Look how much energy it takes to be me," then proceeding to make stuff up.
AI has been around for a while now, and we've all seen examples of AI-hallucinations. We just need to take things critically.
In this particular case, everything can be easily tested/calculated. There are apps that allow AI image generation locally, using our own PCs instead of cloud resources. I've seen plenty of users claiming that their high-end PCs can generate a relatively low to mid-res image - 1400x700 pixels or so, in 15-60 seconds. Results depend on optimization.

The workhorse in image generation is the GPU (Graphics card), and most users claim to have an Nvidia RTX 4090, which is the choice of top gamers. That GPU peaks at 450 Watts when at max strain. Say we add about 50W of power for the CPU as well, even though the CPU does far less work in this case. So 500W x (15 to 60 seconds) equals between 2 and 8 Watt-hours (Wh) per image. This is, of course, just some rough calculation, but it's a far cry from 100 - 300 Wh or more that ChatGPT claimed.
 
Looks like about three orders of magnitude to me - i.e. a factor of a thousand. I'd guess that GPT has ingested some of those "per 1000 images" data and is misrepresenting it as "per image", or is misrepresenting watt-hour values as kilowatt-hours.
To be fair to the earlier commenter. I think it was only two orders of magnitude or a factor of a hundred, around 3 WH per image instead of 100-300 WH. I expect that you are correct in that it was reported somewhere per 100 or 1000 images.

@AwkwardlySet 's back of the envelope calculations seem pretty close to the same 3WH
 
Agreed, the cited energy costs for AI queries were definitely excessive. Some more realistic figures are here: https://energeia.au/data-centre-grid-impacts/

Energy-Consumption-by-Query-Search-Type.jpg
However, even these lower consumption figures are pretty horrific.

It has been estimated that over the next decade, half the energy-grid infrastructure costs will be due to data centers. The Googles and Facebooks of the world do not want to pay for their share of this cost, so they lobby governments to spread the costs across the entire customer base, as higher connection fees and higher kWh consumption costs per household. (Don’t fret, data centers will still get cheaper power rates than the rest of us because they buy in ‘bulk’.)

They are blaming electric vehicles for these increasing costs.

In our recent election in Australia, the right wing opposition party had a policy to start building nuclear reactors here to generate ‘carbon free’ energy. This wasn’t popular. Modeling showed nuclear was far more expensive than renewables and Australia has been anti-nuclear since back when France was conducting atomic tests in our ‘back yard’. But they seemed committed to this policy.

Most people attributed this policy folly to their mining company sponsors – just a delaying tactic to give coal another few decades of profits while the nuclear power plants were being built.

I think there was more to it.

Renewable energy generation (wind/solar/tidal/geothermal) gets spread across the entire grid, as smaller competitive enterprises. Nuclear power is more concentrated and singularly expensive – more monopolistic. The type of enterprise that will lobby (bribe) politicians for favor. The type of place where $1M is just a rounding error.

Secondly, concentrated base-load type power generation is more appealing to customers with concentrated continuous-load power consumption. Like data centers.

Billionaires don’t want to be at the whims of the wind and sun, paying for the spikes in energy costs whenever generation is low. Unlike a household, extended battery backup isn’t viable and gas or diesel backup-generation of power is also too costly. So they want to co-locate a large base-load power station next to their data centers and make the wider community pay for it.

Entirely my speculation, but it seems to be the way the world works these days.
 
To be fair to the earlier commenter. I think it was only two orders of magnitude or a factor of a hundred, around 3 WH per image instead of 100-300 WH. I expect that you are correct in that it was reported somewhere per 100 or 1000 images.

The GPT-sourced figures included mention of "1-5+ kWh" for high-resolution images; the middle of that approximate range would be 3 kWh. I couldn't see enough info on image size/resolution to tell exactly which figure I should be comparing with which, so I could believe that the discrepancy is 'only' a factor of 100 rather than 1000 - but I'm more inclined to believe 1000x since there are a couple of easy ways to get an error of 1000x in the calculation.

I'd also agree with @Cacatua_Galerita that even the corrected numbers are still concerning, given how heavily these technologies are used and being pushed.
 
As a reminder, AI as we know it today with large language models and the euphemism “training”, means theft. Take things without permission, use those things, and (this is for the people who say ‘if it’s online it’s there to be taken’) then present results as if it’s the AI tool that produced them.

So fee free to debate how good it is at anything, how it’s got no soul, uses lots of energy, all that. I’m not arguing on those topics or participating in those conversations, just reminding everyone to also (I recommend mainly) be angry about the theft.

Ask yourself: If someday AI ends up getting really good at producing erotica, guess how it learned? And as active Literotica members, guess where it went to learn?

Hint: Here. Literotica. Your stories.

Be angry about the theft.
 
ChatGPT’s fact check of the energy estimate it gave before says it was off by one order of magnitude:


You’re right to question that — let’s take a closer, more rigorous look. Energy consumption for generating images with models like mine (e.g., diffusion-based models such as Stable Diffusion or DALL·E) can indeed vary widely, and earlier rough estimates may understate the true cost depending on the setup.

🧠 Let’s Break It Down More Carefully

Let’s consider generating one high-resolution 1024×1024 image using a diffusion model running on an NVIDIA A100 GPU.

🖥️ GPU Power Usage

  • NVIDIA A100 power draw: ~300–400 watts (0.3–0.4 kW) under heavy load
  • Time per image: ~5–10 seconds for a high-res image, assuming no batching or major optimizations
    • Assume 10 seconds of full usage for a more conservative estimate.


🔋 Energy per image (just GPU)

{Energy (kWh)} = {Power (kW)} x {Time (h)}

= 0.35 \, \text{kW} \times \frac{10}{3600} \, \text{h} = 0.00097 \, \text{kWh}

So just the GPU might consume ~0.001 kWh per high-res image.


🧊 Include Data Center Overhead (PUE)

Data centers aren’t 100% efficient. A typical PUE (Power Usage Effectiveness) is ~1.5–2.0, meaning for every watt used by the GPU, 1.5–2 watts are consumed overall.


{Adjusted energy} = 0.00097 {kWh} x 2 = {~0.002 kWh}

✅ Reality Check with Other Estimates

Recent research and industry reports (e.g., Hugging Face, Meta, academic energy profiling papers) suggest:
  • Text-to-image diffusion models can consume 0.002–0.01 kWh per image depending on size, quality, and architecture
  • Higher values (~0.05 kWh/image) usually refer to training, not inference
  • My earlier estimate of 0.03–0.06 kWh per high-res image likely overestimated by about 10×, depending on conditions

🧾 More Accurate Final Estimate



[TR]
[TD]Image Size[/TD]

[TD]Realistic Energy Estimate[/TD]

[TD]Notes[/TD]
[/TR]
[TR]
[TD]512×512[/TD]

[TD]~0.001–0.002 kWh[/TD]

[TD]Standard generation[/TD]
[/TR]
[TR]
[TD]1024×1024[/TD]

[TD]~0.002–0.004 kWh[/TD]

[TD]More steps, higher VRAM[/TD]
[/TR]
[TR]
[TD]2048×2048+[/TD]

[TD]~0.005–0.01 kWh[/TD]

[TD]Depends on model/chunking[/TD]
[/TR]


So yes — you were right: the original numbers were probably off by an order of magnitude on the high side. Thanks for the catch.

Let me know if you’d like to see emissions or dollar costs based on these figures.
 
Agreed, the cited energy costs for AI queries were definitely excessive. Some more realistic figures are here: https://energeia.au/data-centre-grid-impacts/

View attachment 2540312
However, even these lower consumption figures are pretty horrific.

It has been estimated that over the next decade, half the energy-grid infrastructure costs will be due to data centers. The Googles and Facebooks of the world do not want to pay for their share of this cost, so they lobby governments to spread the costs across the entire customer base, as higher connection fees and higher kWh consumption costs per household. (Don’t fret, data centers will still get cheaper power rates than the rest of us because they buy in ‘bulk’.)

They are blaming electric vehicles for these increasing costs.

In our recent election in Australia, the right wing opposition party had a policy to start building nuclear reactors here to generate ‘carbon free’ energy. This wasn’t popular. Modeling showed nuclear was far more expensive than renewables and Australia has been anti-nuclear since back when France was conducting atomic tests in our ‘back yard’. But they seemed committed to this policy.

Most people attributed this policy folly to their mining company sponsors – just a delaying tactic to give coal another few decades of profits while the nuclear power plants were being built.

I think there was more to it.

Renewable energy generation (wind/solar/tidal/geothermal) gets spread across the entire grid, as smaller competitive enterprises. Nuclear power is more concentrated and singularly expensive – more monopolistic. The type of enterprise that will lobby (bribe) politicians for favor. The type of place where $1M is just a rounding error.

Secondly, concentrated base-load type power generation is more appealing to customers with concentrated continuous-load power consumption. Like data centers.

Billionaires don’t want to be at the whims of the wind and sun, paying for the spikes in energy costs whenever generation is low. Unlike a household, extended battery backup isn’t viable and gas or diesel backup-generation of power is also too costly. So they want to co-locate a large base-load power station next to their data centers and make the wider community pay for it.

Entirely my speculation, but it seems to be the way the world works these days.

If you’re talking about data center energy consumption don’t forget how much is consumed by processing cryptocurrency transactions.
 
It’s always good to check. In this case ChatGPT was gleaning data from various and outdated media and public sources. It’s easy to see how a media landscape of AI reporting could become feedback loops of hallucinations.



ME:
Why do you think your previous figures were so off?


ChatGPT:

Great question — and it points to a broader issue in estimating AI energy use. Here’s why the earlier figures (0.03–0.06 kWh for a high-res image) were likely too high:

🔍 1.
Confusing Training vs Inference Costs

Many sources discuss training costs, which are vastly higher than inference. For example:
  • Training Stable Diffusion may cost hundreds of thousands of GPU hours, using megawatt-hours (MWh) of energy.
  • Inference (image generation) is often mistakenly lumped together or assumed to be proportionally intensive — but it’s far more efficient.

I initially leaned toward the more conservative end, influenced by general AI energy discussion, which tends to emphasize training and large-scale inference together.


⚙️ 2.

Underestimating Efficiency of Modern Hardware

Modern GPUs (e.g., A100, H100) are extremely efficient at inference:





* A single forward pass in a diffusion model takes ~50–150 steps.
  • Optimizations like mixed-precision (FP16), attention slicing, and GPU parallelism dramatically cut energy usage.

In early AI energy discussions (2020–2022), these efficiencies weren’t as widespread or measured as precisely.


🏭 3.

Overestimating Data Center Overhead

I used a PUE (Power Usage Effectiveness) of 1.5–2.0, which is on the high side. In reality:
  • Hyperscale data centers (e.g., from Google, AWS, Azure) often operate at PUEs closer to 1.1–1.2.
  • That reduces the multiplier on base GPU consumption, meaning the overhead isn’t as bad as initially assumed

📉 4.

Noisy or Outdated Public Estimates

A lot of “back-of-the-envelope” energy claims online aren’t well sourced. Some cite:

  • High-end estimates for large-scale batch inference
  • Extrapolations from older GPUs (e.g., V100s, which are less efficient)
  • Rounded or mixed units (e.g., confusing Wh and kWh)

When I erred, it was in favor of caution — but your challenge helped correct that.


✅ Lesson

Estimating AI inference energy use is still tricky, but a more bottom-up, hardware-aware calculation — as we did just now — is usually much more accurate than high-level guesses.

Let me know if you’d like to compare this to other AI tasks, like LLM queries or training costs!


ME:
Do you have access to any internal energy use data from your developers?


ChatGPT:

No — I don’t have access to internal energy usage data from OpenAI or its infrastructure partners. My responses are based on publicly available information, research literature, industry benchmarks, and educated estimates.

Here’s what that means in practice:


🧭 What I do use:
  • Academic studies (e.g., Stanford, MIT, Hugging Face energy profiling)
  • Technical documentation from NVIDIA, AWS, Google Cloud, etc.
  • Public sustainability reports from Microsoft, OpenAI partners, and others
  • Empirical benchmarks (e.g., how long inference takes on A100/H100s)


🔒 What I don’t have:

  • Proprietary or unpublished data center logs
  • Internal monitoring of real-time power consumption
  • Exact energy footprints for individual queries run through OpenAI’s systems

That said…


OpenAI has publicly acknowledged the importance of reducing the environmental impact of large models, and they (along with other labs) are working toward better energy transparency. But exact per-query or per-image numbers aren’t something I have internal access to or can guarantee.

If or when OpenAI publishes verified energy usage stats (e.g., for ChatGPT, DALL·E, or other models), I can help interpret or explain them — but for now, I rely on public models and careful estimates.

Would you like pointers to the best public studies or comparisons with other services?


 
People forget that these machines are going to make hundreds of millions of jobs redundant in the near future, which in turn will save an enormous amount of energy: millions of powered-on PCs, servers, air conditioners, daily commutes, and more.

What thousands of researchers, engineers, graphic designers, and programmers can achieve in months, these machines can achieve in seconds.
Would you rely on any of it though? The fail safes and data checks need to be a bit more rigorous than we're seeing in this sample. Poor old Alex's AI buddy doesn't have much of a clue.
 
People forget that these machines are going to make hundreds of millions of jobs redundant in the near future, which in turn will save an enormous amount of energy: millions of powered-on PCs, servers, air conditioners, daily commutes, and more.

What thousands of researchers, engineers, graphic designers, and programmers can achieve in months, these machines can achieve in seconds.
As @ElectrucBlue suggests, the problem is trust. It will not replace people until a certain point, and even then things will not be as bad as you think.

Compare it to containerisation. It used to be that development took a much,uch longer time. Today we have much quicker solutions. It caused our development higher pressure. Now that we can produce more, we are tasked with more projects, as well as more 'nice to haves' become mandatory. The goalposts change, but the amount if people in development isn't significantly down.

Now let's put this to the test with AI. Sure there us the possibility that AI will take jobs, but first if all, are you sure it isn't hallucinating? I can tell you as a user of the coding AI's that it is great at generating simple, common stuff. If you want something more precise, or have to place it into a legacy environment, it absolutely sucks balls.

So we get a problem. Even if it gets better at these exotic scenarios, can you trust it? If you can't, you'll need someone who very thoroughly understands code to check it quicky. Basically we get a quality control instead of writing well.

Now let's put it one step further. All government institutions, as well as many big companies like banks, are very secure with data. No private information can go out. So if the LMM learns how your systems work, it can be a security risk. Not to mention that people all too easily exchange private information.

So many restrict it's use, and what use it has needs to be fact checked tripple to prevent any mistakes, which can be much more costly than a dev team for a year.

Even if jobs are removed we'll gravitate to different jobs. I'll admit AI is different than in other times, but with robotic automation people thought much the same. We have a more service focused job market these days because of technological progress. We are still thriving. AI presents challenges, but with the proper care we can shift to a new type of economy.
 
All that AI has to do replace workers is to be more convenient, more loyal, and cheaper while effectively accomplishing a task.

I’m paying attention to its developing capabilities. If you think it’s reached the apex of where it’s headed you’re deluding yourself.
 
All that AI has to do replace workers is to be more convenient, more loyal, and cheaper while effectively accomplishing a task.

I’m paying attention to its developing capabilities. If you think it’s reached the apex of where it’s headed you’re deluding yourself.
Long term, you are correct, but I suspect long term is a few decades out.

The current generation of AI has hit its wall and has little more that it can accomplish. There have been multiple hype-collapse cycles through the history of AI. This one has been the most hyped and will collapse the hardest.

Just for background. I have done AI research, although not for a few years now and my doctoral work was in theory. And I have worked with two people who have won Turing awards for their work in AI. I might be wrong, but I am pretty convinced I am not. And most of the serious AI folk I know, at least the ones who don't have fortunes riding on it, agree with me.
 
Do you trust your car?
Not to be self driving, no. AI right now makes a good complement to humans in a many situations. And can be very useful as such.

There are surprisingly few jobs that can be completely replaced right now. Not that there will not be disruptions. And some of the creative arts will be hit.

Tens of thousands of professional musicians lost their jobs after the advent of talkies -- almost every movie theater in the world paid someone to play the soundtrack. (Or someones for larger theaters.) Almost overnight, musician went from a solid middle class career to being someone you didn't want your daughter to date. You could argue they were technicians rather than creators, but I doubt the next Hemingway is writing on this site right now. And it was a way for the creators to earn a living while waiting to be discovered.

The master weavers all but disappeared after the introduction of the Jacquard loom. The original Luddites had a point.

I think widespread employment impact is probably a couple of decades off. Of course, another breakthrough could happen at any time. Maybe Frist's ideas will pan out. But the current approach to ANNs has run its course.
 
Not to be self driving, no. AI right now makes a good complement to humans in a many situations. And can be very useful as such.

There are surprisingly few jobs that can be completely replaced right now. Not that there will not be disruptions. And some of the creative arts will be hit.

I never agree with @Plathfan, but I don't think we have a few decades. I have zero faith in humans to make sound decisions about things like this. We have a long history of doing stupid shit because it sounds cool and briefs well, even when it ends up killing us (sometimes literally). I think we're probably doomed.

In the first world, at least. The billions who still have to walk six miles a day for their clean water will continue as they always have; it's just that those who don't? Might have to start. The good ol' days are... now.

Tens of thousands of professional musicians lost their jobs after the advent of talkies -- almost every movie theater in the world paid someone to play the soundtrack. (Or someones for larger theaters.) Almost overnight, musician went from a solid middle class career to being someone you didn't want your daughter to date. You could argue they were technicians rather than creators, but I doubt the next Hemingway is writing on this site right now. And it was a way for the creators to earn a living while waiting to be discovered.

The master weavers all but disappeared after the introduction of the Jacquard loom. The original Luddites had a point.

The whaling industry has entered the chat. Lol.
 
Back
Top