Skip to content(if available)orjump to list(if available)

I've never been so conflicted about a technology

roywiggins

> Models require massive amounts of electricity and water (to cool servers).

Does it?

https://prospect.org/environment/2024-09-27-water-not-the-pr...

> training GPT-3 used as much water as just under twice what is required for the average American’s beef consumption. In other words, just two people swearing off beef would more than compensate.

https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

> It would be a sad meaningless distraction for people who care about the climate to freak out about how often they use Google search. Imagine what your reaction would be to someone telling you they did ten Google searches. You should have the same reaction to someone telling you they prompted ChatGPT.

Aurornis

I agree with your general point, but water usage numbers for farming animals are notorious for being exaggerated. There are studies that take all of the rainfall multiplied by all of the grazing land the cows are allowed on and include that as “water used”.

Water usage is also a nuanced concept because water isn’t destroyed when we use it. You have to consider the source and incidentals of processing it, as well as what alternatives it’s taking away from. None of this fits into convenient quotes though.

I think energy usage is a much better metric because we can quantify the impacts of energy usage by considering the blended sum of the sources going into the grid.

Borg3

Of course water is not destroyed. It gets contaminated. And to clean it up, you need energy. Basically, we can simplify those equation to pretty much single argument: energy. Whatever you do, it needs energy. And people really really comes very lighty to that problem and waste management. If we would really like to recycle everything property I suspect global energy usage 10x fold at least. For now, its just better (I mean more economical) to just "store" that waste somewhere.

bryanlarsen

Not in a data center. In a data center the water is either re-used (closed loop) or it's evaporated (open loop). In either case the water does not need decontamination.

rdiddly

I must be missing something because water seems like the least of our worries when compared to the energy being used and the carbon dioxide being produced. I know I know, still talking about CO2 at parties instead of the hot new water thing? BOOOO-RINGGGG!!!

The whole issue of "using" water is meaningless to me in the context of the water cycle. Does a data center "use" water? Whatever water evaporates from their cooling systems falls again as rain and becomes someone else's water. Same with farming - it all either evaporates (sometimes frustratingly right from the field it was applied to, or otherwise from the surface of the river it eventually runs into, or from the food you bite into, or from your sweat, or from your excretions/the sewage system/rivers again), or ends up supplementing a (typically badly depleted) aquifer, or gets temporarily used by animals including humans to e.g. hydrolyze fats (but full completion of metabolism of said fats & fatty acids actually returns MORE water on a net basis than it took to metabolize them) and so on.

In short, water is never passing into anything or anyone. It's passing through it. You don't own it, you're just borrowing it.

Even water recirculated as a coolant in a data center (the closest thing to actually "using" water) is a finite quantity, needed only one time, with maybe small top-ups due to losses, all of which end up, you guessed it, evaporating into the commons.

roywiggins

Water use can have big, localized problems, but that's mostly solvable by just putting the datacenters somewhere else.

Some places rely on deep aquifers that don't refill on human lifetimes, essentially fossil water. But that's mostly a local problem and we should just stop building water-hungry industry and agriculture in those places because it's stupid.

perrygeo

I don't understand the "whatboutism" angle. Running LLMs requires massive amounts of electricity and water. Period.

That beef consumption and other activities also require massive amounts of resources is independent. Both can be true. And both need to be addressed. We no longer have the luxury to pick and choose which climate interventions we find most convenient; we need to succeed on all fronts. If the patient is dehydrated and bleeding, do doctors sit around debating if they should give them water or gauze? No, they do both immediately.

btilly

According to https://www.theverge.com/24066646/ai-electricity-energy-watt... a reasonable estimate is half a percent of global energy usage by 2027. On the one hand, this is a lot of energy. On the other, that's in the same general neighborhood as what we could save by maintaining proper tire pressure on our cars. (See https://www.factcheck.org/2008/08/the-truth-about-tire-press... for more about that.)

perrygeo

I get that. I just don't get why you frame it as an either/or question as though they were mutually exclusive. We can reduce data center emissions, and improve tire maintenance. Por que no los dos?

TheNewsIsHere

I used to live in a place where there are a lot of data centers now. There were only a few just a few years ago. Now there are tens and more under construction.

I returned to that place to visit family and friends last year. It was an eye opening experience. The people who live there have taken a keen interest on curtailing further development of any new data centers. One of the chief complaints is the constant power issues that didn’t exist before 2021-2022.

The locals argue that this is the result of the dramatic infusion of AI into every technology product. They’re likely not wrong. The communities in the area became quite politically active over the issue and have retained all manner of analysts and scientists, journalists and investigators, and so on, to aide them in making a political case against future data center development.

The thing that got me was the complaints about the noise. Over the past few years the locals and those in their employ have been monitoring noise levels near the data centers and they’ve tracked sustained increases in noise pollution with the timeline over which the use of AI has exploded. There it has become sort of a trope to measure the working day by how much noise pollution is produced by any data center nearby. Mostly belonging to a hyperscaler known by an acronym.

The data centers have reportedly not been the boon that was promised, which seems to be seen as insult to injury. The area already has a vibrant tech scene independent of data center operations. So the locals don’t really see value in allowing more data centers to be built, and they’re starting to organize politically around the idea of preventing future data center construction and implementing heavy usage based taxation on utilities used by the existing ones.

teruakohatu

> massive amounts of electricity and water ... we need to succeed on all fronts

Datacenters' used for training or batch jobs can be placed in places where water and power are plentiful, ambient temperature are relatively low and government/society are stable.

A datacenter is going to be built at the bottom of New Zealand where all these things are true. There must be plenty of places in the world where this holds true.

At least for training, I think it is possible to have our cake and eat it. For real-time inference, probably not.

distances

> A datacenter is going to be built at the bottom of New Zealand where all these things are true. There must be plenty of places in the world where this holds true.

There are many datacenters being built in Finland. Water is not running out, and one of the cheapest electricity prices in Europe thanks to plentiful wind power.

int_19h

Given typical inference speeds for SOTA models, I rather suspect that you could also do it for inference without latency being particularly notable.

We might need to lay a few more cables to New Zealand though.

metalcrow

They would if the patient is alive and fighting them! In our case, the patient is society and it is quite definitely not all unified that we cannot pick and choose. If the doctors want to save their patient, they either need to tie them down and do what is needed, or they need to work within the confines of what the patient allows while trying to convince them to allow more.

dgb23

The patient is kicking and screaming, because they are an addict.

Regardless, I’m not convinced by the parent comment. It basically suggests we solve this without analysis of costs and effects. Sounds insane to me.

Especially since the most promising solutions come from rapidly developing technologies and research. Turning off the lights just sends us backwards.

roywiggins

Okay, but there is some reason why you personally are still spending electrons posting on the internet and not, presumably, entertaining yourself by rolling coal, and it's probably because the amount of energy you're using to browse and post on HN seems trivial to you. Does the energy use differential between Python and C++ seem important enough to always pick C++ for everything?

What I'm saying is that casual LLM use is also essentially trivial, and scolding people for using querying ChatGPT when you wouldn't scold people for, say, making a dozen Google searches or watching a Netflix show or taking a long shower or driving to the mall, is quite silly and a waste of time.

Similarly, there's a reason why doctors will tell you to worry more about your cigarette habit than almost any other bad habits you have, because it's so much more likely to kill you that it practically demands to be prioritized.

lolinder

Whataboutism is bad, but there's a distinction between whataboutism and making comparisons to get a sense of the true scale of the problem. If you're not allowed to ever compare relative scales of things without being accused of whataboutism then we're paving the way to making every molehill a mountain.

Two average American's beef consumption for training GPT-3 would place the cost in water of training GPT-3 at 1/150000000 of the US beef industry's average water consumption. That means it's a rounding error relative to all the other uses we place on water, and pointing that out is not whataboutism, it's putting the problem in its proper scale.

roywiggins

if a few thousand westerners who want to atone for LLM water use skipped a few burgers, the planet would come out ahead on net I think

clbrmbr

Naive question: why doesn’t the market regulate electricity consumption?

I’ve heard the answers pre-AI, but I wonder how this new general purpose use of electricity changes the calculus?

joshjob42

We have a constraint, say keeping warming below 1.5 or 2C, and we want to achieve it with as little pain as possible.

You use a laptop, you probably play some video games, you see movies, you probably eat meat, you probably drive a car around town instead of a velomobile.

So the question is, how much energy is used by AI and what does it get us? Google's latest TPUs consume ~250W per fp8-petaflop at the data-center level (ie including cooling etc, according to their recent presentations). Even assuming 50% utilization which is a bit poor, that's say 0.5kW per fp8-petaflop, or ~1kW per 10^15 dense-parameter-equivalent tokens per second. So using a huge model, say 10T dense-equivalent parameters, and doing inference at the rate of o3-mini or Gemini 2.5 Pro (around 150 tokens a second) consumes 1.5kW during generation time. But maybe a better way to think about it is just that you'd be using 10J of energy per token, or ~15J of energy per word.

In that context then, generating a book the length of Fellowship of the Ring (~180k words) would consume ~0.75kWh of energy. That's about like playing a video game on a PS5 for ~3 hours. In the US, that's on average ~270g of CO2, like driving ~1.5 miles or so in a Prius. It'd be about like riding ~30 miles on an e-bike.

Another way to think of it might be that if you are texting with a friend, wherein lets say you both aggressively type at ~120wpm, you'd be using ~15W of power to chat with an AI at a similar pace, about the power draw of your friend's Macbook Air he would be typing on. That's around 0.5-0.7 miles on an ebike per hour of chatting.

So even compared to your typical leisure activities, chatting with an AI is comparable in resource use to whatever else you'd be doing in that time most likely. And of course, a human in the US on average generates ~1.5kg of CO2 per hour they're alive in the US. In Bolivia, it's ~150g/hr, and in Sudan it's ~50g/hr or so. So if you're a company and you want to be climate conscious merely hiring someone who is a ~ subsistence farmer to do a job such that their standard of living rises enough to roughly emit the CO2 of the average Bolivian would mean they emit more carbon in a typical day than running an AI to produce as many words as are in the entire Game of Thrones book series to date. And if you were going to help someone immigrate from Bolivia to the US, you could have the AI write ~10 Songs of Ice and Fire a day for the same net CO2 output.

Not to say that we shouldn't do those things because of climate or whatever, but I'm just saying that the objective energy and climate impact of using these models for things is small compared to basically any alternative for completing a task and most entertainment activities people do (movie theaters for instance use ~7kW or so while running, so in the time you spend watching the Fellowship of the Ring in a theater, an AI could write a text as long as FOTR 28 times).

burningion

I know everyone likes to abstract away the costs associated with AI data centers.

But let's look at what has happened with Grok, for example:

From May 6, 2025

https://www.yahoo.com/news/elon-musk-xai-memphis-35-14321739...

>The company has no Clean Air Act permits.

> In just 11 months since the company arrived in Memphis, xAI has become one of Shelby County's largest emitters of smog-producing nitrogen oxides, according to calculations by environmental groups whose data has been reviewed by POLITICO's E&E News. The plant is in an area whose air is already considered unhealthy due to smog.

> The turbines spew nitrogen oxides, also known as NOx, at an estimated rate of 1,200 to 2,000 tons a year — far more than the gas-fired power plant across the street or the oil refinery down the road.

The details are in the specifics here. People are _already_ feeling the effects of the AI race, the consequences just aren't evenly distributed.

And if we look at the "clean" nuclear deals to power these data centers:

https://www.reuters.com/business/energy/us-regulators-reject...

> The Talen agreement, however, would divert large amounts of power currently supplying the regional grid, which FERC said raised concerns about how that loss of supply would affect power bills and reliability. It was also unclear how transmission and distribution upgrades would be paid for.

The scale of environmental / social impacts comes down to how aggressive the AI race gets.

chasd00

yeah, running GPUs in a datacenter is such a strange thing to fixate on with respect to the overall health of the entire planet. I don't recall it being discussed when the whole "move to cloud" thing started and AWS (+ others) came online, surely those have a larger impact.

roywiggins

you could probably save energy by scrapping Python in favor of C++ everywhere but people don't seem too worried about that either

BeetleB

Is training the bigger producer of CO2 or inference for the whole world using it? I suspect that in the long run, it is the latter.

jebarker

> “If you’re going to use generative tools powered by large language models, don’t pretend you don’t know how your sausage is made.”

This isn't a very hopeful quote given how many people continue to eat sausages even though we all know how sausages are made.

staunton

> we all know how sausages are made.

I'm doubtful. There's a few documentaries showing the process and I've had people tell me multiple times that they were genuinely shocked by it. I'm assuming those people "knew" and it's just that knowing "it's all the waste at the butcher getting stuffed into guts" is not quite the same as seeing it first hand.

jfengel

I'm not sure what documentaries you're watching.

Natural sausage casings are specialty items. If you're buying it at the grocery store, it's probably collagen (closely related to gelatin).

And it's not "all the waste". It includes fatty cuts that people wouldn't want to eat whole, but it doesn't include organ meats outside of specialty items.

Perhaps people find meat-grinding distressing, though it's really not all that different from ground beef. The emulsified filling of hot dogs and bologna looks odd, but the ingredients are inoffensive.

I'm less disturbed by sausage-making than by the slaughter and prime butchering of animals. They're no less dead and dismembered if you're eating a steak or pot roast. I'd rather we at least make use of all of the other parts.

staunton

Indeed, the images I remember most vividly myself (though I wouldn't say they shocked me at any point) is the guts being emptied. So it was some more traditional process for specialty sausages, not a huge factory.

Then again, some people find seeing a huge Industrial room full of raw meat distressing, perhaps somewhat analogously to how they might not be afraid of one spider but suddenly panick upon seeing hundreds at one spot.

smitty1e

Assuming a disdain toward sausage is straightforward and natural.

Hunger is an unforgiving teacher, and the reality that our forebears grew expert at squeezing every last calorie out of everything in sight might afford a lesson.

jebarker

I wasn't even really thinking of the actual sausage stuffing process so much as how the meat gets to the butcher (which is really a meat grinding factory in most cases) in the first place.

mvdtnz

People get upset because we use every part of an animal for food? I think it's absolutely great. I am a meat-eater who doesn't feel ethically great about it, but I'd feel much worse if I knew a large percentage of a carcass went to waste.

raincole

And what happened after they told you they were shocked? Did they all become vegan once and for all?

staunton

Most just expressed their shock (which didn't seem staged) but changed nothing. One person became vegetarian for about a year, then started to eat meat again. I'm not sure if they still eat sausages.

bdangubic

and why focus on LLMs... there is a shitton of other things that use power/resources/... if we are going to be start worrying here we should consistently apply this across everything...

sundaeofshock

Because LLMs are not deeply embedded in all aspects of our society. It is very difficult to reduce existing usages of carbon (eg cars) in society, as opposed to stopping the widespread use of LLMs.

bdangubic

facts! but did we have these conversations about other things in these terms before they got deeply embedded in all aspects of society?

teruakohatu

I feel the points made in this article have been debated many times on HN.

The argument about power usage for developers, as opposed to consumers, is probably insignificant compared to the inefficient use of compute resources for deployed software today.

Arguably LLMs are probably enabling some web developers to create native applications rather than Electron monstrosities saving many P-core CPU cycles multiplied by the number of their users.

Optimising server applications with an LLM could eliminate unnecessary cloud servers.

Of course all the above could be done without LLMs, but LLMs can empower people to do this kind of work when they were not able before.

breuleux

> Arguably LLMs are probably enabling some web developers to create native applications rather than Electron monstrosities saving many P-core CPU cycles multiplied by the number of their users.

Does any such project exist, or are developers using LLMs to help them develop new Electron monstrosities? I think that if a developer has the sensibility to want to develop a native app, they will do so regardless, it's not that much harder.

dgb23

The main point being: LLMs cost, but do something useful. Building slow applications just adds cost, but the utility is much more vague.

BeetleB

> The argument about power usage for developers, as opposed to consumers,

Except with MCP, essentially most consumers will become "programmers". See my other comment for the rationale (https://news.ycombinator.com/item?id=43997227).

(TLDR: MCP lets non-programmers convert their prompts into programs, for all practical purposes. Currently there is a barrier to entry to automate simple tasks: The need to learn programming. That barrier will go away)

teruakohatu

> Except with MCP, essentially most consumers will become "programmers"

You are more optimistic than I am. Most people I have seen are using LLMs are at best as an alternative to Grammarly or a document/web summary, or at worst making decision based on outdated LLM advice or as an inaccurate fact-engine.

The average person could code using Excel, but most don't even if they know how to use IF() and VLOOKUP().

BeetleB

That's because most people don't have access to MCPs. It will take 1-2 years to hit critical mass - once it's easy to plug in to ChatGPT and once major companies (e.g. Google for Gmail) provide easy to configure MCP servers.

> The average person could code using Excel, but most don't even if they know how to use IF() and VLOOKUP().

Using Excel (even without IF) is way more complicated than what I am saying. MCPs will enable people to program with natural language. It's not like vibe coding where the natural language will produce code we'll run. The prompt will be the program. You need to put in a lot more effort to learn the basics of Excel.

elpocko

I would like to see a comparison of the amount of electrity used to run AI and the total electricity used to run computer games. I'm quite sure we're wasting much more resources on computer games, just for entertainment.

acomjean

I think AI is interesting, but I wonder what it will do to progress when used very extensively.

Trained on all the things/ideas of the past and creating a larger barrier of entry for new ideas.

Thinking about new computer languages, who would use one that AI couldn’t help you code in?

blizdiddy

When chatGPT came out, i thought we would stagnate with new languages, but not anymore. Now we have small models that can be fine tuned for less than $20 and we have huge models that can learn a new programming language in-context if you provide the docs.

furbolapp

I think its the other way around, it wont create barriers instead it will focus on detecting/recognizing/aknowledging originality and new ideas

computerex

> AI and Crypto are net negatives in this regard.

Not sure how anyone can with certainty say that AI will be a net negative in the long run for climate change. Logic I think says the opposite.

candiddevmike

How can AI be a net positive for climate change? We know what needs to happen to fix it, how would AI telling us the same thing "hit differently"?

philipkglass

The cost of energy generated from burning fossil fuels is dominated by the cost of the fuel itself; the power plant costs much less. Non-combustion energy is the opposite: fuel costs are tiny or zero, so the construction cost for the initial plant is much more important.

If you generate a terawatt hour of electricity with natural gas, most of the cost will be from the fuel. A nuclear plant will have a tiny fraction of the cost come from fuel for the same amount of energy. A solar farm will have none of the cost come from fuel.

If AI lowers construction costs, it will improve the relative economics of non-fossil energy compared to fossil energy. A natural gas plant constructed at half the cost will have its final energy cost decrease just a little whereas a half-as-expensive solar farm will have its final energy cost decrease nearly by half. Making clean energy cheaper than fossils means that it will out-compete dirty energy even in locations where there are no explicit policies to reduce CO2 emissions.

You can see the effects on pricing advantage with this interactive simulation of electricity supply in the United States. If you cut the overnight construction cost in half for all generating technologies, solar and wind dominate the country:

https://calculators.energy.utexas.edu/lcoe_map/#/county/tech

Some example modeling of gas/solar electricity economics in the United Kingdom here:

https://electrotechrevolution.substack.com/cp/160279905

Companies have already started using robotics and AI to construct solar farms faster and at lower cost:

https://www.aes.com/press-release/AES-Launches-First-AI-Enab...

https://cleantechnica.com/2025/02/27/leaptings-ai-powered-ro...

https://www.renewableenergyworld.com/solar/cool-solar-tech-w...

senordevnyc

Maybe AI will help us actually create or implement something that fixes it, instead of idealistic approaches that are never going to happen?

raincole

I think it's just most programmers finally got the reality check: we're menial laborers, not superstars.

MaxGripe

Plz don’t be conflicted. The Earth has an unlimited source of energy, the Sun, and water cannot be “saved” because our planet is a closed cycle.

ChrisMarshallNY

> Sources used to train models are kept secret.

I've been using Perplexity, and it annotates its recommendations, as to sources. Maybe it isn't very complete, though.

BeetleB

The environmental cost can't be overstated.

When I AI code, it's more convenient for me to type in a prompt to change the name of a variable. This involves sending a request to the LLM provider, doing very expensive computations, and then doing a buggy job at it. Even though my IDE can do it for what 0.1% of the energy usage? Or even less? Try running an LLM locally on a CPU and you'll get a glimpse of how much energy it is using to do this simple task.

But coding with AI isn't that huge. What will become huge this year or next is MCP. It will bring "programming" to the masses, all of whom will do stupid queries like the above.

Consider this: I wrote an MCP server to fetch the weather forecast, and have separate tools to get a broad forecast, an hourly forecast, etc. I often want to check things like "I'm thinking of going to X tomorrow. It will be a bummer if it's cloudy. Which hours tomorrow have less than 50% cloud cover?" I could go to a weather web site, but that's more effort (lots of clicks to get to this detail). Way easier if I have a prompt ready.

OK - that doesn't sound too bad. Now let's say I want to do this check daily. What you have to realize is that with MCP the prompt above is as good as a program! It's trivial for an average non-programmer to write that prompt and put it as part of a cron job, and have the LLM email/text you when the weather hits a predefined criteria.

Consider emails. I sign up for deals from a retailer.[1] Now deals from them are a dime a dozen so I've been programmed to ignore those emails. But now with MCP, I can set a simple rule: Any email from that retailer goes to the LLM, and I've written a "program" that loosely describes what I think is a great deal, and let the LLM decide if it should notify me.

Everyone will do this - no programming required! That prompt + cron is the program.

Compared to traditional programming, this produces 100-1000x more CO2 emissions. And because there is no barrier to entry, easily 1000x more people will be doing programming than are doing now. So it's almost a millionfold in CO2 emissions for tasks like these.

[1] OK, I don't do it, but most people do.

lordnacho

I think it's harsh to start the accounting so soon after the breakthrough.

Yes, of course AI uses a lot of energy. But we have to give it a bit of time to see if there are benefits that come with this cost. I think there will be. Whether the tradeoff was worthwhile, I think we are not even close to being able to conclude.

Something like social media, which has a good long while behind it, I could accept if you started to close the book on the plus-minus.

jebarker

You can't have it both ways, i.e. deploy the technology as rapidly and widely as possible but then say you have to wait to do any accounting of the pros/cons.

breuleux

You have to start the accounting immediately, because once a technology becomes entrenched, it's very difficult to backtrack. If it spins out of control and half of all energy is sunk into AI because half of services depend on it and workforces have been laid off, you won't be able to do anything about it without disrupting half of services. The window of opportunity to keep a technology in check, unfortunately, often occurs before the problems it causes become obvious.

candiddevmike

Same arguments folks had with cryptocurrencies.

olalonde

I guess it makes sense if you're genuinely worried about an imminent climate crisis. I am just not...