logoalt Hacker News

bee_ridertoday at 6:58 PM11 repliesview on HN

I don’t really get the water concerns in datacenter cooling. Even if a lot of water was used for cooling with every prompt (which he argues against here, but, even if)… water “used up” by cooling just comes out a little hotter, right? Maybe evaporated. Then it’ll come back in the form of rain. This isn’t an industrial chemistry process that leaves some toxic waste in the water. Or an agricultural one that puts water in plants and then ships it off to some other region. It just becomes another path through the water cycle.

I actually don’t get how this can be a real thing that people are worried about. Is there some astroturfing behind this? Maybe an attempt to make environmentalists and AI skeptics look stupid?


Replies

loegtoday at 7:00 PM

The absolute strongest complaint is that DCs consume treated, potable water, which is less abundant / easily re-created than any old non-potable source. (Of course the easy solution here is DCs just ingest / treat their own non-potable source. Or utilities charge rates sufficient to price in the externality of drawing down more potable water. The economics still work for DCs if they need to treat their own water -- the fundamental problem is that utilities are underpricing their potable water, so DCs prefer it all else being equal.)

show 1 reply
bronsontoday at 7:03 PM

Because they're taking water from already parched regions, often pumping it out of the ground. Even if the water did come back locally as rain (it doesn't), it still makes it impossible for people to live off the same aquifers and water sources sustainably.

show 1 reply
pier25today at 7:02 PM

Just 30 mins from where I live data centers are having an impact on water used for farming.

https://www.theguardian.com/global-development/2024/sep/25/m...

https://www.bbc.com/news/articles/cx2ngz7ep1eo

show 1 reply
traderj0etoday at 7:09 PM

It doesn't come out a little hotter, it gets evaporated in cooling towers. Same result as any other water usage. Cooling towers can't use seawater either. Most datacenters are in places where fresh water is abundant anyway, but some are not.

Anyway agricultural water usage is way worse in California.

show 1 reply
echoangletoday at 7:01 PM

The water isn’t gone but if it comes back as rain, it at least has to be cleaned again, since data centers probably don’t use raw rainwater for cooling.

It’s probably still not too bad but there’s at least some work done that’s „used up“ by letting tap water (or probably demineralized water used for cooling) evaporate.

show 2 replies
skywhoppertoday at 8:00 PM

The rain doesn’t happen directly above where it evaporates. And “slightly warmer” waste water can have major ecological impacts, destroying native life in the lakes and rivers where the wastewater is ejected. Plus, if the water is taken away from underground aquifers that may not be refilling fast enough, or if it’s taking water from downstream users, that’s something to be concerned with.

sublineartoday at 7:10 PM

I have also wondered this and came to a similar conclusion about the politics.

This whole time I've been wondering how it's possible that people don't realize how common evaporative cooling is for much larger buildings that are far more numerous than these data centers, and especially in dry climates where drought is common.

cute_boitoday at 7:08 PM

> Or an agricultural one that puts water in plants and then ships it off to some other region

Just like an agriculture, data center puts water to cool chips and ships token to some other reason?

catlikesshrimptoday at 7:08 PM

I honestly don't know if you are an AI atroturfing bot. No, I am not being sarcastic. Given this is the top comment and there is no reply, here you go

For a pre-chewed eli5 overview, check this: https://www.eesi.org/articles/view/data-centers-and-water-co...

A responsible human must always verify information. I DW as "secondary l" information source. For instance https://www.dw.com/en/why-does-ai-need-so-much-energy/video-...

tldr: chip immersion uses less water but is more expensive. Water evaporation is the opposite. Datacenters will use the cheapest they can get away with. Water is scarse; evaporated water is as unavailable as contaminated water. Read the information sources.

show 3 replies
bigmadshoetoday at 7:03 PM

By that argument water use is never a bad thing since all water comes back as rain. The problem is that data centers need to use clean water, which has to be treated. On a local scale, a large data center could starve a community of potable water, even if the state-wide water use is very small.