>people mistakenly repeating the conclusion that AI consumes huge amounts of water comparable to that of entire cities
Does it not?
"We estimate that 1 MWh of energy consumption by a data center requires 7.1 m3 of water." If Microsoft, Amazon and Google are assumed to have ~8000 MW of data centers in the US, that is 1.4M m3 per day. The city of Philadelphia supplies 850K m3 per day.
https://iopscience.iop.org/article/10.1088/1748-9326/abfba1/...
Why do we need to assume so many things, when we can peg it to reality.
Worldwide, Google's data centers averaged 3.7GW in 2024. Globally, they use 8.135e9 gallons of water in the year, which is 30.8e6m³ per year, which is 84e3m³ per day. Double that to meet the assumed 8GW data center capacity, 168e3m³/day. QED: the estimate 1.4e6m³/day is high by a factor of 10x. Or, in other words, the entire information industry consumes the same amount of water as one very small city.
I believe this is why Google states their water consumption as equivalent to 51 golf courses. It gives a useful benchmark for comparison. But any way you look at it the water consumption of the information sector is basically nothing.
Resource consumption of AI is unclear on two axes:
1) As other commenters have noted: raw numbers. In general, people are taking the resource consumption of new datacenters and attributing 100% of that to "because AI," when the reality is generally that while AI is increasing spend on new infrastructure, data companies are always spending on new infrastructure because of everything they do.
2) Comparative cost. In general, image synthesis takes between 80 and 300 times fewer resources (mostly electricity) per image than human creation does. It turns out a modern digital artist letting their CPU idle and screen on while they muse is soaking significant resources that an AI is using to just synthesize. Granted, this is also not an apples-to-apples comparison because the average AI flow generates dozens of draft images to find the one that is used, but the net resource effect might be less energy spent in total per produced image (on a skew of "more spent by computers" and "less by people").
Yeah, but that is for everything. YouTube, Amazon itself, AWS, Azure, GCP, ... not just AI stuff. I mean, it is still a lot of water, but the numbers are not that easy to calculate IMHO