> You should consider dropping that instinct.
This is the reason we have people mistakenly repeating the conclusion that AI consumes huge amounts of water comparable to that of entire cities.
If you make any other assumption than "I don't know what's happening here and need to learn more" you'll constantly be making these kind of errors. You don't have to have an opinion on every topic.
Edit: By the way, I also don't think we should trust big companies indiscriminately. Like, we could have a system for pesticide approval that errs on the side of caution: We only permit pesticides for which there is undisputed evidence that the chemicals do not cause problems for humans/animals/other plants etc.
>people mistakenly repeating the conclusion that AI consumes huge amounts of water comparable to that of entire cities
Does it not?
"We estimate that 1 MWh of energy consumption by a data center requires 7.1 m3 of water." If Microsoft, Amazon and Google are assumed to have ~8000 MW of data centers in the US, that is 1.4M m3 per day. The city of Philadelphia supplies 850K m3 per day.
https://iopscience.iop.org/article/10.1088/1748-9326/abfba1/...
"undisputed evidence that the chemicals do not cause problems"
Impossible standard. You cannot prove a negative.
But, I think it's fair to assume that any chemical that is toxic to plant or insect life is probably something you want to be careful with.
AI water usage is pretty bad on a local scale where a large water consumer(Data centers) start sucking up more water than the local table can bear at the expense of the people living there.
Even if the general takes seen on water use is wrong, it's correct in that these companies don't have the best in mind for the average person. It's correct that these companies will push limits and avoid accountability. It's correct that they're generally a liability creating a massive bubble and speculation based on an immature tech designed to automate as many careers away as possible without a proposed solution to the newly unemployed besides "deliver fast food" or "die".
Despite legally treating corporations as people, there's no consistently enforced mechanism that can punish them like people. Monsanto can't be sent to jail for murder. Their C-Levels will never see a cell the way the average person can have the book thrown at them for comparably minor crimes.
Because companies cannot be held accountable legally and effectively, it's important to assume the worst, to generate the public blowback to hold them accountable via lost business.
Your edit was a good one.
It's a rational default position to say, "I'll default to distrusting large corporate scientific literature that tells me neurotoxins on my food aren't a problem."
As with any rule of thumb, that one will sometimes land you on the wrong side of history, but my guess is that it will more often than not guide you well if you don't have the time to dive deeper into a subject.
I'm not saying all corporations are evil. I'm not saying all corporate science is bad or bunk. But, corporations have a poor track record with this sort of thing, and it's the kind of thing that could obviously have large, negative societal consequences if we get it wrong. This is the category of problem for which the science needs to be clear and overwhelming in favor of a thing before we should allow it.
Not at all. NOT AT ALL.
There are shades of gray here. But you are absolutely not required to extend benefit of the doubt to entities that have not earned it. That's a recipe for disaster.
Personally, I find myself to be incredibly biased against corporations over people. I've met a lot of people in my life, they seem mostly nice if a bit stupid. Well intentioned. Selfish.
Are corporations mostly well intentioned? Well, consider that some people tried to put "good intentions" into corporations bylaws and has been viciously resisted.
Corporations will happily take everything you have if you accidentally give it to them. Actual human beings aren't like that.
> …undisputed evidence… do not cause problems…
This is unworkable in practice; nothing will ever be completely safe. Instead, we need a public regulatory body that makes reasonable risk/reward tradeoffs when approving necessary chemicals. However, this system breaks down completely when you allow for lobbying and a revolving door between the public and private sectors.
AI does consume huge amounts of water comparable to entire cities. A single AI facility consumes more water than most cities.
That AI consumes somewhat less water than cities of millions is not a defense.
Ai does us a crap-ton of water. Most data centers use closed loop liquid cooling with heat exchangers to water cooling. (At least all the big ones like Google and Amazon do)
I’m curious what evidence you think you’ve seen to the contrary. from my side, I used to build data centers and my friends are still in the industry. As of a month ago I’ve had discussions with Google engineers who build data centers regarding their carful navigation of water rights, testing of waste water etc.
"If you make any other assumption than "I don't know what's happening here and need to learn more" you'll constantly be making these kind of errors. You don't have to have an opinion on every topic."
I can do this and still start off by assuming the corporation is in the wrong. The tendency to optimize for profits at the expense of everything else, to ignore all negative externalities is inherent to all American corporations.