The human propensity to anthropomorphize computer programs scares me.
the propensity extends beyond computer programs. I understand the concern in this case, because some corners of the AI industry are taking advantage of it as a way to sell their product as capital-I "Intelligent" but we've been doing it for thousands of years and it's not gonna stop now.
The ELIZA program, released in 1966, one of the first chatbots, led to the "ELIZA effect", where normal people would project human qualities upon simple programs. It prompted Joseph Weizenbaum, its author, to write "Computer Power and Human Reason" to try to dispel such errors. I bought a copy for my personal library as a kind of reassuring sanity check.
Yeah, we shouldn't anthropomorphize computers, they hate that.
We objectify humans and anthropomorph objects because that's what comparisons are. There's nothing that deep about it
It's pretty wild. People are punching into a calculator and hand-wringing about the morals of the output.
Obviously it's amoral. Why are we even considering it could be ethical?
We anthropomorphize everything. Deer spirit. Mother nature. Storm god. It is how we evolved to build mental models to understand the world around us without needing to fully understand the underlying mechanism involved in how those factors present themselves.
[flagged]
These aren't computer programs. A computer program runs them, like electricity runs a circuit and physics runs your brain.
It provides a serviceable analog for discussing model behavior. It certainly provides more value than the dead horse of "everyone is a slave to anthropomorphism".
The human propensity to call out as "anthropomorphizing" the attributing of human-like behavior to programs built on a simplified version of brain neural networks, that train on a corpus of nearly everything humans expressed in writing, and that can pass the Turing test with flying colors, scares me.
That's exaxtly the kind of thing that makes absolute sense to anthropomorphize. We're not talking about Excel here.