I know anthropomorphizing LLMs has been normalized, but holy shit. I hope the language in this article is intentionally chosen for a dramatic effect.
Fascinating. This is invisible to me, what anthropomorphising did you notice that stood out?
Agreed. We should not be anthropomorphising LLMs or having them mimic humans.
The thing is .. what else can you do? All the advice on how to get results out of LLMs talks in the same way, as if it's a negotiation or giving a set of instructions to a person.
You can do a mental or physical search and replace all references to the LLM as "it" if you like, but that doesn't change the interaction.