Bag of words is actually the perfect metaphor. The data structure is a bag. The output is a word. The selection strategy is opaquely undefined.
> Gen AI tricks laypeople into treating its token inferences as "thinking" because it is trained to replicate the semiotic appearance of doing so. A "bag of words" doesn't sufficiently explain this behavior.
Something about there being significant overlap between the smartest bears and the dumbest humans. Sorry you[0] were fooled by the magic bag.
[0] in the "not you, the layperson in question" sense
lol magic bag.
Yeah. I have a half-cynical/half-serious pet theory that a decent fraction of humanity has a broken theory of mind and thinks everyone has the same thought patterns they do. If it talks like me, it thinks like me.
Whenever the comment section takes a long hit and goes "but what is thinking, really" I get slightly more cynical about it lol