Every day I see people treat gen AI like a thinking human, Dijkstra's attitudes about anthropomorphizing computers is vindicated even more.
That said, I think the author's use of "bag of words" here is a mistake. Not only does it have a real meaning in a similar area as LLMs, but I don't think the metaphor explains anything. Gen AI tricks laypeople into treating its token inferences as "thinking" because it is trained to replicate the semiotic appearance of doing so. A "bag of words" doesn't sufficiently explain this behavior.
I'll make the following observation:
The contra-positive of "All LLMs are not thinking like humans" is "No humans are thinking like LLMs"
And I do not believe we actually understand human thinking well enough to make that assertion.
Indeed, it is my deep suspicion that we will eventually achieve AGI not by totally abandoning today's LLMs for some other paradigm, but rather embedding them in a loop with the right persistence mechanisms.
Yea bag of words isn’t helpful at all. I really do think that “superpowered sentence completion” is the best description. Not only is it reasonably accurate it is understandable, everyone has seen autocomplete function, and it’s useful. I don’t know how to “use” a bag of words. I do know how to use sentence completion. It also helps explains why context matters.
well they are trained to be almost in distribution as a thinking human. So...
Bag of words is actually the perfect metaphor. The data structure is a bag. The output is a word. The selection strategy is opaquely undefined.
> Gen AI tricks laypeople into treating its token inferences as "thinking" because it is trained to replicate the semiotic appearance of doing so. A "bag of words" doesn't sufficiently explain this behavior.
Something about there being significant overlap between the smartest bears and the dumbest humans. Sorry you[0] were fooled by the magic bag.
[0] in the "not you, the layperson in question" sense
Spoken Query Language? Just like SQL, but for unstructured blobs of text as a database and unstructured language as a query? Also known as Slop Query Language or just Slop Machine for its unpredictable results.
One metaphor is to call the model a person, another metaphor is to call it a pile of words. These are quite opposite. I think that's the whole point.
Person-metaphor does nothing to explain its behavior, either.
"Bag of words" has a deep origin in English, the Anglo-Saxon kenning "word-hord", as when Beowulf addresses the Danish sea-scout (line 258)
"He unlocked his word-hoard and delivered this answer."
So, bag of words, word-treasury, was already a metaphor for what makes a person a clever speaker.