The bag of words reminds me of the Chinese room.
"The machine accepts Chinese characters as input, carries out each instruction of the program step by step, and then produces Chinese characters as output. The machine does this so perfectly that no one can tell that they are communicating with a machine and not a hidden Chinese speaker.
The questions at issue are these: does the machine actually understand the conversation, or is it just simulating the ability to understand the conversation? Does the machine have a mind in exactly the same sense that people do, or is it just acting as if it had a mind?"
Chinese room has been discussed to death of course.
Here's one fun approach (out of 100s) :
What if we answer the Chinese room with the Systems Reply [1]?
Searle countered the systems reply by saying he would internalize the Chinese room.
But at that point it's pretty much exactly the Cartesian theater[2] : with room, homunculus, implement.
But the Cartesian theater is disproven, because we've cut open brains and there's no room in there to fit a popcorn concession.
[1] https://plato.stanford.edu/entries/chinese-room/
[2] https://en.wikipedia.org/wiki/Cartesian_theater