logoalt Hacker News

sethevtoday at 12:08 AM1 replyview on HN

It's kind of an aside in the post, but connecting LLMs and Searle's Chinese Room argument is a brilliant observation. Although there are people who believe LLMs are really thinking, it's mostly confirming that the Turing test wasn't the right way to test this.


Replies

forgetfreemantoday at 12:25 AM

Is it? The observation seems patently obvious if one has even the most superficial understanding of how LLMs work? Why is this not common knowledge?