logoalt Hacker News

qserayesterday at 1:06 PM2 repliesview on HN

Just tell one funny thing an LLM said...


Replies

letmevotepleasetoday at 1:54 AM

Lots of examples here:

https://news.ycombinator.com/item?id=46205632

anthonyrstevensyesterday at 5:24 PM

Yesterday it was "LLM's can't count R's in 'strawberry'." Today it's "LLM's can't tell jokes". Tomorrow it might be "LLM's can't do (X)", all while LLMs get better and better at every objection/challenge posed.

The problem as I see it is that you have a fundamental objection to categorizing the way LLMs do their work as in any way related to "real gosh-darn human thinking". Which I think is wrong. At the root, we are just information-processing meat that happens to have had millions of years to optimize for speed, pattern recognition, feedback, etc.