logoalt Hacker News

direwolf20today at 1:03 AM1 replyview on HN

Who cares how it thinks? It's a Chinese room. If the input–output mapping works, then it's correct.


Replies

emsigntoday at 1:22 AM

But it's not correct! Exactly because it can't possibly have enough training data to fill the void of not being able to experience the human condition. Text is not enough. The error rate of LLMs are horrendously bad. And the errors grow exponentially the more steps follow each other.

All the great work you see on the internet AI has supposedly done was only achieved by a human doing lots of trial and error and curating everything the agentic LLM did. And it's all cherry picked successes.

show 1 reply