logoalt Hacker News

chonglitoday at 12:57 AM1 replyview on HN

Experimentation leads to experience

Of course it does, but only after the fact. You don't have any experience of the result of the experiment before you perform it.

Sure, they can't have apples fall on their heads like Newton had but they can totally observe the apple falling on someones head in a video

I have strong doubts that LLMs have any understanding whatsoever of what's happening in images (let alone videos). The claim (I've sometimes heard) that they possess a world model and are able to interpret an image according to that model is an extremely strong one, that's strongly contradicted by the fact that they: a) continue to hallucinate in pretty glaring ways, and b) continue to mis-identify doctored (adversarial) images that no human would mis-identify (because they don't drastically alter the subject).


Replies

sally_glancetoday at 7:00 AM

In software, they can and do perform experiments (make a change then observe the log output). I don't think they possess a "world model" or that it's worth spending too much thought on... My reasoning is more along the lines that our brains are also just [very advanced] inference machines. We also hallucinate and mis-identify images (there are image/video classification tasks where humans have lower scores).

For me the most glaring difference to how humans work is the lack of online learning. If that prevents them from being able to innovate, I'm not so sure.