logoalt Hacker News

ben_wtoday at 8:11 AM2 repliesview on HN

The general argument you make is correct, but you conclusion "And this one doesn't." is as yet uncertain.

I will absolutely say that all ML methods known are literally too stupid to live, as in no living thing can get away with making so many mistakes before it's learned anything, but that's the rate of change of performance with respect to examples rather than what it learns by the time training is finished.

What is "abstract thought"? Is that even the same between any two humans who use that word to describe their own inner processes? Because "imagination"/"visualise" certainly isn't.


Replies

rob74today at 9:25 AM

> no living thing can get away with making so many mistakes before it's learned anything

If you consider that LLMs have already "learned" more than any one human in this world is able to learn, and still make those mistakes, that suggests there may be something wrong with this approach...

show 2 replies
littlestymaartoday at 10:40 AM

> but that's the rate of change of performance with respect to examples rather than what it learns by the time training is finished.

It's not just that. The problem of “deep learning” is that we use the word “learning” for something that really has no similarity with actual learning: it's not just that it converges way too slowly, it's also that it just seeks to minimize the predicted loss for every samples during training, but that's no how humans learn. If you feed it enough flat-earther content, as well a physics books, an LLM will happily tells you that the earth is flat, and explain you with lots of physics why it cannot be flat. It simply learned both “facts” during training and then spit it out during inference.

A human will learn one or the other first, and once the initial learning is made, it will disregards all the evidence of the contrary, until maybe at some point it doesn't and switches side entirely.

LLMs don't have an inner representation of the world and as such they don't have an opinion about the world.

The humans can't see the reality for itself, but they at least know it exists and they are constantly struggling to understand it. The LLM, by nature, is indifferent to the world.

show 1 reply