logoalt Hacker News

madaxe_againtoday at 7:08 AM1 replyview on HN

The instantiation of models in humans is not unsupervised, and language, for instance, absolutely requires labelled data and structured input. The predefined objective is “expand”.

See also: feral children.


Replies

gloosxtoday at 11:24 AM

Children are not shown pairs like

"dog": [object of class Canine]

They infer meaning from noisy, ambiguous sensory streams. The labels are not explicit, they are discovered through correlation, context, and feedback.

So although caregivers sometimes point and name things, that is a tiny fraction of linguistic input, and it is inconsistent. Children generalize far beyond that.

Real linguistic input to a child is incomplete, fragmented, error-filled, and dependens on context. It is full of interruptions, mispronunciations, and slang. The brain extracts structure from that chaos. Calling that "structured input" confuses the output - inherent structure of language - with the raw input, noisy speech and gestures.

The brain has drives: social bonding, curiosity, pattern-seeking. But it doesn't have a single optimisation target like "expand." Objectives are not hardcoded loss functions, they are emergent and changing.

You're right that lack of linguistic input prevents full language development, but that is not evidence of supervised learning. It just shows that exposure to any language stream is needed to trigger the innate capacity.

Both complexity and efficiency of the human learning is just on another level. Transformers are child's play compared to that level. They are not going to gain consciousness, and no AGI will happen in the foreseeable future, it is all just marketing crap, and it's becoming more and more obvious as the dust settles.