logoalt Hacker News

bhadass01/15/20266 repliesview on HN

better mental model: it's a lossy compression of human knowledge that can decompress and recombine in novel (sometimes useful, sometimes sloppy) ways.

classical search simply retrieves, llms can synthesize as well.


Replies

TeMPOraL01/15/2026

Corporate wants you to find the difference...

Point being, in broad enough scope, search and compression and learning are the same thing. Learning can be phrased as efficient compression of input knowledge. Compression can be phrased as search through space of possible representation structures. And search through space of possible X for x such that F(x) is minimized, is a way to represent any optimization problem.

RhythmFox01/15/2026

This isn't strictly better to me. It captures some intuitions about how a neural network ends up encoding its inputs over time in a 'lossy' way (doesn't store previous input states in an explicit form). Maybe saying 'probabilistic compression/decompression' makes it a bit more accurate? I do not really think it connects to your 'synthesize' claim at the very end to call it compression/decompression, but I am curious if you had a specific reason to use the term.

show 1 reply
nextaccountic01/16/2026

Maybe the base model is just a compression of the training data?

There is also a RLHF training step on top of that

show 1 reply
andy9901/15/2026

No, this describes the common understanding of LLMs and adds little to just calling it AI. The search is the more accurate model when considering their actual capabilities and understanding weaknesses. “Lossy compression of human knowledge” is marketing.

show 1 reply
DebtDeflation01/15/2026

Information Retrieval followed by Summarization is how I view it.

andrei_says_01/15/2026

“Novel” to the person who has not consumed the training data. Otherwise, just training data combined in highly probable ways.

Not quite autocomplete but not intelligence either.

show 3 replies