logoalt Hacker News

bhadassyesterday at 7:16 PM4 repliesview on HN

better mental model: it's a lossy compression of human knowledge that can decompress and recombine in novel (sometimes useful, sometimes sloppy) ways.

classical search simply retrieves, llms can synthesize as well.


Replies

RhythmFoxyesterday at 7:23 PM

This isn't strictly better to me. It captures some intuitions about how a neural network ends up encoding its inputs over time in a 'lossy' way (doesn't store previous input states in an explicit form). Maybe saying 'probabilistic compression/decompression' makes it a bit more accurate? I do not really think it connects to your 'synthesize' claim at the very end to call it compression/decompression, but I am curious if you had a specific reason to use the term.

show 1 reply
andy99yesterday at 8:31 PM

No, this describes the common understanding of LLMs and adds little to just calling it AI. The search is the more accurate model when considering their actual capabilities and understanding weaknesses. “Lossy compression of human knowledge” is marketing.

show 1 reply
andrei_says_yesterday at 7:19 PM

“Novel” to the person who has not consumed the training data. Otherwise, just training data combined in highly probable ways.

Not quite autocomplete but not intelligence either.

show 3 replies
DebtDeflationyesterday at 8:30 PM

Information Retrieval followed by Summarization is how I view it.