logoalt Hacker News

ACCount37today at 12:51 AM1 replyview on HN

Humans do have an upper limit on how much working memory they have. Which I see as the closest thing to the "O(N^2) attention curse" of LLMs.

That doesn't stop an LLM from manipulating its context window to take full advantage of however much context capacity it has. Today's tools like file search and context compression are crude versions of that.


Replies

rishabhaiovertoday at 12:32 PM

Human brain's prediction loop is bayesian in nature.

show 1 reply