logoalt Hacker News

scotty79last Wednesday at 2:40 PM1 replyview on HN

Context is a short term memory. Yours is even more limited and yet somehow you get by.


Replies

troupolast Wednesday at 5:08 PM

I get by because I also have long-term memory, and experience, and I can learn. LLMs have none of that, and every new session is rebuilding the world anew.

And even my short-term memory is significantly larger than the at most 50% of the 200k-token context window that Claude has. It runs out of context before my short-term memory is probably not even 1% full, for the same task (and I'm capable of more context-switching in the meantime).

And so even the "Opus 4.5 really is at a new tier" runs into the very same limitations all models have been running into since the beginning.

show 1 reply