logoalt Hacker News

root_axisyesterday at 7:21 PM2 repliesview on HN

Funny enough, Anthropic just went GA with 1m context claude that has supposedly solved the lost-in-the-middle problem.


Replies

SyneRyderyesterday at 7:44 PM

Just for anyone else who hadn't seen the announcement yet, this Anthropic 1M context is now the same price as the previous 256K context - not the beta where Anthropic charged extra for the 1M window:

https://x.com/claudeai/status/2032509548297343196

As for retrieval, the post shows Opus 4.6 at 78.3% needle retrieval success in 1M window (compared with 91.9% in 256K), and Sonnet 4.6 at 65.1% needle retrieval in 1M (compared with 90.6% in 256K).

show 3 replies
BloondAndDoomyesterday at 8:00 PM

In addition to context rot, cost matters, I think lots of people use toke compression tools for that not because of context rot

show 1 reply