logoalt Hacker News

AureliusMAtoday at 10:33 AM0 repliesview on HN

I remember when a large context was 8k! Nowadays that would seem extremely small, because we have new use-cases that require much larger context sizes. Maybe in the future, we will invent ways to use inference on very large contexts that we cannot even imagine today.