logoalt Hacker News

kimixayesterday at 7:36 PM3 repliesview on HN

Man I hope so - the context limit is hit really quickly in many of my use cases - and a compaction event inevitably means another round of corrections and fixes to the current task.

Though I'm wary about that being a magic bullet fix - already it can be pretty "selective" in what it actually seems to take into account documentation wise as the existing 200k context fills.


Replies

humanfromearth9yesterday at 8:45 PM

Hello,

I check context use percentage, and above ~70% I ask it to generate a prompt for continuation in a new chat session to avoid compaction.

It works fine, and saves me from using precious tokens for context compaction.

Maybe you should try it.

show 1 reply
nickstinematesyesterday at 7:42 PM

Is this a case of doing it wrong, or you think accuracy is good enough with the amount of context you need to stuff it with often?

show 2 replies
IhateAI_2yesterday at 9:10 PM

lmao what are you building that actually justify needing 1mm tokens on a task? People are spending all this money to do magic tricks on themselves.

show 1 reply