Can you link to the research on millions of different terms and stable long contexts? I haven't come across that yet.
You can look at AnyTool, 2024 (16,000 tools) and start looking at newer research from there.
https://arxiv.org/abs/2402.04253
For long contexts start with activation beacons and RoPE scaling.
You can look at AnyTool, 2024 (16,000 tools) and start looking at newer research from there.
https://arxiv.org/abs/2402.04253
For long contexts start with activation beacons and RoPE scaling.