But are you losing horsepower of the LLM available to problem solving on a given task by doing so?
Maybe a little, but Claude has 200,000 tokens these days and GPT-5.2 has 400,000 - there's a lot of space.
Maybe a little, but Claude has 200,000 tokens these days and GPT-5.2 has 400,000 - there's a lot of space.