Absolutely:
> In order to predict where thinking and reasoning capabilities are going, it's important to understand the trail of thought that went into today's thinking LLMs.
No. You don't understand at all. They don't think. They don't reason. They are statistical word generators. They are very impressive at doing things like writing code, but they don't work the way that is being inferred here.