Apparent reasoning can emerge from probabilistic systems that simply reproduce statistical order not genuine understanding.
Weather models sometimes “predict” a real pattern by chance, yet we don’t call the atmosphere intelligent.
If LLMs were truly thinking, we could enroll one at MIT and expect it to graduate, not just autocomplete its way through the syllabus or we could teach one how to drive.