I mean, you've collapsed a complex, mixed system into a single negative narrative.
Examples of how I learn with LLMs:
- Paste sections from reading and ask questions / clarify my understanding / ask it to quiz me
- Produce Anki cards by pasting in chapter text and the culling out the goods ones
- Request resources / links for further learning
Basically, LLMs serve as a thinking partner. Yes, it's a fallible tool, not an oracle. But dismissing the idea that you can learn (and learn faster / more efficiently) with LLMs is reductionist