When you’re using a map you’re still navigating, even if you’re just following directions. The act of navigating teaches you spatial awareness regardless of how you got there.
AI doesn’t provide directions, it navigates for you. You’re actively getting stupider every time you take an LLMs answer for granted, and this paper demonstrates that people are likely to take answers for granted.
If I use Google Maps I ain't navigating. I follow the instructions until I arrive.
> AI doesn’t provide directions, it navigates for you.
LLMs (try to) give you what you're asking for. If you ask for directions, you'll get something that resembles that, if you ask it to 100% navigate, that's what you get.
> and this paper demonstrates that people are likely to take answers for granted.
Could you point out where exactly this is demonstrated in this paper? As far as I can tell from the study, people who used ChatGPT for the studying did better than the ones that didn't, with no different in knowledge retention.