LLMs are not remotely good enough to use as a learning tool. They still make shit up a ton of the time, and you can only catch it if you already know the material (so, not useful for learning). They probably never will be useful for learning, since even after all this time hallucinations are still just as bad as they ever were.
Have you tried them with providing a grounding resource, e.g. attaching a file to ChatGPT or NotebookLM? Yes need some human expert to create (or curate) that grounding resource in the first place, but LLMs handle the rest well: presenting info in different ways and paces, interacting with the learner like a tutor, etc.