LLMs are so good at telling me about things I know little to nothing about, but when when I ask about things I have expert knowledge on they consistently fail, hallucinate, and confidently lie...
I’ve found that they vary a huge amount based on the subject matter. In my case, I have noticed the opposite of what you observed. They know a lot about the web space (which I’ve been in for around 25 years), but are pretty bad (though not useless) at esoteric languages such as Hare.
I think you end up asking it basic questions about stuff you know little about, but much more complex/difficult questions for stuff you're already an expert in.
Feels like https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect