I've learned repeatedly that LLMs are very susceptible to helpfully giving you the wrong answer when you're asking the wrong question, or asking it in the wrong way.