> Like you go to some question and the accepted answer with the most votes is for a ten-year-old version of the technology.
This is still a problem with LLMs as a result. The bigger problem is that now the LLM doesn’t show you it was a 10 year old solution, you have to try it, watch it fail, then find out it’s old, and ask for a more up to date example, then watch it flounder around. I’ve experienced this more times than I can count.
Usually that's resolved by saying "I want you to use v2" or whatever it is, which you can't really do with a Stack Overflow answer as easily.
Have you tried using context7 or a similar MCP to have the agent automatically fetch up to date documentation?
Then you're doing it wrong?
I'd need to see a few examples, but this is easily solved by giving the llm more context, any really. Give it the version number, give it a url to a doc. Better yet git clone the repo and tell it to reference the source.
Apologies for using you as an example, but this is a common theme on people who slam LLMs. They ask it a specific/complex question with little context and then complain when the answer is wrong.