> Without explicit instruction, LLMs are really bad at this
They used to be. They have become quite good at it, even without instruction. Impressively so.
But it does require that the humans who laid the foundation also followed consistent patterns and conventions. If there is deviation to be found, the LLM will see it and be forced to choose which direction to go, and that's when things quickly fall off the rails. LLMs are not (yet) good at that, and maybe never can be as not even the humans were able to get it right.
Garbage in, garbage out, as they say.