> that code is probably only ever supposed to be LLM maintained, not by people.
But LLMs are trying to mimic people. So if confusion is the human response, what's to stop the llm from acting confused?
A mechanical ability to look at the code without having a judgement.
A mechanical ability to look at the code without having a judgement.