logoalt Hacker News

drekipusyesterday at 9:14 PM1 replyview on HN

> that code is probably only ever supposed to be LLM maintained, not by people.

But LLMs are trying to mimic people. So if confusion is the human response, what's to stop the llm from acting confused?


Replies

throw1234567891yesterday at 11:36 PM

A mechanical ability to look at the code without having a judgement.