logoalt Hacker News

cratermoontoday at 3:49 PM2 repliesview on HN

"Another critical lesson is that humans are distinctly bad at monitoring automated processes".

Humans are also distinctly bad at noticing certain kinds of bugs in software. Think off-by-one errors, deadlocks, or any sort of bug you've stared at for days and not noticed the one missing or extra semicolon. But LLMs can generate a tsunami of subtly wrong code in the time a reviewer will notice one typo and miss all the rest.


Replies

aphyrtoday at 3:52 PM

Yes. For more on this, see section 2: https://aphyr.com/posts/412-the-future-of-everything-is-lies...

show 1 reply
intendedtoday at 4:27 PM

> "Another critical lesson is that humans are distinctly bad at monitoring automated processes".

I believe the technical term is vigilance degradation?