logoalt Hacker News

ivan_gammeltoday at 6:00 PM3 repliesview on HN

If you sign off the code and put your expertise and reputation behind it, AI becomes just an advanced autocomplete tool and, as such, should not count in “no AI” rules. It’s ok to use it, if that enables you to work.


Replies

notatoadtoday at 6:02 PM

this sounds reasonable, but in practice people will simply sign off on anything without having thoroughly reviewed it.

I agree with you that there's a huge distinction between code that a person understands as thoroughly as if they wrote it, and vibecoded stuff that no person actually understands. but actually doing something practical with that distinction is a difficult problem to solve.

show 1 reply
heavyset_gotoday at 7:21 PM

> If you sign off the code and put your expertise and reputation behind it, AI becomes just an advanced autocomplete tool and, as such, should not count in “no AI” rules.

No, it's not that simple. AI generated code isn't owned by anyone, it can't be copyrighted, so it cannot be licensed.

This matters for open source projects that care about licensing. It should also matter for proprietary code bases, as anyone can copy and distribute "their" AI generated code for any purpose, including to compete with the "owner".

show 2 replies
Groxxtoday at 7:07 PM

this is equivalent to claiming that automation has no negative side effects at all.

we do often choose automation when possible (especially in computer realms), but there are endless examples in programing and other fields of not-so-surprising-in-retrospect failures due to how automation affects human behavior.

so it's clearly not true. what we're debating is the amount of harm, not if there is any.