logoalt Hacker News

roger_last Wednesday at 9:20 PM1 replyview on HN

If they pepper it with warnings and add safe guards, then I'm fine.

I think they can design it to minimize misinformation or at least blind trust.


Replies

piva00last Wednesday at 9:26 PM

People are very good at ignoring warnings, I see it all the time.

There's no way to design it to minimise misinformation, the "ground truth" problem of LLM alignment is still unsolved.

The only system we currently have to allow people to verify they know what they are doing is through licencing: you go to training, you are tested that you understand the training, and you are allowed to do the dangerous thing. Are you ok with needing this to be able to access a potentially dangerous tool for the untrained?

show 1 reply