logoalt Hacker News

dleeftinklast Friday at 6:35 AM1 replyview on HN

I would say a text-based model carries a different risk profile compared to video-based ones. At some point (now?) we'd probably need to have the difficult conversation of what level of media-impersonation we are comfortable with.


Replies

akerstenlast Friday at 6:50 AM

It's messy because media impersonation has been a problem since the advent of communication. In the extreme, we're sort of asking "should we make lying illegal?"

The model (pardon) in my mind is like this:

* The forger of the banknote is punished, not the maker of the quill

* The author of the libelous pamphlet is punished, not the maker of the press

* The creep pasting heads onto scandalous bodies is punished, not the author of Photoshop

In this world view, how do we handle users of the magic bag of math? We've scarcely thought before that a tool should police its own use. Maybe, we can say, because it's too easy to do bad things with, it's crossed some nebulous line. But it's hard to argue for that on principle, as it doesn't sit consistently with the more tangible and well-trodden examples.

With respect to the above, all the harms are clearly articulated in the law as specific crimes (forgery, libel, defamation). The square I can't circle with proposals like the one under discussion is that they open the door for authors of tools to be responsible for whatever arbitrary and undiscovered harms await from some unknown future use of their work. That seems like a regressive way of crafting law.

show 1 reply