logoalt Hacker News

thisisittoday at 2:10 PM2 repliesview on HN

Laws will be passed to make it "safer". Just like it is happening with the id verification systems. Every image or video gen will require a watermark. Something visible which cannot be removed easily or hidden which can be detected and blocked. Access to models which do not comply will be made harder through id verification checks or something.

There will be some regulatory capture in between.

World will kick into gear only when something really bad happens. Maybe a influential person - rich or politician - fooled into doing something catastrophic due to a deepfake video/image. Until then normal people being affected isn't going to move the needle.


Replies

Mirastetoday at 2:36 PM

Verification needs to work the other way around, some kind of verifiable chain of trust for photos and videos from real cameras. Watermarking all generated media is impossible.

show 2 replies
red-iron-pinetoday at 3:04 PM

> Laws will be passed to make it "safer". Just like it is happening with the id verification systems. Every image or video gen will require a watermark. Something visible which cannot be removed easily or hidden which can be detected and blocked. Access to models which do not comply will be made harder through id verification checks or something.

i've thought about this off and on and how to implement it. Not easily, was my general takeaway.

or rather, it's easily to implement but you're in a adversarial relationship with bad actors and easy implementations may be easily broken

e.g. your certs gotta come from somewhere and stay protected, and how do you update and control them. key management for every single camera on every phone, etc.