logoalt Hacker News

mmh0000last Tuesday at 9:10 PM1 replyview on HN

I know OpenAI watermarks their stuff. But I wish they wouldn't. It's a "false" trust.

Now it means whoever has access to uncensored/non-watermarking models can pass off their faked images as real and claim, "Look! There's no watermark, of course, it's not fake!"

Whereas, if none of the image models did watermarking, then people (should) inherently know nothing can be trusted by default.


Replies

pbmonsteryesterday at 10:27 AM

Yeah, I'd go the other way. Camera manufacturers should have the camera cryptographically sign the data from the sensor directly in hardware, and then provide an API to query if a signed image was taken on one of their cameras.

Add an anonymizing scheme (blind signatures or group signatures), done.