logoalt Hacker News

enaaemyesterday at 8:33 PM4 repliesview on HN

The problem is you can undress real people and that is extremely harmful and dangerous. One kid took his life after an ai sextortian scam [1]. Imagine the damage cyberbullies, scammers and stalkers can do?

[1] https://www.cbsnews.com/news/sextortion-generative-ai-scam-e...


Replies

snackerbluestoday at 1:59 AM

Imagine how freeing it will be when people stop caring about this stuff because anyone can see anyone else naked in about 5 seconds. We're basically already at realistic hardcore porn videos of anyone fucking anyone else in a few minutes. No point in worrying about it, and it even serves as a shield for real leaked revenge porn - just claim it's AI.

wolvoleoyesterday at 11:20 PM

Yeah like I said. With consent of the people involved.

There must be a way to do that. Especially with all the facial req chops these days. Also, you could simply refuse using existing images. I don't see why they wouldn't refuse that because that's a pretty narrow usecase with very few benign purposes.

> Imagine the damage cyberbullies, scammers and stalkers can do?

They already can. There's open-source models out there.

raw_anon_1111yesterday at 9:45 PM

This has been fixed months ago. From reading Reddit, Grok is now really conservative about what it will let you do with uploaded images. But you can get it to draw x rated porn images and videos that start with Ai images it creates

thaumasiotesyesterday at 9:34 PM

> The problem is you can undress real people and that is extremely harmful and dangerous.

But... that's not something you can do. It's impossible.

You can imagine what real people look like naked. That's not a new thing.

https://www.youtube.com/watch?v=p7FCgw_GlWc

show 1 reply