logoalt Hacker News

chrisjjyesterday at 10:44 AM3 repliesview on HN

I am not a fan of Grok, but there has been zero evidence of it creating CSAM. For why, see https://www.iwf.org.uk/about-us/


Replies

mortarionyesterday at 11:08 AM

CSAM does not have a universal definition. In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response. If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.

No abuse of a real minor is needed.

show 3 replies
moolcoolyesterday at 1:53 PM

Are you implying that it's not abuse to "undress" a child using AI?

You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools. Just because these images are "fake" doesn't mean they're not abuse, and that there aren't real victims.

show 1 reply
secretsatanyesterday at 11:06 AM

It doesn't mention grok?

show 1 reply