logoalt Hacker News

chrisjjyesterday at 3:07 PM2 repliesview on HN

> Are you implying that it's not abuse to "undress" a child using AI?

Not at all. I am saying just it is not CSAM.

> You should realize that children have committed suicide before because AI deepfakes of themselves have been spread around schools.

Its terrible. And when "AI"s are found spreading deepfakes around schools, do let us know.


Replies

mrtksnyesterday at 7:04 PM

CSAM: Child Sexual Abuse Material.

When you undress a child with AI, especially publicly on Twitter or privately through DM, that child is abused using the material the AI generated. Therefore CSAM.

show 1 reply
enaaemtoday at 2:58 AM

[flagged]