logoalt Hacker News

chrisjjyesterday at 4:10 PM1 replyview on HN

> The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration.

Quite.

> That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.

Really? By what US definition of CSAM?

https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

"Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. "


Replies