logoalt Hacker News

logicchainsyesterday at 12:51 PM4 repliesview on HN

The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration. That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.


Replies

cwilluyesterday at 1:45 PM

If libeling real people is a harm to those people, then altering photos of real children is certainly also a harm to those children.

show 1 reply
chrisjjyesterday at 4:10 PM

> The point of banning real CSAM is to stop the production of it, because the production is inherently harmful. The production of AI or human generated CSAM-like images does not inherently require the harm of children, so it's fundamentally a different consideration.

Quite.

> That's why some countries, notably Japan, allow the production of hand-drawn material that in the US would be considered CSAM.

Really? By what US definition of CSAM?

https://rainn.org/get-the-facts-about-csam-child-sexual-abus...

"Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. "

show 1 reply
tokaiyesterday at 12:53 PM

That's not what we are discussing here. Even less when a lot of the material here is edits of real pictures.

duckbilled2yesterday at 3:04 PM

[dead]