logoalt Hacker News

boothbylast Thursday at 5:36 PM4 repliesview on HN

I know what porn looks like. I know what children look like. I do not need to be shown child porn in order to recognize it if I saw it. I don't think there's an ethical dilemma here; there is no need if LLMs have the capabilities we're told to expect.


Replies

jjk166last Thursday at 5:48 PM

AI doesn't know what either porn or children are. It finds correlations between aspects of inputs and the labels porn and children. Even if you did develop an advanced enough AI that could develop a good enough idea of what porn and children are, how would you ever verify that it is indeed capable of recognizing child porn without plugging in samples for it to flag?

show 3 replies
giantg2last Friday at 1:31 PM

"I know what porn looks like. I know what children look like."

Do you though?

Some children look like adults (17 vs 18, etc). Some adults, look younger than they actually are. How do we tell the difference between porn and art, such as nude scenes in movies, or even ancient sculptures? It doesn't seem like an agent would be able to make these determinations without a significant amount of training, and likely added context about any images it processes.

Nevermarklast Thursday at 6:29 PM

That is a good point. Is the image highly sexual? Are their children in the image?

Not a perfect CP detection system (might detect kids playing in a room with a rated R movie playing on a TV in the background), but it would be a good first attempt filter.

Of course, if you upload a lot of files to Google Drive and run a sanity check like this on the files, it is too late to save you from Google.

Avoiding putting anything with any risk potential on Google Drive seems like an important precaution regarding the growing tyranny of automated and irreversible judge & juries.

cs02rm0last Thursday at 5:39 PM

They don't have your capabilities.

show 1 reply