logoalt Hacker News

ipythonlast Thursday at 5:22 PM3 repliesview on HN

If the picture truly was of a child, the company is _required_ to report CSAM to NCMEC. It's taken very seriously. If they're not being responsive, escalate and report it yourself so you don't have legal problems.

See https://report.cybertip.org/.


Replies

amarcheschilast Thursday at 5:33 PM

Even if it's an Ai image? I will follow through contacting them directly rather than with the platform messaging system, then I'll see what to do if they don't answer

Edit i read the informations given in the briefing before the task, and they say that there might be offensive content displayed. They say to tell them if it happens, but well I did and got no answer so weeeell, not so inclined to believe they care about it

show 2 replies
moi2388last Friday at 7:14 AM

A nude picture of a child is not automatically CSAM.

It needs to be sexually abused or exploited for something to be CSAM.

show 1 reply
kotaKatlast Thursday at 6:25 PM

> It's taken very seriously

Can confirm. The amount of people I see in my local news getting arrested for possession that "... came from a cybertip escalated to NCMEC from <BIGCOMPANY>" is... staggering. (And it's almost always Google Drive or GMail locally, but sometimes a curveball out there.)

show 1 reply