logoalt Hacker News

jfindper12/11/20251 replyview on HN

>They got banned for uploading child porn to Google Drive

They uploaded the full "widely-used" training dataset, which happened to include CSAM (child sexual abuse material).

While the title of the article is not great, your wording here implies that they purposefully uploaded some independent CSAM pictures, which is not accurate.


Replies

AdamJacobMuller12/11/2025

No but "They got banned for uploading child porn to Google Drive" is a correct framing and "google banned a developer for finding child porn" is incorrect.

There is important additional context around it, of course, which mitigates (should remove) any criminal legal implications, and should also result in google unsuspending his account in a reasonable timeframe but what happened is also reasonable. Google does automated scans of all data uploaded to drive and caught CP images being uploaded (presumably via hashes from something like NCMEC?) and banned the user. Totally reasonable thing. Google should have an appeal process where a reasonable human can look at it and say "oh shit the guy just uploaded 100m AI training images and 7 of them were CP, he's not a pedo, unban him, ask him not to do it again and report this to someone."

The headline frames it like the story was "A developer found CP in AI training data from google and banned him in retaliation for reporting it." Totally disingenuous framing of the situation.

show 2 replies