More like "A developer accidentally uploaded child porn to his Google Drive account and Google banned him for it".
On one hand, I would like to say this could happen to anyone, on the other hand, what the F?? why are people passing around a dataset that contains child sexual abuse material??, and on another hand, I think this whole thing just reeks of techy-bravado, and I don’t exactly blame him. If one of the inputs of your product (OpenAI, google, microsoft, meta, X) is a dataset that you can’t even say for sure does not contain child pornography, that’s pretty alarming.
The penalties for unknowingly possessing or transmitting child porn are far too harsh, both in this case and in general (far beyond just Google's corporate policies).
Again, to avoid misunderstandings, I said unknowingly - I'm not defending anything about people who knowingly possess or traffic in child porn, other than for the few appropriate purposes like reporting it to the proper authorities when discovered.