If the picture truly was of a child, the company is _required_ to report CSAM to NCMEC. It's taken very seriously. If they're not being responsive, escalate and report it yourself so you don't have legal problems.
A nude picture of a child is not automatically CSAM.
It needs to be sexually abused or exploited for something to be CSAM.
> It's taken very seriously
Can confirm. The amount of people I see in my local news getting arrested for possession that "... came from a cybertip escalated to NCMEC from <BIGCOMPANY>" is... staggering. (And it's almost always Google Drive or GMail locally, but sometimes a curveball out there.)
Even if it's an Ai image? I will follow through contacting them directly rather than with the platform messaging system, then I'll see what to do if they don't answer
Edit i read the informations given in the briefing before the task, and they say that there might be offensive content displayed. They say to tell them if it happens, but well I did and got no answer so weeeell, not so inclined to believe they care about it