The penalties for unknowingly possessing or transmitting child porn are far too harsh, both in this case and in general (far beyond just Google's corporate policies).
Again, to avoid misunderstandings, I said unknowingly - I'm not defending anything about people who knowingly possess or traffic in child porn, other than for the few appropriate purposes like reporting it to the proper authorities when discovered.
That's the root problem with all mandated, invasive CSAM scanning. (Non-signature based) creates an unreasonable panopticon that leads to lifelong banishment by imprecise, evidence-free guessing. It also hyper-criminalizes every parent who accidentally takes a picture of their kid without being fully dressed. And what about DoS victims who are anonymously sent CSAM without their consent to get them banned for "possession"? While pedo is gross and evil no doubt, but extreme "think of the children" measures that sacrifice liberty and privacy create another evil that is different. Handing over total responsibility and ultimate decision-making for critical matters to a flawed algorithm is lazy, negligent, and immoral. There's no easy solution to any such process, except requiring human review should be the moral and ethical minimum standard before drastic measures (human in the loop (HITL)).
The issue is that when you make ignorance a valid defense, the optimal strategy is to deliberately turn a blind eye, as it reduces your risk exposure. It further gives refuge for those who can convincingly feign ignorance.
We should make tools readily available and user friendly so it is easier for people to detect CSAM that they have unintentionally interacted with. This both shields the innocent from being falsely accused, and makes it easier to stop bad actors as their activities are detected earlier.