> Maybe AI detection is more ethically fraught since you'd need to keep hold of the CSAM until the next training run,
why?
the damage is already done
Some victims feel this way. Some do not.
Why would you think that? Every distribution, every view is adding damage, even if the original victim doesn't know (or even would rather not know) about it.
I would think there's more possibility of it leaking or being abused in a giant stockpile. Undoubtedly, those training sets would be commercialized in some way, potentially causing some to see that as adding insult to injury.