This is a good way to regulate this. Criminalize people who abuse AI tools to cause harm. Don't try to impose censorship or mass-surveillance on AI tools. I oppose all pornography, but censoring nudity from a model both compromises the model's quality (example: SD3) and stops legitimate artistic value.
Though the framing on Grok is highly duplicitous. It is against the ToS, and it's about a few abusers among millions of legitimate users. Meanwhile there are actual "nudification" services which advertise themselves entirely to enable this kind of abuse.
> This is a good way to regulate this. Criminalize people who abuse AI tools to cause harm.
In same way as is done so successfully with guns, speeding automobiles, etc?
These things are capable of inferring photorealistic av deepfakes; with a well drafted law they're more than capable of inferring if what they are being asked to is illegal.
It makes zero sense to wait for the poop to hit the fan and then waste taxes investigating the illegality, punishing criminals, and dealing with impact on victims when it can stopped at source.
who decides what is harmful? That's going to be like letting a Bible thumping Moral Majority zealot decide what's Art and what's pornography.