if a user uses a tool to break the law it's on the person who broke the law not the people who made the tool. knife manufacturers aren't to blame if someone gets stabbed right?
If the knife manufacturer willingly broke the law in order to sell it, then yes.
If the manufacturer advertised that the knife is not just for cooking but also stabbing people, then yes.
if the knife was designed to evade detection, then yes.
Text on the internet and all of that, but you should have added the "/s" to the end so people didn't think you were promoting this line of logic seriously.
If a knife manufacturer constructs an apparatus wherein someone can simply write "stab this child" on a whim to watch a knife stab a child, that manufacturer would in fact discover they are in legal peril to some extent.
I mean, no one's ever made a tool who's scope is "making literally anything you want," including, apparently CSAM. So we're in a bit of uncharted waters, really. Like mostly, no I would agree, it's a bad idea to hold the makers of a tool responsible for how it's used. And, this is an especially egregious offense on the part of said tool-maker.
Like how I see this is:
* If you can't restrict people from making kiddie porn with Grok, then it stands to reason at the very least, access to Grok needs to be strictly controlled.
* If you can restrict that, why wasn't that done? It can't be completely omitted from this conversation that Grok is, pretty famously, the "unrestrained" AI, which in most respects means it swears more, quotes and uses highly dubious sources of information that are friendly to Musk's personal politics, and occasionally spouts white nationalist rhetoric. So as part of their quest to "unwoke" Grok did they also make it able to generate this shit too?
This seems different. With a knife the stabbing is done by the human. That would be akin to a paintbrush or camera or something being used to create CSAM.
Here you have a model that is actually creating the CSAM.
It seems more similar to a robot that is told to go kill someone and does so. Sure, someone told the robot to do something, but the creators of the robot really should have to put some safeguards to prevent it.