Funny how all the negative uses to which something like this might be put are regulated or criminalized already - if you try to scam someone, commit libel or defamation, attempt widespread fraud, or any of a million nefarious uses, you'll get fined, sued, or go to jail.
Would you want Microsoft to claim they're responsible for the "safety" of what you write with Word? For the legality of the numbers you're punching into an Excel spreadsheet? Would you want Verizon keeping tabs on every word you say, to make sure it's in line with their corporate ethos?
This idea that AI is somehow special, that they absolutely must monitor and censor and curtail usage, that they claim total responsibility for the behavior of their users - Anthropic and OpenAI don't seem to realize that they're the bad guys.
If you build tools of totalitarian dystopian tyranny, dystopian tyrants will take those tools from you and use them. Or worse yet, force your compliance and you'll become nothing more than the big stick used to keep people cowed.
We have laws and norms and culture about what's ok and what's not ok to write, produce, and publish. We don't need corporate morality police, thanks.
Censorship of tools is ethically wrong. If someone wants to publish things that are horrific or illegal, let that person be responsible for their own actions. There is absolutely no reason for AI companies to be involved.
Well, humans do understand such thing as scale.
C4 and nuke are both just explosives, and there are laws in place that prohibit exploding them in the middle of the city. But the laws that regulate storage and access to the nukes and to C4 are different, and there is a very strong reason for that.
Censorship is bad, everyone agrees on that. But regulating access to technology that has already proven that it can trick people into sending millions to fraudsters is a must, IMO. And it'd better be regulated before in overthrows some governments, not after.
Is this a "guns don't kill" argument ?
Microsoft Word and Excel aren't generative tools. If Excel added a new headline feature to scan your financial sheets and auto-adjust the numbers to match what's expected when audited, you bet there would be backlash.
And regarding scrutiny, morphine is a immensely usefulness tool and it's use surely extremely monitored.
On the general point, our society values intent. Tools can just be tools when their primary purpose is in line with our values and they only behave according to the user's intent. AI will have to prove a lot to match both criteria.
I disagree. Analogous would be how we have very limited regulations on guns, but you can’t just have a tank, fighter jet, or ICBM.
Some tools are a lot more powerful than others and we have to take special care with them.
I can write an erotic fiction about your husband or wife or son or daughter in microsoft word, but it's a little different if I scrape their profiles and turn it into hardcore porn and distribute it to their classmates coworkers isn't it?
You are posting this under a pseudonym. If you did publish something horrific or illegal, it would have been the responsibility of this web site to either censor your content, and/or identify you when asked by authorities. Which do you prefer?
Maybe you should talk with image editor developers, copier/scanner manufacturers and governments about the safeguards they shall implement to prevent counterfeiting money.
Because, at the end of the day, counterfeiting money is already illegal.
...and we should not censor tools, and judge people, not the tools they use.
that works for locally hosted models, but if its as a service, openai is publishing those verboten works to you, the person who requested it.
even if it is a local model, if you trained a model to spew nazi propaganda, youre still publishing nazi propaganda to the people who then go use it to make propaganda. its just very summarized propaganda
Censorship of tools...
Then let's parents choose when teenagers can start driving.
Also let's legalize ALL drugs.
Weapons should all be available to public.
Etc. Etc.
----
It's very naive to think that we shouldn't regulate "tools"; or that we shouldn't regulate software.
I do agree that on many cases the bad actors who misuse tools should be the ones punished, but we should always check the risk of putting something out there that can be used for evil.
> Would you want Microsoft to claim they're responsible for the "safety" of what you write with Word? For the legality of the numbers you're punching into an Excel spreadsheet? Would you want Verizon keeping tabs on every word you say, to make sure it's in line with their corporate ethos?
Would you want DuPont to check the toxicity of Teflon effluents they're releasing in your neighbourhood? That's insane. It's people's responsibility to make sure that they drink harmless water. New tech is always amazing.