> And by the way, training your monster on data produced in part by my own hands, without attribution or compensation.
> To the others: I apologize to the world at large for my inadvertent, naive if minor role in enabling this assault.
this is my position too, I regret every single piece of open source software I ever produced
and I will produce no more
Unfortunately as I see it, even if you want to contribute to open source out of a pure passion or enjoyment, they don't respect the licenses that are consumed. And the "training" companies are not being held liable.
Are there any proposals to nail down an open source license which would explicitly exclude use with AI systems and companies?
If you're unhappy that bad people might use your software in unexpected ways, open source licenses were never appropriate for you in the first place.
Anyone can use your software! Some of them are very likely bad people who will misuse it to do bad things, but you don't have any control over it. Giving up control is how it works. It's how it's always worked, but often people don't understand the consequences.
It's kind of ironic since AI can only grow by feeding on data and open source with its good intentions of sharing knowledge is absolutely perfect for this.
But AI is also the ultimate meat grinder, there's no yours or theirs in the final dish, it's just meat.
And open source licenses are practically unenforceable for an AI system, unless you can maybe get it to cough up verbatim code from its training data.
At the same time, we all know they're not going anywhere, they're here to stay.
I'm personally not against them, they're very useful obviously, but I do have mixed or mostly negative feelings on how they got their training data.
I learned what i learned due to all the openess in software engineering and not because everyone put it behind a pay wall.
Might be because most of us got/gets payed well enough that this philosophy works well or because our industry is so young or because people writing code share good values.
It never worried me that a corp would make money out of some code i wrote and it still doesn't. AFter all, i'm able to write code because i get paid well writing code, which i do well because of open source. Companies always benefited from open source code attributed or not.
Now i use it to write more code.
I would argue though, I'm fine with that, to push for laws forcing models to be opened up after x years, but i would just prefer the open source / open community coming together and creating just better open models overall.
I've been feeling a lot the same way, but removing your source code from the world does not feel like a constructive solution either.
Some Shareware used to be individually licensed with the name of the licensee prominently visible, so if you had got an illegal copy you'd be able to see whose licensed copy it was that had been copied.
I wonder if something based on that idea of personal responsibility for your copy could be adopted to source code. If you wanted to contribute to a piece of software, you could ask a contributor and then get a personally licensed copy of the source code with your name in every source file... but I don't know where to take it from there. Has there ever been some system similar to something like that that one could take inspiration from?
That's a weird position to take. Open source software is actually what is mitigating this stupidity in my opinion. Having monopolistic players like Microsoft and Google is what brought us here in the first place.
And then having vibe coders constantly lecture us about how the future is just prompt engineering, and that we should totally be happy to desert the skills we spent decades building (the skills that were stolen to train AI).
"The only thing that matters is the end result, it's no different than a compiler!", they say as someone with no experience dumps giant PRs of horrific vibe code for those of us that still know what we're doing to review.
What a miserable attitude. When you put something out in the world it's out there for anyone to use and always has been before AI.
> and I will produce no more
Nah, don't do that. Produce shitloads of it using the very same LLM tools that ripped you off, but license it under the GPL.
If they're going to thief GPL software, least we can do is thief it back.
Why? The core vision of free software and many open source licenses was to empower users and developers to make things they need without being financially extorted, to avoid having users locked in to proprietary systems, to enable interoperability, and to share knowledge. GenAI permits all of this to a level beyond just providing source code.
Most objections like yours are couched in language about principles, but ultimately seem to be about ego. That's not always bad, but I'm not sure why it should be compelling compared to the public good that these systems might ultimately enable.
[flagged]
Was it ever open source if there was an implied refusal to create something you don't approve of? Was it only for certain kinds of software, certain kinds of creators? If there was some kind of implicit approval process or consent requirement, did you publish it? Where can that be reviewed?
> and I will produce no more
Thanks for your contributions so far but this won't change anything.
If you'd want to have a positive on this matter, it's better to pressure the government(s) to prevent GenAI companies from using content they don't have a license for, so they behave like any other business that came before them.
What people like Rob Pike don't understand is that the technology wouldn't be possible at all if creators needed to be compensated. Would you really choose a future where creators were compensated fairly, but ChatGPT didn't exist?
That’s throwing the baby out with the bath water.
The Open Source movement has been a gigantic boon on the whole of computing, and it would be a terrible shame to lose that ad a knee jerk reaction to genAI