I wonder how this will work with AI stuff generating code without any source or attribution. It’s not like the LLMs make this stuff up out of thin air it comes from source material.
I don't think anyone really disputes what should be done when an LLM violates copyright in a way that would be a violation if a human did it.
Questions about LLMs are primarily about whether it's legal for them to do something that would be legal for a human to do and secondarily about the technical feasibility of policing them at all.
> I wonder how this will work with AI stuff generating code without any source or attribution.
It's already fixed. Anything you make with AI cannot be protected in any way (UK gives some leeway on certain types of creations).
So if it mimics code from ffmpeg for example, then ffmpeg wins.
Everything is a derivative work.
Llm's do not verbatim disgorge chunks of the code they were trained on.
Everything humans make up also comes from source material.
The real (legal) question in either case, is how much is actually copied, and how obvious is it.
Best case scenario is it nukes the whole concept of software patents and the whole ugly industry of copyright hoarding. The idea that perpetual rent-seeking is a natural extension and intended outcome of the legal concepts of copyrights and patents is bizarre.