And is "model collapse" a thing when LLMs are trained on 100% LLM-generated code? Fun times ahead.
What examples in history can be learned from here?
What examples in history can be learned from here?