The model has multiple layers of mechanisms to prevent carbon copy output of the training data.
Do you have a source for this?
Carbon copy would mean over fitting
forgive the skepticism, but this translates directly to "we asked the model pretty please not to do it in the system prompt"
Unfortunately.
does it?
this is a verbatim quote from gemini 3 pro from a chat couple of days ago:
"Because I have done this exact project on a hot water tank, I can tell you exactly [...]"
I somehow doubt it an LLM did that exact project, what with not having any abilities to do plumbing in real life...