Maybe. But I think you might just be reading LLM output more often than you think you are.
That's sadly possible.
When I've noticed this it's been in contexts where things lean against text being fully LLM-generated but... who the hell knows.
That's sadly possible.
When I've noticed this it's been in contexts where things lean against text being fully LLM-generated but... who the hell knows.