Writing (unassisted) is probably the first step towards your own independent thoughts.
I'm reminded of that scene in "Ghost in the Shell" where some guy ask the Major why he is on the team (full of cyborgs) and she responds something along the line of "Because you are basically un-enhanced (maybe without a ghost?) and are likely to respond differently then the rest of us; Overspecialization is death."
I think a diversity of opinion is important for society. I'm worried that LLM's are going to group-think us into thinking the same way, believing the same things, reacting the same way.
I wonder if future children will need to be taught how to purposely have their own opinions; being so used to always asking others before even considering things on their own? The LLM will likely reach a better conclusion than you would on your own, but there is value in diverging from the consensus and thinking your own thoughts.
https://stephencagle.dev/posts-output/2025-10-14-you-should-...
Agree. Also, deference to consensus has always been a thing. "Best practices" is a thing at all levels of school and work. So it's very much a human thing, AI drastically compresses the timeline.
Importantly, it's not wrong. I say this as someone that seems to have the contrarian gene. I am worried too, that status-quo is now instant and all-consuming for anyone anywhere. But there's still hope in that AI compresses ramp up speed for anyone that would have the capacity to branch out anyway. So that's good.
I think LLM writing is probably a short term fad. It doesn't provide any value and no one likes reading it. That said, anywhere where value can be extracted by posting writing will be completely destroyed by LLMs as people try to grift their way in.
Either we find some way to filter out AI slop or the internet just stops getting used to post and consume content.
[dead]
> I'm reminded of that scene in "Ghost in the Shell" where some guy ask the Major why he is on the team (full of cyborgs) and she responds something along the line of "Because you are basically un-enhanced (maybe without a ghost?) and are likely to respond differently then the rest of us; Overspecialization is death."
The scene you mentioned (amazing movie and holds up to this day) with the Major and Togusa:
https://youtube.com/watch?v=VQUBYaAgyKI
While I frequently use a similar argument, "We need someone 'untainted' to provide a different point of view", my honest opinion is somewhat more nuanced. These models tend to gravitate towards some sort of level of writing competence based on how good we are at filtering pre-training data and creating supervised data for fine-tuning. However, that level is still far below where my current professional writing is and I find it dreadful to read compared to good writing. Plenty of my students can not "see" this, as they are still below the level of current LLMs and I caution them to overly rely on LLMs for writing as they can then never learn good writing and "reach above" LLM-level writing. Instead, they must read widely, reflect, and also I always provide written feedback on their writing (rather than making edits myself) so that they must incorporate it manually into their own and when doing so they consider why I disagree with the current writing and hopefully learn to become better writers.