> Conversely, we ignored brilliant people simply because they couldn't articulate their complex ideas effectively.
I don't see how AI helps here. If you can't articulate your idea, then
1) how clear is that idea in your head anyway
2) how are you going to articulate it to the LLM?
You're projecting your own thinking style onto others; you're incorrectly assuming that because your thinking maps neatly onto language that everyone else works like that. No so.
For example, I'm bilingual and I tend to think in visual and abstract concepts and then translate to the target language as a separate step. It doesn't necessarily come out exactly right the first time. I often re-read what I wrote and see ambiguities which could cause someone to misinterpret what I'm trying to say.
Also, I tend to over-elaborate and struggle to understand other people's mental models. You need to understand your audience really well in order to convey points effectively or else you might bore them or your ideas might seem to go off on a tangent while you're actually trying to lay the foundation for the idea you're trying to convey...
For example, as an experiment, I posted my previous comment 2 times; once handwritten, the other transformed by Gemini (the one you responded to). The transformed one did better and got more engagement... It said the same thing but punchier and shorter. It doesn't waste words laying the groundwork because it has a better sense of what you already know (as the audience) given the conversation context.
This comment here is handwritten. I suspect it's probably not as punchy or to-the-point from your perspective.
So to summarize; I think LLMs can help some people more than others and it fits with the point I was trying to make that it will empower more people to write who would previously not write.