> These kinds of messages read to me like people with superiority complexes. We get that you don't need AI to help you write a letter. For the rest of us, it allows us to improve our writing, can be a creative partner, can help us express our own ideas
The writing is the ideas. You cannot be full of yourself enough to think you can write a two second prompt and get back "Your idea" in a more fleshed out form. Your idea was to have someone/something else do it for you.
There are contexts where that's fine, and you list some of them, but they are not as broad as you imply.
This feels like the essential divide to me. I see this often with junior developers.
You can use AI to write a lot of your code, and as a side effect you might start losing your ability to code. You can also use it to learn new languages, concepts, programming patterns, etc and become a much better developer faster than ever before.
Personally, I'm extremely jealous of how easy it is to learn today with LLMs. So much of the effort I spent learning the things could be done much faster now.
If I'm honest, many of those hours reading through textbooks, blog posts, technical papers, iterating a million times on broken code that had trivial errors, were really wasted time, time which if I were starting over I wouldn't need to lose today.
This is pretty far off from the original thread though. I appreciate your less abrasive response.
As the saying goes, "If I'd had more time, I would have written a shorter letter". Of course AI can be used to lazily stretch a short prompt into a long output, but I don't see any implication of that in the parent comment.
If someone isn't a good writer, or isn't a native speaker, using AI to compress a poorly written wall of text may well produce a better result while remaining substantially the prompter's own ideas. For those with certain disabilities or conditions, having AI distill a verbal stream of consciousness into a textual output could even be the only practical way for them to "write" at all.
We should all be more understanding, and not assume that only people with certain cognitive and/or physical capabilities can have something valuable to say. If AI can help someone articulate a fresh perspective or disseminate knowledge that would otherwise have been lost and forgotten, I'm all for it.