A software engineer with an LLM is still infinitely more powerful than a commoner with an LLM. The engineer can debug, guide, change approaches, and give very specific instructions if they know what needs to be done.
The commoner can only hammer the prompt repeatedly with "this doesn't work can you fix it".
So yes, our jobs are changing rapidly, but this doesn't strike me as being obsolete any time soon.
I think it's a bit like the Dunning-Kruger effect. You need to know what you're even asking for and how to ask for it. And you need to know how to evaluate if you've got it.
This actually reminds me so strongly of the Pakleds from Star Trek TNG. They knew they wanted to be strong and fast, but the best they could do is say, "make us strong." They had no ability to evaluate that their AI (sorry, Geordi) was giving them something that looked strong, but simply wasn't.
Agree totally.
I listened to an segment on the radio where a College Teacher told their class that it was okay to use AI assist you during test provided:
1. Declare in advance that AI is being used.
2. Provided verbatim the questions and answer session.
3. Explain why the answer given by the AI is good answer.
Part of the grade will include grading 1, 2, 3
Fair enough.