LLMs are very easy to pick up, the point of them for their makers is to commoditize skill and knowledge, you can't be left behind in learning to use them, AI providers don't have economic incentives to make them into anything other than appliances.
The people more at risk of being left behind are the ones that don't learn when not to trust their output.
They'll get left behind in the same sense that 1980s professionals who refused to touch computers got left behind.
> The people more at risk of being left behind are the ones that don't learn when not to trust their output.
Or the ones who fall out of practice writing software themselves because they've been relying on AI to do all the work.
(Or the same, but with "long-form English text" instead of "software".)