I think this is the wrong mental model. The correct one is:
'AI makes everything easier, but it's a skill in itself, and learning that skill is just as hard as learning any other skill.'
For a more complete understand, you also have to add: 'we're in the ENIAC era of AI. The equivalents of high-level languages and operating systems haven't yet been invented.'
I have no doubt the next few years will birth a "context engineering" academic field, and everything we're doing currently will seem hopelessly primitive.
My mind changed on this after attempting complex projects—with the right structure, the capabilities appear unbounded in practice.
But, of course, there is baked-in mean reversion. Doing the most popular and uncomplicated things is obviously easier. That's just the nature of these models.
Funny how people only looks at the easy part, but not the cost part.
"I did it with AI" = "I did it with an army of CPU burning considerable resources and owned by a foreign company."
Give me an AI agent that I own and operate 100%, and the comparison will be fair. Otherwise it's not progress, but rather a theft at planetary scale.