I go back and forth on this. I relate it to software. I don't think AI can meaningfully write software autonomously. There are people who oversee it and prompt it and even then it might write things badly. So there needs to be a person in the loop. But that person should probably have very deep knowledge of the software especially for say low level coding. But then that person probably developed the knowledge by coding things by hand for a long time. Coding things by hand is part of getting the knowledge. But people especially students rely heavily on AI to write code so I assume their knowledge growth is stunted. I don't know mathematical proofs will help here. The specs have to come from somewhere.
I can see AI making things more productive but it requires humans to be very expert and do more work. That might mean fewer developers but they are all more skilled. It will take a while for people to level up so to speak. It's hard to predict but I think there could be a rough transition period because people haven't caught on that they can't rely on AI so either they will have to get a new career or ironically study harder.
An AI’s ability to meaningfully write software autonomously has changed hugely even in the last 6 months. They might still require a human in the loop, but for how long?