Having taught at a university I'll say that the general reason is because there's already too much to teach, so you do your best. It's extra hard since there's a million people saying "why don't they teach X?" and you have to accommodate them.
There's problems like do you teach Python or C? It sounds silly but the difference is not about languages but how much you teach about systems. Teaching Python you get people going and they can produce faster, which does help students get less discouraged. But teaching C forces learning about the computer system and enables students to dive deeper to teach themselves many different subtopics that no 4 year program can.
What I think is generally missing and would be good to implement is code review and teaching how to understand a large existing codebase (all that grep, find, profilers, traces, tags, and all that jazz). This often gets taught in parallel (e.g. have students review each others code) but it's hit or miss, a lot of extra work, and not everyone does it.
Here's the shitty part: I was often told by peers and people higher up "don't look at student's code, just look at output and run tests." I always ignored, because that advice is why we're failing so many students. But I also understand it because professors are overburdened. There's too much work to do and teaching isn't even half the job. Then every new administrator or "office assistant" they hire, the more work you have (seriously, it'll take days to book a flight because you have to use some code but it takes 2 days for someone to tell you the code and 5 more to tell you that it was the wrong code and it's clearly your fault because you clicked on "book flight" and not trips > booking > flights > schedules > trips > access code > flights > search available flights. Honestly, I think all this llm agent stuff would sound silly if people actually just knew how to design things...)