I agree, this is clearly an indictment against LLMs. If LLMs and agents were capable they'd 100% write it natively but they realize the current limitations.
I have tested this and they are very capable, replacing the modern windows calc (38mb mem) with an identical app (ux, looks, features, accessibility, localization) ended up with an app using 2MB memory.
You just need to point it in the right direction (c/rust/go etc) and be harsh with the requirements, especially memory usage.
If I was Microsoft I would use AI for this rather than badly embedding AI everywhere, a lot of power users would be overwhelmed by a win 11 update where OS apps/features dropped mem usage by 90%+
I have tested this and they are very capable, replacing the modern windows calc (38mb mem) with an identical app (ux, looks, features, accessibility, localization) ended up with an app using 2MB memory.
You just need to point it in the right direction (c/rust/go etc) and be harsh with the requirements, especially memory usage.
If I was Microsoft I would use AI for this rather than badly embedding AI everywhere, a lot of power users would be overwhelmed by a win 11 update where OS apps/features dropped mem usage by 90%+