What about just AI assisted accessibility? Like stop requiring apps to do anything at all. The AI visually parses the app UI for the user, explains it, and interacts.
Accessible is an also-have at best for the vast majority of software. This would open a lot more software to blind users than is currently available.
That's expensive, slow (listen to a screenreader user some time to see how quickly they operate) and likely only works online.
I'm also not going to shirk my responsibilities as a developer based on a hope that the assistive tech will improve.