I assume that looking into the present we need to think about running local LLMs in the browser. Just a few days ago I submitted an article about that [1].
[1] https://news.ycombinator.com/item?id=45200414