> We may reach a point where the only ones able to afford compute are AI companies
Nah. I think "good enough AI for 95% of people" will be able to run locally within 3-5 years on consumer-accessible devices. There will be concentration of the best compute in AI companies for training, but inference will always become cheaper over time. Decommissioned training chips will also become inference chips, adding even more compute capacity to inference.
This is like computing once again. In 1990 only the upper class could afford computers, as of 2000 only the upper class owned mobile phones, as of now more or less everyone and their kid has these things.
1990? We were solid lower-middle class, and I got a computer for Christmas in 1983. I bought my own, from $$ saved by working in 1987.