Yeah it seems pretty obvious that we’re in the mainframe era of transformer models and we’ll soon transition to the personal computer era where these all run on your device, which Apple stands to benefit from the most. Their FoundationModels are actually pretty good at certain tasks
I don't think that's obvious. The marginal return on additional units of compute seems to fall pretty quickly for the vast majority of applications, which increases the benefit of decentralization over the cost of reduced compute. It isn't clear the same is true of intelligence.