I don't think that's obvious. The marginal return on additional units of compute seems to fall pretty quickly for the vast majority of applications, which increases the benefit of decentralization over the cost of reduced compute. It isn't clear the same is true of intelligence.