> I would argue that corporate actors (a state, an army or a corporation) are not true superorganisms but are semi-autonomous, field-embedded systems that can exhibit super-organism properties, with their autonomy being conditional, relational and bounded by the institutional logics and resource structures of their respective organisational fields.
Lotsa big words there.
Really, though, we're probably going to have AI-like things that run substantial parts of for-profit corporations. As soon as AI-like things are better at this than humans, capitalism will force them to be in charge. Companies that don't do this lose.
There's a school of thought, going back to Milton Friedman, that corporations have no responsibilities to society.[1] Their goal is to optimize for shareholder value. We can expect to see AI-like things which align with that value system.
And that's how AI will take over. Shareholder value!
[1] https://www.nytimes.com/1970/09/13/archives/a-friedman-doctr...
That assumes that consumers will just accept it. I would not do business with an AI company, just as I don’t listen to AI music, view AI pictures or video, or read AI writings. At least not knowingly.
Costs will go down. But so will revenue, as fewer customers have an income because a different company also cut costs.
Record profits. Right up until the train goes off a cliff.