This is an intellectually and morally deficient position to take. There is no moral principle in any system anywhere in the history of the universe that requires me to bind myself to a contract that nobody else is bound to.
We can all agree, as a society, "hey, no individual person will graze more than ten cows on the commons," and that's fine. And if we all agree and someone breaks their vow, then that is immoral. "Society just sucks when everyone thinks this way" indeed.
But if nobody ever agreed to it, and you're out there grazing all you're cattle, and Ezekiel is out there grazing all his cattle, and Josiah is out there grazing all his cattle, there is no reasonable ethical principle you could propose that would prevent me from grazing all my cattle too.
I think this argument would justify slavery: no one (white people) has decided that holding others as slaves is bad, therefore I can hold slaves.
But let me entertain it for a moment: prior to knowing, e.g., that plastics or CO2 are bad for the environment, how should one know that they are bad for the environment. Fred, the first person to realize this would run around saying "hey guys, this is bad".
And here is where I think it gets interesting: the folks making all the $ producing the CO2 and plastics are highly motivated to say "sorry Fred, your science is wrong". So when it finally turns out that Fred was right, were the plastics/CO2 companies morally wrong in hindsight?
You are arguing that morality is entirely socially determined. This may be partially true, but IMO, only economically. If I must choose between hurting someone else and dying, I do not think there is a categorically moral choice there. (Though Mengzi/ Mencius would say that you should prefer death -- see fish and the bear's paw in 告子上). So, to the extent that your life or life-preserving business (i.e. source of food/housing) demands hurting others (producing plastics, CO2), then perhaps it is moral to do so. But to the extent that your desire for fancy cars and first class plane tickets demands producing CO2...well (ibid.).
The issue is that the people who benefit economically are highly incentivized to object to any new moral reckoning (i.e. tracking people is bad; privacy is good; selling drugs is bad; building casinos is bad). To the extent that we care about morality (and we seem to), those folks benefitting from these actions can effectively lobby against moral change with propaganda. And this is, in fact, exactly what happens politically. Politics is, after all, an attempt to produce a kind of morality. It may depend on whom you follow, but my view would be that politics should be an approach to utilitarian management of resources, in service of the people. But others might say we need to be concerned for the well-being of animals. And still others, would say that we must be concerned with the well-being of capital, or even AIs! In any case, large corporations effectively lobby against any moral reckoning against their activities and thus avoid regulation.
The problem with your "socially determined morality" (though admittedly, I increasingly struggle to see a practical way around this) is that, though in some ways true (since society is economics and therefore impacts one's capacity to live) is that you end up in a world in which everyone can exploit everyone else maximally. There is no inherent truth in what the crowd believes (though again, crowd beliefs do affect short-term and even intermediate-term economics, especially in a hyper-connected world). The fact that most white people in the 1700s believed that it was not wrong to enslave black people does not make that right. The fact that many people believed tulips were worth millions of dollars does not make it true in the long run.
Are we running up against truth vs practicality? I think so. It may be impractical to enforce morality, but that doesn't make Google moral.
Overall, your arguments are compatible with a kind of nihilism: there is no universal morality; I can adopt whatever morality is most suitable to my ends.
I make one final point: how should slavery and plastics be handled? It takes a truly unfeeling sort of human to enslave another human being. It is hard to imagine that none of these people felt that something was wrong. Though google is not enslaving people nor are its actions tantamount to Nazism, there is plenty of recent writing about the rise of technofascism. The EAs would certainly sacrifice the "few" of today's people for the nebulous "many" of the future over which they will rule. But they have constructed a narrative in which the future's many need protection. There are moral philosophies (e.g. utilitarianism) that would justify this. And this is partially because we have insufficient knowledge of the future, and also because the technologies of today make highly variable the possible futures of tomorrow.
I propose instead that---especially in this era of extreme individual power (i.e. the capacity to be "loud" -- see below)---a different kind of morality is useful: the wielding of power is bad. As your power grows, so to does the responsibility to consider its impact on others and to more aggressively judge every action one takes under the Veil of Ignorance. Any time we affect the lives of others around us, we are at greater risk of violating this morality. See eg., Tools for Conviviality or Silence is a Commons (https://news.ycombinator.com/item?id=44609969). Google and the tech companies are being extremely loud, and you'd have to be an idiot to see that it's not harmful. If your mental contortions allow you to say "harm is moral because the majority don't object," well, that looks like nihilism and certainly doesn't get us anywhere "good". But my "good" cannot be measured, and your good is GDP, so I suppose I will lose.
> There is no moral principle in any system anywhere in the history of the universe that requires me to bind myself to a contract that nobody else is bound to.
Is there not? I don't feel this makes sense to me, as the conclusion seems to be "if everyone (or perhaps a large amount of people) do it, then it's not immoral". My immediate thought goes to moral systems that universalise an action, such that if everyone did it and it makes the world worse, then it's something that you should not do. That would be an example of a system that goes counter to what you say. Since morals are personal, you can still have that conclusion even if other people do not subscribe to the same set of moral beliefs that you have. Something can be immoral to you, and you will refuse to do it even if everyone else does.
> But if nobody ever agreed to it [...] there is no reasonable ethical principle you could propose that would prevent me from grazing all my cattle too.
Why not? I don't quite understand your conclusion. Why could the conclusion not be "I feel what everyone else is doing is wrong, and I will not do it myself"? Is it because it puts you at a disadvantage, and you believe that is unfair? Perhaps this is the "reasonable" aspect?