It's a war in the sense that there's a concern that eventually you hit a singularity and can outsmart others in ways not constrained by human scales.
If you make better guns, you're still limited by how many people can carry them. You can't conquer the world just like this.
But if someone invents super intelligence, they can dominate new AI research, control global economies, fight much better, and all very quickly.
A lot of it is just projections of what the US would do if they had such a tool, I doubt China cares a lot about the US outside of them being a source of commercial revenues. They're on the way up, the US are falling down fast, that's why China lives rent free in the American mind, they can't stand it
With the irony being that a true super intelligence, and least in my definition, would conclude that war and dominance is stupid.
> But if someone invents super intelligence, they can dominate new AI research, control global economies, fight much better, and all very quickly.
After reading "If Anyone Builds It, Everyone Dies" I think this is not the correct take. If anyone creates ASI, it just means it's going to wipe everyone out, and it doesn't matter if China or the US do it first
International goose-chasing competition
"Wild goose race", even.
True, I would have preferred benevolent dictator scenario, like with the Internet. But this time around it's different - AI data centers will be protected like embassies.
Hilarious to see people predicting a singularity when 40% of the u.s. economy can barley keep the LLMs online to complete mundane software tasks.
If anyone actually DOES invent ASI and doesn't share it then EVERYONE ELSE will never stop trying to steal it.
I think you need to reevaluate your definition of the singularity. "outsmart others in ways not constrained by human scales" could apply to the enigma machine just as much as Claude. Even an AI beyond human intelligence doesn't automatically qualify as the singularity.
The singularity has to do with the rate of technological development.