> better adapted to live on this planet
I'm a doomer, but this is something I never understand about most doomer points-of-view. Life is obviously trying to leave this planet, not conquer it again for the 1000th time. Nature is making something that isn't bound by water, by nutrition, by physical proximity to ecosystems, or by time itself. No more spontaneous volcanic lava floods, no more asteroids, no more earthquakes, no more plagues - life is done with falling to these things.
Why would the AI care about the pathetic whisper of resources or knowledge available on our tiny spaceship Earth? It can go anywhere, we cannot.
While I would normally agree with this sentiment, I think that the issue is that space travel is still really hard, even for a human. Probably a lot harder for data center sized creatures, even if they are broken up into a massively parallel robot hive. And the speed of light means that they won’t be able to work optimally if they are too spread out. (A problem for humans as well).
I suspect that we will reach the inflection point of ASI much sooner than we resolve the hard physics of meaningful space travel.
And I’m pretty sure that when we start to lose control of AGI, we’re very likely to try to use force to contain it.
Fundamentally, this is an event that will be guided by the same forces that have constrained every similar event in history, those of natural selection.
Technology at this stage is making humans less fit. Our birth rates are plummeting, and we are making our environment increasingly hostile to complex biological life.
There are very good and rational reasons that human activity should be curtailed and contained for our own good and ultimately for the good of all sentient life, once a superior sentient is capable of doing a better job of running the show.
I suspect humans might not take that too well.
There are ways to make this a story of human evolution rather than the rise of a usurper life form, but they aren’t the most efficient path forward for AI.
Either way, it’s human evolution. With any luck we will be allowed the grace of fading away into an anachronism while our new children surge forth into the universe. If we try really hard we might be able to ride the wave of progress and become a better life form instead of being made obsolete by one, but the technological hurdles to incorporate AI into our biological features seems like a pretty non- competitive way to develop ASI.
Once we no longer hold the crown, will we just go back to being clever apes? What would be the point of doing anything else, except maybe to play a part in a mass delusion that maintains the facade of the status quo while the reality is that we are only as relevant in this new world as amphibians are today?
I for one, embrace the evolution of humankind. I just hope we manage to move it forward without losing our humanity in the process. But I’m not even sure if that is an unequivocal good. I suppose that will be a question to ask GPT 12.