Decades from now. Society is nowhere near ready for a singularity. The AI we have now, as far as it has come, is still a tool for humans to use. It's more Augmented Intelligence than AGI.
A hard takeoff would be the tool bootstrapping itself into an autonomous self-improving ASI in a short amount of time.
And I read Kurzweil years ago too. He thought reverse engineering the human brain once the hardware was powerful enough would together give us the singularity in 2045. And the Turing Test would have been passed by 2029, but seems like LLMs have already accomplished this.
Decades from now. Society is nowhere near ready for a singularity. The AI we have now, as far as it has come, is still a tool for humans to use. It's more Augmented Intelligence than AGI.
A hard takeoff would be the tool bootstrapping itself into an autonomous self-improving ASI in a short amount of time.
And I read Kurzweil years ago too. He thought reverse engineering the human brain once the hardware was powerful enough would together give us the singularity in 2045. And the Turing Test would have been passed by 2029, but seems like LLMs have already accomplished this.