logoalt Hacker News

thinkthatoveryesterday at 11:03 PM1 replyview on HN

I really don't like this framing - it's hard to short a market at the best of times, let alone when governments have a vested interest in tech being too big to fail to compete in the global economic arms race - see Intel's stock in the past few months.

I agree with you both - undoubtedly there are still massive gains to be made with the frontier models we have today with tooling and iteration, yet I do not believe there's sufficient evidence to claim we are rolling towards AG/SI on an exponential curve, without some additional breakthroughs given the jagged edges and data used to train models being fundamentally linear


Replies

pear01today at 4:39 AM

Just remember you don't need AGI to see massive societal change. Certainly not mass layoffs. AGI is not the bar. By the time we all agree AGI has come the world will have already changed.

You just need AI to be just good enough to win the tradeoff over a human employee. Just take your average office. Then ask yourself if the bar is really that high. AGI strikes me as an extremely nebulous concept. Better to just list everyone at your office and bucket them with a guess of how soon you think AI will replace them. Or weaken their market power. This is what every corporate boss in America is already doing. I'm merely suggesting rather than hope a graph curves in our individual favor we try to act more collectively as a species. Of course, I don't hold my breath.

I also don't find myself compelled by the notion that the danger to humanity is "AGI". The true danger is as it always has been - each other.