> They took 1970s dead tech and deployed it on machines 1 million times more powerful. I’m not sure I’d qualify this as progress
If this isn’t meant to be sarcasm or irony, you’ve got some really exciting research and learning ahead of you! At the moment it reads very “computers are just addition and multiplication and we’ve had that for thousands of years!”
> you’ve got some really exciting research and learning ahead of you
I've done the research. Which is why I made the point I did. You're being dismissive and rude instead of putting forth any sort of argument. It's the paper hat of fake intellect. Yawn.
> At the moment it reads very “computers are just addition and multiplication and we’ve had that for thousands of years!”
Let's be specific then. The problem with the models is they require exponential cost growth for model generation giving only linear increases in output performance. This cost curve is currently a factor or two stronger than the curve of increasing hardware performance. Putting the technology, absent any actual fundamental algorithmic improvements, which do /not/ seem forthcoming despite billions in speculative funding, into a strict coffin corner. In short: AI winter 2.0.
Got any plans for that? Any specific research that deals with that? Any thoughts of your own on this matter?