logoalt Hacker News

jerf10/02/20240 repliesview on HN

I think it's a lot of little things. There's a lot of people very motivated to keep presenting not just AI in general, but the AI we have in hand right now as the next big thing. We've got literally trillions of dollars of wealth tied up in that being maintained right now. It's a great news article to get eyeballs in an attention economy. The prospect of the monetary savings has the asset-owning class salivating.

But I think a more subtle, harder-to-see aspect, that may well be bigger than all those forces, is a general underestimation of how often the problem is knowing what to do rather than how. "How" factors in, certainly, in various complicated ways. But "what" is the complicated thing.

And I suspect that's what will actually gas out this current AI binge. It isn't just that they don't know "what"... it's that they can in many cases make it harder to learn "what" because the user is so busy with "how". That classic movie quote "Your scientists were so preoccupied with whether they could, they didn't stop to think if they should" may take on a new dimension of meaning in an AI era. You were so concerned with how to do the task and letting the computer do all the thinking you didn't consider whether that's what you should be doing at all.

Also, I'm sure a lot of people will read this as me claiming AI can't learn what to do. Actually, no, I don't claim that. I'm talking about the humans here. Even if AI can get better at "what", if humans get too used to not thinking about it and don't even use the AI tool properly, AI is a long way from being able to fill in that deficit.