Wow I knew many people had anti-AI sentiments, but this post has really hit another level.
It will be interesting to look back in 10 years at whether we consider LLMs to be the invention of the “tractor” of knowledge work, or if we will view them as an unnecessary misstep like crypto.
Thank you for at least acknowledging that we may eventually feel differently about AI.
I'm so tired of being called a luddite just for voicing reservations. My company is all in on AI. My CEO has informed us that if we're not "100% all in on AI", then we should seek employment elsewhere. I use it all day at work, and it doesn't seem to be nearly enough for them.
I wonder if we would still call it "knowledge work" if no human knowledge/experience is required or in the loop anymore. And also if we will stop looking up to that generally.
Because AI stands at odds with the concept of meritocracy I also wonder if we will stop democratically electing other humans and outsource such tasks as well.
Overall I'm not seeing it. Progress is already slow and so far I personally think what AI can do is a nice party trick but it remains unimpressive if judged rigorously.
It doesn't matter if it can one shot code a game in a few minutes. The reason why a game made by a human is probably still better is because the human spends hours and days of deep focus to research and create it. It is not at all clear that, given as much time, AI could deliver the same results.
It'll be the latter. Unfortunately a lot of damage (including psychological damage) has to be done before people realize it.