Best quote from the article:
> That’s also why I see no point in using AI to, say, write an essay, just like I see no point in bringing a forklift to the gym. Sure, it can lift the weights, but I’m not trying to suspend a barbell above the floor for the hell of it. I lift it because I want to become the kind of person who can lift it. Similarly, I write because I want to become the kind of person who can think.
If you're writing an essay to prove you can or to speak your words - then you should do it yourself - but sometimes you just need an essay to summarize a complex topic as a deliverable.
tough most people either don't get it or are lay people that do not want to become the kind of people who can think. I go with the second one
Below is the worst quote... It is plain wrong to see an LLM as a bags of words. LLMs pre-trained on large datasets of text are world models. LLMs post-trained with RL are RL-agents that use these modeling capabilities.
> We are in dire need of a better metaphor. Here’s my suggestion: instead of seeing AI as a sort of silicon homunculus, we should see it as a bag of words.
I don't really like the assumption that anyone who uses AI to, say, write an essay, is not the "kind of person who can think."
And using AI to replace things you find recreational is not the point. If you got paid $100 each time you lifted a weight, would you see a point in bringing a forklift to the gym if it's allowed? Or will that make you a person who is so dumb that they cannot think, as the author is implying?