> The real danger lies in their seductive nature - over how tempting it becomes to immediately reach for the nearest LLM to provide an answer rather than taking a few moments to quietly ponder the problem on your own.
I get the point you're trying to make. However, quietly pondering the problem is only fruitful if you have the right information. If you don't, best case scenario you risk wasting time reinventing the wheel for no good reason. In this application, a LLM is just the same type of tool as Google: a way to query and retrieve information cor you to ingest. Like Google, the info you get from queries is not the end but the means.
As the saying goes, a month in the lab saves you a week in the library. I would say it can also save you 10 minutes with Claude/ChatGPT/Copilot.
Is hiring a private tutor also laziness?
>wasting time reinventing the wheel for no good reason
Nearly all of learning relies on reinventing the wheel. Most personal projects involve reinventing wheels, but improving yourself by doing so.
I'll stop short of asserting you don't, but I'm having a hard time convincing myself your reply does reflect that you get GP's point.
If I were to reframe GP's point, it would be: having to figure out how to answer a question changes you a little. Over time, it changes you a lot.
Yes, of course, there is a perspective from which a month spent in the lab to answer a question that's well-settled in the literature is ~wasted. But the GP is arguing for a utility function that optimizes for improving the questioner.
Quietly pondering the problem with the wrong information can be fruitful in this context.
(To be pragmatic, we need both of these. We'd get nowhere if we had to solve every problem and learn every lesson from first principles. But we'd also get nowhere if no one were well-prepared and motivated to solve novel problems without prior art.)