logoalt Hacker News

endymion-lightyesterday at 10:45 AM1 replyview on HN

I feel like this is partially a skill issue - You can get direct, cited information from LLMs. There's a level of personal responsibility for over-using the tools and letting them feed you bad/false information, but if you try researching specific abstractions, newer documentation, most LLMS now correctly call and research the tools available, directly citing them.

I think you can build a very easy workflow that reinforces rather than replaces learning, I've used a citation flow to link and put into practice a ton of more advanced programming techniques, that I found incredibly difficult to locate and research before AI.

I'd say the comparison is faulty, it's more akin to swimming to an island (no-ai) vs using a boat. You control the speed and direction of the boat, which also means you have the responsbility of directing it to the correct location.


Replies

utopiahyesterday at 10:51 AM

The analogy was about the unknown thinnest of the ice, not just the fastest way to get there. It's specifically about the lack of reliability of the process.

show 1 reply