logoalt Hacker News

JuniperMesosyesterday at 11:01 PM6 repliesview on HN

Why shouldn't someone consult some kind of external resource for help, after struggling with a specific coding problem for 20 minutes? Why is 6 hours the right amount of time to timebox this to?


Replies

demorroyesterday at 11:11 PM

20 minutes is not enough time to drive you into a state of desperation, where you may be forced to try something novel which will expand your mind and future capabilities in unknown and unexpected ways. You might be driven to contact another human being, for example.

show 1 reply
Jtariiyesterday at 11:27 PM

It entirely depends on what your goals are.

If you want to solve the problem quickly then just use the resources you have, if you want to become someone who can solve problems quickly then you need to spend hundreds of hours banging your head against a wall.

bhelkeyyesterday at 11:17 PM

There wasn't always an external resource to go to for help. Especially for legacy pieces of software, it was easy to become the person with most context on the team.

show 1 reply
thranceyesterday at 11:07 PM

The struggle is the point, that's how you learn. If you offload your task to someone/something else after barely 20 minutes of head scratching, you've missed the plot entirely.

bsdertoday at 2:05 AM

1) 20 minutes is barely enough time to get into flow.

2) There are different levels of debugging. Are your eyes going to glaze over searching volumes of logs for the needle in a haystack with awk/grep/find? Fire up the LLM immediately; don't wait at all. Do the fixes seem to just be bouncing the bugs around your codebase? There is probably a conceptual fault and you should be thinking and talking to other people rather than an AI.

3) Debugging requires you to do a brain inload of a model of what you are trying to fix and then correct that model gradually with experiments until you isolate the bug. That takes time, discipline and practice. If you never practice, you won't be able to fix the problem when the LLM can't.

4) The LLM will often give you a very, very suboptimal solution when a really good one is right around the corner. However, you have to have the technical knowledge to identify that what the LLM handed you was suboptimal AND know the right magic technical words to push it down the right path. "Bad AI. No biscuit." on every response is NOT enough to make an LLM correct itself properly; it will always try to "correct" itself even if it makes things worse.

th0ma5yesterday at 11:05 PM

[dead]