logoalt Hacker News

scroottoday at 2:25 PM6 repliesview on HN

These posts claiming that "we will review the output" etc., and that claim software engineers will still need to apply their expertise and wisdom to generated outputs, never seem to think this all the way through. Those who write such articles might indeed have enough experience and deep knowledge to evaluate AI outputs. What of subsequent generations of engineers? What about the forthcoming wave of people who may never attain the (required) deep knowledge, because they've been dependent on these generation tools during the course of their own education?

The structures of our culture combined with what generative AI necessarily is means that expertise will fade generationally. I don't see a way around that, and I see almost no discussion of ameliorating the issue.


Replies

mpalmertoday at 4:51 PM

The solution is to find a way to use these tools in such a way that saves us huge amounts of time but still forces us to think and document our decisions. Then, teach these methods in school.

Self-directed, individual use of LLMs for generating code is not the way forward for industrial software production.

8organicbitstoday at 3:16 PM

Another thing I keep thinking about is that review is harder than writing code. A casual LGTM is suitable for peer review, but applying deep context and checking for logic issues requires more thought. When I write code, I usually learn something about software or the context. "Writing is thinking" in a way that reading isn't.

candiddevmiketoday at 3:01 PM

This is why you aren't seeing GenAI used more in law firms. Lawyers can be disbarred by erroneous hallucinations, so they're all extremely cautious about using them. Imagine if there was that kind of accountability in our profession.

idtoday at 2:33 PM

>software engineers will still need to apply their expertise and wisdom to generated outputs

And in my experience they don't really do that. They trust that it'll be good enough.

dfxm12today at 3:52 PM

I don't understand how this is a new or unique problem. Regardless of when or where (or if!) my coworkers got their degrees, before or after access to AI tools, some of them are intellectually curious. Some do their job well. Some are in over their head & are improving. Some are probably better suited for other lines of work. It's always been an organizational function to identify & retain folks who are willing and able to grow into the experience and knowledge required for the role they currently have and future roles where they may be needed.

Academically, this is a non factor as well. You still learned your multiplication tables even though calculators existed, right?

echelontoday at 2:28 PM

The invention of calculators did not cause society to collapse.

Smart and industrious people will focus energy on economically important problems. That has always been the case.

Everything will work out just fine.