logoalt Hacker News

ej88today at 2:33 PM3 repliesview on HN

I am personally of the opinion that ML will end up being 'normal technology', albeit incredibly transformative.

I think you can combine 'Incanters' and 'Process Engineers' into one - 'Users'. Jobs that encompass a role that requires accountability will be directing, providing context, and verifying the output of agents, almost like how millions of workers know basic computer skills and Microsoft Office.

In my opinion, how at-risk a job is in the LLM era comes down to:

1: How easy is it to construct RL loops to hillclimb on performance?

2: How easy is it to construct a LLM harness to perform the tasks?

3: How much of the job is a structured set of tasks vs. taking accountability? What's the consequence of a mistake? How much of it comes down to human relationships?

Hence why I've been quite bullish on software engineering (but not coding). You can easy set up 1) and 2) on contrived or sandboxed coding tasks but then 3) expands and dominates the rest of the role.

On Model Trainers -- I'm not so convinced that RLHF puts the professional experts out of work, for a few reasons. Firstly, nearly all human data companies produce data that is somewhat contrived, by definition of having people grade outputs on a contracting platform; plus there's a seemingly unlimited bound on how much data we can harvest in the world. Secondly, as I mentioned before, the bottleneck is both accountability and the ability for the model to find fresh context without error.


Replies

aphyrtoday at 4:09 PM

> I think you can combine 'Incanters' and 'Process Engineers' into one - 'Users'

I wanted to talk about this more but couldn't quite figure out how to phrase it, so I cut a fair bit: with "incanters" I'm trying to point at a sort of ... intuitive, more informal practitioner knowledge / metis, and contrast it with a more statistically rigorous approach in "statistical/process engineers". I expect a lot of people will fuse the two, but I'm trying to stake out some tentpoles here. Users integrate a continuum of approaches, including individual intuition, folklore, formal and informal texts, scientific papers, and rigorously designed harnesses & in-house experiments. Like farming--there's deep, intuitive knowledge of local climate and landraces, but also big industrial practice, and also research plots, and those different approaches inform (and override) each other in complex ways.

netcantoday at 2:53 PM

In some sense, technology is "not normal" regardless.

If we think of the digitization tech revolution... the changes it made to the economy are hard to describe well, even now.

In the early days, it was going to turn banks from billion dollar businesses to million dollar ones. Universities would be able to eliminate most of their admin. Accounting and finances would be trivialized. Etc.

Earlier tech revolution s were unpredictable too... But at lest retrospectively they made sense.

It's not that clear what the core activities of our economy even are. It's clear at micro level, but as you zoom out it gets blurry.

Why is accountability needed? It's clearly needed in its context... but it's hard to understand how it aggregates.

show 1 reply
xienzetoday at 3:38 PM

> Hence why I've been quite bullish on software engineering (but not coding). You can easy set up 1) and 2) on contrived or sandboxed coding tasks but then 3) expands and dominates the rest of the role.

Why can't LLMs and agents progress further to do this software engineering job better than an actual software engineer? I've never seen anyone give a satisfactory answer to this. Especially the part about making mistakes. A lot of the defense of LLM shortcomings (i.e., generating crappy code) comes down to "well humans write bad code too." OK? Well, humans make mistakes too. Theoretically, an LLM software engineer will make far fewer than a human. So why should I prefer keeping you in the loop?

It's why I just can't understand the mindset of software engineers who are giddy about the direction things are going. There really is nothing special about your expertise that an LLM can't achieve, theoretically.

We're always so enamored by new and exciting technology that we fail to realize the people in charge are more than happy to completely bury us with it.

show 4 replies