logoalt Hacker News

Waterluviantoday at 3:23 PM7 repliesview on HN

I think if your job is to assemble a segment of a car based on a spec using provided tools and pre-trained processes, it makes sense if you worry that giant robot arms might be installed to replace you.

But if your job is to assemble a car in order to explore what modifications to make to the design, experiment with a single prototype, and determine how to program those robot arms, you’re probably not thinking about the risk of being automated.

I know a lot of counter arguments are a form of, “but AI is automating that second class of job!” But I just really haven’t seen that at all. What I have seen is a misclassification of the former as the latter.


Replies

mips_avatartoday at 7:49 PM

Well a lot of managers view their employees as doing the former, but they’re really doing the latter

enlythtoday at 3:34 PM

A software engineer with an LLM is still infinitely more powerful than a commoner with an LLM. The engineer can debug, guide, change approaches, and give very specific instructions if they know what needs to be done.

The commoner can only hammer the prompt repeatedly with "this doesn't work can you fix it".

So yes, our jobs are changing rapidly, but this doesn't strike me as being obsolete any time soon.

show 3 replies
crazyloggertoday at 4:19 PM

You are describing tradition (deterministic?) automation before AI. With AI systems as general as today's SOTA LLMs, they'll happily take on the job regardless of the task falling into class I or class II.

Ask a robot arm "how should we improve our car design this year", it'll certainly get stuck. Ask an AI, it'll give you a real opinion that's at least on par with a human's opinion. If a company builds enough tooling to complete the "AI comes up with idea -> AI designs prototype -> AI robot physically builds the car -> AI robot test drives the car -> AI evaluates all prototypes and confirms next year's design" feedback loop, then theoretically this definitely can work.

This is why AI is seen as such a big deal - it's fundamentally different from all previous technologies. To an AI, there is no line that would distinguish class I from II.

figassistoday at 6:02 PM

I don’t think this is the issue “yet”. It’s that no matter what class you are, your CEO does not care. Mediocre AI work is enough to give them immense returns and an exit. He’s not looking out for the unfortunate bag holders. The world has always had tolerance for highly distributed crap. See Windows.

Buttons840today at 3:51 PM

My job is to make people who have money think I'm indispensable to achieving their goals. There's a good chance AI can fake this well enough to replace me. Faking it would be good enough in an economy with low levels of competition; everyone can judge for themselves if this is our economy or not.

HorizonXPtoday at 3:29 PM

This is actually a really good description of the situation. But I will say, as someone that prided myself on being the second one you described, I am becoming very concerned about how much of my work was misclassified. It does feel like a lot of work I did in the second class is being automated where maybe previously it overinflated my ego.

show 1 reply
raincoletoday at 3:39 PM

> I know a lot of counter arguments are a form of, “but AI is automating that second class of job!”

Uh, it's not the issue. The issue is that there isn't that much demand for the second class of job. At least not yet. The first class of job is what feeds billions of families.

Yeah, I'm aware of the lump of labour fallacy.

show 2 replies