logoalt Hacker News

lapcatyesterday at 3:06 PM1 replyview on HN

> The cost is SPECIFICALLY that it is harder to jump in to a mature field with its own jargon and concerns.

Hasn't that been the case for decades? What specifically is different now, such that for some reason it's harder to jump in now than it was before?

If anything, LLMs are supposed to make things easier, aren't they?

> it's the implied "imma wait until everything falls apart, then we'll go back to what I know how to do." that people don't like saying.

You can read whatever assumption you want into the blog post, but it's not there in the words. You're dunking on a straw man.


Replies

adampunkyesterday at 4:31 PM

>What specifically is different now, such that for some reason it's harder to jump in now than it was before?

Well, the obvious evidence that something is different now is all around us. It's been made with painful seriousness by people who thought they were making a different point, namely that LLMs represent an unreliable, poorly understood, and hazardous abstraction layer between coders and the machine. Specifically that this abstraction layer is DIFFERENT than others in the past. There are dozens and dozens of blog posts making this point (some written by machines) on HN. It would be hard to have not come across this point or miss the chorus of engineers agreeing with it. It's supposedly a cardinal reason why assembly -> C was a "good abstraction" and natural language -> slop is a "bad abstraction." If we take that argument seriously, it represents strong evidence that something new is happening, independent of anything I might say.

Why is it different? C'mon. COME ON. why can I find a post on the front page of HN when Claude is down for more than 10 minutes? Why can I find out that a new model has been released from the big 5 frontier labs, again on the front page, inside minutes? Why is it different? Did we build trillions of dollars of datacenters for NetBeans or SecondLife or whatever other cartoonish old fad I'm supposed to treat as analogous today? Are we just supposed to imagine that Microsoft, NVIDIA, Facebook, Google, Alibaba are all just staffed with idiots, or they're all caught up in irrational exuberance? Are we supposed to watch generation costs march down and outcomes improve and still think 'yeah, this is just like Pets.com? Are we supposed to yield to vague and suggestive motions toward e.g. the dot com boom as though working with agents were the same thing as investing in a specific internet company ca. 1998? Are we supposed to take from that analogy that an engineer who said "no thanks, I'll wait to see how this internet thing shakes out" in the 1990s was a real smarty to be emulated? Come on.

It's both categorically different and clearly has meaningful material force behind it.

This is a whole different interface to the computer and even if the eventual outcome is that real engineering work happens with tightly constrained and specialized harnesses around agents, understanding the actual interface is critical. Ironically, the meta-claim here is that good engineers will just be able to vibe out correct practice by engineering harder instead of understanding that core interface! Rather what will be needed is attention and orientation to concerns that people care about in the space.

I don't want to dunk on a strawman. I'd much rather not see a whole community of engineers loudly pat each other on the back for not learning about something.

show 1 reply