logoalt Hacker News

syndackstoday at 2:56 AM12 repliesview on HN

I can’t get over the range of sentiment on LLMs. HN leans snake oil, X leans “we’re all cooked” —- can it possibly be both? How do other folks make sense of this? I’m not asking for a side, rather understanding the range. Does the range lead you to believe X over Y?


Replies

johnfntoday at 3:41 AM

I believe the spikiness in response is because AI itself is spiky - it’s incredibly good at some classes of tasks, and remarkably poor at others. People who use it on the spikes are genuinely amazed because of how good it is. This does nothing but annoy the people who use it in the troughs, who become increasingly annoyed that everyone seems to be losing their mind over something that can’t even do (whatever).

sphtoday at 2:07 PM

Truth lies in the middle. Yes LLM are an incredible piece of technology, and yes we are cooked because once again technologists and VC have no idea nor interest in understanding the long-term societal ramifications of technology.

Now we are starting to agree that social media has had disastrous effects that have not fully manifested yet, and in the same breath we accept a piece of technology that promises to replace large parts of society with machines controlled by a few megacorps and we collectively shrug with “eh, we’re gonna be alright.” I mean, until recently the stated goal was to literally recreate advanced super-intelligence with the same nonchalance one releases a new JavaScript framework unto the world.

I find it utterly maddening how divorced STEM people have become from philosophical and ethical concerns of their work. I blame academia and the education system for creating this massive blind spot, and it is most apparent in echo chambers like HN that are mostly composed of Western-educated programmers with a degree in computer science. At least on X you get, among the lunatics, people that have read more than just books on algorithms and startups.

coffeefirsttoday at 5:08 AM

Well, this is the internet. Arguing about everything is its favorite pastime.

But generally yes, I think back to Mongo/Node/metaverse/blockchain/IDEs/tablets and pretty much everything has had its boosters and skeptics, this is just more... intense.

Anyway I've decided to believe my own eyes. The crowds say a lot of things. You can try most of it yourself and see what it can and can't do. I make a point to compare notes with competent people who also spent the time trying things. What's interesting is most of their findings are compatible with mine, including for folks who don't work in tech.

Oh, and one thing is for sure: shoving this technology into every single application imaginable is a good way to lose friends and alienate users.

nstarttoday at 5:19 AM

The problem with X is that so many people who have no verifiable expertise are super loud in shouting "$INDUSTRY is cooked!!" every time a new model releases. It's exhausting and untrue. The kind of video generation we see might nail realism but if you want to use it to create something meaningful which involves solving a ton of problems and making difficult choices in order to express an idea, you run into the walls of easy work pretty quickly. It's insulting then for professionals to see manga PFPs on X put some slop together and say "movie industry is cooked!". It betrays a lack of understanding of what it takes to make something good and it gives off a vibe of "the loud ones are just trying to force this objectively meh-by-default thing to happen".

The other day there was that dude loudly arguing about some code they wrote/converted even after a woman with significant expertise in the topic pointed out their errors.

Gen AI has its promise. But when you look at the lack of ethics from the industry, the cacophony of voices of non experts screaming "this time it's really doom", and the weariness/wariness that set in during the crypto cycle, it's a natural tendency that people are going to call snake oil.

That said, I think the more accurate representation here is that HN as a whole is calling the hype snake oil. There's very little question anymore about the tools being capable of advanced things. But there is annoyance at proclamations of it being beyond what it really is at the moment which is that it's still at the stage of being an expertise+motivation multiplier for deterministic areas of work. It's not replacing that facet any time soon on its current trend (which could change wildly in 2026). Not until it starts training itself I think. Could be famous last words

llmslave2today at 4:43 AM

Because there is a wide range of what people consider good. If you look at that the people on X consider to be good, it's not very surprising.

PeterHolzwarthtoday at 6:57 AM

I think it may be all summed up by Roy Amara's observation that "We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run."

show 2 replies
thisoneisrealtoday at 3:34 AM

My take (no more informed than anyone else's) is that the range indicates this is a complex phenomenon that people are still making sense of. My suspicion is that something like the following is going on:

1. LLMs can do some truly impressive things, like taking natural language instructions and producing compiling, functional code as output. This experience is what turns some people into cheerleaders.

2. Other engineers see that in real production systems, LLMs lack sufficient background / domain knowledge to effectively iterate. They also still produce output, but it's verbose and essentially missing the point of a desired change.

3. LLMs also can be used by people who are not knowledgeable to "fake it," and produce huge amounts of output that is basically besides-the-point bullshit. This makes those same senior folks very, very resentful, because it wastes a huge amount of their time. This isn't really the fault of the tool, but it's a common way the tool gets used and so it gets tarnished by association.

4. There is a ridiculous amount of complexity in some of these tools and workflows people are trying to invent, some of which is of questionable value. So aside from the tools themselves people are skeptical of the people trying to become thought leaders in this space and the sort of wild hacks they're coming up with.

5. There are real macro questions about whether these tools can be made economical to justify whatever value they do produce, and broader questions about their net impact on society.

6. Last but not least, these tools poke at the edges of "intelligence," the crown jewel of our species and also a big source of status for many people in the engineering community. It's natural that we're a little sensitive about the prospect of anything that might devalue or democratize the concept.

That's my take for what it's worth. It's a complex phenomenon that touches all of these threads, so not only do you see a bunch of different opinions, but the same person might feel bullish about one aspect and bearish about another.

zahlmantoday at 2:53 AM

I'm not really convinced that anywhere leans heavily towards anything; it depends which thread you're in etc.

It's polarizing because it represents a more radical shift in expected workflows. Seeing that range of opinions doesn't really give me a reason to update, no. I'm evaluating based on what makes sense when I hear it.

xboxnolifestoday at 7:18 AM

From my perspective, both show HN and Twitter's normal biases. I view HN as generally leaning toward "new things suck, nothing ever changes", and I view Twitter generally as "Things suck, and everything is getting worse". Both of those align with snake oil and we're all cooked.

sanderjdtoday at 8:05 AM

As usual, somewhere in between!

Madmallardtoday at 6:52 AM

I use them daily and I actively lose progress on complex problems and save time on simple problems.