logoalt Hacker News

Leomucktoday at 3:51 PM8 repliesview on HN

So basically more ways of trying to make people buy things, do things, think things than before? I feel like our whole world more and more circulates around manipulation and the absence of truth and discourse.

Then again, I do think LLMs are an incredible technological achievement. The issue is not so much what they do or that they exist, but how they are utilized. Right now, they are utilized to further the class divide between rich and poor.

Who are we to trust in the future? Not big companies, not the state, not LLMs. Time to organize around groups and collectives that we know we can trust and that we know have our wellbeing in mind.


Replies

groundzeros2015today at 4:16 PM

> The issue is not so much what they do or that they exist, but how they are utilized

This is exactly how we got here though. Technology is not passive. It changes incentives, procedures, ideas and shapes the world. If we don't structurally limit what and how it's used, then we are not in control, no matter what are choices personally are.

show 2 replies
drzaiusx11today at 6:36 PM

The majority of human history has been written by the ruling class of the day. Transparency only seems to follow in the wake of their inevitable fall, usually at great cost in retrospective research via the oft thankless unraveling of threads of truth from their more copious fictions. Much like the machines we construct in our likeness, we too seem to get stuck in endless regressive cycles.

Folks in the "now" have always had a tendency to cling to their fictions as if they were truth for whatever reason; like nationalist exceptionalism, racial superiority, or religions rooted in "othering", etc. Humans seem to have an innate desire to fool themselves and trust in things they should not. Perhaps it's simply a sort of existential coping mechanism of living in a cold, unforgiving reality. We seek the comfort of lies.

Organizing around groups of trust, tends to lead to factionalism and conflicts. Knowing and trusting are sadly very different things in our species.

SoftTalkertoday at 6:08 PM

> I feel like our whole world more and more circulates around manipulation

Hate to break it to you but it's always been this way, and it was easier in the past when information was so much more expensive to distribute.

show 1 reply
mentalgeartoday at 6:32 PM

> Time to organize around groups and collectives that we know we can trust

I’ve had the same thoughts, but if you look deeper, it all circles back to what we already had: (open, transparent) public institutions, society, and government by the people. The foundation wasn't the problem; the environment was.

Along the way, social media noise, engagement-optimisation and Kardashian-style "entertainment news" infecting real news made an attention economy where, no matter how scandalous you are, attention can be minted into dollars. That is what polluted our infosphere and lead to the lack of trust.

Now, nobody trusts these previously mentioned public entities any more - sometimes due to state-actor or ad-tech disinformation, and sometimes for good reason like when the poisoned public allowed these 80s-style telemarketer-style political weirdos and their cronies to take over public administration.

show 1 reply
nalekberovtoday at 5:26 PM

> Right now, they are utilized to further the class divide between rich and poor.

Ironically this was the main reason LLMs were introduced in the first place, not to benefit the poor, but to widen the gap between the rich and the poor.

intendedtoday at 6:57 PM

Our society, pre internet, built systems to manage trust. The conditions that allowed those systems to exist (the speed of transmission of data, the ratio of content generation to verification, the ability to shape consensus), have changed.

You are ringing the clarion call for community and cooperation, and it will not work. Not because people don’t want community or the better things, but because incentives make the world go round.

The choice between making some money at the cost of polluting the information commons is no choice at all. That degradation of the commons means no one can escape. No community you form, no group you build, dodges the fallout when someone decides to set fire to shared infrastructure.

We are moving into the dark forest era of the information economy. As models improve, inference costs drop, and capacity increases, the primary organism creating content online will be the bot.

Instead of building communities of people, build collections based on rules of engagement. Participants - be it bots or humans - must follow proscribed rules of conflict and debate.

That way it doesn’t matter if you are talking to a machine or a person. All that matters is that the rules were followed.

LogicFailsMetoday at 5:33 PM

Local models and powerful consumer HW and an informed populace that doesn't hate STEM, but that's not good for the shareholder value so you get expensive everything everywhere all at once instead. And if you dare question the mindset of hating on STEM whilst being addicted to its fruits, that just means you're another one of those maximally SV-aligned sociopaths so why bother? Evolve and let the chips fall where they may because I don't see any other options that play out in the idiocracy craving for strong confidently wrong leadership.

sassymuffinztoday at 4:10 PM

Self inflating nipple shaped balloons that generate their own lift without any helium would be an incredible achievement but that doesn't mean it's useful beyond being novel. Chatbots are ultimately just predictive text on steroids, and only complete fools would base their business, or entire economy around it.

show 1 reply