logoalt Hacker News

eggytoday at 2:03 PM25 repliesview on HN

I'm skeptical about banning design patterns just because people might overuse them. Growing up, I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling. Now we binge entire Netflix series and that's fine, but short-form video needs government intervention? The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices? If we're worried about addictive patterns, those exist everywhere—streaming platforms, social feeds, gaming, even email notifications. My concern isn't whether TikTok's format is uniquely dangerous. It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app. I'd rather see us focus on media literacy and transparency than constantly asking governments to protect us from ourselves.

You can't legislate intelligence...


Replies

wackgettoday at 2:52 PM

You are not acknowledging the fact that the companies producing these addictive apps are very much doing it intentionally. They are specifically making it as engaging as possible because that's how they make money. And they have billions of dollars to sink into making their products as irresistable as possible.

The average person has zero chance against all-pervasive, ultra-manipulative, highly-engineered systems like that.

It is, quite simply, not a fair fight.

show 12 replies
seydortoday at 6:11 PM

The best way for tiktok to respond to this , is to add some "cooling down" delay between videos. The EU commision will boast about this achievement, but effectively tiktok users will spend MORE time on their app.

mtoner23today at 2:16 PM

Short form video has been a total break from previous media and social media consumption patterns. Personally I would support a ban on algorithmic endless short form video. It's purely toxic and bad for humanity

show 3 replies
Supermanchotoday at 3:01 PM

> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?

Spoiler: There is no line. Societies (or more accurately, communities) attempt to self-regulate behaviors that have perceived net-negative effects. These perceptions change over time. There is no optimal set of standards. Historically, this has no consideration for intelligence or biology or physics (close-enough-rituals tended to replace impractical mandates).

Etherytetoday at 5:41 PM

You could make the same argument about sugary beverages, that you can't legislate intelligence, yet every country that has imposed a considerable sugar tax has seen benefits across the board. This of course omits a lot of nuance, but the main takeaway remains the same. We all have that monkey brain inside us and sometimes we need guardrails to defend against it. It's the same reason we don't allow advertising alcohol and casinos to kids, and many other similar examples. (Or at least we don't allow it where I'm from, maybe the laws are different where you're from.)

show 1 reply
gtoweytoday at 5:14 PM

It's not about banning design patterns. It's about removing the harmful results they produce.

Can you imagine if gambling were allowed to be marketed to children? Especially things like slot machines. We absolutely limit the reach of those "design patterns".

show 1 reply
GorbachevyChasetoday at 5:00 PM

I don’t think the addictive argument is being made in good faith. Any platform with an infinite scroll feed and titillating content is intentionally made to be like a slot machine. Just keep swiping and maybe you’ll get that little dopamine hit. The idea that TikTok is dangerous, but Twitter, Instagram, porn, alcohol, and Doritos are fine doesn’t come across as an internally consistent argument. I think that the reality is that those who have an actual say in legislation perceive these platforms as a mechanism of social control and weapon. Right now the weapon isn’t in the “right“ hands.

hollerithtoday at 6:16 PM

>I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling.

The argument against tiktok (and smartphones in general) is not that experiences above a certain threshold of compellingness are bad for you: it is that filling your waking hours with compelling experiences is bad for you.

Back when he had to travel to a theatre to have them, a person was unable to have them every free minute of his day.

andrei_says_today at 5:22 PM

I'm skeptical about banning sales of tobacco and alcohol products to children because children may (over)use them.

Also do we trust adults prescribed oxytocin to manage their use?

We are speaking of weaponized addiction at planetary scale.

Refreeze5224today at 5:33 PM

You can regulate power imbalances though, which is what every individual has versus a multinational with vast resources.

derektanktoday at 3:22 PM

My preferred solution would be to subsidize tools that allow people to better identify and resist compulsive behaviors. Apps like Opal and Freedom that allow you to monitor your free time and block apps or websites you have a troubled relationship with would probably see more use if everybody was given a voucher to buy a subscription. Funding more basic research into behavioral addictions like gambling, etc (ideally research that couldn’t be used by casinos and sports gambling apps on the other side). Helping fund the clinical trials for next Zepbound and Ozempic.

enaaemtoday at 4:19 PM

Gambling mechanics are also banned for certain ages and in some countries for everyone. We don’t say that it’s just a game, and people should just control themselves. Without going into the specifics of this case, design pattern intervention have existed for a long time and it has been in most cases desirable.

show 1 reply
turtlesdown11today at 6:03 PM

I'm also skeptical about banning products like opium or methamphetamine, just because people might overuse them.

thisislife2today at 2:18 PM

The only reason the US and Europe are targeting TikTok is because they don't own the platform. Facebook and WhatsApp (owned by Meta) are responsible for so much hate politics and social unrest around the world (Facebook and Genocide: How Facebook contributed to genocide in Myanmar and why it will not be held accountable - https://systemicjustice.org/article/facebook-and-genocide-ho... ). Amazon, Google and Microsoft helped the Israelis conduct the genocide in Gaza with their AI tools (UN Calls Out Google and Amazon for Abetting Gaza Genocide - https://progressive.international/wire/2025-08-26-un-calls-o... ). But all that's OK.

show 2 replies
kranke155today at 2:19 PM

You should be able to pick your own algorithm. It’s a matter of freedom of choice.

show 2 replies
nuneztoday at 5:02 PM

More and more businesses are shifting their operations and outreach to IG and TikTok, so deciding how to live in a society is increasingly becoming "live under a rock" or "enter the casino and hope to not get swallowed up by the slop".

wackgettoday at 2:46 PM

> It's whether we trust adults to manage their own media consumption

HA!

zbentleytoday at 2:22 PM

> It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app

I think there's a wide regulatory spectrum between those extremes--one that all sorts of governments already use to regulate everything from weapons to software to antibiotics.

It's easy to cherry-pick examples where regulation failed or produced unexpected bad results. However, doing that misses the huge majority of cases where regulation succeeds at preventing harms without imposing problematic burdens on people. Those successes are hard to see because they're evidenced by bad outcomes failing to happen, things working much as they did before (or getting worse at a slower rate than otherwise might happen).

It's harder to point to "nothing changed" as a win than it is to find the pissed-off minority who got denied building permits for reasons they disagree with, or the whataboutists who take bad actions by governments as evidence that regulation in unrelated areas is doomed to failure.

cvosstoday at 5:07 PM

> people might overuse them ... cliffhangers and sequels

I once heard some try to understand pornography addiction by asking if it was comparable to a desire to eat a lot of lemon cookies. To quote Margaret Thatcher, "No. No. No."

> Where do we draw the line

Just because it's hard to find a principled place to draw the line doesn't mean we give up and draw no line. If you are OK with the government setting speed limits, then you're OK with lines drawn in ways that are intended to be sensible but are, ultimately, arbitrary, and which infringe on your freedom for the sake of your good and the public good.

> trust adults

Please do not forget the children.

> You can't legislate intelligence

Your implication is that people who are addicted to TikTok or anything else are unintelligent, dumb, and need to be educated. This is, frankly, an offensive way to engage the conversation, and, worse, naive.

xp84today at 4:39 PM

I am just as uncomfortable with this banning of ideas, or to look at it another way, banning designing it this way simply because it’s effective. I assume this exact same design would not be made illegal if it were terrible at increasing engagement. However I also have to acknowledge that I already can’t stand what TikTok and its ilk have done to attention spans and how addictive they are even across several generations. People just end up sitting there and thumb-twitching while the algorithm pipes handpicked slop into their brains for hours a day. I really don’t want a world where everything is just like this, but even more refined and effective. So, it’s tough to argue that we should just let these sociopaths do this to everyone.

Arguably, the best reason for the government to care is that whoever controls this algorithm, especially in a future when it’s twice as entrenched as it is today, has an unbelievably unfair advantage in influencing public opinion.

Juliatetoday at 2:22 PM

> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?

We do it for alcohol and cigarettes already: taxes, ads & marketing restrictions, health warning mandated communication.

show 1 reply
croestoday at 2:16 PM

> You can't legislate intelligence...

That’s why we ban harmful things.

grayhattertoday at 3:47 PM

> I'm skeptical about banning design patterns just because people might overuse them.

I used to be opposed, now I'm not. I strongly believe human specialization is the important niche humans have adapted, and that should be encouraged. Another equally significant part of human nature is, trust and gullibility. People will abuse these aspects of human nature to give themselves an unfair advantage. If you believe lying is bad, and laws should exist to punish those who do to gain an advantage. Or if you believe that selling an endless, and addictive substance should restricted. You already agree.

There's are two bars in your town, and shady forms of alcohol abound. One bar is run by someone who will always cut someone off after they've had too many. And goes to extreme lengths to ensure that the only alcohol they sell is etoh. Another one is run by someone who doesn't appear to give a fuck, and is constantly suggesting that you should have another, some people have even gone blind.

I think a just society, would allow people to specialize in their domain, without needing to also be a phd in the effects of alcohol poisoning, and which alcohols are safe to consume, and how much.

> Growing up, I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling. Now we binge entire Netflix series and that's fine, but short-form video needs government intervention?

Yes, the dopamine feedback loop of short form endless scrolling has a significantly different effect on the brain's reward system. I guess in line with how everyone shouldn't need to be a phd, you also need people to be able to believe the conclusions of experts as well.

> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?

It's not as linear of a distinction. We don't have to draw the line of where we stop today. It's perfectly fine to iterate and reevaluate. Endless scroll large data source algorithm's are, without a doubt, addictive. Where's the line on cigarettes or now vapes? Surely they should be available, endlessly to children, because where do you draw the line?

(It's mental health, cigarettes and alcohol are bad for physical health, but no one (rhetorical speaking) gives a shit about mental health)

> If we're worried about addictive patterns, those exist everywhere—streaming platforms, social feeds, gaming,

I'd love to ban micro transactions and loot boxes (gambling games) for children.

> even email notifications.

reductive ad absurdism, or perhaps you meant to make a whataboutism argument?

> My concern isn't whether TikTok's format is uniquely dangerous.

Camels and Lucky Strike are both illegal for children to buy.

> It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app.

We clearly do. Companies are taking advantage of the natural dopamine system of the brain for their advantage, at the expense of the people using their applications. Mental health deserves the same prioritzation and protection as physical health. I actually agree with you, banning some activity that doesn't harm others, only a risk to yourself, among reasonably educated adults is insanely stupid. But that's not what's happening.

> I'd rather see us focus on media literacy and transparency than constantly asking governments to protect us from ourselves.

I'd rather see companies that use an unfair disparity of power, control, knowledge and data, be punished when they use it to gain an advantage over their consumers. I think dark patterns should be illegal and come with apocalyptic fines. I think tuning your algorithm's recommendation so that you can sell more ads, or one that recommends divisive content because it drives engagement, (again, because ads) should be heavily taxed, or fined so that the government has the funding to provide an equally effective source of information or transparency.

> You can't legislate intelligence...

You equally can't demand that everyone know exactly why every flavor of snake oil is dangerous, and you should punish those who try to pretend it's safe.

Especially when there's an executive in some part of the building trying to figure out how to get more children using it.

The distinction requiring intervention isn't because these companies exist. The intervention is required because the company has hired someone who's job is to convince children to use something they know is addictive.

wasmainiactoday at 3:17 PM

> didn't make cliffhangers and sequels any less compelling

Apples to oranges.

I can’t make meth in my basement as a precursor to some other drug then complain that my target product had a shitty design.

Real life experience shows that TikTok is harmfully addictive and therefore it must be controlled to prevent negative social outcomes. It’s not rocket science, we have to be pragmatic based on real life experience, not theory.

DaanDLtoday at 3:01 PM

What an unworldly remark. So, we should also not ban hard-drugs then?

show 4 replies