logoalt Hacker News

cortesoftyesterday at 5:59 PM8 repliesview on HN

While I agree with the premise, I do wonder how you can write a law that would stop the behavior we want to stop without hurting beneficial features or allowing the law to be too easily bypassed.

How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?


Replies

ryandrakeyesterday at 7:37 PM

> How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

I don't know how you'd write it in a law either, but if you're in a meeting at your tech company, and the product owner or tech lead uses language like "We need to get users to do..." and "We need to incentivize..." and "It should be easy to do X and hard to do Y..." then do whatever is in your power to steer/stop. You're not really building a product users want, you're pushing a behavior-modification scheme onto users.

show 2 replies
akerstenyesterday at 6:46 PM

> How do you describe in a legal way the difference between a useful feature people want and an addictive feature they don’t want?

For laws like this it always boils down to "I'll know it when I see it" which is such a shockingly poor way to write legislation that I'm flabbergasted it doesn't immediately fail any amount of rudimentary scrutiny. Not to mention the latitude it grants for selective enforcement. It's basically Washington asking (through the Economist) for a leash on platforms that host their critics that they can yank at any time the population gets too rowdy, with the convenient justification that the algorithm is too good and our attention spans are in danger or whatever.

conductryesterday at 6:37 PM

Agree. My first thought is most people in early days didn’t even want to start using PCs for work to begin with. The businesses generally had to mandate it. I imagine many people are facing this today with AI.

traderj0eyesterday at 7:19 PM

One way is intent. If a company's internal communications show that they're intentionally making it addictive, or worse they know it causes harm, you have the smoking gun. This of course doesn't catch all the abuse, but at least it makes it much harder to do this down an entire reporting chain. They have to get really good at winking.

One famous case was Apple suing Samsung over patents. Hard to prove until internal comms surfaced showing intent to copy the iPhone.

show 1 reply
Seattle3503yesterday at 10:47 PM

You create an agency and give it a mandate that requires it to balance concerns.

show 1 reply
general1465yesterday at 6:45 PM

Very simple - force companies into data interoperability. That will allow users to move to competition without any data loss. I.e. nobody actually cares that GitHub is constantly down because you can move your repos to a different git provider or to your own server.

show 1 reply
y0eswddlyesterday at 7:41 PM

dark patterns are pretty well documented and understood at this point. I don't think identifying them is all that hard.

Infinite scroll is one obvious one. As well as forcing algorithmic feeds of accounts we don't follow.

thaumasiotesyesterday at 6:16 PM

Well, you could look to the gambling market for inspiration and let people voluntarily sign up for a blacklist on that feature.

That would be a lot of extra work for the platforms, but I think the results would be interesting. It amounts to legislating that certain features have to be optional and configurable.