This is pretty easy to solve. If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present. If the user decides you don’t, ala social media 1.0.
So the user opens the app - what is the first video you show them? How does 'the user decide' from the millions upon millions of videos there are?
If the user can search like in Youtube then how do you rank the results? That's also an algorithm.
It isn't pretty easy to solve at all.
The conversation has iterated a couple times and one point that people (on this site at least) are stuck on is “well however you rank things—latest, most popular—you’ll need to use some kind of algorithm, maybe quicksort.” This isn’t what the general public or politicians mean when they say “an algorithm” but it does make something of a point, what exactly the general public and politicians mean when they say that… it’s a bit ambiguous.
I think the EU has fully digested this point, and is focusing on the “addictive design” phrase instead, for good reason. It makes it obvious that the problem is a bit fuzzy and related to the behaviors induced, not some cut-and-dry algorithmic thing.
This is one of those things that don’t translate to legal reality very well, as then you have to define “what is an algorithm”.
Is adding advertisements an algorithm?
Is including likes an algorithm?
Is automatically starting the next video after a previous one has finished an algorithm?
Is infinite scroll an algorithm?
Etc
Even easier solution: mandate that all harmful content be distributed with the evil bit set in all IP packets. Security: solved.
This would be extremely hard to put into legislation that wouldn't affect just about every single site on the internet currently.
And when does the user decide? Must a platform do nothing to stimmy spam, or even illegal content to qualify as impartial?
I suppose the answer could be that only platforms that do indeed allow spam or worse are impartial, but that is a tricky position to be in.
Define "algorithm" then. I would argue that "sort by date, excluding stuff the user already watched" is already an algorithm.
This exactly. I find it perplexing that social media companies get to make decisions about what people see but then also get to pretend that they are just a neutral communication medium. They're clearly not.
This is a bit of systems difference. Under a french law system you would write laws to regulate the harms away. Under english common law liability court cases about the harm would lead to precedents and then to common law derived from it. Though not an expert on this.
You'll need to solve the dark pattern where a new account opens on a blank page with a box saying "Would you like us to suggest what you watch here?"
It’s so elegant that there’s zero chance the EU will do it since this is all performative for them
People have argued that by censoring what users can say, these platforms made themselves editors, if its not flat out illegal, I don't see why anyone should waste any time trying to police the internet, its a fools errand. I've had Facebook's AI ding me for posting literal memes that out of context sound ridiculous.
How does this specific horrible take rank so highly on HN whenever something adjacent to big tech gets posted. "impartial common carrier" is not even an extent legal concept.
It's been argued to death already, I just have to express shock that I'm still seeing this non-starter constantly here.
Back in my day, they used to be called social networks
Alternative suggestion: Force them to open up the service and allow third party clients. Take Art. 20 GDPR "Right to data portability" and extend it to public content.
Do you really think a couple algorithm changes are all that's needed to make social media something into something that won't have a significant negative impact on the average child if they're exposed to it?
"this" - you mean, engagement optimization? i think it would be different content. i don't know how much liability matters, people spend all day watching netflix too, and it is "liable."
ironically, i'm only reading this kind of low brow take because people upvote it, not because it makes any sense.
So...repeal Section 230? I like it!
> If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present
Hacker News is a site that presents data by algorithm. Under your definition, Hacker News goes away, too.
A more accurate framing would be that they’re going after personalized recommendation algorithms. It’s not obvious that offering a recommendation algorithm would mean that the site is no longer an impartial common carrier.