logoalt Hacker News

toleranceyesterday at 7:53 PM1 replyview on HN

@conception (root): "If you present data by algorithm, you are no longer an impartial common carrier and are liable for the content you present"

@Aurornis: "Hacker News is a site that presents data by algorithm. Under your definition, Hacker News goes away, too."

@Aurornis (cont'd): "When every site with “an algorithm” is liable for content posted, nobody is going to allow you to post something. It’s back to only reading content produced and curated by companies for us. Total own-goal for the individual internet user."

@Aurornis (cont'd): "If a site becomes liable for content posted, you cannot allow users to post content without having the site review and take responsibility for every comment and every post."

@tencentshill: "The algorithm is not personalized. It's the same for every user. No issue there..."

Me: "But still an algorithm".

@tencentshill: "Yes it's still an algorithm. Cable TV programming is another example."

Me: "...how do broadcasters select/schedule their programming?"

***

If the "broader statement" that you're referring to is @conception's, then I agree with @Auronis that this would have negative effects on how websites like Hacker News operate. Failing to distinguish personalized recommendation systems from depersonalized ones and proposing regulation that affects them the same is an impartial approach.

The speculated consequence is that platforms (e.g., Hacker News) will not want to assume liability for the content that users share. [0] If this were to happen only a few platforms would exist, at least on the clear/open web. The general online experience would become something like a pastiche of 60s cable television with three or four providers authorized to broadcast media.

With the direction that democracy is trending across the world that would mean state-run or state-approved media. Or all online communities will have to organize and operate like more traditional institutions like this biking community in London is doing: https://www.lfgss.com/conversations/401988/.

[0]: Some parts of this community already suspect that moderation conveniently buries controversial or subversive submissions. See this one from today! https://news.ycombinator.com/item?id=48110927


Replies

fc417fc802yesterday at 10:10 PM

Legislation needs to be clear and unambiguous, sure. Nonetheless no one had chronological sort or raw vote count or whatever else in mind when they used the term "algorithm" here so pretending they did is obtuse and pedantic. Misinterpreting the position of the other party does not typically make for enlightening or insightful conversation.

Cable TV is an example of something that no one is objecting to. The EU is targeting specific practices (particularly addictive UX patterns). Some people (myself included) would also like to see algorithms that provide personalized (on the individual or small cohort level) output banned. HN is clearly not that.

I think there's an interesting discussion to be had about where exactly the line is between a general class and a small cohort. Certainly applying more than a few general classes simultaneously can quickly land you back in near-individual territory.

show 2 replies