its because of hysteria. same thing happened with tv. some girl in england killed herself after she herself had been searching out suicide content, the algo obviously suggests it back to her and because of that we basically got the Online Child Safety act. Whos going to tell grieving parents the kid did it to herself?
> the algo obviously suggests it back to her
Pro-suicide content has been a thing since the days of USENET. It's been a (lesser) problem on social media since the start. Is it really just that one case got over the hysteria threshold?
But yes, if the algorithm is suggesting pro-suicide content then the developers are morally, if not legally, liable for that and should expect some consequences. I note that one of the few taboos maintained by the otherwise grossly irresponsible UK media is not reporting on suicide (because this is known to be a trigger). You might see "famous musician died suddenly at a young age" and have to connect the dots yourself.