for me, the distinction is control. If I'm filtering out things I don't like, I'm in control. If the system is filtering out items or promoting items, I think it fair it take on more responsibility.
A system doesn't want your feed empty because they want your eyes, but because money. When they choose what goes into the feed, they should gain increased liability for what comes out. The risk they take on for more money. If that money is not worth it, don't recommend.
I enjoyed the internet in the beforetimes. Recommendations were limited to "this is objectively related, this is new, this is upvoted, this is by someone you follow or someone they follow, or this is randomly chosen." I still feel there is some liability there, but it is less than when it changes to "this is something we have determined we should show you based on your personal past behavior." That feels different than liking a category when the meta-categories are picked for you. Especially when those meta-categories allow for things you would not want to opt in to, like doomscroll material.
I like some of the stuff I get algorithmically. I never would have searched for a soul cover of Slim Shady, but I'm glad I found it. And I'm glad I found knot tying videos. I think there is space for fancy feeds. But I think it should come closer to being a publisher. This _will_ depress throughput creation if things all have to be monitored which will change the economies and maybe that means some businesses can't exist as they do today. I'd likely pay a subscription to a LearnTok that had curated, quality material.