logoalt Hacker News

Aardwolfyesterday at 7:50 PM1 replyview on HN

If true, this would mean more websites with genuine content from the "old" internet won't show up (since many personal websites won't have this), while more SEO-optimized content farms that of course do put up a robots.txt will...


Replies

shevy-javayesterday at 8:02 PM

It also fits Google's plan to create a surrogate web.

- AI was the first step (or actually, among the first five steps or so). CHECK. - Google search has already been ruined. CHECK. - Now robots.txt is used to weed out "old" websites. CHECK.

They do too much evil. But it is also our fault, because we became WAY too dependent on these mega-corporations.