logoalt Hacker News

rubiquityyesterday at 3:16 AM1 replyview on HN

The scrapers should use some discretion. There are some rather obvious optimizations. Content that is not changing is less likely to change in the future.


Replies

JohnTHalleryesterday at 3:59 AM

They don't care. It's the reason they ignore robots.txt and change up their useragents when you specifically block them.