logoalt Hacker News

dazcyesterday at 8:23 PM0 repliesview on HN

I've witnessed a few catastrophes that have resulted in mistakes made via robots.txt, especially when using 'disallow' as an attempt to prevent pages being indexed.

I don't know if the claims made here are true but there really isn't any reason not to have a valid robots.txt available. One could argue that if you want Google to respect robots.txt then not having one should result in Googlebot not crawling any further.