robots.txt isn't just an on/off switch. You can set crawler rate limits in there that crawlers may choose to respect, and the big companies respect them- because it's in their interest to reduce their crawling cost and not send more requests than they need to.
However, these smaller companies are doing ridiculous things like scraping the same site many thousands of times a day, far more often than the content of the sites change.
robots.txt isn't just an on/off switch. You can set crawler rate limits in there that crawlers may choose to respect, and the big companies respect them- because it's in their interest to reduce their crawling cost and not send more requests than they need to.
However, these smaller companies are doing ridiculous things like scraping the same site many thousands of times a day, far more often than the content of the sites change.