robots.txt is the most basic access restrictions and it doesn't even read it, while faking itself as human[0]. It is about bypassing access restrictions.
[0]: https://github.com/lightfeed/extractor/blob/d11060269e65459e...