>Honors robots.txt
Is it possible to ignore robot.txt in the case the crawl was triggered by a human?