Facebook just decided that instead of loading the robots.txt for every host they intend to crawl, they'll just ignore all the other robots.txt files and then access this one a million times to restore the average.
Ah yes, robots_georg.txt.
Ah yes, robots_georg.txt.