tirreno (1) guy here.
Our open-source system can block IP addresses based on rules triggered by specific behavior.
Can you elaborate on what exact type of crawlers you would like to block? Like, a leaky bucket of a certain number of requests per minute?
The article is about AI web crawlers. How can your tool help and how would one set it up for this specific context?
I believe there is a slight misunderstanding regarding the role of 'AI crawlers'.
Bad crawlers have been there since the very beginning. Some of them looking for known vulnerabilities, some scraping content for third-party services. Most of them have spoofed UAs to pretend to be legitimate bots.
This is approximately 30–50% of traffic on any website.