I feel like this title is misleading, all this does is make an HTTP request to the site and let an LLM summarize its content. It does not just "cut the crap", it cuts everything and boils it down to AI slop. Point it towards a high quality scientific article, and you'll see that it doesn't just cut "crap", but any information that might be valuable.
I get that this can be useful for some sites, I've used Kagi Summarizer (https://kagi.com/summarizer) in the past, which does basically the same thing. To me, it doesn't seem like the solution to AI slop would be to turn it into shorter AI slop, the better "solution" would be to avoid AI slop and to block SEO optimized slop websites from showing up wherever possible.
Highly unethical in my opinion to not disclose the tool is summarizing via an LLM. As a matter of fact, in the right circumstances it may not only fail to do what the title says, but do the opposite - add hallucinations or other AI generated garbage!