A take that I'm not seeing in all the "LLM scrapers are heading to our site, run for your lives!" threads is this:
Why can't people harden their software with guards? Proper DDoS protection? Better caching? Rewrite the hot paths in C, Rust, Zig, Go, Haskell etc.?
It strikes me as very odd, the atmosphere of these threads. So much doom and gloom. If my site was hit by an LLM scraper I'd be like "oh, it's on!", a big smile, and I'll get to work right away. And I'll have that work approved because I'll use the occasion to convince the executives of the need. And I'll have tons of fun.
Can somebody offer a take on why are we, the forefront of the tech sector, just surrendering almost without a single shot?
Because our sites are written in layers of abstraction and terrible design, which leads to requests taking serious server resources. If we hosted everything "well", you'd get a few 10-20k req/s per CPU core, but we aren't.