Because AI scraping is everywhere and flooding sites with useless traffic. It’s not ideal, but it’s the best people can do atm
What kind of blog gets flooded by what, 10/100 req/s at max? Seems somewhere along the line we forgot how to deploy and run infrastructure on the internet, if some basic scrapers manage to down your website.
"It's not ideal" is an understatement, I have to do stupid captchas for about half my Google searches.