I'm not a web developer, but I've picked up some bits of knowledge here and there, mostly from troubleshooting issues I encounter while using websites.
I know there are a number of headers used to control cross-site access to websites, and the linked blog post shows archive.today's denial-of-service script sending random queries to the site's search function. Shouldn't there be a way to prevent those from running when they're requested from within a third-party site?
> I know there are a number of headers used to control cross-site access to websites
Mostly these headers are designed around preventing reading content. Sending content generally does not require anything.
(As a kind of random tidbit, this is why csrf tokens are a thing, you can't prevent sending so websites test to see if you were able to read the token in a previous request)
This is partially historical. The rough rule is if it was possible to make the request without javascript then it doesn't need any special headers (preflight)
You can't completely prevent the browser from sending the request—after all, it needs to figure out whether to block the website from reading the response.
However, browsers will first send a preflight request for non-simple requests before sending the actual request. If the DDOS were effective because the search operation was expensive, then the blog could put search behind a non-simple request, or require a valid CSRF token before performing the search.