So perhaps it's time to standardize that.
I'm not entirely sure why people think more standards are the way forward. The scrapers apparently don't listen to the already-established standards. What makes one think they would suddenly start if we add another one or two?
I'm in favor of /.well-known/[ai|llm].txt or even a JSON or (gasp!) XML.
Or even /.well-known/ai/$PLATFORM.ext which would have the instructions.
Could even be "bootstrapped" from /robots.txt
I'm not entirely sure why people think more standards are the way forward. The scrapers apparently don't listen to the already-established standards. What makes one think they would suddenly start if we add another one or two?