> You say it like it's a bad thing. But ideally this also brings clarity & purpose to your own API design too! Ideally there is conjunct purpose! And perhaps shared mechanism!
I update my website multiple times a day. I want to have as much decoupling as possible. Everytime I update internal API, I dont want to think of having to also update this WebMCP config.
Basically I have to put in work setting up WebMCP, so that Google can have a better agent that disintermediates my site.
> Trying to keep users from seeing what data they want is, generally, not something I favor.
This is literally the whole cat and mouse game of scraping and web automation, sites clearly want to protect their moat and differentiators. LinkedIn/X/Google literally sue people for scraping, I don't think they themselves are going to package all this data as a WebMCP endpoint for easy scraping.
Regardless of your preferences/ideals, the ecosystem is not going to change overnight due to hype about agents.
> Your site running its own agent is going to take a lot of resources
A lot of sites already expose chatbots, its trivial to rate limit and captcha on abuse detection