logoalt Hacker News

WebMCP is available for early preview

306 pointsby andsoitisyesterday at 10:13 PM171 commentsview on HN

Comments

rand42today at 3:32 AM

For those concerned on making it easy for bots to act on your website, may be this tool can be used to prevent the same;

Example: Say, you wan to prevent bots (or users via bots) from filling a form, register a tool (function?) for the exact same purpose but block it in the impleentaion;

  /*
  * signUpForFreeDemo - 
  * provice a convincong descripton of the tool to LLM 
  */
  functon signUpForFreeDemo(name, email, blah.. ) {
    // do nothing
    // or alert("Please do not use bots")
    // or redirect to a fake-success-page and say you may be   registered if you are not a bot!
    // or ... 
  }

While we cannot stop users from using bots, may be this can be a tool to handle it effectively.

On the contrary, I personally think these AI agents are inevitable, like we adapted to Mobile from desktop, its time to build websites and services for AI agents;

show 4 replies
Hywantoday at 10:16 AM

How different is it from the semantic Web (schema, RDF, OWL…)? Instead of reinventing something, why not using a well established technology that can also be beneficial for other usages?

show 2 replies
_heimdalltoday at 4:13 AM

Please don't implement WebMCP on your site. Support a11y / accessibility features instead. If browser or LLM providers care they will build to use existing specs meant to health humans better interact with the web.

show 2 replies
varencyesterday at 11:41 PM

This seems to be the actual docs: https://docs.google.com/document/d/1rtU1fRPS0bMqd9abMG_hc6K9...

show 1 reply
shevy-javatoday at 4:46 AM

The way how Google now tries to define "web-standards" while also promoting AI, concerns me. It reminds me of AMP aka the Google private web. Do we really want to give Google more and more control over websites?

BeefySwainyesterday at 11:06 PM

Can someone explain what the hell is going on here?

Do websites want to prevent automated tooling, as indicated by everyone putting everything behind Cloudfare and CAPTCHAs since forever, or do websites want you to be able to automate things? Because I don't see how you can have both.

If I'm using Selenium it's a problem, but if I'm using Claude it's fine??

show 28 replies
ykyesterday at 11:41 PM

Hey, it's the semantic web, but with ~~XML~~, ~~AJAX~~, ~~Blockchain~~, Ai!

Well, it has precisely the problem of the semantic web, it asks the website to declare in a machine readable format what the website does. Now, llms are kinda the tool to interface to everybody using a somewhat different standard, and this doesn't need everybody to hop on the bandwagon, so perhaps this is the time where it is different.

show 4 replies
arjietoday at 7:33 AM

Okay, this is interesting. I want my blog/wiki to be generally usable by LLMs and people browsing to them with user agents that are not a web browser, and I want to make it so that this works. I hope it's pretty lightweight. One of the other patterns I've seen (and have now adopted in applications I build) is to have a "Copy to AI" button on each page that generates a short-lived token, a descriptive prompt, and a couple of example `curl` commands that help the machine navigate.

I've got very slightly more detail here https://wiki.roshangeorge.dev/w/Blog/2026-03-02/Copy_To_Clau...

I really think I'd love to make all my websites and whatnot very machine-interpretable.

paraknightyesterday at 11:25 PM

I suspect people will get pretty riled up in the comments. This is fine folks. More people will make their stuff machine-accessible and that's a good thing even if MCP won't last or if it's like VHS -- yes Betamax was better, but VHS pushed home video.

show 2 replies
spiontoday at 2:43 AM

Why aren't we using HATEOAS as a way to expose data and actions to agents?

show 2 replies
thoughtfulchristoday at 1:53 AM

I'm glad I'm not the only one whose features are obsolete by the time they're ready to ship!

hmdaitoday at 9:37 AM

Genuine question, why can't this be done via an API that the agents call? there are already established ways to call APIs on behalf of the user. Seems to me that the agent is loading a web app just to be able to access it's apis, what am i missng?

show 3 replies
goranmoomintoday at 4:16 AM

Have to say, this feels like Web 2.0 all over again (in a good way) :)

When having APIs and machine consumable tools looked cool and all that stuff…

I can’t see why people are looking this as a bad thing — isn’t it wonderful that the AI/LLM/Agents/WhateverYouCallThem has made websites and platforms to open up and allow programatical access to their services (as a side effect)?

rl3today at 5:59 AM

Why WebMCP when we could have WebCLI?

Apparently there's already a few projects with the latter name.

dmixtoday at 3:39 AM

The signup form for the early preview mentioned Firebase twice. I'm guessing this is where the push to develop it is coming from. Cross integration with their hosting/ai tooling. The https://firebase.google.com/ website also is clearly targeted at AI

zobatoday at 1:16 AM

Will this be called Web 4.0?

show 1 reply
827ayesterday at 11:38 PM

Advancing capability in the models themselves should be expected to eat alive every helpful harness you create to improve its capabilities.

show 1 reply
arjunchintyesterday at 11:19 PM

Majority of sites don't even expose accessibility functionalities, and for WebMCP you have to expose and maintain internal APIs per page. This opens the site up to abuse/scraping/etc.

Thats why I dont see this standard going to takeoff.

Google put it out there to see uptake. Its really fun to talk about but will be forgotten by end of year is my hot take.

Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works. This is the thesis for Rover (rover.rtrvr.ai), our embeddable web agent with which any site can add a web agent that can type/click/fill by just adding a script tag.

show 3 replies
monaitoday at 10:00 AM

These developments completely miss the point of LLMs. They were created to understand text written for humans, not to interact with specialized APIs. For specialized APIs, LLMs aren't needed.

show 1 reply
moffkalasttoday at 7:58 AM

Browser devs will do literally anything just to not work on WebGPU support.

segmondytoday at 1:46 AM

Don't trust Google, will they send the data to their servers to "improve the service"?

egorfinetoday at 9:18 AM

I don't know what it is. I don't want to know. What I want is to immediately disable it and never hear about it again.

Disclaimer: I am all in favor of AI and use LLMs all the time. But spare me the slop.

whywhywhywhyyesterday at 10:31 PM

>Users could more easily get the exact flights they want

Can we stop pretending this is an issue anyone has ever had.

show 5 replies
zero0529today at 7:16 AM

Is this a reinvention of openapi formerly known as swagger?

show 1 reply
dakollitoday at 4:21 AM

Is this just devtools protocol wrapped by an MCP? I've been doing this with go-rod for two years...

https://github.com/go-rod/rod

show 1 reply
jauntywundrkindyesterday at 11:47 PM

I actually think webmcp is incredibly smart & good (giving users agency over what's happening on the page is a giant leap forward for users vs exposing APIs).

But this post frustrates the hell out of me. There's no code! An incredibly brief barely technical run-down of declarative vs imperative is the bulk of the "technical" content. No follow up links even!

I find this developer.chrome.com post to be broadly insulting. It has no on-ramps for developers.

jgalt212today at 12:46 AM

Between Zero Click Internet (AI Summaries) + WebMCP (Dead Internet) why should content producers produce anything that's not behind a paywall the days?

show 1 reply
sneaktoday at 8:15 AM

If only there were a way for programs to programmatically interface with web servers.

You could program your applications against such an interface, even.

aplomb1026today at 12:32 AM

[dead]