Hey, it's the semantic web, but with ~~XML~~, ~~AJAX~~, ~~Blockchain~~, Ai!
Well, it has precisely the problem of the semantic web, it asks the website to declare in a machine readable format what the website does. Now, llms are kinda the tool to interface to everybody using a somewhat different standard, and this doesn't need everybody to hop on the bandwagon, so perhaps this is the time where it is different.
I think there has to be a gradual on-ramp for things to pick up steam. You can't go over the "activation energy" required to set up the semantic markup etc. upfront that would have been needed for the Semantic Web back then (ontologies, RDF, APIs). Instead, AI agents can use all websites to some extent, even before you do any agent-accommodations. But now you can take small steps to make it slightly better, then see that users want it, or it drives your sales or whatever your site does, and so you can take another small step and by the end of it you have an API. Not to mention that AI agents can code up said API faster as well.
Are AI smart enough to automatically generate semantics now? Vibe semantics? Or would they be Slop semantics?
This is similar to building a React SPA and complaining that Google can't index it.
LLMs will use your website anyway. You're just choosing whether to pay the cost in structured endpoints upfront or hand that cost to browser emulation and lose control of how you're represented.