It's a little hilarious.
First, as an organization, do all this cybersecurity theatre, and then create an MCP/LLM wormhole that bypasses it all.
All because non-technical folks wave their hands about AI and not understanding the most fundamental reality about LLM software being fundamentally so different than all the software before it that it becomes an unavoidable black hole.
I'm also a little pleased I used two space analogies, something I can't expect LLMs to do because they have to go large with their language or go home.
Assuming a 101 security program past the quality bar, there are a number of reason why this can still happen at companies.
Summarized as - security is about risk acceptance, not removal. There’s massive business pressure to risk accept AI. Risk acceptance usually means some sort of supplemental control that’s not the ideal but manages. There are very little of these with AI tools however - small vendors, they’re not really service accounts but IMO best way to monitor them probably is that, integrations are easy, eng companies hate devs losing admin of some kind but if you have that random AI on endpoints becomes very likely.
I’m ignoring a lot of nuance but solid sec program blown open by LLM vendors is going to be common, let alone bad sec programs. Many sec teams I think are just waiting for the other shoe to drop for some evidentiary support while managing heavy pressure to go full bore AI integration until then.
Nitpick, but wormholes and black holes aren't limited to space! (unless you go with the Rick & Morty definition where "there's literally everything in space")
Maybe this is the key takeaway of GenAI: that some access to data, even partially hallucinated data, is better than the hoops that the security theatre puts in place that prevents average Joe doing their job.
This might just be a golden age for getting access to the data you need for getting the job done.
Next security will catch up and there'll be a good balance between access and control.
Then, as always security goes to far and nobody can get anything done.
It's a tale as old as computer security.
My first reaction to the announcement of MCP was that I must be missing something. Surely giving an LLM unlimited access to protected data is going to introduce security holes?