I find these posts hilarious. LLMs are ultimately story generators, and "oops, I DROP'ed our production database" is a common and compelling story. No wonder LLM agents occasionally do this.
Like we say in adventure motorcycling: "It's never the stuff that goes right that makes the best stories." :)
It's also possible it's only a compelling story, and not based on any real events.
Yeah people don’t understand that if you put an LLM in a position where it’s plausible that a human might drop the DB, it very well might do that since it’s a likely next step. Ahahaha
This is exactly what I have in mind when something like this happens. Sometines it generates a story you want, sometimes not
Also funny how people (including LLM vendors, like Cursor) think that rules in a system prompt (or custom rules) are real safety measures.