They are easy to avoid if you actually give a damn. Unfortunately, people who create these things don't, assuming they even know what even half of these attacks are in the first place. They just want to pump out something now now now and the mindset is "we'll figure out all the problems later, I want my cake now now now now!" Maximum velocity! Full throttle!
It's just as bad as a lot of the vibe-coders I've seen. I literally saw this vibe-coder who created an app without even knowing what they wanted to create (as in, what it would do), and the AI they were using to vibe-code literally handwrote a PE parser to load DLLs instead of using LoadLibrary or delay loading. Which, really, is the natural consequence of giving someone access to software engineering tools when they don't know the first thing about it. Is that gatekeeping of a sort? Maybe, but I'd rather have that then "anyone can write software, and oh by the way this app reimplements wcslen in Rust because the vibe-coder had no idea what they were even doing".
> They just want to pump out something now now now
Some people actually fell for "move fast and break things".
I think with the advent of the AI gold rush, this is exactly the mentality that has proliferated throughout new AI startups.
Just ship anything and everything as fast as possible because all that matters is growth at all costs. Security is hard and it takes time, diligence, and effort and investors aren't going to be looking at the metric of "days without security incident" when flinging cash into your dumpster fire.
> "we'll figure out all the problems later, I want my cake now now now now!" Maximum velocity! Full throttle!
That is indeed the point. Moltbot reminds me a lot of the demon core experiment(s): Laughably reckless in hindsight, but ultimately also an artifact of a time of massive scientific progress.
> Is that gatekeeping of a sort? Maybe, but I'd rather have that
Serious question: What do you gain from people not being able to vibe code?