Monkeys and typewriters. Throw enough character input and "It's not working" into an LLM and it will eventually produce... something.
And since it tends to reach for the most web-represented solution, that means infinite redis caches doing the same thing, k8s, and/or Vercel.
Best mental model: imagine something that produces great tactical architecture, with zero strategic architecture, running in a loop.