I think for a lot of companies, AI is a destabilizing force that their managerial structure is unable to compensate for.
When you change the economics to such a degree, you're basically removing a dam - resulting in far more stress on the rest of the system. If the leaders of the org don't see the potential downsides and risks of that, they're in for a world of hurt.
I think we're going to see a real surge of companies just like this - crash and burn even though this tech was sold as being a universal improvement. The ones that survive will spread their knowledge about how to tame this wild horse, and ideally we'll learn a thing or two in the future.
But the wave of naivety has surprised me, and I think there's an endless onrush of people that are overly excited about their new ability to vibe-code things into existence. I think we've got our own endless September event going on for the foreseeable future.
I’m an LLM enjoyer who also thinks that ‘er ‘jerbs are safe and, taken to their logical conclusion, most LLM-stroking online around coding reduces to an argument that we should be speaking Haskell to LLMs and also in specs and documentation (just kidding, OCaml is prettier). But also, I do a little business.
You’ve hit the real issue, IT management is D-tier and lacks self awareness. “Agile” is effed up as a rule, while also being the simplest business process ever.
That juniors and fakers are whole hog on LLMs is understandable to me. Hype, fashion, and BS are always potent. The part I still cannot understand, as an Executive in spirit: when there is a production issue, and one of these vibes monkeys you are paying has to fix it, how could you watch them copy and paste logs into a service you’re top dollar paying for, over and over, with no idea of what they’re doing, and also not be on your way to jail for highly defensible manslaughter?
We don’t pay mechanics to Google “how to fix car”.
Honestly, the most impactful thing I've seen AI do for any workplace is serve as the ultimate excuse for whatever pet thing someone's wanted to do, that can't stand on its own merits, and what they really need is a solid excuse.
Rewrite that old crunchy system that has had 0 incidents in the last year and is also largely "done" (not a lot of new requirements coming in, pretty settled code/architecture)? It's actually one of our most stable systems. But someone who doesn't even write code here thinks the code is yucky! But that doesn't convince the engineers who are on-call for it to replace it for almost no reason. Well guess what. We can do it now, _because AI!!!_ (cue exactly what you think happens next happening next)
Need to lay off 10% of staff because you think the workers are getting too good of a deal? AI.
Need to convince your workers to go faster, but EMs tell you you can't just crack the whip? AI mandates / token spend mandates!
Didn't like code reviews and people nitpicking your designs? Sorry, code reviews are canceled, because of AI.
Don't like meetings or working in a team? Well now everyone is a team of 1, because of AI. Better set up some "teams" full of teams of 1, call them "AI-first" teams, and wait what do you mean they're on vacation and the service is down?
Etc. And they don't even care that these things result in the exact negative outcomes that are why you didn't do them before you had the excuse. You're happy that YOUR thing finally got done despite all the whiners and detractors. And of course, it turns out that businesses can withstand an absurd amount of dysfunction without really feeling it. So it just happens. Maybe some people leave. You hire people who just left their last place for doing the thing you just did and now maybe they spend a bit of time here. And the game of musical chairs, petty monarchies, and degenerate capitalism continues a bit longer.
Big props to the people who managed to invent and sell an excuse machine though. Turns out that's what everyone actually wanted.
> I think for a lot of companies, AI is a destabilizing force that their managerial structure is unable to compensate for.
Absolutely. Giving a traditional company AI is like giving an unlimited supply of crystal-blue methamphetamine to a deadbeat pill addict.
It enables and supercharges all their worst impulses. Making a broken system more 'productive' doesn't do shit to make the users better off.
The work output everyone produces doubles, but the ratio of productive to net-negative work plummets.
I increasingly see “AI” as a sort of virus tuned to target management, specifically. Its output is catnip to them, and it’s going to be unavoidable for those who want to look good to superiors and peers (i.e. the #1 priority for managers) even as it adds no actual value whatsoever to what they do. People under them, too, will have to start burning tokens on bullshit to satisfactorily perform competence and “doing work”. Meanwhile, none of this is actually productive. It’s goddamn peacock feathers.
It’s like some kind of management parasite. I’m not even sure at this point that it’s going to lead to an overall productivity increase whatsoever for most sectors, because of this added drag on everything.