logoalt Hacker News

augusto-mouratoday at 2:29 PM17 repliesview on HN

What I get a bit annoyed is companies forcing AI tools, getting usage metrics and actively hunting the engineers that don't use the tool "enough", I've never seen anything like it for a technically optional tool. Even in the past, aside from technical limitaions, you were not required to use enough of a tool.

It just sounds like a giant scheme to burn through tokens and give money to the AI corps, and tech directors are falling for it immediately.


Replies

HolyLampshadetoday at 2:57 PM

> I've never seen anything like it for a technically optional tool

Cloud had a very similar vibe when it was really running advertising to CIO/CTOs hard. Everything had to be jammed into the cloud, even if it made absolutely no sense for it to be run there.

This seems to come pretty frequently from visionless tech execs. They need to justify their existence to their boss, and thus try to show how innovative and/or cost cutting they can be.

show 4 replies
hibikirtoday at 2:57 PM

It's using a bad tool to try to aim at something reasonable-ish: Developers not taking advantage of the tools in places where it's very easy to get use out of them. I have coworkers like that: One spent 3 days researching a bug that Claude found in 10 minutes by pointing it at the logs in the time window and the codebase. And he didn't even find the bug, when Claude nailed it in one.

But is this something that is best done top to bottom, with a big report, counting tokens? Hell no. This is something that is better found, and tackled at the team level. But execs in many places like easy, visible metrics, whether they are actually helping or not. And that's how you find people playing JIRA games and such. My worse example was a VP has decided that looking at the burndown charts from each team under them, and using their shape as a reasonable metric is a good idea.

It's all natural signs of a total lack of trust, and thinking you can solve all of this from the top.

show 1 reply
jacobsenscotttoday at 3:15 PM

> t just sounds like a giant scheme to burn through tokens and give money to the AI corps, and tech directors are falling for it immediately.

This is exactly what's happening. The top 5 or 6 companies in the s&p 500 are running a very sophisticated marketing/pressure campaign to convince every c-suite down stream that they need to force AI on their entire organization or die. It's working great. CEOs don't get fired for following the herd.

show 1 reply
bondarchuktoday at 4:00 PM

>I've never seen anything like it for a technically optional tool

If you broaden the comparison (only a little bit) it looks suspiciously like employees being forced to train their own replacement (be that other employees, or factory automation), a regular occurrence.

show 1 reply
luisgvvtoday at 2:39 PM

I'm just using Copilot CLI for mindless stuff and set it to the premium models to meet the quota, as long as they can't see the prompts I think I should be fine

show 2 replies
jimmyjazz14today at 2:46 PM

Yeah, I found this strange as well, if the tech is so amazing why do developers need to be forced to use it?

show 2 replies
PedroBatistatoday at 5:35 PM

Tech directors, CEOs, managers, etc tend to be people with a certain personality and ( learned behaviors/thinking ) just like "technical people".

Yes, they tend to be incredible gullible to certain things, over-simplistic and over-confident but also very "agile" when it comes to sweep their failures under the rug and move on to keep their own neck in one piece. At this point in time even the median CEO knows AI has been way overhyped and they over invested to a point of absolute financial insanity.

The first line of defense about the pressure to deliver is to mandate their minions to use it as much as possible.

We spent a fortune on this over-rated Michelin star reservation, and now you kids are going to absolutely enjoy it, like it or not goddammit!

layer8today at 2:38 PM

> I've never seen anything like it for a technically optional tool.

It has often been the case for technologies though, like “now we’re doing everything in $language and $technology”. If you see LLM coding as a technology in that vein, it’s not a completely new phenomenon, although it does affect developers differently.

show 2 replies
afpxtoday at 2:57 PM

It's really insane what is happening. My wife manages 70 software developers. Her boss mandated that managers replace 50% of the staff with AI within a year. And, she's scrambling trying to figure out if any of the tools actually work and annoying her team because she keeps pushing AI on them. Unsurprisingly it's only slowed things down and put her in a terrible position.

show 3 replies
khrisstoday at 6:26 PM

This is largely due to the age old fact that corporations rarely make decisions based on actual data, introspection, and good judgment. Usually the decision is made first and then the justifications are invented afterwards.

In this case, every executive is terrified of being "left out" in the AI race. As we saw with the mass layoffs across companies, most of CEO decision making is just adhering to herd behavior. So it is literally better for execs to have shoveled a shit ton of money into 'strategic' AI initiatives and have them fail than potentially deal with the potentially remote chance of some other exec or company succeeding with 'AI enabled transformation'.

What makes it even more fun is that nobody really has a good understanding of how to measure the ROI of AI. Hence we have people burning a lot of money due to FOMO and no easy way of measuring the outcome, which is usually how the foundations for good Ponzi schemes are laid.

This is unlikely to end well. However, as usual, it's us, the common plebs, who will suffer regardless of outcome.

genthreetoday at 2:57 PM

We're doing that in my office, forced Cursor use. A good chunk of the "edited by AI" lines in my history were just auto-completing about the same as a traditional intellisense-alike would do (and actually Cursor doesn't seem to supply that, which is frequently annoying and wastes my time, in particular when I need to make sure it hasn't hallucinated a method or property on an object it should be able to "see" the definition of, which it does constantly; IDK maybe there's a setting somewhere, but I don't have to fiddle with settings in vanilla VSCode to get that...)

It's actually kinda useful in some cases, but the UI is terrible and it needs to integrate much better with existing tools that are superior to it for specific purposes, before I'll be happy using it. I'd say the productivity gains are a wash, for me, so far. Plus it's entirely too memory-hungry, I'd just come to accept that a text editor takes a couple GB now (SIGH), and here it comes taking way more than that.

whateveraccttoday at 3:29 PM

Yes it's very weird - why is my CEO being so nosy about my text editor all of a sudden? Stay in your lane, buddy.

diehundetoday at 3:09 PM

As an employee of a big tech company doing this, it's all fear mongering. We are being told that if everyone doesn't use these tools, our competitors will wipe the floor with us because they are using them and will ship features 10x faster. But many engineers are suspicious as well.

show 1 reply
zephentoday at 2:47 PM

I don't doubt you, but I'm out of the loop.

Who does this?

show 2 replies
mh-today at 2:46 PM

The only thing I've mandated for engineers is that folks give it a try occasionally, as models, best practices, and tooling improves.

I'm currently tracking exactly two numeric metrics: total MAUs (to track the aforementioned), and total DAUs (to gauge adoption and rightsize seat-licensed contracts.)

show 3 replies
suhputttoday at 3:37 PM

[dead]

jmalickitoday at 2:46 PM

I've also never seen an optional tool become a step change like this

Even moving from assembly language to compiled languages was not as much of a step change.