I have been in the workforce for almost 30 years now and I believe that everybody is getting more squeezed so they don’t have the time or energy to do a proper job. The expectation is to get it done as quickly as possible and not do more unless told so.
In SW development in the 90s I had much more time for experimentation to figure things out. In the last years you often have some manager where you basically have to justify every thing you do and always a huge pile of work that never gets smaller. So you just hurry through your tasks.
I think google had it right for a while with their 20% time where people could do wanted to do. As far as I know that’s over.
People need some slack if you want to see good work. They aren’t machines that can run constantly on 100% utilization.
This is my experience as well. In the late 90s/early 2000s I had the luxury of a lot of time to deeply and learn Unix, Perl, Java, web development, etc., and it was all self-directed. Now with Agile, literally every hour is accounted for, though we of course have other ways of wasting time by overestimating tasks and creating unnecessary do-nothing stories in order to inflate metrics and justify dead space in the sprint.
> People ... aren’t machines that can run constantly on 100% utilization.
You also can't run machines at 100% utilisation & expect quality results. That's when you see tail latencies blow out, hash maps lose their performance, physical machines wear supra-linearly... The list goes on.
> I have been in the workforce for almost 30 years now and I believe that everybody is getting more squeezed so they don’t have the time or energy to do a proper job. The expectation is to get it done as quickly as possible and not do more unless told so.
That's my impression as well, but I'd stress that this push is not implicit or driven by metrics or Jira. This push is sold as the main trait of software projects, and what differentiates software engineering from any other engineering field.
Software projects are considered adaptable, and all projects value minimizing time to market. This means that on paper there is no requirement to eliminate the need to redesign or reimplement whole systems or features. Therefore, if you can live with a MVP that does 70% of your requirements list but can be hacked together in a few weeks, most would not opt to spend more man months only to get minor increments. You'd be even less inclined to pay all those extra man months upfront if you can quickly get that 70% in a few weeks and from that point onward gradually build up features.
You can’t brute-force insight.
I'm often reminded of that Futurama episode “A Pharaoh to Remember” (S04E07), where Bender is whipping the architects/engineers in an attempt to make them solve problems faster.
Definitely squeezed.
They say AI, but AI isn't eliminating programming. I've wrote a few applications with AI assistance. It probably would've been faster if I wrote it myself. The problem is that it doesn't have context and wildly assumes what your intentions are and cheats outcomes.
It will replace juniors for that one liner, it won't replace a senior developer who knows how to write code.
I was about to post largely the same thing. There is a saying in design: "Good, fast, cheap --- pick two." The default choice always seems to be fast and cheap nowadays. I find myself telling other people to take their time, but I too have worked jobs where the workloads were far too great to do a decent job. So this is what we get.
One time during a 1:1 with who I consider the best manager I ever had, in the context of asking now urgent something needed to get done, I said something along the llines of how I tend to throttle to around 60% of my "maximum power" to avoid burnout but I could push a bit harder if the task we were discussing was essential with to warrant it. He said that it wasn't necessary but also stressed that any time in the future that I did push myself further, I should always return to 60% power as soon as I could (even if the "turbo boost" wasn't enough to finish whatever I was working on. To this day, I'm equally amazed at both how his main concern with the idea of me only working at 60% most of the time was that I didn't let myself get pressured into doing more than that and the fact that there are probably very few managers out there who would react well to my stating the obvious truth that this is necessary
The article addresses the fact that it's more of the "job" that the software company provides as an extension of their services isn't really a "job" a la "SW development in the 90s"
It's the after effect of companies not being penalized for using the exploitation dragnet approach to use people in desperate situations to generate more profits while providing nothing in return.
People have to care about outcomes in order to get good outcomes. Its pretty difficult to get someone to work extra time, or care about the small stuff if there is a good chance that they will be gone in 6 months.
Alternatively, if leadership is going to cycle over in 6 months - then no one will remember the details.
Have we learnt nothing? 100% utilisation of practically any resource will result in problems with either quality or schedules.
What, as an industry, do we need to do to learn this lesson?
> People need some slack
Definitely. If you tighten a bearing up-to 100% - to zero "play", it will stop rotating easy.. and start wearing. Which is.. in people-terms, called burnout.
Or as article below says, (too much) Efficiency is the Enemy..
I've always thought if I gave better estimates about how long things would take, my schedule would support a decent job.
But black swans seem to be more common than anticipated.
(I also wonder - over your career, do you naturally move up to jobs with higher salaries and higher expectations?)
Only 20 years for me, but this is my observation also.
I think letting devs 2 hours a day, that they can flex so if they wanna use it on Fridays its fine, for personal projects, whether internal or otherwise. Just think of all the random tech debt that could be resolved if devs had 2 hours a day to code anything, including new projects that benefit everyone. Most people can only squeeze out about 6 hours worth of real work anyway. You burn up by the end of the day.
It's almost as if people don't understand what the word "productivity" means. That's all it is, if you hear "x increase in productivity" and it sounds great, it really means : you, the worker, work harder after we fire other people and thus are "more productive" because you did the same out put that 2 people did. Sucker. And we all eat this shit up.
I totally agree, it was a stark contrast between phd life and purely sw engineer life, in terms of doing things the way i wanted.
I've even seen this and it seems to have accelerated in the last 10 years or so. I'm seeing roles be combined, deadlines get tighter, and quality go down. Documentation has also gotten worse. This all seems pretty odd when you consider the tools to develop, test, and even document have mostly gotten more powerful/better/faster.
How much more expensive is your time for the company now vs the 90s?
sounds like bit of a death spiral
as tech gets commoditized the companies are worse, more funding but worse
Same. What's crazier now is nobody in management seems to want to take a risk, when the risks are so much lower. We have better information, blogs, posts on how others solved issue, yet managers are still like "we can't risk changing our backend from dog shit to postgres". . . .when in the 90s you would literally be figuring it all out yourself, making a gut call and you'd be supported to venture into the unknown.
now it's all RSU, Stock Prices, FAANG ego stroking and mad dashes for the acquihire exit pushing out as much garbage as possible while managers shine it up like AI goodness
> In SW development in the 90s I had much more time for experimentation to figure things out. In the last years you often have some manager where you basically have to justify every thing you do and always a huge pile of work that never gets smaller.
Software development for a long time had the benefit that managers didn't get tech. They had no chance of verifying if what the nerds told them actually made sense.
Nowadays there's not just Agile, "business dashboards" (Power BI and the likes) and other forms of making tech "accountable" to clueless managers, but an awful lot of developers got bought off to C-level and turned into class traitors, forgetting where they came from.
> In the last years you often have some manager where you basically have to justify every thing you do and always a huge pile of work that never gets smaller. So you just hurry through your tasks.
This has been my exact experience. Absolutely everything is tracked as a work item with estimates. Anything you think should be done needs to be justified and tracked the same way. If anything ever takes longer than the estimate that was invariably just pulled out of someones ass (because it's impossible to accurately estimate development unless you're already ~75% of the way through doing it, and even then it's a crapshoot) you need to justify that in a morning standup too.
The end result of all of this is every project getting bogged down by being stuck on the first version of whatever architecture was thought up right at the beginning and there being piles of tech debt that never gets fixed because nobody who actually understands what needs to be done has the political capital to get past the aforementioned justification filter.