A lot of the time it’s less “nobody checked the security inbox” and more “the one person who understands that part of the system is juggling twelve other fires.” Security fixes are often a one-hour patch wrapped in two weeks of internal routing, approvals, and “who even owns this code?” archaeology. Holiday schedules and spam filters don’t help, but organizational entropy is usually the real culprit.
It could also be someone "practicing good time management."
They have a specific time of day, when they check their email, and they only give 30 minutes to that time, and they check emails from most recent, down.
The email comes in, two hours earlier, and, by the time they check their email, it's been buried under 50 spams, and near-spams; each of which needs to be checked, so they run out of 30 minutes, before they get to it. The next day, by email check time, another 400 spams have been thrown on top.
Think I'm kidding?
Many folks that have worked for large companies (or bureaucracies) have seen exactly this.
I've once had a whole sector of a fintech go down because one DevOps person ignored daily warning emails for three months that an API key was about to expire and needed reset.
And of course nobody remembered the setup, and logging was only accessible by the same person, so figuring out also took weeks.
It's not about fixing it, it's about acknowledging it exists
> A lot of the time it’s less “nobody checked the security inbox” and more “the one person who understands that part of the system is juggling twelve other fires.”
At my past employers it was "The VP of such-and-such said we need to ship this feature as our top priority, no exceptions"