logoalt Hacker News

jcgrilloyesterday at 8:31 PM1 replyview on HN

I hadn't actually heard of this, I have never worked on safety critical systems, but doing some googling I found a reference to a "Designated Engineering Representative" in this article about the FAA's DO-178B: https://en.wikipedia.org/wiki/DO-178B

I wasn't able to find much information about U.S. P.E. certification for SWE's, although there is at least one state which offers it. I wasn't able to find any indication anywhere that a compliance process requires a P.E. to sign off on software. That doesn't mean it doesn't exist though!

One major problem is that now that software is "everywhere" it's escaping the boundaries of safety critical standards. Nobody will be killed directly by a bank getting hacked, but it could result in mortal harm to an individual who has their identity stolen. There are all kinds of systems that aren't labeled safety critical in the kinetic sense which are nonetheless very load-bearing. Software which runs on phones, for example. Surely people have died due to buggy phone software. Nobody is being held meaningfully accountable, so it will continue to happen.

To be clear, I'm not saying we should heap a whole lot more pressure onto security teams. Instead we need to find better ways to make security every engineer's professional ethical responsibility--either directly because they're signing off on the system or indirectly because their respected senior colleague is. I just don't see fines getting us there.


Replies

adrianNtoday at 1:59 AM

I understand your argument, but I feel that good development practices are essentially a company culture question. Putting additional burden on the engineers does not sit well with me as in my experience many engineers already care more about quality and security than their management would like to give them manpower to implement. If we make it easier to have some engineers be scapegoats for management failure this might actually backfire.