There is also https://hacker-laws.com.
Is it not the same as https://github.com/dwmkerr/hacker-laws ?
I think it would be cool to have these shown at random as my phone’s “screensaver”
It's missing:
> Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.
Software engineering is voodoo masquerading as science. Most of these "laws" are just things some guys said and people thought "sounds sensible". When will we have "laws" that have been extensively tested experimentally in controlled conditions, or "laws" that will have you in jail for violating them? Like "you WILL be held responsible for compromised user data"?
Pure gold :) I'm missing one though; "You can never underestimate an end user.".
Some of these laws are like Gravity, inevitable things you can fight but will always exist e.g. increasing complexity. Some of them are laws that if you break people will yell at you or at least respect you less, e.g. leave it cleaner than when you found it.
Uhh, I knew I wasn't going to like this one when I read it.
> Premature Optimization (Knuth's Optimization Principle)
> Another example is prematurely choosing a complex data structure for theoretical efficiency (say, a custom tree for log(N) lookups) when the simpler approach (like a linear search) would have been acceptable for the data sizes involved.
This example is the exact example I'd choose where people wrongly and almost obstinately apply the "premature optimization" principles.
I'm not saying that you should write a custom hash table whenever you need to search. However, I am saying that there's a 99% chance your language has an inbuilt and standard datastructure in it's standard library for doing hash table lookups.
The code to use that datastructure vs using an array is nearly identical and not the least bit hard to read or understand.
And the reason you should just do the optimization is because when I've had to fix performance problems, it's almost always been because people put in nested linear searches turning what could have been O(n) into O(n^3).
But further, when Knuth was talking about actual premature optimization, he was not talking about algorithmic complexity. In fact, that would have been exactly the sort of thing he wrapped into "good design".
When knuth wrote about not doing premature optimizations, he was living in an era where compilers were incredibly dumb. A premature optimization would be, for example, hand unrolling a loop to avoid a branch instruction. Or hand inlining functions to avoid method call overhead. That does make code more nasty and harder to deal with. That is to say, the specific optimizations knuth was talking about are the optimizations compilers today do by default.
I really hate that people have taken this to mean "Never consider algorithmic complexity". It's a big reason so much software is so slow and kludgy.
A law of physics is inviolable.... A law of software engineering is a hot take.
Here's another law: the law of Vibe Engineering. Whatever you feel like, as long as you vibe with it, is software engineering.
"No matter how adept and talented you are at your craft with respect to both technical and business matters, people involved in finance will think they know better."
That one's free.
I feel that Postel's law probably holds up the worst out of these. While being liberal with the data you accept can seem good for the functioning of your own application, the broader social effect is negative. It promotes misconceptions about the standard into informal standards of their own to which new apps may be forced to conform. Ultimately being strict with the input data allowed can turn out better in the long run, not to mention be more secure.
Is it just me seeing the following?
Site not available This site was paused as it reached its usage limits. Please contact the site owner for more information.
Calling these "laws" is a really really bad idea.
The list is great but the explanation are clearly AI slop.
"Before SpaceX, launching rockets was costly because industry practice used expensive materials and discarded rockets after one use. Elon Musk applied first-principles thinking: What is a rocket made of? Mainly aluminum, titanium, copper, and carbon fiber. Raw material costs were a fraction of finished rocket prices. From that insight, SpaceX decided to build rockets from scratch and make them reusable."
Everything including humans are made of cheap materials but that doesn't convey the value. The AI got close to the answer with it's first sentence (re-usability) but it clearly missed the mark.
I have a lot of issues with this one:
https://lawsofsoftwareengineering.com/laws/premature-optimiz...
It leaves out this part from Knuth:
>The improvement in speed from Example 2 to Example 2a is only about 12%, and many people would pronounce that insignificant. The conventional wisdom shared by many of today’s software engineers calls for ignoring efficiency in the small; but I believe this is simply an overreaction to the abuses they see being practiced by penny-wise- and-pound-foolish programmers, who can’t debug or maintain their “optimized” programs. In established engineering disciplines a 12% improvement, easily obtained, is never considered marginal; and I believe the same viewpoint should prevail in software engineering. Of course I wouldn’t bother making such optimizations on a one-shot job, but when it’s a question of preparing quality programs, I don’t want to restrict myself to tools that deny me such efficiencies.
Knuth thought an easy 12% was worth it, but most people who quote him would scoff at such efforts.
Moreover:
>Knuth’s Optimization Principle captures a fundamental trade-off in software engineering: performance improvements often increase complexity. Applying that trade-off before understanding where performance actually matters leads to unreadable systems.
I suppose there is a fundamental tradeoff somewhere, but that doesn't mean you're actually at the Pareto frontier, or anywhere close to it. In many cases, simpler code is faster, and fast code makes for simpler systems.
For example, you might write a slow program, so you buy a bunch more machines and scale horizontally. Now you have distributed systems problems, cache problems, lots more orchestration complexity. If you'd written it to be fast to begin with, you could have done it all on one box and had a much simpler architecture.
Most times I hear people say the "premature optimization" quote, it's just a thought-terminating cliche.
reminder - there's tech out there capable of reading your mind remotely
Mad AI slop..
`Copy as markdown` please.
> This site was paused as it reached its usage limits. Please contact the site owner for more information.
Law 0: Fix infra.
[dead]
[dead]
I believe there should be one more law here, telling you to not believe this baloney and spend your money on Claude tokens.
This website should be a json file