Design patterns can be really helpful. In my previous job I worked on enterprise .NET applications. It made sense to use common patterns, because most applications were big and the patterns made it easier to understand unfamiliar code, within an application but also across different teams and applications. New projects looked familiar, because the same style and the same patterns were used.
Now I'm working on an old (+10 years) JS application. Similar patterns were implemented, but in this case it's not helpful at all. The code looks very corporate and Java EE style, with a ton of getters and setters (`getName() {}`, not `get name() {}`, factories, facades, adapters, etc, etc. It's usually completely unclear what the benefit of the pattern is, and code is more complicated, for instance because creating new instances of business objects is split into `Object.build` which calls `new Object`, with no guidelines at all what part of the initialization should be in `build` and what should be in the constructor.
The gist of my comment is that patterns can be useful, but usually they're overused and if you implement one without understanding why and without benefiting from faster understanding the code because the pattern is applied consistently over multiple instances, the result is worse than just implementing what you need in a readable way (YAGNI).
A lot of patterns only make sense in languages like C# or Java, which are inflexible by design. You have two hierarchical trees (inheritance and namespaces) that you have to work around. With something simpler like C, Go, JavaScript, you don’t have those obstacles and a solution can be way simpler to implement.
This is a common thing I see when developers that come from an OOP enterprise environment familiar with Java, C#, etc. do JavaScript, they try to use all the same patterns, default to classes for everything. It just doesn't fit the language.
Yeah, I've seen (okay, and been responsible for) a lot of "the road to hell is paved with good intentions" due to jumping to some pattern or other because, well, it feels like the right thing to do. It makes the code cleaner at delivery time, and is usually very intuitive when the design is fresh in your mind. But IME it doesn't take long before that freshness goes away, and the next time you look at it (let alone anyone else) you find it hard to follow the logic anymore. Usually "constructing" the pattern and "executing" the pattern are in different places in the code, and there's not a straightforward way to mentally step through one without continually cross-referencing the other. At some point you wish you'd just written one long switch block that you could step through.
And that's all before new requirements that break the pattern, or other engineers that don't have time to grok the design and hack in something that looks like a switch block for their thing, which eventually takes over most of the code anyway.
I see a lot of comments about how patterns are useless from people writing toy apps or at least ones that never had to deal with really enterprise scale stuff. So for me they are like people building a shed screaming at people building sky scrapers that no one ever needs to pour so much concrete to form a foundation.
Parent comment is not like that.
People were shifted from Java to Javascript and kept the Java patterns and maybe the organization had standards requiring their use.
design patterns are a language, it's just that the programming language they're being implemented in doesn't support them natively.
e.g. observer pattern in java is what, [array of functions].forEach() in js? not worth calling that by name. another example, singletons - in Python, it's just a module (caveats apply obviously, but if we apply them, some also apply in java).
this is why designing a minimal language to make it 'simple' is misguided: you'll end up having to reinvent the design pattern language anyway. there are good reasons to design a simple language, but simple for the sake of simple is missing the point.
I would put it slightly differently: Patterns (including anti-patterns) happen whether you call them such or not. Any developer will sooner or later come up with patterns like Adapter or Builder or Composite or Iterator. In that sense, patterns are not invented, but discovered. The benefit of design patterns is to be able to communicate these discovered patterns, and to define agreed names for them, so that you don't have to describe the pattern each time you talk to another developer, but can refer to it by a well-understood name. (Or when not yet well-understood, can refer to the corresponding pattern description.) It extends the language we use to talk about software design.
The point of design patterns is less about the individual patterns, than about having "design pattern" as a general concept of coding patterns relevant to software design, that you name and describe because they keep reoccurring.