logoalt Hacker News

raggiyesterday at 10:19 PM3 repliesview on HN

I really hate the anti-RAII sentiments and arguments. I remember the Zig community lead going off about RAII before and making claims like "linux would never do this" (https://github.com/torvalds/linux/blob/master/include/linux/...).

There are bad cases of RAII APIs for sure, but it's not all bad. Andrew posted himself a while back about feeling bad for go devs who never get to debug by seeing 0xaa memory segments, and sure I get it, but you can't make over-extended claims about non-initialization when you're implicitly initializing with the magic value, that's a bit of a false equivalence - and sure, maybe you don't always want a zero scrub instead, I'm not sold on Go's mantra of making zero values always be useful, I've seen really bad code come as a result of people doing backflips to try to make that true - a constructor API is a better pattern as soon as there's a challenge, the "rule" only fits when it's easy, don't force it.

Back to RAII though, or what people think of when they hear RAII. Scope based or automatic cleanup is good. I hate working with Go's mutex's in complex programs after spending life in the better world. People make mistakes and people get clever and the outcome is almost always bad in the long run - bugs that "should never get written/shipped" do come up, and it's awful. I think Zig's errdefer is a cool extension on the defer pattern, but defer patterns are strictly worse than scope based automation for key tasks. I do buy an argument that sometimes you want to deviate from scope based controls, and primitives offering both is reasonable, but the default case for a ton of code should be optimized for avoiding human effort and human error.

In the end I feel similarly about allocation. I appreciate Zig trying to push for a different world, and that's an extremely valuable experiment to be doing. I've fought allocation in Go programs (and Java, etc), and had fights with C++ that was "accidentally" churning too much (classic hashmap string spam, hi ninja, hi GN), but I don't feel like the right trade-off anywhere is "always do all the legwork" vs. "never do all the legwork". I wish Rust was closer to the optimal path, and it's decently ergonomic a lot of the time, but when you really want control I sometimes want something more like Zig. When I spend too much time in Zig I get a bit bored of the ceremony too.

I feel like the next innovation we need is some sanity around the real useful value that is global and thread state. Far too much toxic hot air is spilled over these, and there are bad outcomes from mis/overuse, but innovation could spend far more time on _sanely implicit context_ that reduces programmer effort without being excessively hidden, and allowing for local specialization that is easy and obvious. I imagine it looks somewhere between the rust and zig solutions, but I don't know exactly where it should land. It's a horrible set of layer violations that the purists don't like, because we base a lot of ABI decisions on history, but I'd still like to see more work here.

So RAII isn't the big evil monster, and we need to stop talking about RAII, globals, etc, in these ways. We need to evaluate what's good, what's bad, and try out new arrangements maximize good and minimize bad.


Replies

sesteptoday at 12:08 AM

Heh, sounds like you'd love the work-in-progress I'm about to present at MWPLS 2025 :)

iainmerrickyesterday at 11:32 PM

Have you tried Swift? It has the sort of pragmatic-but-safe-by-default approach you’re talking about.

show 2 replies
bsderyesterday at 11:44 PM

> So RAII isn't the big evil monster, and we need to stop talking about RAII, globals, etc, in these ways.

I disagree and place RAII as the dividing line on programming language complexity and is THE "Big Evil Monster(tm)".

Once your compiled language gains RAII, a cascading and interlocking set of language features now need to accrete around it to make it ... not excruciatingly painful. This practically defines the difference between a "large" language (Rust or C++) and a "small" language (C, Zig, C3, etc.).

For me, the next programming language innovation is getting the garbage collected/managed memory languages to finally quit ceding so much of the performance programming language space to the compiled languages. A managed runtime doesn't have to be so stupidly slow. It doesn't have to be so stupidly non-deterministic. It doesn't have to have a pathetic FFI that is super complex. I see the "strong typing everywhere" as the first step along this path. Fil-C might become an interesting existence proof in this space.

I view having to pull out any of C, Zig, C++, Rust, etc. as a higher-level programming language failure. There will always be a need for something like them at the bottom, but I really want their scope to be super small. I don't want to operate at their level if I can avoid them. And I say all this as someone who has slung more than 100KLoC of Zig code lately.

For a concrete example, let's look at Ghostty which was written in Zig. There is no strong performance reason to be in Zig (except that implementations in every other programming language other than Rust seem to be so much slower). There is no strong memory reason to be in Zig (except that implementations in every other programming language other than Rust chewed up vast amounts of it). And, yet, a relatively new, unstable, low-level programming language was chosen to greenfield Ghostty. And all the other useful terminal emulators seem to be using Rust.

Every adherent of managed memory languages should take it as a personal insult that people are choosing to write modern terminal emulators in Rust and Zig.

show 3 replies