logoalt Hacker News

Acmeonyesterday at 10:56 PM1 replyview on HN

Looks quite cool! I have also considered similar ideas, but have decided to not develop anything. However, there are a few important areas where we have different points of view. My main use cases relate mainly to computationally demanding applications and, thus, runtime performance is quite important.

First, the claim that one gets "reactivity for free" is not entirely true (although the costs may be negligible for many apps). The stores are wrapped in proxies, which lowers the performance of, for example, property access. This is why I rejected the idea of proxies and instead considered generating getters and setters to handle reactivity. This could, in principle, enable zero overhead reads (assuming that all the reactivity stuff were to be handled in the setter). However, this approach fails to account for, for example, array mutations. Thus, I see the point in proxies, but the cost of them can be significantly lowered performance (which may not matter for all applications).

Second, not memoizing computed values (i.e., getters) can also have a significant negative impact on runtime performance, because expensive computations may have to be executed many times. I suppose that caching could be offloaded to the developer, but this could be laborious, at least if the developer would have to keep track of when to invalidate the computation. In, for example, Excel, computed values can be accessed fast, because they are stored in memory (although the added memory usage can become a problem).

Third, you have not addressed async stores or async computed values (as far as I can tell). I will admit that considering async can be quite complicated. However, for better or worse, async is still a key part of JavaScript (e.g., sync alternatives do not exist for all APIs). Thus, the claims "JavaScript is enough", "Zero Concepts" and "You already know the entire API" are slightly imprecise, because async is not supported (although, this does not necessarily matter for many applications).

These three points were the main reasons why I chose not to pursue my similar ideas (especially in the context of computationally demanding applications). Still, I find the idea that JavaScript should be enough for reactivity to be compelling and worthwhile of further development.


Replies

dasherswyesterday at 11:27 PM

Thanks for your insights. I was originally hesitant about the performance of proxies, too, but they turned out to be great. The benchmarks (https://geajs.com/benchmark-report.html) also show good results. Both in terms of memory and CPU cycles, even though proxies are obviously adding an overhead, it's not day and night (https://jsben.ch/proxy-vs-object-performance-benchmark-dtxo6 is a good test for this). With a proxy, you can set a property 25 million times per second (on my M4 Max machine in Safari) with only 4% perf loss vs defineProperty, and Chrome is about half the perf with 20% loss vs defineProperty. So, still, 12.5 million setters per second is pretty good. Of course if your use case demands more performance, nothing beats a hand-optimized vanilla JS code.

Since Gea doesn't rerender the template _at all (well, for the most part, at least)_, in theory we wouldn't really gain much from getter memoization, mainly because we create proxy observers for computed values that update the DOM in place only when the underlying value updates.

And since stores are plain old JS classes, there's no need for an "async store" concept. Just update a value in the store whenever you want, track its loading state however you want, either synchronously or asynchronously, and the observers will take care of them. If you refer to another pattern that I'm not aware of, please let me know.

show 1 reply