logoalt Hacker News

Interactive λ-Reduction

103 pointsby jy14898last Monday at 8:48 PM21 commentsview on HN

Comments

danaugrstoday at 2:28 PM

Author here. Thanks for posting.

I spent several years' worth of weekends working on this, and I'm glad to see it here on Hacker News.

I started working on this problem when I learned about Lamping's algorithm for optimal lambda reduction [1]. He invented a beautiful algorithm (often referred to as his "abstract algorithm"), which uses fan-in and fan-out nodes to reduce lambda terms optimally (with optimal sharing). Unfortunately, in order to make it fully work, some fans needed to duplicate one another while others needed to cancel one another. To determine this correctly Lamping had to extend his abstract algorithm with several delimiter node types and many additional graph reduction rules. These delimiter nodes perform "bookkeeping", making sure the right fan nodes match. I was dissatisfied with the need for these additional nodes and rules. There had to be a better way.

My goal was to try to implement Lamping's abstract algorithm without adding any delimiter nodes, and to do it under the interaction net paradigm to ensure perfect confluence. I tried countless solutions, and finally Delta-Nets was born. Feel free to ask any questions.

I recently started building a programming language on top of Delta-Nets, called Pur (https://pur.dev/).

Feel free to follow along this journey:

https://x.com/danaugrs

https://x.com/purlanguage

[1] https://dl.acm.org/doi/pdf/10.1145/96709.96711

tromptoday at 9:44 AM

I linked to the paper in a recent comment [1]. The author has been active in the Higher Order Community Discord channel for quite a while. The Higher Order Company [2] is developing HVM (Higher Order Virtual Machine), a high performance implementation of interaction nets for both CPUs and GPUs, and started the channel to discuss development and related topics, such as the Bend language for programming the HVM, discussed on HN at [3].

The paper manages to present previous work on Levy's optimal beta reduction in a more streamlined fashion, generalizing duplicators to so-called replicators that avoid the need for separate book-keeping gadgets (like croissants and brackets) . Its author also proves himself to be a skilled programmer in bringing his theory to life with this web based evaluator.

The app contrasts traditional graph-based λ-calculus reduction (which replaces nodes with entire subgraphs) with interaction (which makes only local modifications and hence needs more steps), while showing that semantics is preserved.

[1] https://news.ycombinator.com/item?id=46034355

[2] https://higherorderco.com/

[3] https://news.ycombinator.com/item?id=40390287

xjmtoday at 4:04 PM

I find it much easier to see what is going on when selecting λ-calculus instead of Δ-Nets. E.g. for the mandatory Y Combinator,

λf.(λx.f (x x)) (λx.f (x x))

for which the difference with

λf.(λx.f (x x)) (λx.f (x x) f)

is very clear, whereas with Δ-nets the difference is more subtle. I guess it is because the visualization has more information than with the λ-calculus.

asgrtoday at 9:41 AM

reminds me of https://github.com/HigherOrderCO/HVM

I see Salvatore has a fork, so they are obviously aware of it. unsure whether theyre proposing the exact same thing without reference or citation…

show 1 reply
fnord77today at 4:22 PM

What is this about? A pointer to a tutorial or a wiki link would be nice for someone who has no idea what this is. Thank you

show 1 reply
alan-jordan13today at 2:22 PM

[dead]

qsorttoday at 9:00 AM

What the hell is this?

The linked paper: https://arxiv.org/pdf/2505.20314 claims the squiggles they introduce are apparently a model to solve Levy-optimal parallel reduction of lambda terms.

But the author has no affiliation, it's very weird they're calling this "lambda-reduction" and it heavily smells of AI slop?

I hope I'm wrong but it doesn't look right. Can anyone with expertise in this field chime in?

show 5 replies