logoalt Hacker News

jandrewrogerstoday at 12:32 AM1 replyview on HN

The observation that concatenative programming languages have nearly ideal properties for efficient universal learning on silicon is very old. You can show that the resource footprint required for these algorithms to effectively learn a programming language is much lower than other common types of programming models. There is a natural mechanical sympathy with the theory around universal learning. It was my main motivation to learn concatenative languages in the 1990s.

This doesn't mean you should write AI in these languages, just that it is unusually cheap and easy for AI to reason about code written in these languages on silicon.


Replies

wwwestontoday at 2:07 AM

It sounds like you’re referring a proof. Where can one find it, and what background prepares one for it?