logoalt Hacker News

ACCount37yesterday at 7:51 PM0 repliesview on HN

If platonic representation hypothesis holds across substrates, then it might matter very little, in the end. It holds across architectures in ML, empirically.

The crowd of "backpropagation and Hebbian learning + predictive coding are two facets of the very same gradient descent" also has a surprisingly good track record so far.