logoalt Hacker News

cubefoxyesterday at 8:38 PM1 replyview on HN

> Training a one bit neural network from scratch is apparently an unsolved problem though.

It was until recently, but there is a new method which trains them directly without any floating point math, using "Boolean variation" instead of Newton/Leibniz differentiation:

https://proceedings.neurips.cc/paper_files/paper/2024/hash/7...


Replies

fookertoday at 12:54 AM

Nice!