logoalt Hacker News

pona-atoday at 6:07 AM0 repliesview on HN

Attention is an O(n^2) algorithm. Combined with Moore's doubling, it will at best produce linear growth (assuming Moore's law is still remotely close to alive)