Attention is an O(n^2) algorithm. Combined with Moore's doubling, it will at best produce linear growth (assuming Moore's law is still remotely close to alive)