logoalt Hacker News

kouteiheikayesterday at 4:35 PM5 repliesview on HN

If you want to prove (i.e. show that it works and/or it's faster in a real-world scenario) a new alternative to attention without breaking the bank then one of the best ways to do that would probably be to retrain an already existing model, just with swapped attention modules. Then once you have such a model you can do apples-to-apples benchmarks.

This has been done successfully in the past:

https://huggingface.co/featherless-ai/QRWKV-72B

Note that this is a 72B model which would be very expensive to train from scratch, but here they did the conversion for less than $2000.


Replies

Herringyesterday at 8:23 PM

I'd say try the nanogpt speedrun. It's much easier to train, and gives you a better comparison vs optimized systems.

https://github.com/KellerJordan/modded-nanogpt

show 2 replies
tunedtoday at 6:12 AM

thanks for reading. I cannot retrain an existing model as the self-attention mechanism has been completely redesigned. The Keys and Values in self-attention are stored as scalars, so a latent space with traditional weights does not make sense if used in the context of a topological transformer. The two latent spaces would be somehow equivalent eventually but they would store totally different values.

andaiyesterday at 5:23 PM

This is interesting. Has there been more research into this architecture? I hear about it once every few years but it always seems like a niche / experimental thing. But based on the graph in their blog post you'd expect every company to be using this.

show 1 reply
throwaway314155yesterday at 10:01 PM

That doesn’t tell you if the new method continues to perform better at higher parameter counts.

show 2 replies
oofbeyyesterday at 4:57 PM

Depending on how different the attention mechanism is, that might not work. If it’s just a faster / different way of finding the tokens to attend to, sure. But I get the sense the author is implying this method uses different semantics somehow. Although tbh I didn’t follow it entry.