logoalt Hacker News

AlexCoventrylast Monday at 2:37 AM0 repliesview on HN

I have a transformer attention mechanism which seems to be more data-efficient than the usual dot product, and I'm trying to write a performant backwards kernel for it.