logoalt Hacker News

Alifatisklast Sunday at 10:35 PM1 replyview on HN

> Attention was developed before transformers.

I just looked this up and it’s true, this changes the timeline I had in my mind completely! I thought the paper on Transformers is what also introduced the attention mechanism, but it existed before too and was applied on RNN encoder-decoder. Wow


Replies

logicchainslast Sunday at 11:46 PM

Knowing how such things go, it was probably invented by Schmidhuber in the 90s.

show 1 reply