logoalt Hacker News

mdp202110/11/20241 replyview on HN

> LLMs have dramatically worse performance on basic algebra questions when you add in irrelevant information

"Attention is all you need" /

(It is part of the general problem solving process to evaluate what is relevant and what is not.)


Replies

moffkalast10/11/2024

Differential attention that filters out noise is all you need :)