logoalt Hacker News

prodigycorptoday at 5:19 PM1 replyview on HN

Comments trashing this are rightly correct skeptics who remember the benchmaxxing of llama 4. This model was out in the woods as early as like a couple months ago but they didn't release it because it was at gemini 2.5 pro levels.


Replies

zozbot234today at 5:24 PM

The llama4 series was one of the earliest large MoE's to be made publically available. People just ignored it because they were focused on running smaller and denser models at the time, we should know better these days.

show 2 replies