logoalt Hacker News

eurekintoday at 12:21 AM1 replyview on HN

Correct, most of r/LocalLlama moved onto next gen MoE models mostly. Deepseek introduced few good optimizations that every new model seems to use now too. Llama 4 was generally seen as a fiasco and Meta haven't made a release since


Replies

fragmedetoday at 1:03 AM

What are some of the models people are using? (Rather than naming the ones they aren't.)

show 1 reply