logoalt Hacker News

boccafftoday at 12:06 PM1 replyview on HN

llama models pushed the envelope for a while, and having them "open-weight" allowed a lot of tinkering. I would say that most of fine tuned evolved from work on top of llama models.


Replies

oefrhatoday at 12:40 PM

Llama wasn’t Yann LeCun’s work and he was openly critical of LLMs, so it’s not very relevant in this context.

Source: himself https://x.com/ylecun/status/1993840625142436160 (“I never worked on any Llama.”) and a million previous reports and tweets from him.