logoalt Hacker News

NitpickLawyerlast Monday at 7:38 AM1 replyview on HN

> They basically cloned Qwen3 on that

Oh, come on! GPT4 was rumoured to be an MoE well before Qwen even started releasing models. oAI didn't have to "clone" anything.


Replies

littlestymaarlast Monday at 11:45 AM

First, it would be great if people stopped top acting as if those billion-dollar corporations where sport teams.

Second, I don't claim OpenAI have to clone anything, and I have no reason to believe that their proprietary models are copying other people's ones. But for this particular open weight models, they clearly have an incentive to use exactly the same architectural base as another actor's, in order to avoid leaking too much information about their own secret sauce.

And finally, though GPT-4 was a MoE it was most likely what TFA calls “early MoE” with a few very big experts, not many small ones.