logoalt Hacker News

throwaway894345last Wednesday at 3:55 AM2 repliesview on HN

What does it mean that only 3B parameters are active at a time? Also any indication of whether this was purely CPU or if it’s using the Pi’s GPU?


Replies

kouteiheikalast Wednesday at 4:45 AM

> What does it mean that only 3B parameters are active at a time?

In a nutshell: LLMs generate tokens one at a time. "only 3B parameters active a a time" means that for each of those tokens only 3B parameters need to be fetched from memory, instead of all of them (30B).

show 1 reply
numpad0last Wednesday at 6:53 AM

I've asked Gemini about it the other day(I'm dumb and shameless). Apparently it means that the model branches into bunch of 3B sections in the middle and joins at both ends, totaling in parameters at 30B. This means computational footprint reduces to (bottom "router" parts + 3B + top parts) of effectively-5B or whatever specific to that model implied by "3B", rather than the full 30B.

MoE models still operate on token-by-token basis, i.e. "pot/at/o" -> "12345/7654/8472". "Experts" are selected on per-token basis, not per-interation, so "expert" naming might be a bit of a misnomer, or marketing.