Thanks!
One reason why I'm asking is that I'm in the market for a new laptop and am wondering whether it's worth spending more for the possible benefits of being able to run ~30-40GB local LLMs.
Unfortunately it doesn't look as if the answer is either "ha ha, obviously not" or "yes, obviously". (If the question were only about models available right now I think the answer would be no, but it seems like they're close enough to being useful that I'm reluctant to bet on them not being clearly genuinely useful a year from now.)
Yeah, it's not an obvious answer at all. Spending ~$3,000+ on a laptop to run local models is only economically sensible if you are VERY paranoid about using APIs (there are plenty of API providers that I personally trust not to train on my data) - otherwise that $3,000 will buy you many years worth of access to the best available models via API.