> two 4090s is not consumer grade
I think that is a very narrow perspective. Enormous numbers of consumers own $50,000 cars, but a pair of $2000 GPUs is "not consumer"?
I agree with your view that cheap tokens on SOTA are a trap-- people should use local AI or no AI.
I would still question what usefulness there is with a local model even with 10k in GPUs. I certainly haven't seen any great uses myself from these smaller models (<500 parameters) except claims from people who are totally enamored with AI and basically anything output from an LLM impresses them like a toddler who's entertained by the sound their velcro shoes makes.
> Enormous numbers of consumers own $50,000 cars, but a pair of $2000 GPUs is "not consumer"?
$50k is a median priced car in the US. I'd guess >99.9% of people do not own $4000 of GPUs. I consider myself a computer person and I dont think I even own $4000 of computer hardware in total