Let's see which company becomes the first to sell "coding appliances": hardware with a model good enough for normal coding.
If Mistral is so permissive they could be the first ones, provided that hardware is then fast/cheap/efficient enough to create a small box that can be placed in an office.
Maybe in 5 years.
my bet is a deepseek box
llm in a box connected via usb is the dream.
...so it won't ever happen, it'll require wifi and will only be accessible via the cloud, and you'll have to pay a subscription fee to access the hardware you bought. obviously.
My Macbook Pro with an M4 Pro chip can handle a number of these models (I think it has 16GB of VRAM) with reasonable performance, my bottleneck continuously is the token caps. I assume someone with a much more powerful Mac Studio could run way more than I can, considering they get access to about 96GB of VRAM out of the system RAM iirc.