Im tired of using AI in cloud services. I want user friendly locally owned AI hardware.
Right now nothing is consumer friendly. I can’t get a packaged deal of some locally running ChatGPT quality UI or voice command system in an all in one package. Like what Macs did for PCs I want the same for AI.
From the most unexpected place (but maybe expected if you believed they were paying attention)
Maxsun is releasing a a 48GB dual Intel Arc Pro B60 GPU. It's expected to cost ~$1000.
So for around $4k you should be able to build an 8 core 192GB local AI system, which would allow you to locally run some decent models.
This also assumes the community builds an intel workflow, but given how greedy Nvidia is with vram, it seems poised to be a hit.
Your local computer is not powerful enough, and that's why you must welcome those brand new mainframes... I mean, "cloud services."
I made something[0] last year to have something very consumer friendly. Unbox->connect->run. First iteration is purely to test out the concept and is pretty low power, currently working on a GPU version for bigger models and launching Q4 this year.
Oracle just announced they are spending $40 billion on GPU hardware. All cloud providers have an AI offering, and there are AI-specific cloud providers. I don't think retail is invited.