logoalt Hacker News

ninetynineninelast Wednesday at 10:37 AM5 repliesview on HN

Im tired of using AI in cloud services. I want user friendly locally owned AI hardware.

Right now nothing is consumer friendly. I can’t get a packaged deal of some locally running ChatGPT quality UI or voice command system in an all in one package. Like what Macs did for PCs I want the same for AI.


Replies

Hiliftlast Wednesday at 10:48 AM

Oracle just announced they are spending $40 billion on GPU hardware. All cloud providers have an AI offering, and there are AI-specific cloud providers. I don't think retail is invited.

Workaccount2last Wednesday at 3:12 PM

From the most unexpected place (but maybe expected if you believed they were paying attention)

Maxsun is releasing a a 48GB dual Intel Arc Pro B60 GPU. It's expected to cost ~$1000.

So for around $4k you should be able to build an 8 core 192GB local AI system, which would allow you to locally run some decent models.

This also assumes the community builds an intel workflow, but given how greedy Nvidia is with vram, it seems poised to be a hit.

show 1 reply
Disposal8433last Wednesday at 11:09 AM

Your local computer is not powerful enough, and that's why you must welcome those brand new mainframes... I mean, "cloud services."

show 2 replies
ata_amanlast Wednesday at 3:42 PM

I made something[0] last year to have something very consumer friendly. Unbox->connect->run. First iteration is purely to test out the concept and is pretty low power, currently working on a GPU version for bigger models and launching Q4 this year.

[0] https://persys.ai

petesergeantlast Wednesday at 11:33 AM

Hoping the DGX Spark will deliver on this

show 1 reply