logoalt Hacker News

paxysyesterday at 10:32 PM3 repliesview on HN

The problem with all these "AI box" startups is that the product is too expensive for hobbyists, and companies that need to run workloads at scale can always build their own servers and racks and save on the markup (which is substantial). Unless someone can figure out how to get cheaper GPUs & RAM there is really no margin left to squeeze out.


Replies

qubextoday at 6:59 AM

They’re kickstarting a TINY device that is pocketable and aimed at consumers. I’ve backed it (full disclosure).

nine_ktoday at 12:12 AM

Would a hedge fund that does not want to trust to a public AI cloud just buy chassis, mobos, GPUs, etc, and build an equivalent themselves? I suspect they value their time differently.

show 1 reply
kkralevyesterday at 11:55 PM

i think the real gap isnt at the high end tho. theres a whole segment of people who just want to run a 7-8b model locally for personal use without dealing with cloud APIs or sending their data somewhere. you dont need 4 GPUs for that, a jetson or even a mini pc with decent RAM handles it fine. the $12k+ market feels like it's chasing a different customer than the one who actually cares about offline/private AI

show 1 reply