There's some irony in the fact that this website reads as extremely NOT AI-generated, very human in the way it's designed and the tone of its writing.
Still, this is a great idea, and one I hope takes off. I think there's a good argument that the future of AI is in locally-trained models for everyone, rather than relying on a big company's own model.
One thought: The ability to conveniently get this onto a 240v circuit would be nice. Having to find two different 120v circuits to plug this into will be a pain for many folks.
If I'm spending at least 12k USD on the machine then doing some electrical works to accommodate it is not a big deal.
I don't view this as irony. This seems like good sense in understanding when AI usage will make things better and when it will not.
Good? That's what I want out of all websites. I don't want to read what an AI believes is the best thing for a website, I want to know the honest truth.
Big companies are pushing cloud really hard, and yea the hardware prices too is a problem. People still buy Google cloud and OneDrive when they could literally pickup an old computer from trash and Frankenstein it into a NAS server.
I am a little surprised that they openly solicit code contributions with "Invest with your PRs" but don't have any statement on AI contributions.
Maybe the volume for them is ok that well-intentioned but poor quality PRs can be politely(or otherwise, culture depending) disregarded and the method of generation is not important.
If you’re spending $65,000 on this thing, needing two circuits seems like a minor problem
When you’re dealing with this kind of power it’s easier just to colocate where you’ll typically get two separate feeds of 208v
"locally-trained models for everyone"
Wouldn't there be a massive duplication of effort in that case? It'll be interesting to see how the costs play out. There are security benefits to think about as well in keeping things local-first.
A typical U.S. 240V circuit is actually just two 120V circuits. Fairly trivial to rewire for that.
I find that the most respected writing about AI has very few signs of being written by AI. I'm guessing that's because people in the space are very sensitive to the signs and signal vs. noise.