I recently had a similar experience, although not this size.
Pre-story: For 3 years I wanted to build a rack-gaming-server, so I can play with my son in our small apartment where we don't have enough space for a gaming computer (wife also doesn't allow it). I have a stable IPsec connection to my parents house, where I have a powerfull PV plant (90kWp) and a rack server, for my freelance job.
Fast forward to 2 months ago, I see a Supermicro SYS-7049GP-TRT for 1400€ on Ebay. It looks clean, sold by some IT reuse-warehouse. No desription, just 3 photos and the case label. I ask the seller whether he knows whats in it and he says he didn't check. The case alone comes new at 3k here in Germany. I buy it.
It arrives. 64GB ECC memory, 2x Xeon silver, 1x 500GB SSD, 5x GBit LAN Cards. Dual 2200 Watt PowerSupply. I remove the airshroud, and: A Nvidia V100S 32GB emerges. I sell the card on ebay for 1600€ and buy 2x Xeon 6254 CPUs (100€ each) to replace the 2x Silver ones that are in it. Last week, I bought two Blackwell RTX 4000 Pro for 1100€ each. Enough for gaming with my son! (and I can do some fun with LLMs and home assistant/smart home..)
The case fits 4x dual-size GPUs, so I could fit 4x RTX 6000 in it (384GB VRAM). At a price of 3k, this would come at 12k (still too much for me.. but let's check back in a couple of years..).
Buying used enterprise gear is fun. I had so many good experiences and this stuff is just rock solid.
Love how a €7.5k 20 kilogram server is placed on a €5 particleboard table. I have owned several LACKs but would never put anything valuable on it. IKEA rates them at 25 kilogram maximum load.
> Your mileage may vary. Literally: I had to drive two hours to pick this thing up.
Good one
Serious question: does this thing actually make games run really great? Or are they so optimized for AI/ML workloads that they either don’t work or run normal video games poorly?
Also:
> I arrived at a farmhouse in a small forest…
Were you not worried you were going to get murdered?
> Getting the actual GPU working was also painful, so I’ll leave the details here for future adventurers:
> # Data Center/HGX-Series/HGX H100/Linux aarch64/12.8 seem to work! wget https://us.download.nvidia.com/tesla/570.195.03/NVIDIA-Linux...
> ...
Nothing makes you feel more "I've been there" than typing inscrutable arcana to get a GPU working for ML work...
but .. you know .. can it run Crysis? :-D
SCNR
While this is undoubtably still an excellent deal, the comparison to the new price of H100 is a bit misleading, since today you can buy a new, legit RTX 6000 Pro for about $7-8k, and get similar performance the first two of the models tested at least. As a bonus those can fit in a regular workstation or server, and you can buy multiple. This thing is not worth $80k in the same way that any old enterprise equipment is not worth nearly as much as its price when it was new.
Wow! As others have said, deal of the century!! As a side note, a few years back, I used to scrape eBay for Intel QS Xeon and quite a few times managed to snag incredible deals, but this is beyond anything anyone has ever achieved!
Pretty amazing, although the power consumption and volume put it past the envelope of what I would be willing to run at home…
I would appreciate it if someone could name some shops where you can buy used enterprise grade equipment.
Most of them are in California? Anything in NY/NJ
Great work on the rebuild! The photos are helpful, but if by any chance you happened to film the process, I'd love to see it on YouTube.
Ah, that's the best way to spend ~10K
Wow! Kudos for thinking it was possible and making it happen. I was wondering how long it would be before big local models were possible under 10k—pretty impressive. Qwen3-235B can do mundane chat, coding, and agentic tasks pretty well.
Argh i was so so hoping that this is a 'thing' and I can just do that too.
Lets continue to hope
This is freaking cool. Nice job!
So, what do you plan to do with it?
What inference performance are you getting on this with llama?
How long would it take to recoup the cost if you made the model available for others to run inference at the same price as the big players?
For that price ? The bubble already popped for sure !
What an incredible barn-find type story. Incredible. And you are among very few buyers who could have so lovingly done such an incredible job debugging driver & motherboard issues. Please add a kitsch Serial Experiment Lain themed computing shrine around this incredible work, and all's done.
> 4x Arctic Liquid Freezer III 420 (B-Ware) - €180
Quite aside, but man: I fricking love Arctic. Seeing their fans in the new Corsi-Rosenthal boxes has been awesome. Such good value. I've been sing a Liquid Freeze II after nearly buying my last air-cooled heat-sink & seeing the LF-II onsale for <$75. Buy.
Please give us some power consumption figures! I'm so curious how it scales up and down. Do different models take similar or different power? Asking a lot, but it'd be so neat to see a somewhat high res view (>1 sample/s) of power consumption (watts) on these things, such a unique opportunity.
one of the coolest things i've seen recently. kudos!
It's practically free
Deal of the century.
inspiring! is there an ip i can connect to test the inference speed?
You lucky dog. Have fun!
Actually the most incredible part of the story is that a computer geek ofthis 9th circle of geekdom level has a wife.
Maybe the title could be I bought an Nvidia server..... to avoid confusion that it's something to do with Grace Hopper the person, and her servers ...or mainframes?
Can you bitcoin mine?
This is the story of how I bought enterprise-grade AI hardware designed for liquid-cooled server racks that was converted to air cooling, and then back again, survived multiple near-disasters (including GPUs reporting temperatures of 16 million degrees), and ended up with a desktop that can run 235B parameter models at home. It’s a tale of questionable decisions, creative problem-solving, and what happens when you try to turn datacenter equipment into a daily driver.