> I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked […]
768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
Look at the way age gating is going in a global coordinated push. Can control of compute be far behind?
It wasn't my primary motivator but it hasn't made me regret my decision.
I hummed and hawed on it for a good few months myself.
> 768GB of RAM is insane.
Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.
You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.
Apple hardware is incredibly overpriced.
With the way legislation is going these days, self hosting is becoming ever more important. RAM for zfs + containers on k3s doesn't end up being that crazy if you assuming you need to do everything on your own. (at home I've got 1 1tb ram machine, 1 512gb, 3x 128gb all in a k3s cluster with some various gpus about about a half pb of storage before ~ last sept this wasn't _that_ expensive to do)
My home server has 512GB RAM, 48 cores, my 4 desktops are 16 cores 128GB, 4060GPU each. Server is second hand and I paid around $2500 for it. Just below $3000 price for desktops when I built them. All prices are in Canadian Pesos
> spending $10k on a MacBook Pro with 128GB.
As someone who just bought a completely maxed out 14" Macbook Pro with an M5 Max and 128GB of RAM and 8TB SSD, it was not $10k, it was only a bit over $7k. Where is this extra $3k going?
Your battery is going to suffer because of the extra ram as well.
I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.