logoalt Hacker News

khimarosyesterday at 6:51 PM1 replyview on HN

allocation is irrelevant. as an owner of one of these you can absolutely use the full 128GB (minus OS overhead) for inference workloads


Replies

EasyMarkyesterday at 7:48 PM

Care to go into a bit more on machine specs? I am interested in picking up a rig to do some LLM stuff and not sure where to get started. I also just need a new machine, mine is 8y-o (with some gaming gpu upgrades) at this point and It's That Time Again. No biggie tho, just curious what a good modern machine might look like.

show 1 reply