logoalt Hacker News

Gracanayesterday at 8:35 PM1 replyview on HN

Six months ago I'd have said EPYC Turin. You could do a heck of a build with 12Ch DDR5-6400 and a GPU or two for the dense model parts. 20k would have been a huge budget for a homelab CPU/GPU inference rig at the time. Now 20k won't buy you the memory.


Replies

mythztoday at 1:03 AM

Not VRAM? What performance are people getting running GLM or Kimi on DDR5?