logoalt Hacker News

bobjordanyesterday at 2:06 PM1 replyview on HN

I’ve been using my 10-year-old ThinkPad X250 and a decade-old workstation without feeling any need to upgrade. However, the possibility of running powerful local LLMs that require a lot of GPU or unified memory has finally increased my interest. It's my impression that laptops likely won’t see the major leap required in that area to run truely large LLMs for another 5–10 years, but I expect workstation capabilities to advance more rapidly, meaning I may upgrade in the next 1–3 years.

My current workstation setup includes 22cores/44threads decade old xeon plus four decade old Titan X GPUs with a total of 48GB VRAM, which is enough to run a decent local AI model, but I’m finally wanting more capacity. I haven’t been this interested to upgrade in a decade. NVIDIA’s new DGX-class offerings might convince me, depending on pricing and supply, although waiting a few more years to let things stabilize could be what I do. Still, it’s an exciting time for hardware, especially now that there’s a tangible reason to invest in more power for local AI.


Replies

Melatonicyesterday at 9:56 PM

The new Nvidia developer workstations (believe they are much cheaper than full DGX systems) are definitely interesting.

I have a desktop with a Titan XP (somewhat similar to your Titan X). If you look up LLM performance however these older GPUs (even with enough VRAM) do quite poorly. They still hold up great for gaming and many other GPU hungry tasks though.

Personally I think a really cool setup would be something like a modern MacBook Pro with a ton of RAM and high core CPU that could be plugged into an external GPU enclosure when needed. Depending on LLM needs you could be upgrading the external GPU and still use the power efficient laptop on the go.