logoalt Hacker News

dainiusse01/21/20251 replyview on HN

Curious, can anyone having 128gb ram macs tell their story - is it usable for coding and running model locally? How does latency compare to say copilot?


Replies

svachalek01/21/2025

A rambly "thinking" model like this is way too slow for coding assistance imo, although maybe it could take on larger assignments than you could get out of a chat or coding model.