logoalt Hacker News

midiusyesterday at 10:10 PM1 replyview on HN

Makes me think it's a sponsored post.


Replies

Cadwhiskeryesterday at 10:18 PM

LMStudio? No, it's the easiest way to run am LLM locally that I've seen to the point where I've stopped looking at other alternatives.

It's cross-platform (Win/Mac/Linux), detects the most appropriate GPU in your system and tells you whether the model you want to download will run within it's RAM footprint.

It lets you set up a local server that you can access through API calls as if you were remotely connected to an online service.

show 1 reply