logoalt Hacker News

Aurornislast Saturday at 2:51 PM1 replyview on HN

> The amount of friction to get privacy today is astounding

I don't understand this.

It's easy to get a local LLM running with a couple commands in the terminal. There are multiple local LLM runners to choose from.

This blog post introduces some additional tools for sandboxed code execution and browser automation, but you don't need those to get started with local LLMs.

There are multiple local options. This one is easy to start with: https://ollama.com/


Replies

apitmanlast Sunday at 2:28 PM

> It's easy to get a local LLM running

Easy for what percentage of people?