logoalt Hacker News

shakylast Friday at 6:39 PM2 repliesview on HN

This is something that I think about quite a bit and am grateful for this write-up. The amount of friction to get privacy today is astounding.


Replies

Aurornislast Saturday at 2:51 PM

> The amount of friction to get privacy today is astounding

I don't understand this.

It's easy to get a local LLM running with a couple commands in the terminal. There are multiple local LLM runners to choose from.

This blog post introduces some additional tools for sandboxed code execution and browser automation, but you don't need those to get started with local LLMs.

There are multiple local options. This one is easy to start with: https://ollama.com/

show 1 reply
sneaklast Friday at 7:16 PM

This writeup has nothing of the sort and is not helpful toward that goal.

show 1 reply