Playing with local LLMs is indeed fun. I use Kasm workspaces[0] to run a desktop session with ollama running on the host. Gives me the isolation and lets me experiment with all manner of crazy things (I tried to make a computer-use AI but it wasn't very good)