Which local model works best with this? (Assuming MacOS with 32GB unified RAM)
gpt-oss 20B works well. You'll want at least 12k context length for agent mode.
gpt-oss 20B works well. You'll want at least 12k context length for agent mode.