IIUC Gemini will run in Apple's cloud infra, not on device. The only "gemini" local model is really old by today's standards, and is not that smart for local inference (newer open source models are better).
That's what I figured. Some day eventually it will be possible. Until then, it's only LM Studio or Ollama as a potential hookup.
I've got some ideas inspired by this project. It's promising.
That's what I figured. Some day eventually it will be possible. Until then, it's only LM Studio or Ollama as a potential hookup.
I've got some ideas inspired by this project. It's promising.