Run server with ollama, use Continue extension configured for ollama
I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.
I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.