Never used ollama, only ready to go models via llamafile and llama.cpp.
Maybe ollama has some defaults it applies to models? I start testing models at 0 temp and tweak from there depending how they behave.