How do you configure LLM température in coding agents, e.g. opencode?
You can't without hacking it! That's my point! The only places you can easily are via the API directly, or "coomer" frontends like SillyTavern, Oobabooga, etc.
Same problem with image generation (lack of support for different SDE solvers, the image version of LLM sampling) but they have different "coomer" tools, i.e. ComfyUI or Automatic1111
https://opencode.ai/docs/agents/#temperature
set it in your opencode.json