Under local deployment:
> Local backend server with full API Local model integration (vLLM, Ollama, LM Studio, etc.) Complete isolation from cloud services Zero external dependencies
Seems open source/open weight to me. They additionally offer some cloud hosted version.