I saw this some time ago! I personally have a distaste for external DSLs as I think it generally introduces complexity that I don't think is actually worthwhile, so I skipped over it. Also why I'm very "meh" on BAML.
# Call any model provider (or TensorZero function)
model="tensorzero::model_name::anthropic::claude-sonnet-4-6",
messages=[
{
"role": "user",
"content": "Share a fun fact about TensorZero.",
}
],
)
```
You can layer additional features only as needed (fallbacks, templates, A/B testing, etc).
TensorZero works with the OpenAI SDK out of the box:
```
from openai import OpenAI
# Point the client to the TensorZero Gateway
client = OpenAI(base_url="http://localhost:3000/openai/v1", api_key="not-used")
response = client.chat.completions.create(
)```
You can layer additional features only as needed (fallbacks, templates, A/B testing, etc).