Simple Setup
You can use the short-handopenai::model_name to use an OpenAI model with TensorZero, unless you need advanced features like fallbacks or custom credentials.
Chat Completions API
You can use OpenAI models in your TensorZero variants by setting themodel field to openai::model_name.
For example:
model_name in the inference request to use a specific OpenAI model, without having to configure a function and variant in TensorZero.
Responses API
For models that use the OpenAI Responses API (likegpt-5), use the openai::responses::model_name shorthand:
model_name in inference requests:
Advanced Setup
For more complex scenarios (e.g. fallbacks, custom credentials), you can configure your own model and OpenAI provider in TensorZero. For this minimal setup, you’ll need just two files in your project directory:Configuration
Create a minimal configuration file that defines a model and a simple chat function:config/tensorzero.toml
api_base).
Credentials
You must set theOPENAI_API_KEY environment variable before running the gateway.
You can customize the credential location by setting the api_key_location to env::YOUR_ENVIRONMENT_VARIABLE or dynamic::ARGUMENT_NAME.
See the Credential Management guide and Configuration Reference for more information.
Additionally, see the OpenAI-Compatible guide for more information on how to use other OpenAI-Compatible providers.
Deployment (Docker Compose)
Create a minimal Docker Compose configuration:docker-compose.yml
docker compose up.