- Call any LLM with the same API. TensorZero unifies every major LLM API (e.g. OpenAI) and inference server (e.g. Ollama).
- Get started with a few lines of code. Later, you can optionally add observability, automatic fallbacks, A/B testing, and much more.
- Use any programming language. You can use TensorZero with any OpenAI SDK (Python, Node, Go, etc.).
- Python
- Node
You can point the OpenAI Python SDK to a TensorZero Gateway to call any LLM with a unified API.
Set up the credentials for your LLM provider
For example, if you’re using OpenAI, you can set the
OPENAI_API_KEY environment variable with your API key.Install the OpenAI Python SDK
You can install the OpenAI SDK with a Python package manager like
pip.Deploy the TensorZero Gateway
Let’s deploy the TensorZero Gateway using Docker.
For simplicity, we’ll use the gateway without observability or custom configuration.
Initialize the OpenAI client
Let’s initialize the OpenAI SDK and point it to the gateway we just launched.