---
title: openai
---

The `openai` provider supports the OpenAI `/chat` endpoint, setting OpenAI-specific
default configuration options.

<Tip>
  For Azure, we recommend using [`azure-openai`](azure) instead.

  For all other OpenAI-compatible API providers, such as Groq, HuggingFace,
  Ollama, OpenRouter, Together AI, and others, we recommend using
 [`openai-generic`](openai-generic) instead.
</Tip>

Example:

```baml BAML
client<llm> MyClient {
  provider "openai"
  options {
    api_key env.MY_OPENAI_KEY
    model "gpt-5-mini"
    temperature 0.1
  }
}
```

## BAML-specific request `options`
These unique parameters (aka `options`) are modify the API request sent to the provider.

You can use this to modify the `headers` and `base_url` for example.


<ParamField path="api_key" type="string" default="env.OPENAI_API_KEY">
  Will be used to build the `Authorization` header, like so: `Authorization: Bearer $api_key`

  **Default: `env.OPENAI_API_KEY`**
</ParamField>

<ParamField path="base_url" type="string">
  The base URL for the API.

  **Default: `https://api.openai.com/v1`**
</ParamField>

<ParamField path="headers" type="object">
  Additional headers to send with the request.

Example:

```baml BAML
client<llm> MyClient {
  provider openai
  options {
    api_key env.MY_OPENAI_KEY
    model "gpt-5-mini"
    headers {
      "X-My-Header" "my-value"
    }
  }
}
```

</ParamField>

<Markdown src="/snippets/role-selection.mdx" />

<Markdown src="/snippets/allowed-role-metadata-basic.mdx" />

<Markdown src="/snippets/supports-streaming-openai.mdx" />

<Markdown src="/snippets/finish-reason.mdx" />

<Markdown src="/snippets/client-response-type.mdx" />

<Markdown src="/snippets/media-url-handler.mdx" />

## Provider request parameters
These are other parameters that are passed through to the provider, without modification by BAML. For example if the request has a `temperature` field, you can define it in the client here so every call has that set.

<Warning>
  For reasoning models (like `o1` or `o1-mini`), you must use `max_completion_tokens` instead of `max_tokens`.
  Please set `max_tokens` to `null` in order to get this to work.

  See the [OpenAI API documentation](https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_completion_tokens) and [OpenAI Reasoning Docs](https://platform.openai.com/docs/guides/reasoning#controlling-costs) for more details about token handling.

  Example:

  ```baml BAML
  client<llm> OpenAIo1 {
    provider openai
    options {
      model "o1-mini"
      max_tokens null
    }
  }
  ```
</Warning>


Consult the specific provider's documentation for more information.

<ParamField
   path="messages"
   type="DO NOT USE"
>
  BAML will auto construct this field for you from the prompt
</ParamField>
<ParamField
   path="stream"
   type="DO NOT USE"
>
  BAML will auto construct this field for you based on how you call the client in your code
</ParamField>
<ParamField
  path="model"
  type="string"
>
  The model to use.

| Model | Use Case | Context | Key Features |
|-------|----------|---------|--------------|
| **gpt-5** | Coding, agentic tasks, expert reasoning | 400K total | Built-in reasoning, 45% fewer errors |
| **gpt-5-mini** | Well-defined tasks, cost-efficient | 400K total | Faster alternative to GPT-5 |
| **gpt-5-nano** | Lightweight tasks, maximum efficiency | 400K total | Most cost-effective GPT-5 variant |
| **gpt-4.1** | Large-scale technical work | 1M | Enhanced coding, instruction following |
| **gpt-4.1-mini** | Balanced performance and cost | 1M | Replaces GPT-4o mini |
| **gpt-4.1-nano** | Lightweight variant | 1M | Budget-friendly option |
| **gpt-4o** | General purpose, multimodal | 200K | Updated knowledge cutoff June 2024 |

Note: While GPT-5 is available through this provider, we recommend using the `openai-responses` provider for GPT-5 models to access enhanced response formatting features.

See openai docs for the list of openai models. You can pass any model name you wish, we will not check if it exists.

</ParamField>

For all other options, see the [official OpenAI API documentation](https://platform.openai.com/docs/api-reference/chat/create).

## Changing Regions

To access OpenAI's API in a different region, you can set the `base_url` option
to the appropriate endpoint. For example, to access the API in the EU region,
you can set the `base_url` option to `https://eu.api.openai.com/v1`.