<ParamField path="provider" type="string" required>
This configures which provider to use. The provider is responsible for handling the actual API calls to the LLM service. The provider is a required field.

The configuration modifies the URL request BAML runtime makes.

| Provider Name    | Docs                                                         | Notes                                                      |
| ---------------- | ------------------------------------------------------------ | ---------------------------------------------------------- |
| `anthropic`      | [Anthropic](/ref/llm-client-providers/anthropic)             | Supports [/v1/messages](https://docs.anthropic.com/en/api/messages) endpoint                       |
| `aws-bedrock`    | [AWS Bedrock](/ref/llm-client-providers/aws-bedrock)         | Supports [Converse](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference.html) and [ConverseStream](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference.html) endpoint |
| `google-ai`      | [Google AI](/ref/llm-client-providers/google-ai-gemini)      | Supports Google AI's [generateContent](https://ai.google.dev/api/generate-content) and [streamGenerateContent](https://ai.google.dev/api/generate-content#method:-models.streamgeneratecontent) endpoints |
| `vertex-ai`      | [Vertex AI](/ref/llm-client-providers/google-vertex)         | Supports Vertex's [generateContent](https://cloud.google.com/vertex-ai/docs/reference/rest/v1/projects.locations.publishers.models/generateContent) and [streamGenerateContent](https://cloud.google.com/vertex-ai/docs/reference/rest/v1/projects.locations.publishers.models/streamGenerateContent) endpoints |
| `openai`         | [OpenAI](/ref/llm-client-providers/open-ai)                  | Supports [/chat/completions](https://platform.openai.com/docs/api-reference/chat) endpoint                  |
| `openai-responses` | [OpenAI Responses API](/ref/llm-client-providers/open-ai-responses-api) | Supports OpenAI's most advanced [/responses](https://platform.openai.com/docs/api-reference/responses) endpoint |
| `azure-openai`   | [Azure OpenAI](/ref/llm-client-providers/open-ai-from-azure) | Supports Azure's [/chat/completions](https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#chat-completions) endpoint                  |
| `openai-generic` | [OpenAI (generic)](/ref/llm-client-providers/openai-generic) | Any other provider that supports OpenAI's `/chat/completions` endpoint |


A non-exhaustive list of providers you can use with `openai-generic`:

| Inference Provider | Docs |
| -------------- | -------------- |
| Azure AI Foundary | [Azure AI Foundary](/ref/llm-client-providers/azure-ai-foundary) |
| Groq | [Groq](/ref/llm-client-providers/groq) |
| Hugging Face | [Hugging Face](/ref/llm-client-providers/huggingface) |
| Keywords AI | [Keywords AI](/ref/llm-client-providers/keywordsai) |
| Litellm | [Litellm](/ref/llm-client-providers/litellm) |
| LM Studio | [LM Studio](/ref/llm-client-providers/lmstudio) |
| Ollama | [Ollama](/ref/llm-client-providers/ollama) |
| OpenRouter | [OpenRouter](/ref/llm-client-providers/openrouter) |
| Vercel AI Gateway | [Vercel AI Gateway](/ref/llm-client-providers/vercel-ai-gateway) |
| TogetherAI | [TogetherAI](/ref/llm-client-providers/together) |
| Unify AI | [Unify AI](/ref/llm-client-providers/unify) |
| vLLM | [vLLM](/ref/llm-client-providers/vllm) |


We also have some special providers that allow composing clients together:
| Provider Name  | Docs                             | Notes                                                      |
| -------------- | -------------------------------- | ---------------------------------------------------------- |
| `fallback`     | [Fallback](/ref/llm-client-strategies/fallback)             | Used to chain models conditional on failures               |
| `round-robin`  | [Round Robin](/ref/llm-client-strategies/round-robin)       | Used to load balance                                       |

</ParamField>

<ParamField path="options" type="dict[str, Any]" required>
These vary per provider. Please see provider specific documentation for more
information. Generally they are pass through options to the POST request made
to the LLM.
</ParamField>

