---
title: litellm
---

[LiteLLM](https://www.litellm.ai/) supports the OpenAI client, allowing you to use the
[`openai-generic`](/ref/llm-client-providers/openai-generic) provider with an
overridden `base_url`.


See [OpenAI Generic](/ref/llm-client-providers/openai-generic) for more details about parameters.


## Set up

1. Set up [LiteLLM Proxy server](https://docs.litellm.ai/docs/proxy/docker_quick_start#21-start-proxy)

2. Set up LiteLLM Client in BAML files

3. Use it in a BAML function!


```baml BAML
client<llm> MyClient {
  provider "openai-generic"
  options {
    base_url "http://0.0.0.0:4000"
    api_key env.LITELLM_API_KEY
    model "gpt-5"
  }
}
```
