nonhuman's picture
Upload 225 files
4ec8dba

Secret Manager

LiteLLM supports reading secrets from Azure Key Vault and Infisical

Azure Key Vault

Quick Start

### Instantiate Azure Key Vault Client ###
from azure.keyvault.secrets import SecretClient
from azure.identity import ClientSecretCredential

# Set your Azure Key Vault URI
KVUri = os.getenv("AZURE_KEY_VAULT_URI")

# Set your Azure AD application/client ID, client secret, and tenant ID - create an application with permission to call your key vault
client_id = os.getenv("AZURE_CLIENT_ID") 
client_secret = os.getenv("AZURE_CLIENT_SECRET")
tenant_id = os.getenv("AZURE_TENANT_ID") 

# Initialize the ClientSecretCredential
credential = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)

# Create the SecretClient using the credential
client = SecretClient(vault_url=KVUri, credential=credential)

### Connect to LiteLLM ###
import litellm
litellm.secret_manager = client

litellm.get_secret("your-test-key")

Usage with OpenAI Proxy Server

  1. Install Proxy dependencies
pip install litellm[proxy] litellm[extra_proxy]
  1. Save Azure details in your environment
export["AZURE_CLIENT_ID"]="your-azure-app-client-id"
export["AZURE_CLIENT_SECRET"]="your-azure-app-client-secret"
export["AZURE_TENANT_ID"]="your-azure-tenant-id"
export["AZURE_KEY_VAULT_URI"]="your-azure-key-vault-uri"
  1. Add to proxy config.yaml
model_list: 
    - model_name: "my-azure-models" # model alias 
        litellm_params:
            model: "azure/<your-deployment-name>"
            api_key: "os.environ/AZURE-API-KEY" # reads from key vault - get_secret("AZURE_API_KEY")
            api_base: "os.environ/AZURE-API-BASE" # reads from key vault - get_secret("AZURE_API_BASE")

general_settings:
  use_azure_key_vault: True

You can now test this by starting your proxy:

litellm --config /path/to/config.yaml

Quick Test Proxy

Infisical Secret Manager

Integrates with Infisical's Secret Manager for secure storage and retrieval of API keys and sensitive data.

Usage

liteLLM manages reading in your LLM API secrets/env variables from Infisical for you

import litellm
from infisical import InfisicalClient

litellm.secret_manager = InfisicalClient(token="your-token")

messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "What's the weather like today?"},
]

response = litellm.completion(model="gpt-3.5-turbo", messages=messages)

print(response)

.env Files

If no secret manager client is specified, Litellm automatically uses the .env file to manage sensitive data.