---
title: Model Provider
description: Connect to different LLM providers with Tarko
---

# Model Provider

Tarko follows the **OpenAI Compatible** protocol to connect to any LLM provider, including Volcengine, OpenAI, Anthropic, Gemini, and more.

## Configuration

Configure your model provider in the Agent configuration:

```typescript
import { Agent } from '@tarko/agent';

const agent = new Agent({
  model: {
    provider: 'openai',
    apiKey: process.env.OPENAI_API_KEY,
    model: 'gpt-4'
  }
});
```

## Supported Providers

### OpenAI

```typescript
{
  provider: 'openai',
  apiKey: 'your-api-key',
  model: 'gpt-4'
}
```

### Anthropic

```typescript
{
  provider: 'anthropic',
  apiKey: 'your-api-key',
  model: 'claude-3-sonnet'
}
```

### Volcengine

```typescript
{
  provider: 'volcengine',
  apiKey: 'your-api-key',
  model: 'doubao-pro'
}
```

## Custom Provider

You can also use any OpenAI-compatible endpoint:

```typescript
{
  provider: 'custom',
  baseURL: 'https://your-endpoint.com/v1',
  apiKey: 'your-api-key',
  model: 'your-model'
}
```
