---
title: "Chat with Agent"
api: "POST labs.chonkie.ai/api/v1/chat/completions"
description: "Have conversations with AI agents using OpenAI-compatible API"
---

Chat with your agents using an OpenAI-compatible API. Supports both streaming and non-streaming responses.

## Examples

<CodeGroup>

```python Python - Basic
import requests

url = "https://labs.chonkie.ai/api/v1/chat/completions"
headers = {
    "Authorization": "Bearer YOUR_API_KEY",
    "Content-Type": "application/json"
}

data = {
    "model": "documentation-assistant",
    "messages": [
        {"role": "user", "content": "How do I configure authentication?"}
    ]
}

response = requests.post(url, headers=headers, json=data)
result = response.json()

assistant_message = result["choices"][0]["message"]["content"]
print(f"Assistant: {assistant_message}")
```

```python Python - Streaming
import requests

data = {
    "model": "documentation-assistant",
    "messages": [
        {"role": "user", "content": "Explain API authentication"}
    ],
    "stream": True
}

response = requests.post(url, headers=headers, json=data, stream=True)

print("Assistant: ", end="")
for line in response.iter_lines():
    if line:
        line = line.decode('utf-8')
        if line.startswith('data: '):
            data_str = line[6:]
            if data_str == '[DONE]':
                break
            import json
            chunk = json.loads(data_str)
            content = chunk["choices"][0]["delta"].get("content", "")
            print(content, end="", flush=True)
print()
```

```javascript JavaScript
const response = await fetch(
  "https://labs.chonkie.ai/api/v1/chat/completions",
  {
    method: "POST",
    headers: {
      Authorization: "Bearer YOUR_API_KEY",
      "Content-Type": "application/json",
    },
    body: JSON.stringify({
      model: "documentation-assistant",
      messages: [
        { role: "user", content: "How do I configure authentication?" },
      ],
    }),
  }
);

const result = await response.json();
const assistantMessage = result.choices[0].message.content;
console.log(`Assistant: ${assistantMessage}`);
```

```bash cURL
curl -X POST https://labs.chonkie.ai/api/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "documentation-assistant",
    "messages": [
      {"role": "user", "content": "How do I configure authentication?"}
    ]
  }'
```

</CodeGroup>

## OpenAI SDK Compatibility

Chonkie agents are fully compatible with the OpenAI SDK. Simply configure the base URL and use your agent slug as the model name.

<CodeGroup>

```python Python - OpenAI SDK
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_KEY",
    base_url="https://labs.chonkie.ai/api/v1"
)

response = client.chat.completions.create(
    model="documentation-assistant",  # Use your agent slug
    messages=[
        {"role": "user", "content": "How do I configure authentication?"}
    ]
)

print(response.choices[0].message.content)
```

```python Python - OpenAI SDK Streaming
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_KEY",
    base_url="https://labs.chonkie.ai/api/v1"
)

stream = client.chat.completions.create(
    model="documentation-assistant",
    messages=[
        {"role": "user", "content": "Explain API authentication"}
    ],
    stream=True
)

for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="", flush=True)
print()
```

```javascript JavaScript - OpenAI SDK
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "YOUR_API_KEY",
  baseURL: "https://labs.chonkie.ai/api/v1",
});

const response = await client.chat.completions.create({
  model: "documentation-assistant", // Use your agent slug
  messages: [{ role: "user", content: "How do I configure authentication?" }],
});

console.log(response.choices[0].message.content);
```

```javascript JavaScript - OpenAI SDK Streaming
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "YOUR_API_KEY",
  baseURL: "https://labs.chonkie.ai/api/v1",
});

const stream = await client.chat.completions.create({
  model: "documentation-assistant",
  messages: [{ role: "user", content: "Explain API authentication" }],
  stream: true,
});

for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content;
  if (content) {
    process.stdout.write(content);
  }
}
console.log();
```

</CodeGroup>

## Request

#### Parameters

<ParamField path="model" type="string" required>
  The agent slug to use (replaces OpenAI's model parameter).
</ParamField>

<ParamField path="messages" type="array" required>
  Array of message objects in OpenAI format.
</ParamField>

<ParamField path="stream" type="boolean" default="false">
  Enable streaming responses.
</ParamField>

<ParamField path="max_tokens" type="integer">
  Maximum tokens in response.
</ParamField>

## Response

#### Returns (Non-streaming)

OpenAI-compatible chat completion response.

#### Returns (Streaming)

Server-Sent Events (SSE) with delta chunks.
