---
sidebar_position: 16
---

# OpenRouter

import openRouterLogo from '/img/openRouterLogo.png';

# <img src={openRouterLogo} className="adaptive-logo-filter" width="34" style={{float: 'left', marginRight: '10px', marginTop: '12px'}} /><span className="direct-service-title">OpenRouter</span>

Properties used to connect to [OpenRouter](https://openrouter.ai/).

### `openRouter` {#openRouter}

- Type: \{<br />
  &nbsp;&nbsp;&nbsp;&nbsp; `model?: string`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; `max_tokens?: number`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; `temperature?: number`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; `top_p?: number`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; `frequency_penalty?: number`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; `presence_penalty?: number`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; `system_prompt?: string`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; [`tools?: OpenRouterTool[]`](#OpenRouterTool), <br />
  &nbsp;&nbsp;&nbsp;&nbsp; [`function_handler?: FunctionHandler`](#FunctionHandler) <br />
  \}
- Default: _\{model: "openai/gpt-4o"\}_

import ContainersKeyToggleChatFunction from '@site/src/components/table/containersKeyToggleChatFunction';
import ContainersKeyToggle from '@site/src/components/table/containersKeyToggle';
import ComponentContainer from '@site/src/components/table/componentContainer';
import DeepChatBrowser from '@site/src/components/table/deepChatBrowser';
import LineBreak from '@site/src/components/markdown/lineBreak';
import BrowserOnly from '@docusaurus/BrowserOnly';
import TabItem from '@theme/TabItem';
import Tabs from '@theme/Tabs';

<BrowserOnly>{() => require('@site/src/components/nav/autoNavToggle').readdAutoNavShadowToggle()}</BrowserOnly>

Connect to OpenRouter's [`chat completion`](https://openrouter.ai/docs/api-reference/chat-completion) API. <br />
`model` is the name of the model to be used by the API (e.g., "openai/gpt-3.5-turbo"). <br />
`max_tokens` limits the maximum number of tokens in the generated response (1 to model's context length). <br />
`temperature` controls the randomness of responses (0.0-2.0). Higher values produce more creative outputs. <br />
`top_p` controls diversity through nucleus sampling (0.0-1.0). <br />
`frequency_penalty` reduces repetition by penalizing frequently used tokens (-2.0 to 2.0). <br />
`presence_penalty` encourages topic diversity by penalizing tokens that have appeared (-2.0 to 2.0). <br />
`system_prompt` provides behavioral context and instructions to the model. <br />
[`tools`](#OpenRouterTool) defines available function declarations for the model to call. <br />
[`function_handler`](#FunctionHandler) enables function calling capabilities for tool use. <br />

#### Example

<ContainersKeyToggle>
  <ComponentContainer>
    <DeepChatBrowser
      style={{borderRadius: '8px'}}
      directConnection={{
        openRouter: {
          key: 'placeholder key',
          model: 'openai/gpt-3.5-turbo',
          temperature: 0.7,
        },
      }}
    ></DeepChatBrowser>
  </ComponentContainer>
  <ComponentContainer>
    <DeepChatBrowser
      style={{borderRadius: '8px'}}
      directConnection={{
        openRouter: {
          model: 'openai/gpt-3.5-turbo',
          temperature: 0.7,
        },
      }}
    ></DeepChatBrowser>
  </ComponentContainer>
</ContainersKeyToggle>

<Tabs>
<TabItem value="js" label="Sample code">

```html
<deep-chat
  directConnection='{
    "openRouter": {
      "key": "placeholder key",
      "model": "openai/gpt-3.5-turbo",
      "temperature": 0.7
    }
  }'
></deep-chat>
```

</TabItem>
<TabItem value="py" label="Full code">

```html
<!-- This example is for Vanilla JS and should be tailored to your framework (see Examples) -->

<deep-chat
  directConnection='{
    "openRouter": {
      "key": "placeholder key",
      "model": "openai/gpt-3.5-turbo",
      "temperature": 0.7
    }
  }'
  style="border-radius: 8px"
></deep-chat>
```

</TabItem>
</Tabs>

<LineBreak></LineBreak>

:::info
Use [`stream`](/docs/connect#Stream) to stream the AI responses.
:::

<LineBreak></LineBreak>

#### Vision Example

Upload images alongside your text prompts for visual understanding. You must use a [model with vision capabilities](https://openrouter.ai/models?fmt=cards&input_modalities=image).

<ContainersKeyToggle>
  <ComponentContainer>
    <DeepChatBrowser
      style={{borderRadius: '8px'}}
      directConnection={{
        openRouter: {
          key: 'placeholder key',
          model: 'openai/gpt-4o',
        },
      }}
      images={true}
      camera={true}
      textInput={{styles: {container: {width: '77%'}}}}
    ></DeepChatBrowser>
  </ComponentContainer>
  <ComponentContainer>
    <DeepChatBrowser
      style={{borderRadius: '8px'}}
      directConnection={{
        openRouter: {
          model: 'openai/gpt-4o',
        },
      }}
      images={true}
      camera={true}
      textInput={{styles: {container: {width: '77%'}}}}
    ></DeepChatBrowser>
  </ComponentContainer>
</ContainersKeyToggle>

<Tabs>
<TabItem value="js" label="Sample code">

```html
<deep-chat
  directConnection='{
    "openRouter": {
      "key": "placeholder key",
      "model": "openai/gpt-4o"
    }
  }'
  images="true"
  camera="true"
></deep-chat>
```

</TabItem>
<TabItem value="py" label="Full code">

```html
<!-- This example is for Vanilla JS and should be tailored to your framework (see Examples) -->

<deep-chat
  directConnection='{
    "openRouter": {
      "key": "placeholder key",
      "model": "openai/gpt-4o"
    }
  }'
  images="true"
  camera="true"
  style="border-radius: 8px"
  textInput='{"styles": {"container": {"width": "77%"}}}'
></deep-chat>
```

</TabItem>
</Tabs>

<LineBreak></LineBreak>

:::tip
When sending images we advise you to set [`maxMessages`](/docs/connect#requestBodyLimits) to 1 to send less data and reduce costs.
:::

<LineBreak></LineBreak>

#### Audio Example

Upload audio files alongside your text prompts for speech understanding. You must use a [model with audio capabilities](https://openrouter.ai/models?fmt=cards&input_modalities=audio).

<ContainersKeyToggle>
  <ComponentContainer>
    <DeepChatBrowser
      style={{borderRadius: '8px'}}
      directConnection={{
        openRouter: {
          key: 'placeholder key',
          model: 'openai/gpt-4o-audio-preview',
        },
      }}
      audio={true}
    ></DeepChatBrowser>
  </ComponentContainer>
  <ComponentContainer>
    <DeepChatBrowser
      style={{borderRadius: '8px'}}
      directConnection={{
        openRouter: {
          model: 'openai/gpt-4o-audio-preview',
        },
      }}
      audio={true}
    ></DeepChatBrowser>
  </ComponentContainer>
</ContainersKeyToggle>

<Tabs>
<TabItem value="js" label="Sample code">

```html
<deep-chat
  directConnection='{
    "openRouter": {
      "key": "placeholder key",
      "model": "openai/gpt-4o-audio-preview"
    }
  }'
  audio="true"
></deep-chat>
```

</TabItem>
<TabItem value="py" label="Full code">

```html
<!-- This example is for Vanilla JS and should be tailored to your framework (see Examples) -->

<deep-chat
  directConnection='{
    "openRouter": {
      "key": "placeholder key",
      "model": "openai/gpt-4o-audio-preview"
    }
  }'
  audio="true"
  style="border-radius: 8px"
></deep-chat>
```

</TabItem>
</Tabs>

<LineBreak></LineBreak>

## Tool Calling

OpenRouter supports [function calling](https://openrouter.ai/docs/features/tool-calling) functionality:

### `OpenRouterTool` {#OpenRouterTool}

- Type: \{<br />
  &nbsp;&nbsp;&nbsp;&nbsp; `type: "function"`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp; `function:` \{<br />
  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; `name: string`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; `description: string`, <br />
  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; `parameters: object` <br />
  \}}

Array describing tools that the model may call. <br />
`type` must be "function" for function tools. <br />
`name` is the name of the tool function. <br />
`description` explains what the tool does and when it should be used. <br />
`parameters` defines the parameters the tool accepts in JSON Schema format. <br />

### `FunctionHandler` {#FunctionHandler}

- Type: ([`functionsDetails: FunctionsDetails`](/docs/directConnection#FunctionsDetails)) => `{response: string}[]` | `{text: string}`

The actual function that the component will call if the model wants to use tools. <br />
[`functionsDetails`](/docs/directConnection#FunctionsDetails) contains information about what tool functions should be called. <br />
This function should either return an array of JSONs containing a `response` property for each tool function (in the same order as in [`functionsDetails`](/docs/directConnection#FunctionsDetails))
which will feed it back into the model to finalize a response, or return a JSON containing `text` which will immediately display it in the chat.

#### Example

<ContainersKeyToggleChatFunction service="openRouter"></ContainersKeyToggleChatFunction>

<Tabs>
<TabItem value="js" label="Sample code">

```js
// using JavaScript for a simplified example

chatElementRef.directConnection = {
  openRouter: {
    tools: [
      {
        type: 'function',
        function: {
          name: 'get_current_weather',
          description: 'Get the current weather in a given location',
          parameters: {
            type: 'object',
            properties: {
              location: {
                type: 'string',
                description: 'The city and state, e.g. San Francisco, CA',
              },
              unit: {type: 'string', enum: ['celsius', 'fahrenheit']},
            },
            required: ['location'],
          },
        },
      },
    ],
    function_handler: (functionsDetails) => {
      return functionsDetails.map((functionDetails) => {
        return {
          response: getCurrentWeather(functionDetails.arguments),
        };
      });
    },
    key: 'placeholder-key',
  },
};
```

</TabItem>
<TabItem value="py" label="Full code">

```js
// using JavaScript for a simplified example

chatElementRef.directConnection = {
  openRouter: {
    tools: [
      {
        type: 'function',
        function: {
          name: 'get_current_weather',
          description: 'Get the current weather in a given location',
          parameters: {
            type: 'object',
            properties: {
              location: {
                type: 'string',
                description: 'The city and state, e.g. San Francisco, CA',
              },
              unit: {type: 'string', enum: ['celsius', 'fahrenheit']},
            },
            required: ['location'],
          },
        },
      },
    ],
    function_handler: (functionsDetails) => {
      return functionsDetails.map((functionDetails) => {
        return {
          response: getCurrentWeather(functionDetails.arguments),
        };
      });
    },
    key: 'placeholder-key',
  },
};

function getCurrentWeather(location) {
  location = location.toLowerCase();
  if (location.includes('tokyo')) {
    return JSON.stringify({location, temperature: '10', unit: 'celsius'});
  } else if (location.includes('san francisco')) {
    return JSON.stringify({location, temperature: '72', unit: 'fahrenheit'});
  } else {
    return JSON.stringify({location, temperature: '22', unit: 'celsius'});
  }
}
```

</TabItem>
</Tabs>

<LineBreak></LineBreak>
