---
title: 'Monitor Sarvam AI using OpenTelemetry'
sidebarTitle: 'Sarvam AI'
---

<Frame>
  <img src="/images/docs-sarvam-banner.gif" />
</Frame>

OpenLIT uses OpenTelemetry Auto-Instrumentation to help you monitor LLM applications built using Sarvam AI models. This includes tracking performance, token usage, costs, and how users interact with the application.

Auto-instrumentation means you don't have to set up monitoring manually for different LLMs, frameworks, or databases. By simply adding OpenLIT in your application, all the necessary monitoring configurations are automatically set up.

The integration is compatible with  
- Sarvam AI Python SDK client `>= 0.0.1` 

## Get Started 

<Steps>
    <Step title="Install OpenLIT">
      Open your command line or terminal and run:
      <Tabs>
        <Tab title="Python">
           ```shell
           pip install openlit
           ```
        </Tab>
        <Tab title="Typescript">
           ```shell
           npm install openlit
           ```
        </Tab>
        </Tabs>
    </Step>
    <Step title="Initialize OpenLIT in your Application">
    You can set up OpenLIT in your application using either function arguments directly in your code or by using environment variables.
    <Tabs>
      <Tab title="Setup using function arguments">

      Add the following two lines to your application code:

      <Tabs>
        <Tab title="Python">
          ```python
          import openlit
    
          openlit.init(
            otlp_endpoint="YOUR_OTEL_ENDPOINT", 
          )
          ```

        </Tab>
        <Tab title="Typescript">
        <Warning>Typescript sdk for anthropic doesn't support metrics tracking for now!</Warning>

          ```typescript
          import openlit from "openlit"
    
          openlit.init({ otlpEndpoint: "YOUR_OTEL_ENDPOINT" })
          ```
        </Tab>
        </Tabs>

      Replace:
      1. `YOUR_OTEL_ENDPOINT` with the URL of your OpenTelemetry backend, such as `http://127.0.0.1:4318` if you are using OpenLIT and a local OTel Collector.
      </Tab>
      <Tab title="Setup using Environment Variables">
      
      Add the following two lines to your application code:

      <Tabs>
        <Tab title="Python">
          ```python
          import openlit
    
          openlit.init()
          ```

        </Tab>
        <Tab title="Typescript">
          ```typescript
          import openlit from "openlit"
    
          openlit.init()
          ```
        </Tab>
        </Tabs>

      Then, configure the your OTLP endpoint using environment variable:
      ```shell
      export OTEL_EXPORTER_OTLP_ENDPOINT = "YOUR_OTEL_ENDPOINT"
      ```

      Replace:
      1. `YOUR_OTEL_ENDPOINT` with the URL of your OpenTelemetry backend, such as `http://127.0.0.1:4318` if you are using OpenLIT and a local OTel Collector. 
      </Tab>
    </Tabs>
    To send metrics and traces to other Observability tools, refer to the [Connections Guide](/latest/connections/intro).

    For more advanced configurations and application use cases, visit the [OpenLIT Python repository](https://github.com/openlit/openlit/tree/main/sdk/python) or [OpenLIT Typescript repository](https://github.com/openlit/openlit/tree/main/sdk/typescript).
  </Step>
</Steps>

---

<CardGroup cols={2}>
<Card title="Connections" href="/latest/connections/intro" icon='link'>
Connect to your existing Observablity Stack
</Card>
<Card title="SDK configuration" href="/latest/sdk-configuration" icon='file-code'>
Documentation of the configuration options for the OpenLIT SDK.
</Card>
</CardGroup>
