---
title: Observability Integrations
description: AI SDK Integration for monitoring and tracing LLM applications
---

# Observability Integrations

Several LLM observability providers offer integrations with the AI SDK telemetry data:

- [Axiom](/providers/observability/axiom)
- [Braintrust](/providers/observability/braintrust)
- [Helicone](/providers/observability/helicone)
- [Langfuse](/providers/observability/langfuse)
- [LangSmith](/providers/observability/langsmith)
- [Laminar](/providers/observability/laminar)
- [LangWatch](/providers/observability/langwatch)
- [Maxim](/providers/observability/maxim)
- [HoneyHive](https://docs.honeyhive.ai/integrations/vercel)
- [Scorecard](/providers/observability/scorecard)
- [Sentry](https://docs.sentry.io/platforms/javascript/guides/nextjs/configuration/integrations/vercelai/)
- [SigNoz](/providers/observability/signoz)
- [Traceloop](/providers/observability/traceloop)
- [Weave](/providers/observability/weave)

There are also providers that provide monitoring and tracing for the AI SDK through model wrappers:

- [Literal AI](https://docs.literalai.com/integrations/vercel-ai-sdk)

<Note>
  Do you have an observability integration that supports the AI SDK and has an
  integration guide? Please open a pull request to add it to the list.
</Note>
