### 3. View your telemetry in Dynatrace

Once your AI application starts sending telemetry data, you can explore it in Dynatrace:

1. **Navigate to Observability**: Go to **Observe and explore** in your Dynatrace environment
1. **Distributed traces**: View **Distributed traces** to see your AI application traces with LLM calls and vector operations
1. **Services**: Check **Services** to monitor your AI service performance and dependencies
1. **Metrics**: Explore custom metrics in **Metrics** for token usage, costs, and AI-specific KPIs

Your OpenLIT-instrumented AI applications will appear automatically in Dynatrace with comprehensive observability including LLM costs, token usage, model performance, and vector database operations.
