OpenTelemetry Integration
OpenTelemetry, or OTEL, is the open-source standard for tracing and monitoring distributed applications—including LLM-based workflows. Lunary offers first-class support for ingesting OTEL traces via its/v1/otel endpoint. This means you can export traces, metrics, and events from LLM stacks or frameworks—no matter the language or platform—directly to Lunary’s observability dashboard.
Why OpenTelemetry?
- Unified tracing across polyglot apps (Python, JS, Java, Go, etc.)
- Bring-your-own instrumentation: works with OpenLIT, Arize, OpenLLMetry, MLflow, and more.
- Rich, future-proof GenAI semantic conventions.
How It Works
- Your app or framework emits OpenTelemetry trace data.
- Data is sent to the Lunary endpoint:
https://api.lunary.ai/v1/otel - Lunary’s backend standardizes, stores, and displays all your traces.
Supported Libraries and Frameworks
You can send OTEL traces to Lunary from any library or SDK that supports the OTLP protocol, including:- Python:
opentelemetry-sdk - JavaScript/TypeScript:
@opentelemetry/api - Instrumentation: OpenLIT, OpenLLMetry, Arize OpenInference, MLflow
- AI stacks: LangChain, LlamaIndex, Haystack, CrewAI, Semantic Kernel, and more!
Quickstart
For property mapping and advanced tips, see OTEL attribute mapping.Supported Providers
| Model SDK | Python | Typescript |
|---|---|---|
| Azure OpenAI | ✅ | ✅ |
| Aleph Alpha | ✅ | ❌ |
| Anthropic | ✅ | ✅ |
| Amazon Bedrock | ✅ | ✅ |
| Amazon SageMaker | ✅ | ❌ |
| Cohere | ✅ | ✅ |
| IBM watsonx | ✅ | ⏳ |
| Google Gemini | ✅ | ✅ |
| Google VertexAI | ✅ | ✅ |
| Groq | ✅ | ⏳ |
| Mistral AI | ✅ | ⏳ |
| Ollama | ✅ | ⏳ |
| OpenAI | ✅ | ✅ |
| Replicate | ✅ | ⏳ |
| together.ai | ✅ | ⏳ |
| HuggingFace Transformers | ✅ | ⏳ |