Lunary has powerful observability features that lets you record and analyze your LLM calls.
There are 3 main observability features: analytics, logs and traces.
Analytics and logs are automatically captured as soon as you integrate our SDK.
The following metrics are currently automatically captured:
|Costs incurred by your LLM models
|Number of LLM calls made & tokens used
|Average latency of LLM calls and agents
|Number of errors encountered by LLM calls and agents
|Usage over time of your top users
Lunary allows you to log and inspect your LLM requests and responses.
Logging is automatic as soon as you integrate our SDK.
Tracing is helpful to debug more complex AI agents and troubleshoot issues.
The easiest way to get started with traces is to use our utility wrappers to automatically track your agents and tools.
By wrapping an agent, input, outputs and errors are automatically tracked.
Any query ran inside the agent will be tied to the agent.
If you prefer to use anonymous functions, make sure to pass a name as a 2nd argument to the
If your agents use tools, you can wrap them as well to track them.
If a wrapped tool is executed inside a wrapped agent, the tool will be automatically tied to the agent without the need to manually reconcialiate them.