LiteLLM Integration

liteLLM provides callbacks, making it easy for you to log your completions responses.

Using Callbacks

First, sign up to get an app ID on the Lunary dashboard.

With these 2 lines of code, you can instantly log your responses across all providers with lunary:

litellm.success_callback = ["lunary"]
litellm.failure_callback = ["lunary"]

Complete code

from litellm import completion

## set env variables
os.environ["LUNARY_PUBLIC_KEY"] = "PUBLIC KEY"
# Optional: os.environ["LUNARY_API_URL"] = "self-hosting-url"

os.environ["OPENAI_API_KEY"], os.environ["COHERE_API_KEY"] = "", ""

# set callbacks
litellm.success_callback = ["lunary"]
litellm.failure_callback = ["lunary"]

#openai call
response = completion(
  model="gpt-4o", 
  messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}],
  user="some_user_id",
  metadata={"tags": ["tag1", "tag2"]}
)

#cohere call
response = completion(
  model="command-nightly", 
  messages=[{"role": "user", "content": "Hi 👋 - i'm cohere"}],
  user="some_user_id"
)

Questions? We're here to help.

Email