Chats & Threads

Record and replay chat conversations in your chatbot app. Helps you understand where your chatbot falls short and how to improve it.

Chats integrate seamlessly with traces by reconciliating messages with LLM calls and agents.

Feedback tracking

You can record chats in the backend or directly on the frontend if it's easier for you.

Setup the SDK

Open a thread

Start by opening a thread.

thread = lunary.open_thread()

You can resume an existing thread by passing an ID from an existing thread.

# Save `thread.id` somewhere
existing_thread_id = 'your-thread-id'  # Replace with your actual thread ID
thread = lunary.open_thread(existing_thread_id)

You can also add tags to a thread by passing a object with a tags param:

thread = lunary.open_thread(existing_thread_id, tags=['support'])

Track messages

Now you can track messages. The supported roles are assistant, user, system, & tool.

thread.track_message({
  "role": "user",
  "content": "Hello, please help me"
})

thread.track_message({
  "role": "assistant",
  "content": "Hello, how can I help you?"
})

Track custom events

You can also track custom events.

thread.track_event("event-name")

# you can also use the following optional parameters
thread.track_event("event-name", user_id="user1", user_props={"email": "hello@test.com"}, metadata={})

Capture user feedback

Finally, you can track user feedback on bot replies:

The ID is the same as the one returned by trackMessage.

msg_id = thread.track_message({
  "role": "assistant",
  "content": "Hope you like my answers :)"
})

lunary.track_feedback(msg_id, { "thumb": "up" })

To remove feedback, pass null as the feedback data.

lunary.track_feedback(msg_id, { "thumb": None })

Reconciliate with LLM calls & agents

To take full advantage of Lunary's tracing capabilities, you can reconcile your LLM and agents runs with the messages.

We will automatically reconciliate messages with runs.

msg_id = thread.track_message({ "role": "user", "content": "Hello!" })

chat_completion = client.chat.completions.create(
    messages=[message],
    model="gpt-4o",
    parent=msg_id
)

thread.track_message(
    {"role": "assistant", "content": chat_completion.choices[0].message.content})

If you're using LangChain or agents behind your chatbot, you can inject the current message id into context as a parent:

msg_id = thread.track_message({ "role": "user", "content": "Hello!" })

# In your backend, inject the message id into the context

with lunary.parent(msg_id):
    # your custom code...
    pass

Note that it's safe to pass the message ID from your frontend to your backend, if you're tracking chats directly on the frontend for example.

Questions? We're here to help.

Email