AIMessage class in LangChain
Posted: Nov 15, 2024.
When building AI applications using LangChain or similar frameworks, the behavior of your Language Model (LLM) is the core determinant of your app's success. Thus, Continuous observation and tuning of AI responses is very important.
AIMessage
class is typically created automatically by Langchain during interactions, but it provides flexibility for developers to manage AI responses with additional metadata.
What is the AIMessage Class in Langchain?
The AIMessage
class represents messages generated by an AI model.
It belongs to the langchain_core.messages.ai
module and extends the BaseMessage
class.
AIMessage
is a structured format for AI responses including content and optional metadata for making it easier to manage workflows.
How Does AIMessage Fit into Langchain Workflows?
Langchain workflows involve user prompts and AI responses.
BaseMessage
class contains multiple serialization format for different message types.
HumanMessage
represents user input while AIMessage
represents the AI's reply.
-
Content Handling: Manages the content produced by the model, ranging from simple text to complex data.
-
Metadata Integration: Can include additional metadata, such as token counts or tool calls, for evaluation and debugging.
-
Serialization: Supports serialization for easy storage or transfer of AI responses.
Here’s a simple example to help you get started with AIMessage. Begin by installing Langchain package.
Once installed, you can create a basic AIMessage like this:
The content parameter holds the response text.
You can add extra information to a response, such as tools used or confidence levels, using additional_kwargs.
The Response models of the Langchain gives the response serialized as AIMessage
.
The ChatOpenAI
class from langchain in the below example gives serialized response.
Here are some more attributes of AIMessage
.
Parameter | Description |
---|---|
content | The message content (string or list). |
additional_kwargs | Extra metadata, such as payload information. |
id | Optional unique identifier for the message. |
tool_calls | List of tool calls associated with the message. |
response_metadata | Metadata like token count or response headers. |
type | Always set to 'ai' to indicate an AI-generated message. |
Simplifying Observability with Lunary
Adding observability into your AI applications is important for optimal performance, reliability and user satisfaction.
Lunary offers an observability suite for AI developers by providing analytics, logging, and tracing capabilities that allow you to monitor and analyze your Language Model (LLM) interactions.
Integrating Lunary into your LangChain application is straightforward. Here are the steps
After creating an account on Lunary and setting up an app you can obtain your app's tracking ID.
In your Python code, import the necessary modules and set up the LunaryCallbackHandler
This setup ensures that all interactions with the ChatOpenAI model are monitored and logged by Lunary.
Join 10,000+ subscribers
Every 2 weeks, latest model releases and industry news.
Building an AI chatbot?
Open-source GenAI monitoring, prompt management, and magic.