Documentation
Getting Started
Integrations
JavaScript
Python
LangChain
API
Others
Features
Observability
Prompts
Threads
Evaluations
Radars
Users
Feedback
Tags
More
Security
Concepts
Self-hosting
Prompt Templates
Prompt templates are a way to store, version and collaborate on prompts.
Developers use prompt templates to:
- clean up their source code
- make edits to prompts without re-deploying code
- collaborate with non-technical teammates
- A/B test prompts
Creating a template
You can create a prompt template by clicking on the "Create prompt template" button in the Prompts section of the dashboard.
Usage with OpenAI
You can use templates seamlessly with OpenAI's API with our SDKs.
This will make sure the tracking of the prompt is done automatically.
import lunaryfrom openai import OpenAIclient = OpenAI()# Make sure your OpenAI instance is monitoredlunary.monitor(client)template = lunary.render_template("template-slug", {"name": "John", # Inject variables})result = client.chat.completions.create(**template)
Usage with LangChain's templates
You can pull templates in the LangChain format and use them directly as PromptTemplate and ChatPromptTemplate classes.
Example with simple text template:
The get_langchain_template
method returns a PromptTemplate
object for simple templates, which can be used directly in chains or to format prompts.
import lunarytemplate = lunary.get_langchain_template("my-template")prompt = template.format(question="What is the capital of France?")
Example with a Chat template (ChatPromptTemplate):
The get_langchain_template
method returns a ChatPromptTemplate
object for chat messages templates, which can be directly in chains or to format messages.
template = lunary.get_langchain_template("my-template")messages = lc_template.format_messages(question="What is the capital of France?")
Manual LangChain Usage with LLM Classes
Using with LangChain LLM Classes is similar to using with OpenAI, but requires you to format the messages in the LangChain format as well as pass the template id in the metadata.
from langchain.chat_models import ChatOpenAIfrom langchain_community.adapters.openai import convert_openai_messagesfrom lunary import render_template, LunaryCallbackHandlertemplate = render_template("template-slug", {"name": "John", # Inject variables})chat_model = ChatOpenAI(model=template["model"],metadata={"templateId": template["templateId"] # Optional: this allows to reconcile the logs with the template},# add any other parameters here...temperature=template["temperature"],callbacks=[LunaryCallbackHandler()])# Convert messages to LangChain formatmessages = convert_openai_messages(template["messages"])result = chat_model.invoke(messages)
Manual usage
You can also use templates manually with any LLM API by accessing the relevant fields (returned in OpenAI's format).
import lunarytemplate = lunary.render_template("template-slug", {"name": "John", # Inject variables})messages = template["messages"]model = template["model"]temperature = template["temperature"]max_tokens = template["max_tokens"]# ... use the fields like you want