Prompt Templates

Prompt templates are a way to store, version and collaborate on prompts.

Developers use prompt templates to:

  • clean up their source code
  • make edits to prompts without re-deploying code
  • collaborate with non-technical teammates
  • A/B test prompts

Creating a template

You can create a prompt template by clicking on the "Create prompt template" button in the Prompts section of the dashboard.

Usage with OpenAI

You can use templates seamlessly with OpenAI's API with our SDKs.

This will make sure the tracking of the prompt is done automatically.

import lunary
from openai import OpenAI

client = OpenAI()

# Make sure your OpenAI instance is monitored
lunary.monitor(client)

template = lunary.render_template("template-slug", {
  "name": "John", # Inject variables
})

result = client.chat.completions.create(**template)

Usage with LangChain's templates

You can pull templates in the LangChain format and use them directly as PromptTemplate and ChatPromptTemplate classes.

Example with simple text template:

The get_langchain_template method returns a PromptTemplate object for simple templates, which can be used directly in chains or to format prompts.

import lunary

template = lunary.get_langchain_template("my-template")

prompt = template.format(question="What is the capital of France?")

Example with a Chat template (ChatPromptTemplate):

The get_langchain_template method returns a ChatPromptTemplate object for chat messages templates, which can be directly in chains or to format messages.

template = lunary.get_langchain_template("my-template")

messages = lc_template.format_messages(question="What is the capital of France?")

Manual LangChain Usage with LLM Classes

Using with LangChain LLM Classes is similar to using with OpenAI, but requires you to format the messages in the LangChain format as well as pass the template id in the metadata.

from langchain_openai import ChatOpenAI
from langchain_community.adapters.openai import convert_openai_messages
from lunary import render_template, LunaryCallbackHandler

template = render_template("template-slug", {
  "name": "John", # Inject variables
})

chat_model = ChatOpenAI(
  model=template["model"],
  metadata={
    "templateId": template["templateId"] # Optional: this allows to reconcile the logs with the template
  },
  # add any other parameters here...
  temperature=template["temperature"],
  callbacks=[LunaryCallbackHandler()]
)

# Convert messages to LangChain format
messages = convert_openai_messages(template["messages"])

result = chat_model.invoke(messages)

Manual usage

You can also use templates manually with any LLM API by accessing the relevant fields (returned in OpenAI's format).

import lunary

template = lunary.render_template("template-slug", {
  "name": "John", # Inject variables
})

messages = template["messages"]
model = template["model"]
temperature = template["temperature"]
max_tokens = template["max_tokens"]

# ... use the fields like you want

Questions? We're here to help.

Email