Working with the SelfAskOutputParser in LangChain

Posted: Jan 29, 2025.

The SelfAskOutputParser is a specialized output parser in LangChain designed to handle the output of self-ask style conversations with language models. In this guide, we'll explore what it is, how it works, and how to use it effectively.

What is SelfAskOutputParser?

SelfAskOutputParser is an output parser that handles responses from language models in a self-ask conversation format. It specifically looks for two types of patterns:

  1. Follow-up questions that indicate more information is needed
  2. Final answers that conclude the conversation

The parser converts the LLM's output into either an AgentAction (when a follow-up is needed) or an AgentFinish (when a final answer is provided).

Reference

Here are the key components of the SelfAskOutputParser:

ParameterTypeDescription
finish_stringstrThe string that indicates a final answer is being given (default: "So the final answer is: ")
followupsSequence[str]The strings that indicate a follow-up question (default: ["Follow up:", "Followup:"])

How to use SelfAskOutputParser

Basic Usage

Here's how to create and use a SelfAskOutputParser:

from langchain.agents.output_parsers.self_ask import SelfAskOutputParser

# Create the parser
parser = SelfAskOutputParser()

# Example of parsing a follow-up question
follow_up_text = """
Thoughts: I need to know more about the weather.
Follow up: what is the temperature in SF?
"""
result = parser.parse(follow_up_text)
# Returns an AgentAction with the follow-up question

# Example of parsing a final answer
final_answer_text = """
Thoughts: I have all the information needed.
So the final answer is: The temperature is 72 degrees
"""
result = parser.parse(final_answer_text)
# Returns an AgentFinish with the final answer

Customizing the Parser

You can customize the strings that indicate follow-ups or final answers:

# Custom strings for parsing
parser = SelfAskOutputParser(
    finish_string="The conclusion is: ",
    followups=("Question:", "Next question:")
)

# Now it will look for different indicators
text = """
Thoughts: Need more information
Question: What time is it in London?
"""
result = parser.parse(text)

Integration with an Agent

The SelfAskOutputParser is typically used as part of an agent that needs to break down complex questions into simpler ones:

from langchain.agents import AgentType, initialize_agent
from langchain.llms import OpenAI

# Create the parser
parser = SelfAskOutputParser()

# Initialize an agent with the parser
llm = OpenAI(temperature=0)
agent = initialize_agent(
    tools=[],  # Add your tools here
    llm=llm,
    agent=AgentType.SELF_ASK_WITH_SEARCH,
    output_parser=parser,
    verbose=True
)

# The agent will use the parser to handle the LLM's responses
response = agent.run("What is the capital of France and what is its population?")

Error Handling

The parser will raise an error if it can't find either a follow-up question or a final answer in the text:

try:
    # This will raise an error because it doesn't match the expected format
    parser.parse("This is just some random text")
except Exception as e:
    print(f"Failed to parse text: {e}")

Remember that the SelfAskOutputParser expects the LLM output to follow a specific format. Make sure your prompts guide the LLM to provide responses in the correct format with either follow-up questions or final answers clearly marked.

An alternative to LangSmith

Open-source LangChain monitoring, prompt management, and magic. Get started in 2 minutes.

LangChain Docs

Join 10,000+ subscribers

Every 2 weeks, latest model releases and industry news.

An alternative to LangSmith

Open-source LangChain monitoring, prompt management, and magic. Get started in 2 minutes.

LangChain Docs