Few Shot Chat Prompt (#8038)

Proposal for a few shot chat message example selector

---------

Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
pull/8395/head
William FH 11 months ago committed by GitHub
parent 6dd18eee26
commit ecd4aae818
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

File diff suppressed because one or more lines are too long

@ -7,142 +7,418 @@
"source": [
"# Few shot examples for chat models\n",
"\n",
"This notebook covers how to use few shot examples in chat models.\n",
"This notebook covers how to use few shot examples in chat models. There does not appear to be solid consensus on how best to do few shot prompting, and the optimal prompt compilation will likely vary by model. Because of this, we provide few-shot prompt templates like the [FewShotChatMessagePromptTemplate](https://api.python.langchain.com/en/latest/prompts/langchain.prompts.few_shot.FewShotChatMessagePromptTemplate.html) as a flexible starting point, and you can modify or replace them as you see fit.\n",
"\n",
"There does not appear to be solid consensus on how best to do few shot prompting. As a result, we are not solidifying any abstractions around this yet but rather using existing abstractions."
"The goal of few-shot prompt templates are to dynamically select examples based on an input, and then format the examples in a final prompt to provide for the model.\n",
"\n",
"\n",
"**Note:** The following code examples are for chat models. For similar few-shot prompt examples for completion models (LLMs), see the [few-shot prompt templates](few_shot_examples) guide."
]
},
{
"cell_type": "markdown",
"id": "c6e9664c",
"metadata": {},
"id": "d716f2de-cc29-4823-9360-a808c7bfdb86",
"metadata": {
"tags": []
},
"source": [
"## Alternating Human/AI messages\n",
"The first way of doing few shot prompting relies on using alternating human/ai messages. See an example of this below."
"### Fixed Examples\n",
"\n",
"The most basic (and common) few-shot prompting technique is to use a fixed prompt example. This way you can select a chain, evaluate it, and avoid worrying about additional moving parts in production.\n",
"\n",
"The basic components of the template are:\n",
"- `examples`: A list of dictionary examples to include in the final prompt.\n",
"- `example_prompt`: converts each example into 1 or more messages through its [`format_messages`](https://api.python.langchain.com/en/latest/prompts/langchain.prompts.chat.ChatPromptTemplate.html#langchain.prompts.chat.ChatPromptTemplate.format_messages) method. A common example would be to convert each example into one human message and one AI message response, or a human message followed by a function call message.\n",
"\n",
"Below is a simple demonstration. First, import the modules for this example:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "62156fe4",
"metadata": {},
"execution_count": 7,
"id": "91f1ca7f-a748-44c7-a1c6-a89a2d1414ba",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"from langchain.chat_models import ChatOpenAI\n",
"from langchain import PromptTemplate, LLMChain\n",
"from langchain.prompts.chat import (\n",
" ChatPromptTemplate,\n",
" SystemMessagePromptTemplate,\n",
" AIMessagePromptTemplate,\n",
"from langchain.schema import SystemMessage\n",
"from langchain.prompts import (\n",
" FewShotChatMessagePromptTemplate,\n",
" HumanMessagePromptTemplate,\n",
")\n",
"from langchain.schema import AIMessage, HumanMessage, SystemMessage"
" AIMessagePromptTemplate,\n",
" SystemMessagePromptTemplate,\n",
")"
]
},
{
"cell_type": "markdown",
"id": "2844d5ed-c3cc-4bc3-9462-384fc1618b45",
"metadata": {},
"source": [
"Then, define the examples you'd like to include."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "ed7ac3c6",
"execution_count": 8,
"id": "0fc5a02a-6249-4e92-95c3-30fff9671e8b",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"examples = [\n",
" {\"input\": \"2+2\", \"output\": \"4\"},\n",
" {\"input\": \"2+3\", \"output\": \"5\"},\n",
"]"
]
},
{
"cell_type": "markdown",
"id": "e8710ecc-2aa0-4172-a74c-250f6bc3d9e2",
"metadata": {},
"source": [
"Next, assemble them into the few-shot prompt template."
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "65e72ad1-9060-47d0-91a1-bc130c8b98ac",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Human: 2+2\n",
"AI: 4\n",
"Human: 2+3\n",
"AI: 5\n"
]
}
],
"source": [
"# This is a prompt template used to format each individual example.\n",
"example_prompt = HumanMessagePromptTemplate.from_template(\n",
" \"{input}\"\n",
") + AIMessagePromptTemplate.from_template(\"{output}\")\n",
"few_shot_prompt = FewShotChatMessagePromptTemplate(\n",
" example_prompt=example_prompt,\n",
" examples=examples,\n",
")\n",
"\n",
"print(few_shot_prompt.format())"
]
},
{
"cell_type": "markdown",
"id": "5490bd59-b28f-46a4-bbdf-0191802dd3c5",
"metadata": {},
"source": [
"Finally, assemble your final prompt and use it with a model."
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "9f86d6d9-50de-41b6-b6c7-0f9980cc0187",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"chat = ChatOpenAI(temperature=0)"
"final_prompt = (\n",
" SystemMessagePromptTemplate.from_template(\"You are wonderous wizard of math.\")\n",
" + few_shot_prompt\n",
" + HumanMessagePromptTemplate.from_template(\"{input}\")\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "98791aa9",
"execution_count": 14,
"id": "97d443b1-6fae-4b36-bede-3ff7306288a3",
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content=' Triangles do not have a \"square\". A square refers to a shape with 4 equal sides and 4 right angles. Triangles have 3 sides and 3 angles.\\n\\nThe area of a triangle can be calculated using the formula:\\n\\nA = 1/2 * b * h\\n\\nWhere:\\n\\nA is the area \\nb is the base (the length of one of the sides)\\nh is the height (the length from the base to the opposite vertex)\\n\\nSo the area depends on the specific dimensions of the triangle. There is no single \"square of a triangle\". The area can vary greatly between different triangles.', additional_kwargs={}, example=False)"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain.chat_models import ChatAnthropic\n",
"\n",
"chain = final_prompt | ChatAnthropic(temperature=0.0)\n",
"\n",
"chain.invoke({\"input\": \"What's the square of a triangle?\"})"
]
},
{
"cell_type": "markdown",
"id": "70ab7114-f07f-46be-8874-3705a25aba5f",
"metadata": {},
"source": [
"## Dynamic Few-shot Prompting\n",
"\n",
"Sometimes you may want to condition which examples are shown based on the input. For this, you can replace the `examples` with an `example_selector`. The other components remain the same as above! To review, the dynamic few-shot prompt template would look like:\n",
"\n",
"- `example_selector`: responsible for selecting few-shot examples (and the order in which they are returned) for a given input. These implement the [BaseExampleSelector](https://api.python.langchain.com/en/latest/prompts/langchain.prompts.example_selector.base.BaseExampleSelector.html#langchain.prompts.example_selector.base.BaseExampleSelector) interface. A common example is the vectorstore-backed [SemanticSimilarityExampleSelector](https://api.python.langchain.com/en/latest/prompts/langchain.prompts.example_selector.semantic_similarity.SemanticSimilarityExampleSelector.html#langchain.prompts.example_selector.semantic_similarity.SemanticSimilarityExampleSelector)\n",
"- `example_prompt`: convert each example into 1 or more messages through its [`format_messages`](https://api.python.langchain.com/en/latest/prompts/langchain.prompts.chat.ChatPromptTemplate.html#langchain.prompts.chat.ChatPromptTemplate.format_messages) method. A common example would be to convert each example into one human message and one AI message response, or a human message followed by a function call message.\n",
"\n",
"These once again can be composed with other messages and chat templates to assemble your final prompt."
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "6f7b5e86-4ca7-4edd-bf2b-9663030b2393",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"template = \"You are a helpful assistant that translates english to pirate.\"\n",
"system_message_prompt = SystemMessagePromptTemplate.from_template(template)\n",
"example_human = HumanMessagePromptTemplate.from_template(\"Hi\")\n",
"example_ai = AIMessagePromptTemplate.from_template(\"Argh me mateys\")\n",
"human_template = \"{text}\"\n",
"human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)"
"from langchain.prompts import SemanticSimilarityExampleSelector\n",
"from langchain.embeddings import OpenAIEmbeddings\n",
"from langchain.vectorstores import Chroma"
]
},
{
"cell_type": "markdown",
"id": "303b3f81-8d17-4fa2-81b1-e10bf34dd514",
"metadata": {},
"source": [
"Since we are using a vectorstore to select examples based on semantic similarity, we will want to first populate the store."
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "4eebdcd7",
"execution_count": 16,
"id": "ad66f06a-66fd-4fcc-8166-5d0e3c801e57",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"examples = [\n",
" {\"input\": \"2+2\", \"output\": \"4\"},\n",
" {\"input\": \"2+3\", \"output\": \"5\"},\n",
" {\"input\": \"2+4\", \"output\": \"6\"},\n",
" {\"input\": \"What did the cow say to the moon?\", \"output\": \"nothing at all\"},\n",
" {\n",
" \"input\": \"Write me a poem about the moon\",\n",
" \"output\": \"One for the moon, and one for me, who are we to talk about the moon?\",\n",
" },\n",
"]\n",
"\n",
"to_vectorize = [\" \".join(example.values()) for example in examples]\n",
"embeddings = OpenAIEmbeddings()\n",
"vectorstore = Chroma.from_texts(to_vectorize, embeddings, metadatas=examples)"
]
},
{
"cell_type": "markdown",
"id": "2f7e384a-2031-432b-951c-7ea8cf9262f1",
"metadata": {},
"source": [
"#### Create the `example_selector`\n",
"\n",
"With a vectorstore created, you can create the `example_selector`. Here we will isntruct it to only fetch the top 2 examples."
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "7790303a-f722-452e-8921-b14bdf20bdff",
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"\"I be lovin' programmin', me hearty!\""
"[{'input': 'What did the cow say to the moon?', 'output': 'nothing at all'},\n",
" {'input': '2+4', 'output': '6'}]"
]
},
"execution_count": 4,
"execution_count": 17,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chat_prompt = ChatPromptTemplate.from_messages(\n",
" [system_message_prompt, example_human, example_ai, human_message_prompt]\n",
"example_selector = SemanticSimilarityExampleSelector(\n",
" vectorstore=vectorstore,\n",
" k=2,\n",
")\n",
"chain = LLMChain(llm=chat, prompt=chat_prompt)\n",
"# get a chat completion from the formatted messages\n",
"chain.run(\"I love programming.\")"
"\n",
"# The prompt template will load examples by passing the input do the `select_examples` method\n",
"example_selector.select_examples({\"input\": \"horse\"})"
]
},
{
"cell_type": "markdown",
"id": "5c4135d7",
"id": "cc77c40f-3f58-40a2-b757-a2a2ea43f24a",
"metadata": {},
"source": [
"## System Messages\n",
"#### Create prompt template\n",
"\n",
"OpenAI provides an optional `name` parameter that they also recommend using in conjunction with system messages to do few shot prompting. Here is an example of how to do that below."
"Assemble the prompt template, using the `example_selector` created above."
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "253c255e-41d7-45f6-9d88-c7a0ced4b1bd",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"from langchain.schema import SystemMessage\n",
"from langchain.prompts import ChatPromptTemplate, HumanMessagePromptTemplate\n",
"from langchain.prompts.few_shot import FewShotChatMessagePromptTemplate\n",
"\n",
"\n",
"# Define the few-shot prompt.\n",
"few_shot_prompt = FewShotChatMessagePromptTemplate(\n",
" # The input variables select the values to pass to the example_selector\n",
" input_variables=[\"input\"],\n",
" example_selector=example_selector,\n",
" # Define how each example will be formatted.\n",
" # In this case, each example will become 2 messages:\n",
" # 1 human, and 1 AI\n",
" example_prompt=(\n",
" HumanMessagePromptTemplate.from_template(\"{input}\")\n",
" + AIMessagePromptTemplate.from_template(\"{output}\")\n",
" ),\n",
")"
]
},
{
"cell_type": "markdown",
"id": "d960a471-1e1d-4742-ae49-dd0afcdb34d5",
"metadata": {},
"source": [
"Below is an example of how this would be assembled."
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "1ba92d59",
"execution_count": 20,
"id": "860bf682-c469-40e9-b657-27bfe7026099",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Human: 2+3\n",
"AI: 5\n",
"Human: 2+2\n",
"AI: 4\n"
]
}
],
"source": [
"print(few_shot_prompt.format(input=\"What's 3+3?\"))"
]
},
{
"cell_type": "markdown",
"id": "339cae7d-0eb0-44a6-852f-0267c5ff72b3",
"metadata": {},
"source": [
"Assemble the final prompt template:"
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "e731cb45-f0ea-422c-be37-42af2a6cb2c4",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"template = \"You are a helpful assistant that translates english to pirate.\"\n",
"system_message_prompt = SystemMessagePromptTemplate.from_template(template)\n",
"example_human = SystemMessagePromptTemplate.from_template(\n",
" \"Hi\", additional_kwargs={\"name\": \"example_user\"}\n",
")\n",
"example_ai = SystemMessagePromptTemplate.from_template(\n",
" \"Argh me mateys\", additional_kwargs={\"name\": \"example_assistant\"}\n",
")\n",
"human_template = \"{text}\"\n",
"human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)"
"final_prompt = (\n",
" SystemMessagePromptTemplate.from_template(\"You are wonderous wizard of math.\")\n",
" + few_shot_prompt\n",
" + HumanMessagePromptTemplate.from_template(\"{input}\")\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "56e488a7",
"execution_count": 25,
"id": "e6cc4199-8947-42d7-91f0-375de1e15bd9",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Human: 2+3\n",
"AI: 5\n",
"Human: 2+2\n",
"AI: 4\n"
]
}
],
"source": [
"print(few_shot_prompt.format(input=\"What's 3+3?\"))"
]
},
{
"cell_type": "markdown",
"id": "2408ea69-1880-4ef5-a0fa-ffa8d2026aa9",
"metadata": {},
"source": [
"#### Use with an LLM\n",
"\n",
"Now, you can connect your model to the few-shot prompt."
]
},
{
"cell_type": "code",
"execution_count": 26,
"id": "0568cbc6-5354-47f1-ab4d-dfcc616cf583",
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"\"I be lovin' programmin', me hearty.\""
"AIMessage(content=' 3 + 3 = 6', additional_kwargs={}, example=False)"
]
},
"execution_count": 6,
"execution_count": 26,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chat_prompt = ChatPromptTemplate.from_messages(\n",
" [system_message_prompt, example_human, example_ai, human_message_prompt]\n",
")\n",
"chain = LLMChain(llm=chat, prompt=chat_prompt)\n",
"# get a chat completion from the formatted messages\n",
"chain.run(\"I love programming.\")"
"from langchain.chat_models import ChatAnthropic\n",
"\n",
"chain = final_prompt | ChatAnthropic(temperature=0.0)\n",
"\n",
"chain.invoke({\"input\": \"What's 3+3?\"})"
]
}
],
@ -162,7 +438,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.11.2"
}
},
"nbformat": 4,

@ -15,7 +15,10 @@ from langchain.prompts.example_selector import (
NGramOverlapExampleSelector,
SemanticSimilarityExampleSelector,
)
from langchain.prompts.few_shot import FewShotPromptTemplate
from langchain.prompts.few_shot import (
FewShotChatMessagePromptTemplate,
FewShotPromptTemplate,
)
from langchain.prompts.few_shot_with_templates import FewShotPromptWithTemplates
from langchain.prompts.loading import load_prompt
from langchain.prompts.pipeline import PipelinePromptTemplate
@ -42,4 +45,5 @@ __all__ = [
"StringPromptTemplate",
"SystemMessagePromptTemplate",
"load_prompt",
"FewShotChatMessagePromptTemplate",
]

@ -318,14 +318,18 @@ class ChatPromptTemplate(BaseChatPromptTemplate, ABC):
input_variables: List[str]
"""List of input variables."""
messages: List[Union[BaseMessagePromptTemplate, BaseMessage]]
messages: List[
Union[BaseMessagePromptTemplate, BaseMessage, BaseChatPromptTemplate]
]
"""List of messages consisting of either message prompt templates or messages."""
def __add__(self, other: Any) -> ChatPromptTemplate:
# Allow for easy combining
if isinstance(other, ChatPromptTemplate):
return ChatPromptTemplate(messages=self.messages + other.messages)
elif isinstance(other, (BaseMessagePromptTemplate, BaseMessage)):
elif isinstance(
other, (BaseMessagePromptTemplate, BaseMessage, BaseChatPromptTemplate)
):
return ChatPromptTemplate(messages=self.messages + [other])
elif isinstance(other, str):
prompt = HumanMessagePromptTemplate.from_template(other)
@ -349,7 +353,7 @@ class ChatPromptTemplate(BaseChatPromptTemplate, ABC):
messages = values["messages"]
input_vars = set()
for message in messages:
if isinstance(message, BaseMessagePromptTemplate):
if isinstance(message, (BaseMessagePromptTemplate, BaseChatPromptTemplate)):
input_vars.update(message.input_variables)
if "partial_variables" in values:
input_vars = input_vars - set(values["partial_variables"])
@ -475,7 +479,9 @@ class ChatPromptTemplate(BaseChatPromptTemplate, ABC):
for message_template in self.messages:
if isinstance(message_template, BaseMessage):
result.extend([message_template])
elif isinstance(message_template, BaseMessagePromptTemplate):
elif isinstance(
message_template, (BaseMessagePromptTemplate, BaseChatPromptTemplate)
):
rel_params = {
k: v
for k, v in kwargs.items()

@ -1,24 +1,24 @@
"""Prompt template that contains few shot examples."""
from typing import Any, Dict, List, Optional
from __future__ import annotations
from pydantic import Extra, root_validator
from typing import Any, Dict, List, Optional, Union
from pydantic import BaseModel, Extra, Field, root_validator
from langchain.prompts.base import (
DEFAULT_FORMATTER_MAPPING,
StringPromptTemplate,
check_valid_template,
)
from langchain.prompts.chat import BaseChatPromptTemplate, BaseMessagePromptTemplate
from langchain.prompts.example_selector.base import BaseExampleSelector
from langchain.prompts.prompt import PromptTemplate
from langchain.schema.messages import BaseMessage, get_buffer_string
class FewShotPromptTemplate(StringPromptTemplate):
class _FewShotPromptTemplateMixin(BaseModel):
"""Prompt template that contains few shot examples."""
@property
def lc_serializable(self) -> bool:
return False
examples: Optional[List[dict]] = None
"""Examples to format into the prompt.
Either this or example_selector should be provided."""
@ -27,26 +27,11 @@ class FewShotPromptTemplate(StringPromptTemplate):
"""ExampleSelector to choose the examples to format into the prompt.
Either this or examples should be provided."""
example_prompt: PromptTemplate
"""PromptTemplate used to format an individual example."""
suffix: str
"""A prompt template string to put after the examples."""
input_variables: List[str]
"""A list of the names of the variables the prompt template expects."""
example_separator: str = "\n\n"
"""String separator used to join the prefix, the examples, and suffix."""
prefix: str = ""
"""A prompt template string to put before the examples."""
template_format: str = "f-string"
"""The format of the prompt template. Options are: 'f-string', 'jinja2'."""
class Config:
"""Configuration for this pydantic object."""
validate_template: bool = True
"""Whether or not to try validating the template."""
extra = Extra.forbid
arbitrary_types_allowed = True
@root_validator(pre=True)
def check_examples_and_selector(cls, values: Dict) -> Dict:
@ -65,6 +50,58 @@ class FewShotPromptTemplate(StringPromptTemplate):
return values
def _get_examples(self, **kwargs: Any) -> List[dict]:
"""Get the examples to use for formatting the prompt.
Args:
**kwargs: Keyword arguments to be passed to the example selector.
Returns:
List of examples.
"""
if self.examples is not None:
return self.examples
elif self.example_selector is not None:
return self.example_selector.select_examples(kwargs)
else:
raise ValueError(
"One of 'examples' and 'example_selector' should be provided"
)
class FewShotPromptTemplate(_FewShotPromptTemplateMixin, StringPromptTemplate):
"""Prompt template that contains few shot examples."""
@property
def lc_serializable(self) -> bool:
"""Return whether the prompt template is lc_serializable.
Returns:
Boolean indicating whether the prompt template is lc_serializable.
"""
return False
validate_template: bool = True
"""Whether or not to try validating the template."""
input_variables: List[str]
"""A list of the names of the variables the prompt template expects."""
example_prompt: PromptTemplate
"""PromptTemplate used to format an individual example."""
suffix: str
"""A prompt template string to put after the examples."""
example_separator: str = "\n\n"
"""String separator used to join the prefix, the examples, and suffix."""
prefix: str = ""
"""A prompt template string to put before the examples."""
template_format: str = "f-string"
"""The format of the prompt template. Options are: 'f-string', 'jinja2'."""
@root_validator()
def template_is_valid(cls, values: Dict) -> Dict:
"""Check that prefix, suffix, and input variables are consistent."""
@ -82,19 +119,11 @@ class FewShotPromptTemplate(StringPromptTemplate):
extra = Extra.forbid
arbitrary_types_allowed = True
def _get_examples(self, **kwargs: Any) -> List[dict]:
if self.examples is not None:
return self.examples
elif self.example_selector is not None:
return self.example_selector.select_examples(kwargs)
else:
raise ValueError
def format(self, **kwargs: Any) -> str:
"""Format the prompt with the inputs.
Args:
kwargs: Any arguments to be passed to the prompt template.
**kwargs: Any arguments to be passed to the prompt template.
Returns:
A formatted string.
@ -132,3 +161,184 @@ class FewShotPromptTemplate(StringPromptTemplate):
if self.example_selector:
raise ValueError("Saving an example selector is not currently supported")
return super().dict(**kwargs)
class FewShotChatMessagePromptTemplate(
BaseChatPromptTemplate, _FewShotPromptTemplateMixin
):
"""Chat prompt template that supports few-shot examples.
The high level structure of produced by this prompt template is a list of messages
consisting of prefix message(s), example message(s), and suffix message(s).
This structure enables creating a conversation with intermediate examples like:
System: You are a helpful AI Assistant
Human: What is 2+2?
AI: 4
Human: What is 2+3?
AI: 5
Human: What is 4+4?
This prompt template can be used to generate a fixed list of examples or else
to dynamically select examples based on the input.
Examples:
Prompt template with a fixed list of examples (matching the sample
conversation above):
.. code-block:: python
from langchain.schema import SystemMessage
from langchain.prompts import (
FewShotChatMessagePromptTemplate,
HumanMessagePromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate
)
examples = [
{"input": "2+2", "output": "4"},
{"input": "2+3", "output": "5"},
]
few_shot_prompt = FewShotChatMessagePromptTemplate(
examples=examples,
# This is a prompt template used to format each individual example.
example_prompt=(
HumanMessagePromptTemplate.from_template("{input}")
+ AIMessagePromptTemplate.from_template("{output}")
),
)
final_prompt = (
SystemMessagePromptTemplate.from_template(
"You are a helpful AI Assistant"
)
+ few_shot_prompt
+ HumanMessagePromptTemplate.from_template("{input}")
)
final_prompt.format(input="What is 4+4?")
Prompt template with dynamically selected examples:
.. code-block:: python
from langchain.prompts import SemanticSimilarityExampleSelector
from langchain.embeddings import OpenAIEmbeddings
from langchain.vectorstores import Chroma
examples = [
{"input": "2+2", "output": "4"},
{"input": "2+3", "output": "5"},
{"input": "2+4", "output": "6"},
# ...
]
to_vectorize = [
" ".join(example.values())
for example in examples
]
embeddings = OpenAIEmbeddings()
vectorstore = Chroma.from_texts(
to_vectorize, embeddings, metadatas=examples
)
example_selector = SemanticSimilarityExampleSelector(
vectorstore=vectorstore
)
from langchain.schema import SystemMessage
from langchain.prompts import HumanMessagePromptTemplate
from langchain.prompts.few_shot import FewShotChatMessagePromptTemplate
few_shot_prompt = FewShotChatMessagePromptTemplate(
# Which variable(s) will be passed to the example selector.
input_variables=["input"],
example_selector=example_selector,
# Define how each example will be formatted.
# In this case, each example will become 2 messages:
# 1 human, and 1 AI
example_prompt=(
HumanMessagePromptTemplate.from_template("{input}")
+ AIMessagePromptTemplate.from_template("{output}")
),
)
# Define the overall prompt.
final_prompt = (
SystemMessagePromptTemplate.from_template(
"You are a helpful AI Assistant"
)
+ few_shot_prompt
+ HumanMessagePromptTemplate.from_template("{input}")
)
# Show the prompt
print(final_prompt.format_messages(input="What's 3+3?"))
# Use within an LLM
from langchain.chat_models import ChatAnthropic
chain = final_prompt | ChatAnthropic()
chain.invoke({"input": "What's 3+3?"})
"""
@property
def lc_serializable(self) -> bool:
"""Return whether the prompt template is lc_serializable.
Returns:
Boolean indicating whether the prompt template is lc_serializable.
"""
return False
input_variables: List[str] = Field(default_factory=list)
"""A list of the names of the variables the prompt template will use
to pass to the example_selector, if provided."""
example_prompt: Union[BaseMessagePromptTemplate, BaseChatPromptTemplate]
"""The class to format each example."""
class Config:
"""Configuration for this pydantic object."""
extra = Extra.forbid
arbitrary_types_allowed = True
def format_messages(self, **kwargs: Any) -> List[BaseMessage]:
"""Format kwargs into a list of messages.
Args:
**kwargs: keyword arguments to use for filling in templates in messages.
Returns:
A list of formatted messages with all template variables filled in.
"""
# Get the examples to use.
examples = self._get_examples(**kwargs)
examples = [
{k: e[k] for k in self.example_prompt.input_variables} for e in examples
]
# Format the examples.
messages = [
message
for example in examples
for message in self.example_prompt.format_messages(**example)
]
return messages
def format(self, **kwargs: Any) -> str:
"""Format the prompt with inputs generating a string.
Use this method to generate a string representation of a prompt consisting
of chat messages.
Useful for feeding into a string based completion language model or debugging.
Args:
**kwargs: keyword arguments to use for formatting.
Returns:
A string representation of the prompt
"""
messages = self.format_messages(**kwargs)
return get_buffer_string(messages)

@ -1,10 +1,21 @@
"""Test few shot prompt template."""
from typing import Dict, List, Tuple
from typing import Any, Dict, List, Sequence, Tuple
import pytest
from langchain.prompts.few_shot import FewShotPromptTemplate
from langchain.prompts import (
AIMessagePromptTemplate,
ChatPromptTemplate,
HumanMessagePromptTemplate,
)
from langchain.prompts.chat import SystemMessagePromptTemplate
from langchain.prompts.example_selector.base import BaseExampleSelector
from langchain.prompts.few_shot import (
FewShotChatMessagePromptTemplate,
FewShotPromptTemplate,
)
from langchain.prompts.prompt import PromptTemplate
from langchain.schema import AIMessage, HumanMessage, SystemMessage
EXAMPLE_PROMPT = PromptTemplate(
input_variables=["question", "answer"], template="{question}: {answer}"
@ -267,3 +278,93 @@ def test_prompt_jinja2_extra_input_variables(
example_prompt=example_jinja2_prompt[0],
template_format="jinja2",
)
def test_few_shot_chat_message_prompt_template() -> None:
"""Tests for few shot chat message template."""
examples = [
{"input": "2+2", "output": "4"},
{"input": "2+3", "output": "5"},
]
example_prompt = ChatPromptTemplate.from_messages(
[
HumanMessagePromptTemplate.from_template("{input}"),
AIMessagePromptTemplate.from_template("{output}"),
]
)
few_shot_prompt = FewShotChatMessagePromptTemplate(
input_variables=["input"],
example_prompt=example_prompt,
examples=examples,
)
final_prompt: ChatPromptTemplate = (
SystemMessagePromptTemplate.from_template("You are a helpful AI Assistant")
+ few_shot_prompt
+ HumanMessagePromptTemplate.from_template("{input}")
)
messages = final_prompt.format_messages(input="100 + 1")
assert messages == [
SystemMessage(content="You are a helpful AI Assistant", additional_kwargs={}),
HumanMessage(content="2+2", additional_kwargs={}, example=False),
AIMessage(content="4", additional_kwargs={}, example=False),
HumanMessage(content="2+3", additional_kwargs={}, example=False),
AIMessage(content="5", additional_kwargs={}, example=False),
HumanMessage(content="100 + 1", additional_kwargs={}, example=False),
]
class AsIsSelector(BaseExampleSelector):
"""An example selector for testing purposes.
This selector returns the examples as-is.
"""
def __init__(self, examples: Sequence[Dict[str, str]]) -> None:
"""Initializes the selector."""
self.examples = examples
def add_example(self, example: Dict[str, str]) -> Any:
"""Adds an example to the selector."""
raise NotImplementedError()
def select_examples(self, input_variables: Dict[str, str]) -> List[dict]:
"""Select which examples to use based on the inputs."""
return list(self.examples)
def test_few_shot_chat_message_prompt_template_with_selector() -> None:
"""Tests for few shot chat message template with an example selector."""
examples = [
{"input": "2+2", "output": "4"},
{"input": "2+3", "output": "5"},
]
example_selector = AsIsSelector(examples)
example_prompt = ChatPromptTemplate.from_messages(
[
HumanMessagePromptTemplate.from_template("{input}"),
AIMessagePromptTemplate.from_template("{output}"),
]
)
few_shot_prompt = FewShotChatMessagePromptTemplate(
input_variables=["input"],
example_prompt=example_prompt,
example_selector=example_selector,
)
final_prompt: ChatPromptTemplate = (
SystemMessagePromptTemplate.from_template("You are a helpful AI Assistant")
+ few_shot_prompt
+ HumanMessagePromptTemplate.from_template("{input}")
)
messages = final_prompt.format_messages(input="100 + 1")
assert messages == [
SystemMessage(content="You are a helpful AI Assistant", additional_kwargs={}),
HumanMessage(content="2+2", additional_kwargs={}, example=False),
AIMessage(content="4", additional_kwargs={}, example=False),
HumanMessage(content="2+3", additional_kwargs={}, example=False),
AIMessage(content="5", additional_kwargs={}, example=False),
HumanMessage(content="100 + 1", additional_kwargs={}, example=False),
]

Loading…
Cancel
Save