You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/docs/modules/prompts/prompt_templates/examples/prompt_composition.ipynb

180 lines
4.2 KiB
Plaintext

{
"cells": [
{
"cell_type": "markdown",
"id": "c79d1cbf",
"metadata": {},
"source": [
"# Prompt Composition\n",
"\n",
"This notebook goes over how to compose multiple prompts together. This can be useful when you want to reuse parts of prompts. This can be done with a PipelinePrompt. A PipelinePrompt consists of two main parts:\n",
"\n",
"- final_prompt: This is the final prompt that is returned\n",
"- pipeline_prompts: This is a list of tuples, consisting of a string (`name`) and a Prompt Template. Each PromptTemplate will be formatted and then passed to future prompt templates as a variable with the same name as `name`"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "4eb8c5e6",
"metadata": {},
"outputs": [],
"source": [
"from langchain.prompts.pipeline import PipelinePromptTemplate\n",
"from langchain.prompts.prompt import PromptTemplate"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "67842c6e",
"metadata": {},
"outputs": [],
"source": [
"full_template = \"\"\"{introduction}\n",
"\n",
"{example}\n",
"\n",
"{start}\"\"\"\n",
"full_prompt = PromptTemplate.from_template(full_template)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "11913f4b",
"metadata": {},
"outputs": [],
"source": [
"introduction_template = \"\"\"You are impersonating {person}.\"\"\"\n",
"introduction_prompt = PromptTemplate.from_template(introduction_template)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "bc94cac0",
"metadata": {},
"outputs": [],
"source": [
"example_template = \"\"\"Here's an example of an interaction: \n",
"\n",
"Q: {example_q}\n",
"A: {example_a}\"\"\"\n",
"example_prompt = PromptTemplate.from_template(example_template)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "e89c4dd7",
"metadata": {},
"outputs": [],
"source": [
"start_template = \"\"\"Now, do this for real!\n",
"\n",
"Q: {input}\n",
"A:\"\"\"\n",
"start_prompt = PromptTemplate.from_template(start_template)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "fa029e4b",
"metadata": {},
"outputs": [],
"source": [
"input_prompts = [\n",
" (\"introduction\", introduction_prompt),\n",
" (\"example\", example_prompt),\n",
" (\"start\", start_prompt)\n",
"]\n",
"pipeline_prompt = PipelinePromptTemplate(final_prompt=full_prompt, pipeline_prompts=input_prompts)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "674ea983",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"['example_a', 'person', 'example_q', 'input']"
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"pipeline_prompt.input_variables"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "f1fa0925",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"You are impersonating Elon Musk.\n",
"Here's an example of an interaction: \n",
"\n",
"Q: What's your favorite car?\n",
"A: Telsa\n",
"Now, do this for real!\n",
"\n",
"Q: What's your favorite social media site?\n",
"A:\n",
"\n"
]
}
],
"source": [
"print(pipeline_prompt.format(\n",
" person=\"Elon Musk\",\n",
" example_q=\"What's your favorite car?\",\n",
" example_a=\"Telsa\",\n",
" input=\"What's your favorite social media site?\"\n",
"))"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "047c2b0a",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
}
},
"nbformat": 4,
"nbformat_minor": 5
}