fix docs custom prompt template (#917)

This commit is contained in:
Harrison Chase 2023-02-06 20:29:48 -08:00 committed by GitHub
parent 87fad8fc00
commit cea380174f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 168 additions and 80 deletions

View File

@ -0,0 +1,168 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "c75efab3",
"metadata": {},
"source": [
"# Create a custom prompt template\n",
"\n",
"Let's suppose we want the LLM to generate English language explanations of a function given its name. To achieve this task, we will create a custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function.\n",
"\n",
"## Why are custom prompt templates needed?\n",
"\n",
"LangChain provides a set of default prompt templates that can be used to generate prompts for a variety of tasks. However, there may be cases where the default prompt templates do not meet your needs. For example, you may want to create a prompt template with specific dynamic instructions for your language model. In such cases, you can create a custom prompt template.\n",
"\n",
"Take a look at the current set of default prompt templates [here](../getting_started.md)."
]
},
{
"cell_type": "markdown",
"id": "5d56ce86",
"metadata": {},
"source": [
"## Create a custom prompt template\n",
"\n",
"The only two requirements for all prompt templates are:\n",
"\n",
"1. They have a input_variables attribute that exposes what input variables this prompt template expects.\n",
"2. They expose a format method which takes in keyword arguments corresponding to the expected input_variables and returns the formatted prompt.\n",
"\n",
"Let's create a custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function.\n",
"\n",
"First, let's create a function that will return the source code of a function given its name."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "c831e1ce",
"metadata": {},
"outputs": [],
"source": [
"import inspect\n",
"\n",
"def get_source_code(function_name):\n",
" # Get the source code of the function\n",
" return inspect.getsource(function_name)"
]
},
{
"cell_type": "markdown",
"id": "c2c8f4ea",
"metadata": {},
"source": [
"Next, we'll create a custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function.\n"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "3ad1efdc",
"metadata": {},
"outputs": [],
"source": [
"from langchain.prompts import BasePromptTemplate\n",
"from pydantic import BaseModel, validator\n",
"\n",
"\n",
"class FunctionExplainerPromptTemplate(BasePromptTemplate, BaseModel):\n",
" \"\"\" A custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function. \"\"\"\n",
"\n",
" @validator(\"input_variables\")\n",
" def validate_input_variables(cls, v):\n",
" \"\"\" Validate that the input variables are correct. \"\"\"\n",
" if len(v) != 1 or \"function_name\" not in v:\n",
" raise ValueError(\"function_name must be the only input_variable.\")\n",
" return v\n",
"\n",
" def format(self, **kwargs) -> str:\n",
" # Get the source code of the function\n",
" source_code = get_source_code(kwargs[\"function_name\"])\n",
"\n",
" # Generate the prompt to be sent to the language model\n",
" prompt = f\"\"\"\n",
" Given the function name and source code, generate an English language explanation of the function.\n",
" Function Name: {kwargs[\"function_name\"].__name__}\n",
" Source Code:\n",
" {source_code}\n",
" Explanation:\n",
" \"\"\"\n",
" return prompt\n",
" \n",
" def _prompt_type(self):\n",
" return \"function-explainer\""
]
},
{
"cell_type": "markdown",
"id": "7fcbf6ef",
"metadata": {},
"source": [
"## Use the custom prompt template\n",
"\n",
"Now that we have created a custom prompt template, we can use it to generate prompts for our task."
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "bd836cda",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
" Given the function name and source code, generate an English language explanation of the function.\n",
" Function Name: get_source_code\n",
" Source Code:\n",
" def get_source_code(function_name):\n",
" # Get the source code of the function\n",
" return inspect.getsource(function_name)\n",
"\n",
" Explanation:\n",
" \n"
]
}
],
"source": [
"fn_explainer = FunctionExplainerPromptTemplate(input_variables=[\"function_name\"])\n",
"\n",
"# Generate a prompt for the function \"get_source_code\"\n",
"prompt = fn_explainer.format(function_name=get_source_code)\n",
"print(prompt)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "7f3161c6",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.9"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

View File

@ -1,75 +0,0 @@
# Create a custom prompt template
Let's suppose we want the LLM to generate English language explanations of a function given its name. To achieve this task, we will create a custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function.
## Why are custom prompt templates needed?
LangChain provides a set of default prompt templates that can be used to generate prompts for a variety of tasks. However, there may be cases where the default prompt templates do not meet your needs. For example, you may want to create a prompt template with specific dynamic instructions for your language model. In such cases, you can create a custom prompt template.
:::{note}
Take a look at the current set of default prompt templates [here](../getting_started.md).
:::
<!-- TODO(shreya): Add correct link here. -->
## Create a custom prompt template
The only two requirements for all prompt templates are:
1. They have a input_variables attribute that exposes what input variables this prompt template expects.
2. They expose a format method which takes in keyword arguments corresponding to the expected input_variables and returns the formatted prompt.
Let's create a custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function.
First, let's create a function that will return the source code of a function given its name.
```python
import inspect
def get_source_code(function_name):
# Get the source code of the function
return inspect.getsource(function_name)
```
Next, we'll create a custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function.
```python
from langchain.prompts import BasePromptTemplate
from pydantic import BaseModel, validator
class FunctionExplainerPromptTemplate(BasePromptTemplate, BaseModel):
""" A custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function. """
@validator("input_variables")
def validate_input_variables(cls, v):
""" Validate that the input variables are correct. """
if len(v) != 1 or "function_name" not in v:
raise ValueError("function_name must be the only input_variable.")
return v
def format(self, **kwargs) -> str:
# Get the source code of the function
source_code = get_source_code(kwargs["function_name"])
# Generate the prompt to be sent to the language model
prompt = f"""
Given the function name and source code, generate an English language explanation of the function.
Function Name: {kwargs["function_name"].__name__}
Source Code:
{source_code}
Explanation:
"""
return prompt
```
## Use the custom prompt template
Now that we have created a custom prompt template, we can use it to generate prompts for our task.
```python
fn_explainer = FunctionExplainerPromptTemplate(input_variables=["function_name"])
# Generate a prompt for the function "get_source_code"
prompt = fn_explainer.format(function_name=get_source_code)
print(prompt)
```

View File

@ -19,11 +19,6 @@ The user guide here shows more advanced workflows and how to use the library in
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
:glob: :glob: