Update `Getting Started` page of `Prompt Templates` (#3298)

Updated `Getting Started` page of `Prompt Templates` to showcase more
features provided by the class. Might need some proof reading because
apparently English is not my first language.
fix_agent_callbacks
engkheng 1 year ago committed by GitHub
parent a14d1c02f8
commit 7c2c73af5f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -23,15 +23,6 @@ from langchain import PromptTemplate
template = """
I want you to act as a naming consultant for new companies.
Here are some examples of good company names:
- search engine, Google
- social media, Facebook
- video sharing, YouTube
The name should be short, catchy and easy to remember.
What is a good name for a company that makes {product}?
"""
@ -39,6 +30,9 @@ prompt = PromptTemplate(
input_variables=["product"],
template=template,
)
prompt.format(product="colorful socks")
# -> I want you to act as a naming consultant for new companies.
# -> What is a good name for a company that makes colorful socks?
```
@ -69,30 +63,81 @@ multiple_input_prompt.format(adjective="funny", content="chickens")
# -> "Tell me a funny joke about chickens."
```
If you do not wish to specify `input_variables` manually, you can also create a `PromptTemplate` using `from_templates` class method. `langchain` will automatically infer the `input_variables` based on the `template` passed.
```python
template = "Tell me a {adjective} joke about {content}."
prompt_template = PromptTemplate.from_template(template)
prompt_template.input_variables
# -> ['adjective', 'content']
prompt_template.format(adjective="funny", content="chickens")
# -> Tell me a funny joke about chickens.
```
You can create custom prompt templates that format the prompt in any way you want. For more information, see [Custom Prompt Templates](examples/custom_prompt_template.ipynb).
<!-- TODO(shreya): Add link to Jinja -->
:::{note}
Currently, the template should be formatted as a Python f-string. We also support Jinja2 templates (see [Using Jinja templates](examples/custom_prompt_template.ipynb)). In the future, we will support more templating languages such as Mako.
:::
## Template formats
By default, `PromptTemplate` will treat the provided template as a Python f-string. You can specify other template format through `template_format` argument:
## Load a prompt template from LangChainHub
```python
# Make sure jinja2 is installed before running this
jinja2_template = "Tell me a {{ adjective }} joke about {{ content }}"
prompt_template = PromptTemplate.from_template(template=jinja2_template, template_format="jinja2")
prompt_template.format(adjective="funny", content="chickens")
# -> Tell me a funny joke about chickens.
```
LangChainHub contains a collection of prompts which can be loaded directly via LangChain.
Currently, `PromptTemplate` only supports `jinja2` and `f-string` templating format. If there is any other templating format that you would like to use, feel free to open an issue in the [Github](https://github.com/hwchase17/langchain/issues) page.
## Validate template
By default, `PromptTemplate` will validate the `template` string by checking whether the `input_variables` match the variables defined in `template`. You can disable this behavior by setting `validate_template` to `False`
```python
template = "I am learning langchain because {reason}."
prompt_template = PromptTemplate(template=template,
input_variables=["reason", "foo"]) # ValueError due to extra variables
prompt_template = PromptTemplate(template=template,
input_variables=["reason", "foo"],
validate_template=False) # No error
```
## Serialize prompt template
You can save your `PromptTemplate` into a file in your local filesystem. `langchain` will automatically infer the file format through the file extension name. Currently, `langchain` supports saving template to YAML and JSON file.
```python
prompt_template.save("awesome_prompt.json") # Save to JSON file
```
```python
from langchain.prompts import load_prompt
loaded_prompt = load_prompt("awesome_prompt.json")
assert prompt_template == loaded_prompt
```
`langchain` also supports loading prompt template from LangChainHub, which contains a collection of useful prompts you can use in your project. You can read more about LangChainHub and the prompts available with it [here](https://github.com/hwchase17/langchain-hub).
```python
from langchain.prompts import load_prompt
prompt = load_prompt("lc://prompts/conversation/prompt.json")
prompt.format(history="", input="What is 1 + 1?")
```
You can read more about LangChainHub and the prompts available with it [here](https://github.com/hwchase17/langchain-hub).
You can learn more about serializing prompt template in [How to serialize prompts](examples/prompt_serialization.ipynb).
## Pass few shot examples to a prompt template

Loading…
Cancel
Save