Fix bad spellings for 'convenience' (#3936)

Found in the docs for chat prompt templates:

https://python.langchain.com/en/latest/getting_started/getting_started.html#chat-prompt-templates

and fixed similar issues in neighboring notebooks.
fix_agent_callbacks
Samuel Dion-Girardeau 1 year ago committed by GitHub
parent f04faf8496
commit c5c33786a7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -349,7 +349,7 @@ result.llm_output['token_usage']
## Chat Prompt Templates
Similar to LLMs, you can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplate`s. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or `Message` object, depending on whether you want to use the formatted value as input to an llm or chat model.
For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:
For convenience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:
```python
from langchain.chat_models import ChatOpenAI

@ -18,7 +18,7 @@
"In this notebook, we will walk through the simplest form of memory: \"buffer\" memory, which just involves keeping a buffer of all prior messages. We will show how to use the modular utility functions here, then show how it can be used in a chain (both returning a string as well as a list of messages).\n",
"\n",
"## ChatMessageHistory\n",
"One of the core utility classes underpinning most (if not all) memory modules is the `ChatMessageHistory` class. This is a super lightweight wrapper which exposes convienence methods for saving Human messages, AI messages, and then fetching them all. \n",
"One of the core utility classes underpinning most (if not all) memory modules is the `ChatMessageHistory` class. This is a super lightweight wrapper which exposes convenience methods for saving Human messages, AI messages, and then fetching them all. \n",
"\n",
"You may want to use this class directly if you are managing memory outside of a chain."
]

@ -200,7 +200,7 @@
"source": [
"You can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplates`. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or Message object, depending on whether you want to use the formatted value as input to an llm or chat model.\n",
"\n",
"For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:"
"For convenience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:"
]
},
{

@ -79,7 +79,7 @@
"source": [
"You can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplates`. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or Message object, depending on whether you want to use the formatted value as input to an llm or chat model.\n",
"\n",
"For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:"
"For convenience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:"
]
},
{

Loading…
Cancel
Save