From c5c33786a79f0d68bc6b04254868f7ca3c80d509 Mon Sep 17 00:00:00 2001 From: Samuel Dion-Girardeau Date: Mon, 1 May 2023 23:57:06 -0400 Subject: [PATCH] Fix bad spellings for 'convenience' (#3936) Found in the docs for chat prompt templates: https://python.langchain.com/en/latest/getting_started/getting_started.html#chat-prompt-templates and fixed similar issues in neighboring notebooks. --- docs/getting_started/getting_started.md | 2 +- docs/modules/memory/getting_started.ipynb | 2 +- docs/modules/models/chat/getting_started.ipynb | 2 +- docs/modules/models/chat/integrations/openai.ipynb | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/getting_started/getting_started.md b/docs/getting_started/getting_started.md index cdc1cf0f..c4a6577c 100644 --- a/docs/getting_started/getting_started.md +++ b/docs/getting_started/getting_started.md @@ -349,7 +349,7 @@ result.llm_output['token_usage'] ## Chat Prompt Templates Similar to LLMs, you can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplate`s. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or `Message` object, depending on whether you want to use the formatted value as input to an llm or chat model. -For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like: +For convenience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like: ```python from langchain.chat_models import ChatOpenAI diff --git a/docs/modules/memory/getting_started.ipynb b/docs/modules/memory/getting_started.ipynb index ea8f599d..e0d9a834 100644 --- a/docs/modules/memory/getting_started.ipynb +++ b/docs/modules/memory/getting_started.ipynb @@ -18,7 +18,7 @@ "In this notebook, we will walk through the simplest form of memory: \"buffer\" memory, which just involves keeping a buffer of all prior messages. We will show how to use the modular utility functions here, then show how it can be used in a chain (both returning a string as well as a list of messages).\n", "\n", "## ChatMessageHistory\n", - "One of the core utility classes underpinning most (if not all) memory modules is the `ChatMessageHistory` class. This is a super lightweight wrapper which exposes convienence methods for saving Human messages, AI messages, and then fetching them all. \n", + "One of the core utility classes underpinning most (if not all) memory modules is the `ChatMessageHistory` class. This is a super lightweight wrapper which exposes convenience methods for saving Human messages, AI messages, and then fetching them all. \n", "\n", "You may want to use this class directly if you are managing memory outside of a chain." ] diff --git a/docs/modules/models/chat/getting_started.ipynb b/docs/modules/models/chat/getting_started.ipynb index cee995ec..d98b0c93 100644 --- a/docs/modules/models/chat/getting_started.ipynb +++ b/docs/modules/models/chat/getting_started.ipynb @@ -200,7 +200,7 @@ "source": [ "You can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplates`. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or Message object, depending on whether you want to use the formatted value as input to an llm or chat model.\n", "\n", - "For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:" + "For convenience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:" ] }, { diff --git a/docs/modules/models/chat/integrations/openai.ipynb b/docs/modules/models/chat/integrations/openai.ipynb index c01fe50e..9ce4c70c 100644 --- a/docs/modules/models/chat/integrations/openai.ipynb +++ b/docs/modules/models/chat/integrations/openai.ipynb @@ -79,7 +79,7 @@ "source": [ "You can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplates`. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or Message object, depending on whether you want to use the formatted value as input to an llm or chat model.\n", "\n", - "For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:" + "For convenience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:" ] }, {