From ef25904ecbaf9b49f14e1da1b39123c0bd2d261e Mon Sep 17 00:00:00 2001 From: Alex Telon Date: Wed, 29 Mar 2023 00:03:28 +0200 Subject: [PATCH] Fixed 1 missing line in getting_started.md (#2107) Seems like a copy paste error. The very next example does have this line. Please tell me if I missed something in the process and should have created an issue or something first! --- docs/getting_started/getting_started.md | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/docs/getting_started/getting_started.md b/docs/getting_started/getting_started.md index 9988b3c4..4d276edd 100644 --- a/docs/getting_started/getting_started.md +++ b/docs/getting_started/getting_started.md @@ -355,13 +355,15 @@ Similar to LLMs, you can make use of templating by using a `MessagePromptTemplat For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like: ```python +from langchain.chat_models import ChatOpenAI from langchain.prompts.chat import ( ChatPromptTemplate, SystemMessagePromptTemplate, - AIMessagePromptTemplate, HumanMessagePromptTemplate, ) +chat = ChatOpenAI(temperature=0) + template="You are a helpful assistant that translates {input_language} to {output_language}." system_message_prompt = SystemMessagePromptTemplate.from_template(template) human_template="{text}" @@ -380,11 +382,10 @@ The `LLMChain` discussed in the above section can be used with chat models as we ```python from langchain.chat_models import ChatOpenAI -from langchain import PromptTemplate, LLMChain +from langchain import LLMChain from langchain.prompts.chat import ( ChatPromptTemplate, SystemMessagePromptTemplate, - AIMessagePromptTemplate, HumanMessagePromptTemplate, )