mirror of
https://github.com/hwchase17/langchain
synced 2024-10-31 15:20:26 +00:00
88 lines
2.0 KiB
Plaintext
88 lines
2.0 KiB
Plaintext
|
#### Using `LLMChain`
|
||
|
|
||
|
The `LLMChain` is most basic building block chain. It takes in a prompt template, formats it with the user input and returns the response from an LLM.
|
||
|
|
||
|
To use the `LLMChain`, first create a prompt template.
|
||
|
|
||
|
```python
|
||
|
from langchain.llms import OpenAI
|
||
|
from langchain.prompts import PromptTemplate
|
||
|
|
||
|
llm = OpenAI(temperature=0.9)
|
||
|
prompt = PromptTemplate(
|
||
|
input_variables=["product"],
|
||
|
template="What is a good name for a company that makes {product}?",
|
||
|
)
|
||
|
```
|
||
|
|
||
|
We can now create a very simple chain that will take user input, format the prompt with it, and then send it to the LLM.
|
||
|
|
||
|
|
||
|
```python
|
||
|
from langchain.chains import LLMChain
|
||
|
chain = LLMChain(llm=llm, prompt=prompt)
|
||
|
|
||
|
# Run the chain only specifying the input variable.
|
||
|
print(chain.run("colorful socks"))
|
||
|
```
|
||
|
|
||
|
<CodeOutputBlock lang="python">
|
||
|
|
||
|
```
|
||
|
Colorful Toes Co.
|
||
|
```
|
||
|
|
||
|
</CodeOutputBlock>
|
||
|
|
||
|
If there are multiple variables, you can input them all at once using a dictionary.
|
||
|
|
||
|
|
||
|
```python
|
||
|
prompt = PromptTemplate(
|
||
|
input_variables=["company", "product"],
|
||
|
template="What is a good name for {company} that makes {product}?",
|
||
|
)
|
||
|
chain = LLMChain(llm=llm, prompt=prompt)
|
||
|
print(chain.run({
|
||
|
'company': "ABC Startup",
|
||
|
'product': "colorful socks"
|
||
|
}))
|
||
|
```
|
||
|
|
||
|
<CodeOutputBlock lang="python">
|
||
|
|
||
|
```
|
||
|
Socktopia Colourful Creations.
|
||
|
```
|
||
|
|
||
|
</CodeOutputBlock>
|
||
|
|
||
|
You can use a chat model in an `LLMChain` as well:
|
||
|
|
||
|
|
||
|
```python
|
||
|
from langchain.chat_models import ChatOpenAI
|
||
|
from langchain.prompts.chat import (
|
||
|
ChatPromptTemplate,
|
||
|
HumanMessagePromptTemplate,
|
||
|
)
|
||
|
human_message_prompt = HumanMessagePromptTemplate(
|
||
|
prompt=PromptTemplate(
|
||
|
template="What is a good name for a company that makes {product}?",
|
||
|
input_variables=["product"],
|
||
|
)
|
||
|
)
|
||
|
chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt])
|
||
|
chat = ChatOpenAI(temperature=0.9)
|
||
|
chain = LLMChain(llm=chat, prompt=chat_prompt_template)
|
||
|
print(chain.run("colorful socks"))
|
||
|
```
|
||
|
|
||
|
<CodeOutputBlock lang="python">
|
||
|
|
||
|
```
|
||
|
Rainbow Socks Co.
|
||
|
```
|
||
|
|
||
|
</CodeOutputBlock>
|