You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/docs/modules/chat/examples/memory.ipynb

193 lines
4.6 KiB
Plaintext

{
"cells": [
{
"cell_type": "markdown",
"id": "9a9350a6",
"metadata": {},
"source": [
"# Memory\n",
"This notebook goes over how to use Memory with chat models. The main difference between this and Memory for LLMs is that rather than trying to condense all previous messages into a string, we can keep them as their own unique memory object."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "110935ae",
"metadata": {},
"outputs": [],
"source": [
"from langchain.prompts import (\n",
" ChatPromptTemplate, \n",
" MessagesPlaceholder, \n",
" SystemMessagePromptTemplate, \n",
" HumanMessagePromptTemplate\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "161b6629",
"metadata": {},
"outputs": [],
"source": [
"prompt = ChatPromptTemplate.from_messages([\n",
" SystemMessagePromptTemplate.from_template(\"The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.\"),\n",
" MessagesPlaceholder(variable_name=\"history\"),\n",
" HumanMessagePromptTemplate.from_template(\"{input}\")\n",
"])"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "4976fbda",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains import ConversationChain\n",
"from langchain.chat_models import ChatOpenAI\n",
"from langchain.memory import ConversationBufferMemory"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "12a0bea6",
"metadata": {},
"outputs": [],
"source": [
"llm = ChatOpenAI(temperature=0)"
]
},
{
"cell_type": "markdown",
"id": "f6edcd6a",
"metadata": {},
"source": [
"We can now initialize the memory. Note that we set `return_messages=True` To denote that this should return a list of messages when appropriate"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "f55bea38",
"metadata": {},
"outputs": [],
"source": [
"memory = ConversationBufferMemory(return_messages=True)"
]
},
{
"cell_type": "markdown",
"id": "737e8c78",
"metadata": {},
"source": [
"We can now use this in the rest of the chain."
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "80152db7",
"metadata": {},
"outputs": [],
"source": [
"conversation = ConversationChain(memory=memory, prompt=prompt, llm=llm)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "ac68e766",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'Hello! How can I assist you today?'"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"conversation.predict(input=\"Hi there!\")"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "babb33d0",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"\"That sounds like fun! I'm happy to chat with you. Is there anything specific you'd like to talk about?\""
]
},
"execution_count": 10,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"conversation.predict(input=\"I'm doing well! Just having a conversation with an AI.\")"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "36f8a1dc",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"\"Sure! I am an AI language model created by OpenAI. I was trained on a large dataset of text from the internet, which allows me to understand and generate human-like language. I can answer questions, provide information, and even have conversations like this one. Is there anything else you'd like to know about me?\""
]
},
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"conversation.predict(input=\"Tell me about yourself.\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "79fb460b",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
}
},
"nbformat": 4,
"nbformat_minor": 5
}