{ "cells": [ { "cell_type": "markdown", "id": "46196aa3", "metadata": {}, "source": [ "## ConversationBufferMemory\n", "\n", "This notebook shows how to use `ConversationBufferMemory`. This memory allows for storing of messages and then extracts the messages in a variable.\n", "\n", "We can first extract it as a string." ] }, { "cell_type": "code", "execution_count": 2, "id": "3bac84f3", "metadata": {}, "outputs": [], "source": [ "from langchain.memory import ConversationBufferMemory" ] }, { "cell_type": "code", "execution_count": 3, "id": "cef35e7f", "metadata": {}, "outputs": [], "source": [ "memory = ConversationBufferMemory()\n", "memory.save_context({\"input\": \"hi\"}, {\"ouput\": \"whats up\"})" ] }, { "cell_type": "code", "execution_count": 4, "id": "2c9b39af", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'history': 'Human: hi\\nAI: whats up'}" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "memory.load_memory_variables({})" ] }, { "cell_type": "markdown", "id": "567f7c16", "metadata": {}, "source": [ "We can also get the history as a list of messages (this is useful if you are using this with a chat model)." ] }, { "cell_type": "code", "execution_count": 5, "id": "a481a415", "metadata": {}, "outputs": [], "source": [ "memory = ConversationBufferMemory(return_messages=True)\n", "memory.save_context({\"input\": \"hi\"}, {\"ouput\": \"whats up\"})" ] }, { "cell_type": "code", "execution_count": 6, "id": "86a56348", "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'history': [HumanMessage(content='hi', additional_kwargs={}),\n", " AIMessage(content='whats up', additional_kwargs={})]}" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "memory.load_memory_variables({})" ] }, { "cell_type": "markdown", "id": "d051c1da", "metadata": {}, "source": [ "## Using in a chain\n", "Finally, let's take a look at using this in a chain (setting `verbose=True` so we can see the prompt)." ] }, { "cell_type": "code", "execution_count": 15, "id": "54301321", "metadata": {}, "outputs": [], "source": [ "from langchain.llms import OpenAI\n", "from langchain.chains import ConversationChain\n", "\n", "\n", "llm = OpenAI(temperature=0)\n", "conversation = ConversationChain(\n", " llm=llm, \n", " verbose=True, \n", " memory=ConversationBufferMemory()\n", ")" ] }, { "cell_type": "code", "execution_count": 16, "id": "ae046bff", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "\n", "\u001b[1m> Entering new ConversationChain chain...\u001b[0m\n", "Prompt after formatting:\n", "\u001b[32;1m\u001b[1;3mThe following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.\n", "\n", "Current conversation:\n", "\n", "Human: Hi there!\n", "AI:\u001b[0m\n", "\n", "\u001b[1m> Finished chain.\u001b[0m\n" ] }, { "data": { "text/plain": [ "\" Hi there! It's nice to meet you. How can I help you today?\"" ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "conversation.predict(input=\"Hi there!\")" ] }, { "cell_type": "code", "execution_count": 17, "id": "d8e2a6ff", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "\n", "\u001b[1m> Entering new ConversationChain chain...\u001b[0m\n", "Prompt after formatting:\n", "\u001b[32;1m\u001b[1;3mThe following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.\n", "\n", "Current conversation:\n", "Human: Hi there!\n", "AI: Hi there! It's nice to meet you. How can I help you today?\n", "Human: I'm doing well! Just having a conversation with an AI.\n", "AI:\u001b[0m\n", "\n", "\u001b[1m> Finished chain.\u001b[0m\n" ] }, { "data": { "text/plain": [ "\" That's great! It's always nice to have a conversation with someone new. What would you like to talk about?\"" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "conversation.predict(input=\"I'm doing well! Just having a conversation with an AI.\")" ] }, { "cell_type": "code", "execution_count": 18, "id": "15eda316", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "\n", "\u001b[1m> Entering new ConversationChain chain...\u001b[0m\n", "Prompt after formatting:\n", "\u001b[32;1m\u001b[1;3mThe following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.\n", "\n", "Current conversation:\n", "Human: Hi there!\n", "AI: Hi there! It's nice to meet you. How can I help you today?\n", "Human: I'm doing well! Just having a conversation with an AI.\n", "AI: That's great! It's always nice to have a conversation with someone new. What would you like to talk about?\n", "Human: Tell me about yourself.\n", "AI:\u001b[0m\n", "\n", "\u001b[1m> Finished chain.\u001b[0m\n" ] }, { "data": { "text/plain": [ "\" Sure! I'm an AI created to help people with their everyday tasks. I'm programmed to understand natural language and provide helpful information. I'm also constantly learning and updating my knowledge base so I can provide more accurate and helpful answers.\"" ] }, "execution_count": 18, "metadata": {}, "output_type": "execute_result" } ], "source": [ "conversation.predict(input=\"Tell me about yourself.\")" ] }, { "cell_type": "markdown", "id": "bd0146c2", "metadata": {}, "source": [ "And that's it for the getting started! There are plenty of different types of memory, check out our examples to see them all" ] }, { "cell_type": "code", "execution_count": null, "id": "447c138d", "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.1" } }, "nbformat": 4, "nbformat_minor": 5 }