forked from Archives/langchain
0013256e81
**Short Description** Added a new argument to AutoGPT class which allows to persist the chat history to a file. **Changes** 1. Removed the `self.full_message_history: List[BaseMessage] = []` 2. Replaced it with `chat_history_memory` which can take any subclasses of `BaseChatMessageHistory` --------- Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
208 lines
4.9 KiB
Plaintext
208 lines
4.9 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "14f8b67b",
|
|
"metadata": {},
|
|
"source": [
|
|
"# AutoGPT\n",
|
|
"\n",
|
|
"Implementation of https://github.com/Significant-Gravitas/Auto-GPT but with LangChain primitives (LLMs, PromptTemplates, VectorStores, Embeddings, Tools)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "192496a7",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Set up tools\n",
|
|
"\n",
|
|
"We'll set up an AutoGPT with a search tool, and write-file tool, and a read-file tool"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 2,
|
|
"id": "7c2c9b54",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.utilities import SerpAPIWrapper\n",
|
|
"from langchain.agents import Tool\n",
|
|
"from langchain.tools.file_management.write import WriteFileTool\n",
|
|
"from langchain.tools.file_management.read import ReadFileTool\n",
|
|
"\n",
|
|
"search = SerpAPIWrapper()\n",
|
|
"tools = [\n",
|
|
" Tool(\n",
|
|
" name = \"search\",\n",
|
|
" func=search.run,\n",
|
|
" description=\"useful for when you need to answer questions about current events. You should ask targeted questions\"\n",
|
|
" ),\n",
|
|
" WriteFileTool(),\n",
|
|
" ReadFileTool(),\n",
|
|
"]"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "8e39ee28",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Set up memory\n",
|
|
"\n",
|
|
"The memory here is used for the agents intermediate steps"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 3,
|
|
"id": "72bc204d",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.vectorstores import FAISS\n",
|
|
"from langchain.docstore import InMemoryDocstore\n",
|
|
"from langchain.embeddings import OpenAIEmbeddings"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 4,
|
|
"id": "1df7b724",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Define your embedding model\n",
|
|
"embeddings_model = OpenAIEmbeddings()\n",
|
|
"# Initialize the vectorstore as empty\n",
|
|
"import faiss\n",
|
|
"\n",
|
|
"embedding_size = 1536\n",
|
|
"index = faiss.IndexFlatL2(embedding_size)\n",
|
|
"vectorstore = FAISS(embeddings_model.embed_query, index, InMemoryDocstore({}), {})"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "e40fd657",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Setup model and AutoGPT\n",
|
|
"\n",
|
|
"Initialize everything! We will use ChatOpenAI model"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 5,
|
|
"id": "3393bc23",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.experimental import AutoGPT\n",
|
|
"from langchain.chat_models import ChatOpenAI"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 6,
|
|
"id": "709c08c2",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"agent = AutoGPT.from_llm_and_tools(\n",
|
|
" ai_name=\"Tom\",\n",
|
|
" ai_role=\"Assistant\",\n",
|
|
" tools=tools,\n",
|
|
" llm=ChatOpenAI(temperature=0),\n",
|
|
" memory=vectorstore.as_retriever()\n",
|
|
")\n",
|
|
"# Set verbose to be true\n",
|
|
"agent.chain.verbose = True"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"source": [
|
|
"## Run an example\n",
|
|
"\n",
|
|
"Here we will make it write a weather report for SF"
|
|
],
|
|
"metadata": {
|
|
"collapsed": false
|
|
}
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"outputs": [],
|
|
"source": [
|
|
"agent.run([\"write a weather report for SF today\"])"
|
|
],
|
|
"metadata": {
|
|
"collapsed": false
|
|
}
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"source": [
|
|
"## Chat History Memory\n",
|
|
"\n",
|
|
"In addition to the memory that holds the agent immediate steps, we also have a chat history memory. By default, the agent will use 'ChatMessageHistory' and it can be changed. This is useful when you want to use a different type of memory for example 'FileChatHistoryMemory'"
|
|
],
|
|
"metadata": {
|
|
"collapsed": false
|
|
}
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.memory.chat_message_histories import FileChatMessageHistory\n",
|
|
"\n",
|
|
"agent = AutoGPT.from_llm_and_tools(\n",
|
|
" ai_name=\"Tom\",\n",
|
|
" ai_role=\"Assistant\",\n",
|
|
" tools=tools,\n",
|
|
" llm=ChatOpenAI(temperature=0),\n",
|
|
" memory=vectorstore.as_retriever(),\n",
|
|
" chat_history_memory=FileChatMessageHistory('chat_history.txt')\n",
|
|
")"
|
|
],
|
|
"metadata": {
|
|
"collapsed": false
|
|
}
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"source": [],
|
|
"metadata": {
|
|
"collapsed": false
|
|
}
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3 (ipykernel)",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.9.1"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 5
|
|
}
|