2023-04-19 04:41:03 +00:00
{
"cells": [
{
"cell_type": "markdown",
"id": "14f8b67b",
"metadata": {},
"source": [
"# AutoGPT\n",
"\n",
"Implementation of https://github.com/Significant-Gravitas/Auto-GPT but with LangChain primitives (LLMs, PromptTemplates, VectorStores, Embeddings, Tools)"
]
},
{
"cell_type": "markdown",
"id": "192496a7",
"metadata": {},
"source": [
"## Set up tools\n",
"\n",
"We'll set up an AutoGPT with a search tool, and write-file tool, and a read-file tool"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "7c2c9b54",
"metadata": {},
"outputs": [],
"source": [
"from langchain.agents import Tool\n",
2023-11-14 22:17:44 +00:00
"from langchain.utilities import SerpAPIWrapper\n",
2024-01-02 23:23:34 +00:00
"from langchain_community.tools.file_management.read import ReadFileTool\n",
"from langchain_community.tools.file_management.write import WriteFileTool\n",
2023-04-19 04:41:03 +00:00
"\n",
"search = SerpAPIWrapper()\n",
"tools = [\n",
" Tool(\n",
2023-06-16 18:52:56 +00:00
" name=\"search\",\n",
2023-04-19 04:41:03 +00:00
" func=search.run,\n",
2023-06-16 18:52:56 +00:00
" description=\"useful for when you need to answer questions about current events. You should ask targeted questions\",\n",
2023-04-19 04:41:03 +00:00
" ),\n",
" WriteFileTool(),\n",
" ReadFileTool(),\n",
"]"
]
},
{
"cell_type": "markdown",
"id": "8e39ee28",
"metadata": {},
"source": [
"## Set up memory\n",
"\n",
"The memory here is used for the agents intermediate steps"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "72bc204d",
"metadata": {},
"outputs": [],
"source": [
"from langchain.docstore import InMemoryDocstore\n",
2024-01-02 21:47:11 +00:00
"from langchain_community.embeddings import OpenAIEmbeddings\n",
"from langchain_community.vectorstores import FAISS"
2023-04-19 04:41:03 +00:00
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "1df7b724",
"metadata": {},
"outputs": [],
"source": [
"# Define your embedding model\n",
"embeddings_model = OpenAIEmbeddings()\n",
"# Initialize the vectorstore as empty\n",
"import faiss\n",
"\n",
"embedding_size = 1536\n",
"index = faiss.IndexFlatL2(embedding_size)\n",
"vectorstore = FAISS(embeddings_model.embed_query, index, InMemoryDocstore({}), {})"
]
},
{
"cell_type": "markdown",
"id": "e40fd657",
"metadata": {},
"source": [
"## Setup model and AutoGPT\n",
"\n",
"Initialize everything! We will use ChatOpenAI model"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "3393bc23",
"metadata": {},
"outputs": [],
"source": [
2024-01-02 20:32:16 +00:00
"from langchain_community.chat_models import ChatOpenAI\n",
2023-11-14 22:17:44 +00:00
"from langchain_experimental.autonomous_agents import AutoGPT"
2023-04-19 04:41:03 +00:00
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "709c08c2",
"metadata": {},
"outputs": [],
"source": [
"agent = AutoGPT.from_llm_and_tools(\n",
" ai_name=\"Tom\",\n",
" ai_role=\"Assistant\",\n",
" tools=tools,\n",
" llm=ChatOpenAI(temperature=0),\n",
2023-06-16 18:52:56 +00:00
" memory=vectorstore.as_retriever(),\n",
2023-04-19 04:41:03 +00:00
")\n",
"# Set verbose to be true\n",
"agent.chain.verbose = True"
]
},
{
"cell_type": "markdown",
2023-07-29 15:44:32 +00:00
"id": "f0f208d9",
"metadata": {
"collapsed": false
},
2023-04-19 04:41:03 +00:00
"source": [
"## Run an example\n",
"\n",
"Here we will make it write a weather report for SF"
2023-07-29 15:44:32 +00:00
]
2023-04-19 04:41:03 +00:00
},
{
"cell_type": "code",
2023-06-16 00:49:03 +00:00
"execution_count": null,
2023-07-29 15:44:32 +00:00
"id": "d119d788",
2023-04-19 04:41:03 +00:00
"metadata": {
2023-06-16 00:49:03 +00:00
"collapsed": false
2023-06-16 18:52:56 +00:00
},
2023-07-29 15:44:32 +00:00
"outputs": [],
"source": [
"agent.run([\"write a weather report for SF today\"])"
]
2023-06-16 00:49:03 +00:00
},
{
"cell_type": "markdown",
2023-07-29 15:44:32 +00:00
"id": "f13f8322",
"metadata": {
"collapsed": false
},
2023-06-16 00:49:03 +00:00
"source": [
"## Chat History Memory\n",
"\n",
"In addition to the memory that holds the agent immediate steps, we also have a chat history memory. By default, the agent will use 'ChatMessageHistory' and it can be changed. This is useful when you want to use a different type of memory for example 'FileChatHistoryMemory'"
2023-07-29 15:44:32 +00:00
]
2023-06-16 00:49:03 +00:00
},
{
"cell_type": "code",
"execution_count": null,
2023-07-29 15:44:32 +00:00
"id": "2a81f5ad",
"metadata": {
"collapsed": false
},
2023-06-16 00:49:03 +00:00
"outputs": [],
2023-04-19 04:41:03 +00:00
"source": [
2024-01-02 21:47:11 +00:00
"from langchain_community.chat_message_histories import FileChatMessageHistory\n",
2023-06-16 00:49:03 +00:00
"\n",
"agent = AutoGPT.from_llm_and_tools(\n",
" ai_name=\"Tom\",\n",
" ai_role=\"Assistant\",\n",
" tools=tools,\n",
" llm=ChatOpenAI(temperature=0),\n",
" memory=vectorstore.as_retriever(),\n",
2023-06-16 18:52:56 +00:00
" chat_history_memory=FileChatMessageHistory(\"chat_history.txt\"),\n",
2023-06-16 00:49:03 +00:00
")"
2023-07-29 15:44:32 +00:00
]
2023-06-16 00:49:03 +00:00
},
{
"cell_type": "markdown",
2023-07-29 15:44:32 +00:00
"id": "b1403008",
2023-06-16 00:49:03 +00:00
"metadata": {
"collapsed": false
2023-06-16 18:52:56 +00:00
},
2023-07-29 15:44:32 +00:00
"source": []
2023-04-19 04:41:03 +00:00
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
}
},
"nbformat": 4,
"nbformat_minor": 5
2023-07-29 15:44:32 +00:00
}