You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/templates/rag-conversation/rag_conversation.ipynb

132 lines
3.8 KiB
Plaintext

{
"cells": [
{
"cell_type": "markdown",
"id": "424a9d8d",
"metadata": {},
"source": [
"## Run Template\n",
"\n",
"\n",
"As shown in the README, add template and start server:\n",
"```\n",
"langchain serve add rag-conversation\n",
"langchain start\n",
"```\n",
"\n",
"We can now look at the endpoints:\n",
"\n",
"http://127.0.0.1:8000/docs#\n",
"\n",
"And specifically at our loaded template:\n",
"\n",
"http://127.0.0.1:8000/docs#/default/invoke_rag_conversation_invoke_post\n",
" \n",
"We can also use remote runnable to call it."
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "5f521923",
"metadata": {},
"outputs": [],
"source": [
"from langserve.client import RemoteRunnable\n",
"rag_app = RemoteRunnable('http://localhost:8000/rag-conversation')"
]
},
{
"cell_type": "code",
"execution_count": 26,
"id": "679bd83b",
"metadata": {},
"outputs": [],
"source": [
"question = \"How does agent memory work?\"\n",
"answer = rag_app.invoke({\n",
" \"question\": question,\n",
" \"chat_history\": [],\n",
"})"
]
},
{
"cell_type": "code",
"execution_count": 27,
"id": "94a05616",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content=\"Agent memory works by utilizing both short-term memory and long-term memory mechanisms. \\n\\nShort-term memory allows the agent to learn and retain information within the current context or task. This in-context learning helps the agent handle complex tasks efficiently. \\n\\nOn the other hand, long-term memory enables the agent to retain and recall an unlimited amount of information over extended periods. This is achieved by leveraging an external vector store, such as a memory stream, which serves as a comprehensive database of the agent's past experiences in natural language. The memory stream records observations and events directly provided by the agent, and inter-agent communication can also trigger new natural language statements to be added to the memory.\\n\\nTo access and utilize the stored information, a retrieval model is employed. This model determines the context that is most relevant, recent, and important to inform the agent's behavior. By retrieving information from memory, the agent can reflect on past actions, learn from mistakes, and refine its behavior for future steps, ultimately improving the quality of its results.\")"
]
},
"execution_count": 27,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"answer"
]
},
{
"cell_type": "code",
"execution_count": 29,
"id": "ce206c8a",
"metadata": {},
"outputs": [],
"source": [
"chat_history = [(question, answer.content)]\n",
"answer = rag_app.invoke({\n",
" \"question\": \"What are the different types?\",\n",
" \"chat_history\": chat_history,\n",
"})"
]
},
{
"cell_type": "code",
"execution_count": 30,
"id": "4626f167",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='The different types of memory utilized by the agent are sensory memory, short-term memory, and long-term memory.')"
]
},
"execution_count": 30,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"answer"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
}
},
"nbformat": 4,
"nbformat_minor": 5
}