mirror of
https://github.com/hwchase17/langchain
synced 2024-11-06 03:20:49 +00:00
ebf998acb6
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com> Co-authored-by: Lance Martin <lance@langchain.dev> Co-authored-by: Jacob Lee <jacoblee93@gmail.com>
117 lines
3.0 KiB
Plaintext
117 lines
3.0 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "16f2c32e",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Document Loading\n",
|
|
"\n",
|
|
"Load a blog post on agents."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 1,
|
|
"id": "c9fadce0",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.document_loaders import WebBaseLoader\n",
|
|
"loader = WebBaseLoader(\"https://lilianweng.github.io/posts/2023-06-23-agent/\")\n",
|
|
"text = loader.load()"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "4086be03",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Run Template\n",
|
|
"\n",
|
|
"\n",
|
|
"As shown in the README, add template and start server:\n",
|
|
"```\n",
|
|
"langchain serve add openai-functions\n",
|
|
"langchain start\n",
|
|
"```\n",
|
|
"\n",
|
|
"We can now look at the endpoints:\n",
|
|
"\n",
|
|
"http://127.0.0.1:8000/docs#\n",
|
|
"\n",
|
|
"And specifically at our loaded template:\n",
|
|
"\n",
|
|
"http://127.0.0.1:8000/docs#/default/invoke_openai_functions_invoke_post\n",
|
|
" \n",
|
|
"We can also use remote runnable to call it."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 2,
|
|
"id": "ed507784",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langserve.client import RemoteRunnable\n",
|
|
"oai_function = RemoteRunnable('http://localhost:8000/openai-functions')"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "68046695",
|
|
"metadata": {},
|
|
"source": [
|
|
"The function call will perform tagging:\n",
|
|
"\n",
|
|
"* summarize\n",
|
|
"* provide keywords\n",
|
|
"* provide language"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 3,
|
|
"id": "6dace748",
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"AIMessage(content='', additional_kwargs={'function_call': {'name': 'Overview', 'arguments': '{\\n \"summary\": \"This article discusses the concept of building agents with LLM (large language model) as their core controller. It explores the potentiality of LLM as a general problem solver and describes the key components of an LLM-powered autonomous agent system, including planning, memory, and tool use. The article also presents case studies and challenges related to building LLM-powered agents.\",\\n \"language\": \"English\",\\n \"keywords\": \"LLM, autonomous agents, planning, memory, tool use, case studies, challenges\"\\n}'}})"
|
|
]
|
|
},
|
|
"execution_count": 3,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"oai_function.invoke(text[0].page_content[0:1500])"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "langserve",
|
|
"language": "python",
|
|
"name": "langserve"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.9.16"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 5
|
|
}
|