Harrison/tools priority (#554)

Co-authored-by: Yong723 <50616781+Yongtae723@users.noreply.github.com>
harrison/pinecone-try-except
Harrison Chase 1 year ago committed by GitHub
parent 4974f49bb7
commit e64ed7b975
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -249,18 +249,98 @@
},
{
"cell_type": "markdown",
"id": "a8d95999",
"id": "376813ed",
"metadata": {},
"source": [
"## Using tools to return directly\n",
"## Defining the priorities among Tools\n",
"When you made a Custom tool, you may want the Agent to use the custom tool more than normal tools.\n",
"\n",
"For example, you made a custom tool, which gets information on music from your database. When a user wants information on songs, You want the Agent to use `the custom tool` more than the normal `Search tool`. But the Agent might prioritize a normal Search tool.\n",
"\n",
"This can be accomplished by adding a statement such as `Use this more than the normal search if the question is about Music, like 'who is the singer of yesterday?' or 'what is the most popular song in 2022?'` to the description.\n",
"\n",
"An example is below."
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "3450512e",
"metadata": {},
"outputs": [],
"source": [
"# Import things that are needed generically\n",
"from langchain.agents import initialize_agent, Tool\n",
"from langchain.llms import OpenAI\n",
"from langchain import LLMMathChain, SerpAPIWrapper\n",
"search = SerpAPIWrapper()\n",
"tools = [\n",
" Tool(\n",
" name = \"Search\",\n",
" func=search.run,\n",
" description=\"useful for when you need to answer questions about current events\"\n",
" ),\n",
" Tool(\n",
" name=\"Music Search\",\n",
" func=lambda x: \"'All I Want For Christmas Is You' by Mariah Carey.\", #Mock Function\n",
" description=\"A Music search engine. Use this more than the normal search if the question is about Music, like 'who is the singer of yesterday?' or 'what is the most popular song in 2022?'\",\n",
" )\n",
"]\n",
"\n",
"Often, it can be desirable to have a tool output returned directly to the user, if it's called. You can do this easily with LangChain by setting the `return_direct` flag for a tool to be True."
"agent = initialize_agent(tools, OpenAI(temperature=0), agent=\"zero-shot-react-description\", verbose=True)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "4b9a7849",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
"\u001b[32;1m\u001b[1;3m I should use a music search engine to find the answer\n",
"Action: Music Search\n",
"Action Input: most famous song of christmas\u001b[0m\n",
"Observation: \u001b[33;1m\u001b[1;3m'All I Want For Christmas Is You' by Mariah Carey.\u001b[0m\n",
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
"Final Answer: 'All I Want For Christmas Is You' by Mariah Carey.\u001b[0m\n",
"\n",
"\u001b[1m> Finished chain.\u001b[0m\n"
]
},
{
"data": {
"text/plain": [
"\"'All I Want For Christmas Is You' by Mariah Carey.\""
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"agent.run(\"what is the most famous song of christmas\")"
]
},
{
"cell_type": "markdown",
"id": "bc477d43",
"metadata": {},
"source": [
"## Using tools to return directly\n",
"Often, it can be desirable to have a tool output returned directly to the user, if its called. You can do this easily with LangChain by setting the return_direct flag for a tool to be True."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "c8f65640",
"id": "3bb6185f",
"metadata": {},
"outputs": [],
"source": [
@ -278,7 +358,7 @@
{
"cell_type": "code",
"execution_count": 4,
"id": "4bd1c4bb",
"id": "113ddb84",
"metadata": {},
"outputs": [],
"source": [
@ -289,7 +369,7 @@
{
"cell_type": "code",
"execution_count": 5,
"id": "52fe0594",
"id": "582439a6",
"metadata": {},
"outputs": [
{
@ -326,7 +406,7 @@
{
"cell_type": "code",
"execution_count": null,
"id": "7cc1b875",
"id": "537bc628",
"metadata": {},
"outputs": [],
"source": []
@ -352,7 +432,7 @@
},
"vscode": {
"interpreter": {
"hash": "b1677b440931f40d89ef8be7bf03acb108ce003de0ac9b18e8d43753ea2e7103"
"hash": "cb23c3a7a387ab03496baa08507270f8e0861b23170e79d5edc545893cdca840"
}
}
},

Loading…
Cancel
Save