"[![Open In Collab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/langchain-ai/langchain/blob/master/docs/extras/use_cases/apis.ipynb)\n",
"\n",
"## Use case \n",
"\n",
"Suppose you want an LLM to interact with external APIs.\n",
"\n",
"This can be very useful for retrieving context for the LLM to utilize.\n",
"\n",
"And, more generally, it allows us to interact with APIs using natural langugage! \n",
" \n",
"\n",
"## Overview\n",
"\n",
"There are two primary ways to interface LLMs with external APIs:\n",
" \n",
"* `Functions`: For example, [OpenAI functions](https://platform.openai.com/docs/guides/gpt/function-calling) is one popular means of doing this.\n",
"* `LLM-generated interface`: Use an LLM with access to API documentation to create an interface.\n",
"For example, [Klarna](https://www.klarna.com/international/press/klarna-brings-smoooth-shopping-to-chatgpt/) has a YAML file that describes its API and allows OpenAI to interact with it:\n",
"* [Speak](https://api.speak.com/openapi.yaml) for translation\n",
"* [XKCD](https://gist.githubusercontent.com/roaldnefs/053e505b2b7a807290908fe9aa3e1f00/raw/0a212622ebfef501163f91e23803552411ed00e4/openapi.yaml) for comics\n",
"\n",
"We can supply the specification to `get_openapi_chain` directly in order to query the API with OpenAI functions:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5a218fcc",
"metadata": {},
"outputs": [],
"source": [
"pip install langchain openai \n",
"\n",
"# Set env var OPENAI_API_KEY or load from a .env file:\n",
"# import dotenv\n",
"# dotenv.load_env()"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "30b780e3",
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Attempting to load an OpenAPI 3.0.1 spec. This may result in degraded performance. Convert your OpenAPI spec to 3.1.* spec for better support.\n"
]
},
{
"data": {
"text/plain": [
"{'query': \"What are some options for a men's large blue button down shirt\",\n",
" 'response': {'products': [{'name': 'Cubavera Four Pocket Guayabera Shirt',\n",
"Let's look at the [LangSmith trace](https://smith.langchain.com/public/76a58b85-193f-4eb7-ba40-747f0d5dd56e/r):\n",
"\n",
"* See [here](https://github.com/langchain-ai/langchain/blob/7fc07ba5df99b9fa8bef837b0fafa220bc5c932c/libs/langchain/langchain/chains/openai_functions/openapi.py#L279C9-L279C19) that we call the OpenAI LLM with the provided API spec:\n",
"* The prompt then tells the LLM to use the API spec wiith input question:\n",
"\n",
"```\n",
"Use the provided API's to respond to this user query:\n",
"What are some options for a men's large blue button down shirt\n",
"```\n",
"\n",
"* The LLM returns the parameters for the function call `productsUsingGET`, which is [specified in the provided API spec](https://www.klarna.com/us/shopping/public/openai/v0/api-docs/):\n",
"```\n",
"function_call:\n",
" name: productsUsingGET\n",
" arguments: |-\n",
" {\n",
" \"params\": {\n",
" \"countryCode\": \"US\",\n",
" \"q\": \"men's large blue button down shirt\",\n",
"* This `Dict` above split and the [API is called here](https://github.com/langchain-ai/langchain/blob/7fc07ba5df99b9fa8bef837b0fafa220bc5c932c/libs/langchain/langchain/chains/openai_functions/openapi.py#L215)."
]
},
{
"cell_type": "markdown",
"id": "1fe49a0d",
"metadata": {},
"source": [
"## API Chain \n",
"\n",
"We can also build our own interface to external APIs using the `APIChain` and provided API documentation."
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "4ef0c3d0",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"\u001b[1m> Entering new APIChain chain...\u001b[0m\n",
"chain.run('What is the weather like right now in Munich, Germany in degrees Fahrenheit?')"
]
},
{
"cell_type": "markdown",
"id": "5b179318",
"metadata": {},
"source": [
"Note that we supply information about the API:"
]
},
{
"cell_type": "code",
"execution_count": 37,
"id": "a9e03cc2",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'BASE URL: https://api.open-meteo.com/\\n\\nAPI Documentation\\nThe API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:\\n\\nParameter\\tFormat\\tRequired\\tDefault\\tDescription\\nlatitude, longitude\\tFloating point\\tYes\\t\\tGeographical WGS84 coordinate of the location\\nhourly\\tString array\\tNo\\t\\tA list of weather variables which shou'"
]
},
"execution_count": 37,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"open_meteo_docs.OPEN_METEO_DOCS[0:500]"
]
},
{
"cell_type": "markdown",
"id": "3fab7930",
"metadata": {},
"source": [
"Under the hood, we do two things:\n",
" \n",
"* `api_request_chain`: Generate an API URL based on the input question and the api_docs\n",
"* `api_answer_chain`: generate a final answer based on the API response\n",
"\n",
"We can look at the [LangSmith trace](https://smith.langchain.com/public/1e0d18ca-0d76-444c-97df-a939a6a815a7/r) to inspect this:\n",
"\n",
"* The `api_request_chain` produces the API url from our question and the API documentation:\n",
"\n",
"![Image description](/img/api_chain.png)\n",
"\n",
"* [Here](https://github.com/langchain-ai/langchain/blob/bbd22b9b761389a5e40fc45b0570e1830aabb707/libs/langchain/langchain/chains/api/base.py#L82) we make the API request with the API url.\n",
"* The `api_answer_chain` takes the response from the API and provides us with a natural langugae response:\n",