{ "cells": [ { "cell_type": "markdown", "id": "f6574496-b360-4ffa-9523-7fd34a590164", "metadata": {}, "source": [ "# Async API for LLM\n", "\n", "LangChain provides async support for LLMs by leveraging the [asyncio](https://docs.python.org/3/library/asyncio.html) library.\n", "\n", "Async support is particularly useful for calling multiple LLMs concurrently, as these calls are network-bound. Currently, only `OpenAI` `OpenAIChat`, and `PromptLayerOpenAI` are supported, but async support for other LLMs is on the roadmap.\n", "\n", "You can use the `agenerate` method to call an OpenAI LLM asynchronously." ] }, { "cell_type": "code", "execution_count": 1, "id": "5e49e96c-0f88-466d-b3d3-ea0966bdf19e", "metadata": { "tags": [] }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "\n", "As an AI language model, I don't have feelings like humans, but I'm functioning properly. How may I assist you?\n", "\n", "\n", "I'm an AI language model, so I don't have emotions, but I'm functioning properly. How may I assist you today?\n", "\n", "\n", "As an AI language model, I do not have emotions like humans, but I'm functioning normally. How can I assist you today?\n", "\n", "\n", "I am an AI language model, so I do not have feelings, but I am here to assist you. How may I help you today?\n", "\n", "\n", "As an AI language model, I do not have feelings or emotions but I'm always ready to assist you. How may I assist you today?\n", "\n", "\n", "As an AI language model, I don't have feelings, but I'm functioning normally. How may I assist you today?\n", "\n", "\n", "As an AI language model, I don't have feelings, but I'm functioning properly. Thank you. How may I assist you today?\n", "\n", "\n", "As an AI language model, I don't have emotions, so I don't have a specific feeling or emotion. How can I assist you today?\n", "\n", "\n", "As an AI language model, I do not have feelings or emotions. However, I am functioning as intended and ready to assist you with any queries you may have. How can I be of assistance today?\n", "\n", "\n", "As an AI language model, I do not have feelings, but I am functioning well. Thank you for asking. How can I assist you today?\n", "\u001b[1mConcurrent executed in 0.92 seconds.\u001b[0m\n", "\n", "\n", "As an AI language model, I don't have feelings, but I'm functioning well. How can I assist you today?\n", "\n", "\n", "As an AI language model, I don't have feelings, but I'm functioning well. Thank you for asking. How may I assist you today?\n", "\n", "\n", "I'm an AI language model, so I don't have feelings, but I'm functioning well. How can I assist you today?\n", "\n", "\n", "As an AI language model, I don't have feelings, but I'm functioning well. Thank you for asking. How may I assist you today?\n", "\n", "\n", "As an AI language model, I don't have feelings, but I am functioning well. How can I assist you today?\n", "\n", "\n", "As an AI language model, I don't have feelings but I'm functioning well. How can I assist you today?\n", "\n", "\n", "As an AI language model, I do not have personal emotions. However, I am functioning well and ready to assist you with any queries or tasks you have. How may I assist you today?\n", "\n", "\n", "As an AI language model, I do not have feelings or emotions, but I'm functioning well. How can I assist you today?\n", "\n", "\n", "I am an AI language model and do not have feelings. But I am functioning properly and ready to assist you with any task. How may I help you today?\n", "\n", "\n", "As an AI language model, I do not have emotions, but I am functioning well. How can I assist you today?\n", "\u001b[1mSerial executed in 5.00 seconds.\u001b[0m\n" ] } ], "source": [ "import time\n", "import asyncio\n", "\n", "from langchain.llms import OpenAIChat\n", "\n", "def generate_serially():\n", " llm = OpenAIChat(temperature=0.9)\n", " for _ in range(10):\n", " resp = llm.generate([\"Hello, how are you?\"])\n", " print(resp.generations[0][0].text)\n", "\n", "\n", "async def async_generate(llm):\n", " resp = await llm.agenerate([\"Hello, how are you?\"])\n", " print(resp.generations[0][0].text)\n", "\n", "\n", "async def generate_concurrently():\n", " llm = OpenAIChat(temperature=0.9)\n", " tasks = [async_generate(llm) for _ in range(10)]\n", " await asyncio.gather(*tasks)\n", "\n", "\n", "s = time.perf_counter()\n", "# If running this outside of Jupyter, use asyncio.run(generate_concurrently())\n", "await generate_concurrently() \n", "elapsed = time.perf_counter() - s\n", "print('\\033[1m' + f\"Concurrent executed in {elapsed:0.2f} seconds.\" + '\\033[0m')\n", "\n", "s = time.perf_counter()\n", "generate_serially()\n", "elapsed = time.perf_counter() - s\n", "print('\\033[1m' + f\"Serial executed in {elapsed:0.2f} seconds.\" + '\\033[0m')" ] }, { "cell_type": "code", "execution_count": null, "id": "e1d3a966-3a27-44e8-9441-ed72f01b86f4", "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.9" } }, "nbformat": 4, "nbformat_minor": 5 }