2023-02-14 23:06:14 +00:00
{
"cells": [
{
"cell_type": "markdown",
"id": "6eaf7e66-f49c-42da-8d11-22ea13bef718",
"metadata": {},
"source": [
2023-03-28 00:25:00 +00:00
"# How to stream LLM and Chat Model responses\n",
2023-02-14 23:06:14 +00:00
"\n",
2023-05-08 22:30:52 +00:00
"LangChain provides streaming support for LLMs. Currently, we support streaming for the `OpenAI`, `ChatOpenAI`, and `ChatAnthropic` implementations, but streaming support for other LLM implementations is on the roadmap. To utilize streaming, use a [`CallbackHandler`](https://github.com/hwchase17/langchain/blob/master/langchain/callbacks/base.py) that implements `on_llm_new_token`. In this example, we are using [`StreamingStdOutCallbackHandler`]()."
2023-02-14 23:06:14 +00:00
]
},
{
"cell_type": "code",
2023-05-08 22:30:52 +00:00
"execution_count": 1,
2023-02-14 23:06:14 +00:00
"id": "4ac0ff54-540a-4f2b-8d9a-b590fec7fe07",
"metadata": {
"tags": []
},
2023-03-28 00:25:00 +00:00
"outputs": [],
"source": [
2023-05-08 22:30:52 +00:00
"from langchain.llms import OpenAI\n",
"from langchain.chat_models import ChatOpenAI, ChatAnthropic\n",
2023-03-28 00:25:00 +00:00
"from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler\n",
"from langchain.schema import HumanMessage"
]
},
{
"cell_type": "code",
2023-05-08 22:30:52 +00:00
"execution_count": 2,
2023-03-28 00:25:00 +00:00
"id": "77f60a4b-f786-41f2-972e-e5bb8a48dcd5",
"metadata": {
"tags": []
},
2023-02-14 23:06:14 +00:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"Verse 1\n",
"I'm sippin' on sparkling water,\n",
"It's so refreshing and light,\n",
2023-03-02 05:55:43 +00:00
"It's the perfect way to quench my thirst\n",
2023-02-14 23:06:14 +00:00
"On a hot summer night.\n",
"\n",
"Chorus\n",
"Sparkling water, sparkling water,\n",
"It's the best way to stay hydrated,\n",
2023-03-02 05:55:43 +00:00
"It's so crisp and so clean,\n",
"It's the perfect way to stay refreshed.\n",
2023-02-14 23:06:14 +00:00
"\n",
"Verse 2\n",
"I'm sippin' on sparkling water,\n",
"It's so bubbly and bright,\n",
2023-03-02 05:55:43 +00:00
"It's the perfect way to cool me down\n",
2023-02-14 23:06:14 +00:00
"On a hot summer night.\n",
"\n",
"Chorus\n",
"Sparkling water, sparkling water,\n",
"It's the best way to stay hydrated,\n",
2023-03-02 05:55:43 +00:00
"It's so crisp and so clean,\n",
"It's the perfect way to stay refreshed.\n",
2023-02-14 23:06:14 +00:00
"\n",
"Verse 3\n",
"I'm sippin' on sparkling water,\n",
2023-03-02 05:55:43 +00:00
"It's so light and so clear,\n",
"It's the perfect way to keep me cool\n",
"On a hot summer night.\n",
2023-02-14 23:06:14 +00:00
"\n",
"Chorus\n",
"Sparkling water, sparkling water,\n",
"It's the best way to stay hydrated,\n",
2023-03-02 05:55:43 +00:00
"It's so crisp and so clean,\n",
"It's the perfect way to stay refreshed."
2023-02-14 23:06:14 +00:00
]
}
],
"source": [
2023-04-30 18:14:09 +00:00
"llm = OpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)\n",
2023-02-14 23:06:14 +00:00
"resp = llm(\"Write me a song about sparkling water.\")"
]
},
{
"cell_type": "markdown",
"id": "61fb6de7-c6c8-48d0-a48e-1204c027a23c",
"metadata": {
"tags": []
},
"source": [
"We still have access to the end `LLMResult` if using `generate`. However, `token_usage` is not currently supported for streaming."
]
},
{
"cell_type": "code",
2023-05-08 22:30:52 +00:00
"execution_count": 3,
2023-02-14 23:06:14 +00:00
"id": "a35373f1-9ee6-4753-a343-5aee749b8527",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\n",
"Q: What did the fish say when it hit the wall?\n",
"A: Dam!"
]
},
{
"data": {
"text/plain": [
2023-05-08 22:30:52 +00:00
"LLMResult(generations=[[Generation(text='\\n\\nQ: What did the fish say when it hit the wall?\\nA: Dam!', generation_info={'finish_reason': 'stop', 'logprobs': None})]], llm_output={'token_usage': {}, 'model_name': 'text-davinci-003'})"
2023-02-14 23:06:14 +00:00
]
},
2023-05-08 22:30:52 +00:00
"execution_count": 3,
2023-02-14 23:06:14 +00:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"llm.generate([\"Tell me a joke.\"])"
]
2023-03-02 05:55:43 +00:00
},
{
"cell_type": "markdown",
"id": "a93a4d61-0476-49db-8321-7de92bd74059",
"metadata": {},
"source": [
2023-03-28 00:25:00 +00:00
"Here's an example with the `ChatOpenAI` chat model implementation:"
2023-03-02 05:55:43 +00:00
]
},
{
"cell_type": "code",
2023-05-08 22:30:52 +00:00
"execution_count": 4,
2023-03-02 05:55:43 +00:00
"id": "22665f16-e05b-473c-a4bd-ad75744ea024",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Verse 1:\n",
"Bubbles rising to the top\n",
"A refreshing drink that never stops\n",
2023-03-28 00:25:00 +00:00
"Clear and crisp, it's oh so pure\n",
"Sparkling water, I can't ignore\n",
2023-03-02 05:55:43 +00:00
"\n",
"Chorus:\n",
2023-03-28 00:25:00 +00:00
"Sparkling water, oh how you shine\n",
"A taste so clean, it's simply divine\n",
"You quench my thirst, you make me feel alive\n",
"Sparkling water, you're my favorite vibe\n",
2023-03-02 05:55:43 +00:00
"\n",
"Verse 2:\n",
2023-03-28 00:25:00 +00:00
"No sugar, no calories, just H2O\n",
"A drink that's good for me, don't you know\n",
"With lemon or lime, you're even better\n",
"Sparkling water, you're my forever\n",
2023-03-02 05:55:43 +00:00
"\n",
"Chorus:\n",
2023-03-28 00:25:00 +00:00
"Sparkling water, oh how you shine\n",
"A taste so clean, it's simply divine\n",
"You quench my thirst, you make me feel alive\n",
"Sparkling water, you're my favorite vibe\n",
2023-03-02 05:55:43 +00:00
"\n",
"Bridge:\n",
2023-03-28 00:25:00 +00:00
"You're my go-to drink, day or night\n",
"You make me feel so light\n",
"I'll never give you up, you're my true love\n",
"Sparkling water, you're sent from above\n",
2023-03-02 05:55:43 +00:00
"\n",
"Chorus:\n",
2023-03-28 00:25:00 +00:00
"Sparkling water, oh how you shine\n",
"A taste so clean, it's simply divine\n",
"You quench my thirst, you make me feel alive\n",
"Sparkling water, you're my favorite vibe\n",
2023-03-02 05:55:43 +00:00
"\n",
"Outro:\n",
2023-03-28 00:25:00 +00:00
"Sparkling water, you're the one for me\n",
"I'll never let you go, can't you see\n",
"You're my drink of choice, forevermore\n",
"Sparkling water, I adore."
2023-03-02 05:55:43 +00:00
]
}
],
"source": [
2023-04-30 18:14:09 +00:00
"chat = ChatOpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)\n",
2023-03-06 16:34:24 +00:00
"resp = chat([HumanMessage(content=\"Write me a song about sparkling water.\")])"
2023-03-02 05:55:43 +00:00
]
},
2023-03-28 00:25:00 +00:00
{
"cell_type": "markdown",
"id": "909ae48b-0f07-4990-bbff-e627f706c93e",
"metadata": {},
"source": [
2023-05-08 22:30:52 +00:00
"Here is an example with the `ChatAnthropic` chat model implementation, which uses their `claude` model."
2023-03-28 00:25:00 +00:00
]
},
2023-03-02 05:55:43 +00:00
{
"cell_type": "code",
2023-05-08 22:30:52 +00:00
"execution_count": 5,
2023-03-02 05:55:43 +00:00
"id": "eadae4ba-9f21-4ec8-845d-dd43b0edc2dc",
2023-03-28 00:25:00 +00:00
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
2023-05-08 22:30:52 +00:00
" Here is my attempt at a song about sparkling water:\n",
2023-03-28 00:25:00 +00:00
"\n",
2023-05-08 22:30:52 +00:00
"Sparkling water, bubbles so bright, \n",
"Dancing in the glass with delight.\n",
"Refreshing and crisp, a fizzy delight,\n",
"Quenching my thirst with each sip I take.\n",
"The carbonation tickles my tongue,\n",
"As the refreshing water song is sung.\n",
"Lime or lemon, a citrus twist,\n",
"Makes sparkling water such a bliss.\n",
"Healthy and hydrating, a drink so pure,\n",
"Sparkling water, always alluring.\n",
"Bubbles ascending in a stream, \n",
"Sparkling water, you're my dream!"
2023-03-28 00:25:00 +00:00
]
}
],
"source": [
2023-05-08 22:30:52 +00:00
"chat = ChatAnthropic(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)\n",
"resp = chat([HumanMessage(content=\"Write me a song about sparkling water.\")])"
2023-03-28 00:25:00 +00:00
]
2023-02-14 23:06:14 +00:00
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
2023-03-28 00:25:00 +00:00
"version": "3.10.9"
2023-02-14 23:06:14 +00:00
}
},
"nbformat": 4,
"nbformat_minor": 5
}