mirror of
https://github.com/hwchase17/langchain
synced 2024-11-10 01:10:59 +00:00
67c5950df3
### Description - Add support for streaming with `Bedrock` LLM and `BedrockChat` Chat Model. - Bedrock as of now supports streaming for the `anthropic.claude-*` and `amazon.titan-*` models only, hence support for those have been built. - Also increased the default `max_token_to_sample` for Bedrock `anthropic` model provider to `256` from `50` to keep in line with the `Anthropic` defaults. - Added examples for streaming responses to the bedrock example notebooks. **_NOTE:_**: This PR fixes the issues mentioned in #9897 and makes that PR redundant.
140 lines
4.4 KiB
Plaintext
140 lines
4.4 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"id": "bf733a38-db84-4363-89e2-de6735c37230",
|
|
"metadata": {},
|
|
"source": [
|
|
"# Bedrock Chat\n",
|
|
"\n",
|
|
"[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that makes FMs from leading AI startups and Amazon available via an API, so you can choose from a wide range of FMs to find the model that is best suited for your use case"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "d51edc81",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"%pip install boto3"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "d4a7c55d-b235-4ca4-a579-c90cc9570da9",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.chat_models import BedrockChat\n",
|
|
"from langchain.schema import HumanMessage"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 2,
|
|
"id": "70cf04e8-423a-4ff6-8b09-f11fb711c817",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"chat = BedrockChat(model_id=\"anthropic.claude-v2\", model_kwargs={\"temperature\":0.1})"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 3,
|
|
"id": "8199ef8f-eb8b-4253-9ea0-6c24a013ca4c",
|
|
"metadata": {
|
|
"tags": []
|
|
},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"AIMessage(content=\" Voici la traduction en français : J'adore programmer.\", additional_kwargs={}, example=False)"
|
|
]
|
|
},
|
|
"execution_count": 3,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"messages = [\n",
|
|
" HumanMessage(\n",
|
|
" content=\"Translate this sentence from English to French. I love programming.\"\n",
|
|
" )\n",
|
|
"]\n",
|
|
"chat(messages)"
|
|
]
|
|
},
|
|
{
|
|
"attachments": {},
|
|
"cell_type": "markdown",
|
|
"id": "a4a4f4d4",
|
|
"metadata": {},
|
|
"source": [
|
|
"### For BedrockChat with Streaming"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "c253883f",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler\n",
|
|
"\n",
|
|
"chat = BedrockChat(\n",
|
|
" model_id=\"anthropic.claude-v2\",\n",
|
|
" streaming=True,\n",
|
|
" callbacks=[StreamingStdOutCallbackHandler()],\n",
|
|
" model_kwargs={\"temperature\": 0.1},\n",
|
|
")"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"id": "d9e52838",
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"messages = [\n",
|
|
" HumanMessage(\n",
|
|
" content=\"Translate this sentence from English to French. I love programming.\"\n",
|
|
" )\n",
|
|
"]\n",
|
|
"chat(messages)"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"kernelspec": {
|
|
"display_name": "Python 3 (ipykernel)",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.10.9"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 5
|
|
}
|