mirror of
https://github.com/hwchase17/langchain
synced 2024-10-29 17:07:25 +00:00
67c5950df3
### Description - Add support for streaming with `Bedrock` LLM and `BedrockChat` Chat Model. - Bedrock as of now supports streaming for the `anthropic.claude-*` and `amazon.titan-*` models only, hence support for those have been built. - Also increased the default `max_token_to_sample` for Bedrock `anthropic` model provider to `256` from `50` to keep in line with the `Anthropic` defaults. - Added examples for streaming responses to the bedrock example notebooks. **_NOTE:_**: This PR fixes the issues mentioned in #9897 and makes that PR redundant. |
||
---|---|---|
.. | ||
anthropic_functions.ipynb | ||
anthropic.ipynb | ||
anyscale.ipynb | ||
azure_chat_openai.ipynb | ||
azureml_chat_endpoint.ipynb | ||
baidu_qianfan_endpoint.ipynb | ||
bedrock.ipynb | ||
ernie.ipynb | ||
google_vertex_ai_palm.ipynb | ||
jinachat.ipynb | ||
konko.ipynb | ||
litellm.ipynb | ||
llama_api.ipynb | ||
minimax.ipynb | ||
ollama.ipynb | ||
openai.ipynb | ||
promptlayer_chatopenai.ipynb |