mirror of
https://github.com/hwchase17/langchain
synced 2024-11-06 03:20:49 +00:00
4b53440e70
- **Description:** fixes and upgrades for the Tongyi LLM and ChatTongyi Model - Fixed typos; it should be `Tongyi`, not `OpenAI`. - Fixed a bug in `stream_generate_with_retry`; it's a real stream generator now. - Fixed a bug in `validate_environment`; the `dashscope_api_key` should be properly handled when set by environment variables or initialization parameters. - Changed the `dashscope` response to incremental output by setting the parameter `incremental_output`, which eliminates the need for the prefix-removal trick. - Removed some unused parameters, like `n`, `prefix_messages`. - Added `_stream` method. - Added async methods support, such as `_astream`, `_agenerate`, `_abatch`. - **Dependencies:** No new dependencies. - **Tag maintainer:** @hwchase17 > PS: Some may be confused about the terms `dashscope`, `tongyi`, and `Qwen`: > - `dashscope`: A platform to deploy LLMs and provide APIs to invoke the LLM. > - `tongyi`: A brand name or overall term about Alibaba Cloud's LLM/AI. > - `Qwen`: An LLM that is open-sourced and deployed in `dashscope`. > > We use the `dashscope` SDK to interact with the `tongyi`-`Qwen` LLM. --------- Co-authored-by: Harrison Chase <hw.chase.17@gmail.com> |
||
---|---|---|
.. | ||
__init__.py | ||
test_anthropic.py | ||
test_azureml_endpoint.py | ||
test_azureopenai.py | ||
test_baichuan.py | ||
test_bedrock.py | ||
test_ernie.py | ||
test_fireworks.py | ||
test_google_palm.py | ||
test_huggingface.py | ||
test_hunyuan.py | ||
test_imports.py | ||
test_javelin_ai_gateway.py | ||
test_openai.py |