forked from Archives/langchain
6f36f0f930
Add oobabooga/text-generation-webui support as an LLM. Currently, supports using text-generation-webui's non-streaming API interface. Allows users who already have text-gen running to use the same models with langchain. #### Before submitting Simple usage, similar to existing LLM supported: ``` from langchain.llms import TextGen llm = TextGen(model_url = "http://localhost:5000") ``` #### Who can review? @hwchase17 - project lead --------- Co-authored-by: Hien Ngo <Hien.Ngo@adia.ae> |
||
---|---|---|
.. | ||
api_reference | ||
docs_skeleton | ||
extras | ||
modules/models/llms/integrations | ||
snippets | ||
.local_build.sh | ||
requirements.txt |