Adding Milvus and Zilliz to integrations.md and creating an ecosystems doc for Zilliz. Signed-off-by: Filip Haltmayer <filip.haltmayer@zilliz.com>
4.5 KiB
Integrations
Besides the installation of this python package, you will also need to install packages and set environment variables depending on which chains you want to use.
Note: the reason these packages are not included in the dependencies by default is that as we imagine scaling this package, we do not want to force dependencies that are not needed.
The following use cases require specific installs and api keys:
- OpenAI:
- Install requirements with
pip install openai
- Get an OpenAI api key and either set it as an environment variable (
OPENAI_API_KEY
) or pass it to the LLM constructor asopenai_api_key
.
- Install requirements with
- Cohere:
- Install requirements with
pip install cohere
- Get a Cohere api key and either set it as an environment variable (
COHERE_API_KEY
) or pass it to the LLM constructor ascohere_api_key
.
- Install requirements with
- GooseAI:
- Install requirements with
pip install openai
- Get an GooseAI api key and either set it as an environment variable (
GOOSEAI_API_KEY
) or pass it to the LLM constructor asgooseai_api_key
.
- Install requirements with
- Hugging Face Hub
- Install requirements with
pip install huggingface_hub
- Get a Hugging Face Hub api token and either set it as an environment variable (
HUGGINGFACEHUB_API_TOKEN
) or pass it to the LLM constructor ashuggingfacehub_api_token
.
- Install requirements with
- Petals:
- Install requirements with
pip install petals
- Get an GooseAI api key and either set it as an environment variable (
HUGGINGFACE_API_KEY
) or pass it to the LLM constructor ashuggingface_api_key
.
- Install requirements with
- CerebriumAI:
- Install requirements with
pip install cerebrium
- Get a Cerebrium api key and either set it as an environment variable (
CEREBRIUMAI_API_KEY
) or pass it to the LLM constructor ascerebriumai_api_key
.
- Install requirements with
- PromptLayer:
- Install requirements with
pip install promptlayer
(be sure to be on version 0.1.62 or higher) - Get an API key from promptlayer.com and set it using
promptlayer.api_key=<API KEY>
- Install requirements with
- SerpAPI:
- Install requirements with
pip install google-search-results
- Get a SerpAPI api key and either set it as an environment variable (
SERPAPI_API_KEY
) or pass it to the LLM constructor asserpapi_api_key
.
- Install requirements with
- GoogleSearchAPI:
- Install requirements with
pip install google-api-python-client
- Get a Google api key and either set it as an environment variable (
GOOGLE_API_KEY
) or pass it to the LLM constructor asgoogle_api_key
. You will also need to set theGOOGLE_CSE_ID
environment variable to your custom search engine id. You can pass it to the LLM constructor asgoogle_cse_id
as well.
- Install requirements with
- WolframAlphaAPI:
- Install requirements with
pip install wolframalpha
- Get a Wolfram Alpha api key and either set it as an environment variable (
WOLFRAM_ALPHA_APPID
) or pass it to the LLM constructor aswolfram_alpha_appid
.
- Install requirements with
- NatBot:
- Install requirements with
pip install playwright
- Install requirements with
- Wikipedia:
- Install requirements with
pip install wikipedia
- Install requirements with
- Elasticsearch:
- Install requirements with
pip install elasticsearch
- Set up Elasticsearch backend. If you want to do locally, this is a good guide.
- Install requirements with
- FAISS:
- Install requirements with
pip install faiss
for Python 3.7 andpip install faiss-cpu
for Python 3.10+.
- Install requirements with
- Manifest:
- Install requirements with
pip install manifest-ml
(Note: this is only available in Python 3.8+ currently).
- Install requirements with
- OpenSearch:
- Install requirements with
pip install opensearch-py
- If you want to set up OpenSearch on your local, here
- Install requirements with
- DeepLake:
- Install requirements with
pip install deeplake
- Install requirements with
- LlamaCpp:
- Install requirements with
pip install llama-cpp-python
- Download model and convert following llama.cpp instructions
- Install requirements with
- Milvus:
- Install requirements with
pip install pymilvus
- In order to setup a local cluster, take a look here.
- Install requirements with
- Zilliz:
- Install requirements with
pip install pymilvus
- To get up and running, take a look here.
- Install requirements with
If you are using the NLTKTextSplitter
or the SpacyTextSplitter
, you will also need to install the appropriate models. For example, if you want to use the SpacyTextSplitter
, you will need to install the en_core_web_sm
model with python -m spacy download en_core_web_sm
. Similarly, if you want to use the NLTKTextSplitter
, you will need to install the punkt
model with python -m nltk.downloader punkt
.