# Vectara Integration This PR provides integration with Vectara. Implemented here are: * langchain/vectorstore/vectara.py * tests/integration_tests/vectorstores/test_vectara.py * langchain/retrievers/vectara_retriever.py And two IPYNB notebooks to do more testing: * docs/modules/chains/index_examples/vectara_text_generation.ipynb * docs/modules/indexes/vectorstores/examples/vectara.ipynb --------- Co-authored-by: Dev 2049 <dev.dev2049@gmail.com>
1.8 KiB
Vectara
What is Vectara?
Vectara Overview:
- Vectara is developer-first API platform for building conversational search applications
- To use Vectara - first sign up and create an account. Then create a corpus and an API key for indexing and searching.
- You can use Vectara's indexing API to add documents into Vectara's index
- You can use Vectara's Search API to query Vectara's index (which also supports Hybrid search implicitly).
- You can use Vectara's integration with LangChain as a Vector store or using the Retriever abstraction.
Installation and Setup
To use Vectara with LangChain no special installation steps are required. You just have to provide your customer_id, corpus ID, and an API key created within the Vectara console to enable indexing and searching.
VectorStore
There exists a wrapper around the Vectara platform, allowing you to use it as a vectorstore, whether for semantic search or example selection.
To import this vectorstore:
from langchain.vectorstores import Vectara
To create an instance of the Vectara vectorstore:
vectara = Vectara(
vectara_customer_id=customer_id,
vectara_corpus_id=corpus_id,
vectara_api_key=api_key
)
The customer_id, corpus_id and api_key are optional, and if they are not supplied will be read from the environment variables VECTARA_CUSTOMER_ID
, VECTARA_CORPUS_ID
and VECTARA_API_KEY
, respectively.
For a more detailed walkthrough of the Vectara wrapper, see one of the two example notebooks: