You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/templates/rag-pinecone-multi-query
David Duong d39b4b61b6
Batch apply `poetry lock --no-update` for all templates (#12531)
Ran the following bash script for all templates

```bash
#!/bin/bash

set -e
current_dir="$(pwd)"
for directory in */; do
    if [ -d "$directory" ]; then
        (cd "$directory" && poetry lock --no-update)
    fi
done

cd "$current_dir"
```

Co-authored-by: Bagatur <baskaryan@gmail.com>
11 months ago
..
rag_pinecone_multi_query Format Templates (#12396) 11 months ago
tests Add template for Pinecone + Multi-Query (#12353) 11 months ago
LICENSE Add template for Pinecone + Multi-Query (#12353) 11 months ago
README.md Update multi query template README, ntbk (#12356) 11 months ago
poetry.lock Batch apply `poetry lock --no-update` for all templates (#12531) 11 months ago
pyproject.toml Add template for Pinecone + Multi-Query (#12353) 11 months ago
rag_pinecone_multi_query.ipynb notebook fmt (#12498) 11 months ago

README.md

RAG Pinecone multi query

This template performs RAG using Pinecone and OpenAI with the multi-query retriever.

This will use an LLM to generate multiple queries from different perspectives for a given user input query.

For each query, it retrieves a set of relevant documents and takes the unique union across all queries for answer synthesis.

Pinecone

This template uses Pinecone as a vectorstore and requires that PINECONE_API_KEY, PINECONE_ENVIRONMENT, and PINECONE_INDEX are set.

LLM

Be sure that OPENAI_API_KEY is set in order to the OpenAI models.

App

Example server.py:

from fastapi import FastAPI
from langserve import add_routes
from rag_pinecone_multi_query.chain import chain

app = FastAPI()

# Edit this to add the chain you want to add
add_routes(app, chain, path="rag_pinecone_multi_query")

if __name__ == "__main__":
    import uvicorn

    uvicorn.run(app, host="0.0.0.0", port=8001)

Run:

python app/server.py

Check endpoint:

http://0.0.0.0:8001/docs

See rag_pinecone_multi_query.ipynb for example usage -

from langserve.client import RemoteRunnable
rag_app_pinecone = RemoteRunnable('http://0.0.0.0:8001/rag_pinecone_multi_query')
rag_app_pinecone.invoke("What are the different types of agent memory")