langchain/templates/rag-pinecone
2023-10-26 10:12:23 -07:00
..
rag_pinecone Add template for Pinecone + Multi-Query (#12353) 2023-10-26 10:12:23 -07:00
tests Templates (#12294) 2023-10-25 18:47:42 -07:00
LICENSE Templates (#12294) 2023-10-25 18:47:42 -07:00
poetry.lock Templates (#12294) 2023-10-25 18:47:42 -07:00
pyproject.toml Templates (#12294) 2023-10-25 18:47:42 -07:00
rag_pinecone.ipynb Add template for Pinecone + Multi-Query (#12353) 2023-10-26 10:12:23 -07:00
README.md Templates (#12294) 2023-10-25 18:47:42 -07:00

RAG Pinecone

This template performs RAG using Pinecone and OpenAI.

Pinecone

This connects to a hosted Pinecone vectorstore.

Be sure that you have set a few env variables in chain.py:

  • PINECONE_API_KEY
  • PINECONE_ENV
  • index_name

LLM

Be sure that OPENAI_API_KEY is set in order to the OpenAI models.

Installation

Create your LangServe app:

langchain serve new my-app
cd my-app

Add template:

langchain serve add rag-pinecone

Start server:

langchain start

See Jupyter notebook rag_pinecone for various way to connect to the template.