langchain/templates/rag-conversation
David Duong b5c17ff188
Force List[Tuple[str,str]] to chat history widget (#12530)
Co-authored-by: Bagatur <baskaryan@gmail.com>
2023-10-30 15:19:32 -07:00
..
rag_conversation Force List[Tuple[str,str]] to chat history widget (#12530) 2023-10-30 15:19:32 -07:00
tests
LICENSE
poetry.lock Batch apply poetry lock --no-update for all templates (#12531) 2023-10-30 15:18:53 -07:00
pyproject.toml
rag_conversation.ipynb notebook fmt (#12498) 2023-10-29 15:50:09 -07:00
README.md Add template for Pinecone + Multi-Query (#12353) 2023-10-26 10:12:23 -07:00

Conversational RAG

This template performs conversational retrieval, which is one of the most popular LLM use-cases.

It passes both a conversation history and retrieved documents into an LLM for synthesis.

LLM

Be sure that OPENAI_API_KEY is set in order to use the OpenAI models.

Pinecone

This template uses Pinecone as a vectorstore and requires that PINECONE_API_KEY, PINECONE_ENVIRONMENT, and PINECONE_INDEX are set.