Qdrant update to 1.1.1 & docs polishing (#2388)

This PR updates Qdrant to 1.1.1 and introduces local mode, so there is
no need to spin up the Qdrant server. By that occasion, the Qdrant
example notebooks also got updated, covering more cases and answering
some commonly asked questions. All the Qdrant's integration tests were
switched to local mode, so no Docker container is required to launch
them.
doc
Kacper Łukawski 1 year ago committed by GitHub
parent 90973c10b1
commit 585f60a5aa
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -7,14 +7,23 @@
"source": [
"# Qdrant\n",
"\n",
"This notebook shows how to use functionality related to the Qdrant vector database."
"This notebook shows how to use functionality related to the Qdrant vector database. There are various modes of how to run Qdrant, and depending on the chosen one, there will be some subtle differences. The options include:\n",
"\n",
"- Local mode, no server required\n",
"- On-premise server deployment\n",
"- Qdrant Cloud"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "aac9563e",
"metadata": {},
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:22.282884Z",
"start_time": "2023-04-04T10:51:21.408077Z"
}
},
"outputs": [],
"source": [
"from langchain.embeddings.openai import OpenAIEmbeddings\n",
@ -27,10 +36,14 @@
"cell_type": "code",
"execution_count": 2,
"id": "a3c3999a",
"metadata": {},
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:22.520144Z",
"start_time": "2023-04-04T10:51:22.285826Z"
}
},
"outputs": [],
"source": [
"from langchain.document_loaders import TextLoader\n",
"loader = TextLoader('../../../state_of_the_union.txt')\n",
"documents = loader.load()\n",
"text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)\n",
@ -39,43 +52,536 @@
"embeddings = OpenAIEmbeddings()"
]
},
{
"cell_type": "markdown",
"id": "eeead681",
"metadata": {},
"source": [
"## Connecting to Qdrant from LangChain\n",
"\n",
"### Local mode\n",
"\n",
"Python client allows you to run the same code in local mode without running the Qdrant server. That's great for testing things out and debugging or if you plan to store just a small amount of vectors. The embeddings might be fully kepy in memory or persisted on disk.\n",
"\n",
"#### In-memory\n",
"\n",
"For some testing scenarios and quick experiments, you may prefer to keep all the data in memory only, so it gets lost when the client is destroyed - usually at the end of your script/notebook."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "dcf88bdf",
"execution_count": 3,
"id": "8429667e",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:22.525091Z",
"start_time": "2023-04-04T10:51:22.522015Z"
}
},
"outputs": [],
"source": [
"qdrant = Qdrant.from_documents(\n",
" docs, embeddings, \n",
" location=\":memory:\", # Local mode with in-memory storage only\n",
" collection_name=\"my_documents\",\n",
")"
]
},
{
"cell_type": "markdown",
"id": "59f0b954",
"metadata": {},
"source": [
"#### On-disk storage\n",
"\n",
"Local mode, without using the Qdrant server, may also store your vectors on disk so they're persisted between runs."
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "24b370e2",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:24.827567Z",
"start_time": "2023-04-04T10:51:22.529080Z"
}
},
"outputs": [],
"source": [
"qdrant = Qdrant.from_documents(\n",
" docs, embeddings, \n",
" path=\"/tmp/local_qdrant\",\n",
" collection_name=\"my_documents\",\n",
")"
]
},
{
"cell_type": "markdown",
"id": "749658ce",
"metadata": {},
"source": [
"### On-premise server deployment\n",
"\n",
"No matter if you choose to launch Qdrant locally with [a Docker container](https://qdrant.tech/documentation/install/), or select a Kubernetes deployment with [the official Helm chart](https://github.com/qdrant/qdrant-helm), the way you're going to connect to such an instance will be identical. You'll need to provide a URL pointing to the service."
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "91e7f5ce",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:24.832708Z",
"start_time": "2023-04-04T10:51:24.829905Z"
}
},
"outputs": [],
"source": [
"url = \"<---qdrant url here --->\"\n",
"qdrant = Qdrant.from_documents(\n",
" docs, embeddings, \n",
" url, prefer_grpc=True, \n",
" collection_name=\"my_documents\",\n",
")"
]
},
{
"cell_type": "markdown",
"id": "c9e21ce9",
"metadata": {},
"source": [
"### Qdrant Cloud\n",
"\n",
"If you prefer not to keep yourself busy with managing the infrastructure, you can choose to set up a fully-managed Qdrant cluster on [Qdrant Cloud](https://cloud.qdrant.io/). There is a free forever 1GB cluster included for trying out. The main difference with using a managed version of Qdrant is that you'll need to provide an API key to secure your deployment from being accessed publicly."
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "dcf88bdf",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:24.837599Z",
"start_time": "2023-04-04T10:51:24.834690Z"
}
},
"outputs": [],
"source": [
"host = \"<---host name here --->\"\n",
"url = \"<---qdrant cloud cluster url here --->\"\n",
"api_key = \"<---api key here--->\"\n",
"qdrant = Qdrant.from_documents(docs, embeddings, host=host, prefer_grpc=True, api_key=api_key)\n",
"query = \"What did the president say about Ketanji Brown Jackson\""
"qdrant = Qdrant.from_documents(\n",
" docs, embeddings, \n",
" url, prefer_grpc=True, api_key=api_key, \n",
" collection_name=\"my_documents\",\n",
")"
]
},
{
"cell_type": "markdown",
"id": "93540013",
"metadata": {},
"source": [
"## Reusing the same collection\n",
"\n",
"Both `Qdrant.from_texts` and `Qdrant.from_documents` methods are great to start using Qdrant with LangChain, but **they are going to destroy the collection and create it from scratch**! If you want to reuse the existing collection, you can always create an instance of `Qdrant` on your own and pass the `QdrantClient` instance with the connection details."
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 7,
"id": "b7b432d7",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:24.843090Z",
"start_time": "2023-04-04T10:51:24.840041Z"
}
},
"outputs": [],
"source": [
"del qdrant"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "30a87570",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:24.854117Z",
"start_time": "2023-04-04T10:51:24.845385Z"
}
},
"outputs": [],
"source": [
"import qdrant_client\n",
"\n",
"client = qdrant_client.QdrantClient(\n",
" path=\"/tmp/local_qdrant\", prefer_grpc=True\n",
")\n",
"qdrant = Qdrant(\n",
" client=client, collection_name=\"my_documents\", \n",
" embedding_function=embeddings.embed_query\n",
")"
]
},
{
"cell_type": "markdown",
"id": "1f9215c8",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T09:27:29.920258Z",
"start_time": "2023-04-04T09:27:29.913714Z"
}
},
"source": [
"## Similarity search\n",
"\n",
"The simplest scenario for using Qdrant vector store is to perform a similarity search. Under the hood, our query will be encoded with the `embedding_function` and used to find similar documents in Qdrant collection."
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "a8c513ab",
"metadata": {},
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:25.204469Z",
"start_time": "2023-04-04T10:51:24.855618Z"
}
},
"outputs": [],
"source": [
"docs = qdrant.similarity_search(query)"
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
"found_docs = qdrant.similarity_search(query)"
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 10,
"id": "fc516993",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:25.220984Z",
"start_time": "2023-04-04T10:51:25.213943Z"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Tonight. I call on the Senate to: Pass the Freedom to Vote Act. Pass the John Lewis Voting Rights Act. And while youre at it, pass the Disclose Act so Americans can know who is funding our elections. \n",
"\n",
"Tonight, Id like to honor someone who has dedicated his life to serve this country: Justice Stephen Breyer—an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court. Justice Breyer, thank you for your service. \n",
"\n",
"One of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court. \n",
"\n",
"And I did that 4 days ago, when I nominated Circuit Court of Appeals Judge Ketanji Brown Jackson. One of our nations top legal minds, who will continue Justice Breyers legacy of excellence.\n"
]
}
],
"source": [
"print(found_docs[0].page_content)"
]
},
{
"cell_type": "markdown",
"id": "1bda9bf5",
"metadata": {},
"source": [
"## Similarity search with score\n",
"\n",
"Sometimes we might want to perform the search, but also obtain a relevancy score to know how good is a particular result."
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "8804a21d",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:25.631585Z",
"start_time": "2023-04-04T10:51:25.227384Z"
}
},
"outputs": [],
"source": [
"docs[0]"
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
"found_docs = qdrant.similarity_search_with_score(query)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "756a6887",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:25.642282Z",
"start_time": "2023-04-04T10:51:25.635947Z"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Tonight. I call on the Senate to: Pass the Freedom to Vote Act. Pass the John Lewis Voting Rights Act. And while youre at it, pass the Disclose Act so Americans can know who is funding our elections. \n",
"\n",
"Tonight, Id like to honor someone who has dedicated his life to serve this country: Justice Stephen Breyer—an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court. Justice Breyer, thank you for your service. \n",
"\n",
"One of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court. \n",
"\n",
"And I did that 4 days ago, when I nominated Circuit Court of Appeals Judge Ketanji Brown Jackson. One of our nations top legal minds, who will continue Justice Breyers legacy of excellence.\n",
"\n",
"Score: 0.8153784913324512\n"
]
}
],
"source": [
"document, score = found_docs[0]\n",
"print(document.page_content)\n",
"print(f\"\\nScore: {score}\")"
]
},
{
"cell_type": "markdown",
"id": "c58c30bf",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:39:53.032744Z",
"start_time": "2023-04-04T10:39:53.028673Z"
}
},
"source": [
"## Maximum marginal relevance search (MMR)\n",
"\n",
"If you'd like to look up for some similar documents, but you'd also like to receive diverse results, MMR is method you should consider. Maximal marginal relevance optimizes for similarity to query AND diversity among selected documents."
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "76810fb6",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:26.010947Z",
"start_time": "2023-04-04T10:51:25.647687Z"
}
},
"outputs": [],
"source": [
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
"found_docs = qdrant.max_marginal_relevance_search(query, k=2, fetch_k=10)"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "80c6db11",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:26.016979Z",
"start_time": "2023-04-04T10:51:26.013329Z"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"1. Tonight. I call on the Senate to: Pass the Freedom to Vote Act. Pass the John Lewis Voting Rights Act. And while youre at it, pass the Disclose Act so Americans can know who is funding our elections. \n",
"\n",
"Tonight, Id like to honor someone who has dedicated his life to serve this country: Justice Stephen Breyer—an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court. Justice Breyer, thank you for your service. \n",
"\n",
"One of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court. \n",
"\n",
"And I did that 4 days ago, when I nominated Circuit Court of Appeals Judge Ketanji Brown Jackson. One of our nations top legal minds, who will continue Justice Breyers legacy of excellence. \n",
"\n",
"2. We cant change how divided weve been. But we can change how we move forward—on COVID-19 and other issues we must face together. \n",
"\n",
"I recently visited the New York City Police Department days after the funerals of Officer Wilbert Mora and his partner, Officer Jason Rivera. \n",
"\n",
"They were responding to a 9-1-1 call when a man shot and killed them with a stolen gun. \n",
"\n",
"Officer Mora was 27 years old. \n",
"\n",
"Officer Rivera was 22. \n",
"\n",
"Both Dominican Americans whod grown up on the same streets they later chose to patrol as police officers. \n",
"\n",
"I spoke with their families and told them that we are forever in debt for their sacrifice, and we will carry on their mission to restore the trust and safety every community deserves. \n",
"\n",
"Ive worked on these issues a long time. \n",
"\n",
"I know what works: Investing in crime preventionand community police officers wholl walk the beat, wholl know the neighborhood, and who can restore trust and safety. \n",
"\n"
]
}
],
"source": [
"for i, doc in enumerate(found_docs):\n",
" print(f\"{i + 1}.\", doc.page_content, \"\\n\")"
]
},
{
"cell_type": "markdown",
"id": "691a82d6",
"metadata": {},
"source": [
"## Qdrant as a Retriever\n",
"\n",
"Qdrant, as all the other vector stores, is a LangChain Retriever, by using cosine similarity. "
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "9427195f",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:26.031451Z",
"start_time": "2023-04-04T10:51:26.018763Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"VectorStoreRetriever(vectorstore=<langchain.vectorstores.qdrant.Qdrant object at 0x7fc4e5720a00>, search_type='similarity', search_kwargs={})"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"retriever = qdrant.as_retriever()\n",
"retriever"
]
},
{
"cell_type": "markdown",
"id": "0c851b4f",
"metadata": {},
"source": [
"It might be also specified to use MMR as a search strategy, instead of similarity."
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "64348f1b",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:26.043909Z",
"start_time": "2023-04-04T10:51:26.034284Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"VectorStoreRetriever(vectorstore=<langchain.vectorstores.qdrant.Qdrant object at 0x7fc4e5720a00>, search_type='mmr', search_kwargs={})"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"retriever = qdrant.as_retriever(search_type=\"mmr\")\n",
"retriever"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "f3c70c31",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T10:51:26.495652Z",
"start_time": "2023-04-04T10:51:26.046407Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"Document(page_content='Tonight. I call on the Senate to: Pass the Freedom to Vote Act. Pass the John Lewis Voting Rights Act. And while youre at it, pass the Disclose Act so Americans can know who is funding our elections. \\n\\nTonight, Id like to honor someone who has dedicated his life to serve this country: Justice Stephen Breyer—an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court. Justice Breyer, thank you for your service. \\n\\nOne of the most serious constitutional responsibilities a President has is nominating someone to serve on the United States Supreme Court. \\n\\nAnd I did that 4 days ago, when I nominated Circuit Court of Appeals Judge Ketanji Brown Jackson. One of our nations top legal minds, who will continue Justice Breyers legacy of excellence.', metadata={'source': '../../../state_of_the_union.txt'})"
]
},
"execution_count": 17,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"query = \"What did the president say about Ketanji Brown Jackson\"\n",
"retriever.get_relevant_documents(query)[0]"
]
},
{
"cell_type": "markdown",
"id": "0358ecde",
"metadata": {},
"source": [
"## Customizing Qdrant\n",
"\n",
"Qdrant stores your vector embeddings along with the optional JSON-like payload. Payloads are optional, but since LangChain assumes the embeddings are generated from the documents, we keep the context data, so you can extract the original texts as well.\n",
"\n",
"By default, your document is going to be stored in the following payload structure:\n",
"\n",
"```json\n",
"{\n",
" \"page_content\": \"Lorem ipsum dolor sit amet\",\n",
" \"metadata\": {\n",
" \"foo\": \"bar\"\n",
" }\n",
"}\n",
"```\n",
"\n",
"You can, however, decide to use different keys for the page content and metadata. That's useful if you already have a collection that you'd like to reuse. You can always change the "
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "e4d6baf9",
"metadata": {
"ExecuteTime": {
"end_time": "2023-04-04T11:08:31.739141Z",
"start_time": "2023-04-04T11:08:30.229748Z"
}
},
"outputs": [
{
"data": {
"text/plain": [
"<langchain.vectorstores.qdrant.Qdrant at 0x7fc4e2baa230>"
]
},
"execution_count": 19,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"Qdrant.from_documents(\n",
" docs, embeddings, \n",
" location=\":memory:\",\n",
" collection_name=\"my_documents_2\",\n",
" content_payload_key=\"my_page_content_key\",\n",
" metadata_payload_key=\"my_meta\",\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a359ed74",
"id": "2300e785",
"metadata": {},
"outputs": [],
"source": []
@ -97,7 +603,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.10.6"
}
},
"nbformat": 4,

@ -181,6 +181,7 @@ class Qdrant(VectorStore):
cls,
documents: List[Document],
embedding: Embeddings,
location: Optional[str] = None,
url: Optional[str] = None,
port: Optional[int] = 6333,
grpc_port: int = 6334,
@ -190,6 +191,7 @@ class Qdrant(VectorStore):
prefix: Optional[str] = None,
timeout: Optional[float] = None,
host: Optional[str] = None,
path: Optional[str] = None,
collection_name: Optional[str] = None,
distance_func: str = "Cosine",
content_payload_key: str = CONTENT_KEY,
@ -201,6 +203,7 @@ class Qdrant(VectorStore):
super().from_documents(
documents,
embedding,
location=location,
url=url,
port=port,
grpc_port=grpc_port,
@ -210,6 +213,7 @@ class Qdrant(VectorStore):
prefix=prefix,
timeout=timeout,
host=host,
path=path,
collection_name=collection_name,
distance_func=distance_func,
content_payload_key=content_payload_key,
@ -224,6 +228,7 @@ class Qdrant(VectorStore):
texts: List[str],
embedding: Embeddings,
metadatas: Optional[List[dict]] = None,
location: Optional[str] = None,
url: Optional[str] = None,
port: Optional[int] = 6333,
grpc_port: int = 6334,
@ -233,6 +238,7 @@ class Qdrant(VectorStore):
prefix: Optional[str] = None,
timeout: Optional[float] = None,
host: Optional[str] = None,
path: Optional[str] = None,
collection_name: Optional[str] = None,
distance_func: str = "Cosine",
content_payload_key: str = CONTENT_KEY,
@ -247,6 +253,10 @@ class Qdrant(VectorStore):
metadatas:
An optional list of metadata. If provided it has to be of the same
length as a list of texts.
location:
If `:memory:` - use in-memory Qdrant instance.
If `str` - use it as a `url` parameter.
If `None` - use default values for `host` and `port`.
url: either host or str of "Optional[scheme], host, Optional[port],
Optional[prefix]". Default: `None`
port: Port of the REST API interface. Default: 6333
@ -266,6 +276,9 @@ class Qdrant(VectorStore):
host:
Host name of Qdrant service. If url and host are None, set to
'localhost'. Default: `None`
path:
Path in which the vectors will be stored while using local mode.
Default: `None`
collection_name:
Name of the Qdrant collection to be used. If not provided,
will be created randomly.
@ -311,6 +324,7 @@ class Qdrant(VectorStore):
distance_func = distance_func.upper()
client = qdrant_client.QdrantClient(
location=location,
url=url,
port=port,
grpc_port=grpc_port,
@ -320,6 +334,7 @@ class Qdrant(VectorStore):
prefix=prefix,
timeout=timeout,
host=host,
path=path,
**kwargs,
)

1000
poetry.lock generated

File diff suppressed because it is too large Load Diff

@ -35,7 +35,7 @@ weaviate-client = {version = "^3", optional = true}
google-api-python-client = {version = "2.70.0", optional = true}
wolframalpha = {version = "5.0.0", optional = true}
anthropic = {version = "^0.2.4", optional = true}
qdrant-client = {version = "^1.0.4", optional = true, python = ">=3.8.1,<3.12"}
qdrant-client = {version = "^1.1.1", optional = true, python = ">=3.8.1,<3.12"}
dataclasses-json = "^0.5.7"
tensorflow-text = {version = "^2.11.0", optional = true, python = "^3.10, <3.12"}
tenacity = "^8.1.0"
@ -103,6 +103,9 @@ setuptools = "^67.6.1"
[tool.poetry.extras]
llms = ["anthropic", "cohere", "openai", "nlpcloud", "huggingface_hub", "manifest-ml", "torch", "transformers"]
qdrant = ["qdrant-client"]
openai = ["openai"]
cohere = ["cohere"]
all = ["anthropic", "cohere", "openai", "nlpcloud", "huggingface_hub", "jina", "manifest-ml", "elasticsearch", "opensearch-py", "google-search-results", "faiss-cpu", "sentence_transformers", "transformers", "spacy", "nltk", "wikipedia", "beautifulsoup4", "tiktoken", "torch", "jinja2", "pinecone-client", "weaviate-client", "redis", "google-api-python-client", "wolframalpha", "qdrant-client", "tensorflow-text", "pypdf", "networkx", "nomic", "aleph-alpha-client", "deeplake", "pgvector", "psycopg2-binary", "boto3", "pyowm"]
[tool.ruff]

@ -21,7 +21,7 @@ def test_qdrant(content_payload_key: str, metadata_payload_key: str) -> None:
docsearch = Qdrant.from_texts(
texts,
FakeEmbeddings(),
host="localhost",
location=":memory:",
content_payload_key=content_payload_key,
metadata_payload_key=metadata_payload_key,
)
@ -48,7 +48,7 @@ def test_qdrant_with_metadatas(
texts,
FakeEmbeddings(),
metadatas=metadatas,
host="localhost",
location=":memory:",
content_payload_key=content_payload_key,
metadata_payload_key=metadata_payload_key,
)
@ -64,7 +64,7 @@ def test_qdrant_similarity_search_filters() -> None:
texts,
FakeEmbeddings(),
metadatas=metadatas,
host="localhost",
location=":memory:",
)
output = docsearch.similarity_search("foo", k=1, filter={"page": 1})
assert output == [Document(page_content="bar", metadata={"page": 1})]
@ -89,7 +89,7 @@ def test_qdrant_max_marginal_relevance_search(
texts,
FakeEmbeddings(),
metadatas=metadatas,
host="localhost",
location=":memory:",
content_payload_key=content_payload_key,
metadata_payload_key=metadata_payload_key,
)

Loading…
Cancel
Save