diff --git a/examples/Search_augmented_by_query_generation_and_embeddings_reranking.ipynb b/examples/Search_augmented_by_query_generation_and_embeddings_reranking.ipynb index dbe153fc..b75af159 100644 --- a/examples/Search_augmented_by_query_generation_and_embeddings_reranking.ipynb +++ b/examples/Search_augmented_by_query_generation_and_embeddings_reranking.ipynb @@ -12,7 +12,7 @@ "There are two prominent approaches to using language models for information retrieval:\n", "\n", "1. **Mimicking Human Browsing:** [GPT triggers a search](https://openai.com/blog/chatgpt-plugins#browsing), evaluates the results, and modifies the search query if necessary. It can also follow up on specific search results to form a chain of thought, much like a human user would do.\n", - "2. **Retrieval with Embeddings:** Calculating [embeddings](https://platform.openai.com/docs/guides/embeddings) for your content, and then using a metric like cosine distance between the user query and the embedded data to sort and [retrieve information](https://github.com/openai/openai-cookbook/blob/main/examples/Question_answering_using_embeddings.ipynb). This technique is [used heavily](https://blog.google/products/search/search-language-understanding-bert/) by search engines like Google.\n", + "2. **Retrieval with Embeddings:** Calculating [embeddings](https://platform.openai.com/docs/guides/embeddings) for your content, and then using a metric like cosine distance between the user query and the embedded data to sort and [retrieve information](Question_answering_using_embeddings.ipynb). This technique is [used heavily](https://blog.google/products/search/search-language-understanding-bert/) by search engines like Google.\n", "\n", "These approaches are both promising, but each has their shortcomings: the first one can be slow due to its iterative nature and the second one requires embedding your entire knowledge base in advance, continuously embedding new content and maintaining a vector database.\n", "\n",