Update Integrations links (#8206)

pull/8225/head
William FH 1 year ago committed by GitHub
parent a7efa95775
commit 0a16b3d84b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -31,7 +31,7 @@ There isn't any special setup for it.
## LLM
See a [usage example](/docs/modules/model_io/models/llms/integrations/INCLUDE_REAL_NAME.html).
See a [usage example](/docs/integrations/llms/INCLUDE_REAL_NAME).
```python
from langchain.llms import integration_class_REPLACE_ME
@ -40,7 +40,7 @@ from langchain.llms import integration_class_REPLACE_ME
## Text Embedding Models
See a [usage example](/docs/modules/data_connection/text_embedding/integrations/INCLUDE_REAL_NAME.html)
See a [usage example](/docs/integrations/text_embedding/INCLUDE_REAL_NAME)
```python
from langchain.embeddings import integration_class_REPLACE_ME
@ -49,7 +49,7 @@ from langchain.embeddings import integration_class_REPLACE_ME
## Chat Models
See a [usage example](/docs/modules/model_io/models/chat/integrations/INCLUDE_REAL_NAME.html)
See a [usage example](/docs/integrations/chat/INCLUDE_REAL_NAME)
```python
from langchain.chat_models import integration_class_REPLACE_ME
@ -57,7 +57,7 @@ from langchain.chat_models import integration_class_REPLACE_ME
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/INCLUDE_REAL_NAME.html).
See a [usage example](/docs/integrations/document_loaders/INCLUDE_REAL_NAME).
```python
from langchain.document_loaders import integration_class_REPLACE_ME

@ -11,7 +11,7 @@
"\n",
"[PromptLayer](https://promptlayer.com) is a an LLM observability platform that lets you visualize requests, version prompts, and track usage. In this guide we will go over how to setup the `PromptLayerCallbackHandler`. \n",
"\n",
"While PromptLayer does have LLMs that integrate directly with LangChain (eg [`PromptLayerOpenAI`](https://python.langchain.com/docs/modules/model_io/models/llms/integrations/promptlayer_openai)), this callback is the recommended way to integrate PromptLayer with LangChain.\n",
"While PromptLayer does have LLMs that integrate directly with LangChain (eg [`PromptLayerOpenAI`](https://python.langchain.com/docs/integrations/llms/promptlayer_openai)), this callback is the recommended way to integrate PromptLayer with LangChain.\n",
"\n",
"See [our docs](https://docs.promptlayer.com/languages/langchain) for more information."
]

@ -13,7 +13,7 @@
"\n",
"## Prerequisites\n",
"\n",
"You need to have an existing dataset on the Apify platform. If you don't have one, please first check out [this notebook](/docs/modules/agents/tools/integrations/apify.html) on how to use Apify to extract content from documentation, knowledge bases, help centers, or blogs."
"You need to have an existing dataset on the Apify platform. If you don't have one, please first check out [this notebook](/docs/integrations/tools/apify.html) on how to use Apify to extract content from documentation, knowledge bases, help centers, or blogs."
]
},
{

@ -36,7 +36,7 @@
"## Deployments\n",
"With Azure OpenAI, you set up your own deployments of the common GPT-3 and Codex models. When calling the API, you need to specify the deployment you want to use.\n",
"\n",
"_**Note**: These docs are for the Azure text completion models. Models like GPT-4 are chat models. They have a slightly different interface, and can be accessed via the `AzureChatOpenAI` class. For docs on Azure chat see [Azure Chat OpenAI documentation](/docs/modules/model_io/models/chat/integrations/azure_chat_openai)._\n",
"_**Note**: These docs are for the Azure text completion models. Models like GPT-4 are chat models. They have a slightly different interface, and can be accessed via the `AzureChatOpenAI` class. For docs on Azure chat see [Azure Chat OpenAI documentation](/docs/integrations/chat/azure_chat_openai)._\n",
"\n",
"Let's say your deployment name is `text-davinci-002-prod`. In the `openai` Python API, you can specify this deployment with the `engine` parameter. For example:\n",
"\n",

@ -22,7 +22,7 @@ Have `docker desktop` installed.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/airbyte_json.html).
See a [usage example](/docs/integrations/document_loaders/airbyte_json).
```python
from langchain.document_loaders import AirbyteJSONLoader

@ -25,4 +25,4 @@ pip install pyairtable
from langchain.document_loaders import AirtableLoader
```
See an [example](/docs/modules/data_connection/document_loaders/integrations/airtable.html).
See an [example](/docs/integrations/document_loaders/airtable.html).

@ -21,7 +21,7 @@ ALEPH_ALPHA_API_KEY = getpass()
## LLM
See a [usage example](/docs/modules/model_io/models/llms/integrations/aleph_alpha.html).
See a [usage example](/docs/integrations/llms/aleph_alpha).
```python
from langchain.llms import AlephAlpha
@ -29,7 +29,7 @@ from langchain.llms import AlephAlpha
## Text Embedding Models
See a [usage example](/docs/modules/data_connection/text_embedding/integrations/aleph_alpha.html).
See a [usage example](/docs/integrations/text_embedding/aleph_alpha).
```python
from langchain.embeddings import AlephAlphaSymmetricSemanticEmbedding, AlephAlphaAsymmetricSemanticEmbedding

@ -6,7 +6,7 @@ API Gateway handles all the tasks involved in accepting and processing up to hun
## LLM
See a [usage example](/docs/modules/model_io/models/llms/integrations/amazon_api_gateway_example.html).
See a [usage example](/docs/integrations/llms/amazon_api_gateway_example).
```python
from langchain.llms import AmazonAPIGateway

@ -12,4 +12,4 @@ To import this vectorstore:
from langchain.vectorstores import AnalyticDB
```
For a more detailed walkthrough of the AnalyticDB wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/analyticdb.html)
For a more detailed walkthrough of the AnalyticDB wrapper, see [this notebook](/docs/integrations/vectorstores/analyticdb.html)

@ -11,7 +11,7 @@ pip install annoy
## Vectorstore
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/annoy.html).
See a [usage example](/docs/integrations/vectorstores/annoy).
```python
from langchain.vectorstores import Annoy

@ -32,7 +32,7 @@ You can use the `ApifyWrapper` to run Actors on the Apify platform.
from langchain.utilities import ApifyWrapper
```
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/modules/agents/tools/integrations/apify.html).
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/integrations/tools/apify.html).
### Loader
@ -43,4 +43,4 @@ You can also use our `ApifyDatasetLoader` to get data from Apify dataset.
from langchain.document_loaders import ApifyDatasetLoader
```
For a more detailed walkthrough of this loader, see [this notebook](/docs/modules/data_connection/document_loaders/integrations/apify_dataset.html).
For a more detailed walkthrough of this loader, see [this notebook](/docs/integrations/document_loaders/apify_dataset.html).

@ -21,7 +21,7 @@ pip install pymupdf
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/arxiv.html).
See a [usage example](/docs/integrations/document_loaders/arxiv).
```python
from langchain.document_loaders import ArxivLoader
@ -29,7 +29,7 @@ from langchain.document_loaders import ArxivLoader
## Retriever
See a [usage example](/docs/modules/data_connection/retrievers/integrations/arxiv.html).
See a [usage example](/docs/integrations/retrievers/arxiv).
```python
from langchain.retrievers import ArxivRetriever

@ -24,4 +24,4 @@ To import this vectorstore:
from langchain.vectorstores import AtlasDB
```
For a more detailed walkthrough of the AtlasDB wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/atlas.html)
For a more detailed walkthrough of the AtlasDB wrapper, see [this notebook](/docs/integrations/vectorstores/atlas.html)

@ -18,4 +18,4 @@ whether for semantic search or example selection.
from langchain.vectorstores import AwaDB
```
For a more detailed walkthrough of the AwaDB wrapper, see [here](/docs/modules/data_connection/vectorstores/integrations/awadb.html).
For a more detailed walkthrough of the AwaDB wrapper, see [here](/docs/integrations/vectorstores/awadb.html).

@ -16,9 +16,9 @@ pip install boto3
## Document Loader
See a [usage example for S3DirectoryLoader](/docs/modules/data_connection/document_loaders/integrations/aws_s3_directory.html).
See a [usage example for S3DirectoryLoader](/docs/integrations/document_loaders/aws_s3_directory.html).
See a [usage example for S3FileLoader](/docs/modules/data_connection/document_loaders/integrations/aws_s3_file.html).
See a [usage example for S3FileLoader](/docs/integrations/document_loaders/aws_s3_file.html).
```python
from langchain.document_loaders import S3DirectoryLoader, S3FileLoader

@ -9,7 +9,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/azlyrics.html).
See a [usage example](/docs/integrations/document_loaders/azlyrics).
```python
from langchain.document_loaders import AZLyricsLoader

@ -23,13 +23,13 @@ pip install azure-storage-blob
## Document Loader
See a [usage example for the Azure Blob Storage](/docs/modules/data_connection/document_loaders/integrations/azure_blob_storage_container.html).
See a [usage example for the Azure Blob Storage](/docs/integrations/document_loaders/azure_blob_storage_container.html).
```python
from langchain.document_loaders import AzureBlobStorageContainerLoader
```
See a [usage example for the Azure Files](/docs/modules/data_connection/document_loaders/integrations/azure_blob_storage_file.html).
See a [usage example for the Azure Files](/docs/integrations/document_loaders/azure_blob_storage_file.html).
```python
from langchain.document_loaders import AzureBlobStorageFileLoader

@ -17,7 +17,7 @@ See [set up instructions](https://learn.microsoft.com/en-us/azure/search/search-
## Retriever
See a [usage example](/docs/modules/data_connection/retrievers/integrations/azure_cognitive_search.html).
See a [usage example](/docs/integrations/retrievers/azure_cognitive_search).
```python
from langchain.retrievers import AzureCognitiveSearchRetriever

@ -27,7 +27,7 @@ os.environ["OPENAI_API_VERSION"] = "2023-05-15"
## LLM
See a [usage example](/docs/modules/model_io/models/llms/integrations/azure_openai_example.html).
See a [usage example](/docs/integrations/llms/azure_openai_example).
```python
from langchain.llms import AzureOpenAI
@ -35,7 +35,7 @@ from langchain.llms import AzureOpenAI
## Text Embedding Models
See a [usage example](/docs/modules/data_connection/text_embedding/integrations/azureopenai.html)
See a [usage example](/docs/integrations/text_embedding/azureopenai)
```python
from langchain.embeddings import OpenAIEmbeddings
@ -43,7 +43,7 @@ from langchain.embeddings import OpenAIEmbeddings
## Chat Models
See a [usage example](/docs/modules/model_io/models/chat/integrations/azure_chat_openai.html)
See a [usage example](/docs/integrations/chat/azure_chat_openai)
```python
from langchain.chat_models import AzureChatOpenAI

@ -10,7 +10,7 @@ pip install boto3
## LLM
See a [usage example](/docs/modules/model_io/models/llms/integrations/bedrock.html).
See a [usage example](/docs/integrations/llms/bedrock).
```python
from langchain import Bedrock
@ -18,7 +18,7 @@ from langchain import Bedrock
## Text Embedding Models
See a [usage example](/docs/modules/data_connection/text_embedding/integrations/bedrock.html).
See a [usage example](/docs/integrations/text_embedding/bedrock).
```python
from langchain.embeddings import BedrockEmbeddings
```

@ -10,7 +10,7 @@ pip install bilibili-api-python
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/bilibili.html).
See a [usage example](/docs/integrations/document_loaders/bilibili).
```python
from langchain.document_loaders import BiliBiliLoader

@ -14,7 +14,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/blackboard.html).
See a [usage example](/docs/integrations/document_loaders/blackboard).
```python
from langchain.document_loaders import BlackboardLoader

@ -21,7 +21,7 @@ To get access to the Brave Search API, you need to [create an account and get an
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/brave_search.html).
See a [usage example](/docs/integrations/document_loaders/brave_search).
```python
from langchain.document_loaders import BraveSearchLoader
@ -29,7 +29,7 @@ from langchain.document_loaders import BraveSearchLoader
## Tool
See a [usage example](/docs/modules/agents/tools/integrations/brave_search.html).
See a [usage example](/docs/integrations/tools/brave_search).
```python
from langchain.tools import BraveSearch

@ -18,7 +18,7 @@ pip install cassio
## Vector Store
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/cassandra.html).
See a [usage example](/docs/integrations/vectorstores/cassandra).
```python
from langchain.memory import CassandraChatMessageHistory
@ -28,7 +28,7 @@ from langchain.memory import CassandraChatMessageHistory
## Memory
See a [usage example](/docs/modules/memory/integrations/cassandra_chat_message_history.html).
See a [usage example](/docs/modules/memory/integrations/cassandra_chat_message_history).
```python
from langchain.memory import CassandraChatMessageHistory

@ -10,7 +10,7 @@ We need the [API Key](https://docs.chaindesk.ai/api-reference/authentication).
## Retriever
See a [usage example](/docs/modules/data_connection/retrievers/integrations/chaindesk.html).
See a [usage example](/docs/integrations/retrievers/chaindesk).
```python
from langchain.retrievers import ChaindeskRetriever

@ -18,11 +18,11 @@ whether for semantic search or example selection.
from langchain.vectorstores import Chroma
```
For a more detailed walkthrough of the Chroma wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/chroma.html)
For a more detailed walkthrough of the Chroma wrapper, see [this notebook](/docs/integrations/vectorstores/chroma.html)
## Retriever
See a [usage example](/docs/modules/data_connection/retrievers/how_to/self_query/chroma_self_query.html).
See a [usage example](/docs/modules/data_connection/retrievers/how_to/self_query/chroma_self_query).
```python
from langchain.retrievers import SelfQueryRetriever

@ -25,7 +25,7 @@ from langchain.llms import Clarifai
llm = Clarifai(pat=CLARIFAI_PAT, user_id=USER_ID, app_id=APP_ID, model_id=MODEL_ID)
```
For more details, the docs on the Clarifai LLM wrapper provide a [detailed walkthrough](/docs/modules/model_io/models/llms/integrations/clarifai.html).
For more details, the docs on the Clarifai LLM wrapper provide a [detailed walkthrough](/docs/integrations/llms/clarifai.html).
### Text Embedding Models
@ -37,7 +37,7 @@ There is a Clarifai Embedding model in LangChain, which you can access with:
from langchain.embeddings import ClarifaiEmbeddings
embeddings = ClarifaiEmbeddings(pat=CLARIFAI_PAT, user_id=USER_ID, app_id=APP_ID, model_id=MODEL_ID)
```
For more details, the docs on the Clarifai Embeddings wrapper provide a [detailed walthrough](/docs/modules/data_connection/text_embedding/integrations/clarifai.html).
For more details, the docs on the Clarifai Embeddings wrapper provide a [detailed walthrough](/docs/integrations/text_embedding/clarifai.html).
## Vectorstore
@ -49,4 +49,4 @@ You an also add data directly from LangChain as well, and the auto-indexing will
from langchain.vectorstores import Clarifai
clarifai_vector_db = Clarifai.from_texts(user_id=USER_ID, app_id=APP_ID, texts=texts, pat=CLARIFAI_PAT, number_of_docs=NUMBER_OF_DOCS, metadatas = metadatas)
```
For more details, the docs on the Clarifai vector store provide a [detailed walthrough](/docs/modules/data_connection/text_embedding/integrations/clarifai.html).
For more details, the docs on the Clarifai vector store provide a [detailed walthrough](/docs/integrations/text_embedding/clarifai.html).

@ -15,7 +15,7 @@ Get a [Cohere api key](https://dashboard.cohere.ai/) and set it as an environmen
## LLM
There exists an Cohere LLM wrapper, which you can access with
See a [usage example](/docs/modules/model_io/models/llms/integrations/cohere.html).
See a [usage example](/docs/integrations/llms/cohere).
```python
from langchain.llms import Cohere
@ -27,11 +27,11 @@ There exists an Cohere Embedding model, which you can access with
```python
from langchain.embeddings import CohereEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/cohere.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/cohere.html)
## Retriever
See a [usage example](/docs/modules/data_connection/retrievers/integrations/cohere-reranker.html).
See a [usage example](/docs/integrations/retrievers/cohere-reranker).
```python
from langchain.retrievers.document_compressors import CohereRerank

@ -9,7 +9,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/college_confidential.html).
See a [usage example](/docs/integrations/document_loaders/college_confidential).
```python
from langchain.document_loaders import CollegeConfidentialLoader

@ -15,7 +15,7 @@ See [instructions](https://support.atlassian.com/atlassian-account/docs/manage-a
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/confluence.html).
See a [usage example](/docs/integrations/document_loaders/confluence).
```python
from langchain.document_loaders import ConfluenceLoader

@ -54,4 +54,4 @@ llm = CTransformers(model='marella/gpt-2-ggml', config=config)
See [Documentation](https://github.com/marella/ctransformers#config) for a list of available parameters.
For a more detailed walkthrough of this, see [this notebook](/docs/modules/model_io/models/llms/integrations/ctransformers.html).
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/ctransformers.html).

@ -12,7 +12,7 @@ We must initialize the loader with the Datadog API key and APP key, and we need
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/datadog_logs.html).
See a [usage example](/docs/integrations/document_loaders/datadog_logs).
```python
from langchain.document_loaders import DatadogLogsLoader

@ -16,7 +16,7 @@ The DataForSEO utility wraps the API. To import this utility, use:
from langchain.utilities import DataForSeoAPIWrapper
```
For a detailed walkthrough of this wrapper, see [this notebook](/docs/modules/agents/tools/integrations/dataforseo.ipynb).
For a detailed walkthrough of this wrapper, see [this notebook](/docs/integrations/tools/dataforseo.ipynb).
### Tool

@ -27,4 +27,4 @@ from langchain.vectorstores import DeepLake
```
For a more detailed walkthrough of the Deep Lake wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/deeplake.html)
For a more detailed walkthrough of the Deep Lake wrapper, see [this notebook](/docs/integrations/vectorstores/deeplake.html)

@ -11,7 +11,7 @@ Read [instructions](https://docs.diffbot.com/reference/authentication) how to ge
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/diffbot.html).
See a [usage example](/docs/integrations/document_loaders/diffbot).
```python
from langchain.document_loaders import DiffbotLoader

@ -23,7 +23,7 @@ with Discord. That email will have a download button using which you would be ab
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/discord.html).
See a [usage example](/docs/integrations/document_loaders/discord).
```python
from langchain.document_loaders import DiscordChatLoader

@ -13,7 +13,7 @@ pip install lxml
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/docugami.html).
See a [usage example](/docs/integrations/document_loaders/docugami).
```python
from langchain.document_loaders import DocugamiLoader

@ -12,7 +12,7 @@ pip install duckdb
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/duckdb.html).
See a [usage example](/docs/integrations/document_loaders/duckdb).
```python
from langchain.document_loaders import DuckDBLoader

@ -17,7 +17,7 @@ pip install elasticsearch
>The name of the actual ranking function is BM25. The fuller name, Okapi BM25, includes the name of the first system to use it, which was the Okapi information retrieval system, implemented at London's City University in the 1980s and 1990s. BM25 and its newer variants, e.g. BM25F (a version of BM25 that can take document structure and anchor text into account), represent TF-IDF-like retrieval functions used in document retrieval.
See a [usage example](/docs/modules/data_connection/retrievers/integrations/elastic_search_bm25.html).
See a [usage example](/docs/integrations/retrievers/elastic_search_bm25).
```python
from langchain.retrievers import ElasticSearchBM25Retriever

@ -13,7 +13,7 @@ pip install html2text
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/evernote.html).
See a [usage example](/docs/integrations/document_loaders/evernote).
```python
from langchain.document_loaders import EverNoteLoader

@ -14,7 +14,7 @@ pip install pandas
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/facebook_chat.html).
See a [usage example](/docs/integrations/document_loaders/facebook_chat).
```python
from langchain.document_loaders import FacebookChatLoader

@ -14,7 +14,7 @@ The `file key` can be pulled from the URL. https://www.figma.com/file/{filekey}
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/figma.html).
See a [usage example](/docs/integrations/document_loaders/figma).
```python
from langchain.document_loaders import FigmaFileLoader

@ -12,7 +12,7 @@ pip install GitPython
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/git.html).
See a [usage example](/docs/integrations/document_loaders/git).
```python
from langchain.document_loaders import GitLoader

@ -8,7 +8,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/gitbook.html).
See a [usage example](/docs/integrations/document_loaders/gitbook).
```python
from langchain.document_loaders import GitbookLoader

@ -20,7 +20,7 @@ There exists a GoldenQueryAPIWrapper utility which wraps this API. To import thi
from langchain.utilities.golden_query import GoldenQueryAPIWrapper
```
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/modules/agents/tools/integrations/golden_query.html).
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/integrations/tools/golden_query.html).
### Tool

@ -13,7 +13,7 @@ pip install google-cloud-bigquery
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/google_bigquery.html).
See a [usage example](/docs/integrations/document_loaders/google_bigquery).
```python
from langchain.document_loaders import BigQueryLoader

@ -14,12 +14,12 @@ pip install google-cloud-storage
There are two loaders for the `Google Cloud Storage`: the `Directory` and the `File` loaders.
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/google_cloud_storage_directory.html).
See a [usage example](/docs/integrations/document_loaders/google_cloud_storage_directory).
```python
from langchain.document_loaders import GCSDirectoryLoader
```
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/google_cloud_storage_file.html).
See a [usage example](/docs/integrations/document_loaders/google_cloud_storage_file).
```python
from langchain.document_loaders import GCSFileLoader

@ -14,7 +14,7 @@ pip install google-api-python-client google-auth-httplib2 google-auth-oauthlib
## Document Loader
See a [usage example and authorizing instructions](/docs/modules/data_connection/document_loaders/integrations/google_drive.html).
See a [usage example and authorizing instructions](/docs/integrations/document_loaders/google_drive.html).
```python

@ -18,7 +18,7 @@ There exists a GoogleSearchAPIWrapper utility which wraps this API. To import th
from langchain.utilities import GoogleSearchAPIWrapper
```
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/modules/agents/tools/integrations/google_search.html).
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/integrations/tools/google_search.html).
### Tool

@ -59,7 +59,7 @@ So the final answer is: El Palmar, Spain
'El Palmar, Spain'
```
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/modules/agents/tools/integrations/google_serper.html).
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/integrations/tools/google_serper.html).
### Tool

@ -45,4 +45,4 @@ model("Once upon a time, ", callbacks=callbacks)
You can find links to model file downloads in the [pyllamacpp](https://github.com/nomic-ai/pyllamacpp) repository.
For a more detailed walkthrough of this, see [this notebook](/docs/modules/model_io/models/llms/integrations/gpt4all.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/gpt4all.html)

@ -8,7 +8,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/gutenberg.html).
See a [usage example](/docs/integrations/document_loaders/gutenberg).
```python
from langchain.document_loaders import GutenbergLoader

@ -11,7 +11,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/hacker_news.html).
See a [usage example](/docs/integrations/document_loaders/hacker_news).
```python
from langchain.document_loaders import HNLoader

@ -16,7 +16,7 @@ pip install psycopg2
## Vector Store
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/hologres.html).
See a [usage example](/docs/integrations/vectorstores/hologres).
```python
from langchain.vectorstores import Hologres

@ -30,7 +30,7 @@ To use a the wrapper for a model hosted on Hugging Face Hub:
```python
from langchain.llms import HuggingFaceHub
```
For a more detailed walkthrough of the Hugging Face Hub wrapper, see [this notebook](/docs/modules/model_io/models/llms/integrations/huggingface_hub.html)
For a more detailed walkthrough of the Hugging Face Hub wrapper, see [this notebook](/docs/integrations/llms/huggingface_hub.html)
### Embeddings
@ -47,7 +47,7 @@ To use a the wrapper for a model hosted on Hugging Face Hub:
```python
from langchain.embeddings import HuggingFaceHubEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/huggingfacehub.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/huggingfacehub.html)
### Tokenizer

@ -9,7 +9,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/ifixit.html).
See a [usage example](/docs/integrations/document_loaders/ifixit).
```python
from langchain.document_loaders import IFixitLoader

@ -8,7 +8,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/imsdb.html).
See a [usage example](/docs/integrations/document_loaders/imsdb).
```python

@ -15,7 +15,7 @@ There exists a Jina Embeddings wrapper, which you can access with
```python
from langchain.embeddings import JinaEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/jina.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/jina.html)
## Deployment

@ -20,4 +20,4 @@ To import this vectorstore:
from langchain.vectorstores import LanceDB
```
For a more detailed walkthrough of the LanceDB wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/lancedb.html)
For a more detailed walkthrough of the LanceDB wrapper, see [this notebook](/docs/integrations/vectorstores/lancedb.html)

@ -15,7 +15,7 @@ There exists a LlamaCpp LLM wrapper, which you can access with
```python
from langchain.llms import LlamaCpp
```
For a more detailed walkthrough of this, see [this notebook](/docs/modules/model_io/models/llms/integrations/llamacpp.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/llamacpp.html)
### Embeddings
@ -23,4 +23,4 @@ There exists a LlamaCpp Embeddings wrapper, which you can access with
```python
from langchain.embeddings import LlamaCppEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/llamacpp.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/llamacpp.html)

@ -28,4 +28,4 @@ To import this vectorstore:
from langchain.vectorstores import Marqo
```
For a more detailed walkthrough of the Marqo wrapper and some of its unique features, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/marqo.html)
For a more detailed walkthrough of the Marqo wrapper and some of its unique features, see [this notebook](/docs/integrations/vectorstores/marqo.html)

@ -23,7 +23,7 @@ pip install -qU mwparserfromhell
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/mediawikidump.html).
See a [usage example](/docs/integrations/document_loaders/mediawikidump).
```python

@ -10,11 +10,11 @@ First, you need to install a python package.
pip install o365
```
Then follow instructions [here](/docs/modules/data_connection/document_loaders/integrations/microsoft_onedrive.html).
Then follow instructions [here](/docs/integrations/document_loaders/microsoft_onedrive.html).
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/microsoft_onedrive.html).
See a [usage example](/docs/integrations/document_loaders/microsoft_onedrive).
```python

@ -8,7 +8,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/microsoft_powerpoint.html).
See a [usage example](/docs/integrations/document_loaders/microsoft_powerpoint).
```python

@ -8,7 +8,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/microsoft_word.html).
See a [usage example](/docs/integrations/document_loaders/microsoft_word).
```python

@ -17,4 +17,4 @@ To import this vectorstore:
from langchain.vectorstores import Milvus
```
For a more detailed walkthrough of the Miluvs wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/milvus.html)
For a more detailed walkthrough of the Miluvs wrapper, see [this notebook](/docs/integrations/vectorstores/milvus.html)

@ -17,4 +17,4 @@ There exists a modelscope Embeddings wrapper, which you can access with
from langchain.embeddings import ModelScopeEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/modelscope_hub.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/modelscope_hub.html)

@ -11,7 +11,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/modern_treasury.html).
See a [usage example](/docs/integrations/document_loaders/modern_treasury).
```python

@ -62,4 +62,4 @@ To import this vectorstore:
from langchain.vectorstores import MyScale
```
For a more detailed walkthrough of the MyScale wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/myscale.html)
For a more detailed walkthrough of the MyScale wrapper, see [this notebook](/docs/integrations/vectorstores/myscale.html)

@ -12,14 +12,14 @@ All instructions are in examples below.
We have two different loaders: `NotionDirectoryLoader` and `NotionDBLoader`.
See a [usage example for the NotionDirectoryLoader](/docs/modules/data_connection/document_loaders/integrations/notion.html).
See a [usage example for the NotionDirectoryLoader](/docs/integrations/document_loaders/notion.html).
```python
from langchain.document_loaders import NotionDirectoryLoader
```
See a [usage example for the NotionDBLoader](/docs/modules/data_connection/document_loaders/integrations/notiondb.html).
See a [usage example for the NotionDBLoader](/docs/integrations/document_loaders/notiondb.html).
```python

@ -10,7 +10,7 @@ All instructions are in examples below.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/obsidian.html).
See a [usage example](/docs/integrations/document_loaders/obsidian).
```python

@ -32,7 +32,7 @@ If you are using a model hosted on `Azure`, you should use different wrapper for
```python
from langchain.llms import AzureOpenAI
```
For a more detailed walkthrough of the `Azure` wrapper, see [this notebook](/docs/modules/model_io/models/llms/integrations/azure_openai_example.html)
For a more detailed walkthrough of the `Azure` wrapper, see [this notebook](/docs/integrations/llms/azure_openai_example.html)
@ -41,7 +41,7 @@ For a more detailed walkthrough of the `Azure` wrapper, see [this notebook](/doc
```python
from langchain.embeddings import OpenAIEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/openai.html)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/openai.html)
## Tokenizer
@ -58,7 +58,7 @@ For a more detailed walkthrough of this, see [this notebook](/docs/modules/data_
## Chain
See a [usage example](/docs/modules/chains/additional/moderation.html).
See a [usage example](/docs/modules/chains/additional/moderation).
```python
from langchain.chains import OpenAIModerationChain
@ -66,7 +66,7 @@ from langchain.chains import OpenAIModerationChain
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/chatgpt_loader.html).
See a [usage example](/docs/integrations/document_loaders/chatgpt_loader).
```python
from langchain.document_loaders.chatgpt import ChatGPTLoader
@ -74,7 +74,7 @@ from langchain.document_loaders.chatgpt import ChatGPTLoader
## Retriever
See a [usage example](/docs/modules/data_connection/retrievers/integrations/chatgpt-plugin.html).
See a [usage example](/docs/integrations/retrievers/chatgpt-plugin).
```python
from langchain.retrievers import ChatGPTPluginRetriever

@ -67,4 +67,4 @@ llm("What is the difference between a duck and a goose? And why there are so man
### Usage
For a more detailed walkthrough of the OpenLLM Wrapper, see the
[example notebook](/docs/modules/model_io/models/llms/integrations/openllm.html)
[example notebook](/docs/integrations/llms/openllm.html)

@ -18,4 +18,4 @@ To import this vectorstore:
from langchain.vectorstores import OpenSearchVectorSearch
```
For a more detailed walkthrough of the OpenSearch wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/opensearch.html)
For a more detailed walkthrough of the OpenSearch wrapper, see [this notebook](/docs/integrations/vectorstores/opensearch.html)

@ -29,7 +29,7 @@ There exists a OpenWeatherMapAPIWrapper utility which wraps this API. To import
from langchain.utilities.openweathermap import OpenWeatherMapAPIWrapper
```
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/modules/agents/tools/integrations/openweathermap.html).
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/integrations/tools/openweathermap.html).
### Tool

@ -26,4 +26,4 @@ from langchain.vectorstores.pgvector import PGVector
### Usage
For a more detailed walkthrough of the PGVector Wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/pgvector.html)
For a more detailed walkthrough of the PGVector Wrapper, see [this notebook](/docs/integrations/vectorstores/pgvector.html)

@ -19,4 +19,4 @@ whether for semantic search or example selection.
from langchain.vectorstores import Pinecone
```
For a more detailed walkthrough of the Pinecone vectorstore, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/pinecone.html)
For a more detailed walkthrough of the Pinecone vectorstore, see [this notebook](/docs/integrations/vectorstores/pinecone.html)

@ -46,4 +46,4 @@ This LLM is identical to the [OpenAI](/docs/ecosystem/integrations/openai.html)
- you can add `return_pl_id` when instantializing to return a PromptLayer request id to use [while tracking requests](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
PromptLayer also provides native wrappers for [`PromptLayerChatOpenAI`](/docs/modules/model_io/models/chat/integrations/promptlayer_chatopenai.html) and `PromptLayerOpenAIChat`
PromptLayer also provides native wrappers for [`PromptLayerChatOpenAI`](/docs/integrations/chat/promptlayer_chatopenai.html) and `PromptLayerOpenAIChat`

@ -16,7 +16,7 @@ view these connections from the dashboard and retrieve data using the server-sid
1. Create an account in the [dashboard](https://dashboard.psychic.dev/).
2. Use the [react library](https://docs.psychic.dev/sidekick-link) to add the Psychic link modal to your frontend react app. You will use this to connect the SaaS apps.
3. Once you have created a connection, you can use the `PsychicLoader` by following the [example notebook](/docs/modules/data_connection/document_loaders/integrations/psychic.html)
3. Once you have created a connection, you can use the `PsychicLoader` by following the [example notebook](/docs/integrations/document_loaders/psychic.html)
## Advantages vs Other Document Loaders

@ -17,4 +17,4 @@ To import this vectorstore:
from langchain.vectorstores import Qdrant
```
For a more detailed walkthrough of the Qdrant wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/qdrant.html)
For a more detailed walkthrough of the Qdrant wrapper, see [this notebook](/docs/integrations/vectorstores/qdrant.html)

@ -14,7 +14,7 @@ Make a [Reddit Application](https://www.reddit.com/prefs/apps/) and initialize t
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/reddit.html).
See a [usage example](/docs/integrations/document_loaders/reddit).
```python

@ -92,7 +92,7 @@ To import this vectorstore:
from langchain.vectorstores import Redis
```
For a more detailed walkthrough of the Redis vectorstore wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/redis.html).
For a more detailed walkthrough of the Redis vectorstore wrapper, see [this notebook](/docs/integrations/vectorstores/redis.html).
### Retriever

@ -10,7 +10,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/roam.html).
See a [usage example](/docs/integrations/document_loaders/roam).
```python
from langchain.document_loaders import RoamLoader

@ -12,7 +12,7 @@ pip install rockset
## Vector Store
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/rockset.html).
See a [usage example](/docs/integrations/vectorstores/rockset).
```python
from langchain.vectorstores import RocksetDB

@ -15,7 +15,7 @@ custom LLMs, you can use the `SelfHostedPipeline` parent class.
from langchain.llms import SelfHostedPipeline, SelfHostedHuggingFaceLLM
```
For a more detailed walkthrough of the Self-hosted LLMs, see [this notebook](/docs/modules/model_io/models/llms/integrations/runhouse.html)
For a more detailed walkthrough of the Self-hosted LLMs, see [this notebook](/docs/integrations/llms/runhouse.html)
## Self-hosted Embeddings
There are several ways to use self-hosted embeddings with LangChain via Runhouse.
@ -26,4 +26,4 @@ the `SelfHostedEmbedding` class.
from langchain.llms import SelfHostedPipeline, SelfHostedHuggingFaceLLM
```
For a more detailed walkthrough of the Self-hosted Embeddings, see [this notebook](/docs/modules/data_connection/text_embedding/integrations/self-hosted.html)
For a more detailed walkthrough of the Self-hosted Embeddings, see [this notebook](/docs/integrations/text_embedding/self-hosted.html)

@ -40,7 +40,7 @@ We have to set up following required parameters of the `SagemakerEndpoint` call:
## LLM
See a [usage example](/docs/modules/model_io/models/llms/integrations/sagemaker.html).
See a [usage example](/docs/integrations/llms/sagemaker).
```python
from langchain import SagemakerEndpoint
@ -49,7 +49,7 @@ from langchain.llms.sagemaker_endpoint import LLMContentHandler
## Text Embedding Models
See a [usage example](/docs/modules/data_connection/text_embedding/integrations/sagemaker-endpoint.html).
See a [usage example](/docs/integrations/text_embedding/sagemaker-endpoint).
```python
from langchain.embeddings import SagemakerEndpointEmbeddings
from langchain.llms.sagemaker_endpoint import ContentHandlerBase

@ -17,7 +17,7 @@ There exists a SerpAPI utility which wraps this API. To import this utility:
from langchain.utilities import SerpAPIWrapper
```
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/modules/agents/tools/integrations/serpapi.html).
For a more detailed walkthrough of this wrapper, see [this notebook](/docs/integrations/tools/serpapi.html).
### Tool

@ -13,7 +13,7 @@ pip install singlestoredb
## Vector Store
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/singlestoredb.html).
See a [usage example](/docs/integrations/vectorstores/singlestoredb).
```python
from langchain.vectorstores import SingleStoreDB

@ -19,4 +19,4 @@ To import this vectorstore:
from langchain.vectorstores import SKLearnVectorStore
```
For a more detailed walkthrough of the SKLearnVectorStore wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/sklearn.html).
For a more detailed walkthrough of the SKLearnVectorStore wrapper, see [this notebook](/docs/integrations/vectorstores/sklearn.html).

@ -10,7 +10,7 @@ There isn't any special setup for it.
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/slack.html).
See a [usage example](/docs/integrations/document_loaders/slack).
```python
from langchain.document_loaders import SlackDirectoryLoader

@ -4,11 +4,11 @@
## Installation and Setup
See [setup instructions](/docs/modules/data_connection/document_loaders/integrations/spreedly.html).
See [setup instructions](/docs/integrations/document_loaders/spreedly.html).
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/spreedly.html).
See a [usage example](/docs/integrations/document_loaders/spreedly).
```python
from langchain.document_loaders import SpreedlyLoader

@ -14,7 +14,7 @@ pip install pymysql
## Vector Store
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/starrocks.html).
See a [usage example](/docs/integrations/vectorstores/starrocks).
```python
from langchain.vectorstores import StarRocks

@ -5,11 +5,11 @@
## Installation and Setup
See [setup instructions](/docs/modules/data_connection/document_loaders/integrations/stripe.html).
See [setup instructions](/docs/integrations/document_loaders/stripe.html).
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/stripe.html).
See a [usage example](/docs/integrations/document_loaders/stripe).
```python
from langchain.document_loaders import StripeLoader

@ -19,4 +19,4 @@ To import this vectorstore:
from langchain.vectorstores import Tair
```
For a more detailed walkthrough of the Tair wrapper, see [this notebook](/docs/modules/data_connection/vectorstores/integrations/tair.html)
For a more detailed walkthrough of the Tair wrapper, see [this notebook](/docs/integrations/vectorstores/tair.html)

@ -5,11 +5,11 @@
## Installation and Setup
See [setup instructions](/docs/modules/data_connection/document_loaders/integrations/telegram.html).
See [setup instructions](/docs/integrations/document_loaders/telegram.html).
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/telegram.html).
See a [usage example](/docs/integrations/document_loaders/telegram).
```python
from langchain.document_loaders import TelegramChatFileLoader

@ -12,7 +12,7 @@ pip install tigrisdb openapi-schema-pydantic openai tiktoken
## Vector Store
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/tigris.html).
See a [usage example](/docs/integrations/vectorstores/tigris).
```python
from langchain.vectorstores import Tigris

@ -9,7 +9,7 @@ We need the `API key`. See [instructions how to get it](https://2markdown.com/lo
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/tomarkdown.html).
See a [usage example](/docs/integrations/document_loaders/tomarkdown).
```python
from langchain.document_loaders import ToMarkdownLoader

@ -10,12 +10,12 @@
pip install py-trello beautifulsoup4
```
See [setup instructions](/docs/modules/data_connection/document_loaders/integrations/trello.html).
See [setup instructions](/docs/integrations/document_loaders/trello.html).
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/trello.html).
See a [usage example](/docs/integrations/document_loaders/trello).
```python
from langchain.document_loaders import TrelloLoader

@ -14,7 +14,7 @@ We must initialize the loader with the `Twitter API` token, and we need to set u
## Document Loader
See a [usage example](/docs/modules/data_connection/document_loaders/integrations/twitter.html).
See a [usage example](/docs/integrations/document_loaders/twitter).
```python
from langchain.document_loaders import TwitterTweetLoader

@ -15,7 +15,7 @@ pip install typesense openapi-schema-pydantic openai tiktoken
## Vector Store
See a [usage example](/docs/modules/data_connection/vectorstores/integrations/typesense.html).
See a [usage example](/docs/integrations/vectorstores/typesense).
```python
from langchain.vectorstores import Typesense

Some files were not shown because too many files have changed in this diff Show More

Loading…
Cancel
Save