mirror of
https://github.com/hwchase17/langchain
synced 2024-11-13 19:10:52 +00:00
docs: integrations
updates 17 (#27015)
Added missed provider pages. Added missed descriptions and links. I fixed the Ipex-LLM titles, so the ToC is now sorted properly for these titles. --------- Co-authored-by: Erick Friis <erick@langchain.dev>
This commit is contained in:
parent
8d27325dbc
commit
ea9a59bcf5
44
docs/docs/integrations/providers/baai.mdx
Normal file
44
docs/docs/integrations/providers/baai.mdx
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
# BAAI
|
||||||
|
|
||||||
|
>[Beijing Academy of Artificial Intelligence (BAAI) (Wikipedia)](https://en.wikipedia.org/wiki/Beijing_Academy_of_Artificial_Intelligence),
|
||||||
|
> also known as `Zhiyuan Institute`, is a Chinese non-profit artificial
|
||||||
|
> intelligence (AI) research laboratory. `BAAI` conducts AI research
|
||||||
|
> and is dedicated to promoting collaboration among academia and industry,
|
||||||
|
> as well as fostering top talent and a focus on long-term research on
|
||||||
|
> the fundamentals of AI technology. As a collaborative hub, BAAI's founding
|
||||||
|
> members include leading AI companies, universities, and research institutes.
|
||||||
|
|
||||||
|
|
||||||
|
## Embedding Models
|
||||||
|
|
||||||
|
### HuggingFaceBgeEmbeddings
|
||||||
|
|
||||||
|
>[BGE models on the HuggingFace](https://huggingface.co/BAAI/bge-large-en-v1.5)
|
||||||
|
> are one of [the best open-source embedding models](https://huggingface.co/spaces/mteb/leaderboard).
|
||||||
|
|
||||||
|
See a [usage example](/docs/integrations/text_embedding/bge_huggingface).
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain_community.embeddings import HuggingFaceBgeEmbeddings
|
||||||
|
```
|
||||||
|
|
||||||
|
### IpexLLMBgeEmbeddings
|
||||||
|
|
||||||
|
>[IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch
|
||||||
|
> library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU,
|
||||||
|
> discrete GPU such as Arc, Flex and Max) with very low latency.
|
||||||
|
|
||||||
|
See a [usage example running model on Intel CPU](/docs/integrations/text_embedding/ipex_llm).
|
||||||
|
See a [usage example running model on Intel GPU](/docs/integrations/text_embedding/ipex_llm_gpu).
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain_community.embeddings import IpexLLMBgeEmbeddings
|
||||||
|
```
|
||||||
|
|
||||||
|
### QuantizedBgeEmbeddings
|
||||||
|
|
||||||
|
See a [usage example](/docs/integrations/text_embedding/itrex).
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain_community.embeddings import QuantizedBgeEmbeddings
|
||||||
|
```
|
@ -1,20 +1,35 @@
|
|||||||
# Jina
|
# Jina AI
|
||||||
|
|
||||||
This page covers how to use the Jina Embeddings within LangChain.
|
>[Jina AI](https://jina.ai/about-us) is a search AI company. `Jina` helps businesses and developers unlock multimodal data with a better search.
|
||||||
It is broken into two parts: installation and setup, and then references to specific Jina wrappers.
|
|
||||||
|
|
||||||
## Installation and Setup
|
## Installation and Setup
|
||||||
- Get a Jina AI API token from [here](https://jina.ai/embeddings/) and set it as an environment variable (`JINA_API_TOKEN`)
|
- Get a Jina AI API token from [here](https://jina.ai/embeddings/) and set it as an environment variable (`JINA_API_TOKEN`)
|
||||||
|
|
||||||
There exists a Jina Embeddings wrapper, which you can access with
|
## Chat Models
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from langchain_community.embeddings import JinaEmbeddings
|
from langchain_community.chat_models import JinaChat
|
||||||
|
|
||||||
# you can pas jina_api_key, if none is passed it will be taken from `JINA_API_TOKEN` environment variable
|
|
||||||
embeddings = JinaEmbeddings(jina_api_key='jina_**', model_name='jina-embeddings-v2-base-en')
|
|
||||||
```
|
```
|
||||||
|
|
||||||
|
See a [usage examples](/docs/integrations/chat/jinachat).
|
||||||
|
|
||||||
|
## Embedding Models
|
||||||
|
|
||||||
You can check the list of available models from [here](https://jina.ai/embeddings/)
|
You can check the list of available models from [here](https://jina.ai/embeddings/)
|
||||||
|
|
||||||
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/jina)
|
```python
|
||||||
|
from langchain_community.embeddings import JinaEmbeddings
|
||||||
|
```
|
||||||
|
|
||||||
|
See a [usage examples](/docs/integrations/text_embedding/jina).
|
||||||
|
|
||||||
|
## Document Transformers
|
||||||
|
|
||||||
|
### Jina Rerank
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain_community.document_compressors import JinaRerank
|
||||||
|
```
|
||||||
|
|
||||||
|
See a [usage examples](/docs/integrations/document_transformers/jina_rerank).
|
||||||
|
|
||||||
|
@ -4,7 +4,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Local BGE Embeddings with IPEX-LLM on Intel CPU\n",
|
"# IPEX-LLM: Local BGE Embeddings on Intel CPU\n",
|
||||||
"\n",
|
"\n",
|
||||||
"> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n",
|
"> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n",
|
||||||
"\n",
|
"\n",
|
||||||
@ -92,10 +92,24 @@
|
|||||||
}
|
}
|
||||||
],
|
],
|
||||||
"metadata": {
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
"language_info": {
|
"language_info": {
|
||||||
"name": "python"
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.12"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
"nbformat_minor": 2
|
"nbformat_minor": 4
|
||||||
}
|
}
|
||||||
|
@ -4,7 +4,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Local BGE Embeddings with IPEX-LLM on Intel GPU\n",
|
"# IPEX-LLM: Local BGE Embeddings on Intel GPU\n",
|
||||||
"\n",
|
"\n",
|
||||||
"> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n",
|
"> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n",
|
||||||
"\n",
|
"\n",
|
||||||
@ -155,10 +155,24 @@
|
|||||||
}
|
}
|
||||||
],
|
],
|
||||||
"metadata": {
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
"language_info": {
|
"language_info": {
|
||||||
"name": "python"
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.12"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
"nbformat_minor": 2
|
"nbformat_minor": 4
|
||||||
}
|
}
|
||||||
|
@ -5,7 +5,11 @@
|
|||||||
"id": "1c0cf975",
|
"id": "1c0cf975",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Jina"
|
"# Jina\n",
|
||||||
|
"\n",
|
||||||
|
"You can check the list of available models from [here](https://jina.ai/embeddings/).\n",
|
||||||
|
"\n",
|
||||||
|
"## Installation and setup"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -231,7 +235,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.9.13"
|
"version": "3.10.12"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
@ -209,15 +209,25 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"id": "5f5751e3-2e98-485f-8164-db8094039c25",
|
"id": "4e3fd064-aa86-448d-8db3-3c55eaa5bc15",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"API references:\n",
|
"You can use the individual tools directly:"
|
||||||
"\n",
|
]
|
||||||
"- [QuerySQLDataBaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.QuerySQLDataBaseTool.html)\n",
|
},
|
||||||
"- [InfoSQLDatabaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.InfoSQLDatabaseTool.html)\n",
|
{
|
||||||
"- [ListSQLDatabaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.ListSQLDatabaseTool.html)\n",
|
"cell_type": "code",
|
||||||
"- [QuerySQLCheckerTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.QuerySQLCheckerTool.html)"
|
"execution_count": null,
|
||||||
|
"id": "7fa8d00c-750c-4803-9b66-057d12b26b06",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain_community.tools.sql_database.tool import (\n",
|
||||||
|
" InfoSQLDatabaseTool,\n",
|
||||||
|
" ListSQLDatabaseTool,\n",
|
||||||
|
" QuerySQLCheckerTool,\n",
|
||||||
|
" QuerySQLDataBaseTool,\n",
|
||||||
|
")"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -604,7 +614,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.10.4"
|
"version": "3.10.12"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
Loading…
Reference in New Issue
Block a user