docs: integrations updates 17 (#27015)

Added missed provider pages. Added missed descriptions and links.
I fixed the Ipex-LLM titles, so the ToC is now sorted properly for these
titles.

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
This commit is contained in:
Leonid Ganeline 2024-10-08 10:03:18 -07:00 committed by GitHub
parent 8d27325dbc
commit ea9a59bcf5
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
6 changed files with 126 additions and 25 deletions

View File

@ -0,0 +1,44 @@
# BAAI
>[Beijing Academy of Artificial Intelligence (BAAI) (Wikipedia)](https://en.wikipedia.org/wiki/Beijing_Academy_of_Artificial_Intelligence),
> also known as `Zhiyuan Institute`, is a Chinese non-profit artificial
> intelligence (AI) research laboratory. `BAAI` conducts AI research
> and is dedicated to promoting collaboration among academia and industry,
> as well as fostering top talent and a focus on long-term research on
> the fundamentals of AI technology. As a collaborative hub, BAAI's founding
> members include leading AI companies, universities, and research institutes.
## Embedding Models
### HuggingFaceBgeEmbeddings
>[BGE models on the HuggingFace](https://huggingface.co/BAAI/bge-large-en-v1.5)
> are one of [the best open-source embedding models](https://huggingface.co/spaces/mteb/leaderboard).
See a [usage example](/docs/integrations/text_embedding/bge_huggingface).
```python
from langchain_community.embeddings import HuggingFaceBgeEmbeddings
```
### IpexLLMBgeEmbeddings
>[IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch
> library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU,
> discrete GPU such as Arc, Flex and Max) with very low latency.
See a [usage example running model on Intel CPU](/docs/integrations/text_embedding/ipex_llm).
See a [usage example running model on Intel GPU](/docs/integrations/text_embedding/ipex_llm_gpu).
```python
from langchain_community.embeddings import IpexLLMBgeEmbeddings
```
### QuantizedBgeEmbeddings
See a [usage example](/docs/integrations/text_embedding/itrex).
```python
from langchain_community.embeddings import QuantizedBgeEmbeddings
```

View File

@ -1,20 +1,35 @@
# Jina
# Jina AI
This page covers how to use the Jina Embeddings within LangChain.
It is broken into two parts: installation and setup, and then references to specific Jina wrappers.
>[Jina AI](https://jina.ai/about-us) is a search AI company. `Jina` helps businesses and developers unlock multimodal data with a better search.
## Installation and Setup
- Get a Jina AI API token from [here](https://jina.ai/embeddings/) and set it as an environment variable (`JINA_API_TOKEN`)
There exists a Jina Embeddings wrapper, which you can access with
## Chat Models
```python
from langchain_community.embeddings import JinaEmbeddings
# you can pas jina_api_key, if none is passed it will be taken from `JINA_API_TOKEN` environment variable
embeddings = JinaEmbeddings(jina_api_key='jina_**', model_name='jina-embeddings-v2-base-en')
from langchain_community.chat_models import JinaChat
```
See a [usage examples](/docs/integrations/chat/jinachat).
## Embedding Models
You can check the list of available models from [here](https://jina.ai/embeddings/)
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/jina)
```python
from langchain_community.embeddings import JinaEmbeddings
```
See a [usage examples](/docs/integrations/text_embedding/jina).
## Document Transformers
### Jina Rerank
```python
from langchain_community.document_compressors import JinaRerank
```
See a [usage examples](/docs/integrations/document_transformers/jina_rerank).

View File

@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Local BGE Embeddings with IPEX-LLM on Intel CPU\n",
"# IPEX-LLM: Local BGE Embeddings on Intel CPU\n",
"\n",
"> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n",
"\n",
@ -92,10 +92,24 @@
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"name": "python"
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}

View File

@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Local BGE Embeddings with IPEX-LLM on Intel GPU\n",
"# IPEX-LLM: Local BGE Embeddings on Intel GPU\n",
"\n",
"> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n",
"\n",
@ -155,10 +155,24 @@
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"name": "python"
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}

View File

@ -5,7 +5,11 @@
"id": "1c0cf975",
"metadata": {},
"source": [
"# Jina"
"# Jina\n",
"\n",
"You can check the list of available models from [here](https://jina.ai/embeddings/).\n",
"\n",
"## Installation and setup"
]
},
{
@ -231,7 +235,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.13"
"version": "3.10.12"
}
},
"nbformat": 4,

View File

@ -209,15 +209,25 @@
},
{
"cell_type": "markdown",
"id": "5f5751e3-2e98-485f-8164-db8094039c25",
"id": "4e3fd064-aa86-448d-8db3-3c55eaa5bc15",
"metadata": {},
"source": [
"API references:\n",
"\n",
"- [QuerySQLDataBaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.QuerySQLDataBaseTool.html)\n",
"- [InfoSQLDatabaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.InfoSQLDatabaseTool.html)\n",
"- [ListSQLDatabaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.ListSQLDatabaseTool.html)\n",
"- [QuerySQLCheckerTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.QuerySQLCheckerTool.html)"
"You can use the individual tools directly:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "7fa8d00c-750c-4803-9b66-057d12b26b06",
"metadata": {},
"outputs": [],
"source": [
"from langchain_community.tools.sql_database.tool import (\n",
" InfoSQLDatabaseTool,\n",
" ListSQLDatabaseTool,\n",
" QuerySQLCheckerTool,\n",
" QuerySQLDataBaseTool,\n",
")"
]
},
{
@ -604,7 +614,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.4"
"version": "3.10.12"
}
},
"nbformat": 4,