From ea9a59bcf52eda76126a1f713f5eb4924494b215 Mon Sep 17 00:00:00 2001 From: Leonid Ganeline Date: Tue, 8 Oct 2024 10:03:18 -0700 Subject: [PATCH] docs: `integrations` updates 17 (#27015) Added missed provider pages. Added missed descriptions and links. I fixed the Ipex-LLM titles, so the ToC is now sorted properly for these titles. --------- Co-authored-by: Erick Friis --- docs/docs/integrations/providers/baai.mdx | 44 +++++++++++++++++++ docs/docs/integrations/providers/jina.mdx | 33 ++++++++++---- .../text_embedding/ipex_llm.ipynb | 20 +++++++-- .../text_embedding/ipex_llm_gpu.ipynb | 20 +++++++-- .../integrations/text_embedding/jina.ipynb | 8 +++- .../integrations/tools/sql_database.ipynb | 26 +++++++---- 6 files changed, 126 insertions(+), 25 deletions(-) create mode 100644 docs/docs/integrations/providers/baai.mdx diff --git a/docs/docs/integrations/providers/baai.mdx b/docs/docs/integrations/providers/baai.mdx new file mode 100644 index 0000000000..58ff1152ef --- /dev/null +++ b/docs/docs/integrations/providers/baai.mdx @@ -0,0 +1,44 @@ +# BAAI + +>[Beijing Academy of Artificial Intelligence (BAAI) (Wikipedia)](https://en.wikipedia.org/wiki/Beijing_Academy_of_Artificial_Intelligence), +> also known as `Zhiyuan Institute`, is a Chinese non-profit artificial +> intelligence (AI) research laboratory. `BAAI` conducts AI research +> and is dedicated to promoting collaboration among academia and industry, +> as well as fostering top talent and a focus on long-term research on +> the fundamentals of AI technology. As a collaborative hub, BAAI's founding +> members include leading AI companies, universities, and research institutes. + + +## Embedding Models + +### HuggingFaceBgeEmbeddings + +>[BGE models on the HuggingFace](https://huggingface.co/BAAI/bge-large-en-v1.5) +> are one of [the best open-source embedding models](https://huggingface.co/spaces/mteb/leaderboard). + +See a [usage example](/docs/integrations/text_embedding/bge_huggingface). + +```python +from langchain_community.embeddings import HuggingFaceBgeEmbeddings +``` + +### IpexLLMBgeEmbeddings + +>[IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch +> library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, +> discrete GPU such as Arc, Flex and Max) with very low latency. + +See a [usage example running model on Intel CPU](/docs/integrations/text_embedding/ipex_llm). +See a [usage example running model on Intel GPU](/docs/integrations/text_embedding/ipex_llm_gpu). + +```python +from langchain_community.embeddings import IpexLLMBgeEmbeddings +``` + +### QuantizedBgeEmbeddings + +See a [usage example](/docs/integrations/text_embedding/itrex). + +```python +from langchain_community.embeddings import QuantizedBgeEmbeddings +``` diff --git a/docs/docs/integrations/providers/jina.mdx b/docs/docs/integrations/providers/jina.mdx index 057ace079f..5262889c37 100644 --- a/docs/docs/integrations/providers/jina.mdx +++ b/docs/docs/integrations/providers/jina.mdx @@ -1,20 +1,35 @@ -# Jina +# Jina AI -This page covers how to use the Jina Embeddings within LangChain. -It is broken into two parts: installation and setup, and then references to specific Jina wrappers. +>[Jina AI](https://jina.ai/about-us) is a search AI company. `Jina` helps businesses and developers unlock multimodal data with a better search. ## Installation and Setup - Get a Jina AI API token from [here](https://jina.ai/embeddings/) and set it as an environment variable (`JINA_API_TOKEN`) -There exists a Jina Embeddings wrapper, which you can access with +## Chat Models ```python -from langchain_community.embeddings import JinaEmbeddings - -# you can pas jina_api_key, if none is passed it will be taken from `JINA_API_TOKEN` environment variable -embeddings = JinaEmbeddings(jina_api_key='jina_**', model_name='jina-embeddings-v2-base-en') +from langchain_community.chat_models import JinaChat ``` +See a [usage examples](/docs/integrations/chat/jinachat). + +## Embedding Models + You can check the list of available models from [here](https://jina.ai/embeddings/) -For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/jina) +```python +from langchain_community.embeddings import JinaEmbeddings +``` + +See a [usage examples](/docs/integrations/text_embedding/jina). + +## Document Transformers + +### Jina Rerank + +```python +from langchain_community.document_compressors import JinaRerank +``` + +See a [usage examples](/docs/integrations/document_transformers/jina_rerank). + diff --git a/docs/docs/integrations/text_embedding/ipex_llm.ipynb b/docs/docs/integrations/text_embedding/ipex_llm.ipynb index e8d7c2dd0e..17f9fca8ca 100644 --- a/docs/docs/integrations/text_embedding/ipex_llm.ipynb +++ b/docs/docs/integrations/text_embedding/ipex_llm.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Local BGE Embeddings with IPEX-LLM on Intel CPU\n", + "# IPEX-LLM: Local BGE Embeddings on Intel CPU\n", "\n", "> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n", "\n", @@ -92,10 +92,24 @@ } ], "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, "language_info": { - "name": "python" + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.12" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/docs/docs/integrations/text_embedding/ipex_llm_gpu.ipynb b/docs/docs/integrations/text_embedding/ipex_llm_gpu.ipynb index ca3ad124e6..70498d979d 100644 --- a/docs/docs/integrations/text_embedding/ipex_llm_gpu.ipynb +++ b/docs/docs/integrations/text_embedding/ipex_llm_gpu.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Local BGE Embeddings with IPEX-LLM on Intel GPU\n", + "# IPEX-LLM: Local BGE Embeddings on Intel GPU\n", "\n", "> [IPEX-LLM](https://github.com/intel-analytics/ipex-llm) is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency.\n", "\n", @@ -155,10 +155,24 @@ } ], "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, "language_info": { - "name": "python" + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.12" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/docs/docs/integrations/text_embedding/jina.ipynb b/docs/docs/integrations/text_embedding/jina.ipynb index e77fe213d3..aa75ba8766 100644 --- a/docs/docs/integrations/text_embedding/jina.ipynb +++ b/docs/docs/integrations/text_embedding/jina.ipynb @@ -5,7 +5,11 @@ "id": "1c0cf975", "metadata": {}, "source": [ - "# Jina" + "# Jina\n", + "\n", + "You can check the list of available models from [here](https://jina.ai/embeddings/).\n", + "\n", + "## Installation and setup" ] }, { @@ -231,7 +235,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.13" + "version": "3.10.12" } }, "nbformat": 4, diff --git a/docs/docs/integrations/tools/sql_database.ipynb b/docs/docs/integrations/tools/sql_database.ipynb index ef3d63e2b2..f5be93f2c5 100644 --- a/docs/docs/integrations/tools/sql_database.ipynb +++ b/docs/docs/integrations/tools/sql_database.ipynb @@ -209,15 +209,25 @@ }, { "cell_type": "markdown", - "id": "5f5751e3-2e98-485f-8164-db8094039c25", + "id": "4e3fd064-aa86-448d-8db3-3c55eaa5bc15", "metadata": {}, "source": [ - "API references:\n", - "\n", - "- [QuerySQLDataBaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.QuerySQLDataBaseTool.html)\n", - "- [InfoSQLDatabaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.InfoSQLDatabaseTool.html)\n", - "- [ListSQLDatabaseTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.ListSQLDatabaseTool.html)\n", - "- [QuerySQLCheckerTool](https://python.langchain.com/api_reference/community/tools/langchain_community.tools.sql_database.tool.QuerySQLCheckerTool.html)" + "You can use the individual tools directly:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7fa8d00c-750c-4803-9b66-057d12b26b06", + "metadata": {}, + "outputs": [], + "source": [ + "from langchain_community.tools.sql_database.tool import (\n", + " InfoSQLDatabaseTool,\n", + " ListSQLDatabaseTool,\n", + " QuerySQLCheckerTool,\n", + " QuerySQLDataBaseTool,\n", + ")" ] }, { @@ -604,7 +614,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.4" + "version": "3.10.12" } }, "nbformat": 4,