diff --git a/docs/docs/integrations/chat/llamacpp.ipynb b/docs/docs/integrations/chat/llamacpp.ipynb index 6ce74e1846..85aedff9cb 100644 --- a/docs/docs/integrations/chat/llamacpp.ipynb +++ b/docs/docs/integrations/chat/llamacpp.ipynb @@ -4,9 +4,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# ChatLlamaCpp\n", - "\n", - "This notebook provides a quick overview for getting started with chat model intergrated with [llama cpp python](https://github.com/abetlen/llama-cpp-python)." + "# Llama.cpp\n", + "\n", + ">[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`\n", + ">[llama.cpp](https://github.com/ggerganov/llama.cpp).\n", + ">\n", + ">This package provides:\n", + ">\n", + "> - Low-level access to C API via ctypes interface.\n", + "> - High-level Python API for text completion\n", + "> - `OpenAI`-like API\n", + "> - `LangChain` compatibility\n", + "> - `LlamaIndex` compatibility\n", + "> - OpenAI compatible web server\n", + "> - Local Copilot replacement\n", + "> - Function Calling support\n", + "> - Vision API support\n", + "> - Multiple Models\n" ] }, { @@ -410,7 +424,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.8" + "version": "3.10.12" } }, "nbformat": 4, diff --git a/docs/docs/integrations/chat/octoai.ipynb b/docs/docs/integrations/chat/octoai.ipynb index 8c2a1bc853..a0bbe98be8 100644 --- a/docs/docs/integrations/chat/octoai.ipynb +++ b/docs/docs/integrations/chat/octoai.ipynb @@ -99,7 +99,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.7" + "version": "3.10.12" }, "vscode": { "interpreter": { diff --git a/docs/docs/integrations/chat/perplexity.ipynb b/docs/docs/integrations/chat/perplexity.ipynb index f74a410115..4b8e151750 100644 --- a/docs/docs/integrations/chat/perplexity.ipynb +++ b/docs/docs/integrations/chat/perplexity.ipynb @@ -17,7 +17,7 @@ "source": [ "# ChatPerplexity\n", "\n", - "This notebook covers how to get started with Perplexity chat models." + "This notebook covers how to get started with `Perplexity` chat models." ] }, { @@ -37,17 +37,31 @@ "from langchain_core.prompts import ChatPromptTemplate" ] }, + { + "cell_type": "markdown", + "id": "b26e2035-2f81-4451-ba44-fa2e2d5aeb62", + "metadata": {}, + "source": [ + "The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d986aac6-1bae-4608-8514-d3ba5b35b10e", + "metadata": {}, + "outputs": [], + "source": [ + "chat = ChatPerplexity(\n", + " temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\"\n", + ")" + ] + }, { "cell_type": "markdown", "id": "97a8ce3a", "metadata": {}, "source": [ - "The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n", - "\n", - "```python\n", - "chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\")\n", - "```\n", - "\n", "You can check a list of available models [here](https://docs.perplexity.ai/docs/model-cards). For reproducibility, we can set the API key dynamically by taking it as an input in this notebook." ] }, @@ -221,7 +235,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.18" + "version": "3.10.12" } }, "nbformat": 4, diff --git a/docs/docs/integrations/providers/llamacpp.mdx b/docs/docs/integrations/providers/llamacpp.mdx index 48b12913c2..de7d40a1c5 100644 --- a/docs/docs/integrations/providers/llamacpp.mdx +++ b/docs/docs/integrations/providers/llamacpp.mdx @@ -1,26 +1,50 @@ # Llama.cpp -This page covers how to use [llama.cpp](https://github.com/ggerganov/llama.cpp) within LangChain. -It is broken into two parts: installation and setup, and then references to specific Llama-cpp wrappers. +>[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov` +>[llama.cpp](https://github.com/ggerganov/llama.cpp). +> +>This package provides: +> +> - Low-level access to C API via ctypes interface. +> - High-level Python API for text completion +> - `OpenAI`-like API +> - `LangChain` compatibility +> - `LlamaIndex` compatibility +> - OpenAI compatible web server +> - Local Copilot replacement +> - Function Calling support +> - Vision API support +> - Multiple Models ## Installation and Setup -- Install the Python package with `pip install llama-cpp-python` + +- Install the Python package + ```bash + pip install llama-cpp-python + ```` - Download one of the [supported models](https://github.com/ggerganov/llama.cpp#description) and convert them to the llama.cpp format per the [instructions](https://github.com/ggerganov/llama.cpp) -## Wrappers -### LLM +## Chat models + +See a [usage example](/docs/integrations/chat/llamacpp). + +```python +from langchain_community.chat_models import ChatLlamaCpp +``` + +## LLMs + +See a [usage example](/docs/integrations/llms/llamacpp). -There exists a LlamaCpp LLM wrapper, which you can access with ```python from langchain_community.llms import LlamaCpp ``` -For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/llamacpp) -### Embeddings +## Embedding models + +See a [usage example](/docs/integrations/text_embedding/llamacpp). -There exists a LlamaCpp Embeddings wrapper, which you can access with ```python from langchain_community.embeddings import LlamaCppEmbeddings ``` -For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/llamacpp) diff --git a/docs/docs/integrations/providers/maritalk.mdx b/docs/docs/integrations/providers/maritalk.mdx new file mode 100644 index 0000000000..6b0dcda545 --- /dev/null +++ b/docs/docs/integrations/providers/maritalk.mdx @@ -0,0 +1,21 @@ +# MariTalk + +>[MariTalk](https://www.maritaca.ai/en) is an LLM-based chatbot trained to meet the needs of Brazil. + +## Installation and Setup + +You have to get the MariTalk API key. + +You also need to install the `httpx` Python package. + +```bash +pip install httpx +``` + +## Chat models + +See a [usage example](/docs/integrations/chat/maritalk). + +```python +from langchain_community.chat_models import ChatMaritalk +``` diff --git a/docs/docs/integrations/providers/mlx.mdx b/docs/docs/integrations/providers/mlx.mdx new file mode 100644 index 0000000000..dc859305cd --- /dev/null +++ b/docs/docs/integrations/providers/mlx.mdx @@ -0,0 +1,34 @@ +# MLX + +>[MLX](https://ml-explore.github.io/mlx/build/html/index.html) is a `NumPy`-like array framework +> designed for efficient and flexible machine learning on `Apple` silicon, +> brought to you by `Apple machine learning research`. + + +## Installation and Setup + +Install several Python packages: + +```bash +pip install mlx-lm transformers huggingface_hub +```` + + +## Chat models + + +See a [usage example](/docs/integrations/chat/mlx). + +```python +from langchain_community.chat_models.mlx import ChatMLX +``` + +## LLMs + +### MLX Local Pipelines + +See a [usage example](/docs/integrations/llms/mlx_pipelines). + +```python +from langchain_community.llms.mlx_pipeline import MLXPipeline +``` diff --git a/docs/docs/integrations/providers/octoai.mdx b/docs/docs/integrations/providers/octoai.mdx new file mode 100644 index 0000000000..d4a064c7c7 --- /dev/null +++ b/docs/docs/integrations/providers/octoai.mdx @@ -0,0 +1,37 @@ +# OctoAI + +>[OctoAI](https://docs.octoai.cloud/docs) offers easy access to efficient compute +> and enables users to integrate their choice of AI models into applications. +> The `OctoAI` compute service helps you run, tune, and scale AI applications easily. + + +## Installation and Setup + +- Install the `openai` Python package: + ```bash + pip install openai + ```` +- Register on `OctoAI` and get an API Token from [your OctoAI account page](https://octoai.cloud/settings). + + +## Chat models + +See a [usage example](/docs/integrations/chat/octoai). + +```python +from langchain_community.chat_models import ChatOctoAI +``` + +## LLMs + +See a [usage example](/docs/integrations/llms/octoai). + +```python +from langchain_community.llms.octoai_endpoint import OctoAIEndpoint +``` + +## Embedding models + +```python +from langchain_community.embeddings.octoai_embeddings import OctoAIEmbeddings +``` diff --git a/docs/docs/integrations/providers/perplexity.mdx b/docs/docs/integrations/providers/perplexity.mdx new file mode 100644 index 0000000000..9e89994f54 --- /dev/null +++ b/docs/docs/integrations/providers/perplexity.mdx @@ -0,0 +1,25 @@ +# Perplexity + +>[Perplexity](https://www.perplexity.ai/pro) is the most powerful way to search +> the internet with unlimited Pro Search, upgraded AI models, unlimited file upload, +> image generation, and API credits. +> +> You can check a [list of available models](https://docs.perplexity.ai/docs/model-cards). + +## Installation and Setup + +Install a Python package: + +```bash +pip install openai +```` + +Get your API key from [here](https://docs.perplexity.ai/docs/getting-started). + +## Chat models + +See a [usage example](/docs/integrations/chat/perplexity). + +```python +from langchain_community.chat_models import ChatPerplexity +``` diff --git a/docs/docs/integrations/text_embedding/llamacpp.ipynb b/docs/docs/integrations/text_embedding/llamacpp.ipynb index d57f58716a..375ee76f6b 100644 --- a/docs/docs/integrations/text_embedding/llamacpp.ipynb +++ b/docs/docs/integrations/text_embedding/llamacpp.ipynb @@ -4,9 +4,23 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Llama-cpp\n", + "# Llama.cpp\n", "\n", - "This notebook goes over how to use Llama-cpp embeddings within LangChain" + ">[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`\n", + ">[llama.cpp](https://github.com/ggerganov/llama.cpp).\n", + ">\n", + ">This package provides:\n", + ">\n", + "> - Low-level access to C API via ctypes interface.\n", + "> - High-level Python API for text completion\n", + "> - `OpenAI`-like API\n", + "> - `LangChain` compatibility\n", + "> - `LlamaIndex` compatibility\n", + "> - OpenAI compatible web server\n", + "> - Local Copilot replacement\n", + "> - Function Calling support\n", + "> - Vision API support\n", + "> - Multiple Models\n" ] }, { @@ -80,9 +94,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.1" + "version": "3.10.12" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 }