diff --git a/docs/docs/integrations/chat/anthropic.ipynb b/docs/docs/integrations/chat/anthropic.ipynb index 00ad314f45..5c99085bb9 100644 --- a/docs/docs/integrations/chat/anthropic.ipynb +++ b/docs/docs/integrations/chat/anthropic.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "a016701c", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Anthropic\n", + "---" + ] + }, { "cell_type": "markdown", "id": "bf733a38-db84-4363-89e2-de6735c37230", "metadata": {}, "source": [ - "# Anthropic\n", + "# ChatAnthropic\n", "\n", "This notebook covers how to get started with Anthropic chat models." ] diff --git a/docs/docs/integrations/chat/anyscale.ipynb b/docs/docs/integrations/chat/anyscale.ipynb index 674549a656..3d2e9e80cb 100644 --- a/docs/docs/integrations/chat/anyscale.ipynb +++ b/docs/docs/integrations/chat/anyscale.ipynb @@ -1,12 +1,22 @@ { "cells": [ + { + "cell_type": "raw", + "id": "31895fc4", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Anyscale\n", + "---" + ] + }, { "attachments": {}, "cell_type": "markdown", "id": "642fd21c-600a-47a1-be96-6e1438b421a9", "metadata": {}, "source": [ - "# Anyscale\n", + "# ChatAnyscale\n", "\n", "This notebook demonstrates the use of `langchain.chat_models.ChatAnyscale` for [Anyscale Endpoints](https://endpoints.anyscale.com/).\n", "\n", @@ -33,7 +43,7 @@ "metadata": {}, "outputs": [ { - "name": "stdin", + "name": "stdout", "output_type": "stream", "text": [ " ········\n" diff --git a/docs/docs/integrations/chat/azure_chat_openai.ipynb b/docs/docs/integrations/chat/azure_chat_openai.ipynb index b4568ca2fd..a6bee8f49c 100644 --- a/docs/docs/integrations/chat/azure_chat_openai.ipynb +++ b/docs/docs/integrations/chat/azure_chat_openai.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "641f8cb0", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Azure OpenAI\n", + "---" + ] + }, { "cell_type": "markdown", "id": "38f26d7a", "metadata": {}, "source": [ - "# Azure OpenAI\n", + "# AzureChatOpenAI\n", "\n", ">[Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/overview) provides REST API access to OpenAI's powerful language models including the GPT-4, GPT-3.5-Turbo, and Embeddings model series. These models can be easily adapted to your specific task including but not limited to content generation, summarization, semantic search, and natural language to code translation. Users can access the service through REST APIs, Python SDK, or a web-based interface in the Azure OpenAI Studio.\n", "\n", diff --git a/docs/docs/integrations/chat/azureml_chat_endpoint.ipynb b/docs/docs/integrations/chat/azureml_chat_endpoint.ipynb index 4444f7fdf6..2e0a09aff4 100644 --- a/docs/docs/integrations/chat/azureml_chat_endpoint.ipynb +++ b/docs/docs/integrations/chat/azureml_chat_endpoint.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Azure ML Endpoint\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# Azure ML Endpoint\n", + "# AzureMLChatOnlineEndpoint\n", "\n", ">[Azure Machine Learning](https://azure.microsoft.com/en-us/products/machine-learning/) is a platform used to build, train, and deploy machine learning models. Users can explore the types of models to deploy in the Model Catalog, which provides Azure Foundation Models and OpenAI Models. `Azure Foundation Models` include various open-source models and popular Hugging Face models. Users can also import models of their liking into AzureML.\n", ">\n", diff --git a/docs/docs/integrations/chat/baichuan.ipynb b/docs/docs/integrations/chat/baichuan.ipynb index 9f9376beee..462a6aa2bc 100644 --- a/docs/docs/integrations/chat/baichuan.ipynb +++ b/docs/docs/integrations/chat/baichuan.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Baichuan Chat\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# Baichuan Chat\n", + "# ChatBaichuan\n", "\n", "Baichuan chat models API by Baichuan Intelligent Technology. For more information, see [https://platform.baichuan-ai.com/docs/api](https://platform.baichuan-ai.com/docs/api)" ] @@ -63,7 +72,9 @@ "outputs": [ { "data": { - "text/plain": "AIMessage(content='首先,我们需要确定闰年的二月有多少天。闰年的二月有29天。\\n\\n然后,我们可以计算你的月薪:\\n\\n日薪 = 月薪 / (当月天数)\\n\\n所以,你的月薪 = 日薪 * 当月天数\\n\\n将数值代入公式:\\n\\n月薪 = 8元/天 * 29天 = 232元\\n\\n因此,你在闰年的二月的月薪是232元。')" + "text/plain": [ + "AIMessage(content='首先,我们需要确定闰年的二月有多少天。闰年的二月有29天。\\n\\n然后,我们可以计算你的月薪:\\n\\n日薪 = 月薪 / (当月天数)\\n\\n所以,你的月薪 = 日薪 * 当月天数\\n\\n将数值代入公式:\\n\\n月薪 = 8元/天 * 29天 = 232元\\n\\n因此,你在闰年的二月的月薪是232元。')" + ] }, "execution_count": 3, "metadata": {}, @@ -76,16 +87,23 @@ }, { "cell_type": "markdown", - "source": [ - "## For ChatBaichuan with Streaming" - ], "metadata": { "collapsed": false - } + }, + "source": [ + "## For ChatBaichuan with Streaming" + ] }, { "cell_type": "code", "execution_count": 5, + "metadata": { + "ExecuteTime": { + "end_time": "2023-10-17T15:14:25.870044Z", + "start_time": "2023-10-17T15:14:25.863381Z" + }, + "collapsed": false + }, "outputs": [], "source": [ "chat = ChatBaichuan(\n", @@ -93,22 +111,24 @@ " baichuan_secret_key=\"YOUR_SECRET_KEY\",\n", " streaming=True,\n", ")" - ], - "metadata": { - "collapsed": false, - "ExecuteTime": { - "end_time": "2023-10-17T15:14:25.870044Z", - "start_time": "2023-10-17T15:14:25.863381Z" - } - } + ] }, { "cell_type": "code", "execution_count": 6, + "metadata": { + "ExecuteTime": { + "end_time": "2023-10-17T15:14:27.153546Z", + "start_time": "2023-10-17T15:14:25.868470Z" + }, + "collapsed": false + }, "outputs": [ { "data": { - "text/plain": "AIMessageChunk(content='首先,我们需要确定闰年的二月有多少天。闰年的二月有29天。\\n\\n然后,我们可以计算你的月薪:\\n\\n日薪 = 月薪 / (当月天数)\\n\\n所以,你的月薪 = 日薪 * 当月天数\\n\\n将数值代入公式:\\n\\n月薪 = 8元/天 * 29天 = 232元\\n\\n因此,你在闰年的二月的月薪是232元。')" + "text/plain": [ + "AIMessageChunk(content='首先,我们需要确定闰年的二月有多少天。闰年的二月有29天。\\n\\n然后,我们可以计算你的月薪:\\n\\n日薪 = 月薪 / (当月天数)\\n\\n所以,你的月薪 = 日薪 * 当月天数\\n\\n将数值代入公式:\\n\\n月薪 = 8元/天 * 29天 = 232元\\n\\n因此,你在闰年的二月的月薪是232元。')" + ] }, "execution_count": 6, "metadata": {}, @@ -117,14 +137,7 @@ ], "source": [ "chat([HumanMessage(content=\"我日薪8块钱,请问在闰年的二月,我月薪多少\")])" - ], - "metadata": { - "collapsed": false, - "ExecuteTime": { - "end_time": "2023-10-17T15:14:27.153546Z", - "start_time": "2023-10-17T15:14:25.868470Z" - } - } + ] } ], "metadata": { diff --git a/docs/docs/integrations/chat/baidu_qianfan_endpoint.ipynb b/docs/docs/integrations/chat/baidu_qianfan_endpoint.ipynb index 57749548a8..65f7826815 100644 --- a/docs/docs/integrations/chat/baidu_qianfan_endpoint.ipynb +++ b/docs/docs/integrations/chat/baidu_qianfan_endpoint.ipynb @@ -1,11 +1,20 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Baidu Qianfan\n", + "---" + ] + }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ - "# Baidu Qianfan\n", + "# QianfanChatEndpoint\n", "\n", "Baidu AI Cloud Qianfan Platform is a one-stop large model development and service operation platform for enterprise developers. Qianfan not only provides including the model of Wenxin Yiyan (ERNIE-Bot) and the third-party open-source models, but also provides various AI development tools and the whole set of development environment, which facilitates customers to use and develop large model applications easily.\n", "\n", diff --git a/docs/docs/integrations/chat/bedrock.ipynb b/docs/docs/integrations/chat/bedrock.ipynb index 02dfb5b9fb..3957c9c1e4 100644 --- a/docs/docs/integrations/chat/bedrock.ipynb +++ b/docs/docs/integrations/chat/bedrock.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "fbc66410", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Bedrock Chat\n", + "---" + ] + }, { "cell_type": "markdown", "id": "bf733a38-db84-4363-89e2-de6735c37230", "metadata": {}, "source": [ - "# Bedrock Chat\n", + "# BedrockChat\n", "\n", ">[Amazon Bedrock](https://aws.amazon.com/bedrock/) is a fully managed service that offers a choice of \n", "> high-performing foundation models (FMs) from leading AI companies like `AI21 Labs`, `Anthropic`, `Cohere`, \n", diff --git a/docs/docs/integrations/chat/cohere.ipynb b/docs/docs/integrations/chat/cohere.ipynb index e9b90af3d2..8f05b1c667 100644 --- a/docs/docs/integrations/chat/cohere.ipynb +++ b/docs/docs/integrations/chat/cohere.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "53fbf15f", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Cohere\n", + "---" + ] + }, { "cell_type": "markdown", "id": "bf733a38-db84-4363-89e2-de6735c37230", "metadata": {}, "source": [ - "# Cohere\n", + "# ChatCohere\n", "\n", "This notebook covers how to get started with Cohere chat models." ] diff --git a/docs/docs/integrations/chat/ernie.ipynb b/docs/docs/integrations/chat/ernie.ipynb index bcd28fd9cf..d98fcdb592 100644 --- a/docs/docs/integrations/chat/ernie.ipynb +++ b/docs/docs/integrations/chat/ernie.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Ernie Bot Chat\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# ERNIE-Bot Chat\n", + "# ErnieBotChat\n", "\n", "[ERNIE-Bot](https://cloud.baidu.com/doc/WENXINWORKSHOP/s/jlil56u11) is a large language model developed by Baidu, covering a huge amount of Chinese data.\n", "This notebook covers how to get started with ErnieBot chat models.\n", diff --git a/docs/docs/integrations/chat/everlyai.ipynb b/docs/docs/integrations/chat/everlyai.ipynb index 3310f8f213..3f18b36f7e 100644 --- a/docs/docs/integrations/chat/everlyai.ipynb +++ b/docs/docs/integrations/chat/everlyai.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "5e45f35c", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: EverlyAI\n", + "---" + ] + }, { "cell_type": "markdown", "id": "642fd21c-600a-47a1-be96-6e1438b421a9", "metadata": {}, "source": [ - "# EverlyAI\n", + "# ChatEverlyAI\n", "\n", ">[EverlyAI](https://everlyai.xyz) allows you to run your ML models at scale in the cloud. It also provides API access to [several LLM models](https://everlyai.xyz).\n", "\n", diff --git a/docs/docs/integrations/chat/fireworks.ipynb b/docs/docs/integrations/chat/fireworks.ipynb index 6a5b0ad01d..a0a3932bc5 100644 --- a/docs/docs/integrations/chat/fireworks.ipynb +++ b/docs/docs/integrations/chat/fireworks.ipynb @@ -1,12 +1,22 @@ { "cells": [ + { + "cell_type": "raw", + "id": "529aeba9", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Fireworks\n", + "---" + ] + }, { "attachments": {}, "cell_type": "markdown", "id": "642fd21c-600a-47a1-be96-6e1438b421a9", "metadata": {}, "source": [ - "# Fireworks\n", + "# ChatFireworks\n", "\n", ">[Fireworks](https://app.fireworks.ai/) accelerates product development on generative AI by creating an innovative AI experiment and production platform. \n", "\n", diff --git a/docs/docs/integrations/chat/google_vertex_ai_palm.ipynb b/docs/docs/integrations/chat/google_vertex_ai_palm.ipynb index af44d316e9..436e2fd142 100644 --- a/docs/docs/integrations/chat/google_vertex_ai_palm.ipynb +++ b/docs/docs/integrations/chat/google_vertex_ai_palm.ipynb @@ -1,11 +1,20 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Google Cloud Vertex AI\n", + "---" + ] + }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ - "# Google Cloud Vertex AI \n", + "# ChatVertexAI\n", "\n", "Note: This is separate from the Google PaLM integration. Google has chosen to offer an enterprise version of PaLM through GCP, and this supports the models made available through there. \n", "\n", diff --git a/docs/docs/integrations/chat/hunyuan.ipynb b/docs/docs/integrations/chat/hunyuan.ipynb index 20779607dc..2cb334bfb9 100644 --- a/docs/docs/integrations/chat/hunyuan.ipynb +++ b/docs/docs/integrations/chat/hunyuan.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Tencent Hunyuan\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# Tencent Hunyuan\n", + "# ChatHunyuan\n", "\n", "Hunyuan chat model API by Tencent. For more information, see [https://cloud.tencent.com/document/product/1729](https://cloud.tencent.com/document/product/1729)" ] @@ -54,7 +63,9 @@ "outputs": [ { "data": { - "text/plain": "AIMessage(content=\"J'aime programmer.\")" + "text/plain": [ + "AIMessage(content=\"J'aime programmer.\")" + ] }, "execution_count": 3, "metadata": {}, @@ -73,16 +84,23 @@ }, { "cell_type": "markdown", - "source": [ - "## For ChatHunyuan with Streaming" - ], "metadata": { "collapsed": false - } + }, + "source": [ + "## For ChatHunyuan with Streaming" + ] }, { "cell_type": "code", "execution_count": 2, + "metadata": { + "ExecuteTime": { + "end_time": "2023-10-19T10:20:41.507720Z", + "start_time": "2023-10-19T10:20:41.496456Z" + }, + "collapsed": false + }, "outputs": [], "source": [ "chat = ChatHunyuan(\n", @@ -91,22 +109,24 @@ " hunyuan_secret_key=\"YOUR_SECRET_KEY\",\n", " streaming=True,\n", ")" - ], - "metadata": { - "collapsed": false, - "ExecuteTime": { - "end_time": "2023-10-19T10:20:41.507720Z", - "start_time": "2023-10-19T10:20:41.496456Z" - } - } + ] }, { "cell_type": "code", "execution_count": 3, + "metadata": { + "ExecuteTime": { + "end_time": "2023-10-19T10:20:46.275673Z", + "start_time": "2023-10-19T10:20:44.241097Z" + }, + "collapsed": false + }, "outputs": [ { "data": { - "text/plain": "AIMessageChunk(content=\"J'aime programmer.\")" + "text/plain": [ + "AIMessageChunk(content=\"J'aime programmer.\")" + ] }, "execution_count": 3, "metadata": {}, @@ -121,26 +141,19 @@ " )\n", " ]\n", ")" - ], - "metadata": { - "collapsed": false, - "ExecuteTime": { - "end_time": "2023-10-19T10:20:46.275673Z", - "start_time": "2023-10-19T10:20:44.241097Z" - } - } + ] }, { "cell_type": "code", "execution_count": null, - "outputs": [], - "source": [], "metadata": { - "collapsed": false, "ExecuteTime": { "start_time": "2023-10-19T10:19:56.233477Z" - } - } + }, + "collapsed": false + }, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/docs/docs/integrations/chat/konko.ipynb b/docs/docs/integrations/chat/konko.ipynb index 2250a242e5..6e4e19bf3d 100644 --- a/docs/docs/integrations/chat/konko.ipynb +++ b/docs/docs/integrations/chat/konko.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Konko\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# Konko\n", + "# ChatKonko\n", "\n", ">[Konko](https://www.konko.ai/) API is a fully managed Web API designed to help application developers:\n", "\n", diff --git a/docs/docs/integrations/chat/litellm.ipynb b/docs/docs/integrations/chat/litellm.ipynb index bd3c8ef282..a93d595bfb 100644 --- a/docs/docs/integrations/chat/litellm.ipynb +++ b/docs/docs/integrations/chat/litellm.ipynb @@ -1,12 +1,22 @@ { "cells": [ + { + "cell_type": "raw", + "id": "59148044", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: LiteLLM\n", + "---" + ] + }, { "attachments": {}, "cell_type": "markdown", "id": "bf733a38-db84-4363-89e2-de6735c37230", "metadata": {}, "source": [ - "# 🚅 LiteLLM\n", + "# ChatLiteLLM\n", "\n", "[LiteLLM](https://github.com/BerriAI/litellm) is a library that simplifies calling Anthropic, Azure, Huggingface, Replicate, etc. \n", "\n", diff --git a/docs/docs/integrations/chat/llama2_chat.ipynb b/docs/docs/integrations/chat/llama2_chat.ipynb index 48493973f6..98cce09dfa 100644 --- a/docs/docs/integrations/chat/llama2_chat.ipynb +++ b/docs/docs/integrations/chat/llama2_chat.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "7320f16b", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Llama 2 Chat\n", + "---" + ] + }, { "cell_type": "markdown", "id": "90a1faf2", "metadata": {}, "source": [ - "# Llama-2 Chat\n", + "# Llama2Chat\n", "\n", "This notebook shows how to augment Llama-2 `LLM`s with the `Llama2Chat` wrapper to support the [Llama-2 chat prompt format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2). Several `LLM` implementations in LangChain can be used as interface to Llama-2 chat models. These include [HuggingFaceTextGenInference](https://python.langchain.com/docs/integrations/llms/huggingface_textgen_inference), [LlamaCpp](https://python.langchain.com/docs/use_cases/question_answering/how_to/local_retrieval_qa), [GPT4All](https://python.langchain.com/docs/integrations/llms/gpt4all), ..., to mention a few examples. \n", "\n", @@ -721,7 +731,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.18" + "version": "3.11.4" } }, "nbformat": 4, diff --git a/docs/docs/integrations/chat/llama_api.ipynb b/docs/docs/integrations/chat/llama_api.ipynb index 329904a6bc..e75cd5b4b4 100644 --- a/docs/docs/integrations/chat/llama_api.ipynb +++ b/docs/docs/integrations/chat/llama_api.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "71b5cfca", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Llama API\n", + "---" + ] + }, { "cell_type": "markdown", "id": "90a1faf2", "metadata": {}, "source": [ - "# Llama API\n", + "# ChatLlamaAPI\n", "\n", "This notebook shows how to use LangChain with [LlamaAPI](https://llama-api.com/) - a hosted version of Llama2 that adds in support for function calling." ] diff --git a/docs/docs/integrations/chat/minimax.ipynb b/docs/docs/integrations/chat/minimax.ipynb index 8b4d683d0f..e10eeb0d2a 100644 --- a/docs/docs/integrations/chat/minimax.ipynb +++ b/docs/docs/integrations/chat/minimax.ipynb @@ -1,11 +1,20 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: MiniMax\n", + "---" + ] + }, { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ - "# MiniMax\n", + "# MiniMaxChat\n", "\n", "[Minimax](https://api.minimax.chat) is a Chinese startup that provides LLM service for companies and individuals.\n", "\n", diff --git a/docs/docs/integrations/chat/ollama.ipynb b/docs/docs/integrations/chat/ollama.ipynb index 7fb4b2984c..7f069112e6 100644 --- a/docs/docs/integrations/chat/ollama.ipynb +++ b/docs/docs/integrations/chat/ollama.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Ollama\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# Ollama\n", + "# ChatOllama\n", "\n", "[Ollama](https://ollama.ai/) allows you to run open-source large language models, such as LLaMA2, locally.\n", "\n", diff --git a/docs/docs/integrations/chat/ollama_functions.ipynb b/docs/docs/integrations/chat/ollama_functions.ipynb index a4f365bf3a..707b8d74cc 100644 --- a/docs/docs/integrations/chat/ollama_functions.ipynb +++ b/docs/docs/integrations/chat/ollama_functions.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Ollama Functions\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# Ollama Functions\n", + "# OllamaFunctions\n", "\n", "This notebook shows how to use an experimental wrapper around Ollama that gives it the same API as OpenAI Functions.\n", "\n", diff --git a/docs/docs/integrations/chat/openai.ipynb b/docs/docs/integrations/chat/openai.ipynb index 4c66ab7657..5fa123d0c9 100644 --- a/docs/docs/integrations/chat/openai.ipynb +++ b/docs/docs/integrations/chat/openai.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "afaf8039", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: OpenAI\n", + "---" + ] + }, { "cell_type": "markdown", "id": "e49f1e0d", "metadata": {}, "source": [ - "# OpenAI\n", + "# ChatOpenAI\n", "\n", "This notebook covers how to get started with OpenAI chat models." ] diff --git a/docs/docs/integrations/chat/pai_eas_chat_endpoint.ipynb b/docs/docs/integrations/chat/pai_eas_chat_endpoint.ipynb index 55bde5e531..395d64775f 100644 --- a/docs/docs/integrations/chat/pai_eas_chat_endpoint.ipynb +++ b/docs/docs/integrations/chat/pai_eas_chat_endpoint.ipynb @@ -1,10 +1,19 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: AliCloud PAI EAS\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ - "# AliCloud PAI EAS\n", + "# PaiEasChatEndpoint\n", "Machine Learning Platform for AI of Alibaba Cloud is a machine learning or deep learning engineering platform intended for enterprises and developers. It provides easy-to-use, cost-effective, high-performance, and easy-to-scale plug-ins that can be applied to various industry scenarios. With over 140 built-in optimization algorithms, Machine Learning Platform for AI provides whole-process AI engineering capabilities including data labeling (PAI-iTAG), model building (PAI-Designer and PAI-DSW), model training (PAI-DLC), compilation optimization, and inference deployment (PAI-EAS). PAI-EAS supports different types of hardware resources, including CPUs and GPUs, and features high throughput and low latency. It allows you to deploy large-scale complex models with a few clicks and perform elastic scale-ins and scale-outs in real time. It also provides a comprehensive O&M and monitoring system." ] }, diff --git a/docs/docs/integrations/chat/promptlayer_chatopenai.ipynb b/docs/docs/integrations/chat/promptlayer_chatopenai.ipynb index 4b20a5852d..623bfbe1ae 100644 --- a/docs/docs/integrations/chat/promptlayer_chatopenai.ipynb +++ b/docs/docs/integrations/chat/promptlayer_chatopenai.ipynb @@ -1,12 +1,22 @@ { "cells": [ + { + "cell_type": "raw", + "id": "ce3672d3", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: PromptLayer ChatOpenAI\n", + "---" + ] + }, { "attachments": {}, "cell_type": "markdown", "id": "959300d4", "metadata": {}, "source": [ - "# PromptLayer ChatOpenAI\n", + "# PromptLayerChatOpenAI\n", "\n", "This example showcases how to connect to [PromptLayer](https://www.promptlayer.com) to start recording your ChatOpenAI requests." ] diff --git a/docs/docs/integrations/chat/tongyi.ipynb b/docs/docs/integrations/chat/tongyi.ipynb index f3c64f2830..3de68b1e5c 100644 --- a/docs/docs/integrations/chat/tongyi.ipynb +++ b/docs/docs/integrations/chat/tongyi.ipynb @@ -1,5 +1,14 @@ { "cells": [ + { + "cell_type": "raw", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Tongyi Qwen\n", + "---" + ] + }, { "cell_type": "markdown", "metadata": { @@ -9,7 +18,7 @@ } }, "source": [ - "# Tongyi Qwen\n", + "# ChatTongyi\n", "Tongyi Qwen is a large language model developed by Alibaba's Damo Academy. It is capable of understanding user intent through natural language understanding and semantic analysis, based on user input in natural language. It provides services and assistance to users in different domains and tasks. By providing clear and detailed instructions, you can obtain results that better align with your expectations.\n", "In this notebook, we will introduce how to use langchain with [Tongyi](https://www.aliyun.com/product/dashscope) mainly in `Chat` corresponding\n", " to the package `langchain/chat_models` in langchain" @@ -41,7 +50,7 @@ }, "outputs": [ { - "name": "stdin", + "name": "stdout", "output_type": "stream", "text": [ " ········\n" diff --git a/docs/docs/integrations/chat/vllm.ipynb b/docs/docs/integrations/chat/vllm.ipynb index 5cc825d6d8..11023a201b 100644 --- a/docs/docs/integrations/chat/vllm.ipynb +++ b/docs/docs/integrations/chat/vllm.ipynb @@ -1,5 +1,15 @@ { "cells": [ + { + "cell_type": "raw", + "id": "eb65deaa", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: vLLM Chat\n", + "---" + ] + }, { "cell_type": "markdown", "id": "eb7e5679-aa06-47e4-a1a3-b6b70e604017", diff --git a/docs/docs/integrations/chat/volcengine_maas.ipynb b/docs/docs/integrations/chat/volcengine_maas.ipynb index 32dd0c16d0..e7c39c6b6f 100644 --- a/docs/docs/integrations/chat/volcengine_maas.ipynb +++ b/docs/docs/integrations/chat/volcengine_maas.ipynb @@ -1,5 +1,15 @@ { "cells": [ + { + "cell_type": "raw", + "id": "66107bdd", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: Volc Enging Maas\n", + "---" + ] + }, { "cell_type": "markdown", "id": "404758628c7b20f6", @@ -7,7 +17,7 @@ "collapsed": false }, "source": [ - "# Volc Engine Maas\n", + "# VolcEngineMaasChat\n", "\n", "This notebook provides you with a guide on how to get started with volc engine maas chat models." ] @@ -86,7 +96,9 @@ "outputs": [ { "data": { - "text/plain": "AIMessage(content='好的,这是一个笑话:\\n\\n为什么鸟儿不会玩电脑游戏?\\n\\n因为它们没有翅膀!')" + "text/plain": [ + "AIMessage(content='好的,这是一个笑话:\\n\\n为什么鸟儿不会玩电脑游戏?\\n\\n因为它们没有翅膀!')" + ] }, "execution_count": 26, "metadata": {}, @@ -141,7 +153,9 @@ "outputs": [ { "data": { - "text/plain": "AIMessage(content='好的,这是一个笑话:\\n\\n三岁的女儿说她会造句了,妈妈让她用“年轻”造句,女儿说:“妈妈减肥,一年轻了好几斤”。')" + "text/plain": [ + "AIMessage(content='好的,这是一个笑话:\\n\\n三岁的女儿说她会造句了,妈妈让她用“年轻”造句,女儿说:“妈妈减肥,一年轻了好几斤”。')" + ] }, "execution_count": 28, "metadata": {}, diff --git a/docs/docs/integrations/chat/yandex.ipynb b/docs/docs/integrations/chat/yandex.ipynb index 598c937956..0e1ced9b63 100644 --- a/docs/docs/integrations/chat/yandex.ipynb +++ b/docs/docs/integrations/chat/yandex.ipynb @@ -1,11 +1,21 @@ { "cells": [ + { + "cell_type": "raw", + "id": "b4154fbe", + "metadata": {}, + "source": [ + "---\n", + "sidebar_label: YandexGPT\n", + "---" + ] + }, { "cell_type": "markdown", "id": "af63c9db-e4bd-4d3b-a4d7-7927f5541734", "metadata": {}, "source": [ - "# YandexGPT\n", + "# ChatYandexGPT\n", "\n", "This notebook goes over how to use Langchain with [YandexGPT](https://cloud.yandex.com/en/services/yandexgpt) chat model.\n", "\n",