diff --git a/docs/docs/integrations/llms/anthropic.ipynb b/docs/docs/integrations/llms/anthropic.ipynb index 77c359e871..7db347cc1f 100644 --- a/docs/docs/integrations/llms/anthropic.ipynb +++ b/docs/docs/integrations/llms/anthropic.ipynb @@ -23,7 +23,7 @@ "# AnthropicLLM\n", "\n", ":::caution\n", - "You are currently on a page documenting the use of Anthropic legacy Claude 2 models as [text completion models](/docs/concepts/#llms). The latest and most popular Anthropic models are [chat completion models](/docs/concepts/#chat-models).\n", + "You are currently on a page documenting the use of Anthropic legacy Claude 2 models as [text completion models](/docs/concepts/#llms). The latest and most popular Anthropic models are [chat completion models](/docs/concepts/#chat-models), and the text completion models have been deprecated.\n", "\n", "You are probably looking for [this page instead](/docs/integrations/chat/anthropic/).\n", ":::\n", @@ -115,14 +115,6 @@ "\n", "chain.invoke({\"question\": \"What is LangChain?\"})" ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "a52f765c", - "metadata": {}, - "outputs": [], - "source": [] } ], "metadata": { diff --git a/docs/docs/integrations/llms/cohere.ipynb b/docs/docs/integrations/llms/cohere.ipynb index bfd83a92ac..5fa336b389 100644 --- a/docs/docs/integrations/llms/cohere.ipynb +++ b/docs/docs/integrations/llms/cohere.ipynb @@ -15,7 +15,14 @@ "\n", ">[Cohere](https://cohere.ai/about) is a Canadian startup that provides natural language processing models that help companies improve human-machine interactions.\n", "\n", - "Head to the [API reference](https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html) for detailed documentation of all attributes and methods." + "Head to the [API reference](https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html) for detailed documentation of all attributes and methods.\n", + "\n", + "## Overview\n", + "### Integration details\n", + "\n", + "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.2/docs/integrations/llms/cohere/) | Package downloads | Package latest |\n", + "| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n", + "| [Cohere](https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html) | [langchain_community](https://api.python.langchain.com/en/latest/community_api_reference.html) | ❌ | beta | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain_community?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain_community?style=flat-square&label=%20) |\n" ] }, { @@ -29,34 +36,43 @@ "\n", "The integration lives in the `langchain-community` package. We also need to install the `cohere` package itself. We can install these with:\n", "\n", - "```bash\n", - "pip install -U langchain-community langchain-cohere\n", - "```\n", + "### Credentials\n", "\n", - "We'll also need to get a [Cohere API key](https://cohere.com/) and set the `COHERE_API_KEY` environment variable:" + "We'll need to get a [Cohere API key](https://cohere.com/) and set the `COHERE_API_KEY` environment variable:" ] }, { "cell_type": "code", - "execution_count": 2, + "execution_count": null, "id": "3f5dc9d7-65e3-4b5b-9086-3327d016cfe0", "metadata": { "tags": [] }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - " ········\n" - ] - } - ], + "outputs": [], "source": [ "import getpass\n", "import os\n", "\n", - "os.environ[\"COHERE_API_KEY\"] = getpass.getpass()" + "if \"COHERE_API_KEY\" not in os.environ:\n", + " os.environ[\"COHERE_API_KEY\"] = getpass.getpass()" + ] + }, + { + "cell_type": "markdown", + "id": "ff211537", + "metadata": {}, + "source": [ + "### Installation" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "318454f9", + "metadata": {}, + "outputs": [], + "source": [ + "pip install -U langchain-community langchain-cohere" ] }, { @@ -83,7 +99,7 @@ "id": "0b4e02bf-5beb-48af-a2a2-52cbcd8ebed6", "metadata": {}, "source": [ - "## Usage\n", + "## Invocation\n", "\n", "Cohere supports all [LLM](/docs/how_to#llms) functionality:" ] @@ -199,6 +215,8 @@ "id": "39198f7d-6fc8-4662-954a-37ad38c4bec4", "metadata": {}, "source": [ + "## Chaining\n", + "\n", "You can also easily combine with a prompt template for easy structuring of user input. We can do this using [LCEL](/docs/concepts#langchain-expression-language-lcel)" ] }, @@ -237,12 +255,14 @@ ] }, { - "cell_type": "code", - "execution_count": null, - "id": "4797d719", + "cell_type": "markdown", + "id": "ac5fcbed", "metadata": {}, - "outputs": [], - "source": [] + "source": [ + "## API reference\n", + "\n", + "For detailed documentation of all `Cohere` llm features and configurations head to the API reference: https://api.python.langchain.com/en/latest/llms/langchain_community.llms.cohere.Cohere.html" + ] } ], "metadata": { diff --git a/docs/docs/integrations/llms/fireworks.ipynb b/docs/docs/integrations/llms/fireworks.ipynb index 532ab6874c..9127ff4be9 100644 --- a/docs/docs/integrations/llms/fireworks.ipynb +++ b/docs/docs/integrations/llms/fireworks.ipynb @@ -15,7 +15,14 @@ "\n", ">[Fireworks](https://app.fireworks.ai/) accelerates product development on generative AI by creating an innovative AI experiment and production platform. \n", "\n", - "This example goes over how to use LangChain to interact with `Fireworks` models." + "This example goes over how to use LangChain to interact with `Fireworks` models.\n", + "\n", + "## Overview\n", + "### Integration details\n", + "\n", + "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.1/docs/integrations/llms/fireworks/) | Package downloads | Package latest |\n", + "| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n", + "| [Fireworks](https://api.python.langchain.com/en/latest/llms/langchain_fireworks.llms.Fireworks.html#langchain_fireworks.llms.Fireworks) | [langchain_fireworks](https://api.python.langchain.com/en/latest/fireworks_api_reference.html) | ❌ | ❌ | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain_fireworks?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain_fireworks?style=flat-square&label=%20) |" ] }, { @@ -24,47 +31,72 @@ "id": "fb345268", "metadata": {}, "outputs": [], + "source": [] + }, + { + "cell_type": "markdown", + "id": "ccff689e", + "metadata": {}, "source": [ - "%pip install -qU langchain-fireworks" + "## Setup\n", + "\n", + "### Credentials \n", + "\n", + "Sign in to [Fireworks AI](http://fireworks.ai) for the an API Key to access our models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable.\n", + "3. Set up your model using a model id. If the model is not set, the default model is fireworks-llama-v2-7b-chat. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai)." ] }, { "cell_type": "code", - "execution_count": 2, - "id": "60b6dbb2", + "execution_count": 3, + "id": "9ca87a2e", "metadata": {}, "outputs": [], "source": [ - "from langchain_fireworks import Fireworks" + "import getpass\n", + "import os\n", + "\n", + "if \"FIREWORKS_API_KEY\" not in os.environ:\n", + " os.environ[\"FIREWORKS_API_KEY\"] = getpass.getpass(\"Fireworks API Key:\")" ] }, { "cell_type": "markdown", - "id": "ccff689e", + "id": "e42ced7e", "metadata": {}, "source": [ - "# Setup\n", + "### Installation\n", "\n", - "1. Make sure the `langchain-fireworks` package is installed in your environment.\n", - "2. Sign in to [Fireworks AI](http://fireworks.ai) for the an API Key to access our models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable.\n", - "3. Set up your model using a model id. If the model is not set, the default model is fireworks-llama-v2-7b-chat. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai)." + "You need to install the `langchain_fireworks` python package for the rest of the notebook to work." ] }, { "cell_type": "code", - "execution_count": 3, - "id": "9ca87a2e", + "execution_count": null, + "id": "ca824723", + "metadata": {}, + "outputs": [], + "source": [ + "%pip install -qU langchain-fireworks" + ] + }, + { + "cell_type": "markdown", + "id": "acc24d0c", + "metadata": {}, + "source": [ + "## Instantiation" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d285fd7f", "metadata": {}, "outputs": [], "source": [ - "import getpass\n", - "import os\n", - "\n", "from langchain_fireworks import Fireworks\n", "\n", - "if \"FIREWORKS_API_KEY\" not in os.environ:\n", - " os.environ[\"FIREWORKS_API_KEY\"] = getpass.getpass(\"Fireworks API Key:\")\n", - "\n", "# Initialize a Fireworks model\n", "llm = Fireworks(\n", " model=\"accounts/fireworks/models/mixtral-8x7b-instruct\",\n", @@ -74,10 +106,10 @@ }, { "cell_type": "markdown", - "id": "acc24d0c", + "id": "a4c29f7b", "metadata": {}, "source": [ - "# Calling the Model Directly\n", + "## Invocation\n", "\n", "You can call the model directly with string prompts to get completions." ] @@ -98,11 +130,18 @@ } ], "source": [ - "# Single prompt\n", "output = llm.invoke(\"Who's the best quarterback in the NFL?\")\n", "print(output)" ] }, + { + "cell_type": "markdown", + "id": "b0283343", + "metadata": {}, + "source": [ + "### Invoking with multiple prompts" + ] + }, { "cell_type": "code", "execution_count": 5, @@ -128,6 +167,14 @@ "print(output.generations)" ] }, + { + "cell_type": "markdown", + "id": "f18f5717", + "metadata": {}, + "source": [ + "### Invoking with additional parameters" + ] + }, { "cell_type": "code", "execution_count": 7, @@ -158,7 +205,7 @@ "id": "137662a6", "metadata": {}, "source": [ - "# Simple Chain with Non-Chat Model" + "## Chaining" ] }, { @@ -206,6 +253,8 @@ "id": "d0a29826", "metadata": {}, "source": [ + "## Streaming\n", + "\n", "You can stream the output, if you want." ] }, @@ -233,12 +282,14 @@ ] }, { - "cell_type": "code", - "execution_count": null, - "id": "fcc0eecb", + "cell_type": "markdown", + "id": "692c5e76", "metadata": {}, - "outputs": [], - "source": [] + "source": [ + "## API reference\n", + "\n", + "For detailed documentation of all `Fireworks` LLM features and configurations head to the API reference: https://api.python.langchain.com/en/latest/llms/langchain_fireworks.llms.Fireworks.html#langchain_fireworks.llms.Fireworks" + ] } ], "metadata": { diff --git a/docs/docs/integrations/llms/openai.ipynb b/docs/docs/integrations/llms/openai.ipynb index 92dce56dac..398a48d26d 100644 --- a/docs/docs/integrations/llms/openai.ipynb +++ b/docs/docs/integrations/llms/openai.ipynb @@ -19,127 +19,139 @@ ] }, { - "cell_type": "code", - "execution_count": 1, - "id": "5d71df86-8a17-4283-83d7-4e46e7c06c44", - "metadata": { - "tags": [] - }, - "outputs": [], + "cell_type": "markdown", + "id": "74312161", + "metadata": {}, "source": [ - "# get a token: https://platform.openai.com/account/api-keys\n", + "## Overview\n", + "\n", + "### Integration details\n", + "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/v0.2/docs/integrations/chat/openai) | Package downloads | Package latest |\n", + "| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n", + "| [ChatOpenAI](https://api.python.langchain.com/en/latest/chat_models/langchain_openai.chat_models.base.ChatOpenAI.html) | [langchain-openai](https://api.python.langchain.com/en/latest/openai_api_reference.html) | ❌ | beta | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-openai?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-openai?style=flat-square&label=%20) |\n", + "\n", "\n", - "from getpass import getpass\n", + "## Setup\n", "\n", - "OPENAI_API_KEY = getpass()" + "To access OpenAI models you'll need to create an OpenAI account, get an API key, and install the `langchain-openai` integration package.\n", + "\n", + "### Credentials\n", + "\n", + "Head to https://platform.openai.com to sign up to OpenAI and generate an API key. Once you've done this set the OPENAI_API_KEY environment variable:" ] }, { "cell_type": "code", - "execution_count": 2, - "id": "5472a7cd-af26-48ca-ae9b-5f6ae73c74d2", - "metadata": { - "tags": [] - }, - "outputs": [], + "execution_count": null, + "id": "efcdb2b6", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Enter your OpenAI API key: ········\n" + ] + } + ], "source": [ + "import getpass\n", "import os\n", "\n", - "os.environ[\"OPENAI_API_KEY\"] = OPENAI_API_KEY" + "if \"OPENAI_API_KEY\" not in os.environ:\n", + " os.environ[\"OPENAI_API_KEY\"] = getpass.getpass(\"Enter your OpenAI API key: \")" ] }, { "cell_type": "markdown", - "id": "129a3275", + "id": "f5d528fa", "metadata": {}, "source": [ - "Should you need to specify your organization ID, you can use the following cell. However, it is not required if you are only part of a single organization or intend to use your default organization. You can check your default organization [here](https://platform.openai.com/account/api-keys).\n", - "\n", - "To specify your organization, you can use this:\n", - "```python\n", - "OPENAI_ORGANIZATION = getpass()\n", - "\n", - "os.environ[\"OPENAI_ORGANIZATION\"] = OPENAI_ORGANIZATION\n", - "```" + "If you want to get automated best in-class tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:" ] }, { "cell_type": "code", - "execution_count": 1, - "id": "6fb585dd", - "metadata": { - "tags": [] - }, + "execution_count": null, + "id": "52fa46e8", + "metadata": {}, "outputs": [], "source": [ - "from langchain_core.prompts import PromptTemplate\n", - "from langchain_openai import OpenAI" + "# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n", + "# os.environ[\"LANGSMITH_TRACING\"] = \"true\"" ] }, { - "cell_type": "code", - "execution_count": 2, - "id": "035dea0f", - "metadata": { - "tags": [] - }, - "outputs": [], + "cell_type": "markdown", + "id": "0fad78d8", + "metadata": {}, "source": [ - "template = \"\"\"Question: {question}\n", - "\n", - "Answer: Let's think step by step.\"\"\"\n", + "### Installation\n", "\n", - "prompt = PromptTemplate.from_template(template)" + "The LangChain OpenAI integration lives in the `langchain-openai` package:" ] }, { "cell_type": "code", - "execution_count": 3, - "id": "3f3458d9", - "metadata": { - "tags": [] - }, + "execution_count": null, + "id": "2e300149", + "metadata": {}, "outputs": [], "source": [ - "llm = OpenAI()" + "%pip install -qU langchain-openai" ] }, { "cell_type": "markdown", - "id": "4fc152cd", + "id": "129a3275", "metadata": {}, "source": [ - "If you manually want to specify your OpenAI API key and/or organization ID, you can use the following:\n", + "Should you need to specify your organization ID, you can use the following cell. However, it is not required if you are only part of a single organization or intend to use your default organization. You can check your default organization [here](https://platform.openai.com/account/api-keys).\n", + "\n", + "To specify your organization, you can use this:\n", "```python\n", - "llm = OpenAI(openai_api_key=\"YOUR_API_KEY\", openai_organization=\"YOUR_ORGANIZATION_ID\")\n", + "OPENAI_ORGANIZATION = getpass()\n", + "\n", + "os.environ[\"OPENAI_ORGANIZATION\"] = OPENAI_ORGANIZATION\n", "```\n", - "Remove the openai_organization parameter should it not apply to you." + "\n", + "## Instantiation\n", + "\n", + "Now we can instantiate our model object and generate chat completions:" ] }, { "cell_type": "code", - "execution_count": 4, - "id": "a641dbd9", + "execution_count": 1, + "id": "6fb585dd", "metadata": { "tags": [] }, "outputs": [], "source": [ - "llm_chain = prompt | llm" + "from langchain_openai import OpenAI\n", + "\n", + "llm = OpenAI()" + ] + }, + { + "cell_type": "markdown", + "id": "464003c1", + "metadata": {}, + "source": [ + "## Invocation" ] }, { "cell_type": "code", "execution_count": 5, - "id": "9f844993", - "metadata": { - "tags": [] - }, + "id": "85b49da0", + "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "' Justin Bieber was born on March 1, 1994. The Super Bowl is typically played in late January or early February. So, we need to look at the Super Bowl from 1994. In 1994, the Super Bowl was Super Bowl XXVIII, played on January 30, 1994. The winning team of that Super Bowl was the Dallas Cowboys.'" + "\"\\n\\nI'm an AI language model created by OpenAI, so I don't have feelings or emotions. But thank you for asking! How can I assist you today?\"" ] }, "execution_count": 5, @@ -148,9 +160,37 @@ } ], "source": [ - "question = \"What NFL team won the Super Bowl in the year Justin Beiber was born?\"\n", + "llm.invoke(\"Hello how are you?\")" + ] + }, + { + "cell_type": "markdown", + "id": "2b7e0dfc", + "metadata": {}, + "source": [ + "## Chaining" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "a641dbd9", + "metadata": { + "tags": [] + }, + "outputs": [], + "source": [ + "from langchain_core.prompts import PromptTemplate\n", "\n", - "llm_chain.invoke(question)" + "prompt = PromptTemplate(\"How to say {input} in {output_language}:\\n\")\n", + "\n", + "chain = prompt | llm\n", + "chain.invoke(\n", + " {\n", + " \"output_language\": \"German\",\n", + " \"input\": \"I love programming.\",\n", + " }\n", + ")" ] }, { @@ -158,6 +198,8 @@ "id": "58a9ddb1", "metadata": {}, "source": [ + "## Using a proxy\n", + "\n", "If you are behind an explicit proxy, you can specify the http_client to pass through" ] }, @@ -168,11 +210,24 @@ "metadata": {}, "outputs": [], "source": [ - "pip install httpx\n", + "%pip install httpx\n", "\n", "import httpx\n", "\n", - "openai = OpenAI(model_name=\"gpt-3.5-turbo-instruct\", http_client=httpx.Client(proxies=\"http://proxy.yourcompany.com:8080\"))" + "openai = OpenAI(\n", + " model_name=\"gpt-3.5-turbo-instruct\",\n", + " http_client=httpx.Client(proxies=\"http://proxy.yourcompany.com:8080\"),\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "73e207dd", + "metadata": {}, + "source": [ + "## API reference\n", + "\n", + "For detailed documentation of all `OpenAI` llm features and configurations head to the API reference: https://api.python.langchain.com/en/latest/llms/langchain_openai.llms.base.OpenAI.html" ] } ], @@ -192,7 +247,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.6" + "version": "3.11.9" }, "vscode": { "interpreter": { diff --git a/libs/cli/langchain_cli/integration_template/docs/llms.ipynb b/libs/cli/langchain_cli/integration_template/docs/llms.ipynb index 6ccf227fb7..98be1ccb7f 100644 --- a/libs/cli/langchain_cli/integration_template/docs/llms.ipynb +++ b/libs/cli/langchain_cli/integration_template/docs/llms.ipynb @@ -34,7 +34,7 @@ "\n", "## Setup\n", "\n", - "- [ ] TODO: Update with relevant info.\n", + "- TODO: Update with relevant info.\n", "\n", "To access __ModuleName__ models you'll need to create a/an __ModuleName__ account, get an API key, and install the `__package_name__` integration package.\n", "\n",