diff --git a/docs/modules/agents.rst b/docs/modules/agents.rst index 87c831de..20592ceb 100644 --- a/docs/modules/agents.rst +++ b/docs/modules/agents.rst @@ -10,7 +10,7 @@ but potentially an unknown chain that depends on the user's input. In these types of chains, there is a “agent” which has access to a suite of tools. Depending on the user input, the agent can then decide which, if any, of these tools to call. -In this section of documentation, we first start with a Getting Started notebook to over over how to use all things related to agents in an end-to-end manner. +In this section of documentation, we first start with a Getting Started notebook to cover how to use all things related to agents in an end-to-end manner. .. toctree:: :maxdepth: 1 diff --git a/docs/modules/chains/examples/moderation.ipynb b/docs/modules/chains/examples/moderation.ipynb index 4a372003..9846c072 100644 --- a/docs/modules/chains/examples/moderation.ipynb +++ b/docs/modules/chains/examples/moderation.ipynb @@ -1,6 +1,7 @@ { "cells": [ { + "attachments": {}, "cell_type": "markdown", "id": "b83e61ed", "metadata": {}, @@ -13,7 +14,7 @@ "In this notebook, we will show:\n", "\n", "1. How to run any piece of text through a moderation chain.\n", - "2. How to append a Moderation chain to a LLMChain." + "2. How to append a Moderation chain to an LLMChain." ] }, { diff --git a/docs/modules/models/llms/examples/llm_serialization.ipynb b/docs/modules/models/llms/examples/llm_serialization.ipynb index 6dfde6e4..0edc32f9 100644 --- a/docs/modules/models/llms/examples/llm_serialization.ipynb +++ b/docs/modules/models/llms/examples/llm_serialization.ipynb @@ -27,7 +27,7 @@ "metadata": {}, "source": [ "## Loading\n", - "First, lets go over loading a LLM from disk. LLMs can be saved on disk in two formats: json or yaml. No matter the extension, they are loaded in the same way." + "First, lets go over loading an LLM from disk. LLMs can be saved on disk in two formats: json or yaml. No matter the extension, they are loaded in the same way." ] }, { @@ -112,7 +112,7 @@ "metadata": {}, "source": [ "## Saving\n", - "If you want to go from a LLM in memory to a serialized version of it, you can do so easily by calling the `.save` method. Again, this supports both json and yaml." + "If you want to go from an LLM in memory to a serialized version of it, you can do so easily by calling the `.save` method. Again, this supports both json and yaml." ] }, {