cleanup warnings (#8379)

This commit is contained in:
Bagatur 2023-07-27 13:43:05 -07:00 committed by GitHub
parent 41524304bf
commit 55beab326c
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 10 additions and 10 deletions

View File

@ -1,6 +1,6 @@
# Sequential
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
The next step after calling a language model is make a series of calls to a language model. This is particularly useful when you want to take the output from one call and use it as the input to another.

View File

@ -4,7 +4,7 @@ If you're building with LLMs, at some point something will break, and you'll nee
Here's a few different tools and functionalities to aid in debugging.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
## Tracing

View File

@ -2,7 +2,7 @@
This notebook covers how to load data from an .ipynb notebook into a format suitable by LangChain.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
```python

View File

@ -10,7 +10,7 @@ The LLMAgent is used in an AgentExecutor. This AgentExecutor can largely be thou
In this notebook we walk through how to create a custom LLM agent.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
## Set up environment

View File

@ -10,7 +10,7 @@ The LLMAgent is used in an AgentExecutor. This AgentExecutor can largely be thou
In this notebook we walk through how to create a custom LLM agent.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
## Set up environment

View File

@ -3,7 +3,7 @@ We'll show:
1. How to run any piece of text through a moderation chain.
2. How to append a Moderation chain to an LLMChain.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
```python

View File

@ -5,7 +5,7 @@ One of the core utility classes underpinning most (if not all) memory modules is
You may want to use this class directly if you are managing memory outside of a chain.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
```python

View File

@ -1,7 +1,7 @@
### Use Case
In this tutorial, we'll configure few shot examples for self-ask with search.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
## Using an example set

View File

@ -77,7 +77,7 @@ For example, in OpenAI [Chat Completion API](https://platform.openai.com/docs/gu
LangChain provides several prompt templates to make constructing and working with prompts easily. You are encouraged to use these chat related prompt templates instead of `PromptTemplate` when querying chat models to fully exploit the potential of underlying chat model.
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
```python

View File

@ -2,7 +2,7 @@
One common use case for wanting to partial a prompt template is if you get some of the variables before others. For example, suppose you have a prompt template that requires two variables, `foo` and `baz`. If you get the `foo` value early on in the chain, but the `baz` value later, it can be annoying to wait until you have both variables in the same place to pass them to the prompt template. Instead, you can partial the prompt template with the `foo` value, and then pass the partialed prompt template along and just use that. Below is an example of doing this:
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! Instead, edit the notebook w/the location & name as this file. -->
```python