forked from Archives/langchain
Compare commits
306 Commits
main
...
searx-asyn
Author | SHA1 | Date | |
---|---|---|---|
|
cc486f94bf | ||
|
e2c26909f2 | ||
|
3e879b47c1 | ||
|
859502b16c | ||
|
c33e055f17 | ||
|
a5bf8c9b9d | ||
|
0874872dee | ||
|
ef25904ecb | ||
|
9d6f649ba5 | ||
|
c58932e8fd | ||
|
6e85cbcce3 | ||
|
b25dbcb5b3 | ||
|
a554e94a1a | ||
|
5f34dffedc | ||
|
aff33d52c5 | ||
|
f16c1fb6df | ||
|
a9e1043673 | ||
|
f281033362 | ||
|
410bf37fb8 | ||
|
eff5eed719 | ||
|
d0a56f47ee | ||
|
9e74df2404 | ||
|
0bee219cb3 | ||
|
923a7dde5a | ||
|
4cd5cf2e95 | ||
33ebb05251 | |||
|
e0331b55bb | ||
|
d5825bd3e8 | ||
|
e8d9cbca3f | ||
|
b5020c7d9c | ||
|
5bea731fb4 | ||
|
0e3b0c827e | ||
|
365669a7fd | ||
b7f392fdd6 | |||
|
4be2f9d75a | ||
|
f74a1bebf5 | ||
|
76ecca4d53 | ||
|
b7ebb8fe30 | ||
|
41c8a42e22 | ||
|
1cc9e90041 | ||
|
30e3b31b04 | ||
|
a0cd6672aa | ||
|
8b5a43d720 | ||
|
725b668aef | ||
|
024efb09f8 | ||
|
953e58d004 | ||
|
f257b08406 | ||
|
5e91928607 | ||
|
880a6a3db5 | ||
|
71e8eaff2b | ||
|
6598beacdb | ||
|
e4f15e4eac | ||
|
e50c1ea7fb | ||
|
62e08f80de | ||
|
c50fafb35d | ||
|
3d3e523520 | ||
|
c1a9d83b34 | ||
|
42d725223e | ||
|
0bbcc7815b | ||
|
b26fa1935d | ||
|
bc2ed93b77 | ||
|
c71f2a7b26 | ||
|
51681f653f | ||
|
705431aecc | ||
|
b83e826510 | ||
|
e7d6de6b1c | ||
|
6e0d3880df | ||
|
6ec5780547 | ||
|
47d37db2d2 | ||
|
4f364db9a9 | ||
|
030ce9f506 | ||
|
8990122d5d | ||
|
52d6bf04d0 | ||
|
910da8518f | ||
|
2f27ef92fe | ||
|
75149d6d38 | ||
|
fab7994b74 | ||
|
eb80d6e0e4 | ||
|
b5667bed9e | ||
|
b3be83c750 | ||
|
50626a10ee | ||
|
6e1b5b8f7e | ||
|
eec9b1b306 | ||
|
ea142f6a32 | ||
|
12f868b292 | ||
|
31f9ecfc19 | ||
|
273e9bf296 | ||
|
f155d9d3ec | ||
|
d3d4503ce2 | ||
|
1f93c5cf69 | ||
|
15b5a08f4b | ||
|
ff4a25b841 | ||
|
2212520a6c | ||
|
d08f940336 | ||
|
2280a2cb2f | ||
|
ce5d97bcb3 | ||
|
8fa1764c60 | ||
|
f299bd1416 | ||
|
064be93edf | ||
|
86822d1cc2 | ||
|
a581bce379 | ||
|
2ffc643086 | ||
|
2136dc94bb | ||
|
a92344f476 | ||
|
b706966ebc | ||
|
1c22657256 | ||
|
6f02286805 | ||
|
3674074eb0 | ||
|
a7e09d46c5 | ||
|
fa2e546b76 | ||
|
c592b12043 | ||
|
9555bbd5bb | ||
|
0ca1641b14 | ||
|
d5b4393bb2 | ||
|
7b6ff7fe00 | ||
|
76c7b1f677 | ||
|
5aa8ece211 | ||
|
f6d24d5740 | ||
|
b1c4480d7c | ||
|
b6ba989f2f | ||
|
04acda55ec | ||
|
8e5c4ac867 | ||
|
df8702fead | ||
|
d5d50c39e6 | ||
|
1f18698b2a | ||
|
ef4945af6b | ||
|
7de2ada3ea | ||
|
262d4cb9a8 | ||
|
951c158106 | ||
|
85e4dd7fc3 | ||
|
b1b4a4065a | ||
|
08f23c95d9 | ||
|
3cf493b089 | ||
|
e635c86145 | ||
|
779790167e | ||
|
3161ced4bc | ||
|
3d6fcb85dc | ||
|
3701b2901e | ||
|
280cb4160d | ||
|
80d8db5f60 | ||
|
1a8790d808 | ||
|
34840f3aee | ||
|
8685d53adc | ||
|
2f6833d433 | ||
|
dd90fd02d5 | ||
|
07766a69f3 | ||
|
aa854988bf | ||
|
96ebe98dc2 | ||
|
45f05fc939 | ||
|
cf9c3f54f7 | ||
|
fbc0c85b90 | ||
|
276940fd9b | ||
|
cdff6c8181 | ||
|
cd45adbea2 | ||
|
aff44d0a98 | ||
|
8a95fdaee1 | ||
|
5d8dc83ede | ||
|
b157e0c1c3 | ||
|
40e9488055 | ||
|
55efbb8a7e | ||
|
d6bbf395af | ||
|
606605925d | ||
|
f93c011456 | ||
|
3c24684522 | ||
|
b84d190fd0 | ||
|
aad4bff098 | ||
|
3ea6d9c4d2 | ||
|
ced412e1c1 | ||
|
1279c8de39 | ||
|
c7779c800a | ||
|
6f4f771897 | ||
|
4a327dd1d6 | ||
|
d4edd3c312 | ||
|
e72074f78a | ||
|
0b29e68c17 | ||
|
4d7fdb8957 | ||
|
656efe6ef3 | ||
|
362586fe8b | ||
|
63aa28e2a6 | ||
|
c3dfbdf0da | ||
|
a2280f321f | ||
|
4e13cef05a | ||
|
e5c1659864 | ||
|
2d098e8869 | ||
|
8965a2f0af | ||
|
e222ea4ee8 | ||
|
e326939759 | ||
|
7cf46b3fee | ||
|
84cd825a0e | ||
|
0a1b1806e9 | ||
|
9ee2713272 | ||
|
b3234bf3b0 | ||
|
562d9891ea | ||
|
56aff797c0 | ||
|
d53ff270e0 | ||
|
df6c33d4b3 | ||
|
039d05c808 | ||
|
aed9f9febe | ||
|
72b461e257 | ||
|
cb646082ba | ||
|
bd4a2a670b | ||
|
6e98ab01e1 | ||
|
c0ad5d13b8 | ||
|
acd86d33bc | ||
|
9707eda83c | ||
|
7e550df6d4 | ||
|
c9b5a30b37 | ||
|
cb04ba0136 | ||
|
5903a93f3d | ||
|
15de3e8137 | ||
|
f95d551f7a | ||
|
c6bfa00178 | ||
|
01a57198b8 | ||
|
8dba30f31e | ||
|
9f78717b3c | ||
|
90846dcc28 | ||
|
6ed16e13b1 | ||
|
c1dc784a3d | ||
|
5b0e747f9a | ||
|
624c72c266 | ||
|
a950287206 | ||
|
30383abb12 | ||
|
cdb97f3dfb | ||
|
b44c8bd969 | ||
|
c9189d354a | ||
622578a022 | |||
|
7018806a92 | ||
|
bd335ffd64 | ||
|
a094c49153 | ||
|
99fe023496 | ||
|
3ee32a01ea | ||
|
c844d1fd46 | ||
|
9405af6919 | ||
|
357d808484 | ||
|
cc423f40f1 | ||
|
b053f831cd | ||
|
523ad8d2e2 | ||
|
31303d0b11 | ||
|
494c9d341a | ||
|
519f0187b6 | ||
|
64c6435545 | ||
|
7eba828e1b | ||
|
2a7215bc3b | ||
|
784d24a1d5 | ||
|
aba58e9e2e | ||
|
c4a557bdd4 | ||
|
97e3666e0d | ||
|
7ade419a0e | ||
|
a4a2d79087 | ||
|
8f21605d71 | ||
|
064741db58 | ||
|
e3354404ad | ||
|
3610ef2830 | ||
|
27104d4921 | ||
|
4f41e20f09 | ||
|
d0062c7a9a | ||
|
8e6f599822 | ||
|
f276bfad8e | ||
|
7bec461782 | ||
|
df6865cd52 | ||
|
312c319d8b | ||
|
0e21463f07 | ||
|
dec3750875 | ||
|
763f879536 | ||
|
56b850648f | ||
|
63a5614d23 | ||
|
a1b9dfc099 | ||
|
68ce68f290 | ||
|
b8a7828d1f | ||
|
6a4ee07e4f | ||
|
23231d65a9 | ||
3d54b05863 | |||
|
bca0935d90 | ||
|
882f7964fb | ||
|
443992c4d5 | ||
|
a83a371069 | ||
|
499e76b199 | ||
|
8947797250 | ||
|
1989e7d4c2 | ||
|
dda5259f68 | ||
|
f032609f8d | ||
|
9ac442624c | ||
|
34abcd31b9 | ||
|
fe30be6fba | ||
|
cfed0497ac | ||
|
59157b6891 | ||
|
e178008b75 | ||
|
1cd8996074 | ||
|
cfae03042d | ||
|
4b5e850361 | ||
|
4d4b43cf5a | ||
|
c01f9100e4 | ||
|
edb3915ee7 | ||
|
fe7dbecfe6 | ||
|
02ec72df87 | ||
|
92ab27e4b8 | ||
|
82baecc892 | ||
|
35f1e8f569 | ||
|
6c629b54e6 | ||
|
3574418a40 | ||
|
5bf8772f26 | ||
|
924bba5ce9 | ||
|
786852e9e6 | ||
|
72ef69d1ba | ||
|
1aa41b5741 | ||
|
c14cff60d0 |
2
CONTRIBUTING.md → .github/CONTRIBUTING.md
vendored
2
CONTRIBUTING.md → .github/CONTRIBUTING.md
vendored
@ -73,6 +73,8 @@ poetry install -E all
|
|||||||
|
|
||||||
This will install all requirements for running the package, examples, linting, formatting, tests, and coverage. Note the `-E all` flag will install all optional dependencies necessary for integration testing.
|
This will install all requirements for running the package, examples, linting, formatting, tests, and coverage. Note the `-E all` flag will install all optional dependencies necessary for integration testing.
|
||||||
|
|
||||||
|
❗Note: If you're running Poetry 1.4.1 and receive a `WheelFileValidationError` for `debugpy` during installation, you can try either downgrading to Poetry 1.4.0 or disabling "modern installation" (`poetry config installer.modern-installation false`) and re-install requirements. See [this `debugpy` issue](https://github.com/microsoft/debugpy/issues/1246) for more details.
|
||||||
|
|
||||||
Now, you should be able to run the common tasks in the following section.
|
Now, you should be able to run the common tasks in the following section.
|
||||||
|
|
||||||
## ✅Common Tasks
|
## ✅Common Tasks
|
7
.gitignore
vendored
7
.gitignore
vendored
@ -106,6 +106,7 @@ celerybeat.pid
|
|||||||
|
|
||||||
# Environments
|
# Environments
|
||||||
.env
|
.env
|
||||||
|
.envrc
|
||||||
.venv
|
.venv
|
||||||
.venvs
|
.venvs
|
||||||
env/
|
env/
|
||||||
@ -134,3 +135,9 @@ dmypy.json
|
|||||||
|
|
||||||
# macOS display setting files
|
# macOS display setting files
|
||||||
.DS_Store
|
.DS_Store
|
||||||
|
|
||||||
|
# Wandb directory
|
||||||
|
wandb/
|
||||||
|
|
||||||
|
# asdf tool versions
|
||||||
|
.tool-versions
|
||||||
|
@ -32,7 +32,7 @@ This library is aimed at assisting in the development of those types of applicat
|
|||||||
|
|
||||||
**🤖 Agents**
|
**🤖 Agents**
|
||||||
|
|
||||||
- [Documentation](https://langchain.readthedocs.io/en/latest/use_cases/agents.html)
|
- [Documentation](https://langchain.readthedocs.io/en/latest/modules/agents.html)
|
||||||
- End-to-end Example: [GPT+WolframAlpha](https://huggingface.co/spaces/JavaFXpert/Chat-GPT-LangChain)
|
- End-to-end Example: [GPT+WolframAlpha](https://huggingface.co/spaces/JavaFXpert/Chat-GPT-LangChain)
|
||||||
|
|
||||||
## 📖 Documentation
|
## 📖 Documentation
|
||||||
@ -42,7 +42,7 @@ Please see [here](https://langchain.readthedocs.io/en/latest/?) for full documen
|
|||||||
- Getting started (installation, setting up the environment, simple examples)
|
- Getting started (installation, setting up the environment, simple examples)
|
||||||
- How-To examples (demos, integrations, helper functions)
|
- How-To examples (demos, integrations, helper functions)
|
||||||
- Reference (full API docs)
|
- Reference (full API docs)
|
||||||
Resources (high-level explanation of core concepts)
|
- Resources (high-level explanation of core concepts)
|
||||||
|
|
||||||
## 🚀 What can this help with?
|
## 🚀 What can this help with?
|
||||||
|
|
||||||
@ -79,4 +79,4 @@ For more information on these concepts, please see our [full documentation](http
|
|||||||
|
|
||||||
As an open source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infra, or better documentation.
|
As an open source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infra, or better documentation.
|
||||||
|
|
||||||
For detailed information on how to contribute, see [here](CONTRIBUTING.md).
|
For detailed information on how to contribute, see [here](.github/CONTRIBUTING.md).
|
||||||
|
@ -23,13 +23,14 @@ with open("../pyproject.toml") as f:
|
|||||||
# -- Project information -----------------------------------------------------
|
# -- Project information -----------------------------------------------------
|
||||||
|
|
||||||
project = "🦜🔗 LangChain"
|
project = "🦜🔗 LangChain"
|
||||||
copyright = "2022, Harrison Chase"
|
copyright = "2023, Harrison Chase"
|
||||||
author = "Harrison Chase"
|
author = "Harrison Chase"
|
||||||
|
|
||||||
version = data["tool"]["poetry"]["version"]
|
version = data["tool"]["poetry"]["version"]
|
||||||
release = version
|
release = version
|
||||||
|
|
||||||
html_title = project + " " + version
|
html_title = project + " " + version
|
||||||
|
html_last_updated_fmt = "%b %d, %Y"
|
||||||
|
|
||||||
|
|
||||||
# -- General configuration ---------------------------------------------------
|
# -- General configuration ---------------------------------------------------
|
||||||
@ -45,6 +46,7 @@ extensions = [
|
|||||||
"sphinx.ext.viewcode",
|
"sphinx.ext.viewcode",
|
||||||
"sphinxcontrib.autodoc_pydantic",
|
"sphinxcontrib.autodoc_pydantic",
|
||||||
"myst_nb",
|
"myst_nb",
|
||||||
|
"sphinx_copybutton",
|
||||||
"sphinx_panels",
|
"sphinx_panels",
|
||||||
"IPython.sphinxext.ipython_console_highlighting",
|
"IPython.sphinxext.ipython_console_highlighting",
|
||||||
]
|
]
|
||||||
|
@ -37,3 +37,6 @@ A minimal example on how to run LangChain on Vercel using Flask.
|
|||||||
## [SteamShip](https://github.com/steamship-core/steamship-langchain/)
|
## [SteamShip](https://github.com/steamship-core/steamship-langchain/)
|
||||||
This repository contains LangChain adapters for Steamship, enabling LangChain developers to rapidly deploy their apps on Steamship.
|
This repository contains LangChain adapters for Steamship, enabling LangChain developers to rapidly deploy their apps on Steamship.
|
||||||
This includes: production ready endpoints, horizontal scaling across dependencies, persistant storage of app state, multi-tenancy support, etc.
|
This includes: production ready endpoints, horizontal scaling across dependencies, persistant storage of app state, multi-tenancy support, etc.
|
||||||
|
|
||||||
|
## [Langchain-serve](https://github.com/jina-ai/langchain-serve)
|
||||||
|
This repository allows users to serve local chains and agents as RESTful, gRPC, or Websocket APIs thanks to [Jina](https://docs.jina.ai/). Deploy your chains & agents with ease and enjoy independent scaling, serverless and autoscaling APIs, as well as a Streamlit playground on Jina AI Cloud.
|
||||||
|
@ -1,19 +1,21 @@
|
|||||||
# AtlasDB
|
# AtlasDB
|
||||||
|
|
||||||
This page covers how to Nomic's Atlas ecosystem within LangChain.
|
This page covers how to use Nomic's Atlas ecosystem within LangChain.
|
||||||
It is broken into two parts: installation and setup, and then references to specific Atlas wrappers.
|
It is broken into two parts: installation and setup, and then references to specific Atlas wrappers.
|
||||||
|
|
||||||
## Installation and Setup
|
## Installation and Setup
|
||||||
- Install the Python package with `pip install nomic`
|
- Install the Python package with `pip install nomic`
|
||||||
- Nomic is also included in langchains poetry extras `poetry install -E all`
|
- Nomic is also included in langchains poetry extras `poetry install -E all`
|
||||||
-
|
|
||||||
## Wrappers
|
## Wrappers
|
||||||
|
|
||||||
### VectorStore
|
### VectorStore
|
||||||
|
|
||||||
There exists a wrapper around the Atlas neural database, allowing you to use it as a vectorstore.
|
There exists a wrapper around the Atlas neural database, allowing you to use it as a vectorstore.
|
||||||
This vectorstore also gives you full access to the underlying AtlasProject object, which will allow you to use the full range of Atlas map interactions, such as bulk tagging and automatic topic modeling.
|
This vectorstore also gives you full access to the underlying AtlasProject object, which will allow you to use the full range of Atlas map interactions, such as bulk tagging and automatic topic modeling.
|
||||||
Please see [the Nomic docs](https://docs.nomic.ai/atlas_api.html) for more detailed information.
|
Please see [the Atlas docs](https://docs.nomic.ai/atlas_api.html) for more detailed information.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@ -22,4 +24,4 @@ To import this vectorstore:
|
|||||||
from langchain.vectorstores import AtlasDB
|
from langchain.vectorstores import AtlasDB
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of the Chroma wrapper, see [this notebook](../modules/indexes/examples/vectorstores.ipynb)
|
For a more detailed walkthrough of the AtlasDB wrapper, see [this notebook](../modules/indexes/vectorstores/examples/atlas.ipynb)
|
||||||
|
@ -5,7 +5,7 @@ It is broken into two parts: installation and setup, and then references to spec
|
|||||||
|
|
||||||
## Installation and Setup
|
## Installation and Setup
|
||||||
|
|
||||||
- Install with `pip3 install banana-dev`
|
- Install with `pip install banana-dev`
|
||||||
- Get an Banana api key and set it as an environment variable (`BANANA_API_KEY`)
|
- Get an Banana api key and set it as an environment variable (`BANANA_API_KEY`)
|
||||||
|
|
||||||
## Define your Banana Template
|
## Define your Banana Template
|
||||||
|
@ -17,4 +17,4 @@ To import this vectorstore:
|
|||||||
from langchain.vectorstores import Chroma
|
from langchain.vectorstores import Chroma
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of the Chroma wrapper, see [this notebook](../modules/indexes/examples/vectorstores.ipynb)
|
For a more detailed walkthrough of the Chroma wrapper, see [this notebook](../modules/indexes/vectorstores/getting_started.ipynb)
|
||||||
|
@ -22,4 +22,4 @@ There exists an Cohere Embeddings wrapper, which you can access with
|
|||||||
```python
|
```python
|
||||||
from langchain.embeddings import CohereEmbeddings
|
from langchain.embeddings import CohereEmbeddings
|
||||||
```
|
```
|
||||||
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/examples/embeddings.ipynb)
|
For a more detailed walkthrough of this, see [this notebook](../modules/models/text_embedding/examples/cohere.ipynb)
|
||||||
|
@ -22,4 +22,4 @@ from langchain.vectorstores import DeepLake
|
|||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
For a more detailed walkthrough of the Deep Lake wrapper, see [this notebook](../modules/indexes/vectorstore_examples/deeplake.ipynb)
|
For a more detailed walkthrough of the Deep Lake wrapper, see [this notebook](../modules/indexes/vectorstores/examples/deeplake.ipynb)
|
||||||
|
@ -18,7 +18,7 @@ There exists a GoogleSearchAPIWrapper utility which wraps this API. To import th
|
|||||||
from langchain.utilities import GoogleSearchAPIWrapper
|
from langchain.utilities import GoogleSearchAPIWrapper
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/utils/examples/google_search.ipynb).
|
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/agents/tools/examples/google_search.ipynb).
|
||||||
|
|
||||||
### Tool
|
### Tool
|
||||||
|
|
||||||
@ -29,4 +29,4 @@ from langchain.agents import load_tools
|
|||||||
tools = load_tools(["google-search"])
|
tools = load_tools(["google-search"])
|
||||||
```
|
```
|
||||||
|
|
||||||
For more information on this, see [this page](../modules/agents/tools.md)
|
For more information on this, see [this page](../modules/agents/tools/getting_started.md)
|
||||||
|
@ -34,7 +34,8 @@ search = GoogleSerperAPIWrapper()
|
|||||||
tools = [
|
tools = [
|
||||||
Tool(
|
Tool(
|
||||||
name="Intermediate Answer",
|
name="Intermediate Answer",
|
||||||
func=search.run
|
func=search.run,
|
||||||
|
description="useful for when you need to ask with search"
|
||||||
)
|
)
|
||||||
]
|
]
|
||||||
|
|
||||||
@ -57,7 +58,7 @@ So the final answer is: El Palmar, Spain
|
|||||||
'El Palmar, Spain'
|
'El Palmar, Spain'
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/utils/examples/google_serper.ipynb).
|
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/agents/tools/examples/google_serper.ipynb).
|
||||||
|
|
||||||
### Tool
|
### Tool
|
||||||
|
|
||||||
@ -68,4 +69,4 @@ from langchain.agents import load_tools
|
|||||||
tools = load_tools(["google-serper"])
|
tools = load_tools(["google-serper"])
|
||||||
```
|
```
|
||||||
|
|
||||||
For more information on this, see [this page](../modules/agents/tools.md)
|
For more information on this, see [this page](../modules/agents/tools/getting_started.md)
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
# Graphsignal
|
# Graphsignal
|
||||||
|
|
||||||
This page covers how to use the Graphsignal to trace and monitor LangChain.
|
This page covers how to use the Graphsignal ecosystem to trace and monitor LangChain.
|
||||||
|
|
||||||
## Installation and Setup
|
## Installation and Setup
|
||||||
|
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
# Helicone
|
# Helicone
|
||||||
|
|
||||||
This page covers how to use the [Helicone](https://helicone.ai) within LangChain.
|
This page covers how to use the [Helicone](https://helicone.ai) ecosystem within LangChain.
|
||||||
|
|
||||||
## What is Helicone?
|
## What is Helicone?
|
||||||
|
|
||||||
|
@ -30,7 +30,7 @@ To use a the wrapper for a model hosted on Hugging Face Hub:
|
|||||||
```python
|
```python
|
||||||
from langchain.llms import HuggingFaceHub
|
from langchain.llms import HuggingFaceHub
|
||||||
```
|
```
|
||||||
For a more detailed walkthrough of the Hugging Face Hub wrapper, see [this notebook](../modules/llms/integrations/huggingface_hub.ipynb)
|
For a more detailed walkthrough of the Hugging Face Hub wrapper, see [this notebook](../modules/models/llms/integrations/huggingface_hub.ipynb)
|
||||||
|
|
||||||
|
|
||||||
### Embeddings
|
### Embeddings
|
||||||
@ -47,7 +47,7 @@ To use a the wrapper for a model hosted on Hugging Face Hub:
|
|||||||
```python
|
```python
|
||||||
from langchain.embeddings import HuggingFaceHubEmbeddings
|
from langchain.embeddings import HuggingFaceHubEmbeddings
|
||||||
```
|
```
|
||||||
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/examples/embeddings.ipynb)
|
For a more detailed walkthrough of this, see [this notebook](../modules/models/text_embedding/examples/huggingfacehub.ipynb)
|
||||||
|
|
||||||
### Tokenizer
|
### Tokenizer
|
||||||
|
|
||||||
@ -59,7 +59,7 @@ You can also use it to count tokens when splitting documents with
|
|||||||
from langchain.text_splitter import CharacterTextSplitter
|
from langchain.text_splitter import CharacterTextSplitter
|
||||||
CharacterTextSplitter.from_huggingface_tokenizer(...)
|
CharacterTextSplitter.from_huggingface_tokenizer(...)
|
||||||
```
|
```
|
||||||
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/examples/textsplitter.ipynb)
|
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/text_splitters/examples/huggingface_length_function.ipynb)
|
||||||
|
|
||||||
|
|
||||||
### Datasets
|
### Datasets
|
||||||
|
18
docs/ecosystem/jina.md
Normal file
18
docs/ecosystem/jina.md
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
# Jina
|
||||||
|
|
||||||
|
This page covers how to use the Jina ecosystem within LangChain.
|
||||||
|
It is broken into two parts: installation and setup, and then references to specific Jina wrappers.
|
||||||
|
|
||||||
|
## Installation and Setup
|
||||||
|
- Install the Python SDK with `pip install jina`
|
||||||
|
- Get a Jina AI Cloud auth token from [here](https://cloud.jina.ai/settings/tokens) and set it as an environment variable (`JINA_AUTH_TOKEN`)
|
||||||
|
|
||||||
|
## Wrappers
|
||||||
|
|
||||||
|
### Embeddings
|
||||||
|
|
||||||
|
There exists a Jina Embeddings wrapper, which you can access with
|
||||||
|
```python
|
||||||
|
from langchain.embeddings import JinaEmbeddings
|
||||||
|
```
|
||||||
|
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/examples/embeddings.ipynb)
|
20
docs/ecosystem/milvus.md
Normal file
20
docs/ecosystem/milvus.md
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
# Milvus
|
||||||
|
|
||||||
|
This page covers how to use the Milvus ecosystem within LangChain.
|
||||||
|
It is broken into two parts: installation and setup, and then references to specific Milvus wrappers.
|
||||||
|
|
||||||
|
## Installation and Setup
|
||||||
|
- Install the Python SDK with `pip install pymilvus`
|
||||||
|
## Wrappers
|
||||||
|
|
||||||
|
### VectorStore
|
||||||
|
|
||||||
|
There exists a wrapper around Milvus indexes, allowing you to use it as a vectorstore,
|
||||||
|
whether for semantic search or example selection.
|
||||||
|
|
||||||
|
To import this vectorstore:
|
||||||
|
```python
|
||||||
|
from langchain.vectorstores import Milvus
|
||||||
|
```
|
||||||
|
|
||||||
|
For a more detailed walkthrough of the Miluvs wrapper, see [this notebook](../modules/indexes/vectorstores/examples/milvus.ipynb)
|
@ -21,7 +21,7 @@ If you are using a model hosted on Azure, you should use different wrapper for t
|
|||||||
```python
|
```python
|
||||||
from langchain.llms import AzureOpenAI
|
from langchain.llms import AzureOpenAI
|
||||||
```
|
```
|
||||||
For a more detailed walkthrough of the Azure wrapper, see [this notebook](../modules/llms/integrations/azure_openai_example.ipynb)
|
For a more detailed walkthrough of the Azure wrapper, see [this notebook](../modules/models/llms/integrations/azure_openai_example.ipynb)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@ -31,7 +31,7 @@ There exists an OpenAI Embeddings wrapper, which you can access with
|
|||||||
```python
|
```python
|
||||||
from langchain.embeddings import OpenAIEmbeddings
|
from langchain.embeddings import OpenAIEmbeddings
|
||||||
```
|
```
|
||||||
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/examples/embeddings.ipynb)
|
For a more detailed walkthrough of this, see [this notebook](../modules/models/text_embedding/examples/openai.ipynb)
|
||||||
|
|
||||||
|
|
||||||
### Tokenizer
|
### Tokenizer
|
||||||
@ -44,7 +44,7 @@ You can also use it to count tokens when splitting documents with
|
|||||||
from langchain.text_splitter import CharacterTextSplitter
|
from langchain.text_splitter import CharacterTextSplitter
|
||||||
CharacterTextSplitter.from_tiktoken_encoder(...)
|
CharacterTextSplitter.from_tiktoken_encoder(...)
|
||||||
```
|
```
|
||||||
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/examples/textsplitter.ipynb)
|
For a more detailed walkthrough of this, see [this notebook](../modules/indexes/text_splitters/examples/tiktoken.ipynb)
|
||||||
|
|
||||||
### Moderation
|
### Moderation
|
||||||
You can also access the OpenAI content moderation endpoint with
|
You can also access the OpenAI content moderation endpoint with
|
||||||
|
@ -18,4 +18,4 @@ To import this vectorstore:
|
|||||||
from langchain.vectorstores import OpenSearchVectorSearch
|
from langchain.vectorstores import OpenSearchVectorSearch
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of the OpenSearch wrapper, see [this notebook](../modules/indexes/vectorstore_examples/opensearch.ipynb)
|
For a more detailed walkthrough of the OpenSearch wrapper, see [this notebook](../modules/indexes/vectorstores/examples/opensearch.ipynb)
|
||||||
|
29
docs/ecosystem/pgvector.md
Normal file
29
docs/ecosystem/pgvector.md
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
# PGVector
|
||||||
|
|
||||||
|
This page covers how to use the Postgres [PGVector](https://github.com/pgvector/pgvector) ecosystem within LangChain
|
||||||
|
It is broken into two parts: installation and setup, and then references to specific PGVector wrappers.
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
- Install the Python package with `pip install pgvector`
|
||||||
|
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
1. The first step is to create a database with the `pgvector` extension installed.
|
||||||
|
|
||||||
|
Follow the steps at [PGVector Installation Steps](https://github.com/pgvector/pgvector#installation) to install the database and the extension. The docker image is the easiest way to get started.
|
||||||
|
|
||||||
|
## Wrappers
|
||||||
|
|
||||||
|
### VectorStore
|
||||||
|
|
||||||
|
There exists a wrapper around Postgres vector databases, allowing you to use it as a vectorstore,
|
||||||
|
whether for semantic search or example selection.
|
||||||
|
|
||||||
|
To import this vectorstore:
|
||||||
|
```python
|
||||||
|
from langchain.vectorstores.pgvector import PGVector
|
||||||
|
```
|
||||||
|
|
||||||
|
### Usage
|
||||||
|
|
||||||
|
For a more detailed walkthrough of the PGVector Wrapper, see [this notebook](../modules/indexes/vectorstores/examples/pgvector.ipynb)
|
@ -17,4 +17,4 @@ To import this vectorstore:
|
|||||||
from langchain.vectorstores import Pinecone
|
from langchain.vectorstores import Pinecone
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of the Pinecone wrapper, see [this notebook](../modules/indexes/examples/vectorstores.ipynb)
|
For a more detailed walkthrough of the Pinecone wrapper, see [this notebook](../modules/indexes/vectorstores/examples/pinecone.ipynb)
|
||||||
|
@ -25,7 +25,25 @@ from langchain.llms import PromptLayerOpenAI
|
|||||||
llm = PromptLayerOpenAI(pl_tags=["langchain-requests", "chatbot"])
|
llm = PromptLayerOpenAI(pl_tags=["langchain-requests", "chatbot"])
|
||||||
```
|
```
|
||||||
|
|
||||||
This LLM is identical to the [OpenAI LLM](./openai), except that
|
To get the PromptLayer request id, use the argument `return_pl_id` when instanializing the LLM
|
||||||
|
```python
|
||||||
|
from langchain.llms import PromptLayerOpenAI
|
||||||
|
llm = PromptLayerOpenAI(return_pl_id=True)
|
||||||
|
```
|
||||||
|
This will add the PromptLayer request ID in the `generation_info` field of the `Generation` returned when using `.generate` or `.agenerate`
|
||||||
|
|
||||||
|
For example:
|
||||||
|
```python
|
||||||
|
llm_results = llm.generate(["hello world"])
|
||||||
|
for res in llm_results.generations:
|
||||||
|
print("pl request id: ", res[0].generation_info["pl_request_id"])
|
||||||
|
```
|
||||||
|
You can use the PromptLayer request ID to add a prompt, score, or other metadata to your request. [Read more about it here](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
|
||||||
|
|
||||||
|
This LLM is identical to the [OpenAI LLM](./openai.md), except that
|
||||||
- all your requests will be logged to your PromptLayer account
|
- all your requests will be logged to your PromptLayer account
|
||||||
- you can add `pl_tags` when instantializing to tag your requests on PromptLayer
|
- you can add `pl_tags` when instantializing to tag your requests on PromptLayer
|
||||||
|
- you can add `return_pl_id` when instantializing to return a PromptLayer request id to use [while tracking requests](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9).
|
||||||
|
|
||||||
|
|
||||||
|
PromptLayer also provides native wrappers for [`PromptLayerChatOpenAI`](../modules/models/chat/integrations/promptlayer_chatopenai.ipynb) and `PromptLayerOpenAIChat`
|
||||||
|
20
docs/ecosystem/qdrant.md
Normal file
20
docs/ecosystem/qdrant.md
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
# Qdrant
|
||||||
|
|
||||||
|
This page covers how to use the Qdrant ecosystem within LangChain.
|
||||||
|
It is broken into two parts: installation and setup, and then references to specific Qdrant wrappers.
|
||||||
|
|
||||||
|
## Installation and Setup
|
||||||
|
- Install the Python SDK with `pip install qdrant-client`
|
||||||
|
## Wrappers
|
||||||
|
|
||||||
|
### VectorStore
|
||||||
|
|
||||||
|
There exists a wrapper around Qdrant indexes, allowing you to use it as a vectorstore,
|
||||||
|
whether for semantic search or example selection.
|
||||||
|
|
||||||
|
To import this vectorstore:
|
||||||
|
```python
|
||||||
|
from langchain.vectorstores import Qdrant
|
||||||
|
```
|
||||||
|
|
||||||
|
For a more detailed walkthrough of the Qdrant wrapper, see [this notebook](../modules/indexes/vectorstores/examples/qdrant.ipynb)
|
47
docs/ecosystem/replicate.md
Normal file
47
docs/ecosystem/replicate.md
Normal file
@ -0,0 +1,47 @@
|
|||||||
|
# Replicate
|
||||||
|
This page covers how to run models on Replicate within LangChain.
|
||||||
|
|
||||||
|
## Installation and Setup
|
||||||
|
- Create a [Replicate](https://replicate.com) account. Get your API key and set it as an environment variable (`REPLICATE_API_TOKEN`)
|
||||||
|
- Install the [Replicate python client](https://github.com/replicate/replicate-python) with `pip install replicate`
|
||||||
|
|
||||||
|
## Calling a model
|
||||||
|
|
||||||
|
Find a model on the [Replicate explore page](https://replicate.com/explore), and then paste in the model name and version in this format: `owner-name/model-name:version`
|
||||||
|
|
||||||
|
For example, for this [flan-t5 model](https://replicate.com/daanelson/flan-t5), click on the API tab. The model name/version would be: `daanelson/flan-t5:04e422a9b85baed86a4f24981d7f9953e20c5fd82f6103b74ebc431588e1cec8`
|
||||||
|
|
||||||
|
Only the `model` param is required, but any other model parameters can also be passed in with the format `input={model_param: value, ...}`
|
||||||
|
|
||||||
|
|
||||||
|
For example, if we were running stable diffusion and wanted to change the image dimensions:
|
||||||
|
|
||||||
|
```
|
||||||
|
Replicate(model="stability-ai/stable-diffusion:db21e45d3f7023abc2a46ee38a23973f6dce16bb082a930b0c49861f96d1e5bf", input={'image_dimensions': '512x512'})
|
||||||
|
```
|
||||||
|
|
||||||
|
*Note that only the first output of a model will be returned.*
|
||||||
|
From here, we can initialize our model:
|
||||||
|
|
||||||
|
```python
|
||||||
|
llm = Replicate(model="daanelson/flan-t5:04e422a9b85baed86a4f24981d7f9953e20c5fd82f6103b74ebc431588e1cec8")
|
||||||
|
```
|
||||||
|
|
||||||
|
And run it:
|
||||||
|
|
||||||
|
```python
|
||||||
|
prompt = """
|
||||||
|
Answer the following yes/no question by reasoning step by step.
|
||||||
|
Can a dog drive a car?
|
||||||
|
"""
|
||||||
|
llm(prompt)
|
||||||
|
```
|
||||||
|
|
||||||
|
We can call any Replicate model (not just LLMs) using this syntax. For example, we can call [Stable Diffusion](https://replicate.com/stability-ai/stable-diffusion):
|
||||||
|
|
||||||
|
```python
|
||||||
|
text2image = Replicate(model="stability-ai/stable-diffusion:db21e45d3f7023abc2a46ee38a23973f6dce16bb082a930b0c49861f96d1e5bf",
|
||||||
|
input={'image_dimensions'='512x512'}
|
||||||
|
|
||||||
|
image_output = text2image("A cat riding a motorcycle by Picasso")
|
||||||
|
```
|
@ -15,7 +15,7 @@ custom LLMs, you can use the `SelfHostedPipeline` parent class.
|
|||||||
from langchain.llms import SelfHostedPipeline, SelfHostedHuggingFaceLLM
|
from langchain.llms import SelfHostedPipeline, SelfHostedHuggingFaceLLM
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of the Self-hosted LLMs, see [this notebook](../modules/llms/integrations/self_hosted_examples.ipynb)
|
For a more detailed walkthrough of the Self-hosted LLMs, see [this notebook](../modules/models/llms/integrations/self_hosted_examples.ipynb)
|
||||||
|
|
||||||
## Self-hosted Embeddings
|
## Self-hosted Embeddings
|
||||||
There are several ways to use self-hosted embeddings with LangChain via Runhouse.
|
There are several ways to use self-hosted embeddings with LangChain via Runhouse.
|
||||||
@ -26,6 +26,4 @@ the `SelfHostedEmbedding` class.
|
|||||||
from langchain.llms import SelfHostedPipeline, SelfHostedHuggingFaceLLM
|
from langchain.llms import SelfHostedPipeline, SelfHostedHuggingFaceLLM
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of the Self-hosted Embeddings, see [this notebook](../modules/indexes/examples/embeddings.ipynb)
|
For a more detailed walkthrough of the Self-hosted Embeddings, see [this notebook](../modules/models/text_embedding/examples/self-hosted.ipynb)
|
||||||
|
|
||||||
##
|
|
||||||
|
@ -5,31 +5,66 @@ It is broken into two parts: installation and setup, and then references to the
|
|||||||
|
|
||||||
## Installation and Setup
|
## Installation and Setup
|
||||||
|
|
||||||
- You can find a list of public SearxNG instances [here](https://searx.space/).
|
While it is possible to utilize the wrapper in conjunction with [public searx
|
||||||
- It recommended to use a self-hosted instance to avoid abuse on the public instances. Also note that public instances often have a limit on the number of requests.
|
instances](https://searx.space/) these instances frequently do not permit API
|
||||||
- To run a self-hosted instance see [this page](https://searxng.github.io/searxng/admin/installation.html) for more information.
|
access (see note on output format below) and have limitations on the frequency
|
||||||
- To use the tool you need to provide the searx host url by:
|
of requests. It is recommended to opt for a self-hosted instance instead.
|
||||||
1. passing the named parameter `searx_host` when creating the instance.
|
|
||||||
2. exporting the environment variable `SEARXNG_HOST`.
|
### Self Hosted Instance:
|
||||||
|
|
||||||
|
See [this page](https://searxng.github.io/searxng/admin/installation.html) for installation instructions.
|
||||||
|
|
||||||
|
When you install SearxNG, the only active output format by default is the HTML format.
|
||||||
|
You need to activate the `json` format to use the API. This can be done by adding the following line to the `settings.yml` file:
|
||||||
|
```yaml
|
||||||
|
search:
|
||||||
|
formats:
|
||||||
|
- html
|
||||||
|
- json
|
||||||
|
```
|
||||||
|
You can make sure that the API is working by issuing a curl request to the API endpoint:
|
||||||
|
|
||||||
|
`curl -kLX GET --data-urlencode q='langchain' -d format=json http://localhost:8888`
|
||||||
|
|
||||||
|
This should return a JSON object with the results.
|
||||||
|
|
||||||
|
|
||||||
## Wrappers
|
## Wrappers
|
||||||
|
|
||||||
### Utility
|
### Utility
|
||||||
|
|
||||||
|
To use the wrapper we need to pass the host of the SearxNG instance to the wrapper with:
|
||||||
|
1. the named parameter `searx_host` when creating the instance.
|
||||||
|
2. exporting the environment variable `SEARXNG_HOST`.
|
||||||
|
|
||||||
You can use the wrapper to get results from a SearxNG instance.
|
You can use the wrapper to get results from a SearxNG instance.
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from langchain.utilities import SearxSearchWrapper
|
from langchain.utilities import SearxSearchWrapper
|
||||||
|
s = SearxSearchWrapper(searx_host="http://localhost:8888")
|
||||||
|
s.run("what is a large language model?")
|
||||||
```
|
```
|
||||||
|
|
||||||
### Tool
|
### Tool
|
||||||
|
|
||||||
You can also easily load this wrapper as a Tool (to use with an Agent).
|
You can also load this wrapper as a Tool (to use with an Agent).
|
||||||
|
|
||||||
You can do this with:
|
You can do this with:
|
||||||
|
|
||||||
```python
|
```python
|
||||||
from langchain.agents import load_tools
|
from langchain.agents import load_tools
|
||||||
tools = load_tools(["searx-search"], searx_host="https://searx.example.com")
|
tools = load_tools(["searx-search"],
|
||||||
|
searx_host="http://localhost:8888",
|
||||||
|
engines=["github"])
|
||||||
```
|
```
|
||||||
|
|
||||||
For more information on this, see [this page](../modules/agents/tools.md)
|
Note that we could _optionally_ pass custom engines to use.
|
||||||
|
|
||||||
|
If you want to obtain results with metadata as *json* you can use:
|
||||||
|
```python
|
||||||
|
tools = load_tools(["searx-search-results-json"],
|
||||||
|
searx_host="http://localhost:8888",
|
||||||
|
num_results=5)
|
||||||
|
```
|
||||||
|
|
||||||
|
For more information on tools, see [this page](../modules/agents/tools/getting_started.md)
|
||||||
|
@ -17,7 +17,7 @@ There exists a SerpAPI utility which wraps this API. To import this utility:
|
|||||||
from langchain.utilities import SerpAPIWrapper
|
from langchain.utilities import SerpAPIWrapper
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/utils/examples/serpapi.ipynb).
|
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/agents/tools/examples/serpapi.ipynb).
|
||||||
|
|
||||||
### Tool
|
### Tool
|
||||||
|
|
||||||
@ -28,4 +28,4 @@ from langchain.agents import load_tools
|
|||||||
tools = load_tools(["serpapi"])
|
tools = load_tools(["serpapi"])
|
||||||
```
|
```
|
||||||
|
|
||||||
For more information on this, see [this page](../modules/agents/tools.md)
|
For more information on this, see [this page](../modules/agents/tools/getting_started.md)
|
||||||
|
@ -17,9 +17,12 @@ This page is broken into two parts: installation and setup, and then references
|
|||||||
- `poppler-utils`
|
- `poppler-utils`
|
||||||
- `tesseract-ocr`
|
- `tesseract-ocr`
|
||||||
- `libreoffice`
|
- `libreoffice`
|
||||||
- If you are parsing PDFs, run the following to install the `detectron2` model, which
|
- If you are parsing PDFs using the `"hi_res"` strategy, run the following to install the `detectron2` model, which
|
||||||
`unstructured` uses for layout detection:
|
`unstructured` uses for layout detection:
|
||||||
- `pip install "detectron2@git+https://github.com/facebookresearch/detectron2.git@v0.6#egg=detectron2"`
|
- `pip install "detectron2@git+https://github.com/facebookresearch/detectron2.git@v0.6#egg=detectron2"`
|
||||||
|
- If `detectron2` is not installed, `unstructured` will fallback to processing PDFs
|
||||||
|
using the `"fast"` strategy, which uses `pdfminer` directly and doesn't require
|
||||||
|
`detectron2`.
|
||||||
|
|
||||||
## Wrappers
|
## Wrappers
|
||||||
|
|
||||||
|
625
docs/ecosystem/wandb_tracking.ipynb
Normal file
625
docs/ecosystem/wandb_tracking.ipynb
Normal file
@ -0,0 +1,625 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Weights & Biases\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook goes over how to track your LangChain experiments into one centralized Weights and Biases dashboard. To learn more about prompt engineering and the callback please refer to this Report which explains both alongside the resultant dashboards you can expect to see.\n",
|
||||||
|
"\n",
|
||||||
|
"Run in Colab: https://colab.research.google.com/drive/1DXH4beT4HFaRKy_Vm4PoxhXVDRf7Ym8L?usp=sharing\n",
|
||||||
|
"\n",
|
||||||
|
"View Report: https://wandb.ai/a-sh0ts/langchain_callback_demo/reports/Prompt-Engineering-LLMs-with-LangChain-and-W-B--VmlldzozNjk1NTUw#👋-how-to-build-a-callback-in-langchain-for-better-prompt-engineering"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"!pip install wandb\n",
|
||||||
|
"!pip install pandas\n",
|
||||||
|
"!pip install textstat\n",
|
||||||
|
"!pip install spacy\n",
|
||||||
|
"!python -m spacy download en_core_web_sm"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"metadata": {
|
||||||
|
"id": "T1bSmKd6V2If"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import os\n",
|
||||||
|
"os.environ[\"WANDB_API_KEY\"] = \"\"\n",
|
||||||
|
"# os.environ[\"OPENAI_API_KEY\"] = \"\"\n",
|
||||||
|
"# os.environ[\"SERPAPI_API_KEY\"] = \"\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"metadata": {
|
||||||
|
"id": "8WAGnTWpUUnD"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from datetime import datetime\n",
|
||||||
|
"from langchain.callbacks import WandbCallbackHandler, StdOutCallbackHandler\n",
|
||||||
|
"from langchain.callbacks.base import CallbackManager\n",
|
||||||
|
"from langchain.llms import OpenAI"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"```\n",
|
||||||
|
"Callback Handler that logs to Weights and Biases.\n",
|
||||||
|
"\n",
|
||||||
|
"Parameters:\n",
|
||||||
|
" job_type (str): The type of job.\n",
|
||||||
|
" project (str): The project to log to.\n",
|
||||||
|
" entity (str): The entity to log to.\n",
|
||||||
|
" tags (list): The tags to log.\n",
|
||||||
|
" group (str): The group to log to.\n",
|
||||||
|
" name (str): The name of the run.\n",
|
||||||
|
" notes (str): The notes to log.\n",
|
||||||
|
" visualize (bool): Whether to visualize the run.\n",
|
||||||
|
" complexity_metrics (bool): Whether to log complexity metrics.\n",
|
||||||
|
" stream_logs (bool): Whether to stream callback actions to W&B\n",
|
||||||
|
"```"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"id": "cxBFfZR8d9FC"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"```\n",
|
||||||
|
"Default values for WandbCallbackHandler(...)\n",
|
||||||
|
"\n",
|
||||||
|
"visualize: bool = False,\n",
|
||||||
|
"complexity_metrics: bool = False,\n",
|
||||||
|
"stream_logs: bool = False,\n",
|
||||||
|
"```\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"NOTE: For beta workflows we have made the default analysis based on textstat and the visualizations based on spacy"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"metadata": {
|
||||||
|
"id": "KAz8weWuUeXF"
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stderr",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\u001b[34m\u001b[1mwandb\u001b[0m: Currently logged in as: \u001b[33mharrison-chase\u001b[0m. Use \u001b[1m`wandb login --relogin`\u001b[0m to force relogin\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Tracking run with wandb version 0.14.0"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Run data is saved locally in <code>/Users/harrisonchase/workplace/langchain/docs/ecosystem/wandb/run-20230318_150408-e47j1914</code>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Syncing run <strong><a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/e47j1914' target=\"_blank\">llm</a></strong> to <a href='https://wandb.ai/harrison-chase/langchain_callback_demo' target=\"_blank\">Weights & Biases</a> (<a href='https://wandb.me/run' target=\"_blank\">docs</a>)<br/>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View project at <a href='https://wandb.ai/harrison-chase/langchain_callback_demo' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo</a>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View run at <a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/e47j1914' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo/runs/e47j1914</a>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "stderr",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\u001b[34m\u001b[1mwandb\u001b[0m: \u001b[33mWARNING\u001b[0m The wandb callback is currently in beta and is subject to change based on updates to `langchain`. Please report any issues to https://github.com/wandb/wandb/issues with the tag `langchain`.\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"\"\"\"Main function.\n",
|
||||||
|
"\n",
|
||||||
|
"This function is used to try the callback handler.\n",
|
||||||
|
"Scenarios:\n",
|
||||||
|
"1. OpenAI LLM\n",
|
||||||
|
"2. Chain with multiple SubChains on multiple generations\n",
|
||||||
|
"3. Agent with Tools\n",
|
||||||
|
"\"\"\"\n",
|
||||||
|
"session_group = datetime.now().strftime(\"%m.%d.%Y_%H.%M.%S\")\n",
|
||||||
|
"wandb_callback = WandbCallbackHandler(\n",
|
||||||
|
" job_type=\"inference\",\n",
|
||||||
|
" project=\"langchain_callback_demo\",\n",
|
||||||
|
" group=f\"minimal_{session_group}\",\n",
|
||||||
|
" name=\"llm\",\n",
|
||||||
|
" tags=[\"test\"],\n",
|
||||||
|
")\n",
|
||||||
|
"manager = CallbackManager([StdOutCallbackHandler(), wandb_callback])\n",
|
||||||
|
"llm = OpenAI(temperature=0, callback_manager=manager, verbose=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"id": "Q-65jwrDeK6w"
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"```\n",
|
||||||
|
"# Defaults for WandbCallbackHandler.flush_tracker(...)\n",
|
||||||
|
"\n",
|
||||||
|
"reset: bool = True,\n",
|
||||||
|
"finish: bool = False,\n",
|
||||||
|
"```\n",
|
||||||
|
"\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"The `flush_tracker` function is used to log LangChain sessions to Weights & Biases. It takes in the LangChain module or agent, and logs at minimum the prompts and generations alongside the serialized form of the LangChain module to the specified Weights & Biases project. By default we reset the session as opposed to concluding the session outright."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"metadata": {
|
||||||
|
"id": "o_VmneyIUyx8"
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Waiting for W&B process to finish... <strong style=\"color:green\">(success).</strong>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View run <strong style=\"color:#cdcd00\">llm</strong> at: <a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/e47j1914' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo/runs/e47j1914</a><br/>Synced 5 W&B file(s), 2 media file(s), 5 artifact file(s) and 0 other file(s)"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Find logs at: <code>./wandb/run-20230318_150408-e47j1914/logs</code>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"application/vnd.jupyter.widget-view+json": {
|
||||||
|
"model_id": "0d7b4307ccdb450ea631497174fca2d1",
|
||||||
|
"version_major": 2,
|
||||||
|
"version_minor": 0
|
||||||
|
},
|
||||||
|
"text/plain": [
|
||||||
|
"VBox(children=(Label(value='Waiting for wandb.init()...\\r'), FloatProgress(value=0.016745895149999985, max=1.0…"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Tracking run with wandb version 0.14.0"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Run data is saved locally in <code>/Users/harrisonchase/workplace/langchain/docs/ecosystem/wandb/run-20230318_150534-jyxma7hu</code>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Syncing run <strong><a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/jyxma7hu' target=\"_blank\">simple_sequential</a></strong> to <a href='https://wandb.ai/harrison-chase/langchain_callback_demo' target=\"_blank\">Weights & Biases</a> (<a href='https://wandb.me/run' target=\"_blank\">docs</a>)<br/>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View project at <a href='https://wandb.ai/harrison-chase/langchain_callback_demo' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo</a>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View run at <a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/jyxma7hu' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo/runs/jyxma7hu</a>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"# SCENARIO 1 - LLM\n",
|
||||||
|
"llm_result = llm.generate([\"Tell me a joke\", \"Tell me a poem\"] * 3)\n",
|
||||||
|
"wandb_callback.flush_tracker(llm, name=\"simple_sequential\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"metadata": {
|
||||||
|
"id": "trxslyb1U28Y"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.prompts import PromptTemplate\n",
|
||||||
|
"from langchain.chains import LLMChain"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"metadata": {
|
||||||
|
"id": "uauQk10SUzF6"
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Waiting for W&B process to finish... <strong style=\"color:green\">(success).</strong>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View run <strong style=\"color:#cdcd00\">simple_sequential</strong> at: <a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/jyxma7hu' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo/runs/jyxma7hu</a><br/>Synced 4 W&B file(s), 2 media file(s), 6 artifact file(s) and 0 other file(s)"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Find logs at: <code>./wandb/run-20230318_150534-jyxma7hu/logs</code>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"application/vnd.jupyter.widget-view+json": {
|
||||||
|
"model_id": "dbdbf28fb8ed40a3a60218d2e6d1a987",
|
||||||
|
"version_major": 2,
|
||||||
|
"version_minor": 0
|
||||||
|
},
|
||||||
|
"text/plain": [
|
||||||
|
"VBox(children=(Label(value='Waiting for wandb.init()...\\r'), FloatProgress(value=0.016736786816666675, max=1.0…"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Tracking run with wandb version 0.14.0"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Run data is saved locally in <code>/Users/harrisonchase/workplace/langchain/docs/ecosystem/wandb/run-20230318_150550-wzy59zjq</code>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Syncing run <strong><a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/wzy59zjq' target=\"_blank\">agent</a></strong> to <a href='https://wandb.ai/harrison-chase/langchain_callback_demo' target=\"_blank\">Weights & Biases</a> (<a href='https://wandb.me/run' target=\"_blank\">docs</a>)<br/>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View project at <a href='https://wandb.ai/harrison-chase/langchain_callback_demo' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo</a>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View run at <a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/wzy59zjq' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo/runs/wzy59zjq</a>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"# SCENARIO 2 - Chain\n",
|
||||||
|
"template = \"\"\"You are a playwright. Given the title of play, it is your job to write a synopsis for that title.\n",
|
||||||
|
"Title: {title}\n",
|
||||||
|
"Playwright: This is a synopsis for the above play:\"\"\"\n",
|
||||||
|
"prompt_template = PromptTemplate(input_variables=[\"title\"], template=template)\n",
|
||||||
|
"synopsis_chain = LLMChain(llm=llm, prompt=prompt_template, callback_manager=manager)\n",
|
||||||
|
"\n",
|
||||||
|
"test_prompts = [\n",
|
||||||
|
" {\n",
|
||||||
|
" \"title\": \"documentary about good video games that push the boundary of game design\"\n",
|
||||||
|
" },\n",
|
||||||
|
" {\"title\": \"cocaine bear vs heroin wolf\"},\n",
|
||||||
|
" {\"title\": \"the best in class mlops tooling\"},\n",
|
||||||
|
"]\n",
|
||||||
|
"synopsis_chain.apply(test_prompts)\n",
|
||||||
|
"wandb_callback.flush_tracker(synopsis_chain, name=\"agent\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"metadata": {
|
||||||
|
"id": "_jN73xcPVEpI"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents import initialize_agent, load_tools"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 8,
|
||||||
|
"metadata": {
|
||||||
|
"id": "Gpq4rk6VT9cu"
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to find out who Leo DiCaprio's girlfriend is and then calculate her age raised to the 0.43 power.\n",
|
||||||
|
"Action: Search\n",
|
||||||
|
"Action Input: \"Leo DiCaprio girlfriend\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mDiCaprio had a steady girlfriend in Camila Morrone. He had been with the model turned actress for nearly five years, as they were first said to be dating at the end of 2017. And the now 26-year-old Morrone is no stranger to Hollywood.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I need to calculate her age raised to the 0.43 power.\n",
|
||||||
|
"Action: Calculator\n",
|
||||||
|
"Action Input: 26^0.43\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mAnswer: 4.059182145592686\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: Leo DiCaprio's girlfriend is Camila Morrone and her current age raised to the 0.43 power is 4.059182145592686.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Waiting for W&B process to finish... <strong style=\"color:green\">(success).</strong>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
" View run <strong style=\"color:#cdcd00\">agent</strong> at: <a href='https://wandb.ai/harrison-chase/langchain_callback_demo/runs/wzy59zjq' target=\"_blank\">https://wandb.ai/harrison-chase/langchain_callback_demo/runs/wzy59zjq</a><br/>Synced 5 W&B file(s), 2 media file(s), 7 artifact file(s) and 0 other file(s)"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/html": [
|
||||||
|
"Find logs at: <code>./wandb/run-20230318_150550-wzy59zjq/logs</code>"
|
||||||
|
],
|
||||||
|
"text/plain": [
|
||||||
|
"<IPython.core.display.HTML object>"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "display_data"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"# SCENARIO 3 - Agent with Tools\n",
|
||||||
|
"tools = load_tools([\"serpapi\", \"llm-math\"], llm=llm, callback_manager=manager)\n",
|
||||||
|
"agent = initialize_agent(\n",
|
||||||
|
" tools,\n",
|
||||||
|
" llm,\n",
|
||||||
|
" agent=\"zero-shot-react-description\",\n",
|
||||||
|
" callback_manager=manager,\n",
|
||||||
|
" verbose=True,\n",
|
||||||
|
")\n",
|
||||||
|
"agent.run(\n",
|
||||||
|
" \"Who is Leo DiCaprio's girlfriend? What is her current age raised to the 0.43 power?\"\n",
|
||||||
|
")\n",
|
||||||
|
"wandb_callback.flush_tracker(agent, reset=False, finish=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"colab": {
|
||||||
|
"provenance": []
|
||||||
|
},
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 1
|
||||||
|
}
|
@ -30,4 +30,4 @@ To import this vectorstore:
|
|||||||
from langchain.vectorstores import Weaviate
|
from langchain.vectorstores import Weaviate
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of the Weaviate wrapper, see [this notebook](../modules/indexes/examples/vectorstores.ipynb)
|
For a more detailed walkthrough of the Weaviate wrapper, see [this notebook](../modules/indexes/vectorstores/getting_started.ipynb)
|
||||||
|
@ -20,7 +20,7 @@ There exists a WolframAlphaAPIWrapper utility which wraps this API. To import th
|
|||||||
from langchain.utilities.wolfram_alpha import WolframAlphaAPIWrapper
|
from langchain.utilities.wolfram_alpha import WolframAlphaAPIWrapper
|
||||||
```
|
```
|
||||||
|
|
||||||
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/utils/examples/wolfram_alpha.ipynb).
|
For a more detailed walkthrough of this wrapper, see [this notebook](../modules/agents/tools/examples/wolfram_alpha.ipynb).
|
||||||
|
|
||||||
### Tool
|
### Tool
|
||||||
|
|
||||||
@ -31,4 +31,4 @@ from langchain.agents import load_tools
|
|||||||
tools = load_tools(["wolfram-alpha"])
|
tools = load_tools(["wolfram-alpha"])
|
||||||
```
|
```
|
||||||
|
|
||||||
For more information on this, see [this page](../modules/agents/tools.md)
|
For more information on this, see [this page](../modules/agents/tools/getting_started.md)
|
||||||
|
@ -158,14 +158,14 @@ Open Source
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
.. link-button:: https://github.com/jerryjliu/gpt_index
|
.. link-button:: https://github.com/jerryjliu/llama_index
|
||||||
:type: url
|
:type: url
|
||||||
:text: GPT Index
|
:text: LlamaIndex
|
||||||
:classes: stretched-link btn-lg
|
:classes: stretched-link btn-lg
|
||||||
|
|
||||||
+++
|
+++
|
||||||
|
|
||||||
GPT Index is a project consisting of a set of data structures that are created using GPT-3 and can be traversed using GPT-3 in order to answer queries.
|
LlamaIndex (formerly GPT Index) is a project consisting of a set of data structures that are created using GPT-3 and can be traversed using GPT-3 in order to answer queries.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@ -322,5 +322,14 @@ Proprietary
|
|||||||
|
|
||||||
By Zahid Khawaja, this demo utilizes question answering to answer questions about a given website. A followup added this for `YouTube videos <https://twitter.com/chillzaza_/status/1593739682013220865?s=20&t=EhU8jl0KyCPJ7vE9Rnz-cQ>`_, and then another followup added it for `Wikipedia <https://twitter.com/chillzaza_/status/1594847151238037505?s=20&t=EhU8jl0KyCPJ7vE9Rnz-cQ>`_.
|
By Zahid Khawaja, this demo utilizes question answering to answer questions about a given website. A followup added this for `YouTube videos <https://twitter.com/chillzaza_/status/1593739682013220865?s=20&t=EhU8jl0KyCPJ7vE9Rnz-cQ>`_, and then another followup added it for `Wikipedia <https://twitter.com/chillzaza_/status/1594847151238037505?s=20&t=EhU8jl0KyCPJ7vE9Rnz-cQ>`_.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
.. link-button:: https://mynd.so
|
||||||
|
:type: url
|
||||||
|
:text: Mynd
|
||||||
|
:classes: stretched-link btn-lg
|
||||||
|
|
||||||
|
+++
|
||||||
|
|
||||||
|
A journaling app for self-care that uses AI to uncover insights and patterns over time.
|
||||||
|
|
||||||
|
@ -36,7 +36,7 @@ os.environ["OPENAI_API_KEY"] = "..."
|
|||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
## Building a Language Model Application
|
## Building a Language Model Application: LLMs
|
||||||
|
|
||||||
Now that we have installed LangChain and set up our environment, we can start building our language model application.
|
Now that we have installed LangChain and set up our environment, we can start building our language model application.
|
||||||
|
|
||||||
@ -66,7 +66,7 @@ llm = OpenAI(temperature=0.9)
|
|||||||
We can now call it on some input!
|
We can now call it on some input!
|
||||||
|
|
||||||
```python
|
```python
|
||||||
text = "What would be a good company name a company that makes colorful socks?"
|
text = "What would be a good company name for a company that makes colorful socks?"
|
||||||
print(llm(text))
|
print(llm(text))
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -74,7 +74,7 @@ print(llm(text))
|
|||||||
Feetful of Fun
|
Feetful of Fun
|
||||||
```
|
```
|
||||||
|
|
||||||
For more details on how to use LLMs within LangChain, see the [LLM getting started guide](../modules/llms/getting_started.ipynb).
|
For more details on how to use LLMs within LangChain, see the [LLM getting started guide](../modules/models/llms/getting_started.ipynb).
|
||||||
`````
|
`````
|
||||||
|
|
||||||
|
|
||||||
@ -111,7 +111,7 @@ What is a good name for a company that makes colorful socks?
|
|||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
[For more details, check out the getting started guide for prompts.](../modules/prompts/getting_started.ipynb)
|
[For more details, check out the getting started guide for prompts.](../modules/prompts/chat_prompt_template.ipynb)
|
||||||
|
|
||||||
`````
|
`````
|
||||||
|
|
||||||
@ -160,7 +160,7 @@ This is one of the simpler types of chains, but understanding how it works will
|
|||||||
`````
|
`````
|
||||||
|
|
||||||
|
|
||||||
`````{dropdown} Agents: Dynamically call chains based on user input
|
`````{dropdown} Agents: Dynamically Call Chains Based on User Input
|
||||||
|
|
||||||
So far the chains we've looked at run in a predetermined order.
|
So far the chains we've looked at run in a predetermined order.
|
||||||
|
|
||||||
@ -210,35 +210,31 @@ tools = load_tools(["serpapi", "llm-math"], llm=llm)
|
|||||||
agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)
|
agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)
|
||||||
|
|
||||||
# Now let's test it out!
|
# Now let's test it out!
|
||||||
agent.run("Who is Olivia Wilde's boyfriend? What is his current age raised to the 0.23 power?")
|
agent.run("What was the high temperature in SF yesterday in Fahrenheit? What is that number raised to the .023 power?")
|
||||||
```
|
```
|
||||||
|
|
||||||
```pycon
|
```pycon
|
||||||
Entering new AgentExecutor chain...
|
> Entering new AgentExecutor chain...
|
||||||
I need to find out who Olivia Wilde's boyfriend is and then calculate his age raised to the 0.23 power.
|
I need to find the temperature first, then use the calculator to raise it to the .023 power.
|
||||||
Action: Search
|
Action: Search
|
||||||
Action Input: "Olivia Wilde boyfriend"
|
Action Input: "High temperature in SF yesterday"
|
||||||
Observation: Jason Sudeikis
|
Observation: San Francisco Temperature Yesterday. Maximum temperature yesterday: 57 °F (at 1:56 pm) Minimum temperature yesterday: 49 °F (at 1:56 am) Average temperature ...
|
||||||
Thought: I need to find out Jason Sudeikis' age
|
Thought: I now have the temperature, so I can use the calculator to raise it to the .023 power.
|
||||||
Action: Search
|
|
||||||
Action Input: "Jason Sudeikis age"
|
|
||||||
Observation: 47 years
|
|
||||||
Thought: I need to calculate 47 raised to the 0.23 power
|
|
||||||
Action: Calculator
|
Action: Calculator
|
||||||
Action Input: 47^0.23
|
Action Input: 57^.023
|
||||||
Observation: Answer: 2.4242784855673896
|
Observation: Answer: 1.0974509573251117
|
||||||
|
|
||||||
Thought: I now know the final answer
|
Thought: I now know the final answer
|
||||||
Final Answer: Jason Sudeikis, Olivia Wilde's boyfriend, is 47 years old and his age raised to the 0.23 power is 2.4242784855673896.
|
Final Answer: The high temperature in SF yesterday in Fahrenheit raised to the .023 power is 1.0974509573251117.
|
||||||
> Finished AgentExecutor chain.
|
|
||||||
"Jason Sudeikis, Olivia Wilde's boyfriend, is 47 years old and his age raised to the 0.23 power is 2.4242784855673896."
|
> Finished chain.
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
`````
|
`````
|
||||||
|
|
||||||
|
|
||||||
`````{dropdown} Memory: Add state to chains and agents
|
`````{dropdown} Memory: Add State to Chains and Agents
|
||||||
|
|
||||||
So far, all the chains and agents we've gone through have been stateless. But often, you may want a chain or agent to have some concept of "memory" so that it may remember information about its previous interactions. The clearest and simple example of this is when designing a chatbot - you want it to remember previous messages so it can use context from that to have a better conversation. This would be a type of "short-term memory". On the more complex side, you could imagine a chain/agent remembering key pieces of information over time - this would be a form of "long-term memory". For more concrete ideas on the latter, see this [awesome paper](https://memprompt.com/).
|
So far, all the chains and agents we've gone through have been stateless. But often, you may want a chain or agent to have some concept of "memory" so that it may remember information about its previous interactions. The clearest and simple example of this is when designing a chatbot - you want it to remember previous messages so it can use context from that to have a better conversation. This would be a type of "short-term memory". On the more complex side, you could imagine a chain/agent remembering key pieces of information over time - this would be a form of "long-term memory". For more concrete ideas on the latter, see this [awesome paper](https://memprompt.com/).
|
||||||
|
|
||||||
@ -288,3 +284,217 @@ AI:
|
|||||||
> Finished chain.
|
> Finished chain.
|
||||||
" That's great! What would you like to talk about?"
|
" That's great! What would you like to talk about?"
|
||||||
```
|
```
|
||||||
|
`````
|
||||||
|
|
||||||
|
## Building a Language Model Application: Chat Models
|
||||||
|
|
||||||
|
Similarly, you can use chat models instead of LLMs. Chat models are a variation on language models. While chat models use language models under the hood, the interface they expose is a bit different: rather than expose a "text in, text out" API, they expose an interface where "chat messages" are the inputs and outputs.
|
||||||
|
|
||||||
|
Chat model APIs are fairly new, so we are still figuring out the correct abstractions.
|
||||||
|
|
||||||
|
|
||||||
|
`````{dropdown} Get Message Completions from a Chat Model
|
||||||
|
You can get chat completions by passing one or more messages to the chat model. The response will be a message. The types of messages currently supported in LangChain are `AIMessage`, `HumanMessage`, `SystemMessage`, and `ChatMessage` -- `ChatMessage` takes in an arbitrary role parameter. Most of the time, you'll just be dealing with `HumanMessage`, `AIMessage`, and `SystemMessage`.
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain.chat_models import ChatOpenAI
|
||||||
|
from langchain.schema import (
|
||||||
|
AIMessage,
|
||||||
|
HumanMessage,
|
||||||
|
SystemMessage
|
||||||
|
)
|
||||||
|
|
||||||
|
chat = ChatOpenAI(temperature=0)
|
||||||
|
```
|
||||||
|
|
||||||
|
You can get completions by passing in a single message.
|
||||||
|
|
||||||
|
```python
|
||||||
|
chat([HumanMessage(content="Translate this sentence from English to French. I love programming.")])
|
||||||
|
# -> AIMessage(content="J'aime programmer.", additional_kwargs={})
|
||||||
|
```
|
||||||
|
|
||||||
|
You can also pass in multiple messages for OpenAI's gpt-3.5-turbo and gpt-4 models.
|
||||||
|
|
||||||
|
```python
|
||||||
|
messages = [
|
||||||
|
SystemMessage(content="You are a helpful assistant that translates English to French."),
|
||||||
|
HumanMessage(content="Translate this sentence from English to French. I love programming.")
|
||||||
|
]
|
||||||
|
chat(messages)
|
||||||
|
# -> AIMessage(content="J'aime programmer.", additional_kwargs={})
|
||||||
|
```
|
||||||
|
|
||||||
|
You can go one step further and generate completions for multiple sets of messages using `generate`. This returns an `LLMResult` with an additional `message` parameter:
|
||||||
|
```python
|
||||||
|
batch_messages = [
|
||||||
|
[
|
||||||
|
SystemMessage(content="You are a helpful assistant that translates English to French."),
|
||||||
|
HumanMessage(content="Translate this sentence from English to French. I love programming.")
|
||||||
|
],
|
||||||
|
[
|
||||||
|
SystemMessage(content="You are a helpful assistant that translates English to French."),
|
||||||
|
HumanMessage(content="Translate this sentence from English to French. I love artificial intelligence.")
|
||||||
|
],
|
||||||
|
]
|
||||||
|
result = chat.generate(batch_messages)
|
||||||
|
result
|
||||||
|
# -> LLMResult(generations=[[ChatGeneration(text="J'aime programmer.", generation_info=None, message=AIMessage(content="J'aime programmer.", additional_kwargs={}))], [ChatGeneration(text="J'aime l'intelligence artificielle.", generation_info=None, message=AIMessage(content="J'aime l'intelligence artificielle.", additional_kwargs={}))]], llm_output={'token_usage': {'prompt_tokens': 71, 'completion_tokens': 18, 'total_tokens': 89}})
|
||||||
|
```
|
||||||
|
|
||||||
|
You can recover things like token usage from this LLMResult:
|
||||||
|
```
|
||||||
|
result.llm_output['token_usage']
|
||||||
|
# -> {'prompt_tokens': 71, 'completion_tokens': 18, 'total_tokens': 89}
|
||||||
|
```
|
||||||
|
`````
|
||||||
|
|
||||||
|
`````{dropdown} Chat Prompt Templates
|
||||||
|
Similar to LLMs, you can make use of templating by using a `MessagePromptTemplate`. You can build a `ChatPromptTemplate` from one or more `MessagePromptTemplate`s. You can use `ChatPromptTemplate`'s `format_prompt` -- this returns a `PromptValue`, which you can convert to a string or `Message` object, depending on whether you want to use the formatted value as input to an llm or chat model.
|
||||||
|
|
||||||
|
For convience, there is a `from_template` method exposed on the template. If you were to use this template, this is what it would look like:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain.chat_models import ChatOpenAI
|
||||||
|
from langchain.prompts.chat import (
|
||||||
|
ChatPromptTemplate,
|
||||||
|
SystemMessagePromptTemplate,
|
||||||
|
HumanMessagePromptTemplate,
|
||||||
|
)
|
||||||
|
|
||||||
|
chat = ChatOpenAI(temperature=0)
|
||||||
|
|
||||||
|
template="You are a helpful assistant that translates {input_language} to {output_language}."
|
||||||
|
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
|
||||||
|
human_template="{text}"
|
||||||
|
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
|
||||||
|
|
||||||
|
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
|
||||||
|
|
||||||
|
# get a chat completion from the formatted messages
|
||||||
|
chat(chat_prompt.format_prompt(input_language="English", output_language="French", text="I love programming.").to_messages())
|
||||||
|
# -> AIMessage(content="J'aime programmer.", additional_kwargs={})
|
||||||
|
```
|
||||||
|
`````
|
||||||
|
|
||||||
|
`````{dropdown} Chains with Chat Models
|
||||||
|
The `LLMChain` discussed in the above section can be used with chat models as well:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain.chat_models import ChatOpenAI
|
||||||
|
from langchain import LLMChain
|
||||||
|
from langchain.prompts.chat import (
|
||||||
|
ChatPromptTemplate,
|
||||||
|
SystemMessagePromptTemplate,
|
||||||
|
HumanMessagePromptTemplate,
|
||||||
|
)
|
||||||
|
|
||||||
|
chat = ChatOpenAI(temperature=0)
|
||||||
|
|
||||||
|
template="You are a helpful assistant that translates {input_language} to {output_language}."
|
||||||
|
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
|
||||||
|
human_template="{text}"
|
||||||
|
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
|
||||||
|
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
|
||||||
|
|
||||||
|
chain = LLMChain(llm=chat, prompt=chat_prompt)
|
||||||
|
chain.run(input_language="English", output_language="French", text="I love programming.")
|
||||||
|
# -> "J'aime programmer."
|
||||||
|
```
|
||||||
|
`````
|
||||||
|
|
||||||
|
`````{dropdown} Agents with Chat Models
|
||||||
|
Agents can also be used with chat models, you can initialize one using `"chat-zero-shot-react-description"` as the agent type.
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain.agents import load_tools
|
||||||
|
from langchain.agents import initialize_agent
|
||||||
|
from langchain.chat_models import ChatOpenAI
|
||||||
|
from langchain.llms import OpenAI
|
||||||
|
|
||||||
|
# First, let's load the language model we're going to use to control the agent.
|
||||||
|
chat = ChatOpenAI(temperature=0)
|
||||||
|
|
||||||
|
# Next, let's load some tools to use. Note that the `llm-math` tool uses an LLM, so we need to pass that in.
|
||||||
|
llm = OpenAI(temperature=0)
|
||||||
|
tools = load_tools(["serpapi", "llm-math"], llm=llm)
|
||||||
|
|
||||||
|
|
||||||
|
# Finally, let's initialize an agent with the tools, the language model, and the type of agent we want to use.
|
||||||
|
agent = initialize_agent(tools, chat, agent="chat-zero-shot-react-description", verbose=True)
|
||||||
|
|
||||||
|
# Now let's test it out!
|
||||||
|
agent.run("Who is Olivia Wilde's boyfriend? What is his current age raised to the 0.23 power?")
|
||||||
|
```
|
||||||
|
|
||||||
|
```pycon
|
||||||
|
|
||||||
|
> Entering new AgentExecutor chain...
|
||||||
|
Thought: I need to use a search engine to find Olivia Wilde's boyfriend and a calculator to raise his age to the 0.23 power.
|
||||||
|
Action:
|
||||||
|
{
|
||||||
|
"action": "Search",
|
||||||
|
"action_input": "Olivia Wilde boyfriend"
|
||||||
|
}
|
||||||
|
|
||||||
|
Observation: Sudeikis and Wilde's relationship ended in November 2020. Wilde was publicly served with court documents regarding child custody while she was presenting Don't Worry Darling at CinemaCon 2022. In January 2021, Wilde began dating singer Harry Styles after meeting during the filming of Don't Worry Darling.
|
||||||
|
Thought:I need to use a search engine to find Harry Styles' current age.
|
||||||
|
Action:
|
||||||
|
{
|
||||||
|
"action": "Search",
|
||||||
|
"action_input": "Harry Styles age"
|
||||||
|
}
|
||||||
|
|
||||||
|
Observation: 29 years
|
||||||
|
Thought:Now I need to calculate 29 raised to the 0.23 power.
|
||||||
|
Action:
|
||||||
|
{
|
||||||
|
"action": "Calculator",
|
||||||
|
"action_input": "29^0.23"
|
||||||
|
}
|
||||||
|
|
||||||
|
Observation: Answer: 2.169459462491557
|
||||||
|
|
||||||
|
Thought:I now know the final answer.
|
||||||
|
Final Answer: 2.169459462491557
|
||||||
|
|
||||||
|
> Finished chain.
|
||||||
|
'2.169459462491557'
|
||||||
|
```
|
||||||
|
`````
|
||||||
|
|
||||||
|
`````{dropdown} Memory: Add State to Chains and Agents
|
||||||
|
You can use Memory with chains and agents initialized with chat models. The main difference between this and Memory for LLMs is that rather than trying to condense all previous messages into a string, we can keep them as their own unique memory object.
|
||||||
|
|
||||||
|
```python
|
||||||
|
from langchain.prompts import (
|
||||||
|
ChatPromptTemplate,
|
||||||
|
MessagesPlaceholder,
|
||||||
|
SystemMessagePromptTemplate,
|
||||||
|
HumanMessagePromptTemplate
|
||||||
|
)
|
||||||
|
from langchain.chains import ConversationChain
|
||||||
|
from langchain.chat_models import ChatOpenAI
|
||||||
|
from langchain.memory import ConversationBufferMemory
|
||||||
|
|
||||||
|
prompt = ChatPromptTemplate.from_messages([
|
||||||
|
SystemMessagePromptTemplate.from_template("The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know."),
|
||||||
|
MessagesPlaceholder(variable_name="history"),
|
||||||
|
HumanMessagePromptTemplate.from_template("{input}")
|
||||||
|
])
|
||||||
|
|
||||||
|
llm = ChatOpenAI(temperature=0)
|
||||||
|
memory = ConversationBufferMemory(return_messages=True)
|
||||||
|
conversation = ConversationChain(memory=memory, prompt=prompt, llm=llm)
|
||||||
|
|
||||||
|
conversation.predict(input="Hi there!")
|
||||||
|
# -> 'Hello! How can I assist you today?'
|
||||||
|
|
||||||
|
|
||||||
|
conversation.predict(input="I'm doing well! Just having a conversation with an AI.")
|
||||||
|
# -> "That sounds like fun! I'm happy to chat with you. Is there anything specific you'd like to talk about?"
|
||||||
|
|
||||||
|
conversation.predict(input="Tell me about yourself.")
|
||||||
|
# -> "Sure! I am an AI language model created by OpenAI. I was trained on a large dataset of text from the internet, which allows me to understand and generate human-like language. I can answer questions, provide information, and even have conversations like this one. Is there anything else you'd like to know about me?"
|
||||||
|
```
|
||||||
|
`````
|
||||||
|
@ -32,7 +32,7 @@ This induces the to model to think about what action to take, then take it.
|
|||||||
Resources:
|
Resources:
|
||||||
|
|
||||||
- [Paper](https://arxiv.org/pdf/2210.03629.pdf)
|
- [Paper](https://arxiv.org/pdf/2210.03629.pdf)
|
||||||
- [LangChain Example](./modules/agents/implementations/react.ipynb)
|
- [LangChain Example](modules/agents/agents/examples/react.ipynb)
|
||||||
|
|
||||||
## Self-ask
|
## Self-ask
|
||||||
|
|
||||||
@ -42,7 +42,7 @@ In this method, the model explicitly asks itself follow-up questions, which are
|
|||||||
Resources:
|
Resources:
|
||||||
|
|
||||||
- [Paper](https://ofir.io/self-ask.pdf)
|
- [Paper](https://ofir.io/self-ask.pdf)
|
||||||
- [LangChain Example](./modules/agents/implementations/self_ask_with_search.ipynb)
|
- [LangChain Example](modules/agents/agents/examples/self_ask_with_search.ipynb)
|
||||||
|
|
||||||
## Prompt Chaining
|
## Prompt Chaining
|
||||||
|
|
||||||
|
@ -1,28 +1,14 @@
|
|||||||
Welcome to LangChain
|
Welcome to LangChain
|
||||||
==========================
|
==========================
|
||||||
|
|
||||||
Large language models (LLMs) are emerging as a transformative technology, enabling
|
LangChain is a framework for developing applications powered by language models. We believe that the most powerful and differentiated applications will not only call out to a language model via an API, but will also:
|
||||||
developers to build applications that they previously could not.
|
|
||||||
But using these LLMs in isolation is often not enough to
|
|
||||||
create a truly powerful app - the real power comes when you are able to
|
|
||||||
combine them with other sources of computation or knowledge.
|
|
||||||
|
|
||||||
This library is aimed at assisting in the development of those types of applications. Common examples of these types of applications include:
|
- *Be data-aware*: connect a language model to other sources of data
|
||||||
|
- *Be agentic*: allow a language model to interact with its environment
|
||||||
|
|
||||||
**❓ Question Answering over specific documents**
|
The LangChain framework is designed with the above principles in mind.
|
||||||
|
|
||||||
- `Documentation <./use_cases/question_answering.html>`_
|
This is the Python specific portion of the documentation. For a purely conceptual guide to LangChain, see `here <https://docs.langchain.com/docs/>`_. For the JavaScript documentation, see `here <https://js.langchain.com/docs/>`_.
|
||||||
- End-to-end Example: `Question Answering over Notion Database <https://github.com/hwchase17/notion-qa>`_
|
|
||||||
|
|
||||||
**💬 Chatbots**
|
|
||||||
|
|
||||||
- `Documentation <./use_cases/chatbots.html>`_
|
|
||||||
- End-to-end Example: `Chat-LangChain <https://github.com/hwchase17/chat-langchain>`_
|
|
||||||
|
|
||||||
**🤖 Agents**
|
|
||||||
|
|
||||||
- `Documentation <./use_cases/agents.html>`_
|
|
||||||
- End-to-end Example: `GPT+WolframAlpha <https://huggingface.co/spaces/JavaFXpert/Chat-GPT-LangChain>`_
|
|
||||||
|
|
||||||
Getting Started
|
Getting Started
|
||||||
----------------
|
----------------
|
||||||
@ -46,22 +32,17 @@ There are several main modules that LangChain provides support for.
|
|||||||
For each module we provide some examples to get started, how-to guides, reference docs, and conceptual guides.
|
For each module we provide some examples to get started, how-to guides, reference docs, and conceptual guides.
|
||||||
These modules are, in increasing order of complexity:
|
These modules are, in increasing order of complexity:
|
||||||
|
|
||||||
|
- `Models <./modules/models.html>`_: The various model types and model integrations LangChain supports.
|
||||||
|
|
||||||
- `Prompts <./modules/prompts.html>`_: This includes prompt management, prompt optimization, and prompt serialization.
|
- `Prompts <./modules/prompts.html>`_: This includes prompt management, prompt optimization, and prompt serialization.
|
||||||
|
|
||||||
- `LLMs <./modules/llms.html>`_: This includes a generic interface for all LLMs, and common utilities for working with LLMs.
|
- `Memory <./modules/memory.html>`_: Memory is the concept of persisting state between calls of a chain/agent. LangChain provides a standard interface for memory, a collection of memory implementations, and examples of chains/agents that use memory.
|
||||||
|
|
||||||
- `Document Loaders <./modules/document_loaders.html>`_: This includes a standard interface for loading documents, as well as specific integrations to all types of text data sources.
|
|
||||||
|
|
||||||
- `Utils <./modules/utils.html>`_: Language models are often more powerful when interacting with other sources of knowledge or computation. This can include Python REPLs, embeddings, search engines, and more. LangChain provides a large collection of common utils to use in your application.
|
|
||||||
|
|
||||||
- `Chains <./modules/chains.html>`_: Chains go beyond just a single LLM call, and are sequences of calls (whether to an LLM or a different utility). LangChain provides a standard interface for chains, lots of integrations with other tools, and end-to-end chains for common applications.
|
|
||||||
|
|
||||||
- `Indexes <./modules/indexes.html>`_: Language models are often more powerful when combined with your own text data - this module covers best practices for doing exactly that.
|
- `Indexes <./modules/indexes.html>`_: Language models are often more powerful when combined with your own text data - this module covers best practices for doing exactly that.
|
||||||
|
|
||||||
- `Agents <./modules/agents.html>`_: Agents involve an LLM making decisions about which Actions to take, taking that Action, seeing an Observation, and repeating that until done. LangChain provides a standard interface for agents, a selection of agents to choose from, and examples of end to end agents.
|
- `Chains <./modules/chains.html>`_: Chains go beyond just a single LLM call, and are sequences of calls (whether to an LLM or a different utility). LangChain provides a standard interface for chains, lots of integrations with other tools, and end-to-end chains for common applications.
|
||||||
|
|
||||||
- `Memory <./modules/memory.html>`_: Memory is the concept of persisting state between calls of a chain/agent. LangChain provides a standard interface for memory, a collection of memory implementations, and examples of chains/agents that use memory.
|
- `Agents <./modules/agents.html>`_: Agents involve an LLM making decisions about which Actions to take, taking that Action, seeing an Observation, and repeating that until done. LangChain provides a standard interface for agents, a selection of agents to choose from, and examples of end to end agents.
|
||||||
|
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
@ -70,37 +51,34 @@ These modules are, in increasing order of complexity:
|
|||||||
:name: modules
|
:name: modules
|
||||||
:hidden:
|
:hidden:
|
||||||
|
|
||||||
./modules/prompts.md
|
./modules/models.rst
|
||||||
./modules/llms.md
|
./modules/prompts.rst
|
||||||
./modules/document_loaders.md
|
|
||||||
./modules/utils.md
|
|
||||||
./modules/indexes.md
|
./modules/indexes.md
|
||||||
|
./modules/memory.md
|
||||||
./modules/chains.md
|
./modules/chains.md
|
||||||
./modules/agents.md
|
./modules/agents.md
|
||||||
./modules/memory.md
|
|
||||||
|
|
||||||
Use Cases
|
Use Cases
|
||||||
----------
|
----------
|
||||||
|
|
||||||
The above modules can be used in a variety of ways. LangChain also provides guidance and assistance in this. Below are some of the common use cases LangChain supports.
|
The above modules can be used in a variety of ways. LangChain also provides guidance and assistance in this. Below are some of the common use cases LangChain supports.
|
||||||
|
|
||||||
- `Agents <./use_cases/agents.html>`_: Agents are systems that use a language model to interact with other tools. These can be used to do more grounded question/answering, interact with APIs, or even take actions.
|
- `Personal Assistants <./use_cases/personal_assistants.html>`_: The main LangChain use case. Personal assistants need to take actions, remember interactions, and have knowledge about your data.
|
||||||
|
|
||||||
|
- `Question Answering <./use_cases/question_answering.html>`_: The second big LangChain use case. Answering questions over specific documents, only utilizing the information in those documents to construct an answer.
|
||||||
|
|
||||||
- `Chatbots <./use_cases/chatbots.html>`_: Since language models are good at producing text, that makes them ideal for creating chatbots.
|
- `Chatbots <./use_cases/chatbots.html>`_: Since language models are good at producing text, that makes them ideal for creating chatbots.
|
||||||
|
|
||||||
- `Data Augmented Generation <./use_cases/combine_docs.html>`_: Data Augmented Generation involves specific types of chains that first interact with an external datasource to fetch data to use in the generation step. Examples of this include summarization of long pieces of text and question/answering over specific data sources.
|
- `Querying Tabular Data <./use_cases/tabular.html>`_: If you want to understand how to use LLMs to query data that is stored in a tabular format (csvs, SQL, dataframes, etc) you should read this page.
|
||||||
|
|
||||||
- `Question Answering <./use_cases/question_answering.html>`_: Answering questions over specific documents, only utilizing the information in those documents to construct an answer. A type of Data Augmented Generation.
|
- `Interacting with APIs <./use_cases/apis.html>`_: Enabling LLMs to interact with APIs is extremely powerful in order to give them more up-to-date information and allow them to take actions.
|
||||||
|
|
||||||
|
- `Extraction <./use_cases/extraction.html>`_: Extract structured information from text.
|
||||||
|
|
||||||
- `Summarization <./use_cases/summarization.html>`_: Summarizing longer documents into shorter, more condensed chunks of information. A type of Data Augmented Generation.
|
- `Summarization <./use_cases/summarization.html>`_: Summarizing longer documents into shorter, more condensed chunks of information. A type of Data Augmented Generation.
|
||||||
|
|
||||||
- `Evaluation <./use_cases/evaluation.html>`_: Generative models are notoriously hard to evaluate with traditional metrics. One new way of evaluating them is using language models themselves to do the evaluation. LangChain provides some prompts/chains for assisting in this.
|
- `Evaluation <./use_cases/evaluation.html>`_: Generative models are notoriously hard to evaluate with traditional metrics. One new way of evaluating them is using language models themselves to do the evaluation. LangChain provides some prompts/chains for assisting in this.
|
||||||
|
|
||||||
- `Generate similar examples <./use_cases/generate_examples.html>`_: Generating similar examples to a given input. This is a common use case for many applications, and LangChain provides some prompts/chains for assisting in this.
|
|
||||||
|
|
||||||
- `Compare models <./use_cases/model_laboratory.html>`_: Experimenting with different prompts, models, and chains is a big part of developing the best possible application. The ModelLaboratory makes it easy to do so.
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
@ -108,14 +86,14 @@ The above modules can be used in a variety of ways. LangChain also provides guid
|
|||||||
:name: use_cases
|
:name: use_cases
|
||||||
:hidden:
|
:hidden:
|
||||||
|
|
||||||
./use_cases/agents.md
|
./use_cases/personal_assistants.md
|
||||||
./use_cases/chatbots.md
|
|
||||||
./use_cases/generate_examples.ipynb
|
|
||||||
./use_cases/combine_docs.md
|
|
||||||
./use_cases/question_answering.md
|
./use_cases/question_answering.md
|
||||||
|
./use_cases/chatbots.md
|
||||||
|
./use_cases/tabular.rst
|
||||||
|
./use_cases/apis.md
|
||||||
./use_cases/summarization.md
|
./use_cases/summarization.md
|
||||||
|
./use_cases/extraction.md
|
||||||
./use_cases/evaluation.rst
|
./use_cases/evaluation.rst
|
||||||
./use_cases/model_laboratory.ipynb
|
|
||||||
|
|
||||||
|
|
||||||
Reference Docs
|
Reference Docs
|
||||||
@ -166,10 +144,12 @@ Additional collection of resources we think may be useful as you develop your ap
|
|||||||
|
|
||||||
- `Deployments <./deployments.html>`_: A collection of instructions, code snippets, and template repositories for deploying LangChain apps.
|
- `Deployments <./deployments.html>`_: A collection of instructions, code snippets, and template repositories for deploying LangChain apps.
|
||||||
|
|
||||||
- `Discord <https://discord.gg/6adMQxSpJS>`_: Join us on our Discord to discuss all things LangChain!
|
|
||||||
|
|
||||||
- `Tracing <./tracing.html>`_: A guide on using tracing in LangChain to visualize the execution of chains and agents.
|
- `Tracing <./tracing.html>`_: A guide on using tracing in LangChain to visualize the execution of chains and agents.
|
||||||
|
|
||||||
|
- `Model Laboratory <./model_laboratory.html>`_: Experimenting with different prompts, models, and chains is a big part of developing the best possible application. The ModelLaboratory makes it easy to do so.
|
||||||
|
|
||||||
|
- `Discord <https://discord.gg/6adMQxSpJS>`_: Join us on our Discord to discuss all things LangChain!
|
||||||
|
|
||||||
- `Production Support <https://forms.gle/57d8AmXBYp8PP8tZA>`_: As you move your LangChains into production, we'd love to offer more comprehensive support. Please fill out this form and we'll set up a dedicated support Slack channel.
|
- `Production Support <https://forms.gle/57d8AmXBYp8PP8tZA>`_: As you move your LangChains into production, we'd love to offer more comprehensive support. Please fill out this form and we'll set up a dedicated support Slack channel.
|
||||||
|
|
||||||
|
|
||||||
@ -184,5 +164,6 @@ Additional collection of resources we think may be useful as you develop your ap
|
|||||||
./gallery.rst
|
./gallery.rst
|
||||||
./deployments.md
|
./deployments.md
|
||||||
./tracing.md
|
./tracing.md
|
||||||
|
./use_cases/model_laboratory.ipynb
|
||||||
Discord <https://discord.gg/6adMQxSpJS>
|
Discord <https://discord.gg/6adMQxSpJS>
|
||||||
Production Support <https://forms.gle/57d8AmXBYp8PP8tZA>
|
Production Support <https://forms.gle/57d8AmXBYp8PP8tZA>
|
||||||
|
@ -1,30 +1,52 @@
|
|||||||
Agents
|
Agents
|
||||||
==========================
|
==========================
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
`Conceptual Guide <https://docs.langchain.com/docs/components/agents>`_
|
||||||
|
|
||||||
|
|
||||||
Some applications will require not just a predetermined chain of calls to LLMs/other tools,
|
Some applications will require not just a predetermined chain of calls to LLMs/other tools,
|
||||||
but potentially an unknown chain that depends on the user input.
|
but potentially an unknown chain that depends on the user's input.
|
||||||
In these types of chains, there is a “agent” which has access to a suite of tools.
|
In these types of chains, there is a “agent” which has access to a suite of tools.
|
||||||
Depending on the user input, the agent can then decide which, if any, of these tools to call.
|
Depending on the user input, the agent can then decide which, if any, of these tools to call.
|
||||||
|
|
||||||
The following sections of documentation are provided:
|
In this section of documentation, we first start with a Getting Started notebook to over over how to use all things related to agents in an end-to-end manner.
|
||||||
|
|
||||||
- `Getting Started <./agents/getting_started.html>`_: A notebook to help you get started working with agents as quickly as possible.
|
|
||||||
|
|
||||||
- `Key Concepts <./agents/key_concepts.html>`_: A conceptual guide going over the various concepts related to agents.
|
|
||||||
|
|
||||||
- `How-To Guides <./agents/how_to_guides.html>`_: A collection of how-to guides. These highlight how to integrate various types of tools, how to work with different types of agent, and how to customize agents.
|
|
||||||
|
|
||||||
- `Reference <../reference/modules/agents.html>`_: API reference documentation for all Agent classes.
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
:caption: Agents
|
|
||||||
:name: Agents
|
|
||||||
:hidden:
|
:hidden:
|
||||||
|
|
||||||
./agents/getting_started.ipynb
|
./agents/getting_started.ipynb
|
||||||
./agents/key_concepts.md
|
|
||||||
./agents/how_to_guides.rst
|
|
||||||
Reference<../reference/modules/agents.rst>
|
We then split the documentation into the following sections:
|
||||||
|
|
||||||
|
**Tools**
|
||||||
|
|
||||||
|
An overview of the various tools LangChain supports.
|
||||||
|
|
||||||
|
|
||||||
|
**Agents**
|
||||||
|
|
||||||
|
An overview of the different agent types.
|
||||||
|
|
||||||
|
|
||||||
|
**Toolkits**
|
||||||
|
|
||||||
|
An overview of toolkits, and examples of the different ones LangChain supports.
|
||||||
|
|
||||||
|
|
||||||
|
**Agent Executor**
|
||||||
|
|
||||||
|
An overview of the Agent Executor class and examples of how to use it.
|
||||||
|
|
||||||
|
Go Deeper
|
||||||
|
---------
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
|
||||||
|
./agents/tools.rst
|
||||||
|
./agents/agents.rst
|
||||||
|
./agents/toolkits.rst
|
||||||
|
./agents/agent_executors.rst
|
||||||
|
17
docs/modules/agents/agent_executors.rst
Normal file
17
docs/modules/agents/agent_executors.rst
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
Agent Executors
|
||||||
|
===============
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
`Conceptual Guide <https://docs.langchain.com/docs/components/agents/agent-executor>`_
|
||||||
|
|
||||||
|
Agent executors take an agent and tools and use the agent to decide which tools to call and in what order.
|
||||||
|
|
||||||
|
In this part of the documentation we cover other related functionality to agent executors
|
||||||
|
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./agent_executors/examples/*
|
||||||
|
|
@ -5,7 +5,7 @@
|
|||||||
"id": "68b24990",
|
"id": "68b24990",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Agents and Vectorstores\n",
|
"# How to combine agents and vectorstores\n",
|
||||||
"\n",
|
"\n",
|
||||||
"This notebook covers how to combine agents and vectorstores. The use case for this is that you've ingested your data into a vectorstore and want to interact with it in an agentic manner.\n",
|
"This notebook covers how to combine agents and vectorstores. The use case for this is that you've ingested your data into a vectorstore and want to interact with it in an agentic manner.\n",
|
||||||
"\n",
|
"\n",
|
||||||
@ -22,7 +22,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 20,
|
"execution_count": 16,
|
||||||
"id": "2e87c10a",
|
"id": "2e87c10a",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -30,13 +30,30 @@
|
|||||||
"from langchain.embeddings.openai import OpenAIEmbeddings\n",
|
"from langchain.embeddings.openai import OpenAIEmbeddings\n",
|
||||||
"from langchain.vectorstores import Chroma\n",
|
"from langchain.vectorstores import Chroma\n",
|
||||||
"from langchain.text_splitter import CharacterTextSplitter\n",
|
"from langchain.text_splitter import CharacterTextSplitter\n",
|
||||||
"from langchain import OpenAI, VectorDBQA\n",
|
"from langchain.llms import OpenAI\n",
|
||||||
|
"from langchain.chains import RetrievalQA\n",
|
||||||
"llm = OpenAI(temperature=0)"
|
"llm = OpenAI(temperature=0)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 37,
|
"execution_count": 17,
|
||||||
|
"id": "0b7b772b",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from pathlib import Path\n",
|
||||||
|
"relevant_parts = []\n",
|
||||||
|
"for p in Path(\".\").absolute().parts:\n",
|
||||||
|
" relevant_parts.append(p)\n",
|
||||||
|
" if relevant_parts[-3:] == [\"langchain\", \"docs\", \"modules\"]:\n",
|
||||||
|
" break\n",
|
||||||
|
"doc_path = str(Path(*relevant_parts) / \"state_of_the_union.txt\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 18,
|
||||||
"id": "f2675861",
|
"id": "f2675861",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@ -51,7 +68,7 @@
|
|||||||
],
|
],
|
||||||
"source": [
|
"source": [
|
||||||
"from langchain.document_loaders import TextLoader\n",
|
"from langchain.document_loaders import TextLoader\n",
|
||||||
"loader = TextLoader('../../state_of_the_union.txt')\n",
|
"loader = TextLoader(doc_path)\n",
|
||||||
"documents = loader.load()\n",
|
"documents = loader.load()\n",
|
||||||
"text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)\n",
|
"text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)\n",
|
||||||
"texts = text_splitter.split_documents(documents)\n",
|
"texts = text_splitter.split_documents(documents)\n",
|
||||||
@ -62,17 +79,17 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 38,
|
"execution_count": 4,
|
||||||
"id": "bc5403d4",
|
"id": "bc5403d4",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"state_of_union = VectorDBQA.from_chain_type(llm=llm, chain_type=\"stuff\", vectorstore=docsearch)"
|
"state_of_union = RetrievalQA.from_chain_type(llm=llm, chain_type=\"stuff\", retriever=docsearch.as_retriever())"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 39,
|
"execution_count": 5,
|
||||||
"id": "1431cded",
|
"id": "1431cded",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -82,7 +99,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 40,
|
"execution_count": 6,
|
||||||
"id": "915d3ff3",
|
"id": "915d3ff3",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -92,7 +109,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 41,
|
"execution_count": 7,
|
||||||
"id": "96a2edf8",
|
"id": "96a2edf8",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@ -109,7 +126,7 @@
|
|||||||
"docs = loader.load()\n",
|
"docs = loader.load()\n",
|
||||||
"ruff_texts = text_splitter.split_documents(docs)\n",
|
"ruff_texts = text_splitter.split_documents(docs)\n",
|
||||||
"ruff_db = Chroma.from_documents(ruff_texts, embeddings, collection_name=\"ruff\")\n",
|
"ruff_db = Chroma.from_documents(ruff_texts, embeddings, collection_name=\"ruff\")\n",
|
||||||
"ruff = VectorDBQA.from_chain_type(llm=llm, chain_type=\"stuff\", vectorstore=ruff_db)"
|
"ruff = RetrievalQA.from_chain_type(llm=llm, chain_type=\"stuff\", retriever=ruff_db.as_retriever())"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -264,9 +281,9 @@
|
|||||||
"id": "9161ba91",
|
"id": "9161ba91",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"You can also set `return_direct=True` if you intend to use the agent as a router and just want to directly return the result of the VectorDBQaChain.\n",
|
"You can also set `return_direct=True` if you intend to use the agent as a router and just want to directly return the result of the RetrievalQAChain.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Notice that in the above examples the agent did some extra work after querying the VectorDBQAChain. You can avoid that and just return the result directly."
|
"Notice that in the above examples the agent did some extra work after querying the RetrievalQAChain. You can avoid that and just return the result directly."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
@ -5,7 +5,7 @@
|
|||||||
"id": "6fb92deb-d89e-439b-855d-c7f2607d794b",
|
"id": "6fb92deb-d89e-439b-855d-c7f2607d794b",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Async API for Agent\n",
|
"# How to use the async API for Agents\n",
|
||||||
"\n",
|
"\n",
|
||||||
"LangChain provides async support for Agents by leveraging the [asyncio](https://docs.python.org/3/library/asyncio.html) library.\n",
|
"LangChain provides async support for Agents by leveraging the [asyncio](https://docs.python.org/3/library/asyncio.html) library.\n",
|
||||||
"\n",
|
"\n",
|
@ -5,7 +5,7 @@
|
|||||||
"id": "b253f4d5",
|
"id": "b253f4d5",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# ChatGPT Clone\n",
|
"# How to create ChatGPT Clone\n",
|
||||||
"\n",
|
"\n",
|
||||||
"This chain replicates ChatGPT by combining (1) a specific prompt, and (2) the concept of memory.\n",
|
"This chain replicates ChatGPT by combining (1) a specific prompt, and (2) the concept of memory.\n",
|
||||||
"\n",
|
"\n",
|
||||||
@ -14,7 +14,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 1,
|
"execution_count": 2,
|
||||||
"id": "a99acd89",
|
"id": "a99acd89",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@ -38,18 +38,17 @@
|
|||||||
"Human: I want you to act as a Linux terminal. I will type commands and you will reply with what the terminal should show. I want you to only reply with the terminal output inside one unique code block, and nothing else. Do not write explanations. Do not type commands unless I instruct you to do so. When I need to tell you something in English I will do so by putting text inside curly brackets {like this}. My first command is pwd.\n",
|
"Human: I want you to act as a Linux terminal. I will type commands and you will reply with what the terminal should show. I want you to only reply with the terminal output inside one unique code block, and nothing else. Do not write explanations. Do not type commands unless I instruct you to do so. When I need to tell you something in English I will do so by putting text inside curly brackets {like this}. My first command is pwd.\n",
|
||||||
"Assistant:\u001b[0m\n",
|
"Assistant:\u001b[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished LLMChain chain.\u001b[0m\n",
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"```\n",
|
"```\n",
|
||||||
"$ pwd\n",
|
"/home/user\n",
|
||||||
"/\n",
|
|
||||||
"```\n"
|
"```\n"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"source": [
|
"source": [
|
||||||
"from langchain import OpenAI, ConversationChain, LLMChain, PromptTemplate\n",
|
"from langchain import OpenAI, ConversationChain, LLMChain, PromptTemplate\n",
|
||||||
"from langchain.chains.conversation.memory import ConversationalBufferWindowMemory\n",
|
"from langchain.memory import ConversationBufferWindowMemory\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"template = \"\"\"Assistant is a large language model trained by OpenAI.\n",
|
"template = \"\"\"Assistant is a large language model trained by OpenAI.\n",
|
||||||
@ -74,7 +73,7 @@
|
|||||||
" llm=OpenAI(temperature=0), \n",
|
" llm=OpenAI(temperature=0), \n",
|
||||||
" prompt=prompt, \n",
|
" prompt=prompt, \n",
|
||||||
" verbose=True, \n",
|
" verbose=True, \n",
|
||||||
" memory=ConversationalBufferWindowMemory(k=2),\n",
|
" memory=ConversationBufferWindowMemory(k=2),\n",
|
||||||
")\n",
|
")\n",
|
||||||
"\n",
|
"\n",
|
||||||
"output = chatgpt_chain.predict(human_input=\"I want you to act as a Linux terminal. I will type commands and you will reply with what the terminal should show. I want you to only reply with the terminal output inside one unique code block, and nothing else. Do not write explanations. Do not type commands unless I instruct you to do so. When I need to tell you something in English I will do so by putting text inside curly brackets {like this}. My first command is pwd.\")\n",
|
"output = chatgpt_chain.predict(human_input=\"I want you to act as a Linux terminal. I will type commands and you will reply with what the terminal should show. I want you to only reply with the terminal output inside one unique code block, and nothing else. Do not write explanations. Do not type commands unless I instruct you to do so. When I need to tell you something in English I will do so by putting text inside curly brackets {like this}. My first command is pwd.\")\n",
|
||||||
@ -961,7 +960,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.10.9"
|
"version": "3.9.1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
@ -5,7 +5,7 @@
|
|||||||
"id": "5436020b",
|
"id": "5436020b",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Intermediate Steps\n",
|
"# How to access intermediate steps\n",
|
||||||
"\n",
|
"\n",
|
||||||
"In order to get more visibility into what an agent is doing, we can also return intermediate steps. This comes in the form of an extra key in the return value, which is a list of (action, observation) tuples."
|
"In order to get more visibility into what an agent is doing, we can also return intermediate steps. This comes in the form of an extra key in the return value, which is a list of (action, observation) tuples."
|
||||||
]
|
]
|
@ -5,7 +5,7 @@
|
|||||||
"id": "75c041b7",
|
"id": "75c041b7",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Max Iterations\n",
|
"# How to cap the max number of iterations\n",
|
||||||
"\n",
|
"\n",
|
||||||
"This notebook walks through how to cap an agent at taking a certain number of steps. This can be useful to ensure that they do not go haywire and take too many steps."
|
"This notebook walks through how to cap an agent at taking a certain number of steps. This can be useful to ensure that they do not go haywire and take too many steps."
|
||||||
]
|
]
|
@ -0,0 +1,548 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "fa6802ac",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# How to add SharedMemory to an Agent and its Tools\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook goes over adding memory to **both** of an Agent and its tools. Before going through this notebook, please walk through the following notebooks, as this will build on top of both of them:\n",
|
||||||
|
"\n",
|
||||||
|
"- [Adding memory to an LLM Chain](../../memory/examples/adding_memory.ipynb)\n",
|
||||||
|
"- [Custom Agents](custom_agent.ipynb)\n",
|
||||||
|
"\n",
|
||||||
|
"We are going to create a custom Agent. The agent has access to a conversation memory, search tool, and a summarization tool. And, the summarization tool also needs access to the conversation memory."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "8db95912",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents import ZeroShotAgent, Tool, AgentExecutor\n",
|
||||||
|
"from langchain.memory import ConversationBufferMemory, ReadOnlySharedMemory\n",
|
||||||
|
"from langchain import OpenAI, LLMChain, PromptTemplate\n",
|
||||||
|
"from langchain.utilities import GoogleSearchAPIWrapper"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "06b7187b",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"template = \"\"\"This is a conversation between a human and a bot:\n",
|
||||||
|
"\n",
|
||||||
|
"{chat_history}\n",
|
||||||
|
"\n",
|
||||||
|
"Write a summary of the conversation for {input}:\n",
|
||||||
|
"\"\"\"\n",
|
||||||
|
"\n",
|
||||||
|
"prompt = PromptTemplate(\n",
|
||||||
|
" input_variables=[\"input\", \"chat_history\"], \n",
|
||||||
|
" template=template\n",
|
||||||
|
")\n",
|
||||||
|
"memory = ConversationBufferMemory(memory_key=\"chat_history\")\n",
|
||||||
|
"readonlymemory = ReadOnlySharedMemory(memory=memory)\n",
|
||||||
|
"summry_chain = LLMChain(\n",
|
||||||
|
" llm=OpenAI(), \n",
|
||||||
|
" prompt=prompt, \n",
|
||||||
|
" verbose=True, \n",
|
||||||
|
" memory=readonlymemory, # use the read-only memory to prevent the tool from modifying the memory\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "97ad8467",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"search = GoogleSearchAPIWrapper()\n",
|
||||||
|
"tools = [\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name = \"Search\",\n",
|
||||||
|
" func=search.run,\n",
|
||||||
|
" description=\"useful for when you need to answer questions about current events\"\n",
|
||||||
|
" ),\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name = \"Summary\",\n",
|
||||||
|
" func=summry_chain.run,\n",
|
||||||
|
" description=\"useful for when you summarize a conversation. The input to this tool should be a string, representing who will read this summary.\"\n",
|
||||||
|
" )\n",
|
||||||
|
"]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "e3439cd6",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"prefix = \"\"\"Have a conversation with a human, answering the following questions as best you can. You have access to the following tools:\"\"\"\n",
|
||||||
|
"suffix = \"\"\"Begin!\"\n",
|
||||||
|
"\n",
|
||||||
|
"{chat_history}\n",
|
||||||
|
"Question: {input}\n",
|
||||||
|
"{agent_scratchpad}\"\"\"\n",
|
||||||
|
"\n",
|
||||||
|
"prompt = ZeroShotAgent.create_prompt(\n",
|
||||||
|
" tools, \n",
|
||||||
|
" prefix=prefix, \n",
|
||||||
|
" suffix=suffix, \n",
|
||||||
|
" input_variables=[\"input\", \"chat_history\", \"agent_scratchpad\"]\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "0021675b",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"We can now construct the LLMChain, with the Memory object, and then create the agent."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"id": "c56a0e73",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"llm_chain = LLMChain(llm=OpenAI(temperature=0), prompt=prompt)\n",
|
||||||
|
"agent = ZeroShotAgent(llm_chain=llm_chain, tools=tools, verbose=True)\n",
|
||||||
|
"agent_chain = AgentExecutor.from_agent_and_tools(agent=agent, tools=tools, verbose=True, memory=memory)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "ca4bc1fb",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I should research ChatGPT to answer this question.\n",
|
||||||
|
"Action: Search\n",
|
||||||
|
"Action Input: \"ChatGPT\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mNov 30, 2022 ... We've trained a model called ChatGPT which interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer ... ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large ... ChatGPT. We've trained a model called ChatGPT which interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer ... Feb 2, 2023 ... ChatGPT, the popular chatbot from OpenAI, is estimated to have reached 100 million monthly active users in January, just two months after ... 2 days ago ... ChatGPT recently launched a new version of its own plagiarism detection tool, with hopes that it will squelch some of the criticism around how ... An API for accessing new AI models developed by OpenAI. Feb 19, 2023 ... ChatGPT is an AI chatbot system that OpenAI released in November to show off and test what a very large, powerful AI system can accomplish. You ... ChatGPT is fine-tuned from GPT-3.5, a language model trained to produce text. ChatGPT was optimized for dialogue by using Reinforcement Learning with Human ... 3 days ago ... Visual ChatGPT connects ChatGPT and a series of Visual Foundation Models to enable sending and receiving images during chatting. Dec 1, 2022 ... ChatGPT is a natural language processing tool driven by AI technology that allows you to have human-like conversations and much more with a ...\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 6,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"What is ChatGPT?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "45627664",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"To test the memory of this agent, we can ask a followup question that relies on information in the previous exchange to be answered correctly."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"id": "eecc0462",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to find out who developed ChatGPT\n",
|
||||||
|
"Action: Search\n",
|
||||||
|
"Action Input: Who developed ChatGPT\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large ... Feb 15, 2023 ... Who owns Chat GPT? Chat GPT is owned and developed by AI research and deployment company, OpenAI. The organization is headquartered in San ... Feb 8, 2023 ... ChatGPT is an AI chatbot developed by San Francisco-based startup OpenAI. OpenAI was co-founded in 2015 by Elon Musk and Sam Altman and is ... Dec 7, 2022 ... ChatGPT is an AI chatbot designed and developed by OpenAI. The bot works by generating text responses based on human-user input, like questions ... Jan 12, 2023 ... In 2019, Microsoft invested $1 billion in OpenAI, the tiny San Francisco company that designed ChatGPT. And in the years since, it has quietly ... Jan 25, 2023 ... The inside story of ChatGPT: How OpenAI founder Sam Altman built the world's hottest technology with billions from Microsoft. Dec 3, 2022 ... ChatGPT went viral on social media for its ability to do anything from code to write essays. · The company that created the AI chatbot has a ... Jan 17, 2023 ... While many Americans were nursing hangovers on New Year's Day, 22-year-old Edward Tian was working feverishly on a new app to combat misuse ... ChatGPT is a language model created by OpenAI, an artificial intelligence research laboratory consisting of a team of researchers and engineers focused on ... 1 day ago ... Everyone is talking about ChatGPT, developed by OpenAI. This is such a great tool that has helped to make AI more accessible to a wider ...\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: ChatGPT was developed by OpenAI.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'ChatGPT was developed by OpenAI.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 7,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"Who developed it?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 8,
|
||||||
|
"id": "c34424cf",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to simplify the conversation for a 5 year old.\n",
|
||||||
|
"Action: Summary\n",
|
||||||
|
"Action Input: My daughter 5 years old\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
|
||||||
|
"Prompt after formatting:\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThis is a conversation between a human and a bot:\n",
|
||||||
|
"\n",
|
||||||
|
"Human: What is ChatGPT?\n",
|
||||||
|
"AI: ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\n",
|
||||||
|
"Human: Who developed it?\n",
|
||||||
|
"AI: ChatGPT was developed by OpenAI.\n",
|
||||||
|
"\n",
|
||||||
|
"Write a summary of the conversation for My daughter 5 years old:\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"The conversation was about ChatGPT, an artificial intelligence chatbot. It was created by OpenAI and can send and receive images while chatting.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: ChatGPT is an artificial intelligence chatbot created by OpenAI that can send and receive images while chatting.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'ChatGPT is an artificial intelligence chatbot created by OpenAI that can send and receive images while chatting.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 8,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"Thanks. Summarize the conversation, for my daughter 5 years old.\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "4ebd8326",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Confirm that the memory was correctly updated."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 9,
|
||||||
|
"id": "b91f8c85",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"Human: What is ChatGPT?\n",
|
||||||
|
"AI: ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\n",
|
||||||
|
"Human: Who developed it?\n",
|
||||||
|
"AI: ChatGPT was developed by OpenAI.\n",
|
||||||
|
"Human: Thanks. Summarize the conversation, for my daughter 5 years old.\n",
|
||||||
|
"AI: ChatGPT is an artificial intelligence chatbot created by OpenAI that can send and receive images while chatting.\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"print(agent_chain.memory.buffer)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "cc3d0aa4",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"For comparison, below is a bad example that uses the same memory for both the Agent and the tool."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 10,
|
||||||
|
"id": "3359d043",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"## This is a bad practice for using the memory.\n",
|
||||||
|
"## Use the ReadOnlySharedMemory class, as shown above.\n",
|
||||||
|
"\n",
|
||||||
|
"template = \"\"\"This is a conversation between a human and a bot:\n",
|
||||||
|
"\n",
|
||||||
|
"{chat_history}\n",
|
||||||
|
"\n",
|
||||||
|
"Write a summary of the conversation for {input}:\n",
|
||||||
|
"\"\"\"\n",
|
||||||
|
"\n",
|
||||||
|
"prompt = PromptTemplate(\n",
|
||||||
|
" input_variables=[\"input\", \"chat_history\"], \n",
|
||||||
|
" template=template\n",
|
||||||
|
")\n",
|
||||||
|
"memory = ConversationBufferMemory(memory_key=\"chat_history\")\n",
|
||||||
|
"summry_chain = LLMChain(\n",
|
||||||
|
" llm=OpenAI(), \n",
|
||||||
|
" prompt=prompt, \n",
|
||||||
|
" verbose=True, \n",
|
||||||
|
" memory=memory, # <--- this is the only change\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"search = GoogleSearchAPIWrapper()\n",
|
||||||
|
"tools = [\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name = \"Search\",\n",
|
||||||
|
" func=search.run,\n",
|
||||||
|
" description=\"useful for when you need to answer questions about current events\"\n",
|
||||||
|
" ),\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name = \"Summary\",\n",
|
||||||
|
" func=summry_chain.run,\n",
|
||||||
|
" description=\"useful for when you summarize a conversation. The input to this tool should be a string, representing who will read this summary.\"\n",
|
||||||
|
" )\n",
|
||||||
|
"]\n",
|
||||||
|
"\n",
|
||||||
|
"prefix = \"\"\"Have a conversation with a human, answering the following questions as best you can. You have access to the following tools:\"\"\"\n",
|
||||||
|
"suffix = \"\"\"Begin!\"\n",
|
||||||
|
"\n",
|
||||||
|
"{chat_history}\n",
|
||||||
|
"Question: {input}\n",
|
||||||
|
"{agent_scratchpad}\"\"\"\n",
|
||||||
|
"\n",
|
||||||
|
"prompt = ZeroShotAgent.create_prompt(\n",
|
||||||
|
" tools, \n",
|
||||||
|
" prefix=prefix, \n",
|
||||||
|
" suffix=suffix, \n",
|
||||||
|
" input_variables=[\"input\", \"chat_history\", \"agent_scratchpad\"]\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"llm_chain = LLMChain(llm=OpenAI(temperature=0), prompt=prompt)\n",
|
||||||
|
"agent = ZeroShotAgent(llm_chain=llm_chain, tools=tools, verbose=True)\n",
|
||||||
|
"agent_chain = AgentExecutor.from_agent_and_tools(agent=agent, tools=tools, verbose=True, memory=memory)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 11,
|
||||||
|
"id": "970d23df",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I should research ChatGPT to answer this question.\n",
|
||||||
|
"Action: Search\n",
|
||||||
|
"Action Input: \"ChatGPT\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mNov 30, 2022 ... We've trained a model called ChatGPT which interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer ... ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large ... ChatGPT. We've trained a model called ChatGPT which interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer ... Feb 2, 2023 ... ChatGPT, the popular chatbot from OpenAI, is estimated to have reached 100 million monthly active users in January, just two months after ... 2 days ago ... ChatGPT recently launched a new version of its own plagiarism detection tool, with hopes that it will squelch some of the criticism around how ... An API for accessing new AI models developed by OpenAI. Feb 19, 2023 ... ChatGPT is an AI chatbot system that OpenAI released in November to show off and test what a very large, powerful AI system can accomplish. You ... ChatGPT is fine-tuned from GPT-3.5, a language model trained to produce text. ChatGPT was optimized for dialogue by using Reinforcement Learning with Human ... 3 days ago ... Visual ChatGPT connects ChatGPT and a series of Visual Foundation Models to enable sending and receiving images during chatting. Dec 1, 2022 ... ChatGPT is a natural language processing tool driven by AI technology that allows you to have human-like conversations and much more with a ...\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 11,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"What is ChatGPT?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 12,
|
||||||
|
"id": "d9ea82f0",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to find out who developed ChatGPT\n",
|
||||||
|
"Action: Search\n",
|
||||||
|
"Action Input: Who developed ChatGPT\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large ... Feb 15, 2023 ... Who owns Chat GPT? Chat GPT is owned and developed by AI research and deployment company, OpenAI. The organization is headquartered in San ... Feb 8, 2023 ... ChatGPT is an AI chatbot developed by San Francisco-based startup OpenAI. OpenAI was co-founded in 2015 by Elon Musk and Sam Altman and is ... Dec 7, 2022 ... ChatGPT is an AI chatbot designed and developed by OpenAI. The bot works by generating text responses based on human-user input, like questions ... Jan 12, 2023 ... In 2019, Microsoft invested $1 billion in OpenAI, the tiny San Francisco company that designed ChatGPT. And in the years since, it has quietly ... Jan 25, 2023 ... The inside story of ChatGPT: How OpenAI founder Sam Altman built the world's hottest technology with billions from Microsoft. Dec 3, 2022 ... ChatGPT went viral on social media for its ability to do anything from code to write essays. · The company that created the AI chatbot has a ... Jan 17, 2023 ... While many Americans were nursing hangovers on New Year's Day, 22-year-old Edward Tian was working feverishly on a new app to combat misuse ... ChatGPT is a language model created by OpenAI, an artificial intelligence research laboratory consisting of a team of researchers and engineers focused on ... 1 day ago ... Everyone is talking about ChatGPT, developed by OpenAI. This is such a great tool that has helped to make AI more accessible to a wider ...\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: ChatGPT was developed by OpenAI.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'ChatGPT was developed by OpenAI.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 12,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"Who developed it?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 13,
|
||||||
|
"id": "5b1f9223",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to simplify the conversation for a 5 year old.\n",
|
||||||
|
"Action: Summary\n",
|
||||||
|
"Action Input: My daughter 5 years old\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
|
||||||
|
"Prompt after formatting:\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThis is a conversation between a human and a bot:\n",
|
||||||
|
"\n",
|
||||||
|
"Human: What is ChatGPT?\n",
|
||||||
|
"AI: ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\n",
|
||||||
|
"Human: Who developed it?\n",
|
||||||
|
"AI: ChatGPT was developed by OpenAI.\n",
|
||||||
|
"\n",
|
||||||
|
"Write a summary of the conversation for My daughter 5 years old:\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"The conversation was about ChatGPT, an artificial intelligence chatbot developed by OpenAI. It is designed to have conversations with humans and can also send and receive images.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: ChatGPT is an artificial intelligence chatbot developed by OpenAI that can have conversations with humans and send and receive images.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'ChatGPT is an artificial intelligence chatbot developed by OpenAI that can have conversations with humans and send and receive images.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 13,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"Thanks. Summarize the conversation, for my daughter 5 years old.\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "d07415da",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"The final answer is not wrong, but we see the 3rd Human input is actually from the agent in the memory because the memory was modified by the summary tool."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 14,
|
||||||
|
"id": "32f97b21",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"Human: What is ChatGPT?\n",
|
||||||
|
"AI: ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI's GPT-3 family of large language models and is optimized for dialogue by using Reinforcement Learning with Human-in-the-Loop. It is also capable of sending and receiving images during chatting.\n",
|
||||||
|
"Human: Who developed it?\n",
|
||||||
|
"AI: ChatGPT was developed by OpenAI.\n",
|
||||||
|
"Human: My daughter 5 years old\n",
|
||||||
|
"AI: \n",
|
||||||
|
"The conversation was about ChatGPT, an artificial intelligence chatbot developed by OpenAI. It is designed to have conversations with humans and can also send and receive images.\n",
|
||||||
|
"Human: Thanks. Summarize the conversation, for my daughter 5 years old.\n",
|
||||||
|
"AI: ChatGPT is an artificial intelligence chatbot developed by OpenAI that can have conversations with humans and send and receive images.\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"print(agent_chain.memory.buffer)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
35
docs/modules/agents/agents.rst
Normal file
35
docs/modules/agents/agents.rst
Normal file
@ -0,0 +1,35 @@
|
|||||||
|
Agents
|
||||||
|
=============
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
`Conceptual Guide <https://docs.langchain.com/docs/components/agents/agent>`_
|
||||||
|
|
||||||
|
|
||||||
|
In this part of the documentation we cover the different types of agents, disregarding which specific tools they are used with.
|
||||||
|
|
||||||
|
For a high level overview of the different types of agents, see the below documentation.
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./agents/agent_types.md
|
||||||
|
|
||||||
|
For documentation on how to create a custom agent, see the below.
|
||||||
|
|
||||||
|
We also have documentation for an in-depth dive into each agent type.
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./agents/custom_agent.ipynb
|
||||||
|
|
||||||
|
We also have documentation for an in-depth dive into each agent type.
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./agents/examples/*
|
||||||
|
|
@ -1,12 +1,9 @@
|
|||||||
# Agents
|
# Agent Types
|
||||||
|
|
||||||
Agents use an LLM to determine which actions to take and in what order.
|
Agents use an LLM to determine which actions to take and in what order.
|
||||||
An action can either be using a tool and observing its output, or returning to the user.
|
An action can either be using a tool and observing its output, or returning a response to the user.
|
||||||
For a list of easily loadable tools, see [here](tools.md).
|
|
||||||
Here are the agents available in LangChain.
|
Here are the agents available in LangChain.
|
||||||
|
|
||||||
For a tutorial on how to load agents, see [here](getting_started.ipynb).
|
|
||||||
|
|
||||||
## `zero-shot-react-description`
|
## `zero-shot-react-description`
|
||||||
|
|
||||||
This agent uses the ReAct framework to determine which tool to use
|
This agent uses the ReAct framework to determine which tool to use
|
@ -0,0 +1,309 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "4658d71a",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Conversation Agent (for Chat Models)\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook walks through using an agent optimized for conversation, using ChatModels. Other agents are often optimized for using tools to figure out the best response, which is not ideal in a conversational setting where you may want the agent to be able to chat with the user as well.\n",
|
||||||
|
"\n",
|
||||||
|
"This is accomplished with a specific type of agent (`chat-conversational-react-description`) which expects to be used with a memory component."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "f4f5d1a8",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import os\n",
|
||||||
|
"os.environ[\"LANGCHAIN_HANDLER\"] = \"langchain\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "f65308ab",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents import Tool\n",
|
||||||
|
"from langchain.memory import ConversationBufferMemory\n",
|
||||||
|
"from langchain.chat_models import ChatOpenAI\n",
|
||||||
|
"from langchain.utilities import SerpAPIWrapper\n",
|
||||||
|
"from langchain.agents import initialize_agent"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "5fb14d6d",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"search = SerpAPIWrapper()\n",
|
||||||
|
"tools = [\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name = \"Current Search\",\n",
|
||||||
|
" func=search.run,\n",
|
||||||
|
" description=\"useful for when you need to answer questions about current events or the current state of the world. the input to this should be a single search term.\"\n",
|
||||||
|
" ),\n",
|
||||||
|
"]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "dddc34c4",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"memory = ConversationBufferMemory(memory_key=\"chat_history\", return_messages=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"id": "cafe9bc1",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"llm=ChatOpenAI(temperature=0)\n",
|
||||||
|
"agent_chain = initialize_agent(tools, llm, agent=\"chat-conversational-react-description\", verbose=True, memory=memory)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "dc70b454",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m{\n",
|
||||||
|
" \"action\": \"Final Answer\",\n",
|
||||||
|
" \"action_input\": \"Hello Bob! How can I assist you today?\"\n",
|
||||||
|
"}\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'Hello Bob! How can I assist you today?'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 6,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"hi, i am bob\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"id": "3dcf7953",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m{\n",
|
||||||
|
" \"action\": \"Final Answer\",\n",
|
||||||
|
" \"action_input\": \"Your name is Bob.\"\n",
|
||||||
|
"}\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'Your name is Bob.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 7,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"what's my name?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 8,
|
||||||
|
"id": "aa05f566",
|
||||||
|
"metadata": {
|
||||||
|
"scrolled": false
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m{\n",
|
||||||
|
" \"action\": \"Current Search\",\n",
|
||||||
|
" \"action_input\": \"Thai food dinner recipes\"\n",
|
||||||
|
"}\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m59 easy Thai recipes for any night of the week · Marion Grasby's Thai spicy chilli and basil fried rice · Thai curry noodle soup · Marion Grasby's ...\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m{\n",
|
||||||
|
" \"action\": \"Final Answer\",\n",
|
||||||
|
" \"action_input\": \"Here are some Thai food dinner recipes you can make this week: Thai spicy chilli and basil fried rice, Thai curry noodle soup, and many more. You can find 59 easy Thai recipes for any night of the week on Marion Grasby's website.\"\n",
|
||||||
|
"}\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"Here are some Thai food dinner recipes you can make this week: Thai spicy chilli and basil fried rice, Thai curry noodle soup, and many more. You can find 59 easy Thai recipes for any night of the week on Marion Grasby's website.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 8,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(\"what are some good dinners to make this week, if i like thai food?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 9,
|
||||||
|
"id": "c5d8b7ea",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m```json\n",
|
||||||
|
"{\n",
|
||||||
|
" \"action\": \"Current Search\",\n",
|
||||||
|
" \"action_input\": \"who won the world cup in 1978\"\n",
|
||||||
|
"}\n",
|
||||||
|
"```\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mThe Argentina national football team represents Argentina in men's international football and is administered by the Argentine Football Association, the governing body for football in Argentina. Nicknamed La Albiceleste, they are the reigning world champions, having won the most recent World Cup in 2022.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m```json\n",
|
||||||
|
"{\n",
|
||||||
|
" \"action\": \"Final Answer\",\n",
|
||||||
|
" \"action_input\": \"The last letter in your name is 'b'. The Argentina national football team won the World Cup in 1978.\"\n",
|
||||||
|
"}\n",
|
||||||
|
"```\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"The last letter in your name is 'b'. The Argentina national football team won the World Cup in 1978.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 9,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"tell me the last letter in my name, and also tell me who won the world cup in 1978?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 10,
|
||||||
|
"id": "f608889b",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m{\n",
|
||||||
|
" \"action\": \"Current Search\",\n",
|
||||||
|
" \"action_input\": \"weather in pomfret\"\n",
|
||||||
|
"}\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mMostly cloudy with gusty winds developing during the afternoon. A few flurries or snow showers possible. High near 40F. Winds NNW at 20 to 30 mph.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m{\n",
|
||||||
|
" \"action\": \"Final Answer\",\n",
|
||||||
|
" \"action_input\": \"The weather in Pomfret is mostly cloudy with gusty winds developing during the afternoon. A few flurries or snow showers are possible. High near 40F. Winds NNW at 20 to 30 mph.\"\n",
|
||||||
|
"}\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'The weather in Pomfret is mostly cloudy with gusty winds developing during the afternoon. A few flurries or snow showers are possible. High near 40F. Winds NNW at 20 to 30 mph.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 10,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_chain.run(input=\"whats the weather like in pomfret?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "0084efd6",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
@ -20,7 +20,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from langchain.agents import Tool\n",
|
"from langchain.agents import Tool\n",
|
||||||
"from langchain.chains.conversation.memory import ConversationBufferMemory\n",
|
"from langchain.memory import ConversationBufferMemory\n",
|
||||||
"from langchain import OpenAI\n",
|
"from langchain import OpenAI\n",
|
||||||
"from langchain.utilities import GoogleSearchAPIWrapper\n",
|
"from langchain.utilities import GoogleSearchAPIWrapper\n",
|
||||||
"from langchain.agents import initialize_agent"
|
"from langchain.agents import initialize_agent"
|
||||||
@ -272,7 +272,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.10.9"
|
"version": "3.9.1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
253
docs/modules/agents/agents/examples/mrkl_chat.ipynb
Normal file
253
docs/modules/agents/agents/examples/mrkl_chat.ipynb
Normal file
@ -0,0 +1,253 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "f1390152",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# MRKL Chat\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook showcases using an agent to replicate the MRKL chain using an agent optimized for chat models."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "39ea3638",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"This uses the example Chinook database.\n",
|
||||||
|
"To set it up follow the instructions on https://database.guide/2-sample-databases-sqlite/, placing the `.db` file in a notebooks folder at the root of this repository."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "ac561cc4",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain import OpenAI, LLMMathChain, SerpAPIWrapper, SQLDatabase, SQLDatabaseChain\n",
|
||||||
|
"from langchain.agents import initialize_agent, Tool\n",
|
||||||
|
"from langchain.chat_models import ChatOpenAI"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "07e96d99",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"llm = ChatOpenAI(temperature=0)\n",
|
||||||
|
"llm1 = OpenAI(temperature=0)\n",
|
||||||
|
"search = SerpAPIWrapper()\n",
|
||||||
|
"llm_math_chain = LLMMathChain(llm=llm1, verbose=True)\n",
|
||||||
|
"db = SQLDatabase.from_uri(\"sqlite:///../../../../notebooks/Chinook.db\")\n",
|
||||||
|
"db_chain = SQLDatabaseChain(llm=llm1, database=db, verbose=True)\n",
|
||||||
|
"tools = [\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name = \"Search\",\n",
|
||||||
|
" func=search.run,\n",
|
||||||
|
" description=\"useful for when you need to answer questions about current events. You should ask targeted questions\"\n",
|
||||||
|
" ),\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name=\"Calculator\",\n",
|
||||||
|
" func=llm_math_chain.run,\n",
|
||||||
|
" description=\"useful for when you need to answer questions about math\"\n",
|
||||||
|
" ),\n",
|
||||||
|
" Tool(\n",
|
||||||
|
" name=\"FooBar DB\",\n",
|
||||||
|
" func=db_chain.run,\n",
|
||||||
|
" description=\"useful for when you need to answer questions about FooBar. Input should be in the form of a question containing full context\"\n",
|
||||||
|
" )\n",
|
||||||
|
"]"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "a069c4b6",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"mrkl = initialize_agent(tools, llm, agent=\"chat-zero-shot-react-description\", verbose=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "e603cd7d",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: The first question requires a search, while the second question requires a calculator.\n",
|
||||||
|
"Action:\n",
|
||||||
|
"```\n",
|
||||||
|
"{\n",
|
||||||
|
" \"action\": \"Search\",\n",
|
||||||
|
" \"action_input\": \"Who is Leo DiCaprio's girlfriend?\"\n",
|
||||||
|
"}\n",
|
||||||
|
"```\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mCamila Morrone\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mFor the second question, I need to use the calculator tool to raise her current age to the 0.43 power.\n",
|
||||||
|
"Action:\n",
|
||||||
|
"```\n",
|
||||||
|
"{\n",
|
||||||
|
" \"action\": \"Calculator\",\n",
|
||||||
|
" \"action_input\": \"22.0^(0.43)\"\n",
|
||||||
|
"}\n",
|
||||||
|
"```\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new LLMMathChain chain...\u001b[0m\n",
|
||||||
|
"22.0^(0.43)\u001b[32;1m\u001b[1;3m\n",
|
||||||
|
"```python\n",
|
||||||
|
"import math\n",
|
||||||
|
"print(math.pow(22.0, 0.43))\n",
|
||||||
|
"```\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Answer: \u001b[33;1m\u001b[1;3m3.777824273683966\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mAnswer: 3.777824273683966\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mI now know the final answer.\n",
|
||||||
|
"Final Answer: Camila Morrone, 3.777824273683966.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'Camila Morrone, 3.777824273683966.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 4,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"mrkl.run(\"Who is Leo DiCaprio's girlfriend? What is her current age raised to the 0.43 power?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"id": "a5c07010",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mQuestion: What is the full name of the artist who recently released an album called 'The Storm Before the Calm' and are they in the FooBar database? If so, what albums of theirs are in the FooBar database?\n",
|
||||||
|
"Thought: I should use the Search tool to find the answer to the first part of the question and then use the FooBar DB tool to find the answer to the second part of the question.\n",
|
||||||
|
"Action:\n",
|
||||||
|
"```\n",
|
||||||
|
"{\n",
|
||||||
|
" \"action\": \"Search\",\n",
|
||||||
|
" \"action_input\": \"Who recently released an album called 'The Storm Before the Calm'\"\n",
|
||||||
|
"}\n",
|
||||||
|
"```\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mAlanis Morissette\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mNow that I have the name of the artist, I can use the FooBar DB tool to find their albums in the database.\n",
|
||||||
|
"Action:\n",
|
||||||
|
"```\n",
|
||||||
|
"{\n",
|
||||||
|
" \"action\": \"FooBar DB\",\n",
|
||||||
|
" \"action_input\": \"What albums does Alanis Morissette have in the database?\"\n",
|
||||||
|
"}\n",
|
||||||
|
"```\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new SQLDatabaseChain chain...\u001b[0m\n",
|
||||||
|
"What albums does Alanis Morissette have in the database? \n",
|
||||||
|
"SQLQuery:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "stderr",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"/Users/harrisonchase/workplace/langchain/langchain/sql_database.py:141: SAWarning: Dialect sqlite+pysqlite does *not* support Decimal objects natively, and SQLAlchemy must convert from floating point - rounding errors and other issues may occur. Please consider storing Decimal numbers as strings or integers on this platform for lossless storage.\n",
|
||||||
|
" sample_rows = connection.execute(command)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\u001b[32;1m\u001b[1;3m SELECT Title FROM Album WHERE ArtistId IN (SELECT ArtistId FROM Artist WHERE Name = 'Alanis Morissette') LIMIT 5;\u001b[0m\n",
|
||||||
|
"SQLResult: \u001b[33;1m\u001b[1;3m[('Jagged Little Pill',)]\u001b[0m\n",
|
||||||
|
"Answer:\u001b[32;1m\u001b[1;3m Alanis Morissette has the album 'Jagged Little Pill' in the database.\u001b[0m\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[38;5;200m\u001b[1;3m Alanis Morissette has the album 'Jagged Little Pill' in the database.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mI have found the answer to both parts of the question.\n",
|
||||||
|
"Final Answer: The artist who recently released an album called 'The Storm Before the Calm' is Alanis Morissette. The album 'Jagged Little Pill' is in the FooBar database.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"The artist who recently released an album called 'The Storm Before the Calm' is Alanis Morissette. The album 'Jagged Little Pill' is in the FooBar database.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 5,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"mrkl.run(\"What is the full name of the artist who recently released an album called 'The Storm Before the Calm' and are they in the FooBar database? If so, what albums of theirs are in the FooBar database?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "af016a70",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
@ -12,7 +12,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 4,
|
"execution_count": 1,
|
||||||
"id": "4e272b47",
|
"id": "4e272b47",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -24,11 +24,13 @@
|
|||||||
"tools = [\n",
|
"tools = [\n",
|
||||||
" Tool(\n",
|
" Tool(\n",
|
||||||
" name=\"Search\",\n",
|
" name=\"Search\",\n",
|
||||||
" func=docstore.search\n",
|
" func=docstore.search,\n",
|
||||||
|
" description=\"useful for when you need to ask with search\"\n",
|
||||||
" ),\n",
|
" ),\n",
|
||||||
" Tool(\n",
|
" Tool(\n",
|
||||||
" name=\"Lookup\",\n",
|
" name=\"Lookup\",\n",
|
||||||
" func=docstore.lookup\n",
|
" func=docstore.lookup,\n",
|
||||||
|
" description=\"useful for when you need to ask with lookup\"\n",
|
||||||
" )\n",
|
" )\n",
|
||||||
"]\n",
|
"]\n",
|
||||||
"\n",
|
"\n",
|
||||||
@ -38,7 +40,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 5,
|
"execution_count": 2,
|
||||||
"id": "8078c8f1",
|
"id": "8078c8f1",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@ -59,7 +61,8 @@
|
|||||||
"Observation 2: \u001b[36;1m\u001b[1;3mWilliam James Crowe Jr. (January 2, 1925 – October 18, 2007) was a United States Navy admiral and diplomat who served as the 11th chairman of the Joint Chiefs of Staff under Presidents Ronald Reagan and George H. W. Bush, and as the ambassador to the United Kingdom and Chair of the Intelligence Oversight Board under President Bill Clinton.\u001b[0m\n",
|
"Observation 2: \u001b[36;1m\u001b[1;3mWilliam James Crowe Jr. (January 2, 1925 – October 18, 2007) was a United States Navy admiral and diplomat who served as the 11th chairman of the Joint Chiefs of Staff under Presidents Ronald Reagan and George H. W. Bush, and as the ambassador to the United Kingdom and Chair of the Intelligence Oversight Board under President Bill Clinton.\u001b[0m\n",
|
||||||
"Thought 3:\u001b[32;1m\u001b[1;3m The President William J. Crowe served as the ambassador to the United Kingdom under is Bill Clinton.\n",
|
"Thought 3:\u001b[32;1m\u001b[1;3m The President William J. Crowe served as the ambassador to the United Kingdom under is Bill Clinton.\n",
|
||||||
"Action 3: Finish[Bill Clinton]\u001b[0m\n",
|
"Action 3: Finish[Bill Clinton]\u001b[0m\n",
|
||||||
"\u001b[1m> Finished AgentExecutor chain.\u001b[0m\n"
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -68,7 +71,7 @@
|
|||||||
"'Bill Clinton'"
|
"'Bill Clinton'"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"execution_count": 5,
|
"execution_count": 2,
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"output_type": "execute_result"
|
"output_type": "execute_result"
|
||||||
}
|
}
|
||||||
@ -77,11 +80,19 @@
|
|||||||
"question = \"Author David Chanoff has collaborated with a U.S. Navy admiral who served as the ambassador to the United Kingdom under which President?\"\n",
|
"question = \"Author David Chanoff has collaborated with a U.S. Navy admiral who served as the ambassador to the United Kingdom under which President?\"\n",
|
||||||
"react.run(question)"
|
"react.run(question)"
|
||||||
]
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "cc4a6efe",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"kernelspec": {
|
"kernelspec": {
|
||||||
"display_name": "Python 3.9.0 64-bit ('llm-env')",
|
"display_name": "Python 3 (ipykernel)",
|
||||||
"language": "python",
|
"language": "python",
|
||||||
"name": "python3"
|
"name": "python3"
|
||||||
},
|
},
|
||||||
@ -95,7 +106,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.9.0"
|
"version": "3.9.1"
|
||||||
},
|
},
|
||||||
"vscode": {
|
"vscode": {
|
||||||
"interpreter": {
|
"interpreter": {
|
@ -52,7 +52,8 @@
|
|||||||
"tools = [\n",
|
"tools = [\n",
|
||||||
" Tool(\n",
|
" Tool(\n",
|
||||||
" name=\"Intermediate Answer\",\n",
|
" name=\"Intermediate Answer\",\n",
|
||||||
" func=search.run\n",
|
" func=search.run,\n",
|
||||||
|
" description=\"useful for when you need to ask with search\"\n",
|
||||||
" )\n",
|
" )\n",
|
||||||
"]\n",
|
"]\n",
|
||||||
"\n",
|
"\n",
|
||||||
@ -63,7 +64,7 @@
|
|||||||
],
|
],
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"kernelspec": {
|
"kernelspec": {
|
||||||
"display_name": "Python 3.9.0 64-bit ('llm-env')",
|
"display_name": "Python 3 (ipykernel)",
|
||||||
"language": "python",
|
"language": "python",
|
||||||
"name": "python3"
|
"name": "python3"
|
||||||
},
|
},
|
||||||
@ -77,7 +78,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.9.0"
|
"version": "3.10.9"
|
||||||
},
|
},
|
||||||
"vscode": {
|
"vscode": {
|
||||||
"interpreter": {
|
"interpreter": {
|
@ -1,130 +0,0 @@
|
|||||||
{
|
|
||||||
"cells": [
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"id": "991b1cc1",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"# Loading from LangChainHub\n",
|
|
||||||
"\n",
|
|
||||||
"This notebook covers how to load agents from [LangChainHub](https://github.com/hwchase17/langchain-hub)."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 1,
|
|
||||||
"id": "bd4450a2",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [
|
|
||||||
{
|
|
||||||
"name": "stderr",
|
|
||||||
"output_type": "stream",
|
|
||||||
"text": [
|
|
||||||
"No `_type` key found, defaulting to `prompt`.\n"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "stdout",
|
|
||||||
"output_type": "stream",
|
|
||||||
"text": [
|
|
||||||
"\n",
|
|
||||||
"\n",
|
|
||||||
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
|
||||||
"\u001b[32;1m\u001b[1;3m Yes.\n",
|
|
||||||
"Follow up: Who is the reigning men's U.S. Open champion?\u001b[0m\n",
|
|
||||||
"Intermediate answer: \u001b[36;1m\u001b[1;3m2016 · SUI · Stan Wawrinka ; 2017 · ESP · Rafael Nadal ; 2018 · SRB · Novak Djokovic ; 2019 · ESP · Rafael Nadal.\u001b[0m\n",
|
|
||||||
"\u001b[32;1m\u001b[1;3mSo the reigning men's U.S. Open champion is Rafael Nadal.\n",
|
|
||||||
"Follow up: What is Rafael Nadal's hometown?\u001b[0m\n",
|
|
||||||
"Intermediate answer: \u001b[36;1m\u001b[1;3mIn 2016, he once again showed his deep ties to Mallorca and opened the Rafa Nadal Academy in his hometown of Manacor.\u001b[0m\n",
|
|
||||||
"\u001b[32;1m\u001b[1;3mSo the final answer is: Manacor, Mallorca, Spain.\u001b[0m\n",
|
|
||||||
"\n",
|
|
||||||
"\u001b[1m> Finished chain.\u001b[0m\n"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"data": {
|
|
||||||
"text/plain": [
|
|
||||||
"'Manacor, Mallorca, Spain.'"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"execution_count": 1,
|
|
||||||
"metadata": {},
|
|
||||||
"output_type": "execute_result"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"source": [
|
|
||||||
"from langchain import OpenAI, SerpAPIWrapper\n",
|
|
||||||
"from langchain.agents import initialize_agent, Tool\n",
|
|
||||||
"\n",
|
|
||||||
"llm = OpenAI(temperature=0)\n",
|
|
||||||
"search = SerpAPIWrapper()\n",
|
|
||||||
"tools = [\n",
|
|
||||||
" Tool(\n",
|
|
||||||
" name=\"Intermediate Answer\",\n",
|
|
||||||
" func=search.run\n",
|
|
||||||
" )\n",
|
|
||||||
"]\n",
|
|
||||||
"\n",
|
|
||||||
"self_ask_with_search = initialize_agent(tools, llm, agent_path=\"lc://agents/self-ask-with-search/agent.json\", verbose=True)\n",
|
|
||||||
"self_ask_with_search.run(\"What is the hometown of the reigning men's U.S. Open champion?\")"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"id": "3aede965",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"# Pinning Dependencies\n",
|
|
||||||
"\n",
|
|
||||||
"Specific versions of LangChainHub agents can be pinned with the `lc@<ref>://` syntax."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 2,
|
|
||||||
"id": "e679f7b6",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [
|
|
||||||
{
|
|
||||||
"name": "stderr",
|
|
||||||
"output_type": "stream",
|
|
||||||
"text": [
|
|
||||||
"No `_type` key found, defaulting to `prompt`.\n"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"source": [
|
|
||||||
"self_ask_with_search = initialize_agent(tools, llm, agent_path=\"lc@2826ef9e8acdf88465e1e5fc8a7bf59e0f9d0a85://agents/self-ask-with-search/agent.json\", verbose=True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"id": "9d3d6697",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": []
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"kernelspec": {
|
|
||||||
"display_name": "Python 3 (ipykernel)",
|
|
||||||
"language": "python",
|
|
||||||
"name": "python3"
|
|
||||||
},
|
|
||||||
"language_info": {
|
|
||||||
"codemirror_mode": {
|
|
||||||
"name": "ipython",
|
|
||||||
"version": 3
|
|
||||||
},
|
|
||||||
"file_extension": ".py",
|
|
||||||
"mimetype": "text/x-python",
|
|
||||||
"name": "python",
|
|
||||||
"nbconvert_exporter": "python",
|
|
||||||
"pygments_lexer": "ipython3",
|
|
||||||
"version": "3.9.1"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"nbformat": 4,
|
|
||||||
"nbformat_minor": 5
|
|
||||||
}
|
|
@ -1,154 +0,0 @@
|
|||||||
{
|
|
||||||
"cells": [
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"id": "bfe18e28",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"# Serialization\n",
|
|
||||||
"\n",
|
|
||||||
"This notebook goes over how to serialize agents. For this notebook, it is important to understand the distinction we draw between `agents` and `tools`. An agent is the LLM powered decision maker that decides which actions to take and in which order. Tools are various instruments (functions) an agent has access to, through which an agent can interact with the outside world. When people generally use agents, they primarily talk about using an agent WITH tools. However, when we talk about serialization of agents, we are talking about the agent by itself. We plan to add support for serializing an agent WITH tools sometime in the future.\n",
|
|
||||||
"\n",
|
|
||||||
"Let's start by creating an agent with tools as we normally do:"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 1,
|
|
||||||
"id": "eb729f16",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from langchain.agents import load_tools\n",
|
|
||||||
"from langchain.agents import initialize_agent\n",
|
|
||||||
"from langchain.llms import OpenAI\n",
|
|
||||||
"\n",
|
|
||||||
"llm = OpenAI(temperature=0)\n",
|
|
||||||
"tools = load_tools([\"serpapi\", \"llm-math\"], llm=llm)\n",
|
|
||||||
"agent = initialize_agent(tools, llm, agent=\"zero-shot-react-description\", verbose=True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"id": "0578f566",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Let's now serialize the agent. To be explicit that we are serializing ONLY the agent, we will call the `save_agent` method."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 2,
|
|
||||||
"id": "dc544de6",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"agent.save_agent('agent.json')"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 3,
|
|
||||||
"id": "62dd45bf",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [
|
|
||||||
{
|
|
||||||
"name": "stdout",
|
|
||||||
"output_type": "stream",
|
|
||||||
"text": [
|
|
||||||
"{\r\n",
|
|
||||||
" \"llm_chain\": {\r\n",
|
|
||||||
" \"memory\": null,\r\n",
|
|
||||||
" \"verbose\": false,\r\n",
|
|
||||||
" \"prompt\": {\r\n",
|
|
||||||
" \"input_variables\": [\r\n",
|
|
||||||
" \"input\",\r\n",
|
|
||||||
" \"agent_scratchpad\"\r\n",
|
|
||||||
" ],\r\n",
|
|
||||||
" \"output_parser\": null,\r\n",
|
|
||||||
" \"template\": \"Answer the following questions as best you can. You have access to the following tools:\\n\\nSearch: A search engine. Useful for when you need to answer questions about current events. Input should be a search query.\\nCalculator: Useful for when you need to answer questions about math.\\n\\nUse the following format:\\n\\nQuestion: the input question you must answer\\nThought: you should always think about what to do\\nAction: the action to take, should be one of [Search, Calculator]\\nAction Input: the input to the action\\nObservation: the result of the action\\n... (this Thought/Action/Action Input/Observation can repeat N times)\\nThought: I now know the final answer\\nFinal Answer: the final answer to the original input question\\n\\nBegin!\\n\\nQuestion: {input}\\nThought:{agent_scratchpad}\",\r\n",
|
|
||||||
" \"template_format\": \"f-string\",\r\n",
|
|
||||||
" \"validate_template\": true,\r\n",
|
|
||||||
" \"_type\": \"prompt\"\r\n",
|
|
||||||
" },\r\n",
|
|
||||||
" \"llm\": {\r\n",
|
|
||||||
" \"model_name\": \"text-davinci-003\",\r\n",
|
|
||||||
" \"temperature\": 0.0,\r\n",
|
|
||||||
" \"max_tokens\": 256,\r\n",
|
|
||||||
" \"top_p\": 1,\r\n",
|
|
||||||
" \"frequency_penalty\": 0,\r\n",
|
|
||||||
" \"presence_penalty\": 0,\r\n",
|
|
||||||
" \"n\": 1,\r\n",
|
|
||||||
" \"best_of\": 1,\r\n",
|
|
||||||
" \"request_timeout\": null,\r\n",
|
|
||||||
" \"logit_bias\": {},\r\n",
|
|
||||||
" \"_type\": \"openai\"\r\n",
|
|
||||||
" },\r\n",
|
|
||||||
" \"output_key\": \"text\",\r\n",
|
|
||||||
" \"_type\": \"llm_chain\"\r\n",
|
|
||||||
" },\r\n",
|
|
||||||
" \"allowed_tools\": [\r\n",
|
|
||||||
" \"Search\",\r\n",
|
|
||||||
" \"Calculator\"\r\n",
|
|
||||||
" ],\r\n",
|
|
||||||
" \"return_values\": [\r\n",
|
|
||||||
" \"output\"\r\n",
|
|
||||||
" ],\r\n",
|
|
||||||
" \"_type\": \"zero-shot-react-description\"\r\n",
|
|
||||||
"}"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"source": [
|
|
||||||
"!cat agent.json"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"id": "0eb72510",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"We can now load the agent back in"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": 4,
|
|
||||||
"id": "eb660b76",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"agent = initialize_agent(tools, llm, agent_path=\"agent.json\", verbose=True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"id": "aa624ea5",
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": []
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"kernelspec": {
|
|
||||||
"display_name": "Python 3 (ipykernel)",
|
|
||||||
"language": "python",
|
|
||||||
"name": "python3"
|
|
||||||
},
|
|
||||||
"language_info": {
|
|
||||||
"codemirror_mode": {
|
|
||||||
"name": "ipython",
|
|
||||||
"version": 3
|
|
||||||
},
|
|
||||||
"file_extension": ".py",
|
|
||||||
"mimetype": "text/x-python",
|
|
||||||
"name": "python",
|
|
||||||
"nbconvert_exporter": "python",
|
|
||||||
"pygments_lexer": "ipython3",
|
|
||||||
"version": "3.9.1"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"nbformat": 4,
|
|
||||||
"nbformat_minor": 5
|
|
||||||
}
|
|
@ -1,6 +1,15 @@
|
|||||||
How-To Guides
|
How-To Guides
|
||||||
=============
|
=============
|
||||||
|
|
||||||
|
There are three types of examples in this section:
|
||||||
|
|
||||||
|
1. Agent Overview: how-to-guides for generic agent functionality
|
||||||
|
2. Agent Toolkits: how-to-guides for specific agent toolkits (agents optimized for interacting with a certain resource)
|
||||||
|
3. Agent Types: how-to-guides for working with the different agent types
|
||||||
|
|
||||||
|
Agent Overview
|
||||||
|
---------------
|
||||||
|
|
||||||
The first category of how-to guides here cover specific parts of working with agents.
|
The first category of how-to guides here cover specific parts of working with agents.
|
||||||
|
|
||||||
`Load From Hub <./examples/load_from_hub.html>`_: This notebook covers how to load agents from `LangChainHub <https://github.com/hwchase17/langchain-hub>`_.
|
`Load From Hub <./examples/load_from_hub.html>`_: This notebook covers how to load agents from `LangChainHub <https://github.com/hwchase17/langchain-hub>`_.
|
||||||
@ -21,7 +30,48 @@ The first category of how-to guides here cover specific parts of working with ag
|
|||||||
|
|
||||||
`Asynchronous <./examples/async_agent.html>`_: Covering asynchronous functionality.
|
`Asynchronous <./examples/async_agent.html>`_: Covering asynchronous functionality.
|
||||||
|
|
||||||
The next set of examples are all end-to-end agents for specific applications.
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
:hidden:
|
||||||
|
|
||||||
|
./examples/*
|
||||||
|
|
||||||
|
|
||||||
|
Agent Toolkits
|
||||||
|
---------------
|
||||||
|
|
||||||
|
The next set of examples covers agents with toolkits.
|
||||||
|
As opposed to the examples above, these examples are not intended to show off an agent `type`,
|
||||||
|
but rather to show off an agent applied to particular use case.
|
||||||
|
|
||||||
|
`SQLDatabase Agent <./agent_toolkits/sql_database.html>`_: This notebook covers how to interact with an arbitrary SQL database using an agent.
|
||||||
|
|
||||||
|
`JSON Agent <./agent_toolkits/json.html>`_: This notebook covers how to interact with a JSON dictionary using an agent.
|
||||||
|
|
||||||
|
`OpenAPI Agent <./agent_toolkits/openapi.html>`_: This notebook covers how to interact with an arbitrary OpenAPI endpoint using an agent.
|
||||||
|
|
||||||
|
`VectorStore Agent <./agent_toolkits/vectorstore.html>`_: This notebook covers how to interact with VectorStores using an agent.
|
||||||
|
|
||||||
|
`Python Agent <./agent_toolkits/python.html>`_: This notebook covers how to produce and execute python code using an agent.
|
||||||
|
|
||||||
|
`Pandas DataFrame Agent <./agent_toolkits/pandas.html>`_: This notebook covers how to do question answering over a pandas dataframe using an agent. Under the hood this calls the Python agent..
|
||||||
|
|
||||||
|
`CSV Agent <./agent_toolkits/csv.html>`_: This notebook covers how to do question answering over a csv file. Under the hood this calls the Pandas DataFrame agent.
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
:hidden:
|
||||||
|
|
||||||
|
./agent_toolkits/*
|
||||||
|
|
||||||
|
|
||||||
|
Agent Types
|
||||||
|
---------------
|
||||||
|
|
||||||
|
The final set of examples are all end-to-end example of different agent types.
|
||||||
In all examples there is an Agent with a particular set of tools.
|
In all examples there is an Agent with a particular set of tools.
|
||||||
|
|
||||||
- Tools: A tool can be anything that takes in a string and returns a string. This means that you can use both the primitives AND the chains found in `this <../chains.html>`_ documentation. LangChain also provides a list of easily loadable tools. For detailed information on those, please see `this documentation <./tools.html>`_
|
- Tools: A tool can be anything that takes in a string and returns a string. This means that you can use both the primitives AND the chains found in `this <../chains.html>`_ documentation. LangChain also provides a list of easily loadable tools. For detailed information on those, please see `this documentation <./tools.html>`_
|
||||||
@ -51,12 +101,7 @@ In all examples there is an Agent with a particular set of tools.
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
.. toctree::
|
|
||||||
:maxdepth: 1
|
|
||||||
:glob:
|
|
||||||
:hidden:
|
|
||||||
|
|
||||||
./examples/*
|
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
@ -64,3 +109,5 @@ In all examples there is an Agent with a particular set of tools.
|
|||||||
:hidden:
|
:hidden:
|
||||||
|
|
||||||
./implementations/*
|
./implementations/*
|
||||||
|
|
||||||
|
|
||||||
|
@ -1,87 +0,0 @@
|
|||||||
"""Run NatBot."""
|
|
||||||
import time
|
|
||||||
|
|
||||||
from langchain.chains.natbot.base import NatBotChain
|
|
||||||
from langchain.chains.natbot.crawler import Crawler
|
|
||||||
|
|
||||||
|
|
||||||
def run_cmd(cmd: str, _crawler: Crawler) -> None:
|
|
||||||
"""Run command."""
|
|
||||||
cmd = cmd.split("\n")[0]
|
|
||||||
|
|
||||||
if cmd.startswith("SCROLL UP"):
|
|
||||||
_crawler.scroll("up")
|
|
||||||
elif cmd.startswith("SCROLL DOWN"):
|
|
||||||
_crawler.scroll("down")
|
|
||||||
elif cmd.startswith("CLICK"):
|
|
||||||
commasplit = cmd.split(",")
|
|
||||||
id = commasplit[0].split(" ")[1]
|
|
||||||
_crawler.click(id)
|
|
||||||
elif cmd.startswith("TYPE"):
|
|
||||||
spacesplit = cmd.split(" ")
|
|
||||||
id = spacesplit[1]
|
|
||||||
text_pieces = spacesplit[2:]
|
|
||||||
text = " ".join(text_pieces)
|
|
||||||
# Strip leading and trailing double quotes
|
|
||||||
text = text[1:-1]
|
|
||||||
|
|
||||||
if cmd.startswith("TYPESUBMIT"):
|
|
||||||
text += "\n"
|
|
||||||
_crawler.type(id, text)
|
|
||||||
|
|
||||||
time.sleep(2)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
objective = "Make a reservation for 2 at 7pm at bistro vida in menlo park"
|
|
||||||
print("\nWelcome to natbot! What is your objective?")
|
|
||||||
i = input()
|
|
||||||
if len(i) > 0:
|
|
||||||
objective = i
|
|
||||||
quiet = False
|
|
||||||
nat_bot_chain = NatBotChain.from_default(objective)
|
|
||||||
_crawler = Crawler()
|
|
||||||
_crawler.go_to_page("google.com")
|
|
||||||
try:
|
|
||||||
while True:
|
|
||||||
browser_content = "\n".join(_crawler.crawl())
|
|
||||||
llm_command = nat_bot_chain.execute(_crawler.page.url, browser_content)
|
|
||||||
if not quiet:
|
|
||||||
print("URL: " + _crawler.page.url)
|
|
||||||
print("Objective: " + objective)
|
|
||||||
print("----------------\n" + browser_content + "\n----------------\n")
|
|
||||||
if len(llm_command) > 0:
|
|
||||||
print("Suggested command: " + llm_command)
|
|
||||||
|
|
||||||
command = input()
|
|
||||||
if command == "r" or command == "":
|
|
||||||
run_cmd(llm_command, _crawler)
|
|
||||||
elif command == "g":
|
|
||||||
url = input("URL:")
|
|
||||||
_crawler.go_to_page(url)
|
|
||||||
elif command == "u":
|
|
||||||
_crawler.scroll("up")
|
|
||||||
time.sleep(1)
|
|
||||||
elif command == "d":
|
|
||||||
_crawler.scroll("down")
|
|
||||||
time.sleep(1)
|
|
||||||
elif command == "c":
|
|
||||||
id = input("id:")
|
|
||||||
_crawler.click(id)
|
|
||||||
time.sleep(1)
|
|
||||||
elif command == "t":
|
|
||||||
id = input("id:")
|
|
||||||
text = input("text:")
|
|
||||||
_crawler.type(id, text)
|
|
||||||
time.sleep(1)
|
|
||||||
elif command == "o":
|
|
||||||
objective = input("Objective:")
|
|
||||||
else:
|
|
||||||
print(
|
|
||||||
"(g) to visit url\n(u) scroll up\n(d) scroll down\n(c) to click"
|
|
||||||
"\n(t) to type\n(h) to view commands again"
|
|
||||||
"\n(r/enter) to run suggested command\n(o) change objective"
|
|
||||||
)
|
|
||||||
except KeyboardInterrupt:
|
|
||||||
print("\n[!] Ctrl+C detected, exiting gracefully.")
|
|
||||||
exit(0)
|
|
@ -1,10 +0,0 @@
|
|||||||
# Key Concepts
|
|
||||||
|
|
||||||
## Agents
|
|
||||||
Agents use an LLM to determine which actions to take and in what order.
|
|
||||||
For more detailed information on agents, and different types of agents in LangChain, see [this documentation](agents.md).
|
|
||||||
|
|
||||||
## Tools
|
|
||||||
Tools are functions that agents can use to interact with the world.
|
|
||||||
These tools can be generic utilities (e.g. search), other chains, or even other agents.
|
|
||||||
For more detailed information on tools, and different types of tools in LangChain, see [this documentation](tools.md).
|
|
18
docs/modules/agents/toolkits.rst
Normal file
18
docs/modules/agents/toolkits.rst
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
Toolkits
|
||||||
|
==============
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
`Conceptual Guide <https://docs.langchain.com/docs/components/agents/toolkit>`_
|
||||||
|
|
||||||
|
|
||||||
|
This section of documentation covers agents with toolkits - eg an agent applied to a particular use case.
|
||||||
|
|
||||||
|
See below for a full list of agent toolkits
|
||||||
|
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./toolkits/examples/*
|
||||||
|
|
202
docs/modules/agents/toolkits/examples/csv.ipynb
Normal file
202
docs/modules/agents/toolkits/examples/csv.ipynb
Normal file
@ -0,0 +1,202 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "7094e328",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# CSV Agent\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook shows how to use agents to interact with a csv. It is mostly optimized for question answering.\n",
|
||||||
|
"\n",
|
||||||
|
"**NOTE: this agent calls the Pandas DataFrame agent under the hood, which in turn calls the Python agent, which executes LLM generated Python code - this can be bad if the LLM generated Python code is harmful. Use cautiously.**\n",
|
||||||
|
"\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "827982c7",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents import create_csv_agent"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "caae0bec",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.llms import OpenAI"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 11,
|
||||||
|
"id": "16c4dc59",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"agent = create_csv_agent(OpenAI(temperature=0), 'titanic.csv', verbose=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 12,
|
||||||
|
"id": "46b9489d",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to count the number of rows\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: len(df)\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m891\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: There are 891 rows in the dataframe.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'There are 891 rows in the dataframe.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 12,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"how many rows are there?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "a96309be",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to count the number of people with more than 3 siblings\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: df[df['SibSp'] > 3].shape[0]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m30\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: 30 people have more than 3 siblings.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'30 people have more than 3 siblings.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 6,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"how many people have more than 3 sibligngs\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"id": "964a09f7",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to calculate the average age first\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: df['Age'].mean()\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m29.69911764705882\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I can now calculate the square root\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: math.sqrt(df['Age'].mean())\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mname 'math' is not defined\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I need to import the math library\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: import math\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mNone\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I can now calculate the square root\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: math.sqrt(df['Age'].mean())\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m5.449689683556195\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: 5.449689683556195\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'5.449689683556195'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 7,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"whats the square root of the average age?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "551de2be",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
190
docs/modules/agents/toolkits/examples/json.ipynb
Normal file
190
docs/modules/agents/toolkits/examples/json.ipynb
Normal file
@ -0,0 +1,190 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "85fb2c03-ab88-4c8c-97e3-a7f2954555ab",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# JSON Agent\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook showcases an agent designed to interact with large JSON/dict objects. This is useful when you want to answer questions about a JSON blob that's too large to fit in the context window of an LLM. The agent is able to iteratively explore the blob to find what it needs to answer the user's question.\n",
|
||||||
|
"\n",
|
||||||
|
"In the below example, we are using the OpenAPI spec for the OpenAI API, which you can find [here](https://github.com/openai/openai-openapi/blob/master/openapi.yaml).\n",
|
||||||
|
"\n",
|
||||||
|
"We will use the JSON agent to answer some questions about the API spec."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "893f90fd-f8f6-470a-a76d-1f200ba02e2f",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Initialization"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "ff988466-c389-4ec6-b6ac-14364a537fd5",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import os\n",
|
||||||
|
"import yaml\n",
|
||||||
|
"\n",
|
||||||
|
"from langchain.agents import (\n",
|
||||||
|
" create_json_agent,\n",
|
||||||
|
" AgentExecutor\n",
|
||||||
|
")\n",
|
||||||
|
"from langchain.agents.agent_toolkits import JsonToolkit\n",
|
||||||
|
"from langchain.chains import LLMChain\n",
|
||||||
|
"from langchain.llms.openai import OpenAI\n",
|
||||||
|
"from langchain.requests import RequestsWrapper\n",
|
||||||
|
"from langchain.tools.json.tool import JsonSpec"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "9ecd1ba0-3937-4359-a41e-68605f0596a1",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with open(\"openai_openapi.yml\") as f:\n",
|
||||||
|
" data = yaml.load(f, Loader=yaml.FullLoader)\n",
|
||||||
|
"json_spec = JsonSpec(dict_=data, max_value_length=4000)\n",
|
||||||
|
"json_toolkit = JsonToolkit(spec=json_spec)\n",
|
||||||
|
"\n",
|
||||||
|
"json_agent_executor = create_json_agent(\n",
|
||||||
|
" llm=OpenAI(temperature=0),\n",
|
||||||
|
" toolkit=json_toolkit,\n",
|
||||||
|
" verbose=True\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "05cfcb24-4389-4b8f-ad9e-466e3fca8db0",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Example: getting the required POST parameters for a request"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"id": "faf13702-50f0-4d1b-b91f-48c750ccfd98",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: json_spec_list_keys\n",
|
||||||
|
"Action Input: data\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['openapi', 'info', 'servers', 'tags', 'paths', 'components', 'x-oaiMeta']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the paths key to see what endpoints exist\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['/engines', '/engines/{engine_id}', '/completions', '/edits', '/images/generations', '/images/edits', '/images/variations', '/embeddings', '/engines/{engine_id}/search', '/files', '/files/{file_id}', '/files/{file_id}/content', '/answers', '/classifications', '/fine-tunes', '/fine-tunes/{fine_tune_id}', '/fine-tunes/{fine_tune_id}/cancel', '/fine-tunes/{fine_tune_id}/events', '/models', '/models/{model}', '/moderations']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the /completions endpoint to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['post']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the post key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['operationId', 'tags', 'summary', 'requestBody', 'responses', 'x-oaiMeta']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the requestBody key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['required', 'content']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the required key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_get_value\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"required\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mTrue\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the content key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['application/json']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the application/json key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['schema']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the schema key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"][\"schema\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['$ref']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the $ref key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_get_value\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"][\"schema\"][\"$ref\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m#/components/schemas/CreateCompletionRequest\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the CreateCompletionRequest schema to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"components\"][\"schemas\"][\"CreateCompletionRequest\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['type', 'properties', 'required']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the required key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_get_value\n",
|
||||||
|
"Action Input: data[\"components\"][\"schemas\"][\"CreateCompletionRequest\"][\"required\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m['model']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: The required parameters in the request body to the /completions endpoint are 'model'.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"The required parameters in the request body to the /completions endpoint are 'model'.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 5,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"json_agent_executor.run(\"What are the required parameters in the request body to the /completions endpoint?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "ba9c9d30",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.9"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
3124
docs/modules/agents/toolkits/examples/openai_openapi.yml
Normal file
3124
docs/modules/agents/toolkits/examples/openai_openapi.yml
Normal file
File diff suppressed because it is too large
Load Diff
242
docs/modules/agents/toolkits/examples/openapi.ipynb
Normal file
242
docs/modules/agents/toolkits/examples/openapi.ipynb
Normal file
@ -0,0 +1,242 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "85fb2c03-ab88-4c8c-97e3-a7f2954555ab",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# OpenAPI Agent\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook showcases an agent designed to interact with an OpenAPI spec and make a correct API request based on the information it has gathered from the spec.\n",
|
||||||
|
"\n",
|
||||||
|
"In the below example, we are using the OpenAPI spec for the OpenAI API, which you can find [here](https://github.com/openai/openai-openapi/blob/master/openapi.yaml)."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "893f90fd-f8f6-470a-a76d-1f200ba02e2f",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Initialization"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "ff988466-c389-4ec6-b6ac-14364a537fd5",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import os\n",
|
||||||
|
"import yaml\n",
|
||||||
|
"\n",
|
||||||
|
"from langchain.agents import create_openapi_agent\n",
|
||||||
|
"from langchain.agents.agent_toolkits import OpenAPIToolkit\n",
|
||||||
|
"from langchain.llms.openai import OpenAI\n",
|
||||||
|
"from langchain.requests import RequestsWrapper\n",
|
||||||
|
"from langchain.tools.json.tool import JsonSpec"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "9ecd1ba0-3937-4359-a41e-68605f0596a1",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with open(\"openai_openapi.yml\") as f:\n",
|
||||||
|
" data = yaml.load(f, Loader=yaml.FullLoader)\n",
|
||||||
|
"json_spec=JsonSpec(dict_=data, max_value_length=4000)\n",
|
||||||
|
"headers = {\n",
|
||||||
|
" \"Authorization\": f\"Bearer {os.getenv('OPENAI_API_KEY')}\"\n",
|
||||||
|
"}\n",
|
||||||
|
"requests_wrapper=RequestsWrapper(headers=headers)\n",
|
||||||
|
"openapi_toolkit = OpenAPIToolkit.from_llm(OpenAI(temperature=0), json_spec, requests_wrapper, verbose=True)\n",
|
||||||
|
"openapi_agent_executor = create_openapi_agent(\n",
|
||||||
|
" llm=OpenAI(temperature=0),\n",
|
||||||
|
" toolkit=openapi_toolkit,\n",
|
||||||
|
" verbose=True\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "f111879d-ae84-41f9-ad82-d3e6b72c41ba",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Example: agent capable of analyzing OpenAPI spec and making requests"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "548db7f7-337b-4ba8-905c-e7fd58c01799",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: json_explorer\n",
|
||||||
|
"Action Input: What is the base url for the API?\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: json_spec_list_keys\n",
|
||||||
|
"Action Input: data\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['openapi', 'info', 'servers', 'tags', 'paths', 'components', 'x-oaiMeta']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the servers key to see what the base url is\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"servers\"][0]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mValueError('Value at path `data[\"servers\"][0]` is not a dict, get the value directly.')\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should get the value of the servers key\n",
|
||||||
|
"Action: json_spec_get_value\n",
|
||||||
|
"Action Input: data[\"servers\"][0]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m{'url': 'https://api.openai.com/v1'}\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the base url for the API\n",
|
||||||
|
"Final Answer: The base url for the API is https://api.openai.com/v1\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mThe base url for the API is https://api.openai.com/v1\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should find the path for the /completions endpoint.\n",
|
||||||
|
"Action: json_explorer\n",
|
||||||
|
"Action Input: What is the path for the /completions endpoint?\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: json_spec_list_keys\n",
|
||||||
|
"Action Input: data\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['openapi', 'info', 'servers', 'tags', 'paths', 'components', 'x-oaiMeta']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the paths key to see what endpoints exist\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['/engines', '/engines/{engine_id}', '/completions', '/edits', '/images/generations', '/images/edits', '/images/variations', '/embeddings', '/engines/{engine_id}/search', '/files', '/files/{file_id}', '/files/{file_id}/content', '/answers', '/classifications', '/fine-tunes', '/fine-tunes/{fine_tune_id}', '/fine-tunes/{fine_tune_id}/cancel', '/fine-tunes/{fine_tune_id}/events', '/models', '/models/{model}', '/moderations']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the path for the /completions endpoint\n",
|
||||||
|
"Final Answer: data[\"paths\"][2]\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mdata[\"paths\"][2]\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should find the required parameters for the POST request.\n",
|
||||||
|
"Action: json_explorer\n",
|
||||||
|
"Action Input: What are the required parameters for a POST request to the /completions endpoint?\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: json_spec_list_keys\n",
|
||||||
|
"Action Input: data\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['openapi', 'info', 'servers', 'tags', 'paths', 'components', 'x-oaiMeta']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the paths key to see what endpoints exist\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['/engines', '/engines/{engine_id}', '/completions', '/edits', '/images/generations', '/images/edits', '/images/variations', '/embeddings', '/engines/{engine_id}/search', '/files', '/files/{file_id}', '/files/{file_id}/content', '/answers', '/classifications', '/fine-tunes', '/fine-tunes/{fine_tune_id}', '/fine-tunes/{fine_tune_id}/cancel', '/fine-tunes/{fine_tune_id}/events', '/models', '/models/{model}', '/moderations']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the /completions endpoint to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['post']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the post key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['operationId', 'tags', 'summary', 'requestBody', 'responses', 'x-oaiMeta']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the requestBody key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['required', 'content']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the content key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['application/json']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the application/json key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['schema']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the schema key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"][\"schema\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['$ref']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the $ref key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"][\"schema\"][\"$ref\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mValueError('Value at path `data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"][\"schema\"][\"$ref\"]` is not a dict, get the value directly.')\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the $ref key to get the value directly\n",
|
||||||
|
"Action: json_spec_get_value\n",
|
||||||
|
"Action Input: data[\"paths\"][\"/completions\"][\"post\"][\"requestBody\"][\"content\"][\"application/json\"][\"schema\"][\"$ref\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m#/components/schemas/CreateCompletionRequest\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the CreateCompletionRequest schema to see what parameters are required\n",
|
||||||
|
"Action: json_spec_list_keys\n",
|
||||||
|
"Action Input: data[\"components\"][\"schemas\"][\"CreateCompletionRequest\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m['type', 'properties', 'required']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the required key to see what parameters are required\n",
|
||||||
|
"Action: json_spec_get_value\n",
|
||||||
|
"Action Input: data[\"components\"][\"schemas\"][\"CreateCompletionRequest\"][\"required\"]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m['model']\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: The required parameters for a POST request to the /completions endpoint are 'model'.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mThe required parameters for a POST request to the /completions endpoint are 'model'.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the parameters needed to make the request.\n",
|
||||||
|
"Action: requests_post\n",
|
||||||
|
"Action Input: { \"url\": \"https://api.openai.com/v1/completions\", \"data\": { \"model\": \"davinci\", \"prompt\": \"tell me a joke\" } }\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m{\"id\":\"cmpl-6oeEcNETfq8TOuIUQvAct6NrBXihs\",\"object\":\"text_completion\",\"created\":1677529082,\"model\":\"davinci\",\"choices\":[{\"text\":\"\\n\\n\\n\\nLove is a battlefield\\n\\n\\n\\nIt's me...And some\",\"index\":0,\"logprobs\":null,\"finish_reason\":\"length\"}],\"usage\":{\"prompt_tokens\":4,\"completion_tokens\":16,\"total_tokens\":20}}\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: Love is a battlefield. It's me...And some.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"Love is a battlefield. It's me...And some.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 3,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"openapi_agent_executor.run(\"Make a post request to openai /completions. The prompt should be 'tell me a joke.'\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "6ec9582b",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.9"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
204
docs/modules/agents/toolkits/examples/pandas.ipynb
Normal file
204
docs/modules/agents/toolkits/examples/pandas.ipynb
Normal file
@ -0,0 +1,204 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "c81da886",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Pandas Dataframe Agent\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook shows how to use agents to interact with a pandas dataframe. It is mostly optimized for question answering.\n",
|
||||||
|
"\n",
|
||||||
|
"**NOTE: this agent calls the Python agent under the hood, which executes LLM generated Python code - this can be bad if the LLM generated Python code is harmful. Use cautiously.**"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"id": "0cdd9bf5",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents import create_pandas_dataframe_agent"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "051ebe84",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.llms import OpenAI\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"\n",
|
||||||
|
"df = pd.read_csv('titanic.csv')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "4185ff46",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"agent = create_pandas_dataframe_agent(OpenAI(temperature=0), df, verbose=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "a9207a2e",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to count the number of rows\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: len(df)\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m891\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: There are 891 rows in the dataframe.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'There are 891 rows in the dataframe.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 4,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"how many rows are there?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"id": "bd43617c",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to count the number of people with more than 3 siblings\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: df[df['SibSp'] > 3].shape[0]\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m30\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: 30 people have more than 3 siblings.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'30 people have more than 3 siblings.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 5,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"how many people have more than 3 sibligngs\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "94e64b58",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mThought: I need to calculate the average age first\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: df['Age'].mean()\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m29.69911764705882\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I can now calculate the square root\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: math.sqrt(df['Age'].mean())\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mname 'math' is not defined\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I need to import the math library\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: import math\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mNone\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I can now calculate the square root\n",
|
||||||
|
"Action: python_repl_ast\n",
|
||||||
|
"Action Input: math.sqrt(df['Age'].mean())\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m5.449689683556195\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: 5.449689683556195\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'5.449689683556195'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 6,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"whats the square root of the average age?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "eba13b4d",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
228
docs/modules/agents/toolkits/examples/python.ipynb
Normal file
228
docs/modules/agents/toolkits/examples/python.ipynb
Normal file
@ -0,0 +1,228 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "82a4c2cc-20ea-4b20-a565-63e905dee8ff",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Python Agent\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook showcases an agent designed to write and execute python code to answer a question."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "f98e9c90-5c37-4fb9-af3e-d09693af8543",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents.agent_toolkits import create_python_agent\n",
|
||||||
|
"from langchain.tools.python.tool import PythonREPLTool\n",
|
||||||
|
"from langchain.python import PythonREPL\n",
|
||||||
|
"from langchain.llms.openai import OpenAI"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "cc422f53-c51c-4694-a834-72ecd1e68363",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"agent_executor = create_python_agent(\n",
|
||||||
|
" llm=OpenAI(temperature=0, max_tokens=1000),\n",
|
||||||
|
" tool=PythonREPLTool(),\n",
|
||||||
|
" verbose=True\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "c16161de",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Fibonacci Example\n",
|
||||||
|
"This example was created by [John Wiseman](https://twitter.com/lemonodor/status/1628270074074398720?s=20)."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "25cd4f92-ea9b-4fe6-9838-a4f85f81eebe",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to calculate the 10th fibonacci number\n",
|
||||||
|
"Action: Python REPL\n",
|
||||||
|
"Action Input: def fibonacci(n):\n",
|
||||||
|
" if n == 0:\n",
|
||||||
|
" return 0\n",
|
||||||
|
" elif n == 1:\n",
|
||||||
|
" return 1\n",
|
||||||
|
" else:\n",
|
||||||
|
" return fibonacci(n-1) + fibonacci(n-2)\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I need to call the function with 10 as the argument\n",
|
||||||
|
"Action: Python REPL\n",
|
||||||
|
"Action Input: fibonacci(10)\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: 55\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'55'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 3,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"What is the 10th fibonacci number?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "7caa30de",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Training neural net\n",
|
||||||
|
"This example was created by [Samee Ur Rehman](https://twitter.com/sameeurehman/status/1630130518133207046?s=20)."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "4b9f60e7-eb6a-4f14-8604-498d863d4482",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to write a neural network in PyTorch and train it on the given data.\n",
|
||||||
|
"Action: Python REPL\n",
|
||||||
|
"Action Input: \n",
|
||||||
|
"import torch\n",
|
||||||
|
"\n",
|
||||||
|
"# Define the model\n",
|
||||||
|
"model = torch.nn.Sequential(\n",
|
||||||
|
" torch.nn.Linear(1, 1)\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"# Define the loss\n",
|
||||||
|
"loss_fn = torch.nn.MSELoss()\n",
|
||||||
|
"\n",
|
||||||
|
"# Define the optimizer\n",
|
||||||
|
"optimizer = torch.optim.SGD(model.parameters(), lr=0.01)\n",
|
||||||
|
"\n",
|
||||||
|
"# Define the data\n",
|
||||||
|
"x_data = torch.tensor([[1.0], [2.0], [3.0], [4.0]])\n",
|
||||||
|
"y_data = torch.tensor([[2.0], [4.0], [6.0], [8.0]])\n",
|
||||||
|
"\n",
|
||||||
|
"# Train the model\n",
|
||||||
|
"for epoch in range(1000):\n",
|
||||||
|
" # Forward pass\n",
|
||||||
|
" y_pred = model(x_data)\n",
|
||||||
|
"\n",
|
||||||
|
" # Compute and print loss\n",
|
||||||
|
" loss = loss_fn(y_pred, y_data)\n",
|
||||||
|
" if (epoch+1) % 100 == 0:\n",
|
||||||
|
" print(f'Epoch {epoch+1}: loss = {loss.item():.4f}')\n",
|
||||||
|
"\n",
|
||||||
|
" # Zero the gradients\n",
|
||||||
|
" optimizer.zero_grad()\n",
|
||||||
|
"\n",
|
||||||
|
" # Backward pass\n",
|
||||||
|
" loss.backward()\n",
|
||||||
|
"\n",
|
||||||
|
" # Update the weights\n",
|
||||||
|
" optimizer.step()\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mEpoch 100: loss = 0.0013\n",
|
||||||
|
"Epoch 200: loss = 0.0007\n",
|
||||||
|
"Epoch 300: loss = 0.0004\n",
|
||||||
|
"Epoch 400: loss = 0.0002\n",
|
||||||
|
"Epoch 500: loss = 0.0001\n",
|
||||||
|
"Epoch 600: loss = 0.0001\n",
|
||||||
|
"Epoch 700: loss = 0.0000\n",
|
||||||
|
"Epoch 800: loss = 0.0000\n",
|
||||||
|
"Epoch 900: loss = 0.0000\n",
|
||||||
|
"Epoch 1000: loss = 0.0000\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: The prediction for x = 5 is 10.0.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'The prediction for x = 5 is 10.0.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 4,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"\"\"Understand, write a single neuron neural network in PyTorch.\n",
|
||||||
|
"Take synthetic data for y=2x. Train for 1000 epochs and print every 100 epochs.\n",
|
||||||
|
"Return prediction for x = 5\"\"\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "eb654671",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
527
docs/modules/agents/toolkits/examples/sql_database.ipynb
Normal file
527
docs/modules/agents/toolkits/examples/sql_database.ipynb
Normal file
@ -0,0 +1,527 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "0e499e90-7a6d-4fab-8aab-31a4df417601",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# SQL Database Agent\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook showcases an agent designed to interact with a sql databases. The agent builds off of [SQLDatabaseChain](https://langchain.readthedocs.io/en/latest/modules/chains/examples/sqlite.html) and is designed to answer more general questions about a database, as well as recover from errors.\n",
|
||||||
|
"\n",
|
||||||
|
"Note that, as this agent is in active development, all answers might not be correct. Additionally, it is not guaranteed that the agent won't perform DML statements on your database given certain questions. Be careful running it on sensitive data!\n",
|
||||||
|
"\n",
|
||||||
|
"This uses the example Chinook database. To set it up follow the instructions on https://database.guide/2-sample-databases-sqlite/, placing the .db file in a notebooks folder at the root of this repository."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "ec927ac6-9b2a-4e8a-9a6e-3e429191875c",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"source": [
|
||||||
|
"## Initialization"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "53422913-967b-4f2a-8022-00269c1be1b1",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents import create_sql_agent\n",
|
||||||
|
"from langchain.agents.agent_toolkits import SQLDatabaseToolkit\n",
|
||||||
|
"from langchain.sql_database import SQLDatabase\n",
|
||||||
|
"from langchain.llms.openai import OpenAI\n",
|
||||||
|
"from langchain.agents import AgentExecutor"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "090f3699-79c6-4ce1-ab96-a94f0121fd64",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"db = SQLDatabase.from_uri(\"sqlite:///../../../../notebooks/Chinook.db\")\n",
|
||||||
|
"toolkit = SQLDatabaseToolkit(db=db)\n",
|
||||||
|
"\n",
|
||||||
|
"agent_executor = create_sql_agent(\n",
|
||||||
|
" llm=OpenAI(temperature=0),\n",
|
||||||
|
" toolkit=toolkit,\n",
|
||||||
|
" verbose=True\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "36ae48c7-cb08-4fef-977e-c7d4b96a464b",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Example: describing a table"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "ff70e83d-5ad0-4fc7-bb96-27d82ac166d7",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: list_tables_sql_db\n",
|
||||||
|
"Action Input: \"\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[38;5;200m\u001b[1;3mArtist, Invoice, Playlist, Genre, Album, PlaylistTrack, Track, InvoiceLine, MediaType, Employee, Customer\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the schema of the playlisttrack table\n",
|
||||||
|
"Action: schema_sql_db\n",
|
||||||
|
"Action Input: \"PlaylistTrack\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"CREATE TABLE \"PlaylistTrack\" (\n",
|
||||||
|
"\t\"PlaylistId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"TrackId\" INTEGER NOT NULL, \n",
|
||||||
|
"\tPRIMARY KEY (\"PlaylistId\", \"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"TrackId\") REFERENCES \"Track\" (\"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"PlaylistId\") REFERENCES \"Playlist\" (\"PlaylistId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'PlaylistTrack' LIMIT 3;\n",
|
||||||
|
"PlaylistId TrackId\n",
|
||||||
|
"1 3402\n",
|
||||||
|
"1 3389\n",
|
||||||
|
"1 3390\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: The PlaylistTrack table has two columns, PlaylistId and TrackId, and is linked to the Playlist and Track tables.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'The PlaylistTrack table has two columns, PlaylistId and TrackId, and is linked to the Playlist and Track tables.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 3,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"Describe the playlisttrack table\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "9abcfe8e-1868-42a4-8345-ad2d9b44c681",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Example: describing a table, recovering from an error\n",
|
||||||
|
"\n",
|
||||||
|
"In this example, the agent tries to search for a table that doesn't exist, but finds the next best result"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 15,
|
||||||
|
"id": "bea76658-a65b-47e2-b294-6d52c5556246",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: list_tables_sql_db\n",
|
||||||
|
"Action Input: \"\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[38;5;200m\u001b[1;3mGenre, PlaylistTrack, MediaType, Invoice, InvoiceLine, Track, Playlist, Customer, Album, Employee, Artist\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the schema of the PlaylistSong table\n",
|
||||||
|
"Action: schema_sql_db\n",
|
||||||
|
"Action Input: \"PlaylistSong\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mError: table_names {'PlaylistSong'} not found in database\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should check the spelling of the table\n",
|
||||||
|
"Action: list_tables_sql_db\n",
|
||||||
|
"Action Input: \"\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[38;5;200m\u001b[1;3mGenre, PlaylistTrack, MediaType, Invoice, InvoiceLine, Track, Playlist, Customer, Album, Employee, Artist\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m The table is called PlaylistTrack\n",
|
||||||
|
"Action: schema_sql_db\n",
|
||||||
|
"Action Input: \"PlaylistTrack\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"CREATE TABLE \"PlaylistTrack\" (\n",
|
||||||
|
"\t\"PlaylistId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"TrackId\" INTEGER NOT NULL, \n",
|
||||||
|
"\tPRIMARY KEY (\"PlaylistId\", \"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"TrackId\") REFERENCES \"Track\" (\"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"PlaylistId\") REFERENCES \"Playlist\" (\"PlaylistId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'PlaylistTrack' LIMIT 3;\n",
|
||||||
|
"PlaylistId TrackId\n",
|
||||||
|
"1 3402\n",
|
||||||
|
"1 3389\n",
|
||||||
|
"1 3390\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: The PlaylistTrack table contains two columns, PlaylistId and TrackId, which are both integers and are used to link Playlist and Track tables.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'The PlaylistTrack table contains two columns, PlaylistId and TrackId, which are both integers and are used to link Playlist and Track tables.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 15,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"Describe the playlistsong table\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "6fbc26af-97e4-4a21-82aa-48bdc992da26",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Example: running queries"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 8,
|
||||||
|
"id": "17bea710-4a23-4de0-b48e-21d57be48293",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: list_tables_sql_db\n",
|
||||||
|
"Action Input: \"\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[38;5;200m\u001b[1;3mInvoice, MediaType, Artist, InvoiceLine, Genre, Playlist, Employee, Album, PlaylistTrack, Track, Customer\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the schema of the relevant tables to see what columns I can use.\n",
|
||||||
|
"Action: schema_sql_db\n",
|
||||||
|
"Action Input: \"Invoice, Customer\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"CREATE TABLE \"Customer\" (\n",
|
||||||
|
"\t\"CustomerId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"FirstName\" NVARCHAR(40) NOT NULL, \n",
|
||||||
|
"\t\"LastName\" NVARCHAR(20) NOT NULL, \n",
|
||||||
|
"\t\"Company\" NVARCHAR(80), \n",
|
||||||
|
"\t\"Address\" NVARCHAR(70), \n",
|
||||||
|
"\t\"City\" NVARCHAR(40), \n",
|
||||||
|
"\t\"State\" NVARCHAR(40), \n",
|
||||||
|
"\t\"Country\" NVARCHAR(40), \n",
|
||||||
|
"\t\"PostalCode\" NVARCHAR(10), \n",
|
||||||
|
"\t\"Phone\" NVARCHAR(24), \n",
|
||||||
|
"\t\"Fax\" NVARCHAR(24), \n",
|
||||||
|
"\t\"Email\" NVARCHAR(60) NOT NULL, \n",
|
||||||
|
"\t\"SupportRepId\" INTEGER, \n",
|
||||||
|
"\tPRIMARY KEY (\"CustomerId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"SupportRepId\") REFERENCES \"Employee\" (\"EmployeeId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'Customer' LIMIT 3;\n",
|
||||||
|
"CustomerId FirstName LastName Company Address City State Country PostalCode Phone Fax Email SupportRepId\n",
|
||||||
|
"1 Luís Gonçalves Embraer - Empresa Brasileira de Aeronáutica S.A. Av. Brigadeiro Faria Lima, 2170 São José dos Campos SP Brazil 12227-000 +55 (12) 3923-5555 +55 (12) 3923-5566 luisg@embraer.com.br 3\n",
|
||||||
|
"2 Leonie Köhler None Theodor-Heuss-Straße 34 Stuttgart None Germany 70174 +49 0711 2842222 None leonekohler@surfeu.de 5\n",
|
||||||
|
"3 François Tremblay None 1498 rue Bélanger Montréal QC Canada H2G 1A7 +1 (514) 721-4711 None ftremblay@gmail.com 3\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"CREATE TABLE \"Invoice\" (\n",
|
||||||
|
"\t\"InvoiceId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"CustomerId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"InvoiceDate\" DATETIME NOT NULL, \n",
|
||||||
|
"\t\"BillingAddress\" NVARCHAR(70), \n",
|
||||||
|
"\t\"BillingCity\" NVARCHAR(40), \n",
|
||||||
|
"\t\"BillingState\" NVARCHAR(40), \n",
|
||||||
|
"\t\"BillingCountry\" NVARCHAR(40), \n",
|
||||||
|
"\t\"BillingPostalCode\" NVARCHAR(10), \n",
|
||||||
|
"\t\"Total\" NUMERIC(10, 2) NOT NULL, \n",
|
||||||
|
"\tPRIMARY KEY (\"InvoiceId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"CustomerId\") REFERENCES \"Customer\" (\"CustomerId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'Invoice' LIMIT 3;\n",
|
||||||
|
"InvoiceId CustomerId InvoiceDate BillingAddress BillingCity BillingState BillingCountry BillingPostalCode Total\n",
|
||||||
|
"1 2 2009-01-01 00:00:00 Theodor-Heuss-Straße 34 Stuttgart None Germany 70174 1.98\n",
|
||||||
|
"2 4 2009-01-02 00:00:00 Ullevålsveien 14 Oslo None Norway 0171 3.96\n",
|
||||||
|
"3 8 2009-01-03 00:00:00 Grétrystraat 63 Brussels None Belgium 1000 5.94\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should query the Invoice and Customer tables to get the total sales per country.\n",
|
||||||
|
"Action: query_sql_db\n",
|
||||||
|
"Action Input: SELECT c.Country, SUM(i.Total) AS TotalSales FROM Invoice i INNER JOIN Customer c ON i.CustomerId = c.CustomerId GROUP BY c.Country ORDER BY TotalSales DESC LIMIT 10\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m[('USA', 523.0600000000003), ('Canada', 303.9599999999999), ('France', 195.09999999999994), ('Brazil', 190.09999999999997), ('Germany', 156.48), ('United Kingdom', 112.85999999999999), ('Czech Republic', 90.24000000000001), ('Portugal', 77.23999999999998), ('India', 75.25999999999999), ('Chile', 46.62)]\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: The customers from the USA spent the most, with a total of $523.06.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'The customers from the USA spent the most, with a total of $523.06.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 8,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"List the total sales per country. Which country's customers spent the most?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"id": "474dddda-c067-4eeb-98b1-e763ee78b18c",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: list_tables_sql_db\n",
|
||||||
|
"Action Input: \"\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[38;5;200m\u001b[1;3mInvoice, MediaType, Artist, InvoiceLine, Genre, Playlist, Employee, Album, PlaylistTrack, Track, Customer\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the schema of the Playlist and PlaylistTrack tables to see what columns I can use.\n",
|
||||||
|
"Action: schema_sql_db\n",
|
||||||
|
"Action Input: \"Playlist, PlaylistTrack\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"CREATE TABLE \"Playlist\" (\n",
|
||||||
|
"\t\"PlaylistId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"Name\" NVARCHAR(120), \n",
|
||||||
|
"\tPRIMARY KEY (\"PlaylistId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'Playlist' LIMIT 3;\n",
|
||||||
|
"PlaylistId Name\n",
|
||||||
|
"1 Music\n",
|
||||||
|
"2 Movies\n",
|
||||||
|
"3 TV Shows\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"CREATE TABLE \"PlaylistTrack\" (\n",
|
||||||
|
"\t\"PlaylistId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"TrackId\" INTEGER NOT NULL, \n",
|
||||||
|
"\tPRIMARY KEY (\"PlaylistId\", \"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"TrackId\") REFERENCES \"Track\" (\"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"PlaylistId\") REFERENCES \"Playlist\" (\"PlaylistId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'PlaylistTrack' LIMIT 3;\n",
|
||||||
|
"PlaylistId TrackId\n",
|
||||||
|
"1 3402\n",
|
||||||
|
"1 3389\n",
|
||||||
|
"1 3390\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I can use a SELECT statement to get the total number of tracks in each playlist.\n",
|
||||||
|
"Action: query_checker_sql_db\n",
|
||||||
|
"Action Input: SELECT Playlist.Name, COUNT(PlaylistTrack.TrackId) AS TotalTracks FROM Playlist INNER JOIN PlaylistTrack ON Playlist.PlaylistId = PlaylistTrack.PlaylistId GROUP BY Playlist.Name\u001b[0m\n",
|
||||||
|
"Observation: \u001b[31;1m\u001b[1;3m\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT Playlist.Name, COUNT(PlaylistTrack.TrackId) AS TotalTracks FROM Playlist INNER JOIN PlaylistTrack ON Playlist.PlaylistId = PlaylistTrack.PlaylistId GROUP BY Playlist.Name\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m The query looks correct, I can now execute it.\n",
|
||||||
|
"Action: query_sql_db\n",
|
||||||
|
"Action Input: SELECT Playlist.Name, COUNT(PlaylistTrack.TrackId) AS TotalTracks FROM Playlist INNER JOIN PlaylistTrack ON Playlist.PlaylistId = PlaylistTrack.PlaylistId GROUP BY Playlist.Name LIMIT 10\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m[('90’s Music', 1477), ('Brazilian Music', 39), ('Classical', 75), ('Classical 101 - Deep Cuts', 25), ('Classical 101 - Next Steps', 25), ('Classical 101 - The Basics', 25), ('Grunge', 15), ('Heavy Metal Classic', 26), ('Music', 6580), ('Music Videos', 1)]\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: The total number of tracks in each playlist are: '90’s Music' (1477), 'Brazilian Music' (39), 'Classical' (75), 'Classical 101 - Deep Cuts' (25), 'Classical 101 - Next Steps' (25), 'Classical 101 - The Basics' (25), 'Grunge' (15), 'Heavy Metal Classic' (26), 'Music' (6580), 'Music Videos' (1).\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"The total number of tracks in each playlist are: '90’s Music' (1477), 'Brazilian Music' (39), 'Classical' (75), 'Classical 101 - Deep Cuts' (25), 'Classical 101 - Next Steps' (25), 'Classical 101 - The Basics' (25), 'Grunge' (15), 'Heavy Metal Classic' (26), 'Music' (6580), 'Music Videos' (1).\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 7,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"Show the total number of tracks in each playlist. The Playlist name should be included in the result.\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "7c7503b5-d9d9-4faa-b064-29fcdb5ff213",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Recovering from an error\n",
|
||||||
|
"\n",
|
||||||
|
"In this example, the agent is able to recover from an error after initially trying to access an attribute (`Track.ArtistId`) which doesn't exist."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 16,
|
||||||
|
"id": "9fe4901e-f9e1-4022-b6bc-80e2b2d6a3a4",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mAction: list_tables_sql_db\n",
|
||||||
|
"Action Input: \"\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[38;5;200m\u001b[1;3mMediaType, Track, Invoice, Album, Playlist, Customer, Employee, InvoiceLine, PlaylistTrack, Genre, Artist\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should look at the schema of the Artist, InvoiceLine, and Track tables to see what columns I can use.\n",
|
||||||
|
"Action: schema_sql_db\n",
|
||||||
|
"Action Input: \"Artist, InvoiceLine, Track\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"CREATE TABLE \"Artist\" (\n",
|
||||||
|
"\t\"ArtistId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"Name\" NVARCHAR(120), \n",
|
||||||
|
"\tPRIMARY KEY (\"ArtistId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'Artist' LIMIT 3;\n",
|
||||||
|
"ArtistId Name\n",
|
||||||
|
"1 AC/DC\n",
|
||||||
|
"2 Accept\n",
|
||||||
|
"3 Aerosmith\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"CREATE TABLE \"Track\" (\n",
|
||||||
|
"\t\"TrackId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"Name\" NVARCHAR(200) NOT NULL, \n",
|
||||||
|
"\t\"AlbumId\" INTEGER, \n",
|
||||||
|
"\t\"MediaTypeId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"GenreId\" INTEGER, \n",
|
||||||
|
"\t\"Composer\" NVARCHAR(220), \n",
|
||||||
|
"\t\"Milliseconds\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"Bytes\" INTEGER, \n",
|
||||||
|
"\t\"UnitPrice\" NUMERIC(10, 2) NOT NULL, \n",
|
||||||
|
"\tPRIMARY KEY (\"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"MediaTypeId\") REFERENCES \"MediaType\" (\"MediaTypeId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"GenreId\") REFERENCES \"Genre\" (\"GenreId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"AlbumId\") REFERENCES \"Album\" (\"AlbumId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'Track' LIMIT 3;\n",
|
||||||
|
"TrackId Name AlbumId MediaTypeId GenreId Composer Milliseconds Bytes UnitPrice\n",
|
||||||
|
"1 For Those About To Rock (We Salute You) 1 1 1 Angus Young, Malcolm Young, Brian Johnson 343719 11170334 0.99\n",
|
||||||
|
"2 Balls to the Wall 2 2 1 None 342562 5510424 0.99\n",
|
||||||
|
"3 Fast As a Shark 3 2 1 F. Baltes, S. Kaufman, U. Dirkscneider & W. Hoffman 230619 3990994 0.99\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"CREATE TABLE \"InvoiceLine\" (\n",
|
||||||
|
"\t\"InvoiceLineId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"InvoiceId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"TrackId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"UnitPrice\" NUMERIC(10, 2) NOT NULL, \n",
|
||||||
|
"\t\"Quantity\" INTEGER NOT NULL, \n",
|
||||||
|
"\tPRIMARY KEY (\"InvoiceLineId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"TrackId\") REFERENCES \"Track\" (\"TrackId\"), \n",
|
||||||
|
"\tFOREIGN KEY(\"InvoiceId\") REFERENCES \"Invoice\" (\"InvoiceId\")\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT * FROM 'InvoiceLine' LIMIT 3;\n",
|
||||||
|
"InvoiceLineId InvoiceId TrackId UnitPrice Quantity\n",
|
||||||
|
"1 1 2 0.99 1\n",
|
||||||
|
"2 1 4 0.99 1\n",
|
||||||
|
"3 2 6 0.99 1\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should query the database to get the top 3 best selling artists.\n",
|
||||||
|
"Action: query_sql_db\n",
|
||||||
|
"Action Input: SELECT Artist.Name, SUM(InvoiceLine.Quantity) AS TotalQuantity FROM Artist INNER JOIN Track ON Artist.ArtistId = Track.ArtistId INNER JOIN InvoiceLine ON Track.TrackId = InvoiceLine.TrackId GROUP BY Artist.Name ORDER BY TotalQuantity DESC LIMIT 3\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mError: (sqlite3.OperationalError) no such column: Track.ArtistId\n",
|
||||||
|
"[SQL: SELECT Artist.Name, SUM(InvoiceLine.Quantity) AS TotalQuantity FROM Artist INNER JOIN Track ON Artist.ArtistId = Track.ArtistId INNER JOIN InvoiceLine ON Track.TrackId = InvoiceLine.TrackId GROUP BY Artist.Name ORDER BY TotalQuantity DESC LIMIT 3]\n",
|
||||||
|
"(Background on this error at: https://sqlalche.me/e/14/e3q8)\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I should double check my query before executing it.\n",
|
||||||
|
"Action: query_checker_sql_db\n",
|
||||||
|
"Action Input: SELECT Artist.Name, SUM(InvoiceLine.Quantity) AS TotalQuantity FROM Artist INNER JOIN Track ON Artist.ArtistId = Track.ArtistId INNER JOIN InvoiceLine ON Track.TrackId = InvoiceLine.TrackId GROUP BY Artist.Name ORDER BY TotalQuantity DESC LIMIT 3\u001b[0m\n",
|
||||||
|
"Observation: \u001b[31;1m\u001b[1;3m\n",
|
||||||
|
"\n",
|
||||||
|
"SELECT Artist.Name, SUM(InvoiceLine.Quantity) AS TotalQuantity \n",
|
||||||
|
"FROM Artist \n",
|
||||||
|
"INNER JOIN Track ON Artist.ArtistId = Track.ArtistId \n",
|
||||||
|
"INNER JOIN InvoiceLine ON Track.TrackId = InvoiceLine.TrackId \n",
|
||||||
|
"GROUP BY Artist.Name \n",
|
||||||
|
"ORDER BY TotalQuantity DESC \n",
|
||||||
|
"LIMIT 3;\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Action: query_sql_db\n",
|
||||||
|
"Action Input: SELECT Artist.Name, SUM(InvoiceLine.Quantity) AS TotalQuantity FROM Artist INNER JOIN Album ON Artist.ArtistId = Album.ArtistId INNER JOIN Track ON Album.AlbumId = Track.AlbumId INNER JOIN InvoiceLine ON Track.TrackId = InvoiceLine.TrackId GROUP BY Artist.Name ORDER BY TotalQuantity DESC LIMIT 3\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m[('Iron Maiden', 140), ('U2', 107), ('Metallica', 91)]\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: The top 3 best selling artists are Iron Maiden, U2, and Metallica.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'The top 3 best selling artists are Iron Maiden, U2, and Metallica.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 16,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"Who are the top 3 best selling artists?\")"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.10.9"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
892
docs/modules/agents/toolkits/examples/titanic.csv
Normal file
892
docs/modules/agents/toolkits/examples/titanic.csv
Normal file
@ -0,0 +1,892 @@
|
|||||||
|
PassengerId,Survived,Pclass,Name,Sex,Age,SibSp,Parch,Ticket,Fare,Cabin,Embarked
|
||||||
|
1,0,3,"Braund, Mr. Owen Harris",male,22,1,0,A/5 21171,7.25,,S
|
||||||
|
2,1,1,"Cumings, Mrs. John Bradley (Florence Briggs Thayer)",female,38,1,0,PC 17599,71.2833,C85,C
|
||||||
|
3,1,3,"Heikkinen, Miss. Laina",female,26,0,0,STON/O2. 3101282,7.925,,S
|
||||||
|
4,1,1,"Futrelle, Mrs. Jacques Heath (Lily May Peel)",female,35,1,0,113803,53.1,C123,S
|
||||||
|
5,0,3,"Allen, Mr. William Henry",male,35,0,0,373450,8.05,,S
|
||||||
|
6,0,3,"Moran, Mr. James",male,,0,0,330877,8.4583,,Q
|
||||||
|
7,0,1,"McCarthy, Mr. Timothy J",male,54,0,0,17463,51.8625,E46,S
|
||||||
|
8,0,3,"Palsson, Master. Gosta Leonard",male,2,3,1,349909,21.075,,S
|
||||||
|
9,1,3,"Johnson, Mrs. Oscar W (Elisabeth Vilhelmina Berg)",female,27,0,2,347742,11.1333,,S
|
||||||
|
10,1,2,"Nasser, Mrs. Nicholas (Adele Achem)",female,14,1,0,237736,30.0708,,C
|
||||||
|
11,1,3,"Sandstrom, Miss. Marguerite Rut",female,4,1,1,PP 9549,16.7,G6,S
|
||||||
|
12,1,1,"Bonnell, Miss. Elizabeth",female,58,0,0,113783,26.55,C103,S
|
||||||
|
13,0,3,"Saundercock, Mr. William Henry",male,20,0,0,A/5. 2151,8.05,,S
|
||||||
|
14,0,3,"Andersson, Mr. Anders Johan",male,39,1,5,347082,31.275,,S
|
||||||
|
15,0,3,"Vestrom, Miss. Hulda Amanda Adolfina",female,14,0,0,350406,7.8542,,S
|
||||||
|
16,1,2,"Hewlett, Mrs. (Mary D Kingcome) ",female,55,0,0,248706,16,,S
|
||||||
|
17,0,3,"Rice, Master. Eugene",male,2,4,1,382652,29.125,,Q
|
||||||
|
18,1,2,"Williams, Mr. Charles Eugene",male,,0,0,244373,13,,S
|
||||||
|
19,0,3,"Vander Planke, Mrs. Julius (Emelia Maria Vandemoortele)",female,31,1,0,345763,18,,S
|
||||||
|
20,1,3,"Masselmani, Mrs. Fatima",female,,0,0,2649,7.225,,C
|
||||||
|
21,0,2,"Fynney, Mr. Joseph J",male,35,0,0,239865,26,,S
|
||||||
|
22,1,2,"Beesley, Mr. Lawrence",male,34,0,0,248698,13,D56,S
|
||||||
|
23,1,3,"McGowan, Miss. Anna ""Annie""",female,15,0,0,330923,8.0292,,Q
|
||||||
|
24,1,1,"Sloper, Mr. William Thompson",male,28,0,0,113788,35.5,A6,S
|
||||||
|
25,0,3,"Palsson, Miss. Torborg Danira",female,8,3,1,349909,21.075,,S
|
||||||
|
26,1,3,"Asplund, Mrs. Carl Oscar (Selma Augusta Emilia Johansson)",female,38,1,5,347077,31.3875,,S
|
||||||
|
27,0,3,"Emir, Mr. Farred Chehab",male,,0,0,2631,7.225,,C
|
||||||
|
28,0,1,"Fortune, Mr. Charles Alexander",male,19,3,2,19950,263,C23 C25 C27,S
|
||||||
|
29,1,3,"O'Dwyer, Miss. Ellen ""Nellie""",female,,0,0,330959,7.8792,,Q
|
||||||
|
30,0,3,"Todoroff, Mr. Lalio",male,,0,0,349216,7.8958,,S
|
||||||
|
31,0,1,"Uruchurtu, Don. Manuel E",male,40,0,0,PC 17601,27.7208,,C
|
||||||
|
32,1,1,"Spencer, Mrs. William Augustus (Marie Eugenie)",female,,1,0,PC 17569,146.5208,B78,C
|
||||||
|
33,1,3,"Glynn, Miss. Mary Agatha",female,,0,0,335677,7.75,,Q
|
||||||
|
34,0,2,"Wheadon, Mr. Edward H",male,66,0,0,C.A. 24579,10.5,,S
|
||||||
|
35,0,1,"Meyer, Mr. Edgar Joseph",male,28,1,0,PC 17604,82.1708,,C
|
||||||
|
36,0,1,"Holverson, Mr. Alexander Oskar",male,42,1,0,113789,52,,S
|
||||||
|
37,1,3,"Mamee, Mr. Hanna",male,,0,0,2677,7.2292,,C
|
||||||
|
38,0,3,"Cann, Mr. Ernest Charles",male,21,0,0,A./5. 2152,8.05,,S
|
||||||
|
39,0,3,"Vander Planke, Miss. Augusta Maria",female,18,2,0,345764,18,,S
|
||||||
|
40,1,3,"Nicola-Yarred, Miss. Jamila",female,14,1,0,2651,11.2417,,C
|
||||||
|
41,0,3,"Ahlin, Mrs. Johan (Johanna Persdotter Larsson)",female,40,1,0,7546,9.475,,S
|
||||||
|
42,0,2,"Turpin, Mrs. William John Robert (Dorothy Ann Wonnacott)",female,27,1,0,11668,21,,S
|
||||||
|
43,0,3,"Kraeff, Mr. Theodor",male,,0,0,349253,7.8958,,C
|
||||||
|
44,1,2,"Laroche, Miss. Simonne Marie Anne Andree",female,3,1,2,SC/Paris 2123,41.5792,,C
|
||||||
|
45,1,3,"Devaney, Miss. Margaret Delia",female,19,0,0,330958,7.8792,,Q
|
||||||
|
46,0,3,"Rogers, Mr. William John",male,,0,0,S.C./A.4. 23567,8.05,,S
|
||||||
|
47,0,3,"Lennon, Mr. Denis",male,,1,0,370371,15.5,,Q
|
||||||
|
48,1,3,"O'Driscoll, Miss. Bridget",female,,0,0,14311,7.75,,Q
|
||||||
|
49,0,3,"Samaan, Mr. Youssef",male,,2,0,2662,21.6792,,C
|
||||||
|
50,0,3,"Arnold-Franchi, Mrs. Josef (Josefine Franchi)",female,18,1,0,349237,17.8,,S
|
||||||
|
51,0,3,"Panula, Master. Juha Niilo",male,7,4,1,3101295,39.6875,,S
|
||||||
|
52,0,3,"Nosworthy, Mr. Richard Cater",male,21,0,0,A/4. 39886,7.8,,S
|
||||||
|
53,1,1,"Harper, Mrs. Henry Sleeper (Myna Haxtun)",female,49,1,0,PC 17572,76.7292,D33,C
|
||||||
|
54,1,2,"Faunthorpe, Mrs. Lizzie (Elizabeth Anne Wilkinson)",female,29,1,0,2926,26,,S
|
||||||
|
55,0,1,"Ostby, Mr. Engelhart Cornelius",male,65,0,1,113509,61.9792,B30,C
|
||||||
|
56,1,1,"Woolner, Mr. Hugh",male,,0,0,19947,35.5,C52,S
|
||||||
|
57,1,2,"Rugg, Miss. Emily",female,21,0,0,C.A. 31026,10.5,,S
|
||||||
|
58,0,3,"Novel, Mr. Mansouer",male,28.5,0,0,2697,7.2292,,C
|
||||||
|
59,1,2,"West, Miss. Constance Mirium",female,5,1,2,C.A. 34651,27.75,,S
|
||||||
|
60,0,3,"Goodwin, Master. William Frederick",male,11,5,2,CA 2144,46.9,,S
|
||||||
|
61,0,3,"Sirayanian, Mr. Orsen",male,22,0,0,2669,7.2292,,C
|
||||||
|
62,1,1,"Icard, Miss. Amelie",female,38,0,0,113572,80,B28,
|
||||||
|
63,0,1,"Harris, Mr. Henry Birkhardt",male,45,1,0,36973,83.475,C83,S
|
||||||
|
64,0,3,"Skoog, Master. Harald",male,4,3,2,347088,27.9,,S
|
||||||
|
65,0,1,"Stewart, Mr. Albert A",male,,0,0,PC 17605,27.7208,,C
|
||||||
|
66,1,3,"Moubarek, Master. Gerios",male,,1,1,2661,15.2458,,C
|
||||||
|
67,1,2,"Nye, Mrs. (Elizabeth Ramell)",female,29,0,0,C.A. 29395,10.5,F33,S
|
||||||
|
68,0,3,"Crease, Mr. Ernest James",male,19,0,0,S.P. 3464,8.1583,,S
|
||||||
|
69,1,3,"Andersson, Miss. Erna Alexandra",female,17,4,2,3101281,7.925,,S
|
||||||
|
70,0,3,"Kink, Mr. Vincenz",male,26,2,0,315151,8.6625,,S
|
||||||
|
71,0,2,"Jenkin, Mr. Stephen Curnow",male,32,0,0,C.A. 33111,10.5,,S
|
||||||
|
72,0,3,"Goodwin, Miss. Lillian Amy",female,16,5,2,CA 2144,46.9,,S
|
||||||
|
73,0,2,"Hood, Mr. Ambrose Jr",male,21,0,0,S.O.C. 14879,73.5,,S
|
||||||
|
74,0,3,"Chronopoulos, Mr. Apostolos",male,26,1,0,2680,14.4542,,C
|
||||||
|
75,1,3,"Bing, Mr. Lee",male,32,0,0,1601,56.4958,,S
|
||||||
|
76,0,3,"Moen, Mr. Sigurd Hansen",male,25,0,0,348123,7.65,F G73,S
|
||||||
|
77,0,3,"Staneff, Mr. Ivan",male,,0,0,349208,7.8958,,S
|
||||||
|
78,0,3,"Moutal, Mr. Rahamin Haim",male,,0,0,374746,8.05,,S
|
||||||
|
79,1,2,"Caldwell, Master. Alden Gates",male,0.83,0,2,248738,29,,S
|
||||||
|
80,1,3,"Dowdell, Miss. Elizabeth",female,30,0,0,364516,12.475,,S
|
||||||
|
81,0,3,"Waelens, Mr. Achille",male,22,0,0,345767,9,,S
|
||||||
|
82,1,3,"Sheerlinck, Mr. Jan Baptist",male,29,0,0,345779,9.5,,S
|
||||||
|
83,1,3,"McDermott, Miss. Brigdet Delia",female,,0,0,330932,7.7875,,Q
|
||||||
|
84,0,1,"Carrau, Mr. Francisco M",male,28,0,0,113059,47.1,,S
|
||||||
|
85,1,2,"Ilett, Miss. Bertha",female,17,0,0,SO/C 14885,10.5,,S
|
||||||
|
86,1,3,"Backstrom, Mrs. Karl Alfred (Maria Mathilda Gustafsson)",female,33,3,0,3101278,15.85,,S
|
||||||
|
87,0,3,"Ford, Mr. William Neal",male,16,1,3,W./C. 6608,34.375,,S
|
||||||
|
88,0,3,"Slocovski, Mr. Selman Francis",male,,0,0,SOTON/OQ 392086,8.05,,S
|
||||||
|
89,1,1,"Fortune, Miss. Mabel Helen",female,23,3,2,19950,263,C23 C25 C27,S
|
||||||
|
90,0,3,"Celotti, Mr. Francesco",male,24,0,0,343275,8.05,,S
|
||||||
|
91,0,3,"Christmann, Mr. Emil",male,29,0,0,343276,8.05,,S
|
||||||
|
92,0,3,"Andreasson, Mr. Paul Edvin",male,20,0,0,347466,7.8542,,S
|
||||||
|
93,0,1,"Chaffee, Mr. Herbert Fuller",male,46,1,0,W.E.P. 5734,61.175,E31,S
|
||||||
|
94,0,3,"Dean, Mr. Bertram Frank",male,26,1,2,C.A. 2315,20.575,,S
|
||||||
|
95,0,3,"Coxon, Mr. Daniel",male,59,0,0,364500,7.25,,S
|
||||||
|
96,0,3,"Shorney, Mr. Charles Joseph",male,,0,0,374910,8.05,,S
|
||||||
|
97,0,1,"Goldschmidt, Mr. George B",male,71,0,0,PC 17754,34.6542,A5,C
|
||||||
|
98,1,1,"Greenfield, Mr. William Bertram",male,23,0,1,PC 17759,63.3583,D10 D12,C
|
||||||
|
99,1,2,"Doling, Mrs. John T (Ada Julia Bone)",female,34,0,1,231919,23,,S
|
||||||
|
100,0,2,"Kantor, Mr. Sinai",male,34,1,0,244367,26,,S
|
||||||
|
101,0,3,"Petranec, Miss. Matilda",female,28,0,0,349245,7.8958,,S
|
||||||
|
102,0,3,"Petroff, Mr. Pastcho (""Pentcho"")",male,,0,0,349215,7.8958,,S
|
||||||
|
103,0,1,"White, Mr. Richard Frasar",male,21,0,1,35281,77.2875,D26,S
|
||||||
|
104,0,3,"Johansson, Mr. Gustaf Joel",male,33,0,0,7540,8.6542,,S
|
||||||
|
105,0,3,"Gustafsson, Mr. Anders Vilhelm",male,37,2,0,3101276,7.925,,S
|
||||||
|
106,0,3,"Mionoff, Mr. Stoytcho",male,28,0,0,349207,7.8958,,S
|
||||||
|
107,1,3,"Salkjelsvik, Miss. Anna Kristine",female,21,0,0,343120,7.65,,S
|
||||||
|
108,1,3,"Moss, Mr. Albert Johan",male,,0,0,312991,7.775,,S
|
||||||
|
109,0,3,"Rekic, Mr. Tido",male,38,0,0,349249,7.8958,,S
|
||||||
|
110,1,3,"Moran, Miss. Bertha",female,,1,0,371110,24.15,,Q
|
||||||
|
111,0,1,"Porter, Mr. Walter Chamberlain",male,47,0,0,110465,52,C110,S
|
||||||
|
112,0,3,"Zabour, Miss. Hileni",female,14.5,1,0,2665,14.4542,,C
|
||||||
|
113,0,3,"Barton, Mr. David John",male,22,0,0,324669,8.05,,S
|
||||||
|
114,0,3,"Jussila, Miss. Katriina",female,20,1,0,4136,9.825,,S
|
||||||
|
115,0,3,"Attalah, Miss. Malake",female,17,0,0,2627,14.4583,,C
|
||||||
|
116,0,3,"Pekoniemi, Mr. Edvard",male,21,0,0,STON/O 2. 3101294,7.925,,S
|
||||||
|
117,0,3,"Connors, Mr. Patrick",male,70.5,0,0,370369,7.75,,Q
|
||||||
|
118,0,2,"Turpin, Mr. William John Robert",male,29,1,0,11668,21,,S
|
||||||
|
119,0,1,"Baxter, Mr. Quigg Edmond",male,24,0,1,PC 17558,247.5208,B58 B60,C
|
||||||
|
120,0,3,"Andersson, Miss. Ellis Anna Maria",female,2,4,2,347082,31.275,,S
|
||||||
|
121,0,2,"Hickman, Mr. Stanley George",male,21,2,0,S.O.C. 14879,73.5,,S
|
||||||
|
122,0,3,"Moore, Mr. Leonard Charles",male,,0,0,A4. 54510,8.05,,S
|
||||||
|
123,0,2,"Nasser, Mr. Nicholas",male,32.5,1,0,237736,30.0708,,C
|
||||||
|
124,1,2,"Webber, Miss. Susan",female,32.5,0,0,27267,13,E101,S
|
||||||
|
125,0,1,"White, Mr. Percival Wayland",male,54,0,1,35281,77.2875,D26,S
|
||||||
|
126,1,3,"Nicola-Yarred, Master. Elias",male,12,1,0,2651,11.2417,,C
|
||||||
|
127,0,3,"McMahon, Mr. Martin",male,,0,0,370372,7.75,,Q
|
||||||
|
128,1,3,"Madsen, Mr. Fridtjof Arne",male,24,0,0,C 17369,7.1417,,S
|
||||||
|
129,1,3,"Peter, Miss. Anna",female,,1,1,2668,22.3583,F E69,C
|
||||||
|
130,0,3,"Ekstrom, Mr. Johan",male,45,0,0,347061,6.975,,S
|
||||||
|
131,0,3,"Drazenoic, Mr. Jozef",male,33,0,0,349241,7.8958,,C
|
||||||
|
132,0,3,"Coelho, Mr. Domingos Fernandeo",male,20,0,0,SOTON/O.Q. 3101307,7.05,,S
|
||||||
|
133,0,3,"Robins, Mrs. Alexander A (Grace Charity Laury)",female,47,1,0,A/5. 3337,14.5,,S
|
||||||
|
134,1,2,"Weisz, Mrs. Leopold (Mathilde Francoise Pede)",female,29,1,0,228414,26,,S
|
||||||
|
135,0,2,"Sobey, Mr. Samuel James Hayden",male,25,0,0,C.A. 29178,13,,S
|
||||||
|
136,0,2,"Richard, Mr. Emile",male,23,0,0,SC/PARIS 2133,15.0458,,C
|
||||||
|
137,1,1,"Newsom, Miss. Helen Monypeny",female,19,0,2,11752,26.2833,D47,S
|
||||||
|
138,0,1,"Futrelle, Mr. Jacques Heath",male,37,1,0,113803,53.1,C123,S
|
||||||
|
139,0,3,"Osen, Mr. Olaf Elon",male,16,0,0,7534,9.2167,,S
|
||||||
|
140,0,1,"Giglio, Mr. Victor",male,24,0,0,PC 17593,79.2,B86,C
|
||||||
|
141,0,3,"Boulos, Mrs. Joseph (Sultana)",female,,0,2,2678,15.2458,,C
|
||||||
|
142,1,3,"Nysten, Miss. Anna Sofia",female,22,0,0,347081,7.75,,S
|
||||||
|
143,1,3,"Hakkarainen, Mrs. Pekka Pietari (Elin Matilda Dolck)",female,24,1,0,STON/O2. 3101279,15.85,,S
|
||||||
|
144,0,3,"Burke, Mr. Jeremiah",male,19,0,0,365222,6.75,,Q
|
||||||
|
145,0,2,"Andrew, Mr. Edgardo Samuel",male,18,0,0,231945,11.5,,S
|
||||||
|
146,0,2,"Nicholls, Mr. Joseph Charles",male,19,1,1,C.A. 33112,36.75,,S
|
||||||
|
147,1,3,"Andersson, Mr. August Edvard (""Wennerstrom"")",male,27,0,0,350043,7.7958,,S
|
||||||
|
148,0,3,"Ford, Miss. Robina Maggie ""Ruby""",female,9,2,2,W./C. 6608,34.375,,S
|
||||||
|
149,0,2,"Navratil, Mr. Michel (""Louis M Hoffman"")",male,36.5,0,2,230080,26,F2,S
|
||||||
|
150,0,2,"Byles, Rev. Thomas Roussel Davids",male,42,0,0,244310,13,,S
|
||||||
|
151,0,2,"Bateman, Rev. Robert James",male,51,0,0,S.O.P. 1166,12.525,,S
|
||||||
|
152,1,1,"Pears, Mrs. Thomas (Edith Wearne)",female,22,1,0,113776,66.6,C2,S
|
||||||
|
153,0,3,"Meo, Mr. Alfonzo",male,55.5,0,0,A.5. 11206,8.05,,S
|
||||||
|
154,0,3,"van Billiard, Mr. Austin Blyler",male,40.5,0,2,A/5. 851,14.5,,S
|
||||||
|
155,0,3,"Olsen, Mr. Ole Martin",male,,0,0,Fa 265302,7.3125,,S
|
||||||
|
156,0,1,"Williams, Mr. Charles Duane",male,51,0,1,PC 17597,61.3792,,C
|
||||||
|
157,1,3,"Gilnagh, Miss. Katherine ""Katie""",female,16,0,0,35851,7.7333,,Q
|
||||||
|
158,0,3,"Corn, Mr. Harry",male,30,0,0,SOTON/OQ 392090,8.05,,S
|
||||||
|
159,0,3,"Smiljanic, Mr. Mile",male,,0,0,315037,8.6625,,S
|
||||||
|
160,0,3,"Sage, Master. Thomas Henry",male,,8,2,CA. 2343,69.55,,S
|
||||||
|
161,0,3,"Cribb, Mr. John Hatfield",male,44,0,1,371362,16.1,,S
|
||||||
|
162,1,2,"Watt, Mrs. James (Elizabeth ""Bessie"" Inglis Milne)",female,40,0,0,C.A. 33595,15.75,,S
|
||||||
|
163,0,3,"Bengtsson, Mr. John Viktor",male,26,0,0,347068,7.775,,S
|
||||||
|
164,0,3,"Calic, Mr. Jovo",male,17,0,0,315093,8.6625,,S
|
||||||
|
165,0,3,"Panula, Master. Eino Viljami",male,1,4,1,3101295,39.6875,,S
|
||||||
|
166,1,3,"Goldsmith, Master. Frank John William ""Frankie""",male,9,0,2,363291,20.525,,S
|
||||||
|
167,1,1,"Chibnall, Mrs. (Edith Martha Bowerman)",female,,0,1,113505,55,E33,S
|
||||||
|
168,0,3,"Skoog, Mrs. William (Anna Bernhardina Karlsson)",female,45,1,4,347088,27.9,,S
|
||||||
|
169,0,1,"Baumann, Mr. John D",male,,0,0,PC 17318,25.925,,S
|
||||||
|
170,0,3,"Ling, Mr. Lee",male,28,0,0,1601,56.4958,,S
|
||||||
|
171,0,1,"Van der hoef, Mr. Wyckoff",male,61,0,0,111240,33.5,B19,S
|
||||||
|
172,0,3,"Rice, Master. Arthur",male,4,4,1,382652,29.125,,Q
|
||||||
|
173,1,3,"Johnson, Miss. Eleanor Ileen",female,1,1,1,347742,11.1333,,S
|
||||||
|
174,0,3,"Sivola, Mr. Antti Wilhelm",male,21,0,0,STON/O 2. 3101280,7.925,,S
|
||||||
|
175,0,1,"Smith, Mr. James Clinch",male,56,0,0,17764,30.6958,A7,C
|
||||||
|
176,0,3,"Klasen, Mr. Klas Albin",male,18,1,1,350404,7.8542,,S
|
||||||
|
177,0,3,"Lefebre, Master. Henry Forbes",male,,3,1,4133,25.4667,,S
|
||||||
|
178,0,1,"Isham, Miss. Ann Elizabeth",female,50,0,0,PC 17595,28.7125,C49,C
|
||||||
|
179,0,2,"Hale, Mr. Reginald",male,30,0,0,250653,13,,S
|
||||||
|
180,0,3,"Leonard, Mr. Lionel",male,36,0,0,LINE,0,,S
|
||||||
|
181,0,3,"Sage, Miss. Constance Gladys",female,,8,2,CA. 2343,69.55,,S
|
||||||
|
182,0,2,"Pernot, Mr. Rene",male,,0,0,SC/PARIS 2131,15.05,,C
|
||||||
|
183,0,3,"Asplund, Master. Clarence Gustaf Hugo",male,9,4,2,347077,31.3875,,S
|
||||||
|
184,1,2,"Becker, Master. Richard F",male,1,2,1,230136,39,F4,S
|
||||||
|
185,1,3,"Kink-Heilmann, Miss. Luise Gretchen",female,4,0,2,315153,22.025,,S
|
||||||
|
186,0,1,"Rood, Mr. Hugh Roscoe",male,,0,0,113767,50,A32,S
|
||||||
|
187,1,3,"O'Brien, Mrs. Thomas (Johanna ""Hannah"" Godfrey)",female,,1,0,370365,15.5,,Q
|
||||||
|
188,1,1,"Romaine, Mr. Charles Hallace (""Mr C Rolmane"")",male,45,0,0,111428,26.55,,S
|
||||||
|
189,0,3,"Bourke, Mr. John",male,40,1,1,364849,15.5,,Q
|
||||||
|
190,0,3,"Turcin, Mr. Stjepan",male,36,0,0,349247,7.8958,,S
|
||||||
|
191,1,2,"Pinsky, Mrs. (Rosa)",female,32,0,0,234604,13,,S
|
||||||
|
192,0,2,"Carbines, Mr. William",male,19,0,0,28424,13,,S
|
||||||
|
193,1,3,"Andersen-Jensen, Miss. Carla Christine Nielsine",female,19,1,0,350046,7.8542,,S
|
||||||
|
194,1,2,"Navratil, Master. Michel M",male,3,1,1,230080,26,F2,S
|
||||||
|
195,1,1,"Brown, Mrs. James Joseph (Margaret Tobin)",female,44,0,0,PC 17610,27.7208,B4,C
|
||||||
|
196,1,1,"Lurette, Miss. Elise",female,58,0,0,PC 17569,146.5208,B80,C
|
||||||
|
197,0,3,"Mernagh, Mr. Robert",male,,0,0,368703,7.75,,Q
|
||||||
|
198,0,3,"Olsen, Mr. Karl Siegwart Andreas",male,42,0,1,4579,8.4042,,S
|
||||||
|
199,1,3,"Madigan, Miss. Margaret ""Maggie""",female,,0,0,370370,7.75,,Q
|
||||||
|
200,0,2,"Yrois, Miss. Henriette (""Mrs Harbeck"")",female,24,0,0,248747,13,,S
|
||||||
|
201,0,3,"Vande Walle, Mr. Nestor Cyriel",male,28,0,0,345770,9.5,,S
|
||||||
|
202,0,3,"Sage, Mr. Frederick",male,,8,2,CA. 2343,69.55,,S
|
||||||
|
203,0,3,"Johanson, Mr. Jakob Alfred",male,34,0,0,3101264,6.4958,,S
|
||||||
|
204,0,3,"Youseff, Mr. Gerious",male,45.5,0,0,2628,7.225,,C
|
||||||
|
205,1,3,"Cohen, Mr. Gurshon ""Gus""",male,18,0,0,A/5 3540,8.05,,S
|
||||||
|
206,0,3,"Strom, Miss. Telma Matilda",female,2,0,1,347054,10.4625,G6,S
|
||||||
|
207,0,3,"Backstrom, Mr. Karl Alfred",male,32,1,0,3101278,15.85,,S
|
||||||
|
208,1,3,"Albimona, Mr. Nassef Cassem",male,26,0,0,2699,18.7875,,C
|
||||||
|
209,1,3,"Carr, Miss. Helen ""Ellen""",female,16,0,0,367231,7.75,,Q
|
||||||
|
210,1,1,"Blank, Mr. Henry",male,40,0,0,112277,31,A31,C
|
||||||
|
211,0,3,"Ali, Mr. Ahmed",male,24,0,0,SOTON/O.Q. 3101311,7.05,,S
|
||||||
|
212,1,2,"Cameron, Miss. Clear Annie",female,35,0,0,F.C.C. 13528,21,,S
|
||||||
|
213,0,3,"Perkin, Mr. John Henry",male,22,0,0,A/5 21174,7.25,,S
|
||||||
|
214,0,2,"Givard, Mr. Hans Kristensen",male,30,0,0,250646,13,,S
|
||||||
|
215,0,3,"Kiernan, Mr. Philip",male,,1,0,367229,7.75,,Q
|
||||||
|
216,1,1,"Newell, Miss. Madeleine",female,31,1,0,35273,113.275,D36,C
|
||||||
|
217,1,3,"Honkanen, Miss. Eliina",female,27,0,0,STON/O2. 3101283,7.925,,S
|
||||||
|
218,0,2,"Jacobsohn, Mr. Sidney Samuel",male,42,1,0,243847,27,,S
|
||||||
|
219,1,1,"Bazzani, Miss. Albina",female,32,0,0,11813,76.2917,D15,C
|
||||||
|
220,0,2,"Harris, Mr. Walter",male,30,0,0,W/C 14208,10.5,,S
|
||||||
|
221,1,3,"Sunderland, Mr. Victor Francis",male,16,0,0,SOTON/OQ 392089,8.05,,S
|
||||||
|
222,0,2,"Bracken, Mr. James H",male,27,0,0,220367,13,,S
|
||||||
|
223,0,3,"Green, Mr. George Henry",male,51,0,0,21440,8.05,,S
|
||||||
|
224,0,3,"Nenkoff, Mr. Christo",male,,0,0,349234,7.8958,,S
|
||||||
|
225,1,1,"Hoyt, Mr. Frederick Maxfield",male,38,1,0,19943,90,C93,S
|
||||||
|
226,0,3,"Berglund, Mr. Karl Ivar Sven",male,22,0,0,PP 4348,9.35,,S
|
||||||
|
227,1,2,"Mellors, Mr. William John",male,19,0,0,SW/PP 751,10.5,,S
|
||||||
|
228,0,3,"Lovell, Mr. John Hall (""Henry"")",male,20.5,0,0,A/5 21173,7.25,,S
|
||||||
|
229,0,2,"Fahlstrom, Mr. Arne Jonas",male,18,0,0,236171,13,,S
|
||||||
|
230,0,3,"Lefebre, Miss. Mathilde",female,,3,1,4133,25.4667,,S
|
||||||
|
231,1,1,"Harris, Mrs. Henry Birkhardt (Irene Wallach)",female,35,1,0,36973,83.475,C83,S
|
||||||
|
232,0,3,"Larsson, Mr. Bengt Edvin",male,29,0,0,347067,7.775,,S
|
||||||
|
233,0,2,"Sjostedt, Mr. Ernst Adolf",male,59,0,0,237442,13.5,,S
|
||||||
|
234,1,3,"Asplund, Miss. Lillian Gertrud",female,5,4,2,347077,31.3875,,S
|
||||||
|
235,0,2,"Leyson, Mr. Robert William Norman",male,24,0,0,C.A. 29566,10.5,,S
|
||||||
|
236,0,3,"Harknett, Miss. Alice Phoebe",female,,0,0,W./C. 6609,7.55,,S
|
||||||
|
237,0,2,"Hold, Mr. Stephen",male,44,1,0,26707,26,,S
|
||||||
|
238,1,2,"Collyer, Miss. Marjorie ""Lottie""",female,8,0,2,C.A. 31921,26.25,,S
|
||||||
|
239,0,2,"Pengelly, Mr. Frederick William",male,19,0,0,28665,10.5,,S
|
||||||
|
240,0,2,"Hunt, Mr. George Henry",male,33,0,0,SCO/W 1585,12.275,,S
|
||||||
|
241,0,3,"Zabour, Miss. Thamine",female,,1,0,2665,14.4542,,C
|
||||||
|
242,1,3,"Murphy, Miss. Katherine ""Kate""",female,,1,0,367230,15.5,,Q
|
||||||
|
243,0,2,"Coleridge, Mr. Reginald Charles",male,29,0,0,W./C. 14263,10.5,,S
|
||||||
|
244,0,3,"Maenpaa, Mr. Matti Alexanteri",male,22,0,0,STON/O 2. 3101275,7.125,,S
|
||||||
|
245,0,3,"Attalah, Mr. Sleiman",male,30,0,0,2694,7.225,,C
|
||||||
|
246,0,1,"Minahan, Dr. William Edward",male,44,2,0,19928,90,C78,Q
|
||||||
|
247,0,3,"Lindahl, Miss. Agda Thorilda Viktoria",female,25,0,0,347071,7.775,,S
|
||||||
|
248,1,2,"Hamalainen, Mrs. William (Anna)",female,24,0,2,250649,14.5,,S
|
||||||
|
249,1,1,"Beckwith, Mr. Richard Leonard",male,37,1,1,11751,52.5542,D35,S
|
||||||
|
250,0,2,"Carter, Rev. Ernest Courtenay",male,54,1,0,244252,26,,S
|
||||||
|
251,0,3,"Reed, Mr. James George",male,,0,0,362316,7.25,,S
|
||||||
|
252,0,3,"Strom, Mrs. Wilhelm (Elna Matilda Persson)",female,29,1,1,347054,10.4625,G6,S
|
||||||
|
253,0,1,"Stead, Mr. William Thomas",male,62,0,0,113514,26.55,C87,S
|
||||||
|
254,0,3,"Lobb, Mr. William Arthur",male,30,1,0,A/5. 3336,16.1,,S
|
||||||
|
255,0,3,"Rosblom, Mrs. Viktor (Helena Wilhelmina)",female,41,0,2,370129,20.2125,,S
|
||||||
|
256,1,3,"Touma, Mrs. Darwis (Hanne Youssef Razi)",female,29,0,2,2650,15.2458,,C
|
||||||
|
257,1,1,"Thorne, Mrs. Gertrude Maybelle",female,,0,0,PC 17585,79.2,,C
|
||||||
|
258,1,1,"Cherry, Miss. Gladys",female,30,0,0,110152,86.5,B77,S
|
||||||
|
259,1,1,"Ward, Miss. Anna",female,35,0,0,PC 17755,512.3292,,C
|
||||||
|
260,1,2,"Parrish, Mrs. (Lutie Davis)",female,50,0,1,230433,26,,S
|
||||||
|
261,0,3,"Smith, Mr. Thomas",male,,0,0,384461,7.75,,Q
|
||||||
|
262,1,3,"Asplund, Master. Edvin Rojj Felix",male,3,4,2,347077,31.3875,,S
|
||||||
|
263,0,1,"Taussig, Mr. Emil",male,52,1,1,110413,79.65,E67,S
|
||||||
|
264,0,1,"Harrison, Mr. William",male,40,0,0,112059,0,B94,S
|
||||||
|
265,0,3,"Henry, Miss. Delia",female,,0,0,382649,7.75,,Q
|
||||||
|
266,0,2,"Reeves, Mr. David",male,36,0,0,C.A. 17248,10.5,,S
|
||||||
|
267,0,3,"Panula, Mr. Ernesti Arvid",male,16,4,1,3101295,39.6875,,S
|
||||||
|
268,1,3,"Persson, Mr. Ernst Ulrik",male,25,1,0,347083,7.775,,S
|
||||||
|
269,1,1,"Graham, Mrs. William Thompson (Edith Junkins)",female,58,0,1,PC 17582,153.4625,C125,S
|
||||||
|
270,1,1,"Bissette, Miss. Amelia",female,35,0,0,PC 17760,135.6333,C99,S
|
||||||
|
271,0,1,"Cairns, Mr. Alexander",male,,0,0,113798,31,,S
|
||||||
|
272,1,3,"Tornquist, Mr. William Henry",male,25,0,0,LINE,0,,S
|
||||||
|
273,1,2,"Mellinger, Mrs. (Elizabeth Anne Maidment)",female,41,0,1,250644,19.5,,S
|
||||||
|
274,0,1,"Natsch, Mr. Charles H",male,37,0,1,PC 17596,29.7,C118,C
|
||||||
|
275,1,3,"Healy, Miss. Hanora ""Nora""",female,,0,0,370375,7.75,,Q
|
||||||
|
276,1,1,"Andrews, Miss. Kornelia Theodosia",female,63,1,0,13502,77.9583,D7,S
|
||||||
|
277,0,3,"Lindblom, Miss. Augusta Charlotta",female,45,0,0,347073,7.75,,S
|
||||||
|
278,0,2,"Parkes, Mr. Francis ""Frank""",male,,0,0,239853,0,,S
|
||||||
|
279,0,3,"Rice, Master. Eric",male,7,4,1,382652,29.125,,Q
|
||||||
|
280,1,3,"Abbott, Mrs. Stanton (Rosa Hunt)",female,35,1,1,C.A. 2673,20.25,,S
|
||||||
|
281,0,3,"Duane, Mr. Frank",male,65,0,0,336439,7.75,,Q
|
||||||
|
282,0,3,"Olsson, Mr. Nils Johan Goransson",male,28,0,0,347464,7.8542,,S
|
||||||
|
283,0,3,"de Pelsmaeker, Mr. Alfons",male,16,0,0,345778,9.5,,S
|
||||||
|
284,1,3,"Dorking, Mr. Edward Arthur",male,19,0,0,A/5. 10482,8.05,,S
|
||||||
|
285,0,1,"Smith, Mr. Richard William",male,,0,0,113056,26,A19,S
|
||||||
|
286,0,3,"Stankovic, Mr. Ivan",male,33,0,0,349239,8.6625,,C
|
||||||
|
287,1,3,"de Mulder, Mr. Theodore",male,30,0,0,345774,9.5,,S
|
||||||
|
288,0,3,"Naidenoff, Mr. Penko",male,22,0,0,349206,7.8958,,S
|
||||||
|
289,1,2,"Hosono, Mr. Masabumi",male,42,0,0,237798,13,,S
|
||||||
|
290,1,3,"Connolly, Miss. Kate",female,22,0,0,370373,7.75,,Q
|
||||||
|
291,1,1,"Barber, Miss. Ellen ""Nellie""",female,26,0,0,19877,78.85,,S
|
||||||
|
292,1,1,"Bishop, Mrs. Dickinson H (Helen Walton)",female,19,1,0,11967,91.0792,B49,C
|
||||||
|
293,0,2,"Levy, Mr. Rene Jacques",male,36,0,0,SC/Paris 2163,12.875,D,C
|
||||||
|
294,0,3,"Haas, Miss. Aloisia",female,24,0,0,349236,8.85,,S
|
||||||
|
295,0,3,"Mineff, Mr. Ivan",male,24,0,0,349233,7.8958,,S
|
||||||
|
296,0,1,"Lewy, Mr. Ervin G",male,,0,0,PC 17612,27.7208,,C
|
||||||
|
297,0,3,"Hanna, Mr. Mansour",male,23.5,0,0,2693,7.2292,,C
|
||||||
|
298,0,1,"Allison, Miss. Helen Loraine",female,2,1,2,113781,151.55,C22 C26,S
|
||||||
|
299,1,1,"Saalfeld, Mr. Adolphe",male,,0,0,19988,30.5,C106,S
|
||||||
|
300,1,1,"Baxter, Mrs. James (Helene DeLaudeniere Chaput)",female,50,0,1,PC 17558,247.5208,B58 B60,C
|
||||||
|
301,1,3,"Kelly, Miss. Anna Katherine ""Annie Kate""",female,,0,0,9234,7.75,,Q
|
||||||
|
302,1,3,"McCoy, Mr. Bernard",male,,2,0,367226,23.25,,Q
|
||||||
|
303,0,3,"Johnson, Mr. William Cahoone Jr",male,19,0,0,LINE,0,,S
|
||||||
|
304,1,2,"Keane, Miss. Nora A",female,,0,0,226593,12.35,E101,Q
|
||||||
|
305,0,3,"Williams, Mr. Howard Hugh ""Harry""",male,,0,0,A/5 2466,8.05,,S
|
||||||
|
306,1,1,"Allison, Master. Hudson Trevor",male,0.92,1,2,113781,151.55,C22 C26,S
|
||||||
|
307,1,1,"Fleming, Miss. Margaret",female,,0,0,17421,110.8833,,C
|
||||||
|
308,1,1,"Penasco y Castellana, Mrs. Victor de Satode (Maria Josefa Perez de Soto y Vallejo)",female,17,1,0,PC 17758,108.9,C65,C
|
||||||
|
309,0,2,"Abelson, Mr. Samuel",male,30,1,0,P/PP 3381,24,,C
|
||||||
|
310,1,1,"Francatelli, Miss. Laura Mabel",female,30,0,0,PC 17485,56.9292,E36,C
|
||||||
|
311,1,1,"Hays, Miss. Margaret Bechstein",female,24,0,0,11767,83.1583,C54,C
|
||||||
|
312,1,1,"Ryerson, Miss. Emily Borie",female,18,2,2,PC 17608,262.375,B57 B59 B63 B66,C
|
||||||
|
313,0,2,"Lahtinen, Mrs. William (Anna Sylfven)",female,26,1,1,250651,26,,S
|
||||||
|
314,0,3,"Hendekovic, Mr. Ignjac",male,28,0,0,349243,7.8958,,S
|
||||||
|
315,0,2,"Hart, Mr. Benjamin",male,43,1,1,F.C.C. 13529,26.25,,S
|
||||||
|
316,1,3,"Nilsson, Miss. Helmina Josefina",female,26,0,0,347470,7.8542,,S
|
||||||
|
317,1,2,"Kantor, Mrs. Sinai (Miriam Sternin)",female,24,1,0,244367,26,,S
|
||||||
|
318,0,2,"Moraweck, Dr. Ernest",male,54,0,0,29011,14,,S
|
||||||
|
319,1,1,"Wick, Miss. Mary Natalie",female,31,0,2,36928,164.8667,C7,S
|
||||||
|
320,1,1,"Spedden, Mrs. Frederic Oakley (Margaretta Corning Stone)",female,40,1,1,16966,134.5,E34,C
|
||||||
|
321,0,3,"Dennis, Mr. Samuel",male,22,0,0,A/5 21172,7.25,,S
|
||||||
|
322,0,3,"Danoff, Mr. Yoto",male,27,0,0,349219,7.8958,,S
|
||||||
|
323,1,2,"Slayter, Miss. Hilda Mary",female,30,0,0,234818,12.35,,Q
|
||||||
|
324,1,2,"Caldwell, Mrs. Albert Francis (Sylvia Mae Harbaugh)",female,22,1,1,248738,29,,S
|
||||||
|
325,0,3,"Sage, Mr. George John Jr",male,,8,2,CA. 2343,69.55,,S
|
||||||
|
326,1,1,"Young, Miss. Marie Grice",female,36,0,0,PC 17760,135.6333,C32,C
|
||||||
|
327,0,3,"Nysveen, Mr. Johan Hansen",male,61,0,0,345364,6.2375,,S
|
||||||
|
328,1,2,"Ball, Mrs. (Ada E Hall)",female,36,0,0,28551,13,D,S
|
||||||
|
329,1,3,"Goldsmith, Mrs. Frank John (Emily Alice Brown)",female,31,1,1,363291,20.525,,S
|
||||||
|
330,1,1,"Hippach, Miss. Jean Gertrude",female,16,0,1,111361,57.9792,B18,C
|
||||||
|
331,1,3,"McCoy, Miss. Agnes",female,,2,0,367226,23.25,,Q
|
||||||
|
332,0,1,"Partner, Mr. Austen",male,45.5,0,0,113043,28.5,C124,S
|
||||||
|
333,0,1,"Graham, Mr. George Edward",male,38,0,1,PC 17582,153.4625,C91,S
|
||||||
|
334,0,3,"Vander Planke, Mr. Leo Edmondus",male,16,2,0,345764,18,,S
|
||||||
|
335,1,1,"Frauenthal, Mrs. Henry William (Clara Heinsheimer)",female,,1,0,PC 17611,133.65,,S
|
||||||
|
336,0,3,"Denkoff, Mr. Mitto",male,,0,0,349225,7.8958,,S
|
||||||
|
337,0,1,"Pears, Mr. Thomas Clinton",male,29,1,0,113776,66.6,C2,S
|
||||||
|
338,1,1,"Burns, Miss. Elizabeth Margaret",female,41,0,0,16966,134.5,E40,C
|
||||||
|
339,1,3,"Dahl, Mr. Karl Edwart",male,45,0,0,7598,8.05,,S
|
||||||
|
340,0,1,"Blackwell, Mr. Stephen Weart",male,45,0,0,113784,35.5,T,S
|
||||||
|
341,1,2,"Navratil, Master. Edmond Roger",male,2,1,1,230080,26,F2,S
|
||||||
|
342,1,1,"Fortune, Miss. Alice Elizabeth",female,24,3,2,19950,263,C23 C25 C27,S
|
||||||
|
343,0,2,"Collander, Mr. Erik Gustaf",male,28,0,0,248740,13,,S
|
||||||
|
344,0,2,"Sedgwick, Mr. Charles Frederick Waddington",male,25,0,0,244361,13,,S
|
||||||
|
345,0,2,"Fox, Mr. Stanley Hubert",male,36,0,0,229236,13,,S
|
||||||
|
346,1,2,"Brown, Miss. Amelia ""Mildred""",female,24,0,0,248733,13,F33,S
|
||||||
|
347,1,2,"Smith, Miss. Marion Elsie",female,40,0,0,31418,13,,S
|
||||||
|
348,1,3,"Davison, Mrs. Thomas Henry (Mary E Finck)",female,,1,0,386525,16.1,,S
|
||||||
|
349,1,3,"Coutts, Master. William Loch ""William""",male,3,1,1,C.A. 37671,15.9,,S
|
||||||
|
350,0,3,"Dimic, Mr. Jovan",male,42,0,0,315088,8.6625,,S
|
||||||
|
351,0,3,"Odahl, Mr. Nils Martin",male,23,0,0,7267,9.225,,S
|
||||||
|
352,0,1,"Williams-Lambert, Mr. Fletcher Fellows",male,,0,0,113510,35,C128,S
|
||||||
|
353,0,3,"Elias, Mr. Tannous",male,15,1,1,2695,7.2292,,C
|
||||||
|
354,0,3,"Arnold-Franchi, Mr. Josef",male,25,1,0,349237,17.8,,S
|
||||||
|
355,0,3,"Yousif, Mr. Wazli",male,,0,0,2647,7.225,,C
|
||||||
|
356,0,3,"Vanden Steen, Mr. Leo Peter",male,28,0,0,345783,9.5,,S
|
||||||
|
357,1,1,"Bowerman, Miss. Elsie Edith",female,22,0,1,113505,55,E33,S
|
||||||
|
358,0,2,"Funk, Miss. Annie Clemmer",female,38,0,0,237671,13,,S
|
||||||
|
359,1,3,"McGovern, Miss. Mary",female,,0,0,330931,7.8792,,Q
|
||||||
|
360,1,3,"Mockler, Miss. Helen Mary ""Ellie""",female,,0,0,330980,7.8792,,Q
|
||||||
|
361,0,3,"Skoog, Mr. Wilhelm",male,40,1,4,347088,27.9,,S
|
||||||
|
362,0,2,"del Carlo, Mr. Sebastiano",male,29,1,0,SC/PARIS 2167,27.7208,,C
|
||||||
|
363,0,3,"Barbara, Mrs. (Catherine David)",female,45,0,1,2691,14.4542,,C
|
||||||
|
364,0,3,"Asim, Mr. Adola",male,35,0,0,SOTON/O.Q. 3101310,7.05,,S
|
||||||
|
365,0,3,"O'Brien, Mr. Thomas",male,,1,0,370365,15.5,,Q
|
||||||
|
366,0,3,"Adahl, Mr. Mauritz Nils Martin",male,30,0,0,C 7076,7.25,,S
|
||||||
|
367,1,1,"Warren, Mrs. Frank Manley (Anna Sophia Atkinson)",female,60,1,0,110813,75.25,D37,C
|
||||||
|
368,1,3,"Moussa, Mrs. (Mantoura Boulos)",female,,0,0,2626,7.2292,,C
|
||||||
|
369,1,3,"Jermyn, Miss. Annie",female,,0,0,14313,7.75,,Q
|
||||||
|
370,1,1,"Aubart, Mme. Leontine Pauline",female,24,0,0,PC 17477,69.3,B35,C
|
||||||
|
371,1,1,"Harder, Mr. George Achilles",male,25,1,0,11765,55.4417,E50,C
|
||||||
|
372,0,3,"Wiklund, Mr. Jakob Alfred",male,18,1,0,3101267,6.4958,,S
|
||||||
|
373,0,3,"Beavan, Mr. William Thomas",male,19,0,0,323951,8.05,,S
|
||||||
|
374,0,1,"Ringhini, Mr. Sante",male,22,0,0,PC 17760,135.6333,,C
|
||||||
|
375,0,3,"Palsson, Miss. Stina Viola",female,3,3,1,349909,21.075,,S
|
||||||
|
376,1,1,"Meyer, Mrs. Edgar Joseph (Leila Saks)",female,,1,0,PC 17604,82.1708,,C
|
||||||
|
377,1,3,"Landergren, Miss. Aurora Adelia",female,22,0,0,C 7077,7.25,,S
|
||||||
|
378,0,1,"Widener, Mr. Harry Elkins",male,27,0,2,113503,211.5,C82,C
|
||||||
|
379,0,3,"Betros, Mr. Tannous",male,20,0,0,2648,4.0125,,C
|
||||||
|
380,0,3,"Gustafsson, Mr. Karl Gideon",male,19,0,0,347069,7.775,,S
|
||||||
|
381,1,1,"Bidois, Miss. Rosalie",female,42,0,0,PC 17757,227.525,,C
|
||||||
|
382,1,3,"Nakid, Miss. Maria (""Mary"")",female,1,0,2,2653,15.7417,,C
|
||||||
|
383,0,3,"Tikkanen, Mr. Juho",male,32,0,0,STON/O 2. 3101293,7.925,,S
|
||||||
|
384,1,1,"Holverson, Mrs. Alexander Oskar (Mary Aline Towner)",female,35,1,0,113789,52,,S
|
||||||
|
385,0,3,"Plotcharsky, Mr. Vasil",male,,0,0,349227,7.8958,,S
|
||||||
|
386,0,2,"Davies, Mr. Charles Henry",male,18,0,0,S.O.C. 14879,73.5,,S
|
||||||
|
387,0,3,"Goodwin, Master. Sidney Leonard",male,1,5,2,CA 2144,46.9,,S
|
||||||
|
388,1,2,"Buss, Miss. Kate",female,36,0,0,27849,13,,S
|
||||||
|
389,0,3,"Sadlier, Mr. Matthew",male,,0,0,367655,7.7292,,Q
|
||||||
|
390,1,2,"Lehmann, Miss. Bertha",female,17,0,0,SC 1748,12,,C
|
||||||
|
391,1,1,"Carter, Mr. William Ernest",male,36,1,2,113760,120,B96 B98,S
|
||||||
|
392,1,3,"Jansson, Mr. Carl Olof",male,21,0,0,350034,7.7958,,S
|
||||||
|
393,0,3,"Gustafsson, Mr. Johan Birger",male,28,2,0,3101277,7.925,,S
|
||||||
|
394,1,1,"Newell, Miss. Marjorie",female,23,1,0,35273,113.275,D36,C
|
||||||
|
395,1,3,"Sandstrom, Mrs. Hjalmar (Agnes Charlotta Bengtsson)",female,24,0,2,PP 9549,16.7,G6,S
|
||||||
|
396,0,3,"Johansson, Mr. Erik",male,22,0,0,350052,7.7958,,S
|
||||||
|
397,0,3,"Olsson, Miss. Elina",female,31,0,0,350407,7.8542,,S
|
||||||
|
398,0,2,"McKane, Mr. Peter David",male,46,0,0,28403,26,,S
|
||||||
|
399,0,2,"Pain, Dr. Alfred",male,23,0,0,244278,10.5,,S
|
||||||
|
400,1,2,"Trout, Mrs. William H (Jessie L)",female,28,0,0,240929,12.65,,S
|
||||||
|
401,1,3,"Niskanen, Mr. Juha",male,39,0,0,STON/O 2. 3101289,7.925,,S
|
||||||
|
402,0,3,"Adams, Mr. John",male,26,0,0,341826,8.05,,S
|
||||||
|
403,0,3,"Jussila, Miss. Mari Aina",female,21,1,0,4137,9.825,,S
|
||||||
|
404,0,3,"Hakkarainen, Mr. Pekka Pietari",male,28,1,0,STON/O2. 3101279,15.85,,S
|
||||||
|
405,0,3,"Oreskovic, Miss. Marija",female,20,0,0,315096,8.6625,,S
|
||||||
|
406,0,2,"Gale, Mr. Shadrach",male,34,1,0,28664,21,,S
|
||||||
|
407,0,3,"Widegren, Mr. Carl/Charles Peter",male,51,0,0,347064,7.75,,S
|
||||||
|
408,1,2,"Richards, Master. William Rowe",male,3,1,1,29106,18.75,,S
|
||||||
|
409,0,3,"Birkeland, Mr. Hans Martin Monsen",male,21,0,0,312992,7.775,,S
|
||||||
|
410,0,3,"Lefebre, Miss. Ida",female,,3,1,4133,25.4667,,S
|
||||||
|
411,0,3,"Sdycoff, Mr. Todor",male,,0,0,349222,7.8958,,S
|
||||||
|
412,0,3,"Hart, Mr. Henry",male,,0,0,394140,6.8583,,Q
|
||||||
|
413,1,1,"Minahan, Miss. Daisy E",female,33,1,0,19928,90,C78,Q
|
||||||
|
414,0,2,"Cunningham, Mr. Alfred Fleming",male,,0,0,239853,0,,S
|
||||||
|
415,1,3,"Sundman, Mr. Johan Julian",male,44,0,0,STON/O 2. 3101269,7.925,,S
|
||||||
|
416,0,3,"Meek, Mrs. Thomas (Annie Louise Rowley)",female,,0,0,343095,8.05,,S
|
||||||
|
417,1,2,"Drew, Mrs. James Vivian (Lulu Thorne Christian)",female,34,1,1,28220,32.5,,S
|
||||||
|
418,1,2,"Silven, Miss. Lyyli Karoliina",female,18,0,2,250652,13,,S
|
||||||
|
419,0,2,"Matthews, Mr. William John",male,30,0,0,28228,13,,S
|
||||||
|
420,0,3,"Van Impe, Miss. Catharina",female,10,0,2,345773,24.15,,S
|
||||||
|
421,0,3,"Gheorgheff, Mr. Stanio",male,,0,0,349254,7.8958,,C
|
||||||
|
422,0,3,"Charters, Mr. David",male,21,0,0,A/5. 13032,7.7333,,Q
|
||||||
|
423,0,3,"Zimmerman, Mr. Leo",male,29,0,0,315082,7.875,,S
|
||||||
|
424,0,3,"Danbom, Mrs. Ernst Gilbert (Anna Sigrid Maria Brogren)",female,28,1,1,347080,14.4,,S
|
||||||
|
425,0,3,"Rosblom, Mr. Viktor Richard",male,18,1,1,370129,20.2125,,S
|
||||||
|
426,0,3,"Wiseman, Mr. Phillippe",male,,0,0,A/4. 34244,7.25,,S
|
||||||
|
427,1,2,"Clarke, Mrs. Charles V (Ada Maria Winfield)",female,28,1,0,2003,26,,S
|
||||||
|
428,1,2,"Phillips, Miss. Kate Florence (""Mrs Kate Louise Phillips Marshall"")",female,19,0,0,250655,26,,S
|
||||||
|
429,0,3,"Flynn, Mr. James",male,,0,0,364851,7.75,,Q
|
||||||
|
430,1,3,"Pickard, Mr. Berk (Berk Trembisky)",male,32,0,0,SOTON/O.Q. 392078,8.05,E10,S
|
||||||
|
431,1,1,"Bjornstrom-Steffansson, Mr. Mauritz Hakan",male,28,0,0,110564,26.55,C52,S
|
||||||
|
432,1,3,"Thorneycroft, Mrs. Percival (Florence Kate White)",female,,1,0,376564,16.1,,S
|
||||||
|
433,1,2,"Louch, Mrs. Charles Alexander (Alice Adelaide Slow)",female,42,1,0,SC/AH 3085,26,,S
|
||||||
|
434,0,3,"Kallio, Mr. Nikolai Erland",male,17,0,0,STON/O 2. 3101274,7.125,,S
|
||||||
|
435,0,1,"Silvey, Mr. William Baird",male,50,1,0,13507,55.9,E44,S
|
||||||
|
436,1,1,"Carter, Miss. Lucile Polk",female,14,1,2,113760,120,B96 B98,S
|
||||||
|
437,0,3,"Ford, Miss. Doolina Margaret ""Daisy""",female,21,2,2,W./C. 6608,34.375,,S
|
||||||
|
438,1,2,"Richards, Mrs. Sidney (Emily Hocking)",female,24,2,3,29106,18.75,,S
|
||||||
|
439,0,1,"Fortune, Mr. Mark",male,64,1,4,19950,263,C23 C25 C27,S
|
||||||
|
440,0,2,"Kvillner, Mr. Johan Henrik Johannesson",male,31,0,0,C.A. 18723,10.5,,S
|
||||||
|
441,1,2,"Hart, Mrs. Benjamin (Esther Ada Bloomfield)",female,45,1,1,F.C.C. 13529,26.25,,S
|
||||||
|
442,0,3,"Hampe, Mr. Leon",male,20,0,0,345769,9.5,,S
|
||||||
|
443,0,3,"Petterson, Mr. Johan Emil",male,25,1,0,347076,7.775,,S
|
||||||
|
444,1,2,"Reynaldo, Ms. Encarnacion",female,28,0,0,230434,13,,S
|
||||||
|
445,1,3,"Johannesen-Bratthammer, Mr. Bernt",male,,0,0,65306,8.1125,,S
|
||||||
|
446,1,1,"Dodge, Master. Washington",male,4,0,2,33638,81.8583,A34,S
|
||||||
|
447,1,2,"Mellinger, Miss. Madeleine Violet",female,13,0,1,250644,19.5,,S
|
||||||
|
448,1,1,"Seward, Mr. Frederic Kimber",male,34,0,0,113794,26.55,,S
|
||||||
|
449,1,3,"Baclini, Miss. Marie Catherine",female,5,2,1,2666,19.2583,,C
|
||||||
|
450,1,1,"Peuchen, Major. Arthur Godfrey",male,52,0,0,113786,30.5,C104,S
|
||||||
|
451,0,2,"West, Mr. Edwy Arthur",male,36,1,2,C.A. 34651,27.75,,S
|
||||||
|
452,0,3,"Hagland, Mr. Ingvald Olai Olsen",male,,1,0,65303,19.9667,,S
|
||||||
|
453,0,1,"Foreman, Mr. Benjamin Laventall",male,30,0,0,113051,27.75,C111,C
|
||||||
|
454,1,1,"Goldenberg, Mr. Samuel L",male,49,1,0,17453,89.1042,C92,C
|
||||||
|
455,0,3,"Peduzzi, Mr. Joseph",male,,0,0,A/5 2817,8.05,,S
|
||||||
|
456,1,3,"Jalsevac, Mr. Ivan",male,29,0,0,349240,7.8958,,C
|
||||||
|
457,0,1,"Millet, Mr. Francis Davis",male,65,0,0,13509,26.55,E38,S
|
||||||
|
458,1,1,"Kenyon, Mrs. Frederick R (Marion)",female,,1,0,17464,51.8625,D21,S
|
||||||
|
459,1,2,"Toomey, Miss. Ellen",female,50,0,0,F.C.C. 13531,10.5,,S
|
||||||
|
460,0,3,"O'Connor, Mr. Maurice",male,,0,0,371060,7.75,,Q
|
||||||
|
461,1,1,"Anderson, Mr. Harry",male,48,0,0,19952,26.55,E12,S
|
||||||
|
462,0,3,"Morley, Mr. William",male,34,0,0,364506,8.05,,S
|
||||||
|
463,0,1,"Gee, Mr. Arthur H",male,47,0,0,111320,38.5,E63,S
|
||||||
|
464,0,2,"Milling, Mr. Jacob Christian",male,48,0,0,234360,13,,S
|
||||||
|
465,0,3,"Maisner, Mr. Simon",male,,0,0,A/S 2816,8.05,,S
|
||||||
|
466,0,3,"Goncalves, Mr. Manuel Estanslas",male,38,0,0,SOTON/O.Q. 3101306,7.05,,S
|
||||||
|
467,0,2,"Campbell, Mr. William",male,,0,0,239853,0,,S
|
||||||
|
468,0,1,"Smart, Mr. John Montgomery",male,56,0,0,113792,26.55,,S
|
||||||
|
469,0,3,"Scanlan, Mr. James",male,,0,0,36209,7.725,,Q
|
||||||
|
470,1,3,"Baclini, Miss. Helene Barbara",female,0.75,2,1,2666,19.2583,,C
|
||||||
|
471,0,3,"Keefe, Mr. Arthur",male,,0,0,323592,7.25,,S
|
||||||
|
472,0,3,"Cacic, Mr. Luka",male,38,0,0,315089,8.6625,,S
|
||||||
|
473,1,2,"West, Mrs. Edwy Arthur (Ada Mary Worth)",female,33,1,2,C.A. 34651,27.75,,S
|
||||||
|
474,1,2,"Jerwan, Mrs. Amin S (Marie Marthe Thuillard)",female,23,0,0,SC/AH Basle 541,13.7917,D,C
|
||||||
|
475,0,3,"Strandberg, Miss. Ida Sofia",female,22,0,0,7553,9.8375,,S
|
||||||
|
476,0,1,"Clifford, Mr. George Quincy",male,,0,0,110465,52,A14,S
|
||||||
|
477,0,2,"Renouf, Mr. Peter Henry",male,34,1,0,31027,21,,S
|
||||||
|
478,0,3,"Braund, Mr. Lewis Richard",male,29,1,0,3460,7.0458,,S
|
||||||
|
479,0,3,"Karlsson, Mr. Nils August",male,22,0,0,350060,7.5208,,S
|
||||||
|
480,1,3,"Hirvonen, Miss. Hildur E",female,2,0,1,3101298,12.2875,,S
|
||||||
|
481,0,3,"Goodwin, Master. Harold Victor",male,9,5,2,CA 2144,46.9,,S
|
||||||
|
482,0,2,"Frost, Mr. Anthony Wood ""Archie""",male,,0,0,239854,0,,S
|
||||||
|
483,0,3,"Rouse, Mr. Richard Henry",male,50,0,0,A/5 3594,8.05,,S
|
||||||
|
484,1,3,"Turkula, Mrs. (Hedwig)",female,63,0,0,4134,9.5875,,S
|
||||||
|
485,1,1,"Bishop, Mr. Dickinson H",male,25,1,0,11967,91.0792,B49,C
|
||||||
|
486,0,3,"Lefebre, Miss. Jeannie",female,,3,1,4133,25.4667,,S
|
||||||
|
487,1,1,"Hoyt, Mrs. Frederick Maxfield (Jane Anne Forby)",female,35,1,0,19943,90,C93,S
|
||||||
|
488,0,1,"Kent, Mr. Edward Austin",male,58,0,0,11771,29.7,B37,C
|
||||||
|
489,0,3,"Somerton, Mr. Francis William",male,30,0,0,A.5. 18509,8.05,,S
|
||||||
|
490,1,3,"Coutts, Master. Eden Leslie ""Neville""",male,9,1,1,C.A. 37671,15.9,,S
|
||||||
|
491,0,3,"Hagland, Mr. Konrad Mathias Reiersen",male,,1,0,65304,19.9667,,S
|
||||||
|
492,0,3,"Windelov, Mr. Einar",male,21,0,0,SOTON/OQ 3101317,7.25,,S
|
||||||
|
493,0,1,"Molson, Mr. Harry Markland",male,55,0,0,113787,30.5,C30,S
|
||||||
|
494,0,1,"Artagaveytia, Mr. Ramon",male,71,0,0,PC 17609,49.5042,,C
|
||||||
|
495,0,3,"Stanley, Mr. Edward Roland",male,21,0,0,A/4 45380,8.05,,S
|
||||||
|
496,0,3,"Yousseff, Mr. Gerious",male,,0,0,2627,14.4583,,C
|
||||||
|
497,1,1,"Eustis, Miss. Elizabeth Mussey",female,54,1,0,36947,78.2667,D20,C
|
||||||
|
498,0,3,"Shellard, Mr. Frederick William",male,,0,0,C.A. 6212,15.1,,S
|
||||||
|
499,0,1,"Allison, Mrs. Hudson J C (Bessie Waldo Daniels)",female,25,1,2,113781,151.55,C22 C26,S
|
||||||
|
500,0,3,"Svensson, Mr. Olof",male,24,0,0,350035,7.7958,,S
|
||||||
|
501,0,3,"Calic, Mr. Petar",male,17,0,0,315086,8.6625,,S
|
||||||
|
502,0,3,"Canavan, Miss. Mary",female,21,0,0,364846,7.75,,Q
|
||||||
|
503,0,3,"O'Sullivan, Miss. Bridget Mary",female,,0,0,330909,7.6292,,Q
|
||||||
|
504,0,3,"Laitinen, Miss. Kristina Sofia",female,37,0,0,4135,9.5875,,S
|
||||||
|
505,1,1,"Maioni, Miss. Roberta",female,16,0,0,110152,86.5,B79,S
|
||||||
|
506,0,1,"Penasco y Castellana, Mr. Victor de Satode",male,18,1,0,PC 17758,108.9,C65,C
|
||||||
|
507,1,2,"Quick, Mrs. Frederick Charles (Jane Richards)",female,33,0,2,26360,26,,S
|
||||||
|
508,1,1,"Bradley, Mr. George (""George Arthur Brayton"")",male,,0,0,111427,26.55,,S
|
||||||
|
509,0,3,"Olsen, Mr. Henry Margido",male,28,0,0,C 4001,22.525,,S
|
||||||
|
510,1,3,"Lang, Mr. Fang",male,26,0,0,1601,56.4958,,S
|
||||||
|
511,1,3,"Daly, Mr. Eugene Patrick",male,29,0,0,382651,7.75,,Q
|
||||||
|
512,0,3,"Webber, Mr. James",male,,0,0,SOTON/OQ 3101316,8.05,,S
|
||||||
|
513,1,1,"McGough, Mr. James Robert",male,36,0,0,PC 17473,26.2875,E25,S
|
||||||
|
514,1,1,"Rothschild, Mrs. Martin (Elizabeth L. Barrett)",female,54,1,0,PC 17603,59.4,,C
|
||||||
|
515,0,3,"Coleff, Mr. Satio",male,24,0,0,349209,7.4958,,S
|
||||||
|
516,0,1,"Walker, Mr. William Anderson",male,47,0,0,36967,34.0208,D46,S
|
||||||
|
517,1,2,"Lemore, Mrs. (Amelia Milley)",female,34,0,0,C.A. 34260,10.5,F33,S
|
||||||
|
518,0,3,"Ryan, Mr. Patrick",male,,0,0,371110,24.15,,Q
|
||||||
|
519,1,2,"Angle, Mrs. William A (Florence ""Mary"" Agnes Hughes)",female,36,1,0,226875,26,,S
|
||||||
|
520,0,3,"Pavlovic, Mr. Stefo",male,32,0,0,349242,7.8958,,S
|
||||||
|
521,1,1,"Perreault, Miss. Anne",female,30,0,0,12749,93.5,B73,S
|
||||||
|
522,0,3,"Vovk, Mr. Janko",male,22,0,0,349252,7.8958,,S
|
||||||
|
523,0,3,"Lahoud, Mr. Sarkis",male,,0,0,2624,7.225,,C
|
||||||
|
524,1,1,"Hippach, Mrs. Louis Albert (Ida Sophia Fischer)",female,44,0,1,111361,57.9792,B18,C
|
||||||
|
525,0,3,"Kassem, Mr. Fared",male,,0,0,2700,7.2292,,C
|
||||||
|
526,0,3,"Farrell, Mr. James",male,40.5,0,0,367232,7.75,,Q
|
||||||
|
527,1,2,"Ridsdale, Miss. Lucy",female,50,0,0,W./C. 14258,10.5,,S
|
||||||
|
528,0,1,"Farthing, Mr. John",male,,0,0,PC 17483,221.7792,C95,S
|
||||||
|
529,0,3,"Salonen, Mr. Johan Werner",male,39,0,0,3101296,7.925,,S
|
||||||
|
530,0,2,"Hocking, Mr. Richard George",male,23,2,1,29104,11.5,,S
|
||||||
|
531,1,2,"Quick, Miss. Phyllis May",female,2,1,1,26360,26,,S
|
||||||
|
532,0,3,"Toufik, Mr. Nakli",male,,0,0,2641,7.2292,,C
|
||||||
|
533,0,3,"Elias, Mr. Joseph Jr",male,17,1,1,2690,7.2292,,C
|
||||||
|
534,1,3,"Peter, Mrs. Catherine (Catherine Rizk)",female,,0,2,2668,22.3583,,C
|
||||||
|
535,0,3,"Cacic, Miss. Marija",female,30,0,0,315084,8.6625,,S
|
||||||
|
536,1,2,"Hart, Miss. Eva Miriam",female,7,0,2,F.C.C. 13529,26.25,,S
|
||||||
|
537,0,1,"Butt, Major. Archibald Willingham",male,45,0,0,113050,26.55,B38,S
|
||||||
|
538,1,1,"LeRoy, Miss. Bertha",female,30,0,0,PC 17761,106.425,,C
|
||||||
|
539,0,3,"Risien, Mr. Samuel Beard",male,,0,0,364498,14.5,,S
|
||||||
|
540,1,1,"Frolicher, Miss. Hedwig Margaritha",female,22,0,2,13568,49.5,B39,C
|
||||||
|
541,1,1,"Crosby, Miss. Harriet R",female,36,0,2,WE/P 5735,71,B22,S
|
||||||
|
542,0,3,"Andersson, Miss. Ingeborg Constanzia",female,9,4,2,347082,31.275,,S
|
||||||
|
543,0,3,"Andersson, Miss. Sigrid Elisabeth",female,11,4,2,347082,31.275,,S
|
||||||
|
544,1,2,"Beane, Mr. Edward",male,32,1,0,2908,26,,S
|
||||||
|
545,0,1,"Douglas, Mr. Walter Donald",male,50,1,0,PC 17761,106.425,C86,C
|
||||||
|
546,0,1,"Nicholson, Mr. Arthur Ernest",male,64,0,0,693,26,,S
|
||||||
|
547,1,2,"Beane, Mrs. Edward (Ethel Clarke)",female,19,1,0,2908,26,,S
|
||||||
|
548,1,2,"Padro y Manent, Mr. Julian",male,,0,0,SC/PARIS 2146,13.8625,,C
|
||||||
|
549,0,3,"Goldsmith, Mr. Frank John",male,33,1,1,363291,20.525,,S
|
||||||
|
550,1,2,"Davies, Master. John Morgan Jr",male,8,1,1,C.A. 33112,36.75,,S
|
||||||
|
551,1,1,"Thayer, Mr. John Borland Jr",male,17,0,2,17421,110.8833,C70,C
|
||||||
|
552,0,2,"Sharp, Mr. Percival James R",male,27,0,0,244358,26,,S
|
||||||
|
553,0,3,"O'Brien, Mr. Timothy",male,,0,0,330979,7.8292,,Q
|
||||||
|
554,1,3,"Leeni, Mr. Fahim (""Philip Zenni"")",male,22,0,0,2620,7.225,,C
|
||||||
|
555,1,3,"Ohman, Miss. Velin",female,22,0,0,347085,7.775,,S
|
||||||
|
556,0,1,"Wright, Mr. George",male,62,0,0,113807,26.55,,S
|
||||||
|
557,1,1,"Duff Gordon, Lady. (Lucille Christiana Sutherland) (""Mrs Morgan"")",female,48,1,0,11755,39.6,A16,C
|
||||||
|
558,0,1,"Robbins, Mr. Victor",male,,0,0,PC 17757,227.525,,C
|
||||||
|
559,1,1,"Taussig, Mrs. Emil (Tillie Mandelbaum)",female,39,1,1,110413,79.65,E67,S
|
||||||
|
560,1,3,"de Messemaeker, Mrs. Guillaume Joseph (Emma)",female,36,1,0,345572,17.4,,S
|
||||||
|
561,0,3,"Morrow, Mr. Thomas Rowan",male,,0,0,372622,7.75,,Q
|
||||||
|
562,0,3,"Sivic, Mr. Husein",male,40,0,0,349251,7.8958,,S
|
||||||
|
563,0,2,"Norman, Mr. Robert Douglas",male,28,0,0,218629,13.5,,S
|
||||||
|
564,0,3,"Simmons, Mr. John",male,,0,0,SOTON/OQ 392082,8.05,,S
|
||||||
|
565,0,3,"Meanwell, Miss. (Marion Ogden)",female,,0,0,SOTON/O.Q. 392087,8.05,,S
|
||||||
|
566,0,3,"Davies, Mr. Alfred J",male,24,2,0,A/4 48871,24.15,,S
|
||||||
|
567,0,3,"Stoytcheff, Mr. Ilia",male,19,0,0,349205,7.8958,,S
|
||||||
|
568,0,3,"Palsson, Mrs. Nils (Alma Cornelia Berglund)",female,29,0,4,349909,21.075,,S
|
||||||
|
569,0,3,"Doharr, Mr. Tannous",male,,0,0,2686,7.2292,,C
|
||||||
|
570,1,3,"Jonsson, Mr. Carl",male,32,0,0,350417,7.8542,,S
|
||||||
|
571,1,2,"Harris, Mr. George",male,62,0,0,S.W./PP 752,10.5,,S
|
||||||
|
572,1,1,"Appleton, Mrs. Edward Dale (Charlotte Lamson)",female,53,2,0,11769,51.4792,C101,S
|
||||||
|
573,1,1,"Flynn, Mr. John Irwin (""Irving"")",male,36,0,0,PC 17474,26.3875,E25,S
|
||||||
|
574,1,3,"Kelly, Miss. Mary",female,,0,0,14312,7.75,,Q
|
||||||
|
575,0,3,"Rush, Mr. Alfred George John",male,16,0,0,A/4. 20589,8.05,,S
|
||||||
|
576,0,3,"Patchett, Mr. George",male,19,0,0,358585,14.5,,S
|
||||||
|
577,1,2,"Garside, Miss. Ethel",female,34,0,0,243880,13,,S
|
||||||
|
578,1,1,"Silvey, Mrs. William Baird (Alice Munger)",female,39,1,0,13507,55.9,E44,S
|
||||||
|
579,0,3,"Caram, Mrs. Joseph (Maria Elias)",female,,1,0,2689,14.4583,,C
|
||||||
|
580,1,3,"Jussila, Mr. Eiriik",male,32,0,0,STON/O 2. 3101286,7.925,,S
|
||||||
|
581,1,2,"Christy, Miss. Julie Rachel",female,25,1,1,237789,30,,S
|
||||||
|
582,1,1,"Thayer, Mrs. John Borland (Marian Longstreth Morris)",female,39,1,1,17421,110.8833,C68,C
|
||||||
|
583,0,2,"Downton, Mr. William James",male,54,0,0,28403,26,,S
|
||||||
|
584,0,1,"Ross, Mr. John Hugo",male,36,0,0,13049,40.125,A10,C
|
||||||
|
585,0,3,"Paulner, Mr. Uscher",male,,0,0,3411,8.7125,,C
|
||||||
|
586,1,1,"Taussig, Miss. Ruth",female,18,0,2,110413,79.65,E68,S
|
||||||
|
587,0,2,"Jarvis, Mr. John Denzil",male,47,0,0,237565,15,,S
|
||||||
|
588,1,1,"Frolicher-Stehli, Mr. Maxmillian",male,60,1,1,13567,79.2,B41,C
|
||||||
|
589,0,3,"Gilinski, Mr. Eliezer",male,22,0,0,14973,8.05,,S
|
||||||
|
590,0,3,"Murdlin, Mr. Joseph",male,,0,0,A./5. 3235,8.05,,S
|
||||||
|
591,0,3,"Rintamaki, Mr. Matti",male,35,0,0,STON/O 2. 3101273,7.125,,S
|
||||||
|
592,1,1,"Stephenson, Mrs. Walter Bertram (Martha Eustis)",female,52,1,0,36947,78.2667,D20,C
|
||||||
|
593,0,3,"Elsbury, Mr. William James",male,47,0,0,A/5 3902,7.25,,S
|
||||||
|
594,0,3,"Bourke, Miss. Mary",female,,0,2,364848,7.75,,Q
|
||||||
|
595,0,2,"Chapman, Mr. John Henry",male,37,1,0,SC/AH 29037,26,,S
|
||||||
|
596,0,3,"Van Impe, Mr. Jean Baptiste",male,36,1,1,345773,24.15,,S
|
||||||
|
597,1,2,"Leitch, Miss. Jessie Wills",female,,0,0,248727,33,,S
|
||||||
|
598,0,3,"Johnson, Mr. Alfred",male,49,0,0,LINE,0,,S
|
||||||
|
599,0,3,"Boulos, Mr. Hanna",male,,0,0,2664,7.225,,C
|
||||||
|
600,1,1,"Duff Gordon, Sir. Cosmo Edmund (""Mr Morgan"")",male,49,1,0,PC 17485,56.9292,A20,C
|
||||||
|
601,1,2,"Jacobsohn, Mrs. Sidney Samuel (Amy Frances Christy)",female,24,2,1,243847,27,,S
|
||||||
|
602,0,3,"Slabenoff, Mr. Petco",male,,0,0,349214,7.8958,,S
|
||||||
|
603,0,1,"Harrington, Mr. Charles H",male,,0,0,113796,42.4,,S
|
||||||
|
604,0,3,"Torber, Mr. Ernst William",male,44,0,0,364511,8.05,,S
|
||||||
|
605,1,1,"Homer, Mr. Harry (""Mr E Haven"")",male,35,0,0,111426,26.55,,C
|
||||||
|
606,0,3,"Lindell, Mr. Edvard Bengtsson",male,36,1,0,349910,15.55,,S
|
||||||
|
607,0,3,"Karaic, Mr. Milan",male,30,0,0,349246,7.8958,,S
|
||||||
|
608,1,1,"Daniel, Mr. Robert Williams",male,27,0,0,113804,30.5,,S
|
||||||
|
609,1,2,"Laroche, Mrs. Joseph (Juliette Marie Louise Lafargue)",female,22,1,2,SC/Paris 2123,41.5792,,C
|
||||||
|
610,1,1,"Shutes, Miss. Elizabeth W",female,40,0,0,PC 17582,153.4625,C125,S
|
||||||
|
611,0,3,"Andersson, Mrs. Anders Johan (Alfrida Konstantia Brogren)",female,39,1,5,347082,31.275,,S
|
||||||
|
612,0,3,"Jardin, Mr. Jose Neto",male,,0,0,SOTON/O.Q. 3101305,7.05,,S
|
||||||
|
613,1,3,"Murphy, Miss. Margaret Jane",female,,1,0,367230,15.5,,Q
|
||||||
|
614,0,3,"Horgan, Mr. John",male,,0,0,370377,7.75,,Q
|
||||||
|
615,0,3,"Brocklebank, Mr. William Alfred",male,35,0,0,364512,8.05,,S
|
||||||
|
616,1,2,"Herman, Miss. Alice",female,24,1,2,220845,65,,S
|
||||||
|
617,0,3,"Danbom, Mr. Ernst Gilbert",male,34,1,1,347080,14.4,,S
|
||||||
|
618,0,3,"Lobb, Mrs. William Arthur (Cordelia K Stanlick)",female,26,1,0,A/5. 3336,16.1,,S
|
||||||
|
619,1,2,"Becker, Miss. Marion Louise",female,4,2,1,230136,39,F4,S
|
||||||
|
620,0,2,"Gavey, Mr. Lawrence",male,26,0,0,31028,10.5,,S
|
||||||
|
621,0,3,"Yasbeck, Mr. Antoni",male,27,1,0,2659,14.4542,,C
|
||||||
|
622,1,1,"Kimball, Mr. Edwin Nelson Jr",male,42,1,0,11753,52.5542,D19,S
|
||||||
|
623,1,3,"Nakid, Mr. Sahid",male,20,1,1,2653,15.7417,,C
|
||||||
|
624,0,3,"Hansen, Mr. Henry Damsgaard",male,21,0,0,350029,7.8542,,S
|
||||||
|
625,0,3,"Bowen, Mr. David John ""Dai""",male,21,0,0,54636,16.1,,S
|
||||||
|
626,0,1,"Sutton, Mr. Frederick",male,61,0,0,36963,32.3208,D50,S
|
||||||
|
627,0,2,"Kirkland, Rev. Charles Leonard",male,57,0,0,219533,12.35,,Q
|
||||||
|
628,1,1,"Longley, Miss. Gretchen Fiske",female,21,0,0,13502,77.9583,D9,S
|
||||||
|
629,0,3,"Bostandyeff, Mr. Guentcho",male,26,0,0,349224,7.8958,,S
|
||||||
|
630,0,3,"O'Connell, Mr. Patrick D",male,,0,0,334912,7.7333,,Q
|
||||||
|
631,1,1,"Barkworth, Mr. Algernon Henry Wilson",male,80,0,0,27042,30,A23,S
|
||||||
|
632,0,3,"Lundahl, Mr. Johan Svensson",male,51,0,0,347743,7.0542,,S
|
||||||
|
633,1,1,"Stahelin-Maeglin, Dr. Max",male,32,0,0,13214,30.5,B50,C
|
||||||
|
634,0,1,"Parr, Mr. William Henry Marsh",male,,0,0,112052,0,,S
|
||||||
|
635,0,3,"Skoog, Miss. Mabel",female,9,3,2,347088,27.9,,S
|
||||||
|
636,1,2,"Davis, Miss. Mary",female,28,0,0,237668,13,,S
|
||||||
|
637,0,3,"Leinonen, Mr. Antti Gustaf",male,32,0,0,STON/O 2. 3101292,7.925,,S
|
||||||
|
638,0,2,"Collyer, Mr. Harvey",male,31,1,1,C.A. 31921,26.25,,S
|
||||||
|
639,0,3,"Panula, Mrs. Juha (Maria Emilia Ojala)",female,41,0,5,3101295,39.6875,,S
|
||||||
|
640,0,3,"Thorneycroft, Mr. Percival",male,,1,0,376564,16.1,,S
|
||||||
|
641,0,3,"Jensen, Mr. Hans Peder",male,20,0,0,350050,7.8542,,S
|
||||||
|
642,1,1,"Sagesser, Mlle. Emma",female,24,0,0,PC 17477,69.3,B35,C
|
||||||
|
643,0,3,"Skoog, Miss. Margit Elizabeth",female,2,3,2,347088,27.9,,S
|
||||||
|
644,1,3,"Foo, Mr. Choong",male,,0,0,1601,56.4958,,S
|
||||||
|
645,1,3,"Baclini, Miss. Eugenie",female,0.75,2,1,2666,19.2583,,C
|
||||||
|
646,1,1,"Harper, Mr. Henry Sleeper",male,48,1,0,PC 17572,76.7292,D33,C
|
||||||
|
647,0,3,"Cor, Mr. Liudevit",male,19,0,0,349231,7.8958,,S
|
||||||
|
648,1,1,"Simonius-Blumer, Col. Oberst Alfons",male,56,0,0,13213,35.5,A26,C
|
||||||
|
649,0,3,"Willey, Mr. Edward",male,,0,0,S.O./P.P. 751,7.55,,S
|
||||||
|
650,1,3,"Stanley, Miss. Amy Zillah Elsie",female,23,0,0,CA. 2314,7.55,,S
|
||||||
|
651,0,3,"Mitkoff, Mr. Mito",male,,0,0,349221,7.8958,,S
|
||||||
|
652,1,2,"Doling, Miss. Elsie",female,18,0,1,231919,23,,S
|
||||||
|
653,0,3,"Kalvik, Mr. Johannes Halvorsen",male,21,0,0,8475,8.4333,,S
|
||||||
|
654,1,3,"O'Leary, Miss. Hanora ""Norah""",female,,0,0,330919,7.8292,,Q
|
||||||
|
655,0,3,"Hegarty, Miss. Hanora ""Nora""",female,18,0,0,365226,6.75,,Q
|
||||||
|
656,0,2,"Hickman, Mr. Leonard Mark",male,24,2,0,S.O.C. 14879,73.5,,S
|
||||||
|
657,0,3,"Radeff, Mr. Alexander",male,,0,0,349223,7.8958,,S
|
||||||
|
658,0,3,"Bourke, Mrs. John (Catherine)",female,32,1,1,364849,15.5,,Q
|
||||||
|
659,0,2,"Eitemiller, Mr. George Floyd",male,23,0,0,29751,13,,S
|
||||||
|
660,0,1,"Newell, Mr. Arthur Webster",male,58,0,2,35273,113.275,D48,C
|
||||||
|
661,1,1,"Frauenthal, Dr. Henry William",male,50,2,0,PC 17611,133.65,,S
|
||||||
|
662,0,3,"Badt, Mr. Mohamed",male,40,0,0,2623,7.225,,C
|
||||||
|
663,0,1,"Colley, Mr. Edward Pomeroy",male,47,0,0,5727,25.5875,E58,S
|
||||||
|
664,0,3,"Coleff, Mr. Peju",male,36,0,0,349210,7.4958,,S
|
||||||
|
665,1,3,"Lindqvist, Mr. Eino William",male,20,1,0,STON/O 2. 3101285,7.925,,S
|
||||||
|
666,0,2,"Hickman, Mr. Lewis",male,32,2,0,S.O.C. 14879,73.5,,S
|
||||||
|
667,0,2,"Butler, Mr. Reginald Fenton",male,25,0,0,234686,13,,S
|
||||||
|
668,0,3,"Rommetvedt, Mr. Knud Paust",male,,0,0,312993,7.775,,S
|
||||||
|
669,0,3,"Cook, Mr. Jacob",male,43,0,0,A/5 3536,8.05,,S
|
||||||
|
670,1,1,"Taylor, Mrs. Elmer Zebley (Juliet Cummins Wright)",female,,1,0,19996,52,C126,S
|
||||||
|
671,1,2,"Brown, Mrs. Thomas William Solomon (Elizabeth Catherine Ford)",female,40,1,1,29750,39,,S
|
||||||
|
672,0,1,"Davidson, Mr. Thornton",male,31,1,0,F.C. 12750,52,B71,S
|
||||||
|
673,0,2,"Mitchell, Mr. Henry Michael",male,70,0,0,C.A. 24580,10.5,,S
|
||||||
|
674,1,2,"Wilhelms, Mr. Charles",male,31,0,0,244270,13,,S
|
||||||
|
675,0,2,"Watson, Mr. Ennis Hastings",male,,0,0,239856,0,,S
|
||||||
|
676,0,3,"Edvardsson, Mr. Gustaf Hjalmar",male,18,0,0,349912,7.775,,S
|
||||||
|
677,0,3,"Sawyer, Mr. Frederick Charles",male,24.5,0,0,342826,8.05,,S
|
||||||
|
678,1,3,"Turja, Miss. Anna Sofia",female,18,0,0,4138,9.8417,,S
|
||||||
|
679,0,3,"Goodwin, Mrs. Frederick (Augusta Tyler)",female,43,1,6,CA 2144,46.9,,S
|
||||||
|
680,1,1,"Cardeza, Mr. Thomas Drake Martinez",male,36,0,1,PC 17755,512.3292,B51 B53 B55,C
|
||||||
|
681,0,3,"Peters, Miss. Katie",female,,0,0,330935,8.1375,,Q
|
||||||
|
682,1,1,"Hassab, Mr. Hammad",male,27,0,0,PC 17572,76.7292,D49,C
|
||||||
|
683,0,3,"Olsvigen, Mr. Thor Anderson",male,20,0,0,6563,9.225,,S
|
||||||
|
684,0,3,"Goodwin, Mr. Charles Edward",male,14,5,2,CA 2144,46.9,,S
|
||||||
|
685,0,2,"Brown, Mr. Thomas William Solomon",male,60,1,1,29750,39,,S
|
||||||
|
686,0,2,"Laroche, Mr. Joseph Philippe Lemercier",male,25,1,2,SC/Paris 2123,41.5792,,C
|
||||||
|
687,0,3,"Panula, Mr. Jaako Arnold",male,14,4,1,3101295,39.6875,,S
|
||||||
|
688,0,3,"Dakic, Mr. Branko",male,19,0,0,349228,10.1708,,S
|
||||||
|
689,0,3,"Fischer, Mr. Eberhard Thelander",male,18,0,0,350036,7.7958,,S
|
||||||
|
690,1,1,"Madill, Miss. Georgette Alexandra",female,15,0,1,24160,211.3375,B5,S
|
||||||
|
691,1,1,"Dick, Mr. Albert Adrian",male,31,1,0,17474,57,B20,S
|
||||||
|
692,1,3,"Karun, Miss. Manca",female,4,0,1,349256,13.4167,,C
|
||||||
|
693,1,3,"Lam, Mr. Ali",male,,0,0,1601,56.4958,,S
|
||||||
|
694,0,3,"Saad, Mr. Khalil",male,25,0,0,2672,7.225,,C
|
||||||
|
695,0,1,"Weir, Col. John",male,60,0,0,113800,26.55,,S
|
||||||
|
696,0,2,"Chapman, Mr. Charles Henry",male,52,0,0,248731,13.5,,S
|
||||||
|
697,0,3,"Kelly, Mr. James",male,44,0,0,363592,8.05,,S
|
||||||
|
698,1,3,"Mullens, Miss. Katherine ""Katie""",female,,0,0,35852,7.7333,,Q
|
||||||
|
699,0,1,"Thayer, Mr. John Borland",male,49,1,1,17421,110.8833,C68,C
|
||||||
|
700,0,3,"Humblen, Mr. Adolf Mathias Nicolai Olsen",male,42,0,0,348121,7.65,F G63,S
|
||||||
|
701,1,1,"Astor, Mrs. John Jacob (Madeleine Talmadge Force)",female,18,1,0,PC 17757,227.525,C62 C64,C
|
||||||
|
702,1,1,"Silverthorne, Mr. Spencer Victor",male,35,0,0,PC 17475,26.2875,E24,S
|
||||||
|
703,0,3,"Barbara, Miss. Saiide",female,18,0,1,2691,14.4542,,C
|
||||||
|
704,0,3,"Gallagher, Mr. Martin",male,25,0,0,36864,7.7417,,Q
|
||||||
|
705,0,3,"Hansen, Mr. Henrik Juul",male,26,1,0,350025,7.8542,,S
|
||||||
|
706,0,2,"Morley, Mr. Henry Samuel (""Mr Henry Marshall"")",male,39,0,0,250655,26,,S
|
||||||
|
707,1,2,"Kelly, Mrs. Florence ""Fannie""",female,45,0,0,223596,13.5,,S
|
||||||
|
708,1,1,"Calderhead, Mr. Edward Pennington",male,42,0,0,PC 17476,26.2875,E24,S
|
||||||
|
709,1,1,"Cleaver, Miss. Alice",female,22,0,0,113781,151.55,,S
|
||||||
|
710,1,3,"Moubarek, Master. Halim Gonios (""William George"")",male,,1,1,2661,15.2458,,C
|
||||||
|
711,1,1,"Mayne, Mlle. Berthe Antonine (""Mrs de Villiers"")",female,24,0,0,PC 17482,49.5042,C90,C
|
||||||
|
712,0,1,"Klaber, Mr. Herman",male,,0,0,113028,26.55,C124,S
|
||||||
|
713,1,1,"Taylor, Mr. Elmer Zebley",male,48,1,0,19996,52,C126,S
|
||||||
|
714,0,3,"Larsson, Mr. August Viktor",male,29,0,0,7545,9.4833,,S
|
||||||
|
715,0,2,"Greenberg, Mr. Samuel",male,52,0,0,250647,13,,S
|
||||||
|
716,0,3,"Soholt, Mr. Peter Andreas Lauritz Andersen",male,19,0,0,348124,7.65,F G73,S
|
||||||
|
717,1,1,"Endres, Miss. Caroline Louise",female,38,0,0,PC 17757,227.525,C45,C
|
||||||
|
718,1,2,"Troutt, Miss. Edwina Celia ""Winnie""",female,27,0,0,34218,10.5,E101,S
|
||||||
|
719,0,3,"McEvoy, Mr. Michael",male,,0,0,36568,15.5,,Q
|
||||||
|
720,0,3,"Johnson, Mr. Malkolm Joackim",male,33,0,0,347062,7.775,,S
|
||||||
|
721,1,2,"Harper, Miss. Annie Jessie ""Nina""",female,6,0,1,248727,33,,S
|
||||||
|
722,0,3,"Jensen, Mr. Svend Lauritz",male,17,1,0,350048,7.0542,,S
|
||||||
|
723,0,2,"Gillespie, Mr. William Henry",male,34,0,0,12233,13,,S
|
||||||
|
724,0,2,"Hodges, Mr. Henry Price",male,50,0,0,250643,13,,S
|
||||||
|
725,1,1,"Chambers, Mr. Norman Campbell",male,27,1,0,113806,53.1,E8,S
|
||||||
|
726,0,3,"Oreskovic, Mr. Luka",male,20,0,0,315094,8.6625,,S
|
||||||
|
727,1,2,"Renouf, Mrs. Peter Henry (Lillian Jefferys)",female,30,3,0,31027,21,,S
|
||||||
|
728,1,3,"Mannion, Miss. Margareth",female,,0,0,36866,7.7375,,Q
|
||||||
|
729,0,2,"Bryhl, Mr. Kurt Arnold Gottfrid",male,25,1,0,236853,26,,S
|
||||||
|
730,0,3,"Ilmakangas, Miss. Pieta Sofia",female,25,1,0,STON/O2. 3101271,7.925,,S
|
||||||
|
731,1,1,"Allen, Miss. Elisabeth Walton",female,29,0,0,24160,211.3375,B5,S
|
||||||
|
732,0,3,"Hassan, Mr. Houssein G N",male,11,0,0,2699,18.7875,,C
|
||||||
|
733,0,2,"Knight, Mr. Robert J",male,,0,0,239855,0,,S
|
||||||
|
734,0,2,"Berriman, Mr. William John",male,23,0,0,28425,13,,S
|
||||||
|
735,0,2,"Troupiansky, Mr. Moses Aaron",male,23,0,0,233639,13,,S
|
||||||
|
736,0,3,"Williams, Mr. Leslie",male,28.5,0,0,54636,16.1,,S
|
||||||
|
737,0,3,"Ford, Mrs. Edward (Margaret Ann Watson)",female,48,1,3,W./C. 6608,34.375,,S
|
||||||
|
738,1,1,"Lesurer, Mr. Gustave J",male,35,0,0,PC 17755,512.3292,B101,C
|
||||||
|
739,0,3,"Ivanoff, Mr. Kanio",male,,0,0,349201,7.8958,,S
|
||||||
|
740,0,3,"Nankoff, Mr. Minko",male,,0,0,349218,7.8958,,S
|
||||||
|
741,1,1,"Hawksford, Mr. Walter James",male,,0,0,16988,30,D45,S
|
||||||
|
742,0,1,"Cavendish, Mr. Tyrell William",male,36,1,0,19877,78.85,C46,S
|
||||||
|
743,1,1,"Ryerson, Miss. Susan Parker ""Suzette""",female,21,2,2,PC 17608,262.375,B57 B59 B63 B66,C
|
||||||
|
744,0,3,"McNamee, Mr. Neal",male,24,1,0,376566,16.1,,S
|
||||||
|
745,1,3,"Stranden, Mr. Juho",male,31,0,0,STON/O 2. 3101288,7.925,,S
|
||||||
|
746,0,1,"Crosby, Capt. Edward Gifford",male,70,1,1,WE/P 5735,71,B22,S
|
||||||
|
747,0,3,"Abbott, Mr. Rossmore Edward",male,16,1,1,C.A. 2673,20.25,,S
|
||||||
|
748,1,2,"Sinkkonen, Miss. Anna",female,30,0,0,250648,13,,S
|
||||||
|
749,0,1,"Marvin, Mr. Daniel Warner",male,19,1,0,113773,53.1,D30,S
|
||||||
|
750,0,3,"Connaghton, Mr. Michael",male,31,0,0,335097,7.75,,Q
|
||||||
|
751,1,2,"Wells, Miss. Joan",female,4,1,1,29103,23,,S
|
||||||
|
752,1,3,"Moor, Master. Meier",male,6,0,1,392096,12.475,E121,S
|
||||||
|
753,0,3,"Vande Velde, Mr. Johannes Joseph",male,33,0,0,345780,9.5,,S
|
||||||
|
754,0,3,"Jonkoff, Mr. Lalio",male,23,0,0,349204,7.8958,,S
|
||||||
|
755,1,2,"Herman, Mrs. Samuel (Jane Laver)",female,48,1,2,220845,65,,S
|
||||||
|
756,1,2,"Hamalainen, Master. Viljo",male,0.67,1,1,250649,14.5,,S
|
||||||
|
757,0,3,"Carlsson, Mr. August Sigfrid",male,28,0,0,350042,7.7958,,S
|
||||||
|
758,0,2,"Bailey, Mr. Percy Andrew",male,18,0,0,29108,11.5,,S
|
||||||
|
759,0,3,"Theobald, Mr. Thomas Leonard",male,34,0,0,363294,8.05,,S
|
||||||
|
760,1,1,"Rothes, the Countess. of (Lucy Noel Martha Dyer-Edwards)",female,33,0,0,110152,86.5,B77,S
|
||||||
|
761,0,3,"Garfirth, Mr. John",male,,0,0,358585,14.5,,S
|
||||||
|
762,0,3,"Nirva, Mr. Iisakki Antino Aijo",male,41,0,0,SOTON/O2 3101272,7.125,,S
|
||||||
|
763,1,3,"Barah, Mr. Hanna Assi",male,20,0,0,2663,7.2292,,C
|
||||||
|
764,1,1,"Carter, Mrs. William Ernest (Lucile Polk)",female,36,1,2,113760,120,B96 B98,S
|
||||||
|
765,0,3,"Eklund, Mr. Hans Linus",male,16,0,0,347074,7.775,,S
|
||||||
|
766,1,1,"Hogeboom, Mrs. John C (Anna Andrews)",female,51,1,0,13502,77.9583,D11,S
|
||||||
|
767,0,1,"Brewe, Dr. Arthur Jackson",male,,0,0,112379,39.6,,C
|
||||||
|
768,0,3,"Mangan, Miss. Mary",female,30.5,0,0,364850,7.75,,Q
|
||||||
|
769,0,3,"Moran, Mr. Daniel J",male,,1,0,371110,24.15,,Q
|
||||||
|
770,0,3,"Gronnestad, Mr. Daniel Danielsen",male,32,0,0,8471,8.3625,,S
|
||||||
|
771,0,3,"Lievens, Mr. Rene Aime",male,24,0,0,345781,9.5,,S
|
||||||
|
772,0,3,"Jensen, Mr. Niels Peder",male,48,0,0,350047,7.8542,,S
|
||||||
|
773,0,2,"Mack, Mrs. (Mary)",female,57,0,0,S.O./P.P. 3,10.5,E77,S
|
||||||
|
774,0,3,"Elias, Mr. Dibo",male,,0,0,2674,7.225,,C
|
||||||
|
775,1,2,"Hocking, Mrs. Elizabeth (Eliza Needs)",female,54,1,3,29105,23,,S
|
||||||
|
776,0,3,"Myhrman, Mr. Pehr Fabian Oliver Malkolm",male,18,0,0,347078,7.75,,S
|
||||||
|
777,0,3,"Tobin, Mr. Roger",male,,0,0,383121,7.75,F38,Q
|
||||||
|
778,1,3,"Emanuel, Miss. Virginia Ethel",female,5,0,0,364516,12.475,,S
|
||||||
|
779,0,3,"Kilgannon, Mr. Thomas J",male,,0,0,36865,7.7375,,Q
|
||||||
|
780,1,1,"Robert, Mrs. Edward Scott (Elisabeth Walton McMillan)",female,43,0,1,24160,211.3375,B3,S
|
||||||
|
781,1,3,"Ayoub, Miss. Banoura",female,13,0,0,2687,7.2292,,C
|
||||||
|
782,1,1,"Dick, Mrs. Albert Adrian (Vera Gillespie)",female,17,1,0,17474,57,B20,S
|
||||||
|
783,0,1,"Long, Mr. Milton Clyde",male,29,0,0,113501,30,D6,S
|
||||||
|
784,0,3,"Johnston, Mr. Andrew G",male,,1,2,W./C. 6607,23.45,,S
|
||||||
|
785,0,3,"Ali, Mr. William",male,25,0,0,SOTON/O.Q. 3101312,7.05,,S
|
||||||
|
786,0,3,"Harmer, Mr. Abraham (David Lishin)",male,25,0,0,374887,7.25,,S
|
||||||
|
787,1,3,"Sjoblom, Miss. Anna Sofia",female,18,0,0,3101265,7.4958,,S
|
||||||
|
788,0,3,"Rice, Master. George Hugh",male,8,4,1,382652,29.125,,Q
|
||||||
|
789,1,3,"Dean, Master. Bertram Vere",male,1,1,2,C.A. 2315,20.575,,S
|
||||||
|
790,0,1,"Guggenheim, Mr. Benjamin",male,46,0,0,PC 17593,79.2,B82 B84,C
|
||||||
|
791,0,3,"Keane, Mr. Andrew ""Andy""",male,,0,0,12460,7.75,,Q
|
||||||
|
792,0,2,"Gaskell, Mr. Alfred",male,16,0,0,239865,26,,S
|
||||||
|
793,0,3,"Sage, Miss. Stella Anna",female,,8,2,CA. 2343,69.55,,S
|
||||||
|
794,0,1,"Hoyt, Mr. William Fisher",male,,0,0,PC 17600,30.6958,,C
|
||||||
|
795,0,3,"Dantcheff, Mr. Ristiu",male,25,0,0,349203,7.8958,,S
|
||||||
|
796,0,2,"Otter, Mr. Richard",male,39,0,0,28213,13,,S
|
||||||
|
797,1,1,"Leader, Dr. Alice (Farnham)",female,49,0,0,17465,25.9292,D17,S
|
||||||
|
798,1,3,"Osman, Mrs. Mara",female,31,0,0,349244,8.6833,,S
|
||||||
|
799,0,3,"Ibrahim Shawah, Mr. Yousseff",male,30,0,0,2685,7.2292,,C
|
||||||
|
800,0,3,"Van Impe, Mrs. Jean Baptiste (Rosalie Paula Govaert)",female,30,1,1,345773,24.15,,S
|
||||||
|
801,0,2,"Ponesell, Mr. Martin",male,34,0,0,250647,13,,S
|
||||||
|
802,1,2,"Collyer, Mrs. Harvey (Charlotte Annie Tate)",female,31,1,1,C.A. 31921,26.25,,S
|
||||||
|
803,1,1,"Carter, Master. William Thornton II",male,11,1,2,113760,120,B96 B98,S
|
||||||
|
804,1,3,"Thomas, Master. Assad Alexander",male,0.42,0,1,2625,8.5167,,C
|
||||||
|
805,1,3,"Hedman, Mr. Oskar Arvid",male,27,0,0,347089,6.975,,S
|
||||||
|
806,0,3,"Johansson, Mr. Karl Johan",male,31,0,0,347063,7.775,,S
|
||||||
|
807,0,1,"Andrews, Mr. Thomas Jr",male,39,0,0,112050,0,A36,S
|
||||||
|
808,0,3,"Pettersson, Miss. Ellen Natalia",female,18,0,0,347087,7.775,,S
|
||||||
|
809,0,2,"Meyer, Mr. August",male,39,0,0,248723,13,,S
|
||||||
|
810,1,1,"Chambers, Mrs. Norman Campbell (Bertha Griggs)",female,33,1,0,113806,53.1,E8,S
|
||||||
|
811,0,3,"Alexander, Mr. William",male,26,0,0,3474,7.8875,,S
|
||||||
|
812,0,3,"Lester, Mr. James",male,39,0,0,A/4 48871,24.15,,S
|
||||||
|
813,0,2,"Slemen, Mr. Richard James",male,35,0,0,28206,10.5,,S
|
||||||
|
814,0,3,"Andersson, Miss. Ebba Iris Alfrida",female,6,4,2,347082,31.275,,S
|
||||||
|
815,0,3,"Tomlin, Mr. Ernest Portage",male,30.5,0,0,364499,8.05,,S
|
||||||
|
816,0,1,"Fry, Mr. Richard",male,,0,0,112058,0,B102,S
|
||||||
|
817,0,3,"Heininen, Miss. Wendla Maria",female,23,0,0,STON/O2. 3101290,7.925,,S
|
||||||
|
818,0,2,"Mallet, Mr. Albert",male,31,1,1,S.C./PARIS 2079,37.0042,,C
|
||||||
|
819,0,3,"Holm, Mr. John Fredrik Alexander",male,43,0,0,C 7075,6.45,,S
|
||||||
|
820,0,3,"Skoog, Master. Karl Thorsten",male,10,3,2,347088,27.9,,S
|
||||||
|
821,1,1,"Hays, Mrs. Charles Melville (Clara Jennings Gregg)",female,52,1,1,12749,93.5,B69,S
|
||||||
|
822,1,3,"Lulic, Mr. Nikola",male,27,0,0,315098,8.6625,,S
|
||||||
|
823,0,1,"Reuchlin, Jonkheer. John George",male,38,0,0,19972,0,,S
|
||||||
|
824,1,3,"Moor, Mrs. (Beila)",female,27,0,1,392096,12.475,E121,S
|
||||||
|
825,0,3,"Panula, Master. Urho Abraham",male,2,4,1,3101295,39.6875,,S
|
||||||
|
826,0,3,"Flynn, Mr. John",male,,0,0,368323,6.95,,Q
|
||||||
|
827,0,3,"Lam, Mr. Len",male,,0,0,1601,56.4958,,S
|
||||||
|
828,1,2,"Mallet, Master. Andre",male,1,0,2,S.C./PARIS 2079,37.0042,,C
|
||||||
|
829,1,3,"McCormack, Mr. Thomas Joseph",male,,0,0,367228,7.75,,Q
|
||||||
|
830,1,1,"Stone, Mrs. George Nelson (Martha Evelyn)",female,62,0,0,113572,80,B28,
|
||||||
|
831,1,3,"Yasbeck, Mrs. Antoni (Selini Alexander)",female,15,1,0,2659,14.4542,,C
|
||||||
|
832,1,2,"Richards, Master. George Sibley",male,0.83,1,1,29106,18.75,,S
|
||||||
|
833,0,3,"Saad, Mr. Amin",male,,0,0,2671,7.2292,,C
|
||||||
|
834,0,3,"Augustsson, Mr. Albert",male,23,0,0,347468,7.8542,,S
|
||||||
|
835,0,3,"Allum, Mr. Owen George",male,18,0,0,2223,8.3,,S
|
||||||
|
836,1,1,"Compton, Miss. Sara Rebecca",female,39,1,1,PC 17756,83.1583,E49,C
|
||||||
|
837,0,3,"Pasic, Mr. Jakob",male,21,0,0,315097,8.6625,,S
|
||||||
|
838,0,3,"Sirota, Mr. Maurice",male,,0,0,392092,8.05,,S
|
||||||
|
839,1,3,"Chip, Mr. Chang",male,32,0,0,1601,56.4958,,S
|
||||||
|
840,1,1,"Marechal, Mr. Pierre",male,,0,0,11774,29.7,C47,C
|
||||||
|
841,0,3,"Alhomaki, Mr. Ilmari Rudolf",male,20,0,0,SOTON/O2 3101287,7.925,,S
|
||||||
|
842,0,2,"Mudd, Mr. Thomas Charles",male,16,0,0,S.O./P.P. 3,10.5,,S
|
||||||
|
843,1,1,"Serepeca, Miss. Augusta",female,30,0,0,113798,31,,C
|
||||||
|
844,0,3,"Lemberopolous, Mr. Peter L",male,34.5,0,0,2683,6.4375,,C
|
||||||
|
845,0,3,"Culumovic, Mr. Jeso",male,17,0,0,315090,8.6625,,S
|
||||||
|
846,0,3,"Abbing, Mr. Anthony",male,42,0,0,C.A. 5547,7.55,,S
|
||||||
|
847,0,3,"Sage, Mr. Douglas Bullen",male,,8,2,CA. 2343,69.55,,S
|
||||||
|
848,0,3,"Markoff, Mr. Marin",male,35,0,0,349213,7.8958,,C
|
||||||
|
849,0,2,"Harper, Rev. John",male,28,0,1,248727,33,,S
|
||||||
|
850,1,1,"Goldenberg, Mrs. Samuel L (Edwiga Grabowska)",female,,1,0,17453,89.1042,C92,C
|
||||||
|
851,0,3,"Andersson, Master. Sigvard Harald Elias",male,4,4,2,347082,31.275,,S
|
||||||
|
852,0,3,"Svensson, Mr. Johan",male,74,0,0,347060,7.775,,S
|
||||||
|
853,0,3,"Boulos, Miss. Nourelain",female,9,1,1,2678,15.2458,,C
|
||||||
|
854,1,1,"Lines, Miss. Mary Conover",female,16,0,1,PC 17592,39.4,D28,S
|
||||||
|
855,0,2,"Carter, Mrs. Ernest Courtenay (Lilian Hughes)",female,44,1,0,244252,26,,S
|
||||||
|
856,1,3,"Aks, Mrs. Sam (Leah Rosen)",female,18,0,1,392091,9.35,,S
|
||||||
|
857,1,1,"Wick, Mrs. George Dennick (Mary Hitchcock)",female,45,1,1,36928,164.8667,,S
|
||||||
|
858,1,1,"Daly, Mr. Peter Denis ",male,51,0,0,113055,26.55,E17,S
|
||||||
|
859,1,3,"Baclini, Mrs. Solomon (Latifa Qurban)",female,24,0,3,2666,19.2583,,C
|
||||||
|
860,0,3,"Razi, Mr. Raihed",male,,0,0,2629,7.2292,,C
|
||||||
|
861,0,3,"Hansen, Mr. Claus Peter",male,41,2,0,350026,14.1083,,S
|
||||||
|
862,0,2,"Giles, Mr. Frederick Edward",male,21,1,0,28134,11.5,,S
|
||||||
|
863,1,1,"Swift, Mrs. Frederick Joel (Margaret Welles Barron)",female,48,0,0,17466,25.9292,D17,S
|
||||||
|
864,0,3,"Sage, Miss. Dorothy Edith ""Dolly""",female,,8,2,CA. 2343,69.55,,S
|
||||||
|
865,0,2,"Gill, Mr. John William",male,24,0,0,233866,13,,S
|
||||||
|
866,1,2,"Bystrom, Mrs. (Karolina)",female,42,0,0,236852,13,,S
|
||||||
|
867,1,2,"Duran y More, Miss. Asuncion",female,27,1,0,SC/PARIS 2149,13.8583,,C
|
||||||
|
868,0,1,"Roebling, Mr. Washington Augustus II",male,31,0,0,PC 17590,50.4958,A24,S
|
||||||
|
869,0,3,"van Melkebeke, Mr. Philemon",male,,0,0,345777,9.5,,S
|
||||||
|
870,1,3,"Johnson, Master. Harold Theodor",male,4,1,1,347742,11.1333,,S
|
||||||
|
871,0,3,"Balkic, Mr. Cerin",male,26,0,0,349248,7.8958,,S
|
||||||
|
872,1,1,"Beckwith, Mrs. Richard Leonard (Sallie Monypeny)",female,47,1,1,11751,52.5542,D35,S
|
||||||
|
873,0,1,"Carlsson, Mr. Frans Olof",male,33,0,0,695,5,B51 B53 B55,S
|
||||||
|
874,0,3,"Vander Cruyssen, Mr. Victor",male,47,0,0,345765,9,,S
|
||||||
|
875,1,2,"Abelson, Mrs. Samuel (Hannah Wizosky)",female,28,1,0,P/PP 3381,24,,C
|
||||||
|
876,1,3,"Najib, Miss. Adele Kiamie ""Jane""",female,15,0,0,2667,7.225,,C
|
||||||
|
877,0,3,"Gustafsson, Mr. Alfred Ossian",male,20,0,0,7534,9.8458,,S
|
||||||
|
878,0,3,"Petroff, Mr. Nedelio",male,19,0,0,349212,7.8958,,S
|
||||||
|
879,0,3,"Laleff, Mr. Kristo",male,,0,0,349217,7.8958,,S
|
||||||
|
880,1,1,"Potter, Mrs. Thomas Jr (Lily Alexenia Wilson)",female,56,0,1,11767,83.1583,C50,C
|
||||||
|
881,1,2,"Shelley, Mrs. William (Imanita Parrish Hall)",female,25,0,1,230433,26,,S
|
||||||
|
882,0,3,"Markun, Mr. Johann",male,33,0,0,349257,7.8958,,S
|
||||||
|
883,0,3,"Dahlberg, Miss. Gerda Ulrika",female,22,0,0,7552,10.5167,,S
|
||||||
|
884,0,2,"Banfield, Mr. Frederick James",male,28,0,0,C.A./SOTON 34068,10.5,,S
|
||||||
|
885,0,3,"Sutehall, Mr. Henry Jr",male,25,0,0,SOTON/OQ 392076,7.05,,S
|
||||||
|
886,0,3,"Rice, Mrs. William (Margaret Norton)",female,39,0,5,382652,29.125,,Q
|
||||||
|
887,0,2,"Montvila, Rev. Juozas",male,27,0,0,211536,13,,S
|
||||||
|
888,1,1,"Graham, Miss. Margaret Edith",female,19,0,0,112053,30,B42,S
|
||||||
|
889,0,3,"Johnston, Miss. Catherine Helen ""Carrie""",female,,1,2,W./C. 6607,23.45,,S
|
||||||
|
890,1,1,"Behr, Mr. Karl Howell",male,26,0,0,111369,30,C148,C
|
||||||
|
891,0,3,"Dooley, Mr. Patrick",male,32,0,0,370376,7.75,,Q
|
|
417
docs/modules/agents/toolkits/examples/vectorstore.ipynb
Normal file
417
docs/modules/agents/toolkits/examples/vectorstore.ipynb
Normal file
@ -0,0 +1,417 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "18ada398-dce6-4049-9b56-fc0ede63da9c",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Vectorstore Agent\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook showcases an agent designed to retrieve information from one or more vectorstores, either with or without sources."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "eecb683b-3a46-4b9d-81a3-7caefbfec1a1",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Create the Vectorstores"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "9bfd0ed8-a5eb-443e-8e92-90be8cabb0a7",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.embeddings.openai import OpenAIEmbeddings\n",
|
||||||
|
"from langchain.vectorstores import Chroma\n",
|
||||||
|
"from langchain.text_splitter import CharacterTextSplitter\n",
|
||||||
|
"from langchain import OpenAI, VectorDBQA\n",
|
||||||
|
"llm = OpenAI(temperature=0)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "345bb078-4ec1-4e3a-827b-cd238c49054d",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"Running Chroma using direct local API.\n",
|
||||||
|
"Using DuckDB in-memory for database. Data will be transient.\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"from langchain.document_loaders import TextLoader\n",
|
||||||
|
"loader = TextLoader('../../../state_of_the_union.txt')\n",
|
||||||
|
"documents = loader.load()\n",
|
||||||
|
"text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)\n",
|
||||||
|
"texts = text_splitter.split_documents(documents)\n",
|
||||||
|
"\n",
|
||||||
|
"embeddings = OpenAIEmbeddings()\n",
|
||||||
|
"state_of_union_store = Chroma.from_documents(texts, embeddings, collection_name=\"state-of-union\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "5f50eb82-e1a5-4252-8306-8ec1b478d9b4",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"Running Chroma using direct local API.\n",
|
||||||
|
"Using DuckDB in-memory for database. Data will be transient.\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"from langchain.document_loaders import WebBaseLoader\n",
|
||||||
|
"loader = WebBaseLoader(\"https://beta.ruff.rs/docs/faq/\")\n",
|
||||||
|
"docs = loader.load()\n",
|
||||||
|
"ruff_texts = text_splitter.split_documents(docs)\n",
|
||||||
|
"ruff_store = Chroma.from_documents(ruff_texts, embeddings, collection_name=\"ruff\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "f4814175-964d-42f1-aa9d-22801ce1e912",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Initialize Toolkit and Agent\n",
|
||||||
|
"\n",
|
||||||
|
"First, we'll create an agent with a single vectorstore."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "5b3b3206",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents.agent_toolkits import (\n",
|
||||||
|
" create_vectorstore_agent,\n",
|
||||||
|
" VectorStoreToolkit,\n",
|
||||||
|
" VectorStoreInfo,\n",
|
||||||
|
")\n",
|
||||||
|
"vectorstore_info = VectorStoreInfo(\n",
|
||||||
|
" name=\"state_of_union_address\",\n",
|
||||||
|
" description=\"the most recent state of the Union adress\",\n",
|
||||||
|
" vectorstore=state_of_union_store\n",
|
||||||
|
")\n",
|
||||||
|
"toolkit = VectorStoreToolkit(vectorstore_info=vectorstore_info)\n",
|
||||||
|
"agent_executor = create_vectorstore_agent(\n",
|
||||||
|
" llm=llm,\n",
|
||||||
|
" toolkit=toolkit,\n",
|
||||||
|
" verbose=True\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "8a38ad10",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Examples"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"id": "3f2f455c",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to find the answer in the state of the union address\n",
|
||||||
|
"Action: state_of_union_address\n",
|
||||||
|
"Action Input: What did biden say about ketanji brown jackson\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m Biden said that Ketanji Brown Jackson is one of the nation's top legal minds and that she will continue Justice Breyer's legacy of excellence.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: Biden said that Ketanji Brown Jackson is one of the nation's top legal minds and that she will continue Justice Breyer's legacy of excellence.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"Biden said that Ketanji Brown Jackson is one of the nation's top legal minds and that she will continue Justice Breyer's legacy of excellence.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 5,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"What did biden say about ketanji brown jackson is the state of the union address?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "d61e1e63",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to use the state_of_union_address_with_sources tool to answer this question.\n",
|
||||||
|
"Action: state_of_union_address_with_sources\n",
|
||||||
|
"Action Input: What did biden say about ketanji brown jackson\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m{\"answer\": \" Biden said that he nominated Circuit Court of Appeals Judge Ketanji Brown Jackson to the United States Supreme Court, and that she is one of the nation's top legal minds who will continue Justice Breyer's legacy of excellence.\\n\", \"sources\": \"../../state_of_the_union.txt\"}\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: Biden said that he nominated Circuit Court of Appeals Judge Ketanji Brown Jackson to the United States Supreme Court, and that she is one of the nation's top legal minds who will continue Justice Breyer's legacy of excellence. Sources: ../../state_of_the_union.txt\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"Biden said that he nominated Circuit Court of Appeals Judge Ketanji Brown Jackson to the United States Supreme Court, and that she is one of the nation's top legal minds who will continue Justice Breyer's legacy of excellence. Sources: ../../state_of_the_union.txt\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 6,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"What did biden say about ketanji brown jackson is the state of the union address? List the source.\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "7ca07707",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Multiple Vectorstores\n",
|
||||||
|
"We can also easily use this initialize an agent with multiple vectorstores and use the agent to route between them. To do this. This agent is optimized for routing, so it is a different toolkit and initializer."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"id": "c3209fd3",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.agents.agent_toolkits import (\n",
|
||||||
|
" create_vectorstore_router_agent,\n",
|
||||||
|
" VectorStoreRouterToolkit,\n",
|
||||||
|
" VectorStoreInfo,\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 8,
|
||||||
|
"id": "815c4f39-308d-4949-b992-1361036e6e09",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"ruff_vectorstore_info = VectorStoreInfo(\n",
|
||||||
|
" name=\"ruff\",\n",
|
||||||
|
" description=\"Information about the Ruff python linting library\",\n",
|
||||||
|
" vectorstore=ruff_store\n",
|
||||||
|
")\n",
|
||||||
|
"router_toolkit = VectorStoreRouterToolkit(\n",
|
||||||
|
" vectorstores=[vectorstore_info, ruff_vectorstore_info],\n",
|
||||||
|
" llm=llm\n",
|
||||||
|
")\n",
|
||||||
|
"agent_executor = create_vectorstore_agent(\n",
|
||||||
|
" llm=llm,\n",
|
||||||
|
" toolkit=router_toolkit,\n",
|
||||||
|
" verbose=True\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "71680984-edaf-4a63-90f5-94edbd263550",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Examples"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 9,
|
||||||
|
"id": "3cd1bf3e-e3df-4e69-bbe1-71c64b1af947",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to use the state_of_union_address tool to answer this question.\n",
|
||||||
|
"Action: state_of_union_address\n",
|
||||||
|
"Action Input: What did biden say about ketanji brown jackson\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m Biden said that Ketanji Brown Jackson is one of the nation's top legal minds and that she will continue Justice Breyer's legacy of excellence.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: Biden said that Ketanji Brown Jackson is one of the nation's top legal minds and that she will continue Justice Breyer's legacy of excellence.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"Biden said that Ketanji Brown Jackson is one of the nation's top legal minds and that she will continue Justice Breyer's legacy of excellence.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 9,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"What did biden say about ketanji brown jackson is the state of the union address?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 10,
|
||||||
|
"id": "c5998b8d",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to find out what tool ruff uses to run over Jupyter Notebooks\n",
|
||||||
|
"Action: ruff\n",
|
||||||
|
"Action Input: What tool does ruff use to run over Jupyter Notebooks?\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m Ruff is integrated into nbQA, a tool for running linters and code formatters over Jupyter Notebooks. After installing ruff and nbqa, you can run Ruff over a notebook like so: > nbqa ruff Untitled.ipynb\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: Ruff is integrated into nbQA, a tool for running linters and code formatters over Jupyter Notebooks. After installing ruff and nbqa, you can run Ruff over a notebook like so: > nbqa ruff Untitled.ipynb\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'Ruff is integrated into nbQA, a tool for running linters and code formatters over Jupyter Notebooks. After installing ruff and nbqa, you can run Ruff over a notebook like so: > nbqa ruff Untitled.ipynb'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 10,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"What tool does ruff use to run over Jupyter Notebooks?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 11,
|
||||||
|
"id": "744e9b51-fbd9-4778-b594-ea957d0f3467",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to find out what tool ruff uses and if the president mentioned it in the state of the union.\n",
|
||||||
|
"Action: ruff\n",
|
||||||
|
"Action Input: What tool does ruff use to run over Jupyter Notebooks?\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3m Ruff is integrated into nbQA, a tool for running linters and code formatters over Jupyter Notebooks. After installing ruff and nbqa, you can run Ruff over a notebook like so: > nbqa ruff Untitled.ipynb\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I need to find out if the president mentioned nbQA in the state of the union.\n",
|
||||||
|
"Action: state_of_union_address\n",
|
||||||
|
"Action Input: Did the president mention nbQA in the state of the union?\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m No, the president did not mention nbQA in the state of the union.\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: No, the president did not mention nbQA in the state of the union.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'No, the president did not mention nbQA in the state of the union.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 11,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent_executor.run(\"What tool does ruff use to run over Jupyter Notebooks? Did the president mention that tool in the state of the union?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "92203aa9-f63a-4ce1-b562-fadf4474ad9d",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
38
docs/modules/agents/tools.rst
Normal file
38
docs/modules/agents/tools.rst
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
Tools
|
||||||
|
=============
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
`Conceptual Guide <https://docs.langchain.com/docs/components/agents/tool>`_
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
Tools are ways that an agent can use to interact with the outside world.
|
||||||
|
|
||||||
|
For an overview of what a tool is, how to use them, and a full list of examples, please see the getting started documentation
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./tools/getting_started.md
|
||||||
|
|
||||||
|
Next, we have some examples of customizing and generically working with tools
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./tools/custom_tools.ipynb
|
||||||
|
./tools/multi_input_tool.ipynb
|
||||||
|
|
||||||
|
|
||||||
|
In this documentation we cover generic tooling functionality (eg how to create your own)
|
||||||
|
as well as examples of tools and how to use them.
|
||||||
|
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./tools/examples/*
|
||||||
|
|
120
docs/modules/agents/tools/examples/chatgpt_plugins.ipynb
Normal file
120
docs/modules/agents/tools/examples/chatgpt_plugins.ipynb
Normal file
@ -0,0 +1,120 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "3f34700b",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# ChatGPT Plugins\n",
|
||||||
|
"\n",
|
||||||
|
"This example shows how to use ChatGPT Plugins within LangChain abstractions.\n",
|
||||||
|
"\n",
|
||||||
|
"Note 1: This currently only works for plugins with no auth.\n",
|
||||||
|
"\n",
|
||||||
|
"Note 2: There are almost certainly other ways to do this, this is just a first pass. If you have better ideas, please open a PR!"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "d41405b5",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.chat_models import ChatOpenAI\n",
|
||||||
|
"from langchain.agents import load_tools, initialize_agent\n",
|
||||||
|
"from langchain.tools import AIPluginTool"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "d9e61df5",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"tool = AIPluginTool.from_plugin_url(\"https://www.klarna.com/.well-known/ai-plugin.json\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "edc0ea0e",
|
||||||
|
"metadata": {
|
||||||
|
"scrolled": false
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mI need to check the Klarna Shopping API to see if it has information on available t shirts.\n",
|
||||||
|
"Action: KlarnaProducts\n",
|
||||||
|
"Action Input: None\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mUsage Guide: Use the Klarna plugin to get relevant product suggestions for any shopping or researching purpose. The query to be sent should not include stopwords like articles, prepositions and determinants. The api works best when searching for words that are related to products, like their name, brand, model or category. Links will always be returned and should be shown to the user.\n",
|
||||||
|
"\n",
|
||||||
|
"OpenAPI Spec: {'openapi': '3.0.1', 'info': {'version': 'v0', 'title': 'Open AI Klarna product Api'}, 'servers': [{'url': 'https://www.klarna.com/us/shopping'}], 'tags': [{'name': 'open-ai-product-endpoint', 'description': 'Open AI Product Endpoint. Query for products.'}], 'paths': {'/public/openai/v0/products': {'get': {'tags': ['open-ai-product-endpoint'], 'summary': 'API for fetching Klarna product information', 'operationId': 'productsUsingGET', 'parameters': [{'name': 'q', 'in': 'query', 'description': 'query, must be between 2 and 100 characters', 'required': True, 'schema': {'type': 'string'}}, {'name': 'size', 'in': 'query', 'description': 'number of products returned', 'required': False, 'schema': {'type': 'integer'}}, {'name': 'budget', 'in': 'query', 'description': 'maximum price of the matching product in local currency, filters results', 'required': False, 'schema': {'type': 'integer'}}], 'responses': {'200': {'description': 'Products found', 'content': {'application/json': {'schema': {'$ref': '#/components/schemas/ProductResponse'}}}}, '503': {'description': 'one or more services are unavailable'}}, 'deprecated': False}}}, 'components': {'schemas': {'Product': {'type': 'object', 'properties': {'attributes': {'type': 'array', 'items': {'type': 'string'}}, 'name': {'type': 'string'}, 'price': {'type': 'string'}, 'url': {'type': 'string'}}, 'title': 'Product'}, 'ProductResponse': {'type': 'object', 'properties': {'products': {'type': 'array', 'items': {'$ref': '#/components/schemas/Product'}}}, 'title': 'ProductResponse'}}}}\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mI need to use the Klarna Shopping API to search for t shirts.\n",
|
||||||
|
"Action: requests_get\n",
|
||||||
|
"Action Input: https://www.klarna.com/us/shopping/public/openai/v0/products?q=t%20shirts\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m{\"products\":[{\"name\":\"Lacoste Men's Pack of Plain T-Shirts\",\"url\":\"https://www.klarna.com/us/shopping/pl/cl10001/3202043025/Clothing/Lacoste-Men-s-Pack-of-Plain-T-Shirts/?utm_source=openai\",\"price\":\"$26.60\",\"attributes\":[\"Material:Cotton\",\"Target Group:Man\",\"Color:White,Black\"]},{\"name\":\"Hanes Men's Ultimate 6pk. Crewneck T-Shirts\",\"url\":\"https://www.klarna.com/us/shopping/pl/cl10001/3201808270/Clothing/Hanes-Men-s-Ultimate-6pk.-Crewneck-T-Shirts/?utm_source=openai\",\"price\":\"$13.82\",\"attributes\":[\"Material:Cotton\",\"Target Group:Man\",\"Color:White\"]},{\"name\":\"Nike Boy's Jordan Stretch T-shirts\",\"url\":\"https://www.klarna.com/us/shopping/pl/cl359/3201863202/Children-s-Clothing/Nike-Boy-s-Jordan-Stretch-T-shirts/?utm_source=openai\",\"price\":\"$14.99\",\"attributes\":[\"Material:Cotton\",\"Color:White,Green\",\"Model:Boy\",\"Size (Small-Large):S,XL,L,M\"]},{\"name\":\"Polo Classic Fit Cotton V-Neck T-Shirts 3-Pack\",\"url\":\"https://www.klarna.com/us/shopping/pl/cl10001/3203028500/Clothing/Polo-Classic-Fit-Cotton-V-Neck-T-Shirts-3-Pack/?utm_source=openai\",\"price\":\"$29.95\",\"attributes\":[\"Material:Cotton\",\"Target Group:Man\",\"Color:White,Blue,Black\"]},{\"name\":\"adidas Comfort T-shirts Men's 3-pack\",\"url\":\"https://www.klarna.com/us/shopping/pl/cl10001/3202640533/Clothing/adidas-Comfort-T-shirts-Men-s-3-pack/?utm_source=openai\",\"price\":\"$14.99\",\"attributes\":[\"Material:Cotton\",\"Target Group:Man\",\"Color:White,Black\",\"Neckline:Round\"]}]}\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mThe available t shirts in Klarna are Lacoste Men's Pack of Plain T-Shirts, Hanes Men's Ultimate 6pk. Crewneck T-Shirts, Nike Boy's Jordan Stretch T-shirts, Polo Classic Fit Cotton V-Neck T-Shirts 3-Pack, and adidas Comfort T-shirts Men's 3-pack.\n",
|
||||||
|
"Final Answer: The available t shirts in Klarna are Lacoste Men's Pack of Plain T-Shirts, Hanes Men's Ultimate 6pk. Crewneck T-Shirts, Nike Boy's Jordan Stretch T-shirts, Polo Classic Fit Cotton V-Neck T-Shirts 3-Pack, and adidas Comfort T-shirts Men's 3-pack.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"The available t shirts in Klarna are Lacoste Men's Pack of Plain T-Shirts, Hanes Men's Ultimate 6pk. Crewneck T-Shirts, Nike Boy's Jordan Stretch T-shirts, Polo Classic Fit Cotton V-Neck T-Shirts 3-Pack, and adidas Comfort T-shirts Men's 3-pack.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 3,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"llm = ChatOpenAI(temperature=0,)\n",
|
||||||
|
"tools = load_tools([\"requests\"] )\n",
|
||||||
|
"tools += [tool]\n",
|
||||||
|
"\n",
|
||||||
|
"agent_chain = initialize_agent(tools, llm, agent=\"zero-shot-react-description\", verbose=True)\n",
|
||||||
|
"agent_chain.run(\"what t shirts are available in klarna?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "e49318a4",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
@ -121,7 +121,8 @@
|
|||||||
"tools = [\n",
|
"tools = [\n",
|
||||||
" Tool(\n",
|
" Tool(\n",
|
||||||
" name=\"Intermediate Answer\",\n",
|
" name=\"Intermediate Answer\",\n",
|
||||||
" func=search.run\n",
|
" func=search.run,\n",
|
||||||
|
" description=\"useful for when you need to ask with search\"\n",
|
||||||
" )\n",
|
" )\n",
|
||||||
"]\n",
|
"]\n",
|
||||||
"\n",
|
"\n",
|
132
docs/modules/agents/tools/examples/human_tools.ipynb
Normal file
132
docs/modules/agents/tools/examples/human_tools.ipynb
Normal file
@ -0,0 +1,132 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Human as a tool\n",
|
||||||
|
"\n",
|
||||||
|
"Human are AGI so they can certainly be used as a tool to help out AI agent \n",
|
||||||
|
"when it is confused."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import sys\n",
|
||||||
|
"from langchain.chat_models import ChatOpenAI\n",
|
||||||
|
"from langchain.llms import OpenAI\n",
|
||||||
|
"from langchain.agents import load_tools, initialize_agent\n",
|
||||||
|
"\n",
|
||||||
|
"llm = ChatOpenAI(temperature=0.0)\n",
|
||||||
|
"math_llm = OpenAI(temperature=0.0)\n",
|
||||||
|
"tools = load_tools(\n",
|
||||||
|
" [\"human\", \"llm-math\"], \n",
|
||||||
|
" llm=math_llm,\n",
|
||||||
|
")\n",
|
||||||
|
"\n",
|
||||||
|
"agent_chain = initialize_agent(\n",
|
||||||
|
" tools,\n",
|
||||||
|
" llm,\n",
|
||||||
|
" agent=\"zero-shot-react-description\",\n",
|
||||||
|
" verbose=True,\n",
|
||||||
|
")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"In the above code you can see the tool takes input directly from command line.\n",
|
||||||
|
"You can customize `prompt_func` and `input_func` according to your need."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3mI don't know Eric Zhu, so I should ask a human for guidance.\n",
|
||||||
|
"Action: Human\n",
|
||||||
|
"Action Input: \"Do you know when Eric Zhu's birthday is?\"\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Do you know when Eric Zhu's birthday is?\n",
|
||||||
|
"last week\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mlast week\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mThat's not very helpful. I should ask for more information.\n",
|
||||||
|
"Action: Human\n",
|
||||||
|
"Action Input: \"Do you know the specific date of Eric Zhu's birthday?\"\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"Do you know the specific date of Eric Zhu's birthday?\n",
|
||||||
|
"august 1st\n",
|
||||||
|
"\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3maugust 1st\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mNow that I have the date, I can check if it's a leap year or not.\n",
|
||||||
|
"Action: Calculator\n",
|
||||||
|
"Action Input: \"Is 2021 a leap year?\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[33;1m\u001b[1;3mAnswer: False\n",
|
||||||
|
"\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3mI have all the information I need to answer the original question.\n",
|
||||||
|
"Final Answer: Eric Zhu's birthday is on August 1st and it is not a leap year in 2021.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"\"Eric Zhu's birthday is on August 1st and it is not a leap year in 2021.\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 3,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"\n",
|
||||||
|
"agent_chain.run(\"What is Eric Zhu's birthday?\")\n",
|
||||||
|
"# Answer with \"last week\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
@ -11,10 +11,10 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"From https://github.com/SidU/teams-langchain-js/wiki/Connecting-IFTTT-Services.\n",
|
"From https://github.com/SidU/teams-langchain-js/wiki/Connecting-IFTTT-Services.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Creating a webhook\n",
|
"## Creating a webhook\n",
|
||||||
"- Go to https://ifttt.com/create\n",
|
"- Go to https://ifttt.com/create\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Configuring the \"If This\"\n",
|
"## Configuring the \"If This\"\n",
|
||||||
"- Click on the \"If This\" button in the IFTTT interface.\n",
|
"- Click on the \"If This\" button in the IFTTT interface.\n",
|
||||||
"- Search for \"Webhooks\" in the search bar.\n",
|
"- Search for \"Webhooks\" in the search bar.\n",
|
||||||
"- Choose the first option for \"Receive a web request with a JSON payload.\"\n",
|
"- Choose the first option for \"Receive a web request with a JSON payload.\"\n",
|
||||||
@ -24,7 +24,7 @@
|
|||||||
"Event Name.\n",
|
"Event Name.\n",
|
||||||
"- Click the \"Create Trigger\" button to save your settings and create your webhook.\n",
|
"- Click the \"Create Trigger\" button to save your settings and create your webhook.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Configuring the \"Then That\"\n",
|
"## Configuring the \"Then That\"\n",
|
||||||
"- Tap on the \"Then That\" button in the IFTTT interface.\n",
|
"- Tap on the \"Then That\" button in the IFTTT interface.\n",
|
||||||
"- Search for the service you want to connect, such as Spotify.\n",
|
"- Search for the service you want to connect, such as Spotify.\n",
|
||||||
"- Choose an action from the service, such as \"Add track to a playlist\".\n",
|
"- Choose an action from the service, such as \"Add track to a playlist\".\n",
|
||||||
@ -38,7 +38,7 @@
|
|||||||
"- Congratulations! You have successfully connected the Webhook to the desired\n",
|
"- Congratulations! You have successfully connected the Webhook to the desired\n",
|
||||||
"service, and you're ready to start receiving data and triggering actions 🎉\n",
|
"service, and you're ready to start receiving data and triggering actions 🎉\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Finishing up\n",
|
"## Finishing up\n",
|
||||||
"- To get your webhook URL go to https://ifttt.com/maker_webhooks/settings\n",
|
"- To get your webhook URL go to https://ifttt.com/maker_webhooks/settings\n",
|
||||||
"- Copy the IFTTT key value from there. The URL is of the form\n",
|
"- Copy the IFTTT key value from there. The URL is of the form\n",
|
||||||
"https://maker.ifttt.com/use/YOUR_IFTTT_KEY. Grab the YOUR_IFTTT_KEY value.\n"
|
"https://maker.ifttt.com/use/YOUR_IFTTT_KEY. Grab the YOUR_IFTTT_KEY value.\n"
|
128
docs/modules/agents/tools/examples/openweathermap.ipynb
Normal file
128
docs/modules/agents/tools/examples/openweathermap.ipynb
Normal file
@ -0,0 +1,128 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"attachments": {},
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "245a954a",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# OpenWeatherMap API\n",
|
||||||
|
"\n",
|
||||||
|
"This notebook goes over how to use the OpenWeatherMap component to fetch weather information.\n",
|
||||||
|
"\n",
|
||||||
|
"First, you need to sign up for an OpenWeatherMap API key:\n",
|
||||||
|
"\n",
|
||||||
|
"1. Go to OpenWeatherMap and sign up for an API key [here](https://openweathermap.org/api/)\n",
|
||||||
|
"2. pip install pyowm\n",
|
||||||
|
"\n",
|
||||||
|
"Then we will need to set some environment variables:\n",
|
||||||
|
"1. Save your API KEY into OPENWEATHERMAP_API_KEY env variable"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "961b3689",
|
||||||
|
"metadata": {
|
||||||
|
"vscode": {
|
||||||
|
"languageId": "shellscript"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"pip install pyowm"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 35,
|
||||||
|
"id": "34bb5968",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import os\n",
|
||||||
|
"os.environ[\"OPENWEATHERMAP_API_KEY\"] = \"\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 36,
|
||||||
|
"id": "ac4910f8",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.utilities import OpenWeatherMapAPIWrapper"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 37,
|
||||||
|
"id": "84b8f773",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"weather = OpenWeatherMapAPIWrapper()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 38,
|
||||||
|
"id": "9651f324-e74a-4f08-a28a-89db029f66f8",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"weather_data = weather.run(\"London,GB\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 39,
|
||||||
|
"id": "028f4cba",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"In London,GB, the current weather is as follows:\n",
|
||||||
|
"Detailed status: overcast clouds\n",
|
||||||
|
"Wind speed: 4.63 m/s, direction: 150°\n",
|
||||||
|
"Humidity: 67%\n",
|
||||||
|
"Temperature: \n",
|
||||||
|
" - Current: 5.35°C\n",
|
||||||
|
" - High: 6.26°C\n",
|
||||||
|
" - Low: 3.49°C\n",
|
||||||
|
" - Feels like: 1.95°C\n",
|
||||||
|
"Rain: {}\n",
|
||||||
|
"Heat index: None\n",
|
||||||
|
"Cloud cover: 100%\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"print(weather_data)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.11.2"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
84
docs/modules/agents/tools/examples/requests.ipynb
Normal file
84
docs/modules/agents/tools/examples/requests.ipynb
Normal file
File diff suppressed because one or more lines are too long
@ -12,7 +12,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 1,
|
"execution_count": 2,
|
||||||
"id": "e6860c2d",
|
"id": "e6860c2d",
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"pycharm": {
|
"pycharm": {
|
||||||
@ -28,7 +28,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 2,
|
"execution_count": 3,
|
||||||
"id": "dadbcfcd",
|
"id": "dadbcfcd",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -238,6 +238,92 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"agent.run(\"What is the weather in Pomfret?\")"
|
"agent.run(\"What is the weather in Pomfret?\")"
|
||||||
]
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "eabad3af",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## SearxNG Meta Search Engine\n",
|
||||||
|
"\n",
|
||||||
|
"Here we will be using a self hosted SearxNG meta search engine."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "b196c704",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"tools = load_tools([\"searx-search\"], searx_host=\"http://localhost:8888\", llm=llm)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "9023eeaa",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"agent = initialize_agent(tools, llm, agent=\"zero-shot-react-description\", verbose=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 8,
|
||||||
|
"id": "3aad92c1",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I should look up the current weather\n",
|
||||||
|
"Action: SearX Search\n",
|
||||||
|
"Action Input: \"weather in Pomfret\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3mMainly cloudy with snow showers around in the morning. High around 40F. Winds NNW at 5 to 10 mph. Chance of snow 40%. Snow accumulations less than one inch.\n",
|
||||||
|
"\n",
|
||||||
|
"10 Day Weather - Pomfret, MD As of 1:37 pm EST Today 49°/ 41° 52% Mon 27 | Day 49° 52% SE 14 mph Cloudy with occasional rain showers. High 49F. Winds SE at 10 to 20 mph. Chance of rain 50%....\n",
|
||||||
|
"\n",
|
||||||
|
"10 Day Weather - Pomfret, VT As of 3:51 am EST Special Weather Statement Today 39°/ 32° 37% Wed 01 | Day 39° 37% NE 4 mph Cloudy with snow showers developing for the afternoon. High 39F....\n",
|
||||||
|
"\n",
|
||||||
|
"Pomfret, CT ; Current Weather. 1:06 AM. 35°F · RealFeel® 32° ; TODAY'S WEATHER FORECAST. 3/3. 44°Hi. RealFeel® 50° ; TONIGHT'S WEATHER FORECAST. 3/3. 32°Lo.\n",
|
||||||
|
"\n",
|
||||||
|
"Pomfret, MD Forecast Today Hourly Daily Morning 41° 1% Afternoon 43° 0% Evening 35° 3% Overnight 34° 2% Don't Miss Finally, Here’s Why We Get More Colds and Flu When It’s Cold Coast-To-Coast...\n",
|
||||||
|
"\n",
|
||||||
|
"Pomfret, MD Weather Forecast | AccuWeather Current Weather 5:35 PM 35° F RealFeel® 36° RealFeel Shade™ 36° Air Quality Excellent Wind E 3 mph Wind Gusts 5 mph Cloudy More Details WinterCast...\n",
|
||||||
|
"\n",
|
||||||
|
"Pomfret, VT Weather Forecast | AccuWeather Current Weather 11:21 AM 23° F RealFeel® 27° RealFeel Shade™ 25° Air Quality Fair Wind ESE 3 mph Wind Gusts 7 mph Cloudy More Details WinterCast...\n",
|
||||||
|
"\n",
|
||||||
|
"Pomfret Center, CT Weather Forecast | AccuWeather Daily Current Weather 6:50 PM 39° F RealFeel® 36° Air Quality Fair Wind NW 6 mph Wind Gusts 16 mph Mostly clear More Details WinterCast...\n",
|
||||||
|
"\n",
|
||||||
|
"12:00 pm · Feels Like36° · WindN 5 mph · Humidity43% · UV Index3 of 10 · Cloud Cover65% · Rain Amount0 in ...\n",
|
||||||
|
"\n",
|
||||||
|
"Pomfret Center, CT Weather Conditions | Weather Underground star Popular Cities San Francisco, CA 49 °F Clear Manhattan, NY 37 °F Fair Schiller Park, IL (60176) warning39 °F Mostly Cloudy...\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer\n",
|
||||||
|
"Final Answer: The current weather in Pomfret is mainly cloudy with snow showers around in the morning. The temperature is around 40F with winds NNW at 5 to 10 mph. Chance of snow is 40%.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'The current weather in Pomfret is mainly cloudy with snow showers around in the morning. The temperature is around 40F with winds NNW at 5 to 10 mph. Chance of snow is 40%.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 8,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"What is the weather in Pomfret\")"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"metadata": {
|
"metadata": {
|
||||||
@ -256,7 +342,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.9.1"
|
"version": "3.9.11"
|
||||||
},
|
},
|
||||||
"vscode": {
|
"vscode": {
|
||||||
"interpreter": {
|
"interpreter": {
|
@ -73,7 +73,7 @@
|
|||||||
"jukit_cell_id": "OHyurqUPbS"
|
"jukit_cell_id": "OHyurqUPbS"
|
||||||
},
|
},
|
||||||
"source": [
|
"source": [
|
||||||
"# Custom Parameters\n",
|
"## Custom Parameters\n",
|
||||||
"\n",
|
"\n",
|
||||||
"SearxNG supports up to [139 search engines](https://docs.searxng.org/admin/engines/configured_engines.html#configured-engines). You can also customize the Searx wrapper with arbitrary named parameters that will be passed to the Searx search API . In the below example we will making a more interesting use of custom search parameters from searx search api."
|
"SearxNG supports up to [139 search engines](https://docs.searxng.org/admin/engines/configured_engines.html#configured-engines). You can also customize the Searx wrapper with arbitrary named parameters that will be passed to the Searx search API . In the below example we will making a more interesting use of custom search parameters from searx search api."
|
||||||
]
|
]
|
||||||
@ -161,7 +161,7 @@
|
|||||||
"jukit_cell_id": "d0x164ssV1"
|
"jukit_cell_id": "d0x164ssV1"
|
||||||
},
|
},
|
||||||
"source": [
|
"source": [
|
||||||
"# Obtaining results with metadata"
|
"## Obtaining results with metadata"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -600,7 +600,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.9.11"
|
"version": "3.9.1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
326
docs/modules/agents/tools/examples/zapier.ipynb
Normal file
326
docs/modules/agents/tools/examples/zapier.ipynb
Normal file
@ -0,0 +1,326 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "16763ed3",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Zapier Natural Language Actions API\n",
|
||||||
|
"\\\n",
|
||||||
|
"Full docs here: https://nla.zapier.com/api/v1/docs\n",
|
||||||
|
"\n",
|
||||||
|
"**Zapier Natural Language Actions** gives you access to the 5k+ apps, 20k+ actions on Zapier's platform through a natural language API interface.\n",
|
||||||
|
"\n",
|
||||||
|
"NLA supports apps like Gmail, Salesforce, Trello, Slack, Asana, HubSpot, Google Sheets, Microsoft Teams, and thousands more apps: https://zapier.com/apps\n",
|
||||||
|
"\n",
|
||||||
|
"Zapier NLA handles ALL the underlying API auth and translation from natural language --> underlying API call --> return simplified output for LLMs. The key idea is you, or your users, expose a set of actions via an oauth-like setup window, which you can then query and execute via a REST API.\n",
|
||||||
|
"\n",
|
||||||
|
"NLA offers both API Key and OAuth for signing NLA API requests.\n",
|
||||||
|
"\n",
|
||||||
|
"1. Server-side (API Key): for quickly getting started, testing, and production scenarios where LangChain will only use actions exposed in the developer's Zapier account (and will use the developer's connected accounts on Zapier.com)\n",
|
||||||
|
"\n",
|
||||||
|
"2. User-facing (Oauth): for production scenarios where you are deploying an end-user facing application and LangChain needs access to end-user's exposed actions and connected accounts on Zapier.com\n",
|
||||||
|
"\n",
|
||||||
|
"This quick start will focus on the server-side use case for brevity. Review [full docs](https://nla.zapier.com/api/v1/docs) or reach out to nla@zapier.com for user-facing oauth developer support.\n",
|
||||||
|
"\n",
|
||||||
|
"This example goes over how to use the Zapier integration with a `SimpleSequentialChain`, then an `Agent`.\n",
|
||||||
|
"In code, below:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 1,
|
||||||
|
"id": "a363309c",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%load_ext autoreload\n",
|
||||||
|
"%autoreload 2"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
|
"id": "5cf33377",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import os\n",
|
||||||
|
"\n",
|
||||||
|
"# get from https://platform.openai.com/\n",
|
||||||
|
"os.environ[\"OPENAI_API_KEY\"] = os.environ.get(\"OPENAI_API_KEY\", \"\")\n",
|
||||||
|
"\n",
|
||||||
|
"# get from https://nla.zapier.com/demo/provider/debug (under User Information, after logging in): \n",
|
||||||
|
"os.environ[\"ZAPIER_NLA_API_KEY\"] = os.environ.get(\"ZAPIER_NLA_API_KEY\", \"\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "4881b484-1b97-478f-b206-aec407ceff66",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Example with Agent\n",
|
||||||
|
"Zapier tools can be used with an agent. See the example below."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 3,
|
||||||
|
"id": "b2044b17-c941-4ffb-8a03-027a35e2df81",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.llms import OpenAI\n",
|
||||||
|
"from langchain.agents import initialize_agent\n",
|
||||||
|
"from langchain.agents.agent_toolkits import ZapierToolkit\n",
|
||||||
|
"from langchain.utilities.zapier import ZapierNLAWrapper"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 4,
|
||||||
|
"id": "7b505eeb",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"## step 0. expose gmail 'find email' and slack 'send channel message' actions\n",
|
||||||
|
"\n",
|
||||||
|
"# first go here, log in, expose (enable) the two actions: https://nla.zapier.com/demo/start -- for this example, can leave all fields \"Have AI guess\"\n",
|
||||||
|
"# in an oauth scenario, you'd get your own <provider> id (instead of 'demo') which you route your users through first"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"id": "cab18227-c232-4214-9256-bb8dd352266c",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"llm = OpenAI(temperature=0)\n",
|
||||||
|
"zapier = ZapierNLAWrapper()\n",
|
||||||
|
"toolkit = ZapierToolkit.from_zapier_nla_wrapper(zapier)\n",
|
||||||
|
"agent = initialize_agent(toolkit.get_tools(), llm, agent=\"zero-shot-react-description\", verbose=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 6,
|
||||||
|
"id": "f94713de-b64d-465f-a087-00288b5f80ec",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
|
||||||
|
"\u001b[32;1m\u001b[1;3m I need to find the email and summarize it.\n",
|
||||||
|
"Action: Gmail: Find Email\n",
|
||||||
|
"Action Input: Find the latest email from Silicon Valley Bank\u001b[0m\n",
|
||||||
|
"Observation: \u001b[31;1m\u001b[1;3m{\"from__name\": \"Silicon Valley Bridge Bank, N.A.\", \"from__email\": \"sreply@svb.com\", \"body_plain\": \"Dear Clients, After chaotic, tumultuous & stressful days, we have clarity on path for SVB, FDIC is fully insuring all deposits & have an ask for clients & partners as we rebuild. Tim Mayopoulos <https://eml.svb.com/NjEwLUtBSy0yNjYAAAGKgoxUeBCLAyF_NxON97X4rKEaNBLG\", \"reply_to__email\": \"sreply@svb.com\", \"subject\": \"Meet the new CEO Tim Mayopoulos\", \"date\": \"Tue, 14 Mar 2023 23:42:29 -0500 (CDT)\", \"message_url\": \"https://mail.google.com/mail/u/0/#inbox/186e393b13cfdf0a\", \"attachment_count\": \"0\", \"to__emails\": \"ankush@langchain.dev\", \"message_id\": \"186e393b13cfdf0a\", \"labels\": \"IMPORTANT, CATEGORY_UPDATES, INBOX\"}\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I need to summarize the email and send it to the #test-zapier channel in Slack.\n",
|
||||||
|
"Action: Slack: Send Channel Message\n",
|
||||||
|
"Action Input: Send a slack message to the #test-zapier channel with the text \"Silicon Valley Bank has announced that Tim Mayopoulos is the new CEO. FDIC is fully insuring all deposits and they have an ask for clients and partners as they rebuild.\"\u001b[0m\n",
|
||||||
|
"Observation: \u001b[36;1m\u001b[1;3m{\"message__text\": \"Silicon Valley Bank has announced that Tim Mayopoulos is the new CEO. FDIC is fully insuring all deposits and they have an ask for clients and partners as they rebuild.\", \"message__permalink\": \"https://langchain.slack.com/archives/C04TSGU0RA7/p1678859932375259\", \"channel\": \"C04TSGU0RA7\", \"message__bot_profile__name\": \"Zapier\", \"message__team\": \"T04F8K3FZB5\", \"message__bot_id\": \"B04TRV4R74K\", \"message__bot_profile__deleted\": \"false\", \"message__bot_profile__app_id\": \"A024R9PQM\", \"ts_time\": \"2023-03-15T05:58:52Z\", \"message__bot_profile__icons__image_36\": \"https://avatars.slack-edge.com/2022-08-02/3888649620612_f864dc1bb794cf7d82b0_36.png\", \"message__blocks[]block_id\": \"kdZZ\", \"message__blocks[]elements[]type\": \"['rich_text_section']\"}\u001b[0m\n",
|
||||||
|
"Thought:\u001b[32;1m\u001b[1;3m I now know the final answer.\n",
|
||||||
|
"Final Answer: I have sent a summary of the last email from Silicon Valley Bank to the #test-zapier channel in Slack.\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'I have sent a summary of the last email from Silicon Valley Bank to the #test-zapier channel in Slack.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 6,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"agent.run(\"Summarize the last email I received regarding Silicon Valley Bank. Send the summary to the #test-zapier channel in slack.\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "bcdea831",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Example with SimpleSequentialChain\n",
|
||||||
|
"If you need more explicit control, use a chain, like below."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 7,
|
||||||
|
"id": "10a46e7e",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from langchain.llms import OpenAI\n",
|
||||||
|
"from langchain.chains import LLMChain, TransformChain, SimpleSequentialChain\n",
|
||||||
|
"from langchain.prompts import PromptTemplate\n",
|
||||||
|
"from langchain.tools.zapier.tool import ZapierNLARunAction\n",
|
||||||
|
"from langchain.utilities.zapier import ZapierNLAWrapper"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 8,
|
||||||
|
"id": "b9358048",
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"## step 0. expose gmail 'find email' and slack 'send direct message' actions\n",
|
||||||
|
"\n",
|
||||||
|
"# first go here, log in, expose (enable) the two actions: https://nla.zapier.com/demo/start -- for this example, can leave all fields \"Have AI guess\"\n",
|
||||||
|
"# in an oauth scenario, you'd get your own <provider> id (instead of 'demo') which you route your users through first\n",
|
||||||
|
"\n",
|
||||||
|
"actions = ZapierNLAWrapper().list()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 9,
|
||||||
|
"id": "4e80f461",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"## step 1. gmail find email\n",
|
||||||
|
"\n",
|
||||||
|
"GMAIL_SEARCH_INSTRUCTIONS = \"Grab the latest email from Silicon Valley Bank\"\n",
|
||||||
|
"\n",
|
||||||
|
"def nla_gmail(inputs):\n",
|
||||||
|
" action = next((a for a in actions if a[\"description\"].startswith(\"Gmail: Find Email\")), None)\n",
|
||||||
|
" return {\"email_data\": ZapierNLARunAction(action_id=action[\"id\"], zapier_description=action[\"description\"], params_schema=action[\"params\"]).run(inputs[\"instructions\"])}\n",
|
||||||
|
"gmail_chain = TransformChain(input_variables=[\"instructions\"], output_variables=[\"email_data\"], transform=nla_gmail)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 10,
|
||||||
|
"id": "46893233",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"## step 2. generate draft reply\n",
|
||||||
|
"\n",
|
||||||
|
"template = \"\"\"You are an assisstant who drafts replies to an incoming email. Output draft reply in plain text (not JSON).\n",
|
||||||
|
"\n",
|
||||||
|
"Incoming email:\n",
|
||||||
|
"{email_data}\n",
|
||||||
|
"\n",
|
||||||
|
"Draft email reply:\"\"\"\n",
|
||||||
|
"\n",
|
||||||
|
"prompt_template = PromptTemplate(input_variables=[\"email_data\"], template=template)\n",
|
||||||
|
"reply_chain = LLMChain(llm=OpenAI(temperature=.7), prompt=prompt_template)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 11,
|
||||||
|
"id": "cd85c4f8",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"## step 3. send draft reply via a slack direct message\n",
|
||||||
|
"\n",
|
||||||
|
"SLACK_HANDLE = \"@Ankush Gola\"\n",
|
||||||
|
"\n",
|
||||||
|
"def nla_slack(inputs):\n",
|
||||||
|
" action = next((a for a in actions if a[\"description\"].startswith(\"Slack: Send Direct Message\")), None)\n",
|
||||||
|
" instructions = f'Send this to {SLACK_HANDLE} in Slack: {inputs[\"draft_reply\"]}'\n",
|
||||||
|
" return {\"slack_data\": ZapierNLARunAction(action_id=action[\"id\"], zapier_description=action[\"description\"], params_schema=action[\"params\"]).run(instructions)}\n",
|
||||||
|
"slack_chain = TransformChain(input_variables=[\"draft_reply\"], output_variables=[\"slack_data\"], transform=nla_slack)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 12,
|
||||||
|
"id": "4829cab4",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new SimpleSequentialChain chain...\u001b[0m\n",
|
||||||
|
"\u001b[36;1m\u001b[1;3m{\"from__name\": \"Silicon Valley Bridge Bank, N.A.\", \"from__email\": \"sreply@svb.com\", \"body_plain\": \"Dear Clients, After chaotic, tumultuous & stressful days, we have clarity on path for SVB, FDIC is fully insuring all deposits & have an ask for clients & partners as we rebuild. Tim Mayopoulos <https://eml.svb.com/NjEwLUtBSy0yNjYAAAGKgoxUeBCLAyF_NxON97X4rKEaNBLG\", \"reply_to__email\": \"sreply@svb.com\", \"subject\": \"Meet the new CEO Tim Mayopoulos\", \"date\": \"Tue, 14 Mar 2023 23:42:29 -0500 (CDT)\", \"message_url\": \"https://mail.google.com/mail/u/0/#inbox/186e393b13cfdf0a\", \"attachment_count\": \"0\", \"to__emails\": \"ankush@langchain.dev\", \"message_id\": \"186e393b13cfdf0a\", \"labels\": \"IMPORTANT, CATEGORY_UPDATES, INBOX\"}\u001b[0m\n",
|
||||||
|
"\u001b[33;1m\u001b[1;3m\n",
|
||||||
|
"Dear Silicon Valley Bridge Bank, \n",
|
||||||
|
"\n",
|
||||||
|
"Thank you for your email and the update regarding your new CEO Tim Mayopoulos. We appreciate your dedication to keeping your clients and partners informed and we look forward to continuing our relationship with you. \n",
|
||||||
|
"\n",
|
||||||
|
"Best regards, \n",
|
||||||
|
"[Your Name]\u001b[0m\n",
|
||||||
|
"\u001b[38;5;200m\u001b[1;3m{\"message__text\": \"Dear Silicon Valley Bridge Bank, \\n\\nThank you for your email and the update regarding your new CEO Tim Mayopoulos. We appreciate your dedication to keeping your clients and partners informed and we look forward to continuing our relationship with you. \\n\\nBest regards, \\n[Your Name]\", \"message__permalink\": \"https://langchain.slack.com/archives/D04TKF5BBHU/p1678859968241629\", \"channel\": \"D04TKF5BBHU\", \"message__bot_profile__name\": \"Zapier\", \"message__team\": \"T04F8K3FZB5\", \"message__bot_id\": \"B04TRV4R74K\", \"message__bot_profile__deleted\": \"false\", \"message__bot_profile__app_id\": \"A024R9PQM\", \"ts_time\": \"2023-03-15T05:59:28Z\", \"message__blocks[]block_id\": \"p7i\", \"message__blocks[]elements[]elements[]type\": \"[['text']]\", \"message__blocks[]elements[]type\": \"['rich_text_section']\"}\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"'{\"message__text\": \"Dear Silicon Valley Bridge Bank, \\\\n\\\\nThank you for your email and the update regarding your new CEO Tim Mayopoulos. We appreciate your dedication to keeping your clients and partners informed and we look forward to continuing our relationship with you. \\\\n\\\\nBest regards, \\\\n[Your Name]\", \"message__permalink\": \"https://langchain.slack.com/archives/D04TKF5BBHU/p1678859968241629\", \"channel\": \"D04TKF5BBHU\", \"message__bot_profile__name\": \"Zapier\", \"message__team\": \"T04F8K3FZB5\", \"message__bot_id\": \"B04TRV4R74K\", \"message__bot_profile__deleted\": \"false\", \"message__bot_profile__app_id\": \"A024R9PQM\", \"ts_time\": \"2023-03-15T05:59:28Z\", \"message__blocks[]block_id\": \"p7i\", \"message__blocks[]elements[]elements[]type\": \"[[\\'text\\']]\", \"message__blocks[]elements[]type\": \"[\\'rich_text_section\\']\"}'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 12,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"## finally, execute\n",
|
||||||
|
"\n",
|
||||||
|
"overall_chain = SimpleSequentialChain(chains=[gmail_chain, reply_chain, slack_chain], verbose=True)\n",
|
||||||
|
"overall_chain.run(GMAIL_SEARCH_INSTRUCTIONS)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"id": "09ff954e-45f2-4595-92ea-91627abde4a0",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": []
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.9.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
@ -1,4 +1,4 @@
|
|||||||
# Tools
|
# Getting Started
|
||||||
|
|
||||||
Tools are functions that agents can use to interact with the world.
|
Tools are functions that agents can use to interact with the world.
|
||||||
These tools can be generic utilities (e.g. search), other chains, or even other agents.
|
These tools can be generic utilities (e.g. search), other chains, or even other agents.
|
||||||
@ -118,7 +118,7 @@ Below is a list of all supported tools and relevant information:
|
|||||||
- Notes: Uses the Google Custom Search API
|
- Notes: Uses the Google Custom Search API
|
||||||
- Requires LLM: No
|
- Requires LLM: No
|
||||||
- Extra Parameters: `google_api_key`, `google_cse_id`
|
- Extra Parameters: `google_api_key`, `google_cse_id`
|
||||||
- For more information on this, see [this page](../../ecosystem/google_search.md)
|
- For more information on this, see [this page](../../../ecosystem/google_search.md)
|
||||||
|
|
||||||
**searx-search**
|
**searx-search**
|
||||||
|
|
||||||
@ -135,4 +135,28 @@ Below is a list of all supported tools and relevant information:
|
|||||||
- Notes: Calls the [serper.dev](https://serper.dev) Google Search API and then parses results.
|
- Notes: Calls the [serper.dev](https://serper.dev) Google Search API and then parses results.
|
||||||
- Requires LLM: No
|
- Requires LLM: No
|
||||||
- Extra Parameters: `serper_api_key`
|
- Extra Parameters: `serper_api_key`
|
||||||
- For more information on this, see [this page](../../ecosystem/google_serper.md)
|
- For more information on this, see [this page](../../../ecosystem/google_serper.md)
|
||||||
|
|
||||||
|
**wikipedia**
|
||||||
|
|
||||||
|
- Tool Name: Wikipedia
|
||||||
|
- Tool Description: A wrapper around Wikipedia. Useful for when you need to answer general questions about people, places, companies, historical events, or other subjects. Input should be a search query.
|
||||||
|
- Notes: Uses the [wikipedia](https://pypi.org/project/wikipedia/) Python package to call the MediaWiki API and then parses results.
|
||||||
|
- Requires LLM: No
|
||||||
|
- Extra Parameters: `top_k_results`
|
||||||
|
|
||||||
|
**podcast-api**
|
||||||
|
|
||||||
|
- Tool Name: Podcast API
|
||||||
|
- Tool Description: Use the Listen Notes Podcast API to search all podcasts or episodes. The input should be a question in natural language that this API can answer.
|
||||||
|
- Notes: A natural language connection to the Listen Notes Podcast API (`https://www.PodcastAPI.com`), specifically the `/search/` endpoint.
|
||||||
|
- Requires LLM: Yes
|
||||||
|
- Extra Parameters: `listen_api_key` (your api key to access this endpoint)
|
||||||
|
|
||||||
|
**openweathermap-api**
|
||||||
|
|
||||||
|
- Tool Name: OpenWeatherMap
|
||||||
|
- Tool Description: A wrapper around OpenWeatherMap API. Useful for fetching current weather information for a specified location. Input should be a location string (e.g. 'London,GB').
|
||||||
|
- Notes: A connection to the OpenWeatherMap API (https://api.openweathermap.org), specifically the `/data/2.5/weather` endpoint.
|
||||||
|
- Requires LLM: No
|
||||||
|
- Extra Parameters: `openweathermap_api_key` (your API key to access this endpoint)
|
@ -1,16 +1,18 @@
|
|||||||
Chains
|
Chains
|
||||||
==========================
|
==========================
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
`Conceptual Guide <https://docs.langchain.com/docs/components/chains>`_
|
||||||
|
|
||||||
|
|
||||||
Using an LLM in isolation is fine for some simple applications,
|
Using an LLM in isolation is fine for some simple applications,
|
||||||
but many more complex ones require chaining LLMs - either with eachother or with other experts.
|
but many more complex ones require chaining LLMs - either with each other or with other experts.
|
||||||
LangChain provides a standard interface for Chains, as well as some common implementations of chains for easy use.
|
LangChain provides a standard interface for Chains, as well as some common implementations of chains for ease of use.
|
||||||
|
|
||||||
The following sections of documentation are provided:
|
The following sections of documentation are provided:
|
||||||
|
|
||||||
- `Getting Started <./chains/getting_started.html>`_: A getting started guide for chains, to get you up and running quickly.
|
- `Getting Started <./chains/getting_started.html>`_: A getting started guide for chains, to get you up and running quickly.
|
||||||
|
|
||||||
- `Key Concepts <./chains/key_concepts.html>`_: A conceptual guide going over the various concepts related to chains.
|
|
||||||
|
|
||||||
- `How-To Guides <./chains/how_to_guides.html>`_: A collection of how-to guides. These highlight how to use various types of chains.
|
- `How-To Guides <./chains/how_to_guides.html>`_: A collection of how-to guides. These highlight how to use various types of chains.
|
||||||
|
|
||||||
- `Reference <../reference/modules/chains.html>`_: API reference documentation for all Chain classes.
|
- `Reference <../reference/modules/chains.html>`_: API reference documentation for all Chain classes.
|
||||||
@ -25,5 +27,4 @@ The following sections of documentation are provided:
|
|||||||
|
|
||||||
./chains/getting_started.ipynb
|
./chains/getting_started.ipynb
|
||||||
./chains/how_to_guides.rst
|
./chains/how_to_guides.rst
|
||||||
./chains/key_concepts.rst
|
|
||||||
Reference<../reference/modules/chains.rst>
|
Reference<../reference/modules/chains.rst>
|
@ -149,6 +149,33 @@
|
|||||||
"chain.run(\"Search for 'Avatar'\")"
|
"chain.run(\"Search for 'Avatar'\")"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Listen API Example"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import os\n",
|
||||||
|
"from langchain.llms import OpenAI\n",
|
||||||
|
"from langchain.chains.api import podcast_docs\n",
|
||||||
|
"from langchain.chains import APIChain\n",
|
||||||
|
"\n",
|
||||||
|
"# Get api key here: https://www.listennotes.com/api/pricing/\n",
|
||||||
|
"listen_api_key = 'xxx'\n",
|
||||||
|
"\n",
|
||||||
|
"llm = OpenAI(temperature=0)\n",
|
||||||
|
"headers = {\"X-ListenAPI-Key\": listen_api_key}\n",
|
||||||
|
"chain = APIChain.from_llm_and_api_docs(llm, podcast_docs.PODCAST_DOCS, headers=headers, verbose=True)\n",
|
||||||
|
"chain.run(\"Search for 'silicon valley bank' podcast episodes, audio length is more than 30 minutes, return only 1 results\")\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
@ -173,7 +200,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.10.9"
|
"version": "3.9.1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
1124
docs/modules/chains/examples/llm_summarization_checker.ipynb
Normal file
1124
docs/modules/chains/examples/llm_summarization_checker.ipynb
Normal file
File diff suppressed because it is too large
Load Diff
@ -71,17 +71,17 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new PALChain chain...\u001b[0m\n",
|
"\u001B[1m> Entering new PALChain chain...\u001B[0m\n",
|
||||||
"\u001b[32;1m\u001b[1;3mdef solution():\n",
|
"\u001B[32;1m\u001B[1;3mdef solution():\n",
|
||||||
" \"\"\"Jan has three times the number of pets as Marcia. Marcia has two more pets than Cindy. If Cindy has four pets, how many total pets do the three have?\"\"\"\n",
|
" \"\"\"Jan has three times the number of pets as Marcia. Marcia has two more pets than Cindy. If Cindy has four pets, how many total pets do the three have?\"\"\"\n",
|
||||||
" cindy_pets = 4\n",
|
" cindy_pets = 4\n",
|
||||||
" marcia_pets = cindy_pets + 2\n",
|
" marcia_pets = cindy_pets + 2\n",
|
||||||
" jan_pets = marcia_pets * 3\n",
|
" jan_pets = marcia_pets * 3\n",
|
||||||
" total_pets = cindy_pets + marcia_pets + jan_pets\n",
|
" total_pets = cindy_pets + marcia_pets + jan_pets\n",
|
||||||
" result = total_pets\n",
|
" result = total_pets\n",
|
||||||
" return result\u001b[0m\n",
|
" return result\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished chain.\u001b[0m\n"
|
"\u001B[1m> Finished chain.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -139,8 +139,8 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new PALChain chain...\u001b[0m\n",
|
"\u001B[1m> Entering new PALChain chain...\u001B[0m\n",
|
||||||
"\u001b[32;1m\u001b[1;3m# Put objects into a list to record ordering\n",
|
"\u001B[32;1m\u001B[1;3m# Put objects into a list to record ordering\n",
|
||||||
"objects = []\n",
|
"objects = []\n",
|
||||||
"objects += [('booklet', 'blue')] * 2\n",
|
"objects += [('booklet', 'blue')] * 2\n",
|
||||||
"objects += [('booklet', 'purple')] * 2\n",
|
"objects += [('booklet', 'purple')] * 2\n",
|
||||||
@ -151,9 +151,9 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"# Count number of purple objects\n",
|
"# Count number of purple objects\n",
|
||||||
"num_purple = len([object for object in objects if object[1] == 'purple'])\n",
|
"num_purple = len([object for object in objects if object[1] == 'purple'])\n",
|
||||||
"answer = num_purple\u001b[0m\n",
|
"answer = num_purple\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished PALChain chain.\u001b[0m\n"
|
"\u001B[1m> Finished PALChain chain.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -212,8 +212,8 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new PALChain chain...\u001b[0m\n",
|
"\u001B[1m> Entering new PALChain chain...\u001B[0m\n",
|
||||||
"\u001b[32;1m\u001b[1;3m# Put objects into a list to record ordering\n",
|
"\u001B[32;1m\u001B[1;3m# Put objects into a list to record ordering\n",
|
||||||
"objects = []\n",
|
"objects = []\n",
|
||||||
"objects += [('booklet', 'blue')] * 2\n",
|
"objects += [('booklet', 'blue')] * 2\n",
|
||||||
"objects += [('booklet', 'purple')] * 2\n",
|
"objects += [('booklet', 'purple')] * 2\n",
|
||||||
@ -224,9 +224,9 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"# Count number of purple objects\n",
|
"# Count number of purple objects\n",
|
||||||
"num_purple = len([object for object in objects if object[1] == 'purple'])\n",
|
"num_purple = len([object for object in objects if object[1] == 'purple'])\n",
|
||||||
"answer = num_purple\u001b[0m\n",
|
"answer = num_purple\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished chain.\u001b[0m\n"
|
"\u001B[1m> Finished chain.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
@ -377,11 +377,20 @@
|
|||||||
"\tFOREIGN KEY(\"GenreId\") REFERENCES \"Genre\" (\"GenreId\"), \n",
|
"\tFOREIGN KEY(\"GenreId\") REFERENCES \"Genre\" (\"GenreId\"), \n",
|
||||||
"\tFOREIGN KEY(\"AlbumId\") REFERENCES \"Album\" (\"AlbumId\")\n",
|
"\tFOREIGN KEY(\"AlbumId\") REFERENCES \"Album\" (\"AlbumId\")\n",
|
||||||
")\n",
|
")\n",
|
||||||
"\n",
|
"/*\n",
|
||||||
"SELECT * FROM 'Track' LIMIT 2;\n",
|
"2 rows from Track table:\n",
|
||||||
"TrackId Name AlbumId MediaTypeId GenreId Composer Milliseconds Bytes UnitPrice\n",
|
"TrackId\tName\tAlbumId\tMediaTypeId\tGenreId\tComposer\tMilliseconds\tBytes\tUnitPrice\n",
|
||||||
"1 For Those About To Rock (We Salute You) 1 1 1 Angus Young, Malcolm Young, Brian Johnson 343719 11170334 0.99\n",
|
"1\tFor Those About To Rock (We Salute You)\t1\t1\t1\tAngus Young, Malcolm Young, Brian Johnson\t343719\t11170334\t0.99\n",
|
||||||
"2 Balls to the Wall 2 2 1 None 342562 5510424 0.99\n"
|
"2\tBalls to the Wall\t2\t2\t1\tNone\t342562\t5510424\t0.99\n",
|
||||||
|
"*/\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "stderr",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"/home/jon/projects/langchain/langchain/sql_database.py:135: SAWarning: Dialect sqlite+pysqlite does *not* support Decimal objects natively, and SQLAlchemy must convert from floating point - rounding errors and other issues may occur. Please consider storing Decimal numbers as strings or integers on this platform for lossless storage.\n",
|
||||||
|
" sample_rows = connection.execute(command)\n"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
@ -434,6 +443,134 @@
|
|||||||
"db_chain.run(\"What are some example tracks by Bach?\")"
|
"db_chain.run(\"What are some example tracks by Bach?\")"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "ef94e948",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### Custom Table Info\n",
|
||||||
|
"In some cases, it can be useful to provide custom table information instead of using the automatically generated table definitions and the first `sample_rows_in_table_info` sample rows. For example, if you know that the first few rows of a table are uninformative, it could help to manually provide example rows that are more diverse or provide more information to the model. It is also possible to limit the columns that will be visible to the model if there are unnecessary columns. \n",
|
||||||
|
"\n",
|
||||||
|
"This information can be provided as a dictionary with table names as the keys and table information as the values. For example, let's provide a custom definition and sample rows for the Track table with only a few columns:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 16,
|
||||||
|
"id": "2ad33ab1",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"custom_table_info = {\n",
|
||||||
|
" \"Track\": \"\"\"CREATE TABLE Track (\n",
|
||||||
|
"\t\"TrackId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"Name\" NVARCHAR(200) NOT NULL,\n",
|
||||||
|
"\t\"Composer\" NVARCHAR(220),\n",
|
||||||
|
"\tPRIMARY KEY (\"TrackId\")\n",
|
||||||
|
")\n",
|
||||||
|
"/*\n",
|
||||||
|
"3 rows from Track table:\n",
|
||||||
|
"TrackId\tName\tComposer\n",
|
||||||
|
"1\tFor Those About To Rock (We Salute You)\tAngus Young, Malcolm Young, Brian Johnson\n",
|
||||||
|
"2\tBalls to the Wall\tNone\n",
|
||||||
|
"3\tMy favorite song ever\tThe coolest composer of all time\n",
|
||||||
|
"*/\"\"\"\n",
|
||||||
|
"}"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 17,
|
||||||
|
"id": "db144352",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"CREATE TABLE \"Playlist\" (\n",
|
||||||
|
"\t\"PlaylistId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"Name\" NVARCHAR(120), \n",
|
||||||
|
"\tPRIMARY KEY (\"PlaylistId\")\n",
|
||||||
|
")\n",
|
||||||
|
"/*\n",
|
||||||
|
"2 rows from Playlist table:\n",
|
||||||
|
"PlaylistId\tName\n",
|
||||||
|
"1\tMusic\n",
|
||||||
|
"2\tMovies\n",
|
||||||
|
"*/\n",
|
||||||
|
"\n",
|
||||||
|
"CREATE TABLE Track (\n",
|
||||||
|
"\t\"TrackId\" INTEGER NOT NULL, \n",
|
||||||
|
"\t\"Name\" NVARCHAR(200) NOT NULL,\n",
|
||||||
|
"\t\"Composer\" NVARCHAR(220),\n",
|
||||||
|
"\tPRIMARY KEY (\"TrackId\")\n",
|
||||||
|
")\n",
|
||||||
|
"/*\n",
|
||||||
|
"3 rows from Track table:\n",
|
||||||
|
"TrackId\tName\tComposer\n",
|
||||||
|
"1\tFor Those About To Rock (We Salute You)\tAngus Young, Malcolm Young, Brian Johnson\n",
|
||||||
|
"2\tBalls to the Wall\tNone\n",
|
||||||
|
"3\tMy favorite song ever\tThe coolest composer of all time\n",
|
||||||
|
"*/\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"db = SQLDatabase.from_uri(\n",
|
||||||
|
" \"sqlite:///../../../../notebooks/Chinook.db\",\n",
|
||||||
|
" include_tables=['Track', 'Playlist'],\n",
|
||||||
|
" sample_rows_in_table_info=2,\n",
|
||||||
|
" custom_table_info=custom_table_info)\n",
|
||||||
|
"\n",
|
||||||
|
"print(db.table_info)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "5fc6f507",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Note how our custom table definition and sample rows for `Track` overrides the `sample_rows_in_table_info` parameter. Tables that are not overridden by `custom_table_info`, in this example `Playlist`, will have their table info gathered automatically as usual."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 18,
|
||||||
|
"id": "dfbda4e6",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new SQLDatabaseChain chain...\u001b[0m\n",
|
||||||
|
"What are some example tracks by Bach? \n",
|
||||||
|
"SQLQuery:\u001b[32;1m\u001b[1;3m SELECT Name, Composer FROM Track WHERE Composer LIKE '%Bach%' LIMIT 5;\u001b[0m\n",
|
||||||
|
"SQLResult: \u001b[33;1m\u001b[1;3m[('American Woman', 'B. Cummings/G. Peterson/M.J. Kale/R. Bachman'), ('Concerto for 2 Violins in D Minor, BWV 1043: I. Vivace', 'Johann Sebastian Bach'), ('Aria Mit 30 Veränderungen, BWV 988 \"Goldberg Variations\": Aria', 'Johann Sebastian Bach'), ('Suite for Solo Cello No. 1 in G Major, BWV 1007: I. Prélude', 'Johann Sebastian Bach'), ('Toccata and Fugue in D Minor, BWV 565: I. Toccata', 'Johann Sebastian Bach')]\u001b[0m\n",
|
||||||
|
"Answer:\u001b[32;1m\u001b[1;3m Some example tracks by Bach are 'American Woman', 'Concerto for 2 Violins in D Minor, BWV 1043: I. Vivace', 'Aria Mit 30 Veränderungen, BWV 988 \"Goldberg Variations\": Aria', 'Suite for Solo Cello No. 1 in G Major, BWV 1007: I. Prélude', and 'Toccata and Fugue in D Minor, BWV 565: I. Toccata'.\u001b[0m\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"' Some example tracks by Bach are \\'American Woman\\', \\'Concerto for 2 Violins in D Minor, BWV 1043: I. Vivace\\', \\'Aria Mit 30 Veränderungen, BWV 988 \"Goldberg Variations\": Aria\\', \\'Suite for Solo Cello No. 1 in G Major, BWV 1007: I. Prélude\\', and \\'Toccata and Fugue in D Minor, BWV 565: I. Toccata\\'.'"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 18,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"db_chain = SQLDatabaseChain(llm=llm, database=db, verbose=True)\n",
|
||||||
|
"db_chain.run(\"What are some example tracks by Bach?\")"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"id": "c12ae15a",
|
"id": "c12ae15a",
|
||||||
|
@ -39,7 +39,7 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"SparkleSmile Toothpaste\n",
|
"SparkleSmile Toothpaste\n",
|
||||||
"\u001b[1mConcurrent executed in 1.54 seconds.\u001b[0m\n",
|
"\u001B[1mConcurrent executed in 1.54 seconds.\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"BrightSmile Toothpaste Co.\n",
|
"BrightSmile Toothpaste Co.\n",
|
||||||
@ -55,7 +55,7 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"BrightSmile Toothpaste.\n",
|
"BrightSmile Toothpaste.\n",
|
||||||
"\u001b[1mSerial executed in 6.38 seconds.\u001b[0m\n"
|
"\u001B[1mSerial executed in 6.38 seconds.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
@ -42,13 +42,13 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
|
"\u001B[1m> Entering new LLMChain chain...\u001B[0m\n",
|
||||||
"Prompt after formatting:\n",
|
"Prompt after formatting:\n",
|
||||||
"\u001b[32;1m\u001b[1;3mQuestion: What NFL team won the Super Bowl in the year Justin Beiber was born?\n",
|
"\u001B[32;1m\u001B[1;3mQuestion: What NFL team won the Super Bowl in the year Justin Beiber was born?\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Answer: Let's think step by step.\u001b[0m\n",
|
"Answer: Let's think step by step.\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished LLMChain chain.\u001b[0m\n"
|
"\u001B[1m> Finished LLMChain chain.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -95,11 +95,11 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
|
"\u001B[1m> Entering new LLMChain chain...\u001B[0m\n",
|
||||||
"Prompt after formatting:\n",
|
"Prompt after formatting:\n",
|
||||||
"\u001b[32;1m\u001b[1;3mWrite a sad poem about ducks.\u001b[0m\n",
|
"\u001B[32;1m\u001B[1;3mWrite a sad poem about ducks.\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished LLMChain chain.\u001b[0m\n"
|
"\u001B[1m> Finished LLMChain chain.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
@ -36,6 +36,25 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 1,
|
"execution_count": 1,
|
||||||
|
"id": "7a886879",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"cannot find .env file\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"%load_ext dotenv\n",
|
||||||
|
"%dotenv"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 2,
|
||||||
"id": "3f2f9b8c",
|
"id": "3f2f9b8c",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -47,7 +66,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 2,
|
"execution_count": 3,
|
||||||
"id": "b8237d1a",
|
"id": "b8237d1a",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -64,7 +83,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 3,
|
"execution_count": 4,
|
||||||
"id": "4a391730",
|
"id": "4a391730",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -82,7 +101,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 4,
|
"execution_count": 5,
|
||||||
"id": "9368bd63",
|
"id": "9368bd63",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -94,7 +113,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 5,
|
"execution_count": 6,
|
||||||
"id": "d39e15f5",
|
"id": "d39e15f5",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@ -107,22 +126,20 @@
|
|||||||
"\u001b[1m> Entering new SimpleSequentialChain chain...\u001b[0m\n",
|
"\u001b[1m> Entering new SimpleSequentialChain chain...\u001b[0m\n",
|
||||||
"\u001b[36;1m\u001b[1;3m\n",
|
"\u001b[36;1m\u001b[1;3m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Tragedy at Sunset on the Beach follows the story of a young couple, Jack and Annie, who have just started to explore the possibility of a relationship together. After a day spent in the sun and sand, they decide to take a romantic stroll down the beach as the sun sets. \n",
|
"Tragedy at Sunset on the Beach is a story of a young couple, Jack and Sarah, who are in love and looking forward to their future together. On the night of their anniversary, they decide to take a walk on the beach at sunset. As they are walking, they come across a mysterious figure, who tells them that their love will be tested in the near future. \n",
|
||||||
"\n",
|
"\n",
|
||||||
"However, their romantic evening quickly turns tragic when they stumble upon a body lying in the sand. As they approach to investigate, they are shocked to discover that it is Jack's long-lost brother, who has been missing for several years. \n",
|
"The figure then tells the couple that the sun will soon set, and with it, a tragedy will strike. If Jack and Sarah can stay together and pass the test, they will be granted everlasting love. However, if they fail, their love will be lost forever.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"The story follows Jack and Annie as they navigate their way through the tragedy and their newfound relationship. With the help of their friends, family, and the beach's inhabitants, Jack and Annie must come to terms with their deep-seated emotions and the reality of the situation. \n",
|
"The play follows the couple as they struggle to stay together and battle the forces that threaten to tear them apart. Despite the tragedy that awaits them, they remain devoted to one another and fight to keep their love alive. In the end, the couple must decide whether to take a chance on their future together or succumb to the tragedy of the sunset.\u001b[0m\n",
|
||||||
"\n",
|
|
||||||
"Ultimately, the play explores themes of family, love, and loss, as Jack and Annie's story unfolds against the beautiful backdrop of the beach at sunset.\u001b[0m\n",
|
|
||||||
"\u001b[33;1m\u001b[1;3m\n",
|
"\u001b[33;1m\u001b[1;3m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Tragedy at Sunset on the Beach is an emotionally complex tale of family, love, and loss. Told against the beautiful backdrop of a beach at sunset, the story follows Jack and Annie, a young couple just beginning to explore a relationship together. When they stumble upon the body of Jack's long-lost brother on the beach, they must face the reality of the tragedy and come to terms with their deep-seated emotions. \n",
|
"Tragedy at Sunset on the Beach is an emotionally gripping story of love, hope, and sacrifice. Through the story of Jack and Sarah, the audience is taken on a journey of self-discovery and the power of love to overcome even the greatest of obstacles. \n",
|
||||||
"\n",
|
"\n",
|
||||||
"The playwright has crafted a heartfelt and thought-provoking story, one that probes into the depths of the human experience. The cast of characters is well-rounded and fully realized, and the dialogue is natural and emotional. The direction and choreography are top-notch, and the scenic design is breathtaking. \n",
|
"The play's talented cast brings the characters to life, allowing us to feel the depths of their emotion and the intensity of their struggle. With its compelling story and captivating performances, this play is sure to draw in audiences and leave them on the edge of their seats. \n",
|
||||||
"\n",
|
"\n",
|
||||||
"Overall, Tragedy at Sunset on the Beach is a powerful and moving story about the fragility of life and the strength of love. It is sure to tug at your heartstrings and leave you with a newfound appreciation of life's precious moments. Highly recommended.\u001b[0m\n",
|
"The play's setting of the beach at sunset adds a touch of poignancy and romanticism to the story, while the mysterious figure serves to keep the audience enthralled. Overall, Tragedy at Sunset on the Beach is an engaging and thought-provoking play that is sure to leave audiences feeling inspired and hopeful.\u001b[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished SimpleSequentialChain chain.\u001b[0m\n"
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
@ -132,7 +149,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 6,
|
"execution_count": 7,
|
||||||
"id": "c6649a01",
|
"id": "c6649a01",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@ -142,11 +159,11 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Tragedy at Sunset on the Beach is an emotionally complex tale of family, love, and loss. Told against the beautiful backdrop of a beach at sunset, the story follows Jack and Annie, a young couple just beginning to explore a relationship together. When they stumble upon the body of Jack's long-lost brother on the beach, they must face the reality of the tragedy and come to terms with their deep-seated emotions. \n",
|
"Tragedy at Sunset on the Beach is an emotionally gripping story of love, hope, and sacrifice. Through the story of Jack and Sarah, the audience is taken on a journey of self-discovery and the power of love to overcome even the greatest of obstacles. \n",
|
||||||
"\n",
|
"\n",
|
||||||
"The playwright has crafted a heartfelt and thought-provoking story, one that probes into the depths of the human experience. The cast of characters is well-rounded and fully realized, and the dialogue is natural and emotional. The direction and choreography are top-notch, and the scenic design is breathtaking. \n",
|
"The play's talented cast brings the characters to life, allowing us to feel the depths of their emotion and the intensity of their struggle. With its compelling story and captivating performances, this play is sure to draw in audiences and leave them on the edge of their seats. \n",
|
||||||
"\n",
|
"\n",
|
||||||
"Overall, Tragedy at Sunset on the Beach is a powerful and moving story about the fragility of life and the strength of love. It is sure to tug at your heartstrings and leave you with a newfound appreciation of life's precious moments. Highly recommended.\n"
|
"The play's setting of the beach at sunset adds a touch of poignancy and romanticism to the story, while the mysterious figure serves to keep the audience enthralled. Overall, Tragedy at Sunset on the Beach is an engaging and thought-provoking play that is sure to leave audiences feeling inspired and hopeful.\n"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
@ -167,7 +184,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 7,
|
"execution_count": 8,
|
||||||
"id": "02016a51",
|
"id": "02016a51",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -185,7 +202,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 8,
|
"execution_count": 9,
|
||||||
"id": "8bd38cc2",
|
"id": "8bd38cc2",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -203,7 +220,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 9,
|
"execution_count": 10,
|
||||||
"id": "524523af",
|
"id": "524523af",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
@ -220,7 +237,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 10,
|
"execution_count": 11,
|
||||||
"id": "3fd3a7be",
|
"id": "3fd3a7be",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
@ -231,14 +248,8 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new SequentialChain chain...\u001b[0m\n",
|
"\u001b[1m> Entering new SequentialChain chain...\u001b[0m\n",
|
||||||
"\u001b[1mChain 0\u001b[0m:\n",
|
|
||||||
"{'synopsis': \" \\n\\nTragedy at Sunset on the Beach is a dark and gripping drama set in Victorian England. The play follows the story of two lovers, Emma and Edward, whose passionate relationship is threatened by the strict rules and regulations of the time.\\n\\nThe two are deeply in love, but Edward is from a wealthy family and Emma is from a lower class background. Despite the obstacles, the two are determined to be together and decide to elope.\\n\\nOn the night of their planned escape, Emma and Edward meet at the beach at sunset to declare their love for one another and begin a new life together. However, their plans are disrupted when Emma's father discovers their plan and appears on the beach with a gun.\\n\\nIn a heartbreaking scene, Emma's father orders Edward to leave, but Edward refuses and fights for their love. In a fit of rage, Emma's father shoots Edward, killing him instantly. \\n\\nThe tragedy of the play lies in the fact that Emma and Edward are denied their chance at a happy ending due to the rigid social conventions of Victorian England. The audience is left with a heavy heart as the play ends with Emma standing alone on the beach, mourning the loss of her beloved.\"}\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1mChain 1\u001b[0m:\n",
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
"{'review': \"\\n\\nTragedy at Sunset on the Beach is an emotionally charged production that will leave audiences heartsick. The play follows the ill-fated love story of Emma and Edward, two star-crossed lovers whose passionate relationship is tragically thwarted by Victorian England's societal conventions. The performance is captivating from start to finish, as the audience is taken on an emotional rollercoaster of love, loss, and heartbreak.\\n\\nThe acting is powerful and sincere, and the performances of the two leads are particularly stirring. Emma and Edward are both portrayed with such tenderness and emotion that it's hard not to feel their pain as they fight for their forbidden love. The climactic scene, in which Edward is shot by Emma's father, is especially heartbreaking and will leave audience members on the edge of their seats.\\n\\nOverall, Tragedy at Sunset on the Beach is a powerful and moving work of theatre. It is a tragedy of impossible love, and a vivid reminder of the devastating consequences of social injustice. The play is sure to leave a lasting impression on anyone who experiences it.\"}\n",
|
|
||||||
"\n",
|
|
||||||
"\n",
|
|
||||||
"\u001b[1m> Finished SequentialChain chain.\u001b[0m\n"
|
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
@ -246,10 +257,91 @@
|
|||||||
"review = overall_chain({\"title\":\"Tragedy at sunset on the beach\", \"era\": \"Victorian England\"})"
|
"review = overall_chain({\"title\":\"Tragedy at sunset on the beach\", \"era\": \"Victorian England\"})"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "d2fac817",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### Memory in Sequential Chains\n",
|
||||||
|
"Sometimes you may want to pass along some context to use in each step of the chain or in a later part of the chain, but maintaining and chaining together the input/output variables can quickly get messy. Using `SimpleMemory` is a convenient way to do manage this and clean up your chains.\n",
|
||||||
|
"\n",
|
||||||
|
"For example, using the previous playwright SequentialChain, lets say you wanted to include some context about date, time and location of the play, and using the generated synopsis and review, create some social media post text. You could add these new context variables as `input_variables`, or we can add a `SimpleMemory` to the chain to manage this context:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"id": "b2cf3098",
|
||||||
|
"metadata": {},
|
||||||
|
"source": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 12,
|
||||||
|
"id": "6b7b3a7a",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Entering new SequentialChain chain...\u001b[0m\n",
|
||||||
|
"\n",
|
||||||
|
"\u001b[1m> Finished chain.\u001b[0m\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"data": {
|
||||||
|
"text/plain": [
|
||||||
|
"{'title': 'Tragedy at sunset on the beach',\n",
|
||||||
|
" 'era': 'Victorian England',\n",
|
||||||
|
" 'time': 'December 25th, 8pm PST',\n",
|
||||||
|
" 'location': 'Theater in the Park',\n",
|
||||||
|
" 'social_post_text': \"\\nSpend your Christmas night with us at Theater in the Park and experience the heartbreaking story of love and loss that is 'A Walk on the Beach'. Set in Victorian England, this romantic tragedy follows the story of Frances and Edward, a young couple whose love is tragically cut short. Don't miss this emotional and thought-provoking production that is sure to leave you in tears. #AWalkOnTheBeach #LoveAndLoss #TheaterInThePark #VictorianEngland\"}"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"execution_count": 12,
|
||||||
|
"metadata": {},
|
||||||
|
"output_type": "execute_result"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"from langchain.chains import SequentialChain\n",
|
||||||
|
"from langchain.memory import SimpleMemory\n",
|
||||||
|
"\n",
|
||||||
|
"llm = OpenAI(temperature=.7)\n",
|
||||||
|
"template = \"\"\"You are a social media manager for a theater company. Given the title of play, the era it is set in, the date,time and location, the synopsis of the play, and the review of the play, it is your job to write a social media post for that play.\n",
|
||||||
|
"\n",
|
||||||
|
"Here is some context about the time and location of the play:\n",
|
||||||
|
"Date and Time: {time}\n",
|
||||||
|
"Location: {location}\n",
|
||||||
|
"\n",
|
||||||
|
"Play Synopsis:\n",
|
||||||
|
"{synopsis}\n",
|
||||||
|
"Review from a New York Times play critic of the above play:\n",
|
||||||
|
"{review}\n",
|
||||||
|
"\n",
|
||||||
|
"Social Media Post:\n",
|
||||||
|
"\"\"\"\n",
|
||||||
|
"prompt_template = PromptTemplate(input_variables=[\"synopsis\", \"review\", \"time\", \"location\"], template=template)\n",
|
||||||
|
"social_chain = LLMChain(llm=llm, prompt=prompt_template, output_key=\"social_post_text\")\n",
|
||||||
|
"\n",
|
||||||
|
"overall_chain = SequentialChain(\n",
|
||||||
|
" memory=SimpleMemory(memories={\"time\": \"December 25th, 8pm PST\", \"location\": \"Theater in the Park\"}),\n",
|
||||||
|
" chains=[synopsis_chain, review_chain, social_chain],\n",
|
||||||
|
" input_variables=[\"era\", \"title\"],\n",
|
||||||
|
" # Here we return multiple variables\n",
|
||||||
|
" output_variables=[\"social_post_text\"],\n",
|
||||||
|
" verbose=True)\n",
|
||||||
|
"\n",
|
||||||
|
"overall_chain({\"title\":\"Tragedy at sunset on the beach\", \"era\": \"Victorian England\"})"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
"id": "6be70d27",
|
"id": "ee9bc09c",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": []
|
"source": []
|
||||||
@ -271,7 +363,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.10.9"
|
"version": "3.9.1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
@ -136,13 +136,13 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
|
"\u001B[1m> Entering new LLMChain chain...\u001B[0m\n",
|
||||||
"Prompt after formatting:\n",
|
"Prompt after formatting:\n",
|
||||||
"\u001b[32;1m\u001b[1;3mQuestion: whats 2 + 2\n",
|
"\u001B[32;1m\u001B[1;3mQuestion: whats 2 + 2\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Answer: Let's think step by step.\u001b[0m\n",
|
"Answer: Let's think step by step.\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished chain.\u001b[0m\n"
|
"\u001B[1m> Finished chain.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -319,13 +319,13 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Entering new LLMChain chain...\u001b[0m\n",
|
"\u001B[1m> Entering new LLMChain chain...\u001B[0m\n",
|
||||||
"Prompt after formatting:\n",
|
"Prompt after formatting:\n",
|
||||||
"\u001b[32;1m\u001b[1;3mQuestion: whats 2 + 2\n",
|
"\u001B[32;1m\u001B[1;3mQuestion: whats 2 + 2\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Answer: Let's think step by step.\u001b[0m\n",
|
"Answer: Let's think step by step.\u001B[0m\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\u001b[1m> Finished chain.\u001b[0m\n"
|
"\u001B[1m> Finished chain.\u001B[0m\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
@ -31,7 +31,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"with open('../../state_of_the_union.txt') as f:\n",
|
"with open(\"../../state_of_the_union.txt\") as f:\n",
|
||||||
" state_of_the_union = f.read()"
|
" state_of_the_union = f.read()"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@ -122,7 +122,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.10.9"
|
"version": "3.9.1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
@ -1,33 +0,0 @@
|
|||||||
Generic Chains
|
|
||||||
--------------
|
|
||||||
|
|
||||||
A chain is made up of links, which can be either primitives or other chains.
|
|
||||||
Primitives can be either `prompts <../prompts.html>`_, `llms <../llms.html>`_, `utils <../utils.html>`_, or other chains.
|
|
||||||
The examples here are all generic end-to-end chains that are meant to be used to construct other chains rather than serving a specific purpose.
|
|
||||||
|
|
||||||
**LLMChain**
|
|
||||||
|
|
||||||
- **Links Used**: PromptTemplate, LLM
|
|
||||||
- **Notes**: This chain is the simplest chain, and is widely used by almost every other chain. This chain takes arbitrary user input, creates a prompt with it from the PromptTemplate, passes that to the LLM, and then returns the output of the LLM as the final output.
|
|
||||||
- `Example Notebook <./generic/llm_chain.html>`_
|
|
||||||
|
|
||||||
**Transformation Chain**
|
|
||||||
|
|
||||||
- **Links Used**: TransformationChain
|
|
||||||
- **Notes**: This notebook shows how to use the Transformation Chain, which takes an arbitrary python function and applies it to inputs/outputs of other chains.
|
|
||||||
- `Example Notebook <./generic/transformation.html>`_
|
|
||||||
|
|
||||||
**Sequential Chain**
|
|
||||||
|
|
||||||
- **Links Used**: Sequential
|
|
||||||
- **Notes**: This notebook shows how to combine calling multiple other chains in sequence.
|
|
||||||
- `Example Notebook <./generic/sequential_chains.html>`_
|
|
||||||
|
|
||||||
.. toctree::
|
|
||||||
:maxdepth: 1
|
|
||||||
:glob:
|
|
||||||
:caption: Generic Chains
|
|
||||||
:name: generic
|
|
||||||
:hidden:
|
|
||||||
|
|
||||||
./generic/*
|
|
@ -9,13 +9,13 @@
|
|||||||
"In this tutorial, we will learn about creating simple chains in LangChain. We will learn how to create a chain, add components to it, and run it.\n",
|
"In this tutorial, we will learn about creating simple chains in LangChain. We will learn how to create a chain, add components to it, and run it.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"In this tutorial, we will cover:\n",
|
"In this tutorial, we will cover:\n",
|
||||||
"- Using the simple LLM chain\n",
|
"- Using a simple LLM chain\n",
|
||||||
"- Creating sequential chains\n",
|
"- Creating sequential chains\n",
|
||||||
"- Creating a custom chain\n",
|
"- Creating a custom chain\n",
|
||||||
"\n",
|
"\n",
|
||||||
"## Why do we need chains?\n",
|
"## Why do we need chains?\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Chains allow us to combine multiple components together to create a single, coherent application. For example, we can create a chain that takes user input, format it with a PromptTemplate, and then passes the formatted response to an LLM. We can build more complex chains by combining multiple chains together, or by combining chains with other components.\n"
|
"Chains allow us to combine multiple components together to create a single, coherent application. For example, we can create a chain that takes user input, formats it with a PromptTemplate, and then passes the formatted response to an LLM. We can build more complex chains by combining multiple chains together, or by combining chains with other components.\n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@ -32,7 +32,9 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 1,
|
"execution_count": 1,
|
||||||
"metadata": {},
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from langchain.prompts import PromptTemplate\n",
|
"from langchain.prompts import PromptTemplate\n",
|
||||||
@ -55,7 +57,9 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": 2,
|
"execution_count": 2,
|
||||||
"metadata": {},
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
"outputs": [
|
"outputs": [
|
||||||
{
|
{
|
||||||
"name": "stdout",
|
"name": "stdout",
|
||||||
@ -63,7 +67,7 @@
|
|||||||
"text": [
|
"text": [
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Vibrancy Socks.\n"
|
"Rainbow Socks Co.\n"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
@ -75,6 +79,48 @@
|
|||||||
"print(chain.run(\"colorful socks\"))"
|
"print(chain.run(\"colorful socks\"))"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"You can use a chat model in an `LLMChain` as well:"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 5,
|
||||||
|
"metadata": {
|
||||||
|
"tags": []
|
||||||
|
},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"Rainbow Threads\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"from langchain.chat_models import ChatOpenAI\n",
|
||||||
|
"from langchain.prompts.chat import (\n",
|
||||||
|
" ChatPromptTemplate,\n",
|
||||||
|
" HumanMessagePromptTemplate,\n",
|
||||||
|
")\n",
|
||||||
|
"human_message_prompt = HumanMessagePromptTemplate(\n",
|
||||||
|
" prompt=PromptTemplate(\n",
|
||||||
|
" template=\"What is a good name for a company that makes {product}?\",\n",
|
||||||
|
" input_variables=[\"product\"],\n",
|
||||||
|
" )\n",
|
||||||
|
" )\n",
|
||||||
|
"chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt])\n",
|
||||||
|
"chat = ChatOpenAI(temperature=0.9)\n",
|
||||||
|
"chain = LLMChain(llm=chat, prompt=chat_prompt_template)\n",
|
||||||
|
"print(chain.run(\"colorful socks\"))"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
@ -88,7 +134,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"## Combine chains with the `SequentialChain`\n",
|
"## Combine chains with the `SequentialChain`\n",
|
||||||
"\n",
|
"\n",
|
||||||
"The next step after calling a language model is make a series of calls to a language model. We can do this using sequential chains, which are chains that execute their links in a predefined order. Specifically, we will use the `SimpleSequentialChain`. This is the simplest form of sequential chains, where each step has a singular input/output, and the output of one step is the input to the next.\n",
|
"The next step after calling a language model is to make a series of calls to a language model. We can do this using sequential chains, which are chains that execute their links in a predefined order. Specifically, we will use the `SimpleSequentialChain`. This is the simplest type of a sequential chain, where each step has a single input/output, and the output of one step is the input to the next.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"In this tutorial, our sequential chain will:\n",
|
"In this tutorial, our sequential chain will:\n",
|
||||||
"1. First, create a company name for a product. We will reuse the `LLMChain` we'd previously initialized to create this company name.\n",
|
"1. First, create a company name for a product. We will reuse the `LLMChain` we'd previously initialized to create this company name.\n",
|
||||||
@ -156,7 +202,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"## Create a custom chain with the `Chain` class\n",
|
"## Create a custom chain with the `Chain` class\n",
|
||||||
"\n",
|
"\n",
|
||||||
"LangChain provides many chains out of the box, but sometimes you may want to create a custom chains for your specific use case. For this example, we will create a custom chain that concatenates the outputs of 2 `LLMChain`s.\n",
|
"LangChain provides many chains out of the box, but sometimes you may want to create a custom chain for your specific use case. For this example, we will create a custom chain that concatenates the outputs of 2 `LLMChain`s.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"In order to create a custom chain:\n",
|
"In order to create a custom chain:\n",
|
||||||
"1. Start by subclassing the `Chain` class,\n",
|
"1. Start by subclassing the `Chain` class,\n",
|
||||||
@ -274,5 +320,5 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
"nbformat_minor": 2
|
"nbformat_minor": 4
|
||||||
}
|
}
|
||||||
|
@ -2,23 +2,37 @@ How-To Guides
|
|||||||
=============
|
=============
|
||||||
|
|
||||||
A chain is made up of links, which can be either primitives or other chains.
|
A chain is made up of links, which can be either primitives or other chains.
|
||||||
Primitives can be either `prompts <../prompts.html>`_, `llms <../llms.html>`_, `utils <../utils.html>`_, or other chains.
|
Primitives can be either `prompts <../prompts.html>`_, `models <../models.html>`_, arbitrary functions, or other chains.
|
||||||
The examples here are all end-to-end chains for specific applications.
|
The examples here are broken up into three sections:
|
||||||
They are broken up into three categories:
|
|
||||||
|
|
||||||
1. `Generic Chains <./generic_how_to.html>`_: Generic chains, that are meant to help build other chains rather than serve a particular purpose.
|
**Generic Functionality**
|
||||||
2. `Utility Chains <./utility_how_to.html>`_: Chains consisting of an LLMChain interacting with a specific util.
|
|
||||||
3. `Asynchronous <./async_chain.html>`_: Covering asynchronous functionality.
|
Covers both generic chains (that are useful in a wide variety of applications) as well as generic functionality related to those chains.
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
:maxdepth: 1
|
:maxdepth: 1
|
||||||
:glob:
|
:glob:
|
||||||
:hidden:
|
|
||||||
|
|
||||||
./generic_how_to.rst
|
./generic/*
|
||||||
./utility_how_to.rst
|
|
||||||
./async_chain.ipynb
|
|
||||||
|
|
||||||
In addition to different types of chains, we also have the following how-to guides for working with chains in general:
|
**Index-related Chains**
|
||||||
|
|
||||||
|
Chains related to working with indexes.
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./index_examples/*
|
||||||
|
|
||||||
|
|
||||||
|
**All other chains**
|
||||||
|
|
||||||
|
All other types of chains!
|
||||||
|
|
||||||
|
.. toctree::
|
||||||
|
:maxdepth: 1
|
||||||
|
:glob:
|
||||||
|
|
||||||
|
./examples/*
|
||||||
|
|
||||||
`Load From Hub <./generic/from_hub.html>`_: This notebook covers how to load chains from `LangChainHub <https://github.com/hwchase17/langchain-hub>`_.
|
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user