update modules sidebar (#13141)

pull/13148/head
Bagatur 8 months ago committed by GitHub
parent 84e65533e9
commit b298f550fe
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -45,20 +45,14 @@ These docs focus on the Python LangChain library. [Head here](https://js.langcha
## Modules ## Modules
LangChain provides standard, extendable interfaces and integrations for the following modules, listed from least to most complex: LangChain provides standard, extendable interfaces and integrations for the following modules:
#### [Model I/O](/docs/modules/model_io/) #### [Model I/O](/docs/modules/model_io/)
Interface with language models Interface with language models
#### [Retrieval](/docs/modules/data_connection/) #### [Retrieval](/docs/modules/data_connection/)
Interface with application-specific data Interface with application-specific data
#### [Chains](/docs/modules/chains/)
Construct sequences of calls
#### [Agents](/docs/modules/agents/) #### [Agents](/docs/modules/agents/)
Let chains choose which tools to use given high-level directives Let chains choose which tools to use given high-level directives
#### [Memory](/docs/modules/memory/)
Persist application state between runs of a chain
#### [Callbacks](/docs/modules/callbacks/)
Log and stream intermediate steps of any chain
## Examples, ecosystem, and resources ## Examples, ecosystem, and resources
### [Use cases](/docs/use_cases/question_answering/) ### [Use cases](/docs/use_cases/question_answering/)

@ -4,16 +4,18 @@ sidebar_class_name: hidden
# Modules # Modules
LangChain provides standard, extendable interfaces and external integrations for the following modules, listed from least to most complex: LangChain provides standard, extendable interfaces and external integrations for the following main modules:
#### [Model I/O](/docs/modules/model_io/) #### [Model I/O](/docs/modules/model_io/)
Interface with language models Interface with language models
#### [Retrieval](/docs/modules/data_connection/) #### [Retrieval](/docs/modules/data_connection/)
Interface with application-specific data Interface with application-specific data
#### [Chains](/docs/modules/chains/)
Construct sequences of calls
#### [Agents](/docs/modules/agents/) #### [Agents](/docs/modules/agents/)
Let chains choose which tools to use given high-level directives Let chains choose which tools to use given high-level directives
## Additional
#### [Chains](/docs/modules/chains/)
Common, building block compositions
#### [Memory](/docs/modules/memory/) #### [Memory](/docs/modules/memory/)
Persist application state between runs of a chain Persist application state between runs of a chain
#### [Callbacks](/docs/modules/callbacks/) #### [Callbacks](/docs/modules/callbacks/)

@ -6,7 +6,6 @@
"metadata": {}, "metadata": {},
"source": [ "source": [
"---\n", "---\n",
"sidebar_position: 1\n",
"title: Chat models\n", "title: Chat models\n",
"---" "---"
] ]

@ -5,10 +5,10 @@ sidebar_position: 1
LangChain provides interfaces and integrations for two types of models: LangChain provides interfaces and integrations for two types of models:
- [LLMs](/docs/modules/model_io/models/llms/): Models that take a text string as input and return a text string
- [Chat models](/docs/modules/model_io/models/chat/): Models that are backed by a language model but take a list of Chat Messages as input and return a Chat Message - [Chat models](/docs/modules/model_io/models/chat/): Models that are backed by a language model but take a list of Chat Messages as input and return a Chat Message
- [LLMs](/docs/modules/model_io/models/llms/): Models that take a text string as input and return a text string
## LLMs vs chat models ## LLMs vs Chat models
LLMs and chat models are subtly but importantly different. LLMs in LangChain refer to pure text completion models. LLMs and chat models are subtly but importantly different. LLMs in LangChain refer to pure text completion models.
The APIs they wrap take a string prompt as input and output a string completion. OpenAI's GPT-3 is implemented as an LLM. The APIs they wrap take a string prompt as input and output a string completion. OpenAI's GPT-3 is implemented as an LLM.

@ -6,7 +6,6 @@
"metadata": {}, "metadata": {},
"source": [ "source": [
"---\n", "---\n",
"sidebar_position: 0\n",
"title: LLMs\n", "title: LLMs\n",
"---" "---"
] ]

@ -46,8 +46,22 @@ module.exports = {
{ {
type: "category", type: "category",
label: "Modules", label: "Modules",
collapsed: true, collapsed: false,
items: [{ type: "autogenerated", dirName: "modules" } ], items: [
{ type: "category", label: "Model I/O", collapsed: true, items: [{type:"autogenerated", dirName: "modules/model_io" }], link: { type: 'doc', id: "modules/model_io/index"}},
{ type: "category", label: "Retrieval", collapsed: true, items: [{type:"autogenerated", dirName: "modules/data_connection" }], link: { type: 'doc', id: "modules/data_connection/index"}},
{ type: "category", label: "Agents", collapsed: true, items: [{type:"autogenerated", dirName: "modules/agents" }], link: { type: 'doc', id: "modules/agents/index"}},
{
type: "category",
label: "More",
collapsed: true,
items: [
{ type: "category", label: "Chains", collapsed: true, items: [{type:"autogenerated", dirName: "modules/chains" }], link: { type: 'doc', id: "modules/chains/index"}},
{ type: "category", label: "Memory", collapsed: true, items: [{type:"autogenerated", dirName: "modules/memory" }], link: { type: 'doc', id: "modules/memory/index"}},
{ type: "category", label: "Callbacks", collapsed: true, items: [{type:"autogenerated", dirName: "modules/callbacks" }], link: { type: 'doc', id: "modules/callbacks/index"}},
]
}
],
link: { link: {
type: 'doc', type: 'doc',
id: "modules/index" id: "modules/index"

Loading…
Cancel
Save