docs: clean up init_chat_model (#26551)

This commit is contained in:
Bagatur 2024-09-16 15:08:22 -07:00 committed by GitHub
parent 3bcd641bc1
commit 99abd254fb
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
4 changed files with 1337 additions and 1205 deletions

View File

@ -15,43 +15,15 @@
"\n",
"Make sure you have the integration packages installed for any model providers you want to support. E.g. you should have `langchain-openai` installed to init an OpenAI model.\n",
"\n",
":::\n",
"\n",
":::info Requires ``langchain >= 0.2.8``\n",
"\n",
"This functionality was added in ``langchain-core == 0.2.8``. Please make sure your package is up to date.\n",
"\n",
":::"
]
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"id": "165b0de6-9ae3-4e3d-aa98-4fc8a97c4a06",
"metadata": {
"execution": {
"iopub.execute_input": "2024-09-10T20:22:32.858670Z",
"iopub.status.busy": "2024-09-10T20:22:32.858278Z",
"iopub.status.idle": "2024-09-10T20:22:33.009452Z",
"shell.execute_reply": "2024-09-10T20:22:33.007022Z"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"zsh:1: 0.2.8 not found\r\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Note: you may need to restart the kernel to use updated packages.\n"
]
}
],
"metadata": {},
"outputs": [],
"source": [
"%pip install -qU langchain>=0.2.8 langchain-openai langchain-anthropic langchain-google-vertexai"
]

View File

@ -98,48 +98,40 @@ def init_chat_model(
Must have the integration package corresponding to the model provider installed.
.. versionadded:: 0.2.7
.. versionchanged:: 0.2.8
Support for ``configurable_fields`` and ``config_prefix`` added.
.. versionchanged:: 0.2.12
Support for Ollama via langchain-ollama package added. Previously
langchain-community version of Ollama (now deprecated) was installed by default.
Args:
model: The name of the model, e.g. "gpt-4o", "claude-3-opus-20240229".
model_provider: The model provider. Supported model_provider values and the
corresponding integration package:
- openai (langchain-openai)
- anthropic (langchain-anthropic)
- azure_openai (langchain-openai)
- google_vertexai (langchain-google-vertexai)
- google_genai (langchain-google-genai)
- bedrock (langchain-aws)
- cohere (langchain-cohere)
- fireworks (langchain-fireworks)
- together (langchain-together)
- mistralai (langchain-mistralai)
- huggingface (langchain-huggingface)
- groq (langchain-groq)
- ollama (langchain-ollama) [support added in langchain==0.2.12]
- openai (langchain-openai)
- anthropic (langchain-anthropic)
- azure_openai (langchain-openai)
- google_vertexai (langchain-google-vertexai)
- google_genai (langchain-google-genai)
- bedrock (langchain-aws)
- cohere (langchain-cohere)
- fireworks (langchain-fireworks)
- together (langchain-together)
- mistralai (langchain-mistralai)
- huggingface (langchain-huggingface)
- groq (langchain-groq)
- ollama (langchain-ollama) [support added in langchain==0.2.12]
Will attempt to infer model_provider from model if not specified. The
following providers will be inferred based on these model prefixes:
- gpt-3... or gpt-4... -> openai
- claude... -> anthropic
- amazon.... -> bedrock
- gemini... -> google_vertexai
- command... -> cohere
- accounts/fireworks... -> fireworks
- gpt-3... or gpt-4... -> openai
- claude... -> anthropic
- amazon.... -> bedrock
- gemini... -> google_vertexai
- command... -> cohere
- accounts/fireworks... -> fireworks
configurable_fields: Which model parameters are
configurable:
- None: No configurable fields.
- "any": All fields are configurable. *See Security Note below.*
- Union[List[str], Tuple[str, ...]]: Specified fields are configurable.
- None: No configurable fields.
- "any": All fields are configurable. *See Security Note below.*
- Union[List[str], Tuple[str, ...]]: Specified fields are configurable.
Fields are assumed to have config_prefix stripped if there is a
config_prefix. If model is specified, then defaults to None. If model is
@ -168,7 +160,9 @@ def init_chat_model(
ValueError: If model_provider cannot be inferred or isn't supported.
ImportError: If the model provider integration package is not installed.
Initialize non-configurable models:
.. dropdown:: Init non-configurable model
:open:
.. code-block:: python
# pip install langchain langchain-openai langchain-anthropic langchain-google-vertexai
@ -183,7 +177,8 @@ def init_chat_model(
gemini_15.invoke("what's your name")
Create a partially configurable model with no default model:
.. dropdown:: Partially configurable model with no default
.. code-block:: python
# pip install langchain langchain-openai langchain-anthropic
@ -204,7 +199,8 @@ def init_chat_model(
)
# claude-3.5 sonnet response
Create a fully configurable model with a default model and a config prefix:
.. dropdown:: Fully configurable model with a default
.. code-block:: python
# pip install langchain langchain-openai langchain-anthropic
@ -233,7 +229,8 @@ def init_chat_model(
)
# Claude-3.5 sonnet response with temperature 0.6
Bind tools to a configurable model:
.. dropdown:: Bind tools to a configurable model
You can call any ChatModel declarative methods on a configurable model in the
same way that you would with a normal model.
@ -270,6 +267,18 @@ def init_chat_model(
config={"configurable": {"model": "claude-3-5-sonnet-20240620"}}
)
# Claude-3.5 sonnet response with tools
.. versionadded:: 0.2.7
.. versionchanged:: 0.2.8
Support for ``configurable_fields`` and ``config_prefix`` added.
.. versionchanged:: 0.2.12
Support for Ollama via langchain-ollama package added. Previously
langchain-community version of Ollama (now deprecated) was installed by default.
""" # noqa: E501
if not model and not configurable_fields:
configurable_fields = ("model", "model_provider")

2414
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -9,14 +9,13 @@ repository = "https://www.github.com/langchain-ai/langchain"
[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
python = ">=3.9,<4.0"
[tool.poetry.group.docs.dependencies]
autodoc_pydantic = "^1"
sphinx = "^7"
myst-parser = "^3"
sphinx-autobuild = "^2021"
pydata-sphinx-theme = "^0.14"
autodoc_pydantic = "^2"
sphinx = ">=7"
sphinx-autobuild = ">=2024"
pydata-sphinx-theme = ">=0.15"
toml = "^0.10.2"
[tool.poetry.group.lint.dependencies]