langchain[patch]: init_chat_model() to import ChatOllama from langchain-ollama and fallback on langchain-community (#24821)

Description: init_chat_model() should import ChatOllama from
`langchain-ollama`. If that fails, fallback to `langchain-community`
This commit is contained in:
Jerron Lim 2024-07-31 02:16:10 +08:00 committed by GitHub
parent 3a7f3d46c3
commit d8f3ea82db
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -118,7 +118,7 @@ def init_chat_model(
- mistralai (langchain-mistralai)
- huggingface (langchain-huggingface)
- groq (langchain-groq)
- ollama (langchain-community)
- ollama (langchain-ollama)
Will attempt to infer model_provider from model if not specified. The
following providers will be inferred based on these model prefixes:
@ -336,8 +336,12 @@ def _init_chat_model_helper(
return ChatFireworks(model=model, **kwargs)
elif model_provider == "ollama":
_check_pkg("langchain_community")
from langchain_community.chat_models import ChatOllama
try:
_check_pkg("langchain_ollama")
from langchain_ollama import ChatOllama
except ImportError:
_check_pkg("langchain_community")
from langchain_community import ChatOllama
return ChatOllama(model=model, **kwargs)
elif model_provider == "together":