mirror of https://github.com/hwchase17/langchain
community[patch]: Added support for Ollama's num_predict option in ChatOllama (#16633)
Just a simple default addition to the options payload for a ollama generate call to support a max_new_tokens parameter. Should fix issue: https://github.com/langchain-ai/langchain/issues/14715pull/16657/head
parent
6a75ef74ca
commit
6543e585a5
Loading…
Reference in New Issue