mirror of https://github.com/hwchase17/langchain
add kwargs support for Baseten models (#8091)
This bugfix PR adds kwargs support to Baseten model invocations so that e.g. the following script works properly: ```python chatgpt_chain = LLMChain( llm=Baseten(model="MODEL_ID"), prompt=prompt, verbose=False, memory=ConversationBufferWindowMemory(k=2), llm_kwargs={"max_length": 4096} ) ```pull/8104/head v0.0.1rc0
parent
8dcabd9205
commit
95bcf68802
Loading…
Reference in New Issue