mirror of https://github.com/hwchase17/langchain
improves huggingface_hub example (#988)
The provided example uses the default `max_length` of `20` tokens, which leads to the example generation getting cut off. 20 tokens is way too short to show CoT reasoning, so I boosted it to `64`. Without knowing HF's API well, it can be hard to figure out just where those `model_kwargs` come from, and `max_length` is a super critical one.pull/890/head^2
parent
c2d1d903fa
commit
e9799d6821
Loading…
Reference in New Issue