mirror of
https://github.com/hwchase17/langchain
synced 2024-10-29 17:07:25 +00:00
e9799d6821
The provided example uses the default `max_length` of `20` tokens, which leads to the example generation getting cut off. 20 tokens is way too short to show CoT reasoning, so I boosted it to `64`. Without knowing HF's API well, it can be hard to figure out just where those `model_kwargs` come from, and `max_length` is a super critical one. |
||
---|---|---|
.. | ||
azure_openai_example.ipynb | ||
huggingface_hub.ipynb | ||
manifest.ipynb |