mirror of https://github.com/hwchase17/langchain
llamacpp wrong default value passed for `f16_kv` (#3320)
Fixes default f16_kv value in llamacpp; corrects incorrect parameter
passed.
See:
ba3959eafd/llama_cpp/llama.py (L33)
Fixes #3241
Fixes #3301
pull/3045/head^2
parent
3a1bdce3f5
commit
77bb6c99f7
Loading…
Reference in New Issue