mirror of https://github.com/hwchase17/langchain
set default embedding max token size (#2330)
#991 has already implemented this convenient feature to prevent exceeding max token limit in embedding model. > By default, this function is deactivated so as not to change the previous behavior. If you specify something like 8191 here, it will work as desired. According to the author, this is not set by default. Until now, the default model in OpenAIEmbeddings's max token size is 8191 tokens, no other openai model has a larger token limit. So I believe it will be better to set this as default value, other wise users may encounter this error and hard to solve it.pull/2525/head
parent
0316900d2f
commit
e131156805
Loading…
Reference in New Issue