forked from Archives/langchain
fix: token counting for chat openai. (#2543)
I noticed that the value of get_num_tokens_from_messages in `ChatOpenAI`
is always one less than the response from OpenAI's API. Upon checking
the official documentation, I found that it had been updated, so I made
the necessary corrections.
Then now I got the same value from OpenAI's API.
d972e7482e (diff-2d4485035b3a3469802dbad11d7b4f834df0ea0e2790f418976b303bc82c1874L474)
This commit is contained in:
parent
8cded3fdad
commit
c9f93f5f74
@ -400,5 +400,5 @@ class ChatOpenAI(BaseChatModel):
|
|||||||
if key == "name":
|
if key == "name":
|
||||||
num_tokens += tokens_per_name
|
num_tokens += tokens_per_name
|
||||||
# every reply is primed with <im_start>assistant
|
# every reply is primed with <im_start>assistant
|
||||||
num_tokens += 2
|
num_tokens += 3
|
||||||
return num_tokens
|
return num_tokens
|
||||||
|
Loading…
Reference in New Issue
Block a user