fix: token counting for chat openai. (#2543)

I noticed that the value of get_num_tokens_from_messages in `ChatOpenAI`
is always one less than the response from OpenAI's API. Upon checking
the official documentation, I found that it had been updated, so I made
the necessary corrections.
Then now I got the same value from OpenAI's API.


d972e7482e (diff-2d4485035b3a3469802dbad11d7b4f834df0ea0e2790f418976b303bc82c1874L474)
This commit is contained in:
tmyjoe 2023-04-07 23:27:03 +09:00 committed by GitHub
parent 8cded3fdad
commit c9f93f5f74
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -400,5 +400,5 @@ class ChatOpenAI(BaseChatModel):
if key == "name": if key == "name":
num_tokens += tokens_per_name num_tokens += tokens_per_name
# every reply is primed with <im_start>assistant # every reply is primed with <im_start>assistant
num_tokens += 2 num_tokens += 3
return num_tokens return num_tokens