Allow no token limit for ChatGPT API (#1481)

The endpoint default is inf if we don't specify max_tokens, so unlike
regular completion API, we don't need to calculate this based on the
prompt.
fix-searx
kahkeng 1 year ago committed by GitHub
parent 312c319d8b
commit df6865cd52
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -619,6 +619,9 @@ class OpenAIChat(BaseLLM, BaseModel):
if "stop" in params:
raise ValueError("`stop` found in both the input and default params.")
params["stop"] = stop
if params.get("max_tokens") == -1:
# for ChatGPT api, omitting max_tokens is equivalent to having no limit
del params["max_tokens"]
return messages, params
def _generate(

Loading…
Cancel
Save