mirror of
https://github.com/hwchase17/langchain
synced 2024-11-08 07:10:35 +00:00
fix: prevent adding an empty string to the result queue in AsyncIteratorCallbackHandler (#7180)
- Description: Modify the code for AsyncIteratorCallbackHandler.on_llm_new_token to ensure that it does not add an empty string to the result queue. - Tag maintainer: @agola11 When using AsyncIteratorCallbackHandler with OpenAIFunctionsAgent, if the LLM response function_call instead of direct answer, the AsyncIteratorCallbackHandler.on_llm_new_token would be called with empty string. see also: langchain.chat_models.openai.ChatOpenAI._generate An alternative solution is to modify the langchain.chat_models.openai.ChatOpenAI._generate and do not call the run_manager.on_llm_new_token when the token is empty string. I am not sure which solution is better. @hwchase17 --------- Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
This commit is contained in:
parent
db98c44f8f
commit
8045870a0f
@ -31,6 +31,7 @@ class AsyncIteratorCallbackHandler(AsyncCallbackHandler):
|
||||
self.done.clear()
|
||||
|
||||
async def on_llm_new_token(self, token: str, **kwargs: Any) -> None:
|
||||
if token is not None and token != "":
|
||||
self.queue.put_nowait(token)
|
||||
|
||||
async def on_llm_end(self, response: LLMResult, **kwargs: Any) -> None:
|
||||
|
Loading…
Reference in New Issue
Block a user