mirror of https://github.com/hwchase17/langchain
fix: prevent adding an empty string to the result queue in AsyncIteratorCallbackHandler (#7180)
- Description: Modify the code for AsyncIteratorCallbackHandler.on_llm_new_token to ensure that it does not add an empty string to the result queue. - Tag maintainer: @agola11 When using AsyncIteratorCallbackHandler with OpenAIFunctionsAgent, if the LLM response function_call instead of direct answer, the AsyncIteratorCallbackHandler.on_llm_new_token would be called with empty string. see also: langchain.chat_models.openai.ChatOpenAI._generate An alternative solution is to modify the langchain.chat_models.openai.ChatOpenAI._generate and do not call the run_manager.on_llm_new_token when the token is empty string. I am not sure which solution is better. @hwchase17 --------- Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>pull/7213/head
parent
db98c44f8f
commit
8045870a0f
Loading…
Reference in New Issue