forked from Archives/langchain
Fix an unusual issue that occurs when using OpenAIChat for llm_math (#1410)
Fix an issue that occurs when using OpenAIChat for llm_math, refer to the code style of the "Final Answer:" in Mrkl。 the reason is I found a issue when I try OpenAIChat for llm_math, when I try the question in Chinese, the model generate the format like "\n\nQuestion: What is the square of 29?\nAnswer: 841", it translate the question first , then answer. below is my snapshot: <img width="945" alt="snapshot" src="https://user-images.githubusercontent.com/82029664/222642193-10ecca77-db7b-4759-bc46-32a8f8ddc48f.png">
This commit is contained in:
parent
b8a7828d1f
commit
68ce68f290
@ -62,6 +62,8 @@ class LLMMathChain(Chain, BaseModel):
|
||||
answer = "Answer: " + output
|
||||
elif t.startswith("Answer:"):
|
||||
answer = t
|
||||
elif "Answer:" in t:
|
||||
answer = "Answer: " + t.split("Answer:")[-1]
|
||||
else:
|
||||
raise ValueError(f"unknown format from LLM: {t}")
|
||||
return {self.output_key: answer}
|
||||
|
Loading…
Reference in New Issue
Block a user