mirror of
https://github.com/hwchase17/langchain
synced 2024-11-10 01:10:59 +00:00
ignore the first turn to apply "history" mechanism (#14118)
This will generate a meaningless string "system: " for generating condense question; this increases the probability to make an improper condense question and misunderstand user's question. Below is a case - Original Question: Can you explain the arguments of Meilisearch? - Condense Question - What are the benefits of using Meilisearch? (by CodeLlama) - What are the reasons for using Meilisearch? (by GPT-4) The condense questions (not matter from CodeLlam or GPT-4) are different from the original one. By checking the content of each dialogue turn, generating history string only when the dialog content is not empty. Since there is nothing before first turn, the "history" mechanism will be ignored at the very first turn. Doing so, the condense question will be "What are the arguments for using Meilisearch?". <!-- Thank you for contributing to LangChain! Replace this entire comment with: - **Description:** a description of the change, - **Issue:** the issue # it fixes (if applicable), - **Dependencies:** any dependencies required for this change, - **Tag maintainer:** for a quicker response, tag the relevant maintainer (see below), - **Twitter handle:** we announce bigger features on Twitter. If your PR gets announced, and you'd like a mention, we'll gladly shout you out! Please make sure your PR is passing linting and testing before submitting. Run `make format`, `make lint` and `make test` to check this locally. See contribution guidelines for more information on how to write/run tests, lint, etc: https://github.com/langchain-ai/langchain/blob/master/.github/CONTRIBUTING.md If you're adding a new integration, please include: 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. It lives in `docs/extras` directory. If no one reviews your PR within a few days, please @-mention one of @baskaryan, @eyurtsev, @hwchase17. --> --------- Co-authored-by: Bagatur <baskaryan@gmail.com> Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> Co-authored-by: ccurme <chester.curme@gmail.com>
This commit is contained in:
parent
236e957abb
commit
df357f82ca
@ -42,8 +42,11 @@ def _get_chat_history(chat_history: List[CHAT_TURN_TYPE]) -> str:
|
||||
buffer = ""
|
||||
for dialogue_turn in chat_history:
|
||||
if isinstance(dialogue_turn, BaseMessage):
|
||||
role_prefix = _ROLE_MAP.get(dialogue_turn.type, f"{dialogue_turn.type}: ")
|
||||
buffer += f"\n{role_prefix}{dialogue_turn.content}"
|
||||
if len(dialogue_turn.content) > 0:
|
||||
role_prefix = _ROLE_MAP.get(
|
||||
dialogue_turn.type, f"{dialogue_turn.type}: "
|
||||
)
|
||||
buffer += f"\n{role_prefix}{dialogue_turn.content}"
|
||||
elif isinstance(dialogue_turn, tuple):
|
||||
human = "Human: " + dialogue_turn[0]
|
||||
ai = "Assistant: " + dialogue_turn[1]
|
||||
|
Loading…
Reference in New Issue
Block a user