experimental[patch]: Fix LLM graph transformer default prompt (#18856)

Some LLMs do not allow multiple user messages in sequence.
pull/15575/merge
Tomaz Bratanic 4 months ago committed by GitHub
parent 19721246f5
commit cda43c5a11
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -52,14 +52,12 @@ default_prompt = ChatPromptTemplate.from_messages(
(
"human",
(
"Tip: Make sure to answer in the correct format and do "
"not include any explanations. "
"Use the given format to extract information from the "
"following input: {input}"
),
),
(
"human",
"Tip: Make sure to answer in the correct format and do not include any ",
),
]
)

Loading…
Cancel
Save