experimental[patch]: Fix LLM graph transformer default prompt (#18856)

Some LLMs do not allow multiple user messages in sequence.
pull/18953/head
Tomaz Bratanic 7 months ago committed by GitHub
parent 19721246f5
commit cda43c5a11
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -52,14 +52,12 @@ default_prompt = ChatPromptTemplate.from_messages(
( (
"human", "human",
( (
"Tip: Make sure to answer in the correct format and do "
"not include any explanations. "
"Use the given format to extract information from the " "Use the given format to extract information from the "
"following input: {input}" "following input: {input}"
), ),
), ),
(
"human",
"Tip: Make sure to answer in the correct format and do not include any ",
),
] ]
) )

Loading…
Cancel
Save