mirror of
https://github.com/hwchase17/langchain
synced 2024-11-18 09:25:54 +00:00
docs: Added illustration of using RetryOutputParser with LLMChain (#16722)
**Description:** Updated the retry.ipynb notebook, it contains the illustrations of RetryOutputParser in LangChain. But the notebook lacks to explain the compatibility of RetryOutputParser with existing chains. This changes adds some code to illustrate the workflow of using RetryOutputParser with the user chain. Changes: 1. Changed RetryWithErrorOutputParser with RetryOutputParser, as the markdown text says so. 2. Added code at the last of the notebook to define a chain which passes the LLM completions to the retry parser, which can be customised for user needs. **Issue:** Since RetryOutputParser/RetryWithErrorOutputParser does not implement the parse function it cannot be used with LLMChain directly like [this](https://python.langchain.com/docs/expression_language/cookbook/prompt_llm_parser#prompttemplate-llm-outputparser). This also raised various issues #15133 #12175 #11719 still open, instead of adding new features/code changes its best to explain the "how to integrate LLMChain with retry parsers" clearly with an example in the corresponding notebook. Inspired from: https://github.com/langchain-ai/langchain/issues/15133#issuecomment-1868972580 --------- Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
This commit is contained in:
parent
a1aa3a657c
commit
47bd58dc11
@ -174,7 +174,7 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from langchain.output_parsers import RetryWithErrorOutputParser"
|
||||
"from langchain.output_parsers import RetryOutputParser"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -184,9 +184,7 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"retry_parser = RetryWithErrorOutputParser.from_llm(\n",
|
||||
" parser=parser, llm=OpenAI(temperature=0)\n",
|
||||
")"
|
||||
"retry_parser = RetryOutputParser.from_llm(parser=parser, llm=OpenAI(temperature=0))"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -210,6 +208,41 @@
|
||||
"retry_parser.parse_with_prompt(bad_response, prompt_value)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"id": "16827256-5801-4388-b6fa-608991e29961",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"We can also add the RetryOutputParser easily with a custom chain which transform the raw LLM/ChatModel output into a more workable format."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"id": "7eaff2fb-56d3-481c-99a1-a968a49d0654",
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Action(action='search', action_input='leo di caprio girlfriend')\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"from langchain_core.runnables import RunnableLambda, RunnableParallel\n",
|
||||
"\n",
|
||||
"completion_chain = prompt | OpenAI(temperature=0)\n",
|
||||
"\n",
|
||||
"main_chain = RunnableParallel(\n",
|
||||
" completion=completion_chain, prompt_value=prompt\n",
|
||||
") | RunnableLambda(lambda x: retry_parser.parse_with_prompt(**x))\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"main_chain.invoke({\"query\": \"who is leo di caprios gf?\"})"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
@ -235,7 +268,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.10.1"
|
||||
"version": "3.9.13"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
Loading…
Reference in New Issue
Block a user