You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/docs/docs/expression_language/how_to/decorator.ipynb

127 lines
3.1 KiB
Plaintext

{
"cells": [
{
"cell_type": "markdown",
"id": "b45110ef",
"metadata": {},
"source": [
"# Create a runnable with the `@chain` decorator\n",
"\n",
"You can also turn an arbitrary function into a chain by adding a `@chain` decorator. This is functionaly equivalent to wrapping in a [`RunnableLambda`](./functions).\n",
"\n",
"This will have the benefit of improved observability by tracing your chain correctly. Any calls to runnables inside this function will be traced as nested childen.\n",
"\n",
"It will also allow you to use this as any other runnable, compose it in chain, etc.\n",
"\n",
"Let's take a look at this in action!"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "d9370420",
"metadata": {},
"outputs": [],
"source": [
"from langchain_core.output_parsers import StrOutputParser\n",
"from langchain_core.prompts import ChatPromptTemplate\n",
"from langchain_core.runnables import chain\n",
"from langchain_openai import ChatOpenAI"
]
},
{
"cell_type": "code",
"execution_count": 17,
"id": "b7f74f7e",
"metadata": {},
"outputs": [],
"source": [
"prompt1 = ChatPromptTemplate.from_template(\"Tell me a joke about {topic}\")\n",
"prompt2 = ChatPromptTemplate.from_template(\"What is the subject of this joke: {joke}\")"
]
},
{
"cell_type": "code",
"execution_count": 18,
"id": "2b0365c4",
"metadata": {},
"outputs": [],
"source": [
"@chain\n",
"def custom_chain(text):\n",
" prompt_val1 = prompt1.invoke({\"topic\": text})\n",
" output1 = ChatOpenAI().invoke(prompt_val1)\n",
" parsed_output1 = StrOutputParser().invoke(output1)\n",
" chain2 = prompt2 | ChatOpenAI() | StrOutputParser()\n",
" return chain2.invoke({\"joke\": parsed_output1})"
]
},
{
"cell_type": "markdown",
"id": "904d6872",
"metadata": {},
"source": [
"`custom_chain` is now a runnable, meaning you will need to use `invoke`"
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "6448bdd3",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'The subject of this joke is bears.'"
]
},
"execution_count": 21,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"custom_chain.invoke(\"bears\")"
]
},
{
"cell_type": "markdown",
"id": "aa767ea9",
"metadata": {},
"source": [
"If you check out your LangSmith traces, you should see a `custom_chain` trace in there, with the calls to OpenAI nested underneath"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f1245bdc",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.1"
}
},
"nbformat": 4,
"nbformat_minor": 5
}