docs[callbacks]: update to the FileCallbackHandler documentation (#20496)

**Description:** Update to the `FileCallbackHandler` documentation
**Issue:** #20493 
**Dependencies:** None
pull/19897/head^2
aditya thomas 3 months ago committed by GitHub
parent cea379e7c7
commit 8bad536c6c
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

@ -5,8 +5,11 @@
"id": "63b87b91",
"metadata": {},
"source": [
"# Logging to file\n",
"This example shows how to print logs to file. It shows how to use the `FileCallbackHandler`, which does the same thing as [`StdOutCallbackHandler`](/docs/modules/callbacks/#get-started), but instead writes the output to file. It also uses the `loguru` library to log other outputs that are not captured by the handler."
"# File logging\n",
"\n",
"LangChain provides the `FileCallbackHandler` to write logs to a file. The `FileCallbackHandler` is similar to the [`StdOutCallbackHandler`](/docs/modules/callbacks/), but instead of printing logs to standard output it writes logs to a file.\n",
"\n",
"We see how to use the `FileCallbackHandler` in this example. Additionally we use the `StdOutCallbackHandler` to print logs to the standard output. It also uses the `loguru` library to log other outputs that are not captured by the handler."
]
},
{
@ -45,8 +48,7 @@
}
],
"source": [
"from langchain.callbacks import FileCallbackHandler\n",
"from langchain.chains import LLMChain\n",
"from langchain_core.callbacks import FileCallbackHandler, StdOutCallbackHandler\n",
"from langchain_core.prompts import PromptTemplate\n",
"from langchain_openai import OpenAI\n",
"from loguru import logger\n",
@ -54,16 +56,18 @@
"logfile = \"output.log\"\n",
"\n",
"logger.add(logfile, colorize=True, enqueue=True)\n",
"handler = FileCallbackHandler(logfile)\n",
"handler_1 = FileCallbackHandler(logfile)\n",
"handler_2 = StdOutCallbackHandler()\n",
"\n",
"llm = OpenAI()\n",
"prompt = PromptTemplate.from_template(\"1 + {number} = \")\n",
"model = OpenAI()\n",
"\n",
"# this chain will both print to stdout (because verbose=True) and write to 'output.log'\n",
"# if verbose=False, the FileCallbackHandler will still write to 'output.log'\n",
"chain = LLMChain(llm=llm, prompt=prompt, callbacks=[handler], verbose=True)\n",
"answer = chain.run(number=2)\n",
"logger.info(answer)"
"chain = prompt | model\n",
"\n",
"response = chain.invoke({\"number\": 2}, {\"callbacks\": [handler_1, handler_2]})\n",
"logger.info(response)"
]
},
{
@ -166,7 +170,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.6"
"version": "3.9.6"
}
},
"nbformat": 4,

Loading…
Cancel
Save