@ -4,8 +4,22 @@ In this section, we discuss other miscellaneous but important topics in prompt e
**Note that this section is under construction.**
--
## Program-Aided Language Models
[Gao et al., (2023)](https://arxiv.org/abs/2211.10435) presents a method that uses LLMs to read natural language problems and generate programs as the intermediate reasoning steps. Coined, program-aided language models (PAL), it differs from chain-of-thought prompting in that instead of using free-form text to obtain solution it offloads the solution step to a programmatic runtime such as a Python interpreter.
[Gao et al., (2022)](https://arxiv.org/abs/2211.10435) presents a method that uses LLMs to read natural language problems and generate programs as the intermediate reasoning steps. Coined, program-aided language models (PAL), it differs from chain-of-thought prompting in that instead of using free-form text to obtain solution it offloads the solution step to a programmatic runtime such as a Python interpreter.
![](../img/pal.png)
Full example coming soon!
---
## ReAct
[Yao et al., 2022](Yao) introduced a framework where LLMs are used to generate both reasoning traces and task-specific actions in an interleaved manner. Generating reasoning traces allow the model to induce, track, and update action plans, and even handle exceptions. The action step allows to interface with and gather information from external sources such as knowledge bases or environments.
The ReAct framework can allow LLMs to interact with external tools to retrieve additional information that leads to more reliable and factual responses.
"- Demonstrate how to use LangChain to develop a simple application leveraging the PAL prompting technique"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"We are developing a simple application that's able to reason about the question being asked through code. \n",
"\n",
"Specifically, the application takes in some data and answers a question about the data input. The prompt includes a few exemplars which are adopted from [here](https://github.com/reasoning-machines/pal/blob/main/pal/prompt/penguin_prompt.py). "
"Now that we have the prompt and question. We can send it to the model. It should output the steps, in code, needed to get the solution to the answer."