mirror of
https://github.com/dair-ai/Prompt-Engineering-Guide
synced 2024-11-02 15:40:13 +00:00
added application example PAL
This commit is contained in:
parent
9559030342
commit
322a5a8ddf
@ -45,6 +45,7 @@ The following are a set of guides on prompt engineering developed by us. Guides
|
||||
- [Prompt Engineering - Introduction](/guides/prompts-intro.md)
|
||||
- [Prompt Engineering - Basic Prompting](/guides/prompts-basic-usage.md)
|
||||
- [Prompt Engineering - Advanced Prompting](/guides/prompts-advanced-usage.md)
|
||||
- [Prompt Engineering - Applications](/guides/prompts-applications.md)
|
||||
- [Prompt Engineering - Adversarial Prompting](/guides/prompt-adversarial.md)
|
||||
- [Prompt Engineering - Miscellaneous Topics](/guides/prompt-miscellaneous.md)
|
||||
|
||||
|
@ -4,5 +4,6 @@ The following are a set of guides on prompt engineering developed by us (DAIR.AI
|
||||
- [Prompt Engineering - Introduction](/guides/prompts-intro.md)
|
||||
- [Prompt Engineering - Basic Prompting](/guides/prompts-basic-usage.md)
|
||||
- [Prompt Engineering - Advanced Prompting](/guides/prompts-advanced-usage.md)
|
||||
- [Prompt Engineering - Applications](/guides/prompts-applications.md)
|
||||
- [Prompt Engineering - Adversarial Prompting](/guides/prompt-adversarial.md)
|
||||
- [Prompt Engineering - Miscellaneous Topics](/guides/prompt-miscellaneous.md)
|
@ -184,6 +184,6 @@ And there are many other variations of this with the goal to make the model do s
|
||||
Models like ChatGPT and Claude have been aligned to avoid outputting content that for instance promote illegal behavior or unethical activities. So it's harder to jailbreak them but they still have flaws and we are learning new ones as people experiment with these systems.
|
||||
|
||||
---
|
||||
[Previous Section (Advanced Prompting)](./prompts-advanced-usage.md)
|
||||
[Previous Section (Applications)](./prompts-applications.md)
|
||||
|
||||
[Next Section (Miscellaneous Topics)](./prompt-miscellaneous.md)
|
||||
|
@ -34,14 +34,6 @@ The figure below shows how Directional Stimulus Prompting compares with standard
|
||||
|
||||
Full example coming soon!
|
||||
|
||||
---
|
||||
## Program-Aided Language Models
|
||||
[Gao et al., (2022)](https://arxiv.org/abs/2211.10435) presents a method that uses LLMs to read natural language problems and generate programs as the intermediate reasoning steps. Coined, program-aided language models (PAL), it differs from chain-of-thought prompting in that instead of using free-form text to obtain solution it offloads the solution step to a programmatic runtime such as a Python interpreter.
|
||||
|
||||
![](../img/pal.png)
|
||||
|
||||
Full example coming soon!
|
||||
|
||||
---
|
||||
## ReAct
|
||||
|
||||
|
@ -371,7 +371,7 @@ Explain and Answer:
|
||||
Yes, part of golf is trying to get a higher point total than others. Each player tries to complete the course with the lowest score, which is calculated by adding up the total number of strokes taken on each hole. The player with the lowest score wins the game.
|
||||
```
|
||||
|
||||
Some really interesting things happened with this example. In the first answer, the model was very confident but in the second not so much. I simplify the process for demonstration purposes but there are few more details to consider when arriving to the final answer. Check out the paper for more.
|
||||
Some really interesting things happened with this example. In the first answer, the model was very confident but in the second not so much. I simplify the process for demonstration purposes but there are few more details to consider when arriving to the final answer. Check out the paper for more.
|
||||
|
||||
---
|
||||
|
||||
@ -398,4 +398,4 @@ This paper touches on an important topic related to prompt engineering which is
|
||||
---
|
||||
[Previous Section (Basic Prompting)](./prompts-basic-usage.md)
|
||||
|
||||
[Next Section (Adversarial Prompting)](./prompt-adversarial.md)
|
||||
[Next Section (Applications)](./prompts-applications.md)
|
||||
|
120
guides/prompts-applications.md
Normal file
120
guides/prompts-applications.md
Normal file
@ -0,0 +1,120 @@
|
||||
# Prompt Applications
|
||||
|
||||
In this guide we will cover some advanced ways we can use prompt engineering to solve more advanced tasks.
|
||||
|
||||
Topics:
|
||||
- [PAL (Program-Aided Language Models): Code as Reasoning](#pal-program-aided-language-models-code-as-reasoning)
|
||||
- More coming soon!
|
||||
|
||||
---
|
||||
|
||||
## PAL (Program-Aided Language Models): Code as Reasoning
|
||||
|
||||
[Gao et al., (2022)](https://arxiv.org/abs/2211.10435) presents a method that uses LLMs to read natural language problems and generate programs as the intermediate reasoning steps. Coined, program-aided language models (PAL), it differs from chain-of-thought prompting in that instead of using free-form text to obtain solution it offloads the solution step to a programmatic runtime such as a Python interpreter.
|
||||
|
||||
![](../img/pal.png)
|
||||
|
||||
Let's look at an example using LangChain and OpenAI GPT-3. We are interested to develop a simple application that's able to interpret the question being asked and provide an answer by leveraging the Python interpreter.
|
||||
|
||||
Specifically, we are interested to create a functionality that allows the use of the LLM to answer questions that require date understanding. We will provide the LLM a prompt that includes a few exemplars which are adopted from [here](https://github.com/reasoning-machines/pal/blob/main/pal/prompt/date_understanding_prompt.py).
|
||||
|
||||
These are the imports we need:
|
||||
|
||||
```python
|
||||
import openai
|
||||
from datetime import datetime
|
||||
from dateutil.relativedelta import relativedelta
|
||||
import os
|
||||
from langchain.llms import OpenAI
|
||||
from dotenv import load_dotenv
|
||||
```
|
||||
|
||||
Let's first configure a few things:
|
||||
|
||||
```python
|
||||
load_dotenv()
|
||||
|
||||
# API configuration
|
||||
openai.api_key = os.getenv("OPENAI_API_KEY")
|
||||
|
||||
# for LangChain
|
||||
os.environ["OPENAI_API_KEY"] = os.getenv("OPENAI_API_KEY")
|
||||
```
|
||||
|
||||
Setup model instance:
|
||||
|
||||
```python
|
||||
llm = OpenAI(model_name='text-davinci-003', temperature=0)
|
||||
```
|
||||
|
||||
Setup prompt + question:
|
||||
|
||||
```python
|
||||
question = "Today is 27 February 2023. I was born exactly 25 years ago. What is the date I was born in MM/DD/YYYY?"
|
||||
|
||||
DATE_UNDERSTANDING_PROMPT = """
|
||||
# Q: 2015 is coming in 36 hours. What is the date one week from today in MM/DD/YYYY?
|
||||
# If 2015 is coming in 36 hours, then today is 36 hours before.
|
||||
today = datetime(2015, 1, 1) - relativedelta(hours=36)
|
||||
# One week from today,
|
||||
one_week_from_today = today + relativedelta(weeks=1)
|
||||
# The answer formatted with %m/%d/%Y is
|
||||
one_week_from_today.strftime('%m/%d/%Y')
|
||||
# Q: The first day of 2019 is a Tuesday, and today is the first Monday of 2019. What is the date today in MM/DD/YYYY?
|
||||
# If the first day of 2019 is a Tuesday, and today is the first Monday of 2019, then today is 6 days later.
|
||||
today = datetime(2019, 1, 1) + relativedelta(days=6)
|
||||
# The answer formatted with %m/%d/%Y is
|
||||
today.strftime('%m/%d/%Y')
|
||||
# Q: The concert was scheduled to be on 06/01/1943, but was delayed by one day to today. What is the date 10 days ago in MM/DD/YYYY?
|
||||
# If the concert was scheduled to be on 06/01/1943, but was delayed by one day to today, then today is one day later.
|
||||
today = datetime(1943, 6, 1) + relativedelta(days=1)
|
||||
# 10 days ago,
|
||||
ten_days_ago = today - relativedelta(days=10)
|
||||
# The answer formatted with %m/%d/%Y is
|
||||
ten_days_ago.strftime('%m/%d/%Y')
|
||||
# Q: It is 4/19/1969 today. What is the date 24 hours later in MM/DD/YYYY?
|
||||
# It is 4/19/1969 today.
|
||||
today = datetime(1969, 4, 19)
|
||||
# 24 hours later,
|
||||
later = today + relativedelta(hours=24)
|
||||
# The answer formatted with %m/%d/%Y is
|
||||
today.strftime('%m/%d/%Y')
|
||||
# Q: Jane thought today is 3/11/2002, but today is in fact Mar 12, which is 1 day later. What is the date 24 hours later in MM/DD/YYYY?
|
||||
# If Jane thought today is 3/11/2002, but today is in fact Mar 12, then today is 3/1/2002.
|
||||
today = datetime(2002, 3, 12)
|
||||
# 24 hours later,
|
||||
later = today + relativedelta(hours=24)
|
||||
# The answer formatted with %m/%d/%Y is
|
||||
later.strftime('%m/%d/%Y')
|
||||
# Q: Jane was born on the last day of Feburary in 2001. Today is her 16-year-old birthday. What is the date yesterday in MM/DD/YYYY?
|
||||
# If Jane was born on the last day of Feburary in 2001 and today is her 16-year-old birthday, then today is 16 years later.
|
||||
today = datetime(2001, 2, 28) + relativedelta(years=16)
|
||||
# Yesterday,
|
||||
yesterday = today - relativedelta(days=1)
|
||||
# The answer formatted with %m/%d/%Y is
|
||||
yesterday.strftime('%m/%d/%Y')
|
||||
# Q: {question}
|
||||
""".strip() + '\n'
|
||||
```
|
||||
|
||||
```python
|
||||
llm_out = llm(DATE_UNDERSTANDING_PROMPT.format(question=question))
|
||||
print(llm_out)
|
||||
```
|
||||
|
||||
```python
|
||||
exec(llm_out)
|
||||
print(born)
|
||||
```
|
||||
|
||||
This will output the following: `02/27/1998`
|
||||
|
||||
See full notebook [here](../notebooks/pe-pal.ipynb)
|
||||
|
||||
---
|
||||
|
||||
More examples coming soon!
|
||||
|
||||
[Previous Section (Advanced Usage)](./prompts-advanced-usage.md)
|
||||
|
||||
[Next Section (Adversarial Prompting)](./prompt-adversarial.md)
|
186
notebooks/pe-pal.ipynb
Normal file
186
notebooks/pe-pal.ipynb
Normal file
@ -0,0 +1,186 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"attachments": {},
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## PAL: Code as Reasoning"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 12,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import openai\n",
|
||||
"from datetime import datetime\n",
|
||||
"from dateutil.relativedelta import relativedelta\n",
|
||||
"import os\n",
|
||||
"from langchain.llms import OpenAI\n",
|
||||
"from dotenv import load_dotenv"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"load_dotenv()\n",
|
||||
"\n",
|
||||
"# API configuration\n",
|
||||
"openai.api_key = os.getenv(\"OPENAI_API_KEY\")\n",
|
||||
"\n",
|
||||
"# for LangChain\n",
|
||||
"os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"llm = OpenAI(model_name='text-davinci-003', temperature=0)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"question = \"Today is 27 February 2023. I was born exactly 25 years ago. What is the date I was born in MM/DD/YYYY?\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"attachments": {},
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"I adopted prompt template from here: https://github.com/reasoning-machines/pal/blob/main/pal/prompt/date_understanding_prompt.py"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 21,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"DATE_UNDERSTANDING_PROMPT = \"\"\"\n",
|
||||
"# Q: 2015 is coming in 36 hours. What is the date one week from today in MM/DD/YYYY?\n",
|
||||
"# If 2015 is coming in 36 hours, then today is 36 hours before.\n",
|
||||
"today = datetime(2015, 1, 1) - relativedelta(hours=36)\n",
|
||||
"# One week from today,\n",
|
||||
"one_week_from_today = today + relativedelta(weeks=1)\n",
|
||||
"# The answer formatted with %m/%d/%Y is\n",
|
||||
"answer = one_week_from_today.strftime('%m/%d/%Y')\n",
|
||||
"# Q: The first day of 2019 is a Tuesday, and today is the first Monday of 2019. What is the date today in MM/DD/YYYY?\n",
|
||||
"# If the first day of 2019 is a Tuesday, and today is the first Monday of 2019, then today is 6 days later.\n",
|
||||
"today = datetime(2019, 1, 1) + relativedelta(days=6)\n",
|
||||
"# The answer formatted with %m/%d/%Y is\n",
|
||||
"answer = today.strftime('%m/%d/%Y')\n",
|
||||
"# Q: The concert was scheduled to be on 06/01/1943, but was delayed by one day to today. What is the date 10 days ago in MM/DD/YYYY?\n",
|
||||
"# If the concert was scheduled to be on 06/01/1943, but was delayed by one day to today, then today is one day later.\n",
|
||||
"today = datetime(1943, 6, 1) + relativedelta(days=1)\n",
|
||||
"# 10 days ago,\n",
|
||||
"ten_days_ago = today - relativedelta(days=10)\n",
|
||||
"# The answer formatted with %m/%d/%Y is\n",
|
||||
"answer = ten_days_ago.strftime('%m/%d/%Y')\n",
|
||||
"# Q: It is 4/19/1969 today. What is the date 24 hours later in MM/DD/YYYY?\n",
|
||||
"# It is 4/19/1969 today.\n",
|
||||
"today = datetime(1969, 4, 19)\n",
|
||||
"# 24 hours later,\n",
|
||||
"later = today + relativedelta(hours=24)\n",
|
||||
"# The answer formatted with %m/%d/%Y is\n",
|
||||
"answer = today.strftime('%m/%d/%Y')\n",
|
||||
"# Q: Jane thought today is 3/11/2002, but today is in fact Mar 12, which is 1 day later. What is the date 24 hours later in MM/DD/YYYY?\n",
|
||||
"# If Jane thought today is 3/11/2002, but today is in fact Mar 12, then today is 3/1/2002.\n",
|
||||
"today = datetime(2002, 3, 12)\n",
|
||||
"# 24 hours later,\n",
|
||||
"later = today + relativedelta(hours=24)\n",
|
||||
"# The answer formatted with %m/%d/%Y is\n",
|
||||
"answer = later.strftime('%m/%d/%Y')\n",
|
||||
"# Q: Jane was born on the last day of Feburary in 2001. Today is her 16-year-old birthday. What is the date yesterday in MM/DD/YYYY?\n",
|
||||
"# If Jane was born on the last day of Feburary in 2001 and today is her 16-year-old birthday, then today is 16 years later.\n",
|
||||
"today = datetime(2001, 2, 28) + relativedelta(years=16)\n",
|
||||
"# Yesterday,\n",
|
||||
"yesterday = today - relativedelta(days=1)\n",
|
||||
"# The answer formatted with %m/%d/%Y is\n",
|
||||
"answer = yesterday.strftime('%m/%d/%Y')\n",
|
||||
"# Q: {question}\n",
|
||||
"\"\"\".strip() + '\\n'"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 22,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"# If today is 27 February 2023 and I was born exactly 25 years ago, then I was born 25 years before.\n",
|
||||
"today = datetime(2023, 2, 27)\n",
|
||||
"# I was born 25 years before,\n",
|
||||
"born = today - relativedelta(years=25)\n",
|
||||
"# The answer formatted with %m/%d/%Y is\n",
|
||||
"answer = born.strftime('%m/%d/%Y')\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"llm_out = llm(DATE_UNDERSTANDING_PROMPT.format(question=question))\n",
|
||||
"print(llm_out)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 23,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"02/27/1998\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"exec(llm_out)\n",
|
||||
"print(answer)"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "promptlecture",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.9.16"
|
||||
},
|
||||
"orig_nbformat": 4,
|
||||
"vscode": {
|
||||
"interpreter": {
|
||||
"hash": "f38e0373277d6f71ee44ee8fea5f1d408ad6999fda15d538a69a99a1665a839d"
|
||||
}
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
Loading…
Reference in New Issue
Block a user