mirror of
https://github.com/dair-ai/Prompt-Engineering-Guide
synced 2024-11-06 09:20:31 +00:00
fixes
This commit is contained in:
parent
3c6e9c78e7
commit
48c1a2bfd2
@ -68,13 +68,13 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"MODEL = \"gpt-3.5-turbo\"\n",
|
||||
"\n",
|
||||
"response = openai.ChatCompletion.create(\n",
|
||||
"response = openai.chat.completions.create(\n",
|
||||
" model=MODEL,\n",
|
||||
" messages=[\n",
|
||||
" {\"role\": \"system\", \"content\": \"You are an AI research assistant. You use a tone that is technical and scientific.\"},\n",
|
||||
@ -96,50 +96,58 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"'Certainly! Black holes are formed when a massive star runs out of fuel and can no longer produce the energy needed to counteract the force of gravity. This causes the star to collapse in on itself, creating a singularity - a point of infinite density and zero volume. The gravitational pull of the singularity is so strong that nothing, not even light, can escape its grasp, hence the name \"black hole\". \\n\\nThere are also supermassive black holes, which are found at the centers of galaxies and are thought to have formed through the merging of smaller black holes and the accretion of matter. \\n\\nThe study of black holes is a fascinating and active area of research in astrophysics, and there is still much to be learned about these mysterious objects.'"
|
||||
"\"Certainly! Black holes are fascinating astronomical objects that form from the remnants of massive stars. The creation of a black hole occurs through a process known as stellar collapse.\\n\\nWhen a massive star exhausts its nuclear fuel, it can no longer sustain the outward pressure generated by nuclear fusion. As a result, the star's core collapses under the force of gravity. This collapse is triggered by the imbalance between the inward gravitational force and the outward pressure.\\n\\nDuring the collapse, the star's core becomes incredibly dense, packing an enormous amount of mass into a tiny volume. This extreme density leads to the formation of a singularity, a point of infinite density at the center of the black hole.\\n\\nSurrounding the singularity is the event horizon, which is the boundary beyond which nothing, not even light, can escape the gravitational pull of the black hole. The event horizon is determined by the mass of the black hole, with larger black holes having larger event horizons.\\n\\nThe formation of black holes is classified into three main types based on their mass: stellar black holes, intermediate-mass black holes, and supermassive black holes. Stellar black holes typically have masses several times that of our Sun, while supermassive black holes can have millions or even billions of times the mass of the Sun.\\n\\nIn addition to stellar collapse, black holes can also form through other mechanisms, such as the collision of neutron stars or the accretion of matter onto an existing black hole.\\n\\nUnderstanding the creation and behavior of black holes is a fascinating area of research in astrophysics, with implications for our understanding of gravity, spacetime, and the evolution of galaxies.\""
|
||||
]
|
||||
},
|
||||
"execution_count": 4,
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"response.choices[0]['message']['content']"
|
||||
"response.choices[0].message.content"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"execution_count": 7,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/markdown": [
|
||||
"Certainly! Black holes are formed when a massive star runs out of fuel and can no longer produce the energy needed to counteract the force of gravity. This causes the star to collapse in on itself, creating a singularity - a point of infinite density and zero volume. The gravitational pull of the singularity is so strong that nothing, not even light, can escape its grasp, hence the name \"black hole\". \n",
|
||||
"Certainly! Black holes are fascinating astronomical objects that form from the remnants of massive stars. The creation of a black hole occurs through a process known as stellar collapse.\n",
|
||||
"\n",
|
||||
"There are also supermassive black holes, which are found at the centers of galaxies and are thought to have formed through the merging of smaller black holes and the accretion of matter. \n",
|
||||
"When a massive star exhausts its nuclear fuel, it can no longer sustain the outward pressure generated by nuclear fusion. As a result, the star's core collapses under the force of gravity. This collapse is triggered by the imbalance between the inward gravitational force and the outward pressure.\n",
|
||||
"\n",
|
||||
"The study of black holes is a fascinating and active area of research in astrophysics, and there is still much to be learned about these mysterious objects."
|
||||
"During the collapse, the star's core becomes incredibly dense, packing an enormous amount of mass into a tiny volume. This extreme density leads to the formation of a singularity, a point of infinite density at the center of the black hole.\n",
|
||||
"\n",
|
||||
"Surrounding the singularity is the event horizon, which is the boundary beyond which nothing, not even light, can escape the gravitational pull of the black hole. The event horizon is determined by the mass of the black hole, with larger black holes having larger event horizons.\n",
|
||||
"\n",
|
||||
"The formation of black holes is classified into three main types based on their mass: stellar black holes, intermediate-mass black holes, and supermassive black holes. Stellar black holes typically have masses several times that of our Sun, while supermassive black holes can have millions or even billions of times the mass of the Sun.\n",
|
||||
"\n",
|
||||
"In addition to stellar collapse, black holes can also form through other mechanisms, such as the collision of neutron stars or the accretion of matter onto an existing black hole.\n",
|
||||
"\n",
|
||||
"Understanding the creation and behavior of black holes is a fascinating area of research in astrophysics, with implications for our understanding of gravity, spacetime, and the evolution of galaxies."
|
||||
],
|
||||
"text/plain": [
|
||||
"<IPython.core.display.Markdown object>"
|
||||
]
|
||||
},
|
||||
"execution_count": 5,
|
||||
"execution_count": 7,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"# pretty format the response\n",
|
||||
"IPython.display.Markdown(response.choices[0]['message']['content'])"
|
||||
"IPython.display.Markdown(response.choices[0].message.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -160,14 +168,14 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"execution_count": 10,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"Mice.\n"
|
||||
"mice\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
@ -181,7 +189,7 @@
|
||||
"Answer:\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"response = openai.ChatCompletion.create(\n",
|
||||
"response = openai.chat.completions.create(\n",
|
||||
" model=MODEL,\n",
|
||||
" messages=[\n",
|
||||
" {\"role\": \"user\", \"content\": CONTENT},\n",
|
||||
@ -189,8 +197,15 @@
|
||||
" temperature=0,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"print(response['choices'][0]['message']['content'])"
|
||||
"print(response.choices[0].message.content)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
@ -209,7 +224,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.9.16"
|
||||
"version": "3.9.18"
|
||||
},
|
||||
"orig_nbformat": 4,
|
||||
"vscode": {
|
||||
|
File diff suppressed because one or more lines are too long
@ -1,8 +1,15 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Function Calling with OpenAI APIs"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 6,
|
||||
"execution_count": 1,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -26,7 +33,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"execution_count": 2,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -51,7 +58,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"execution_count": 3,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -80,7 +87,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 7,
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -111,7 +118,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 8,
|
||||
"execution_count": 5,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -127,14 +134,14 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 9,
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_ADST8mP0bkzu12qibeywdtQ7', function=Function(arguments='{\"location\":\"London\",\"unit\":\"celsius\"}', name='get_current_weather'), type='function')])\n"
|
||||
"ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_GYg3yhSh1bbMabne9brwTqmA', function=Function(arguments='{\"location\":\"London\",\"unit\":\"celsius\"}', name='get_current_weather'), type='function')])\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
@ -159,7 +166,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 10,
|
||||
"execution_count": 7,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -168,7 +175,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 11,
|
||||
"execution_count": 8,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
@ -177,7 +184,7 @@
|
||||
"'{\"location\": \"London\", \"temperature\": \"50\", \"unit\": \"celsius\"}'"
|
||||
]
|
||||
},
|
||||
"execution_count": 11,
|
||||
"execution_count": 8,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@ -202,7 +209,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 12,
|
||||
"execution_count": 9,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -216,7 +223,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 13,
|
||||
"execution_count": 10,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
@ -225,7 +232,7 @@
|
||||
"ChatCompletionMessage(content=\"Hello! I'm here and ready to assist you. How can I help you today?\", role='assistant', function_call=None, tool_calls=None)"
|
||||
]
|
||||
},
|
||||
"execution_count": 13,
|
||||
"execution_count": 10,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@ -243,7 +250,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 14,
|
||||
"execution_count": 11,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
@ -252,7 +259,7 @@
|
||||
"ChatCompletionMessage(content=\"Hello! I'm here and ready to assist you. How can I help you today?\", role='assistant', function_call=None, tool_calls=None)"
|
||||
]
|
||||
},
|
||||
"execution_count": 14,
|
||||
"execution_count": 11,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@ -270,7 +277,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 15,
|
||||
"execution_count": 12,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
@ -279,7 +286,7 @@
|
||||
"ChatCompletionMessage(content=\"Hello! I'm here and ready to assist you. How can I help you today?\", role='assistant', function_call=None, tool_calls=None)"
|
||||
]
|
||||
},
|
||||
"execution_count": 15,
|
||||
"execution_count": 12,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@ -290,7 +297,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 16,
|
||||
"execution_count": 13,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
@ -299,7 +306,7 @@
|
||||
"ChatCompletionMessage(content='I will check the current weather in London for you.', role='assistant', function_call=None, tool_calls=None)"
|
||||
]
|
||||
},
|
||||
"execution_count": 16,
|
||||
"execution_count": 13,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@ -323,16 +330,16 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 17,
|
||||
"execution_count": 14,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_NzHx0V4zDnoCzvgaASGaUmCp', function=Function(arguments='{\"location\":\"London\",\"unit\":\"celsius\"}', name='get_current_weather'), type='function')])"
|
||||
"ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_ctUCZtwxZYhfF3sirByNY2qC', function=Function(arguments='{\"location\":\"London\",\"unit\":\"celsius\"}', name='get_current_weather'), type='function')])"
|
||||
]
|
||||
},
|
||||
"execution_count": 17,
|
||||
"execution_count": 14,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@ -356,16 +363,16 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 18,
|
||||
"execution_count": 15,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_BZo3k606xCW8fnIPlLdJVpRo', function=Function(arguments='{\"location\": \"London\", \"unit\": \"celsius\"}', name='get_current_weather'), type='function'), ChatCompletionMessageToolCall(id='call_rx3uTL4l3PQztPzca04t1Ecu', function=Function(arguments='{\"location\": \"Belmopan\", \"unit\": \"celsius\"}', name='get_current_weather'), type='function')])"
|
||||
"ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_GhyrfrY9QWkrJs9vKqxpGW22', function=Function(arguments='{\"location\": \"London\", \"unit\": \"celsius\"}', name='get_current_weather'), type='function'), ChatCompletionMessageToolCall(id='call_Cl9fqGh1AZgi4yD8n5Lrc6NE', function=Function(arguments='{\"location\": \"Belmopan\", \"unit\": \"celsius\"}', name='get_current_weather'), type='function')])"
|
||||
]
|
||||
},
|
||||
"execution_count": 18,
|
||||
"execution_count": 15,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
@ -391,14 +398,14 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Function Calling Response as Model Feedback\n",
|
||||
"### Function Calling Response for Model Feedback\n",
|
||||
"\n",
|
||||
"You might also be interested in developing an agent that passes back the result obtained after calling your APIs with the inputs generated from function calling. Let's look at an example next:\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 29,
|
||||
"execution_count": 16,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -415,7 +422,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 31,
|
||||
"execution_count": 17,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -431,7 +438,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 32,
|
||||
"execution_count": 18,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -446,7 +453,7 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 34,
|
||||
"execution_count": 19,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
@ -455,16 +462,16 @@
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 35,
|
||||
"execution_count": 20,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"ChatCompletionMessage(content='The current weather in Boston, MA is 50°F.', role='assistant', function_call=None, tool_calls=None)"
|
||||
"ChatCompletionMessage(content='The current temperature in Boston, MA is 50°F.', role='assistant', function_call=None, tool_calls=None)"
|
||||
]
|
||||
},
|
||||
"execution_count": 35,
|
||||
"execution_count": 20,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
|
File diff suppressed because it is too large
Load Diff
1181
notebooks/pe-rag.ipynb
Normal file
1181
notebooks/pe-rag.ipynb
Normal file
File diff suppressed because it is too large
Load Diff
@ -1,5 +1,8 @@
|
||||
# Function Calling with LLMs
|
||||
|
||||
import {Cards, Card} from 'nextra-theme-docs'
|
||||
import {CodeIcon} from 'components/icons'
|
||||
|
||||
## Getting Started with Function Calling
|
||||
|
||||
Function calling is the ability to reliably connect LLMs to external tools to enable effective tool usage and interaction with external APIs.
|
||||
@ -99,14 +102,34 @@ In particular, the `arguments` object contains the important arguments extracted
|
||||
|
||||
You can then choose to call an external weather API for the actual weather. Once you have the weather information available you can pass it back to the model to summarize a final response given the original user question.
|
||||
|
||||
Here is a [notebook](https://github.com/dair-ai/Prompt-Engineering-Guide/blob/main/notebooks/pe-function-calling.ipynb) with a simple example that demonstrates how to use function calling with the OpenAI APIs.
|
||||
## Notebooks
|
||||
|
||||
Here is a notebook with a simple example that demonstrates how to use function calling with the OpenAI APIs:
|
||||
|
||||
<Cards>
|
||||
<Card
|
||||
icon={<CodeIcon />}
|
||||
title="Function Calling with OpenAI APIs"
|
||||
href="https://github.com/dair-ai/Prompt-Engineering-Guide/blob/main/notebooks/pe-function-calling.ipynb"
|
||||
/>
|
||||
</Cards>
|
||||
|
||||
## Function Calling with Open-Source LLMs
|
||||
More notes on function calling with open-source LLMs coming soon...
|
||||
More notes on function calling with open-source LLMs coming soon.
|
||||
|
||||
## Function Calling Use Cases
|
||||
More function calling use cases coming soon...
|
||||
|
||||
Below is a list of use cases that can benefit from the function calling capability of LLMs:
|
||||
|
||||
- **Conversational Agents**: Function calling can be used to create complex conversational agents or chatbots that answer complex questions by calling external APIs or external knowledge base and providing more relevant and useful responses.
|
||||
|
||||
- **Natural Language Understanding**: It can convert natural language into structured JSON data, extract structured data from text, and perform tasks like named entity recognition, sentiment analysis, and keyword extraction.
|
||||
|
||||
- **Math Problem Solving**: Function calling can be used to define custom functions to solve complex mathematical problems that require multiple steps and different types of advanced calculations.
|
||||
|
||||
- **API Integration**: It can be used to effectively integrate LLMs with external APIs to fetch data or perform actions based on the input. This could be helpful to build either a QA system or creative assistant. In general, function calling can convert natural language into valid API calls.
|
||||
|
||||
- **Information Extraction**: Function calling be effectively used to extract specific information from a given input, such as retrieving relevant news stories or references from an article.
|
||||
|
||||
|
||||
## References
|
||||
@ -116,4 +139,5 @@ More function calling use cases coming soon...
|
||||
- [Interacting with APIs](https://python.langchain.com/docs/use_cases/apis)
|
||||
- [OpenAI's Function Calling](https://platform.openai.com/docs/guides/function-calling)
|
||||
- [How to call functions with chat models](https://cookbook.openai.com/examples/how_to_call_functions_with_chat_models)
|
||||
- [Pushing ChatGPT's Structured Data Support To Its Limits](https://minimaxir.com/2023/12/chatgpt-structured-data/)
|
||||
- [Pushing ChatGPT's Structured Data Support To Its Limits](https://minimaxir.com/2023/12/chatgpt-structured-data/)
|
||||
- [Math Problem Solving with Function Calling](https://github.com/svpino/openai-function-calling/blob/main/sample.ipynb)
|
@ -1,5 +1,8 @@
|
||||
# Examples of Prompts
|
||||
|
||||
import {Cards, Card} from 'nextra-theme-docs'
|
||||
import {CodeIcon} from 'components/icons'
|
||||
|
||||
The previous section introduced a basic example of how to prompt LLMs.
|
||||
|
||||
This section will provide more examples of how to use prompts to achieve different tasks and introduce key concepts along the way. Often, the best way to learn concepts is by going through examples. The few examples below illustrate how you can use well-crafted prompts to perform different types of tasks.
|
||||
@ -288,3 +291,16 @@ Much better, right? By the way, I tried this a couple of times and the system so
|
||||
We will continue to include more examples of common applications in this section of the guide.
|
||||
|
||||
In the upcoming section, we will cover even more advanced prompt engineering concepts and techniques for improving performance on all these and more difficult tasks.
|
||||
|
||||
## Notebook
|
||||
|
||||
If you want to practice with the prompts above using Python, we have prepared a notebook to test some of the prompts using the OpenAI models.
|
||||
|
||||
<Cards>
|
||||
<Card
|
||||
icon={<CodeIcon />}
|
||||
title="Getting Started with Prompt Engineering"
|
||||
href="https://github.com/dair-ai/Prompt-Engineering-Guide/blob/main/notebooks/pe-lecture.ipynb"
|
||||
/>
|
||||
</Cards>
|
||||
|
||||
|
@ -4,6 +4,8 @@ import { Callout, FileTree } from 'nextra-theme-docs'
|
||||
import {Screenshot} from 'components/screenshot'
|
||||
import CHATGPT1 from '../../img/chatgpt-1.png'
|
||||
import CHATGPTCLASSIC from '../../img/chatgpt-classic.png'
|
||||
import {Cards, Card} from 'nextra-theme-docs'
|
||||
import {CodeIcon} from 'components/icons'
|
||||
|
||||
In this section, we cover the latest prompt engineering techniques for ChatGPT, including tips, applications, limitations, papers, and additional reading materials.
|
||||
|
||||
@ -138,6 +140,23 @@ According to the official OpenAI docs, snapshots of the `gpt-3.5-turbo` model wi
|
||||
|
||||
The current recommendation for `gpt-3.5-turbo-0301` is to add instructions in the `user` message as opposed to the available `system` message.
|
||||
|
||||
|
||||
## Notebooks
|
||||
Here is a notebook to learn more about how to make calls to the ChatGPT APIs using the official `openai` library:
|
||||
|
||||
<Cards>
|
||||
<Card
|
||||
icon={<CodeIcon />}
|
||||
title="Introduction to The ChatGPT APIs"
|
||||
href="https://github.com/dair-ai/Prompt-Engineering-Guide/blob/main/notebooks/pe-chatgpt-intro.ipynb"
|
||||
/>
|
||||
<Card
|
||||
icon={<CodeIcon />}
|
||||
title="ChatGPT with LangChain"
|
||||
href="https://github.com/dair-ai/Prompt-Engineering-Guide/blob/main/notebooks/pe-chatgpt-langchain.ipynb"
|
||||
/>
|
||||
</Cards>
|
||||
|
||||
---
|
||||
## References
|
||||
|
||||
|
@ -1,5 +1,8 @@
|
||||
# Retrieval Augmented Generation (RAG)
|
||||
|
||||
import {Cards, Card} from 'nextra-theme-docs'
|
||||
import {TerminalIcon} from 'components/icons'
|
||||
import {CodeIcon} from 'components/icons'
|
||||
import {Screenshot} from 'components/screenshot'
|
||||
import RAG from '../../img/rag.png'
|
||||
|
||||
@ -22,4 +25,19 @@ This shows the potential of RAG as a viable option for enhancing outputs of lang
|
||||
|
||||
More recently, these retriever-based approaches have become more popular and are combined with popular LLMs like ChatGPT to improve capabilities and factual consistency.
|
||||
|
||||
You can find a [simple example of how to use retrievers and LLMs for question answering with sources](https://python.langchain.com/docs/use_cases/question_answering/how_to/vector_db_qa) from the LangChain documentation.
|
||||
## RAG Use Case: Generating Friendly ML Paper Titles
|
||||
|
||||
Below, we have prepared a notebook tutorial showcasing the use of open-source LLMs to build a RAG system for generating short and concise machine learning paper titles:
|
||||
|
||||
<Cards>
|
||||
<Card
|
||||
icon={<CodeIcon />}
|
||||
title="Getting Started with RAG"
|
||||
href="https://github.com/dair-ai/Prompt-Engineering-Guide/blob/main/notebooks/pe-rag.ipynb"
|
||||
/>
|
||||
</Cards>
|
||||
|
||||
## References
|
||||
|
||||
- [Retrieval-Augmented Generation for Large Language Models: A Survey](https://arxiv.org/abs/2312.10997) (Dec 2023)
|
||||
- [Retrieval Augmented Generation: Streamlining the creation of intelligent natural language processing models](https://ai.meta.com/blog/retrieval-augmented-generation-streamlining-the-creation-of-intelligent-natural-language-processing-models/) (Sep 2020)
|
||||
|
Loading…
Reference in New Issue
Block a user