2023-02-14 06:06:25 +00:00
{
"cells": [
{
"cell_type": "markdown",
"id": "959300d4",
"metadata": {},
"source": [
"# PromptLayer OpenAI\n",
"\n",
2023-04-18 03:25:32 +00:00
"`PromptLayer` is the first platform that allows you to track, manage, and share your GPT prompt engineering. `PromptLayer` acts a middleware between your code and `OpenAI’ s` python library.\n",
"\n",
"`PromptLayer` records all your `OpenAI API` requests, allowing you to search and explore request history in the `PromptLayer` dashboard.\n",
"\n",
"\n",
"This example showcases how to connect to [PromptLayer](https://www.promptlayer.com) to start recording your OpenAI requests.\n",
"\n",
"Another example is [here](https://python.langchain.com/en/latest/ecosystem/promptlayer.html)."
2023-02-14 06:06:25 +00:00
]
},
2023-02-16 06:37:48 +00:00
{
"cell_type": "markdown",
"id": "6a45943e",
"metadata": {},
"source": [
"## Install PromptLayer\n",
"The `promptlayer` package is required to use PromptLayer with OpenAI. Install `promptlayer` using pip."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "dbe09bd8",
"metadata": {
2023-04-18 03:25:32 +00:00
"tags": [],
2023-02-16 06:37:48 +00:00
"vscode": {
"languageId": "powershell"
}
},
"outputs": [],
"source": [
2023-04-18 03:25:32 +00:00
"!pip install promptlayer"
2023-02-16 06:37:48 +00:00
]
},
{
"cell_type": "markdown",
"id": "536c1dfa",
"metadata": {},
"source": [
"## Imports"
]
},
{
"cell_type": "code",
2023-04-18 03:25:32 +00:00
"execution_count": 3,
2023-02-16 06:37:48 +00:00
"id": "c16da3b5",
2023-04-18 03:25:32 +00:00
"metadata": {
"tags": []
},
2023-02-16 06:37:48 +00:00
"outputs": [],
"source": [
"import os\n",
"from langchain.llms import PromptLayerOpenAI\n",
"import promptlayer"
]
},
{
"cell_type": "markdown",
"id": "8564ce7d",
"metadata": {},
"source": [
"## Set the Environment API Key\n",
2023-03-28 06:35:54 +00:00
"You can create a PromptLayer API Key at [www.promptlayer.com](https://www.promptlayer.com) by clicking the settings cog in the navbar.\n",
2023-02-16 06:37:48 +00:00
"\n",
2023-04-18 03:25:32 +00:00
"Set it as an environment variable called `PROMPTLAYER_API_KEY`.\n",
"\n",
"You also need an OpenAI Key, called `OPENAI_API_KEY`."
2023-02-16 06:37:48 +00:00
]
},
{
"cell_type": "code",
2023-04-18 03:25:32 +00:00
"execution_count": 2,
"id": "1df96674-a9fb-4126-bb87-541082782240",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdin",
"output_type": "stream",
"text": [
" ········\n"
]
}
],
"source": [
"from getpass import getpass\n",
"\n",
"PROMPTLAYER_API_KEY = getpass()"
]
},
{
"cell_type": "code",
"execution_count": 9,
2023-02-16 06:37:48 +00:00
"id": "46ba25dc",
2023-04-18 03:25:32 +00:00
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"os.environ[\"PROMPTLAYER_API_KEY\"] = PROMPTLAYER_API_KEY"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "9aa68c46-4d88-45ba-8a83-18fa41b4daed",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdin",
"output_type": "stream",
"text": [
" ········\n"
]
}
],
"source": [
"from getpass import getpass\n",
"\n",
"OPENAI_API_KEY = getpass()"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "6023b6fa-d9db-49d6-b713-0e19686119b0",
"metadata": {
"tags": []
},
2023-02-16 06:37:48 +00:00
"outputs": [],
"source": [
2023-04-18 03:25:32 +00:00
"os.environ[\"OPENAI_API_KEY\"] = OPENAI_API_KEY"
2023-02-16 06:37:48 +00:00
]
},
{
"cell_type": "markdown",
"id": "bf0294de",
"metadata": {},
"source": [
"## Use the PromptLayerOpenAI LLM like normal\n",
"*You can optionally pass in `pl_tags` to track your requests with PromptLayer's tagging feature.*"
]
},
2023-02-14 06:06:25 +00:00
{
"cell_type": "code",
2023-04-18 03:25:32 +00:00
"execution_count": null,
2023-02-14 06:06:25 +00:00
"id": "3acf0069",
2023-04-18 03:25:32 +00:00
"metadata": {
"tags": []
},
"outputs": [],
2023-02-14 06:06:25 +00:00
"source": [
"llm = PromptLayerOpenAI(pl_tags=[\"langchain\"])\n",
"llm(\"I am a cat and I want\")"
]
},
{
2023-02-16 06:37:48 +00:00
"cell_type": "markdown",
"id": "a2d76826",
"metadata": {},
"source": [
2023-03-31 23:16:23 +00:00
"**The above request should now appear on your [PromptLayer dashboard](https://www.promptlayer.com).**"
2023-02-16 06:37:48 +00:00
]
},
{
"cell_type": "markdown",
"id": "05e9e2fe",
2023-02-14 06:06:25 +00:00
"metadata": {},
2023-03-17 00:05:23 +00:00
"source": [
"## Using PromptLayer Track\n",
"If you would like to use any of the [PromptLayer tracking features](https://magniv.notion.site/Track-4deee1b1f7a34c1680d085f82567dab9), you need to pass the argument `return_pl_id` when instantializing the PromptLayer LLM to get the request id. "
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1a7315b9",
"metadata": {},
"outputs": [],
"source": [
"llm = PromptLayerOpenAI(return_pl_id=True)\n",
"llm_results = llm.generate([\"Tell me a joke\"])\n",
"\n",
"for res in llm_results.generations:\n",
" pl_request_id = res[0].generation_info[\"pl_request_id\"]\n",
" promptlayer.track.score(request_id=pl_request_id, score=100)"
]
},
{
"cell_type": "markdown",
"id": "7eb19139",
"metadata": {},
"source": [
"Using this allows you to track the performance of your model in the PromptLayer dashboard. If you are using a prompt template, you can attach a template to a request as well.\n",
"Overall, this gives you the opportunity to track the performance of different templates and models in the PromptLayer dashboard."
]
2023-02-14 06:06:25 +00:00
}
],
"metadata": {
"kernelspec": {
2023-04-18 03:25:32 +00:00
"display_name": "Python 3 (ipykernel)",
2023-02-14 06:06:25 +00:00
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
2023-04-18 03:25:32 +00:00
"version": "3.10.6"
2023-02-14 06:06:25 +00:00
},
"vscode": {
"interpreter": {
2023-03-17 00:05:23 +00:00
"hash": "8a5edab282632443219e051e4ade2d1d5bbc671c781051bf1437897cbdfea0f1"
2023-02-14 06:06:25 +00:00
}
}
},
"nbformat": 4,
"nbformat_minor": 5
}