[azure] add functions notebook sample (#595)

* add azure functions notebook sample

* update api key to use env var + note use of env vars over config in code across azure samples
This commit is contained in:
Krista Pratico 2023-07-21 16:38:49 -07:00 committed by GitHub
parent b5ea5f3b0e
commit 5e050080ab
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 515 additions and 4 deletions

View File

@ -41,6 +41,7 @@
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import openai"
]
},
@ -105,7 +106,22 @@
"source": [
"if not use_azure_active_directory:\n",
" openai.api_type = 'azure'\n",
" openai.api_key = '' # Add your api key here"
" openai.api_key = os.environ[\"OPENAI_API_KEY\"]"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> Note: In this example, we configured the library to use the Azure API by setting the variables in code. For development, consider setting the environment variables instead:\n",
"\n",
"```\n",
"OPENAI_API_BASE\n",
"OPENAI_API_KEY\n",
"OPENAI_API_TYPE\n",
"OPENAI_API_VERSION\n",
"```"
]
},
{

View File

@ -16,6 +16,7 @@
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import openai"
]
},
@ -62,7 +63,22 @@
"outputs": [],
"source": [
"openai.api_type = 'azure'\n",
"openai.api_key = '' # Please add your api key here\n"
"openai.api_key = os.environ[\"OPENAI_API_KEY\"]\n"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> Note: In this example, we configured the library to use the Azure API by setting the variables in code. For development, consider setting the environment variables instead:\n",
"\n",
"```\n",
"OPENAI_API_BASE\n",
"OPENAI_API_KEY\n",
"OPENAI_API_TYPE\n",
"OPENAI_API_VERSION\n",
"```"
]
},
{

View File

@ -16,6 +16,7 @@
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import openai"
]
},
@ -62,7 +63,22 @@
"outputs": [],
"source": [
"openai.api_type = 'azure'\n",
"openai.api_key = '' # Please add your api key here"
"openai.api_key = os.environ[\"OPENAI_API_KEY\"]"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> Note: In this example, we configured the library to use the Azure API by setting the variables in code. For development, consider setting the environment variables instead:\n",
"\n",
"```\n",
"OPENAI_API_BASE\n",
"OPENAI_API_KEY\n",
"OPENAI_API_TYPE\n",
"OPENAI_API_VERSION\n",
"```"
]
},
{

View File

@ -16,6 +16,7 @@
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import openai"
]
},
@ -62,7 +63,22 @@
"outputs": [],
"source": [
"openai.api_type = 'azure'\n",
"openai.api_key = '' # Please add your api key here"
"openai.api_key = os.environ[\"OPENAI_API_KEY\"]"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> Note: In this example, we configured the library to use the Azure API by setting the variables in code. For development, consider setting the environment variables instead:\n",
"\n",
"```\n",
"OPENAI_API_BASE\n",
"OPENAI_API_KEY\n",
"OPENAI_API_TYPE\n",
"OPENAI_API_VERSION\n",
"```"
]
},
{

View File

@ -0,0 +1,447 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# Azure functions example\n",
"\n",
"This notebook shows how to use the function calling capability with the Azure OpenAI service. Functions allow a caller of chat completions to define capabilities that the model can use to extend its\n",
"functionality into external tools and data sources.\n",
"\n",
"You can read more about chat functions on OpenAI's blog: https://openai.com/blog/function-calling-and-other-api-updates\n",
"\n",
"**NOTE**: Chat functions require model versions beginning with gpt-4 and gpt-35-turbo's `-0613` labels. They are not supported by older versions of the models."
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Setup\n",
"\n",
"First, we install the necessary dependencies."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"! pip install openai\n",
"# (Optional) If you want to use Microsoft Active Directory\n",
"! pip install azure-identity"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import openai"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"Additionally, to properly access the Azure OpenAI Service, we need to create the proper resources at the [Azure Portal](https://portal.azure.com) (you can check a detailed guide on how to do this in the [Microsoft Docs](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal))\n",
"\n",
"Once the resource is created, the first thing we need to use is its endpoint. You can get the endpoint by looking at the *\"Keys and Endpoints\"* section under the *\"Resource Management\"* section. Having this, we will set up the SDK using this information:"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [],
"source": [
"openai.api_base = \"\" # Add your endpoint here\n",
"\n",
"# functions is only supported by the 2023-07-01-preview API version\n",
"openai.api_version = \"2023-07-01-preview\""
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"### Authentication\n",
"\n",
"The Azure OpenAI service supports multiple authentication mechanisms that include API keys and Azure credentials."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"use_azure_active_directory = False"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"#### Authentication using API key\n",
"\n",
"To set up the OpenAI SDK to use an *Azure API Key*, we need to set up the `api_type` to `azure` and set `api_key` to a key associated with your endpoint (you can find this key in *\"Keys and Endpoints\"* under *\"Resource Management\"* in the [Azure Portal](https://portal.azure.com))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"if not use_azure_active_directory:\n",
" openai.api_type = \"azure\"\n",
" openai.api_key = os.environ[\"OPENAI_API_KEY\"]"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> Note: In this example, we configured the library to use the Azure API by setting the variables in code. For development, consider setting the environment variables instead:\n",
"\n",
"```\n",
"OPENAI_API_BASE\n",
"OPENAI_API_KEY\n",
"OPENAI_API_TYPE\n",
"OPENAI_API_VERSION\n",
"```"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Authentication using Microsoft Active Directory\n",
"Let's now see how we can get a key via Microsoft Active Directory Authentication."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azure.identity import DefaultAzureCredential\n",
"\n",
"if use_azure_active_directory:\n",
" default_credential = DefaultAzureCredential()\n",
" token = default_credential.get_token(\"https://cognitiveservices.azure.com/.default\")\n",
"\n",
" openai.api_type = \"azure_ad\"\n",
" openai.api_key = token.token"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"A token is valid for a period of time, after which it will expire. To ensure a valid token is sent with every request, you can refresh an expiring token by hooking into requests.auth:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import typing\n",
"import time\n",
"import requests\n",
"\n",
"if typing.TYPE_CHECKING:\n",
" from azure.core.credentials import TokenCredential\n",
"\n",
"class TokenRefresh(requests.auth.AuthBase):\n",
"\n",
" def __init__(self, credential: \"TokenCredential\", scopes: typing.List[str]) -> None:\n",
" self.credential = credential\n",
" self.scopes = scopes\n",
" self.cached_token: typing.Optional[str] = None\n",
"\n",
" def __call__(self, req):\n",
" if not self.cached_token or self.cached_token.expires_on - time.time() < 300:\n",
" self.cached_token = self.credential.get_token(*self.scopes)\n",
" req.headers[\"Authorization\"] = f\"Bearer {self.cached_token.token}\"\n",
" return req\n",
"\n",
"if use_azure_active_directory:\n",
" session = requests.Session()\n",
" session.auth = TokenRefresh(default_credential, [\"https://cognitiveservices.azure.com/.default\"])\n",
"\n",
" openai.requestssession = session"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Functions\n",
"\n",
"With setup and authentication complete, you can now use functions with the Azure OpenAI service. This will be split into a few steps:\n",
"\n",
"1. Define the function(s)\n",
"2. Pass function definition(s) into chat completions API\n",
"3. Call function with arguments from the response\n",
"4. Feed function response back into chat completions API"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"#### 1. Define the function(s)\n",
"\n",
"A list of functions can be defined, each containing the name of the function, an optional description, and the parameters the function accepts (described as a JSON schema)."
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [],
"source": [
"functions = [\n",
" {\n",
" \"name\": \"get_current_weather\",\n",
" \"description\": \"Get the current weather\",\n",
" \"parameters\": {\n",
" \"type\": \"object\",\n",
" \"properties\": {\n",
" \"location\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The city and state, e.g. San Francisco, CA\",\n",
" },\n",
" \"format\": {\n",
" \"type\": \"string\",\n",
" \"enum\": [\"celsius\", \"fahrenheit\"],\n",
" \"description\": \"The temperature unit to use. Infer this from the users location.\",\n",
" },\n",
" },\n",
" \"required\": [\"location\"],\n",
" },\n",
" }\n",
"]"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"#### 2. Pass function definition(s) into chat completions API\n",
"\n",
"Now we can pass the function into the chat completions API. If the model determines it should call the function, a `finish_reason` of \"function_call\" will be populated on the choice and the details of which function to call and its arguments will be present in the `message`. Optionally, you can set the `function_call` keyword argument to force the model to call a particular function (e.g. `function_call={\"name\": get_current_weather}`). By default, this is set to `auto`, allowing the model to choose whether to call the function or not. "
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{\n",
" \"choices\": [\n",
" {\n",
" \"content_filter_results\": {},\n",
" \"finish_reason\": \"function_call\",\n",
" \"index\": 0,\n",
" \"message\": {\n",
" \"function_call\": {\n",
" \"arguments\": \"{\\n \\\"location\\\": \\\"Seattle, WA\\\"\\n}\",\n",
" \"name\": \"get_current_weather\"\n",
" },\n",
" \"role\": \"assistant\"\n",
" }\n",
" }\n",
" ],\n",
" \"created\": 1689702512,\n",
" \"id\": \"chatcmpl-7dj6GkYdM7Vw9eGn02bc2qqjN70Ps\",\n",
" \"model\": \"gpt-4\",\n",
" \"object\": \"chat.completion\",\n",
" \"prompt_annotations\": [\n",
" {\n",
" \"content_filter_results\": {\n",
" \"hate\": {\n",
" \"filtered\": false,\n",
" \"severity\": \"safe\"\n",
" },\n",
" \"self_harm\": {\n",
" \"filtered\": false,\n",
" \"severity\": \"safe\"\n",
" },\n",
" \"sexual\": {\n",
" \"filtered\": false,\n",
" \"severity\": \"safe\"\n",
" },\n",
" \"violence\": {\n",
" \"filtered\": false,\n",
" \"severity\": \"safe\"\n",
" }\n",
" },\n",
" \"prompt_index\": 0\n",
" }\n",
" ],\n",
" \"usage\": {\n",
" \"completion_tokens\": 18,\n",
" \"prompt_tokens\": 115,\n",
" \"total_tokens\": 133\n",
" }\n",
"}\n"
]
}
],
"source": [
"messages = [\n",
" {\"role\": \"system\", \"content\": \"Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.\"},\n",
" {\"role\": \"user\", \"content\": \"What's the weather like today in Seattle?\"}\n",
"]\n",
"\n",
"chat_completion = openai.ChatCompletion.create(\n",
" deployment_id=\"gpt-35-turbo-0613\",\n",
" messages=messages,\n",
" functions=functions,\n",
")\n",
"print(chat_completion)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"#### 3. Call function with arguments from the response\n",
"\n",
"The name of the function call will be one that was provided initially and the arguments will include JSON matching the schema included in the function definition."
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"get_current_weather\n",
"{\n",
" \"location\": \"Seattle, WA\"\n",
"}\n"
]
}
],
"source": [
"import json\n",
"\n",
"def get_current_weather(request):\n",
" \"\"\"\n",
" This function is for illustrative purposes.\n",
" The location and unit should be used to determine weather\n",
" instead of returning a hardcoded response.\n",
" \"\"\"\n",
" location = request.get(\"location\")\n",
" unit = request.get(\"unit\")\n",
" return {\"temperature\": \"22\", \"unit\": \"celsius\", \"description\": \"Sunny\"}\n",
"\n",
"function_call = chat_completion.choices[0].message.function_call\n",
"print(function_call.name)\n",
"print(function_call.arguments)\n",
"\n",
"if function_call.name == \"get_current_weather\":\n",
" response = get_current_weather(json.loads(function_call.arguments))"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"#### 4. Feed function response back into chat completions API\n",
"\n",
"The response from the function should be serialized into a new message with the role set to \"function\". Now the model will use the response data to formulate its answer."
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Today in Seattle, the weather is sunny with a temperature of 22 degrees celsius.\n"
]
}
],
"source": [
"messages.append(\n",
" {\n",
" \"role\": \"function\",\n",
" \"name\": \"get_current_weather\",\n",
" \"content\": json.dumps(response)\n",
" }\n",
")\n",
"\n",
"function_completion = openai.ChatCompletion.create(\n",
" deployment_id=\"gpt-35-turbo-0613\",\n",
" messages=messages,\n",
" functions=functions,\n",
")\n",
"\n",
"print(function_completion.choices[0].message.content.strip())"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.0"
},
"vscode": {
"interpreter": {
"hash": "3a5103089ab7e7c666b279eeded403fcec76de49a40685dbdfe9f9c78ad97c17"
}
}
},
"nbformat": 4,
"nbformat_minor": 2
}