2022-12-22 01:45:37 +00:00
{
"cells": [
{
"cell_type": "markdown",
"id": "9e9b7651",
"metadata": {},
"source": [
2023-04-18 03:25:32 +00:00
"# Azure OpenAI\n",
2022-12-22 01:45:37 +00:00
"\n",
"This notebook goes over how to use Langchain with [Azure OpenAI](https://aka.ms/azure-openai).\n",
"\n",
"The Azure OpenAI API is compatible with OpenAI's API. The `openai` Python package makes it easy to use both OpenAI and Azure OpenAI. You can call Azure OpenAI the same way you call OpenAI with the exceptions noted below.\n",
"\n",
Docs refactor (#480)
Big docs refactor! Motivation is to make it easier for people to find
resources they are looking for. To accomplish this, there are now three
main sections:
- Getting Started: steps for getting started, walking through most core
functionality
- Modules: these are different modules of functionality that langchain
provides. Each part here has a "getting started", "how to", "key
concepts" and "reference" section (except in a few select cases where it
didnt easily fit).
- Use Cases: this is to separate use cases (like summarization, question
answering, evaluation, etc) from the modules, and provide a different
entry point to the code base.
There is also a full reference section, as well as extra resources
(glossary, gallery, etc)
Co-authored-by: Shreya Rajpal <ShreyaR@users.noreply.github.com>
2023-01-02 16:24:09 +00:00
"## API configuration\n",
2022-12-22 01:45:37 +00:00
"You can configure the `openai` package to use Azure OpenAI using environment variables. The following is for `bash`:\n",
"\n",
"```bash\n",
"# Set this to `azure`\n",
"export OPENAI_API_TYPE=azure\n",
"# The API version you want to use: set this to `2022-12-01` for the released version.\n",
"export OPENAI_API_VERSION=2022-12-01\n",
"# The base URL for your Azure OpenAI resource. You can find this in the Azure portal under your Azure OpenAI resource.\n",
2023-01-18 06:30:29 +00:00
"export OPENAI_API_BASE=https://your-resource-name.openai.azure.com\n",
2022-12-22 01:45:37 +00:00
"# The API key for your Azure OpenAI resource. You can find this in the Azure portal under your Azure OpenAI resource.\n",
"export OPENAI_API_KEY=<your Azure OpenAI API key>\n",
"```\n",
"\n",
"Alternatively, you can configure the API right within your running Python environment:\n",
"\n",
"```python\n",
"import os\n",
"os.environ[\"OPENAI_API_TYPE\"] = \"azure\"\n",
"...\n",
"```\n",
"\n",
Docs refactor (#480)
Big docs refactor! Motivation is to make it easier for people to find
resources they are looking for. To accomplish this, there are now three
main sections:
- Getting Started: steps for getting started, walking through most core
functionality
- Modules: these are different modules of functionality that langchain
provides. Each part here has a "getting started", "how to", "key
concepts" and "reference" section (except in a few select cases where it
didnt easily fit).
- Use Cases: this is to separate use cases (like summarization, question
answering, evaluation, etc) from the modules, and provide a different
entry point to the code base.
There is also a full reference section, as well as extra resources
(glossary, gallery, etc)
Co-authored-by: Shreya Rajpal <ShreyaR@users.noreply.github.com>
2023-01-02 16:24:09 +00:00
"## Deployments\n",
2022-12-22 01:45:37 +00:00
"With Azure OpenAI, you set up your own deployments of the common GPT-3 and Codex models. When calling the API, you need to specify the deployment you want to use.\n",
"\n",
"Let's say your deployment name is `text-davinci-002-prod`. In the `openai` Python API, you can specify this deployment with the `engine` parameter. For example:\n",
"\n",
"```python\n",
"import openai\n",
"\n",
"response = openai.Completion.create(\n",
" engine=\"text-davinci-002-prod\",\n",
" prompt=\"This is a test\",\n",
" max_tokens=5\n",
")\n",
"```\n"
]
},
2023-04-18 03:25:32 +00:00
{
"cell_type": "code",
"execution_count": null,
"id": "89fdb593-5a42-4098-87b7-1496fa511b1c",
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"!pip install openai"
]
},
2022-12-22 01:45:37 +00:00
{
"cell_type": "code",
"execution_count": 1,
"id": "8fad2a6e",
"metadata": {},
"outputs": [],
"source": [
"# Import Azure OpenAI\n",
"from langchain.llms import AzureOpenAI"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "8c80213a",
"metadata": {},
"outputs": [],
"source": [
"# Create an instance of Azure OpenAI\n",
"# Replace the deployment name with your own\n",
"llm = AzureOpenAI(deployment_name=\"text-davinci-002-prod\", model_name=\"text-davinci-002\")"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "592dc404",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"'\\n\\nWhy did the chicken cross the road?\\n\\nTo get to the other side.'"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# Run the LLM\n",
"llm(\"Tell me a joke\")"
]
},
{
"cell_type": "markdown",
"id": "bbfebea1",
"metadata": {},
"source": [
"We can also print the LLM and see its custom print."
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "9c33fa19",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[1mAzureOpenAI\u001b[0m\n",
"Params: {'deployment_name': 'text-davinci-002', 'model_name': 'text-davinci-002', 'temperature': 0.7, 'max_tokens': 256, 'top_p': 1, 'frequency_penalty': 0, 'presence_penalty': 0, 'n': 1, 'best_of': 1}\n"
]
}
],
"source": [
"print(llm)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5a8b5917",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
2023-04-18 03:25:32 +00:00
"version": "3.10.6"
2022-12-22 01:45:37 +00:00
},
"vscode": {
"interpreter": {
"hash": "3bae61d45a4f4d73ecea8149862d4bfbae7d4d4a2f71b6e609a1be8f6c8d4298"
}
}
},
"nbformat": 4,
"nbformat_minor": 5
}