diff --git a/examples/azure/archive/chat_with_your_own_data.ipynb b/examples/azure/archive/chat_with_your_own_data.ipynb index 1c8eca72..98d6a96c 100644 --- a/examples/azure/archive/chat_with_your_own_data.ipynb +++ b/examples/azure/archive/chat_with_your_own_data.ipynb @@ -93,14 +93,14 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "openai.api_base = os.environ[\"OPENAI_API_BASE\"]\n", "\n", - "# Azure OpenAI on your own data is only supported by the 2023-08-01-preview API version\n", - "openai.api_version = \"2023-08-01-preview\"" + "# Azure OpenAI on your own data is only supported by preview API versions\n", + "openai.api_version = \"2024-02-15-preview\"" ] }, { @@ -114,7 +114,7 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 5, "metadata": {}, "outputs": [], "source": [ @@ -133,7 +133,7 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 6, "metadata": {}, "outputs": [], "source": [ @@ -241,113 +241,21 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "To chat with Azure OpenAI models using your own data with the Python SDK, we must first set up the code to target the chat completions extensions endpoint which is designed to work with your own data. To do this, we've created a convenience function that can be called to set a custom adapter for the library which will target the extensions endpoint for a given deployment ID." - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [], - "source": [ - "import requests\n", - "\n", - "def setup_byod(deployment_id: str) -> None:\n", - " \"\"\"Sets up the OpenAI Python SDK to use your own data for the chat endpoint.\n", - " \n", - " :param deployment_id: The deployment ID for the model to use with your own data.\n", - "\n", - " To remove this configuration, simply set openai.requestssession to None.\n", - " \"\"\"\n", - "\n", - " class BringYourOwnDataAdapter(requests.adapters.HTTPAdapter):\n", - "\n", - " def send(self, request, **kwargs):\n", - " request.url = f\"{openai.api_base}/openai/deployments/{deployment_id}/extensions/chat/completions?api-version={openai.api_version}\"\n", - " return super().send(request, **kwargs)\n", - "\n", - " session = requests.Session()\n", - "\n", - " # Mount a custom adapter which will use the extensions endpoint for any call using the given `deployment_id`\n", - " session.mount(\n", - " prefix=f\"{openai.api_base}/openai/deployments/{deployment_id}\",\n", - " adapter=BringYourOwnDataAdapter()\n", - " )\n", - "\n", - " if use_azure_active_directory:\n", - " session.auth = TokenRefresh(default_credential, [\"https://cognitiveservices.azure.com/.default\"])\n", - "\n", - " openai.requestssession = session\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now we can call the convenience function to configure the SDK with the model we plan to use for our own data." + "Providing our search endpoint, key, and index name for the `data_sources` keyword argument, any questions posed to the model will now be grounded in our own data. An additional property, `context`, will be provided to show the data the model referenced to answer the question." ] }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "metadata": {}, "outputs": [], - "source": [ - "setup_byod(\"gpt-4\")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Providing our search endpoint, key, and index name for the `dataSources` keyword argument, any questions posed to the model will now be grounded in our own data. An additional property, `context`, will be provided to show the data the model referenced to answer the question." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " \"id\": \"65b485bb-b3c9-48da-8b6f-7d3a219f0b40\",\n", - " \"model\": \"gpt-4\",\n", - " \"created\": 1693338769,\n", - " \"object\": \"extensions.chat.completion\",\n", - " \"choices\": [\n", - " {\n", - " \"index\": 0,\n", - " \"finish_reason\": \"stop\",\n", - " \"message\": {\n", - " \"role\": \"assistant\",\n", - " \"content\": \"Azure AI services and Azure Machine Learning (AML) both aim to apply artificial intelligence (AI) to enhance business operations, but they target different audiences and offer different capabilities [doc1]. \\n\\nAzure AI services are designed for developers without machine learning experience and provide pre-trained models to solve general problems such as text analysis, image recognition, and natural language processing [doc5]. These services require general knowledge about your data without needing experience with machine learning or data science and provide REST APIs and language-based SDKs [doc2].\\n\\nOn the other hand, Azure Machine Learning is tailored for data scientists and involves a longer process of data collection, cleaning, transformation, algorithm selection, model training, and deployment [doc5]. It allows users to create custom solutions for highly specialized and specific problems, requiring familiarity with the subject matter, data, and expertise in data science [doc5].\\n\\nIn summary, Azure AI services offer pre-trained models for developers without machine learning experience, while Azure Machine Learning is designed for data scientists to create custom solutions for specific problems.\",\n", - " \"end_turn\": true,\n", - " \"context\": {\n", - " \"messages\": [\n", - " {\n", - " \"role\": \"tool\",\n", - " \"content\": \"{\\\"citations\\\": [{\\\"content\\\": \\\"

How are Azure AI services and Azure Machine Learning (AML) similar?.

\\\\n

Both have the end-goal of applying artificial intelligence (AI) to enhance business operations, though how each provides this in the respective offerings is different..

\\\\n

Generally, the audiences are different:

\\\\n