{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "368686b4-f487-4dd4-aeff-37823976529d",
   "metadata": {},
   "source": [
    "<a href=\"https://colab.research.google.com/github/run-llama/llama_index/blob/main/docs/examples/multi_modal/azure_openai_multi_modal.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
    "\n",
    "# Multi-Modal LLM using Azure OpenAI GPT-4V model for image reasoning\n",
    "\n",
    "In this notebook, we show how to use **Azure** OpenAI GPT4V MultiModal LLM class/abstraction for image understanding/reasoning. For a more complete example, please visit [this notebook](https://github.com/run-llama/llama_index/blob/main/docs/examples/multi_modal/openai_multi_modal.ipynb)."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "8d57f059",
   "metadata": {},
   "outputs": [],
   "source": [
    "%pip install llama-index-multi-modal-llms-azure-openai"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "fc691ca8",
   "metadata": {},
   "outputs": [],
   "source": [
    "!pip install openai"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e1a5b0f05aadc911",
   "metadata": {},
   "source": [
    "## Prerequisites\n",
    "\n",
    "1. Setup an Azure subscription - you can create one for free [here](https://azure.microsoft.com/en-us/free/cognitive-services/)\n",
    "2. Apply for access to Azure OpenAI Service [here](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUOFA5Qk1UWDRBMjg0WFhPMkIzTzhKQ1dWNyQlQCN0PWcu) \n",
    "3. Create a resource in the Azure portal [here](https://portal.azure.com/?microsoft_azure_marketplace_ItemHideKey=microsoft_openai_tip#create/Microsoft.CognitiveServicesOpenAI)\n",
    "4. Deploy a model in Azure OpenAI Studio [here](https://oai.azure.com/)\n",
    "\n",
    "\n",
    "You can find more details in [this guide.](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal)\n",
    "\n",
    "Note down the **\"model name\"** and **\"deployment name\"**, you'll need it when connecting to your LLM."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "4479bf64",
   "metadata": {},
   "source": [
    "##  Use GPT4V to understand Images from URLs / base64"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "5455d8c6",
   "metadata": {},
   "outputs": [],
   "source": [
    "import os\n",
    "\n",
    "os.environ[\"AZURE_OPENAI_API_KEY\"] = \"<your-api-key>\"\n",
    "os.environ[\n",
    "    \"AZURE_OPENAI_ENDPOINT\"\n",
    "] = \"https://<your-resource-name>.openai.azure.com/\"\n",
    "os.environ[\"OPENAI_API_VERSION\"] = \"2023-12-01-preview\""
   ]
  },
  {
   "cell_type": "markdown",
   "id": "3d0d083e",
   "metadata": {},
   "source": [
    "## Initialize `AzureOpenAIMultiModal` and Load Images from URLs\n",
    "\n",
    "Unlike normal `OpenAI`, you need to pass a `engine` argument in addition to `model`. The `engine` is the name of your model deployment you selected in Azure OpenAI Studio."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "9e5c6b095ed0995d",
   "metadata": {},
   "outputs": [],
   "source": [
    "from llama_index.multi_modal_llms.azure_openai import AzureOpenAIMultiModal"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "cf65789c9a66cc8b",
   "metadata": {},
   "outputs": [],
   "source": [
    "azure_openai_mm_llm = AzureOpenAIMultiModal(\n",
    "    engine=\"gpt-4-vision-preview\",\n",
    "    api_version=\"2023-12-01-preview\",\n",
    "    model=\"gpt-4-vision-preview\",\n",
    "    max_new_tokens=300,\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "fa22796cdd358274",
   "metadata": {},
   "source": [
    "Alternatively, you can also skip setting environment variables, and pass the parameters in directly via constructor."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "4d310e814a6da1c4",
   "metadata": {},
   "outputs": [],
   "source": [
    "azure_openai_mm_llm = AzureOpenAIMultiModal(\n",
    "    azure_endpoint=\"https://<your-endpoint>.openai.azure.com\",\n",
    "    engine=\"gpt-4-vision-preview\",\n",
    "    api_version=\"2023-12-01-preview\",\n",
    "    model=\"gpt-4-vision-preview\",\n",
    "    max_new_tokens=300,\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "8725b6d2",
   "metadata": {},
   "outputs": [],
   "source": [
    "import base64\n",
    "import requests\n",
    "from llama_index.core.schema import ImageDocument\n",
    "\n",
    "image_url = \"https://www.visualcapitalist.com/wp-content/uploads/2023/10/US_Mortgage_Rate_Surge-Sept-11-1.jpg\"\n",
    "\n",
    "response = requests.get(image_url)\n",
    "if response.status_code != 200:\n",
    "    raise ValueError(\"Error: Could not retrieve image from URL.\")\n",
    "base64str = base64.b64encode(response.content).decode(\"utf-8\")\n",
    "\n",
    "image_document = ImageDocument(image=base64str, image_mimetype=\"image/jpeg\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "05d94bcb",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<img width=400 src=\"\"/>"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "execution_count": null,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "from IPython.display import HTML\n",
    "\n",
    "HTML(f'<img width=400 src=\"data:image/jpeg;base64,{base64str}\"/>')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "fbd9c116",
   "metadata": {},
   "source": [
    "### Complete a prompt with an image"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "c96ab53e",
   "metadata": {},
   "outputs": [],
   "source": [
    "complete_response = azure_openai_mm_llm.complete(\n",
    "    prompt=\"Describe the images as an alternative text\",\n",
    "    image_documents=[image_document],\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "3eba4477",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "The image is a line graph showing the U.S. 30-year fixed-rate mortgage percentage rate and existing home sales from 2015 to 2021. The mortgage rate is represented by a red line, while the home sales are represented by a blue line. The graph shows that the mortgage rate has reached its highest level in over 20 years, while home sales have fluctuated over the same period. There is also a note that the data is sourced from the U.S. Federal Reserve, Trading Economics, and Visual Capitalist.\n"
     ]
    }
   ],
   "source": [
    "print(complete_response)"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
