{
  "cells": [
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "8RopCue2ab0K"
      },
      "outputs": [],
      "source": [
        "# Copyright 2025 Google LLC\n",
        "#\n",
        "# Licensed under the Apache License, Version 2.0 (the \"License\");\n",
        "# you may not use this file except in compliance with the License.\n",
        "# You may obtain a copy of the License at\n",
        "#\n",
        "#     https://www.apache.org/licenses/LICENSE-2.0\n",
        "#\n",
        "# Unless required by applicable law or agreed to in writing, software\n",
        "# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
        "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
        "# See the License for the specific language governing permissions and\n",
        "# limitations under the License."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "VyPmicX9RlZX"
      },
      "source": [
        "# Get hands-on with a customer support use case using Gemini and Gen AI SDK\n",
        "\n",
        "<table align=\"left\">\n",
        "  <td style=\"text-align: center\">\n",
        "    <a href=\"https://colab.research.google.com/github/GoogleCloudPlatform/generative-ai/blob/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb.ipynb\">\n",
        "      <img width=\"32px\" src=\"https://www.gstatic.com/pantheon/images/bigquery/welcome_page/colab-logo.svg\" alt=\"Google Colaboratory logo\"><br> Open in Colab\n",
        "    </a>\n",
        "  </td>\n",
        "  <td style=\"text-align: center\">\n",
        "    <a href=\"https://console.cloud.google.com/vertex-ai/colab/import/https:%2F%2Fraw.githubusercontent.com%2FGoogleCloudPlatform%2Fgenerative-ai%2Fmain%2Fgemini%2Fuse-cases%2Fcustomer-support%2Fcustomer_support_gemini_genai_sdk.ipynb\">\n",
        "      <img width=\"32px\" src=\"https://lh3.googleusercontent.com/JmcxdQi-qOpctIvWKgPtrzZdJJK-J3sWE1RsfjZNwshCFgE_9fULcNpuXYTilIR2hjwN\" alt=\"Google Cloud Colab Enterprise logo\"><br> Open in Colab Enterprise\n",
        "    </a>\n",
        "  </td>\n",
        "  <td style=\"text-align: center\">\n",
        "    <a href=\"https://console.cloud.google.com/vertex-ai/workbench/deploy-notebook?download_url=https://raw.githubusercontent.com/GoogleCloudPlatform/generative-ai/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb\">\n",
        "      <img src=\"https://www.gstatic.com/images/branding/gcpiconscolors/vertexai/v1/32px.svg\" alt=\"Vertex AI logo\"><br> Open in Vertex AI Workbench\n",
        "    </a>\n",
        "  </td>\n",
        "  <td style=\"text-align: center\">\n",
        "    <a href=\"https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb\">\n",
        "      <img width=\"32px\" src=\"https://raw.githubusercontent.com/primer/octicons/refs/heads/main/icons/mark-github-24.svg\" alt=\"GitHub logo\"><br> View on GitHub\n",
        "    </a>\n",
        "  </td>\n",
        "</table>\n",
        "\n",
        "<div style=\"clear: both;\"></div>\n",
        "\n",
        "<b>Share to:</b>\n",
        "\n",
        "<a href=\"https://www.linkedin.com/sharing/share-offsite/?url=https%3A//github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb\" target=\"_blank\">\n",
        "  <img width=\"20px\" src=\"https://upload.wikimedia.org/wikipedia/commons/8/81/LinkedIn_icon.svg\" alt=\"LinkedIn logo\">\n",
        "</a>\n",
        "\n",
        "<a href=\"https://bsky.app/intent/compose?text=https%3A//github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb\" target=\"_blank\">\n",
        "  <img width=\"20px\" src=\"https://upload.wikimedia.org/wikipedia/commons/7/7a/Bluesky_Logo.svg\" alt=\"Bluesky logo\">\n",
        "</a>\n",
        "\n",
        "<a href=\"https://twitter.com/intent/tweet?url=https%3A//github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb\" target=\"_blank\">\n",
        "  <img width=\"20px\" src=\"https://upload.wikimedia.org/wikipedia/commons/5/5a/X_icon_2.svg\" alt=\"X logo\">\n",
        "</a>\n",
        "\n",
        "<a href=\"https://reddit.com/submit?url=https%3A//github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb\" target=\"_blank\">\n",
        "  <img width=\"20px\" src=\"https://redditinc.com/hubfs/Reddit%20Inc/Brand/Reddit_Logo.png\" alt=\"Reddit logo\">\n",
        "</a>\n",
        "\n",
        "<a href=\"https://www.facebook.com/sharer/sharer.php?u=https%3A//github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/use-cases/customer-support/customer_support_gemini_genai_sdk.ipynb\" target=\"_blank\">\n",
        "  <img width=\"20px\" src=\"https://upload.wikimedia.org/wikipedia/commons/5/51/Facebook_f_logo_%282019%29.svg\" alt=\"Facebook logo\">\n",
        "</a>"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "fI7REy_PauX_"
      },
      "source": [
        "| Author |\n",
        "| --- |\n",
        "| [Eric Dong](https://github.com/gericdong) |"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "hXLnBuhYa15O"
      },
      "source": [
        "## Overview\n",
        "\n",
        "This tutorial provides you hands-on experience building your own Gemini-powered applications. It guides you through practical examples and code snippets, focusing on a customer support use case, to show you how to start creating AI applications using Gemini and Google Gen AI SDK."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "EAICO794dAmy"
      },
      "source": [
        "## Getting Started"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "CHRZUpfWSEpp"
      },
      "source": [
        "### ⚙️ **Setup**: Install Gen AI SDK\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "sG3_LKsWSD3A"
      },
      "outputs": [],
      "source": [
        "%pip install --upgrade --quiet google-genai"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "LymmEN6GSTn-"
      },
      "source": [
        "#### Authenticate your notebook environment"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "12fnq4V0SNV3"
      },
      "outputs": [],
      "source": [
        "import sys\n",
        "\n",
        "if \"google.colab\" in sys.modules:\n",
        "    from google.colab import auth\n",
        "\n",
        "    auth.authenticate_user()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "nanp0kdisICh"
      },
      "source": [
        "#### Set Google Cloud project and location"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "Gz3Bq7dwq3K1"
      },
      "outputs": [],
      "source": [
        "import os\n",
        "\n",
        "PROJECT_ID = \"[your-project-id]\"  # @param {type: \"string\", placeholder: \"[your-project-id]\", isTemplate: true}\n",
        "if not PROJECT_ID or PROJECT_ID == \"[your-project-id]\":\n",
        "    PROJECT_ID = str(os.environ.get(\"GOOGLE_CLOUD_PROJECT\"))\n",
        "\n",
        "LOCATION = os.environ.get(\"GOOGLE_CLOUD_REGION\", \"us-central1\")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "37CH91ddY9kG"
      },
      "source": [
        "#### Create a product catalog"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "c5P-OweZpuNO"
      },
      "outputs": [],
      "source": [
        "from google.cloud import storage\n",
        "\n",
        "storage_client = storage.Client()\n",
        "bucket = storage_client.bucket(\"cloud-samples-data\")\n",
        "blobs = bucket.list_blobs(prefix=\"generative-ai/image/retail\")\n",
        "\n",
        "id = 0\n",
        "product_catalog = []\n",
        "for blob in blobs:\n",
        "    if blob.name.endswith(\".png\"):\n",
        "        item = {\n",
        "            \"id\": id,\n",
        "            \"inventory\": \"In stock\",\n",
        "            \"price\": 39.99,\n",
        "            \"image_url\": blob.public_url,\n",
        "        }\n",
        "        product_catalog.append(item)\n",
        "        id += 1"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "EdvJRUWRNGHE"
      },
      "source": [
        "### 1️⃣ Import libraries\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "qgdSpVmDbdQ9"
      },
      "outputs": [],
      "source": [
        "from IPython.display import HTML, Audio, Image, Markdown, display\n",
        "from google import genai\n",
        "from google.genai.types import (\n",
        "    Content,\n",
        "    FunctionDeclaration,\n",
        "    GenerateContentConfig,\n",
        "    GoogleSearch,\n",
        "    LiveConnectConfig,\n",
        "    Part,\n",
        "    Tool,\n",
        ")\n",
        "import numpy as np"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "E_0XTXckBAmy"
      },
      "source": [
        "### 2️⃣ Create a client"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "zpIPG_YhSjaw"
      },
      "outputs": [],
      "source": [
        "client = genai.Client(vertexai=True, project=PROJECT_ID, location=LOCATION)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "a07vlxdfdP5l"
      },
      "source": [
        "## Customer support use case\n",
        "\n",
        "Imagine a customer is buying a chair. He searched the internet and got a picture of a chair that he likes from the web, then he is reaching out online to his favorite furniture store for assistance.\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "KGAKijPSC73g"
      },
      "source": [
        "#### 🤔 **Example question 1**: Do you have a chair similar to the one in this picture, but in red?\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "yAyk3cWpC4jw"
      },
      "outputs": [],
      "source": [
        "customer_query = \"Do you have chairs similar to the one in this picture, but in red?\""
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "sae1XmAohTm0"
      },
      "outputs": [],
      "source": [
        "customer_chair_image_url = (\n",
        "    \"https://storage.googleapis.com/cloud-samples-data/generative-ai/image/armchair.png\"\n",
        ")\n",
        "\n",
        "display(Image(url=customer_chair_image_url, width=300))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "uFDdiSuqR6w4"
      },
      "source": [
        "### 3️⃣ Send requests to the model to generate content\n",
        "\n",
        "The request may contain:\n",
        "- `model`: The model ID\n",
        "- `contents`: Multimodal input (text, PDFs, images, audio, video)\n",
        "- `config`: Configuration for generation\n",
        "\n",
        "Learn more about [Google models](https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models)."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "SNtTH-4EFwFy"
      },
      "outputs": [],
      "source": [
        "MODEL_ID = \"gemini-2.5-pro\"  # gemini-2.5-pro, gemini-2.0-flash, gemini-2.0-flash-lite"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "KF4I1qrkfdAG"
      },
      "source": [
        "Use `contents` to construct multimodal input including text, PDFs, images, audio, video, and code all together.\n",
        "\n",
        "Here, you can leverage the long context window that helps seamlessly analyze large amounts of information."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "uRYbBLDffv9h"
      },
      "outputs": [],
      "source": [
        "product_catalog_parts = []\n",
        "for product in product_catalog:\n",
        "    product_catalog_parts.append(f\"Chair (id={product['id']}):\")\n",
        "    product_catalog_parts.append(\n",
        "        Part.from_uri(file_uri=product[\"image_url\"], mime_type=\"image/png\")\n",
        "    )\n",
        "\n",
        "\n",
        "contents = [\n",
        "    customer_query,\n",
        "    Part.from_uri(file_uri=customer_chair_image_url, mime_type=\"image/png\"),\n",
        "    \"catalog:\",\n",
        "    product_catalog_parts,\n",
        "]"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "mUu7lubTfYXK"
      },
      "source": [
        "Use `system_instruction` to give the model additional context to understand the task, provide more customized responses, adhere to specific guidelines over the full user interaction with the model.\n",
        "\n",
        "Gemini 2.0 models were trained to have conversational responses that are short, to the point and cost-effective for serving. If you need richer, more verbose answers use the following system instruction:\n",
        "\n",
        "\n",
        "> All questions should be answered comprehensively with details,\n",
        "unless the user requests a concise response specifically.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "ye5-5gSIfRjK"
      },
      "outputs": [],
      "source": [
        "system_instruction = \"\"\"\n",
        "You are an expert sales assistant specializing in furniture recommendations.\n",
        "\"\"\""
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "bcWXLz8ef1k1"
      },
      "source": [
        "Use `client.models.generate_content` to send a request to the model with `model`, `contents`, `config`, `tools`, etc.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "Dl8b5_yWwZWg"
      },
      "outputs": [],
      "source": [
        "%%time\n",
        "\n",
        "response = client.models.generate_content(\n",
        "    model=MODEL_ID,\n",
        "    contents=contents,\n",
        "    config=GenerateContentConfig(\n",
        "        system_instruction=system_instruction,\n",
        "    ),\n",
        ")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "_8DNdQ9UEpNy"
      },
      "source": [
        "### 4️⃣ Read model responses\n",
        "\n",
        "The response may contain:\n",
        "- Multimodal output (text, code, images, audio, embeddings)\n",
        "- Response metadata"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "cT0fruYTMpcC"
      },
      "outputs": [],
      "source": [
        "response"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "oY8RMOm3Npf9"
      },
      "source": [
        "You can use `Markdown` to display the formatted text.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "6o77sT_SNs1H"
      },
      "outputs": [],
      "source": [
        "display(Markdown(response.text))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "qTej417fIEOw"
      },
      "source": [
        "### 🚀 **Feature**: Structured output (controlled generation)\n",
        "\n",
        "When you want the model response to a prompt to be returned in a structured data format, particularly if you are using the responses for downstream processes, such as downstream modules that expect a specific format as input, you use the controlled generation to ensure that model outputs comply with a specific structured format.\n",
        "\n",
        "You define a response schema for the model output using these two options:\n",
        "\n",
        "- **Option 1**: Using a Pydantic object"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "m3GoA5HzJQxA"
      },
      "outputs": [],
      "source": [
        "from pydantic import BaseModel, RootModel\n",
        "\n",
        "\n",
        "class MatchedFurnitureInfo(BaseModel):\n",
        "    id: int\n",
        "    match_score: int\n",
        "    match_reason: str\n",
        "\n",
        "\n",
        "class MatchedFurnitureList(RootModel):\n",
        "    root: list[MatchedFurnitureInfo]\n",
        "\n",
        "    def __iter__(self):\n",
        "        return iter(self.root)\n",
        "\n",
        "    def __getitem__(self, item):\n",
        "        return self.root[item]"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "-NwKorrbJbsm"
      },
      "source": [
        "- **Option 2**: Using a dictionary as in OpenAPI schema.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "aFDZntCKJcCu"
      },
      "outputs": [],
      "source": [
        "response_schema = {\n",
        "    \"type\": \"ARRAY\",\n",
        "    \"items\": {\n",
        "        \"type\": \"OBJECT\",\n",
        "        \"properties\": {\n",
        "            \"id\": {\"type\": \"INTEGER\"},\n",
        "            \"match_reason\": {\"type\": \"STRING\", \"nullable\": True},\n",
        "        },\n",
        "    },\n",
        "}"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "ybFEGPYkJ8H3"
      },
      "source": [
        "This example uses the OpenAPI schema option, then sets the defined schema in the `response_schema` in the `GenerateContentConfig`."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "qItf6-QJlpR3"
      },
      "outputs": [],
      "source": [
        "response = client.models.generate_content(\n",
        "    model=\"gemini-2.0-flash\",\n",
        "    contents=response.text,\n",
        "    config=GenerateContentConfig(\n",
        "        system_instruction=\"Convert the given text to JSON\",\n",
        "        response_mime_type=\"application/json\",\n",
        "        response_schema=response_schema,\n",
        "    ),\n",
        ")"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "2Sy-1fxWw_Yz"
      },
      "outputs": [],
      "source": [
        "response.parsed"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "Coyc8ZqEl_5_"
      },
      "outputs": [],
      "source": [
        "matching_items: dict = response.parsed\n",
        "\n",
        "display(HTML(\"<style>#output-body{display:flex; flex-direction: row;}</style>\"))\n",
        "display(HTML(\"Original: \"), Image(url=customer_chair_image_url, width=200))\n",
        "display(HTML(\"Recommended: \"))\n",
        "\n",
        "matching_items_urls = []\n",
        "for item in matching_items:\n",
        "    url = product_catalog[item[\"id\"]][\"image_url\"]\n",
        "    display(Image(url=url, width=200))\n",
        "    matching_items_urls.append(url)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "CIORDrCCassP"
      },
      "source": [
        "### 🤔 **Example question 2**: Would the chair fit in my room?\n",
        "\n",
        "- Reasoning across image and text"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "sQByglSCPUHt"
      },
      "outputs": [],
      "source": [
        "customer_query = \"Would the chair fit in my room?\""
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "TWOQErQDfgrx"
      },
      "outputs": [],
      "source": [
        "customer_room_image_url = \"https://storage.googleapis.com/cloud-samples-data/generative-ai/image/living-room-2.png\"\n",
        "\n",
        "display(Image(url=customer_room_image_url, width=400))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "wVkO1iA8PGv1"
      },
      "source": [
        "### Send request to the model\n",
        "\n",
        "Here, you leverage the multimodal understanding to reason across the images of both the chair and the room."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "Kku6akon9d-u"
      },
      "outputs": [],
      "source": [
        "system_instruction = \"\"\"\n",
        "  You are an interior designer.\n",
        "  Your mission is to help customers to create living spaces that balance functionality\n",
        "  and beauty through personalized service and sustainable design practices.\n",
        "\"\"\"\n",
        "\n",
        "contents = [\n",
        "    \"Chair:\",\n",
        "    Part.from_uri(file_uri=matching_items_urls[0], mime_type=\"image/png\"),\n",
        "    \"Living room:\",\n",
        "    Part.from_uri(file_uri=customer_room_image_url, mime_type=\"image/png\"),\n",
        "    customer_query,\n",
        "]\n",
        "\n",
        "response = client.models.generate_content(\n",
        "    model=\"gemini-2.0-flash\",\n",
        "    contents=contents,\n",
        "    config=GenerateContentConfig(\n",
        "        system_instruction=system_instruction,\n",
        "    ),\n",
        ")\n",
        "\n",
        "display(Markdown(response.text))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "0KHz3YMFPgb7"
      },
      "source": [
        "### 🚀 **Feature**: Image generation\n",
        "\n",
        "You also generate an image with the chair being placed in the living room, showcasing the Gemini's ability to generate images natively."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "Azo_HLmpPde2"
      },
      "outputs": [],
      "source": [
        "response = client.models.generate_content(\n",
        "    model=\"gemini-2.0-flash-preview-image-generation\",\n",
        "    contents=[\n",
        "        contents,\n",
        "        \"Create an image with the chair be integrated in the living room\",\n",
        "    ],\n",
        "    config=GenerateContentConfig(\n",
        "        response_modalities=[\"TEXT\", \"IMAGE\"],\n",
        "    ),\n",
        ")\n",
        "\n",
        "for part in response.candidates[0].content.parts:\n",
        "    if part.text:\n",
        "        display(Markdown(part.text))\n",
        "    if part.inline_data:\n",
        "        display(Image(data=part.inline_data.data, width=400))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "NeP2E49KWzos"
      },
      "source": [
        "### 🤔 **Example question 3**: Is this chair available at a store near me? I am at Google Cloud Next!\n",
        "\n",
        "Here you use the function calling and native tool use to enable models to connect to built-in or external tooling and systems, and fetch real-time or business data.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "kioI4ZD6StdN"
      },
      "outputs": [],
      "source": [
        "customer_query = \"Do you have the chair available in a store near me? I am at Google Cloud Next 2025.\""
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "nK-ZjxDvS2QF"
      },
      "source": [
        "### 🚀 **Feature**: Function calling\n",
        "\n",
        "Define a `FunctionDeclaration` with the name of a function that matches the description and the arguments to call it with. You also can define a Python function for automatic function calling, which will run the function and return the output in natural language generated by the model."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "n9MytQ-FPXIz"
      },
      "outputs": [],
      "source": [
        "get_product_info_function = FunctionDeclaration(\n",
        "    name=\"get_product_info\",\n",
        "    description=\"Get the stock amount and identifier for a given product\",\n",
        "    parameters={\n",
        "        \"type\": \"OBJECT\",\n",
        "        \"properties\": {\n",
        "            \"product_name\": {\"type\": \"STRING\", \"description\": \"Product name\"}\n",
        "        },\n",
        "    },\n",
        ")\n",
        "\n",
        "get_store_location_function = FunctionDeclaration(\n",
        "    name=\"get_store_location\",\n",
        "    description=\"Get the location of the closest store\",\n",
        "    parameters={\n",
        "        \"type\": \"OBJECT\",\n",
        "        \"properties\": {\"location\": {\"type\": \"STRING\", \"description\": \"Location\"}},\n",
        "    },\n",
        ")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Q7JB8rbwSddI"
      },
      "source": [
        "Define a `Tool` that allows the model to select from a set of defined functions."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "xhhcUJjaPbwV"
      },
      "outputs": [],
      "source": [
        "retail_tool = Tool(\n",
        "    function_declarations=[\n",
        "        get_product_info_function,\n",
        "        get_store_location_function,\n",
        "    ],\n",
        ")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "XKCqiOHJSVLZ"
      },
      "source": [
        "Use function calling in a `chats` session to answer user's questions about the products."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "iU0AOZs6gcvN"
      },
      "outputs": [],
      "source": [
        "chat = client.chats.create(\n",
        "    model=\"gemini-2.0-flash\",\n",
        "    config=GenerateContentConfig(\n",
        "        temperature=0,\n",
        "        tools=[retail_tool],\n",
        "    ),\n",
        ")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "2T7C0tdPTI3E"
      },
      "source": [
        "The model generates the function calls that you can use to connect to an external system to get real-time or business data."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "NfuSOgW8Pjjx"
      },
      "outputs": [],
      "source": [
        "response = chat.send_message(customer_query)\n",
        "\n",
        "response.function_calls"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "teVbr4RD9wZH"
      },
      "source": [
        "### 🚀 **Feature**: Google Search\n",
        "\n",
        "Using `GoogleSearch` in Tool, Gemini will automatically do a Google search and incorporate information from the web into its responses. It also provides direct citations so you can easily fact-check sources."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "-w03L_82P0qV"
      },
      "outputs": [],
      "source": [
        "def get_store_location(location: str):\n",
        "    google_search_tool = Tool(google_search=GoogleSearch())\n",
        "\n",
        "    prompt = f\"What is the location for {location}?\"\n",
        "\n",
        "    response = client.models.generate_content(\n",
        "        model=\"gemini-2.0-flash-001\",\n",
        "        contents=prompt,\n",
        "        config=GenerateContentConfig(tools=[google_search_tool], temperature=0),\n",
        "    )\n",
        "\n",
        "    return {\"store\": response.text}\n",
        "\n",
        "\n",
        "def get_product_info(product_name: str):\n",
        "    return {\"id\": \"3\", \"in_stock\": \"yes\"}"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "F8i3Ux3PTuPE"
      },
      "source": [
        "This is where you use the model response to connect to external systems to get the real time or business data."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "H1ZHlsiqfchF"
      },
      "outputs": [],
      "source": [
        "function_response_parts = []\n",
        "\n",
        "for function_call in response.function_calls:\n",
        "    if function_call.name == \"get_store_location\":\n",
        "        function_response = get_store_location(**function_call.args)\n",
        "    elif function_call.name == \"get_product_info\":\n",
        "        function_response = get_product_info(**function_call.args)\n",
        "    else:\n",
        "        raise ValueError(f\"Unknown function: {function_call.name}\")\n",
        "\n",
        "    print(function_call.name)\n",
        "    print(function_response)\n",
        "\n",
        "    function_response_part = Part.from_function_response(\n",
        "        name=function_call.name,\n",
        "        response=function_response,\n",
        "    )\n",
        "\n",
        "    function_response_parts.append(function_response_part)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Uc5d9Q3ZT5Ko"
      },
      "source": [
        "The mode incorporates the data from external systems, and returns the output in natural language."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "psVYQ2PmiCbY"
      },
      "outputs": [],
      "source": [
        "response = chat.send_message(function_response_parts)\n",
        "\n",
        "display(Markdown(response.text))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "xN4JrBBVeD_0"
      },
      "source": [
        "### 🚀 **Feature**: Live API\n",
        "\n",
        "#### **Example**: Text-to-audio conversation\n",
        "\n",
        "**Step 1**: You set up a conversation with the API that allows you to send text prompts and receive audio responses.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "AuPizI-nedBB"
      },
      "outputs": [],
      "source": [
        "config = LiveConnectConfig(\n",
        "    response_modalities=[\"AUDIO\"],\n",
        "    tools=[Tool(google_search=GoogleSearch())],\n",
        ")\n",
        "\n",
        "\n",
        "async def main() -> None:\n",
        "    async with client.aio.live.connect(\n",
        "        model=\"gemini-2.0-flash-live-preview-04-09\", config=config\n",
        "    ) as session:\n",
        "\n",
        "        async def send() -> bool:\n",
        "            text_input = input(\"Input > \")\n",
        "            if text_input.lower() in (\"q\", \"quit\", \"exit\"):\n",
        "                return False\n",
        "            await session.send_client_content(\n",
        "                turns=Content(role=\"user\", parts=[Part(text=text_input)])\n",
        "            )\n",
        "\n",
        "            return True\n",
        "\n",
        "        async def receive() -> None:\n",
        "\n",
        "            audio_data = []\n",
        "\n",
        "            async for message in session.receive():\n",
        "                if (\n",
        "                    message.server_content.model_turn\n",
        "                    and message.server_content.model_turn.parts\n",
        "                ):\n",
        "                    for part in message.server_content.model_turn.parts:\n",
        "                        if part.inline_data:\n",
        "                            audio_data.append(\n",
        "                                np.frombuffer(part.inline_data.data, dtype=np.int16)\n",
        "                            )\n",
        "\n",
        "                if message.server_content.turn_complete:\n",
        "                    display(Markdown(\"**Response >**\"))\n",
        "                    display(\n",
        "                        Audio(np.concatenate(audio_data), rate=24000, autoplay=True)\n",
        "                    )\n",
        "                    break\n",
        "\n",
        "            return\n",
        "\n",
        "        while True:\n",
        "            if not await send():\n",
        "                break\n",
        "            await receive()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "94IeUUb3e90M"
      },
      "source": [
        "**Step 2** Run the chat, input your prompts, or type `q`, `quit` or `exit` to exit.\n",
        "\n",
        "Sample questions:\n",
        "- Where is the Google Cloud Next? Give me the address.\n",
        "- How is the weather today?"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "2UvgUDIYJqfw"
      },
      "outputs": [],
      "source": [
        "await main()"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "KAtlZrA7ULlj"
      },
      "source": [
        "# 🎉 What's Next?\n",
        "\n",
        "🚀 Learn more at the Google Cloud Gen AI sample repository 👇\n",
        "\n",
        "[github.com/GoogleCloudPlatform/generative-ai](https://github.com/GoogleCloudPlatform/generative-ai)\n"
      ]
    }
  ],
  "metadata": {
    "colab": {
      "name": "customer_support_gemini_genai_sdk.ipynb",
      "toc_visible": true
    },
    "kernelspec": {
      "display_name": "Python 3",
      "name": "python3"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 0
}
