{
 "cells": [
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 使用 PAI Python SDK 训练和部署 TensorFlow 模型\n",
    "\n",
    "[TensorFlow](https://pytorch.org/) 是由Google开发的开源机器学习框架，它可以用于构建和训练各种类型的神经网络和机器学习模型。当前示例中，我们将使用PAI Python SDK，在PAI完成一个TensorFlow图片分类模型的训练和部署。主要流程包括：\n",
    "\n",
    "\n",
    "1. 安装和配置SDK\n",
    "\n",
    "安装PAI Python SDK，并完成SDK配置.\n",
    "\n",
    "2. 准备数据集:\n",
    "\n",
    "这里我们选择使用Fashion-MNIST数据集，将获取的数据集上传到OSS Bucket供训练作业使用。\n",
    "\n",
    "3. 提交训练作业\n",
    "\n",
    "按照PAI训练作业的范式，准备TensorFlow训练脚本，然后使用PAI Python SDK提供的Estimator API，将训练脚本提交到云上执行。\n",
    "\n",
    "4. 部署模型\n",
    "\n",
    "将以上训练作业输出的模型，部署到PAI-EAS，创建一个在线推理服务。\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "\n",
    "## 费用说明\n",
    "\n",
    "本示例将会使用以下云产品，并产生相应的费用账单：\n",
    "\n",
    "- PAI-DLC：运行训练任务，详细计费说明请参考[PAI-DLC计费说明](https://help.aliyun.com/zh/pai/product-overview/billing-of-dlc)\n",
    "- PAI-EAS：部署推理服务，详细计费说明请参考[PAI-EAS计费说明](https://help.aliyun.com/zh/pai/product-overview/billing-of-eas)\n",
    "- OSS：存储训练任务输出的模型、TensorBoard日志等，详细计费说明请参考[OSS计费概述](https://help.aliyun.com/zh/oss/product-overview/billing-overview)\n",
    "\n",
    "\n",
    "> 通过参与云产品免费试用，使用**指定资源机型**提交训练作业或是部署推理服务，可以免费试用PAI产品，具体请参考[PAI免费试用](https://help.aliyun.com/zh/pai/product-overview/free-quota-for-new-users)。\n",
    "\n"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {
    "pycharm": {
     "name": "#%% md\n"
    }
   },
   "source": [
    "## Step1: 准备工作\n",
    "\n",
    "\n",
    "我们需要首先安装PAI Python SDK以运行本示例。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "tags": []
   },
   "outputs": [],
   "source": [
    "!python -m pip install --upgrade pai"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "\n",
    "SDK 需要配置访问阿里云服务需要的 AccessKey，以及当前使用的工作空间和OSS Bucket。在 PAI SDK 安装之后，通过在 **命令行终端** 中执行以下命令，按照引导配置密钥，工作空间等信息。\n",
    "\n",
    "\n",
    "```shell\n",
    "\n",
    "# 以下命令，请在 命令行终端 中执行.\n",
    "\n",
    "python -m pai.toolkit.config\n",
    "\n",
    "```\n",
    "\n",
    "我们可以通过以下代码验证当前的配置。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "pycharm": {
     "name": "#%%\n"
    }
   },
   "outputs": [],
   "source": [
    "import pai\n",
    "from pai.session import get_default_session\n",
    "\n",
    "print(pai.__version__)\n",
    "\n",
    "sess = get_default_session()\n",
    "\n",
    "assert sess.workspace_name is not None\n",
    "print(sess.workspace_name)"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {
    "pycharm": {
     "name": "#%% md\n"
    }
   },
   "source": [
    "## Step2: 准备训练数据\n",
    "\n",
    "[FashionMNIST](https://github.com/zalandoresearch/fashion-mnist)是一个流行的视觉分类数据集，数据集中包含70,000张28x28像素的灰度图像，这些图像代表了10种不同类型的服装，包括衬衣、裤子、套装、鞋子等等。当前示例将使用`FashionMNIST`数据集训练一个服饰图片分类模型。\n",
    "\n",
    "我们将首先下载数据到本地，然后再上传到OSS bucket中，供训练作业使用。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "\n",
    "\n",
    "# 下载训练数据集\n",
    "!mkdir -p fashion-mnist/train/\n",
    "!wget http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-images-idx3-ubyte.gz -O fashion-mnist/train/images.gz\n",
    "!wget http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-labels-idx1-ubyte.gz -O fashion-mnist/train/labels.gz\n",
    "\n",
    "\n",
    "# 下载测试数据集\n",
    "!mkdir -p fashion-mnist/test/\n",
    "!wget http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-images-idx3-ubyte.gz -O fashion-mnist/test/images.gz\n",
    "!wget http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-labels-idx1-ubyte.gz -O fashion-mnist/test/labels.gz\n",
    "\n",
    "!ls -lh fashion-mnist"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from pai.common.oss_utils import upload\n",
    "from pai.session import get_default_session\n",
    "\n",
    "sess = get_default_session()\n",
    "\n",
    "train_data = upload(\n",
    "    \"fashion-mnist/train/\",\n",
    "    \"example/data/fashion_mnist/train/\",\n",
    "    bucket=sess.oss_bucket,\n",
    ")\n",
    "\n",
    "\n",
    "test_data = upload(\n",
    "    \"fashion-mnist/test/\",\n",
    "    \"example/data/fashion_mnist/test/\",\n",
    "    bucket=sess.oss_bucket,\n",
    ")\n",
    "\n",
    "print(\"train_data\", train_data)\n",
    "print(\"test_data\", test_data)"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {
    "tags": [
     "keep_output"
    ]
   },
   "source": [
    "在本地环境中，加载和验证下载的数据集。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "!python -m pip install pillow\n",
    "\n",
    "import gzip\n",
    "import os\n",
    "import numpy as np\n",
    "from PIL import Image\n",
    "from IPython import display\n",
    "\n",
    "\n",
    "def load_dataset(data_path):\n",
    "    image_path = os.path.join(data_path, \"images.gz\")\n",
    "    label_path = os.path.join(data_path, \"labels.gz\")\n",
    "\n",
    "    with gzip.open(label_path, \"rb\") as f:\n",
    "        labels = np.frombuffer(f.read(), dtype=np.int8, offset=8)\n",
    "\n",
    "    with gzip.open(image_path, \"rb\") as f:\n",
    "        images = np.frombuffer(f.read(), dtype=np.int8, offset=16).reshape(\n",
    "            len(labels), 28, 28, 1\n",
    "        )\n",
    "\n",
    "    return images, labels\n",
    "\n",
    "\n",
    "test_images, test_labels = load_dataset(\"./fashion-mnist/test/\")\n",
    "train_images, train_labels = load_dataset(\"./fashion-mnist/test/\")\n",
    "\n",
    "for arr in test_images[:5]:\n",
    "    im = Image.fromarray(arr.reshape(28, 28), mode=\"L\").resize((100, 100))\n",
    "    display.display(im)"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Step2: 提交训练作业\n",
    "\n",
    "通过SDK提供的`Estimator`API，用户可以将本地训练作业脚本提交到PAI执行。\n",
    "\n",
    "### 2.1. 准备训练脚本\n",
    "\n",
    "以下我们将基于TensorFlow提供的HighLevel Keras 构建一个2层的卷积神经网络训练模型，对于TensorFlow以及Keras API的详细介绍请参见TensorFlow的官方文档: [Basic classification: Classify images of clothing](https://www.tensorflow.org/tutorials/keras/classification)\n",
    "\n",
    "训练脚本将被提交到PAI执行，在训练脚本的输入输出数据以及超参上需要遵循以下规范：\n",
    "\n",
    "- 训练作业脚本通过读取本地文件的方式读取挂载到执行环境的数据\n",
    "\n",
    "输入数据通过 `.fit` API 传递，对应的数据存储路径会被准备到训练作业容器中。执行的训练脚本可以通过环境变量`PAI_INPUT_{CHANNEL_NAME}` 获取输入数据的挂载路径，然后通过读取本地文件的方式拿到输入的数据。\n",
    "\n",
    "\n",
    "- 训练脚本需要将输出的模型保存到指定路径\n",
    "\n",
    "用户的训练代码必须在训练作业结束之后，将模型写出到 `PAI_OUTPUT_MODEL` 环境变量对应的路径下（默认为 `/ml/output/model/`）。\n",
    "\n",
    "- 使用输入超参\n",
    "\n",
    "训练服务预置了一些环境变量，支持用户引用获取超参，输入数据等，其中`PAI_USER_ARGS`是将用户指定的超参以命令行参数的形式拼接起来的字符串。用户的训练脚本可以通过Python argparse库解析输入的超参。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "!mkdir -p tf_train_src"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "%%writefile tf_train_src/train.py\n",
    "\n",
    "import tensorflow as tf\n",
    "import argparse\n",
    "import gzip\n",
    "import os\n",
    "import numpy as np\n",
    "\n",
    "def load_dataset(data_path):\n",
    "    image_path = os.path.join(data_path, \"images.gz\")\n",
    "    label_path = os.path.join(data_path, \"labels.gz\")\n",
    "    with gzip.open(label_path, \"rb\") as f:\n",
    "        labels = np.frombuffer(\n",
    "            f.read(), dtype=np.int8, offset=8\n",
    "        )\n",
    "    with gzip.open(image_path, \"rb\") as f:\n",
    "        images = np.frombuffer(\n",
    "            f.read(), dtype=np.int8, offset=16\n",
    "        ).reshape(len(labels), 28, 28, 1)\n",
    "    return images, labels\n",
    "\n",
    "\n",
    "def train(batch_size, epochs, train_data, test_data):\n",
    "\n",
    "    # Load dataset from input channel 'train' and 'test'.\n",
    "    train_images, train_labels = load_dataset(train_data)\n",
    "    test_images, test_labels = load_dataset(test_data)\n",
    "\n",
    "    # model train\n",
    "    num_classes = 10\n",
    "    model = tf.keras.Sequential([\n",
    "        tf.keras.layers.Conv2D(8, (3, 3), activation=\"relu\", input_shape=(28, 28, 1)),\n",
    "        tf.keras.layers.MaxPooling2D((2, 2)),\n",
    "        tf.keras.layers.BatchNormalization(),\n",
    "        tf.keras.layers.Conv2D(16, (3, 3), activation=\"relu\"),\n",
    "        tf.keras.layers.MaxPooling2D((2, 2)),\n",
    "        tf.keras.layers.Dropout(0.3),\n",
    "        tf.keras.layers.Flatten(),\n",
    "        tf.keras.layers.Dense(64, activation='relu'),\n",
    "        tf.keras.layers.Dense(num_classes),\n",
    "        \n",
    "    ])\n",
    "    model.compile(optimizer='adam',\n",
    "                loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),\n",
    "                metrics=['accuracy'])\n",
    "\n",
    "    model.fit(train_images, train_labels, batch_size=batch_size, epochs=epochs, validation_data=(test_images, test_labels), verbose=2)\n",
    "\n",
    "    # save model\n",
    "\n",
    "    model_path = os.environ.get(\"PAI_OUTPUT_MODEL\")\n",
    "    model.save(model_path)\n",
    "\n",
    "    return model\n",
    "\n",
    "\n",
    "def main():\n",
    "    parser = argparse.ArgumentParser(description=\"PyTorch MNIST Example\")\n",
    "    parser.add_argument(\n",
    "        \"--batch_size\",\n",
    "        type=int,\n",
    "        default=64,\n",
    "        metavar=\"N\",\n",
    "        help=\"input batch size for training (default: 64)\",\n",
    "    )\n",
    "    parser.add_argument(\n",
    "        \"--epochs\",\n",
    "        type=int,\n",
    "        default=14,\n",
    "        metavar=\"N\",\n",
    "        help=\"number of epochs to train (default: 14)\",\n",
    "    )\n",
    "    parser.add_argument(\n",
    "        \"--train_data\",\n",
    "        default=os.environ.get(\"PAI_INPUT_TRAIN\"),\n",
    "        help=\"Path to train data (default: /ml/input/data/train/)\",\n",
    "    )\n",
    "    parser.add_argument(\n",
    "        \"--test_data\",\n",
    "        default=os.environ.get(\"PAI_INPUT_TEST\"),\n",
    "        help=\"Path to test data (default: /ml/input/data/test/)\",\n",
    "    )\n",
    "\n",
    "    args = parser.parse_args()\n",
    "\n",
    "    train(args.batch_size, args.epochs, args.train_data, args.test_data)\n",
    "\n",
    "\n",
    "if __name__ == \"__main__\":\n",
    "    main()"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "用户可以在本地测试对应的训练作业脚本，例如通过类似的以下命令\n",
    "\n",
    "```shell\n",
    "\n",
    "python tf_train_src/train.py --batch_size 32 --epochs 10 --train_data ./fashion-mnist/train --test_data ./fashion-mnist/test\n",
    "\n",
    "```\n",
    "\n"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 2.3. 使用 Estimator 提交训练作业\n",
    "\n",
    "\n",
    "`Estimator` 支持用户将本地的训练脚本，提交到 PAI ，使用云上的资源执行训练作业，他的主要参数包括以下:\n",
    "\n",
    "- 用户通过 `entry_point` 和  `source_dir` 指定训练脚本:\n",
    "\n",
    "`source_dir` 目录是本地执行脚本所在的目录，对应的目录会被打包上传到用户的OSS Bucket，然后准备到训练容器的 `/ml/usercode` 目录下。 `entry_point` 是训练作业的启动脚本，支持使用 Python 或是 Shell 文件。\n",
    "\n",
    "- 通过 `image_uri` 指定作业的训练镜像:\n",
    "\n",
    "在当前示例中，我们使用PAI提供的 `2.3` 版本的TensorFlow CPU镜像提交训练作业。\n",
    "\n",
    "\n",
    "- 通过 `hyperparameters` 传递的作业使用的超参:\n",
    "\n",
    "超参会通过命令行 arguments 的方式传递给到训练脚本。 例如以下示例中，对应的训练脚本的启动命令为:\n",
    "\n",
    "```shell\n",
    "\n",
    "python train.py --epochs 20 --batch-size 32\n",
    "\n",
    "```\n",
    "\n",
    "- 使用 `metric_definitions` 指定需要采集的Metric:\n",
    "\n",
    "PAI 的训练服务从训练作业输出日志中，以正则的方式捕获用户指定的Metrics信息。用户可以通过作业详情页查看输出日志。\n",
    "\n",
    "- 使用 `instance_type` 指定作业使用的机器实例类型:\n",
    "\n",
    "对于提交训练作业的更加详细的介绍，请查看 [文档:提交训练作业](https://pai-sdk.oss-cn-shanghai.aliyuncs.com/pai/doc/latest/user-guide/estimator.html)\n",
    "\n",
    "在通过 `.fit` API提交训练作业之后，控制台会打印训练作业的控制台详情链接，用户可以通过该链接到控制台查看作业的日志，采集的Metric，机器资源利用率等更多训练作业信息。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from pai.estimator import Estimator\n",
    "from pai.image import retrieve\n",
    "\n",
    "\n",
    "# 获取PAI提供的TensorFlow 2.3的CPU镜像\n",
    "image_uri = retrieve(\"TensorFlow\", framework_version=\"2.3\").image_uri\n",
    "print(image_uri)\n",
    "\n",
    "# 配置训练作业\n",
    "est = Estimator(\n",
    "    command=\"python train.py $PAI_USER_ARGS\",\n",
    "    source_dir=\"./tf_train_src/\",\n",
    "    image_uri=image_uri,\n",
    "    instance_type=\"ecs.g6.xlarge\",\n",
    "    instance_count=1,\n",
    "    hyperparameters={\n",
    "        \"batch_size\": 32,\n",
    "        \"epochs\": 20,\n",
    "    },\n",
    "    metric_definitions=[\n",
    "        {\n",
    "            \"Name\": \"loss\",\n",
    "            \"Regex\": r\".*loss: ([-+]?[0-9]*.?[0-9]+(?:[eE][-+]?[0-9]+)?).*\",\n",
    "        },\n",
    "        {\n",
    "            \"Name\": \"accuracy\",\n",
    "            \"Regex\": r\".*accuracy: ([-+]?[0-9]*.?[0-9]+(?:[eE][-+]?[0-9]+)?).*\",\n",
    "        },\n",
    "        {\n",
    "            \"Name\": \"val_loss\",\n",
    "            \"Regex\": r\".*val_loss: ([-+]?[0-9]*.?[0-9]+(?:[eE][-+]?[0-9]+)?).*\",\n",
    "        },\n",
    "        {\n",
    "            \"Name\": \"val_accuracy\",\n",
    "            \"Regex\": r\".*val_accuracy: ([-+]?[0-9]*.?[0-9]+(?:[eE][-+]?[0-9]+)?).*\",\n",
    "        },\n",
    "    ],\n",
    "    base_job_name=\"tf_tutorial_\",\n",
    ")\n",
    "\n",
    "# 提交训练作业\n",
    "est.fit(\n",
    "    {\n",
    "        \"train\": train_data,\n",
    "        \"test\": test_data,\n",
    "    }\n",
    ")\n",
    "\n",
    "print(est.model_data())"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "在训练结束之后，用户可以通过`est.model_data()` API拿到用户写出到`/ml/output/model`路径下的模型保存到OSS后的路径地址。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "print(est.model_data())"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {
    "pycharm": {
     "name": "#%% md\n"
    }
   },
   "source": [
    "## Step3: 部署推理服务\n",
    "\n",
    "以下我们将训练产出的模型部署到 PAI 创建在线推理服务，部署推理服务的主要流程包括：\n",
    "\n",
    "- 通过 `InferenceSpec` 描述如何使用模型构建推理服务。\n",
    "\n",
    "用户可以选择使用 Processor 模式，或是自定义镜像的模式进行模型部署。这里我们使用了 PAI 提供的预置 TensorFlow Processor部署一个在线服务。\n",
    "\n",
    "- 通过 `Model.deploy` 方法，配置服务的使用资源，服务名称，等信息，创建推理服务。\n",
    "\n",
    "- 通过 deploy 方法返回的 `Predictor`，可以向推理服务发送预测请求。\n",
    "\n",
    "对于部署推理服务的详细介绍，可以见: [文档:部署推理服务](https://pai-sdk.oss-cn-shanghai.aliyuncs.com/pai/doc/latest/user-guide/model.html)"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "[Processor](https://help.aliyun.com/document_detail/111029.html) 是 PAI 对于推理服务程序包的抽象描述，他负责加载模型，启动模型推理服务。模型推理服务会暴露 API 支持用户进行调用。\n",
    "\n",
    "对于 TensorFLow，PAI提供了预置的 [TensorFlow Processor](https://help.aliyun.com/document_detail/468737.html) ，用户可以方便得将获得的 [SavedModel](https://www.tensorflow.org/guide/saved_model) 格式的模型部署到 PAI，创建推理服务。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "pycharm": {
     "name": "#%%\n"
    },
    "tags": []
   },
   "outputs": [],
   "source": [
    "from pai.model import Model, InferenceSpec\n",
    "from pai.common.utils import random_str\n",
    "\n",
    "\n",
    "m = Model(\n",
    "    model_data=est.model_data(),\n",
    "    # 这里使用了 2.3 版本的 TensorFlow Processor。\n",
    "    # 一般情况下建议用户使用最新的TensorFlow Processor创建服务。\n",
    "    inference_spec=InferenceSpec(processor=\"tensorflow_cpu_2.3\"),\n",
    ")\n",
    "\n",
    "p = m.deploy(\n",
    "    service_name=\"tutorial_tf_{}\".format(random_str(6)),\n",
    "    instance_type=\"ecs.c6.xlarge\",\n",
    ")\n",
    "\n",
    "print(p.service_name)\n",
    "print(p.service_status)"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "\n",
    "`Model.deploy` 返回的 `Predictor` 对象指向新创建的推理服务，可以通过 predictor 获取在线服务的状态，发送在线请求给到推理服务。\n",
    "\n",
    "使用 TensorFlow Processor的在线服务，会通过一个API暴露推理服务的模型的签名信息，包含了模型的输入输出数据格式，[详情可见 TensorFlow Processor介绍](https://help.aliyun.com/document_detail/468737.html#section-w41-c2x-vsb)。\n",
    "\n",
    "> 当前仅 TensorFlow Processor 拉起的在线服务支持获取模型签名信息。\n",
    "\n",
    "通过 SDK 提供的 `predictor.inspect_model_signature` 获取相应的模型签名。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "tags": [
     "keep_output"
    ]
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{\n",
      "    \"signature_name\": \"serving_default\",\n",
      "    \"inputs\": [\n",
      "        {\n",
      "            \"name\": \"conv2d_input\",\n",
      "            \"shape\": [\n",
      "                -1,\n",
      "                28,\n",
      "                28,\n",
      "                1\n",
      "            ],\n",
      "            \"type\": \"DT_FLOAT\"\n",
      "        }\n",
      "    ],\n",
      "    \"outputs\": [\n",
      "        {\n",
      "            \"name\": \"dense_1\",\n",
      "            \"shape\": [\n",
      "                -1,\n",
      "                10\n",
      "            ],\n",
      "            \"type\": \"DT_FLOAT\"\n",
      "        }\n",
      "    ]\n",
      "}\n"
     ]
    }
   ],
   "source": [
    "import json\n",
    "\n",
    "\n",
    "model_signauture_def = p.inspect_model_signature_def()\n",
    "\n",
    "print(json.dumps(model_signauture_def, indent=4))"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {
    "pycharm": {
     "name": "#%% md\n"
    }
   },
   "source": [
    "通过 `Predictor.predict` 方法，可以向推理服务发送预测请求，拿到模型推理之后的结果。\n",
    "\n",
    "通过 `inspect_model_signature_def`，我们可以拿到模型的输入签名信息，然后可以使用对应的信息构建我们的请求数据。以上的模型只有一个Input，Name是 conv2d_input，输入数据格式是 (-1, 28, 28, 1)，分别为 (BatchSize, Height, Width, ChannelCount), 我们需要将数据reshpe成符合要求的格式，然后发送给到推理服务。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "tags": [
     "keep_output"
    ]
   },
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAAAAABVicqIAAAJOUlEQVR4nO1Y25IcN3I9JzOBuvR9ZjgUrV1H7P9/lcPeiF2JokTOrbsKmemH6h6OKFtecuUXe85LV0QDOMgLTiYAvOIVr3jFK17xilf8vwf/F1dmIv84EgLn5QCQzEyIKt0dAGB/CAeRfOYRZiSkdphPkX8UCchfWSKZgA0rPsWMf46EFFNGa5G/9nlmJmDrg3yaHr+VZNk2Ves46un+LhDy2RBkZAZg+3daHj5+k7sIEAlQrK4O+3ovpwbkS5Llsx6+M/+B32rJAqnj5vCmG/2UU8aXWyFW252O59W/lSR12F5dXw8j2H+8+4LEuq5ejZqR30SynIgEYKvD9c31sO6G9V+P7deD6na/vhnadPT8FhLgkqzd+nC93wyrvu/mT08v/5duc3273wxxnNo3WXJZh7babDdjNQjz8ZGRCRCZmVb77e3tvi+zXIZ/vbsSEKvDbrdbWYOw2+fqz+GeFPo8hXXj6nBYGY5T0W8hIQQOSF1tD/tNz9Okxbbr7z3aFGIyPz0268Z+NRTEw7HTfyKFtd9eH7Zj9RZFdRiGGtPJtcj88DBr7Us1Rsu+yDdYwnNq2frm7fWmSiyHclh3MRXXIl77RjFRTYqq8KtJiEuFsP27f9l2kilUMxVkRASoKBLe3JucT/7XpzAvmtHvbr/rYo6kmKnEHDGdgk1Vak7zFBAo87lYfX1MVPv9bjPYySOpaibwiBZBUkThcIcITURE4n8m+VzyLsII9Lur24M1P55AFTNTtMwWQGZGpEeEkCJml6D81yTnxbmMuXwtAVm9+/5qmw8xzyyqairhnpEEMjxjbpGgqoWZ6fxV7uJFs7Zv/7Rle/AWJlRTyZjnJIVARkTzSIiohZVSjv8wCZeYJ1i68fp6P0ynU3OgE1NETqcZYgTJ5xpJCkvt+qfI/JLk7KezROdlAhYOcHXYf3cY7TQ9tRSFmaLFdJxpFFBEkEIAhAis9sPT1H5D8gXymSSQAMr27e3bbUmfTg0lqEWj+XRqAkuAIkiSCZAC61frY0T84zFJUGR79ebNrqN7RDChZozW5hbn7gF46THt1rvj3P7bmFxGJnBpP9Ctx+t3bw4DTs1pxJJarc1OEWaGL5rwPF2G3dXpePwNST5zkEDmMmeprsPtzc3NfsU2tWa9p5RimtFaiohyIZGlsAAZIsPh6XT36XcsWcz+3ITANm++v9r1ZZqePFU9WUzFw4NqlIWEWAp7ZgbY7+fHH/X3UjjPGQUAYsXGm8NuXeiteVAokM4EGR4qSgGQwVjkLTO8SXbb+DR+JnkZg/Ph5udmimW122yurjpMp2wtQSpFO0NEZEKUBElkAJGBcBKCstOfB76w5HOreRbbF57Sfvfdm92qNz+2ACkUUavV4B65nI9lXgCRiXBmqFpfdsPvuutMTEjpNvvr253RfZ5CiipFrdYqyISKmS17SmRmJhLhGUmpOnSFsM9n/LL1y6+ImKpp6frVYTcUuntA1IqSamYqAAtROkN45nJ/SACRIUqJSJZhCCOQzJfN7LOTSulqP3R93/fjIC2bJ02smpJy1nGVSisaLds5S3LxmkJiPrmX1Wa6WPJbV9Xa9WO/Wo/D2BVNzO4JNbNiSogKgUwxEVFeRC+ABIlESGabW9i4PtpzTlHIc3FaAtH3/TiMq7Hvi6K1yMTZS0JSmOkAqUJEnI8gAgEhk1SSECt9n5aLF6lWigohFJLUfhj62teuVjNGRoIKsaKCDKUgMsQ1RdPDI3ORCM+knTVfrWbXdZ3bORpaaz9UFYqYCNWG1dAVUyUzW2QCCkoxAQJCZDpBg2q0uaWoCJEt4lx4s4XWDn1XrNgSj9L1/TB2RUTVVKhlWA118YS3iCRFKKK6JAnlrJrizvk4swhFEMigUKxosKlV1mKqBsBK6fq+7/paVU1VBWL90JmQoJCiSaGIgDzHTxFEJNIbptOkksvVOgIAhIgERUWFpAHo1uMw9tXUSi2mFCbVGB4gSZOSIKmU9MhMUrQAke6Z0eJ0nEsBiczICG/m1ZCpFBEi0wAbD/vV0CmSpasmBCNJzwZQ1cQoBKhCR2SCFDUysjGZ4e4REBEQJDMiIv2cQMwIt8Lx8PZq6ITespSiApxbKACMhIo+n57MyKVjEEoCSRWWqqWrlUiCqg6KiIioMMNba3Yj23d/vrGY2wxPn5RMRJzbB2ozFRUgaSo+NwcR4Q4C1NRq6IewvjdBenVvLWjFSJqmT9PpdLRbO/zpL1f5cP+UHjG3RbEzISRIWYKHBNU0vSU1MyIgBFVK32l6iqmQmZnRmlOLMpOabTqdjkdbd/vrN9fNIgHxeHYLEklkhosIEaC5MSmiaqYiZ73vO0NiCQBApM9ONUmPNIa3eZ5togdURu+mubkHwIylG8lEIC+3f4CiqiqmpdYiQqtdMRWGQ6woCZAZzalKb3Mom6kI7ade93ebbiiZGRGemRFza35BQIQgxErt+q6aaq1dEUq3Wnc5T/PxlGUcq3CRlggKs52mJsy+60q19z021/2u76uJZDoi3ed5nts0t3memydVAIjVflitexMrtRoh/W5f4v6T++TU1UrON5IEiJhMG9m6Wkqx+1Y//L2c1mNfiwqVicjqfiGZIylnkm5Yr3olVeDRchbrTh9/ubt78E3dS2FeuidEZmZIRmRmWkwP78vTZuy7UkqttaiKmdXewyPCPQFdstms9p2mt6nN09PjUVbrOt3fPz4e/baNHTURLQIZ6T5NUxMej4+Pj0+GeHp/+qEWM7M6btZj33Wr9aqvJEnBuVdnQoQU+DQ9Pd59+vThw6dZq/p8mueW/5qbzg3ZTs0j3Ju35qF8+Hh3d3c0xGm+IwmhdpvDYb0aV9urFr0WVTPRRQKQACPaPD0+3H388P79D3//6dHJXJqzT9ub+lCY02lq7q01j0AYn3765e5hMiDPz5EAHo9P63EYt/cf170VM1s0+XKPdW+n09Pj/cdf3n/48cefXxTrv/3bTdsuJO7urUWAaTz+x0+Pc/y6JXoSv+9qHVarroqJqojI5VkuEeFzm07Hp4e7+7uHlxMf/l3/OijQWouI8FiKDeef//YYXzzZ0kxUxNR00VDy5StmIjIivLU2z81fTNT1ujMub4OZl/6WiOnpOOMVr3jFK17xilf838R/ApS1P0sVdtkIAAAAAElFTkSuQmCC",
      "text/plain": [
       "<PIL.Image.Image image mode=L size=100x100>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Ankle boot\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAAAAABVicqIAAAPD0lEQVR4nKWa2XPkyHGHf5lZBaDRzeY5nPvQzqHDDwrZitCz/2+/2W8KOyTHSrva3dlDs8O5ODz6ABqoqkw/AE02j+aMZDyQjGahPmRm5YkmfNaV39q2w7cXPiq2Mw2LefiMu91nMfzmeKxhPl39aGvTa8j9dPH/hPjMOy8ikm3ubNr2xokSUfcvKrdGTkNbV1VIKcUQY0z/FGSws7W5McqdZKPNIc2O5xBZQvJBThpi2zZNU82np6ez2T8FcduP7t+9szvy4opRzk3dsvfcQ4jJUopqqZ4df/xw8OY9WvuHIERE5PYePX3y8P7+2LFkuecUVTJ/vihqSga2evrx/ZuNcpD746R2HehaiAw3hmU53Lv/8O7tva2hgMQJeaNLt5oaCfLM5+XG3r337w9PPn6MnwtxOw/u3N7f3dneHA/LzEFNI4iJRc4XhaQGsJC6kRvtP54cH779+1+nnw0p958+ffLozmYuQlDVpGoGEWFmIoKZaoqqIGJmYp+NSZvq5M1X1evqcyDE4vYePH3x+P6tAoCFoKqqagBg3Rk2U0tJrQMKxAljtLVZhoPXbdteOQBXIflw4/HzX31xayyAphSDqpqBAFMjoIeYggAYFKbKwuBy9+FvqvH7t4eXo8AVCJe7d5/98sX9ASqYmSY1MwIxyFK/LxRG4I5sloxgREVe7v9K9r5JJ5+EyHD/ydMn9/dQNdEAWyqRYf3x7D+lnmgwNUtg8n7zUTYMB77+JGTz4fOnd4YATA1EIkxMpmoGmMGIwJ0IHdJgABHIksgY4dU4u7znFYjb+eJfnuyQxmAwsM9zJwRtFrHfkohFiFJMSkSmCmImBqFN5Da2t4c52acg24+e3ctTnYKBwS4rAICFeXknO3FMQaN1VBAzCWANnMtHG2XOl6LYVcMPdva3UIWUwERC2lhUMzKBJjBIHAsTACIAJCBiAgyajNmXZVk4tZshcFkBEAxg50jn7fxokka39wpplJ0T5ximSqJkBpdnnGJMKZmBSeCyvCg06c2Qs4s4z2x+evjuh9fh/u+2MgtGvsgYQBs0wWlKcEMBmpRCNHIsBJJ8MAj2ORACAPGZnR79+ONff2h/dS95CCCOkWJTLxIXmQCgJIiLRZsUQswEkmI4bNKFEHYVQszLvxyhOvjmu5ev4m5lgBBB6+nhSV03tnXnVplHNAdVMLD4TFgIAGXleFw3n7AJkQCAAcTQk5d/+u5ook00wAusnbz88ruqjfzsdxtbOfDqL19PBnu3tnc3ck5qKpRtbG/N5p+AdKfGAIBh04NvXyYlNQBOtGnf/OU//3tu5k93Hu4DmH75H+93Xrx4spGNaNF0kJ3tk+NPSrJM4wSYxhDUEBSA+Gp29NPLH35qALn35t14H/Tux6/f3OJieJczxAYG+OHmZuluhlAfzEFERjR68II+HqcQFcBw9vNX3/9o9wwJ2cevTncdf3sYUB++3n6SACaYwZXj8eBTEHRBgZhIhXd+HXf+Us9TMgAuvPyvH9LW/cLaxh//+ZuNwr2ZjJRmb/ZO2z4JmBQbG5+CdN5sBiY20NYzX5z8NO+da/L9//x9796LDavmJ4evQj7ImtlQ6vrwcBoAIpiZDDZGhdwIYe96u3e2KbYWhxsCEKGtmlc/v61Pq7a1tpkfv69d7gDVpoqTOvXOBVcMh/mNEHK5708wMRG6MMyACGZ/f/O/Bw3qt4NBqhenx6cNCZNAJ00zbTpRzUiKsrwZgqyLGwYiZkJfKoCdYPLdl1+/JehhcrENMSrFRsmzNQhBrbMJ4PJykPMNEGKfe15VF4lzjkHMNn/91csjhbNgKVE2QJjNgs8G4maDYdYRYOCs6B50HYSdz9yZJAQY+yL3TJKXFE8O3pw0GD9+OOJk4Dj5+efp5uOH5eQYz/dcQgLMjH2e+xslcd477gzIRDAjnxeewXkJmx0d18Deb39/OwdSrF//OWX3//D73dN3s1v3s4BEBAW5vHvQ9ZJkywVETIbuuYQshXw+m9eA3Hr+r49HhJTm37cfcO/Xf7g7Ozgq9rJeEtBSG2shF0U1Uyaf507rN38b/PBhIXlWjqmaGVM+Gta3d6Y5QkCxUeSd/KZw/Al1scu7I0xERGaqDm6QO7z/Y8y+/7Ed7+8W5Q+n2wXxg2e3aTDKq29pRyDbBQk0mYKIs+JmSGeBpSApGSB54XH4x++lOpXd54/9yTfvWOB+9++/hR8Npn/6ejje2380MgcNXTHzacgFdakZOCs3kF6/5rzcfvj8ufzt628WgC9/+cWw3N2Z/3wsd57xTiQBw6BM5G62CS0N38VIAsyIR/sP2ib4zVt3f/HiC2kPjipQfmdzUO49aRc/f4B/KN5xvx60jH9rJZHuCJsZiECsKkrjh6d1RLFz996DX9xlSjsLcvnzZ7uFuLz+Foj5eDxgwKiv6ugiY40kZgYQmGGqGD+2Gn60e+f23v4WZ+WTwFmxe2fbDTaKj3fyZri1tzUQYBkiz/Pe9ZK4LDu3CTHDNGL8IKspH23t747HJSBbibN8UGYkuT588nTy+N7eeMDnjCvXZcN3kE4SJqjGyONHuy18UW6MCk7EpRg5L2jZ0fDBv2XV/d882C09ALpcBK+B5LkXgpkZCCRkIdB4ZAoWFramJaMcxKTR2Ody5w9P42h3e1AIDNRVnp+CuCz3fNaVQFOMkfM8A2ApxJSU2DGDLIUkSm576yyWqhGrdb0Lsd4Eyc4hGtqgCnadmazTI/VH20xToKyzQ0qmAAMJBpBkeaPrIOSyzPU5npCqaStZJtYwDKopKQhIRkamxNBWNWdYG2IyuMwJkhlAviijnWvuaj6R5enSdnq4GGwPRJvUJT4CESyCDCBisxDbhhFjDFFl4IQSKQDyg3Ke1kFIfFdIMANpcfKh2ixJLDVqgCybeIMRmAiqysaWYmjqIKkoWbqk6vLCxXN9XfV4YYBFKen88PW83WEBUydFV/iRrTgEAcTMcTqDjsa9r7NkuV/p5i/neOeEAHZGqZ18eDXDQxJy0kE6SYj73QESZiFj09l75R1Dd2hIfJY1aR1EnBMG2KmlZvbx3XS0IGEn1MkCWvE4AzFEhEiSzT7EotZl1GJxbiVIXgMhACIxtfVsMp2H9dECXZQiJk7VcbPdWF8VGZh5JdpfLSS6VkZiapvFogkJfUd/ef/O6QACMWt13ExaA8gMlm5sTFnOINA2xG48BFVjQ1+JrwYOIoKBmHUxreftUpKU9MKyiymMxHvpvAApRO1njmZ27URueRczYl0t0vIZrG9A1qrLCZQJsJRUcuSOPsUAs5eLOmLn1xuexXuBAoAlpWyUlRnBVAEsJ0RnD4ReRmLy3vH5psTOZ15uOl1sZgTTGM0Pi1FG0H5uc91lZmBybgnp3NVlN0C6o2cAkNpFoAIDf32KuEABsWnXjC0VcpMkfcUBWKznC+Qy8NTF92u9hQBTNoJpaKPRMr2L79ucayU5V0qoZpVm+cDTeSNhq6juDEJNDbDYdi7V9xtZ5tcaHme7WKhntfpB4W5SF8HsDNJGXdpElk7dq++yJNRtaqGe1ckVueumTteave/GDdCwqOtFo0bM3URWVnbmy3d1zTgQq1mVXJH35TdWg/05oWdA23o+r+qkYAZAIqtV5FVJYDDA4qJaaC/J+dZ0afFSEovtoq7rqJAlZGXpJQiA3uNSaCPlg9xd2fva5QRouwiKbqjHwmvVZcuROWCqyMpR6WCKKyZZima94dnlGWvbpm5iQHyh5r4kiSU9fz0h+XA08NBuCnwx+i5r+K4OBOflMLPQRmPqplhC646w6QqDOCuHuZjqNerqD7ZZNxeWctwM0C4K17UbIrI2aWEl3hrI5bknpWsy1tn6/ocf7aQhNYs27+4UL2udsQvq/VTNSHzmoFBbDQXWG5psmToA+M3bsilN3Qz76aW/yeNtuSERs/OOTaH9XP4aQawfXWWb+zLgRdUoACN23q2Pwp0ghD6SOiEk3KAu9I/khttKWs37OohklXENhFgIICnKrHBMqrCr3rRS3RERIFlZtc0k29E+q/D69Gtm6KZD8OWGL1zXAC8nYGf7nyWEpSLZF1lTh2yRzpR4oyToJHHFyBWOYf1bpbXa6qi+KOZtM2iWktxQSCw7UgCSD1zedcKXkkj/+7y2sAQUG4t63iyLbCK5SV3a13HkitIVmRD6gImly3fp9hxCiC1hsIO2srOXtszrDW+my1E+i3deZLVGWfllgKKr6Q0pkBWbNjte9FMcEAuvDSvde7e+kGhSNLoaG1d5nSSmBMlysRCiLSXhG9WlKTnAqg+vs8G+CbOdhdyz3fsP+jmHCKXU1tPjozSPWE6ubzC8ppQSAD199W1RPlQvfObvPcTOjjAxzIydp6adHr0/eNdMAs5L8XWQLhAbQadvvi/vVAksvIyCFxVHy53IZWjb6dGHt+9sFjpJus5vDYSYyNQIGuaTVLXGzizpldbhTF1dmQrRtp5NJ8OFAjCIz7K1sYvFZ0Jmy8EwC4kATTCs2OWcQ1BVlwGDjC00TZs6lbrBqMzW18LeC8EAY59nov15ToaLI6yzsKJJuxZSY4qxc0YlPxgNVoZzVwyvZsuOTOuTQyudVpNaaaVoX+GYJvNhSybHJ5N5m5Yp78bMmJr5fJQTASnFpn715XRz4HQxay5DlkNzswQ3Gsn83cHB21kiqDIYcVEtwroWO8xPjkdDYiDFUE3a4y8Hni200a4pWLqAo+A8p3Z2Ovk4IaEUMwi1k6OTel2LHaYft7e2iYAUQ/3hw0+euetL119GzLAYU7QNx6qAoJkcndbrJhJhfnJaJSbATMN87RdRrr+yrnwEszZVvaKuizkvzk9PKvMAnODiW8/PuEiccx5wGduFL5dcglQnk1odAMdM15WwNzFY2DnpnhC28gWUS5B6Nq3iMm/cWAFfIwd36QiwFEO0dadL22o2n88z0nkd+h7wYoF67fa9Z2pbTU9PHGEyOZ3M23WG19DUs9PjIYfjaW+5NV3vJQhgZrE6/fh2u5R0+v7Dx8liHcRSaKrJqbnFyWwRzyFXQvAViAGWqsnRh51NF48Oj05mzVqIpdAu6kKbugm6VNcntHV+sy6q2cT5MJ/Pu++KXA/RLmtRTEn/0RMMIMU2BAohhBhXPPj/ALU7R99e9smFAAAAAElFTkSuQmCC",
      "text/plain": [
       "<PIL.Image.Image image mode=L size=100x100>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Pullover\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAAAAABVicqIAAAJSklEQVR4nK1ayZIctxF9mQBq6W2Gm0SHJdkK++KwT/7/k+86+ORwhBQObSRnOJzuZld1LQAyfUB1D0krqtpTxNxmweuXiXz5kBjCyCJAAbu6fvHHv3y7fH+zpeWK6vd1NLlzmaMYef188e67f/xrbBfAjv8YAMjm5SLPXF4sqMwpehGTOZdZipGKzGVZRjoXxCyefvni2VVpJaeyoEWxFJM5mxmKEaurImw26zbKCM44CEEB9+TrP//hmy+K/kmLzKFtGhjnjGPEgMXCZS9fvt4d+zCLiXv2p79988VVLiGCWX3Xw1rLFghRCkfLN1/dQOI8EHv9+2+/ui5Ovxl9IGMMERCi5Az3/MWL+tiM7TANYsrrZ1clQ6EgAAZEUCVFSoQr15tlzo8FSbmkbLnOuZcYIpFhqEJVoYqorEuoLZZl9miQhMKuKC167ztPxjERNIqIKgRGMyecl2VGjwYZfifLGRJDCKRkiKAxDiAaFWTzsnSPZ5IWWQtAySgxE1RVASKAhJCKNbczmSgRAyC2BsxQUQUxEWmMzADZvMjtTCbMzABABiCCQBVExKRQggLs8myUydgHSIusYQBDjAj4aDuNUchOgEwz4cwyoHISJyJWRfqCSoxBbTYXxORuYKKJBhEGBAU0mACTZWYOCFFRZgyIyG/+XCVG2MzNSrxxy2VuU0qYAYWoQEEDJ5UoZPN8DhOy5XpdGkBUiYgUqqJKSG2TQBqVbOZm5cSU61VhARFh4kFoaDhhRABUyVjLc0BssV4VrBCRYWtmHWol4dDY9heCLFarnIVCiKQggJhJQhgONIFYSEeb73ROXLleFSwcRSSBWMOiEUO1kzGqMYQ4hjLNZLMuTNJEYNAsJeZTabIRxL6fA0JusdksjCoxkwoZQxBRGBZRTSBRfduF3y6jy5i45dXVwkDJglSMceR7D2tZfIxQENsgXdP6mUzWpVGQSRkx2u9r3mxygQoAYmbp29bPYWIX63VhkocgYkP9/Y932dfFkqMHACImDV0f5jCx5WqVs4ooyFhjtP3lu/+s/v7smVoiIiIGxM9LPGy+XOQsKqLGWkNS/fjdP5+t/0rkEgiBNPrxIzzVtDgvC0cDE8PSHd788O/vb1qiQUgIUJFZxQi2zhmoiBJbi77a7fdo/OAkLlxTTIiNIYiqKlur7W5bCbKhe6iqKoiYR/VxGsQYZqiKgq2R+t1dJSgyBkBESQmYkzV+NAgb5xgqCjBTrO5udx0MA8Cp+kFseHSfSRCXZ4aQwgK/v3l91wzpcM5AREFkZ/YTdpljUkmZ7nZvXt0dwaQAnHpIFIBnM7GZZUpaCPj3tze7HhAvgLGc3BimGtd04i2S+SEG/OHubQ1oXx+6oKe9NcbxrjUBQsQmeUZiBkJ1/64DtN3fH30UEDFBY+j9LFlJZluHXh6O+10ApN3fO0MESreVvhsHmfTCp3AzMxD7phdA+vowdBAiQIL3cWyPSZtKJ4kiApDuuNo3xy4qMRERVMKEQE6C6OkDE06XSGjo2i4miSRA4tm9PBJkWOkkDWoroWv7gJMQawxxrDH+PyAS49AANfRdLw86fDYzc0AUBEZE06RWrrHvgxCBzpuPVuNlTJSIKUp97B/CNTC5pOIvZAJiDVIfewEA6eqq7kfT8NGavjMCEAUzxaaqOwGAWG/vdsc4HqMP1mTFA4CKYaPheBhAwuHt6nd1OFv6zxMuBTOF4ylGsb5fbY8RPKBMHK7LwgWAGKE9DjmJ3WF76M6XE506wtPaxcOVV0N3HNyo+Pp97QFmIkrFOIpyUbiSiQtd0ybzrkGrJhCYOd1N47yKV40JJQlWkl5VaTpJSk/pmj1Lu8T3kcCGoaE9Nmfznj77oMJTAjmRE419EwTMROLb40fXED1lXsJMgYxd46FsGBqD/7QBpmhNhmvqdInvfFTmdO99cFcfn6aZbkWi91FBIHbFojhPAdONcbCpxoyb4SkQDb4PChBMtlyv8k9AkoM01o7OVqZAYt+2XogAyhbrh/Hvh+Ga7epDU1WdEEFgF6tPZ8yfR7vU1+8PjTCrqF1sHsI1jIwNDzetWXXi692+ESaN4hbrVWE+/mtmAkSijHKZCpevdrtjZIbAlpv1p+FKbn6ukeir7a4ORCrInGwWD0xOE+gLpH4CRPtquzsGIigyNlefhOv07DWR+Ckmvtpu6wBSUZOZ4uFxQSUp5MTfA5jOSX/Y7qpeT8f04SOrRGDCb53WJeGqeoGqBO66s3tXjUFJB4yZRsLX+13VRVUVTx9KvUo4PQ3h09n6p2tSVtqqOvZRMMyZz/qh8oHsz1RhSFc3fZA0eLDZQ+pVdRjg0tRIdariAd+0nQ8KNsZmxTLnD8w8ziCj8Zr2XbHrQxAlNmyzclEOoq4qZy2ZyQSARhkiwsbmZeFSPepwUVQNMuVTL3CQZFzmDAHExhXF8IoRfdcFJdXoo/I4ygUgbIw1zESU3rMTE+nbzoMgsQs693QBiKFPISGbL1elI+A0eKRpZ3cZSGz3232nhkGmXG+WA0ianROgEsIMB5kkNlZvX2FTsAHs4urJfZY67sCACDJ37AEgHG5fWc4MQc3yydO3+XAxSk8ERKT/6/o+XpeEq767uW+ECWpXz19cF0OWz9vKZ7jHx+ru9vpLNVB11y+bN4skimzSizZU4pycpH9fiPW72+eNGojajbTPyjPIufnOnAsDkGZ7+7KOgIpZ4LAZcvLhzhOtazwnRESQdn9zd+hFRcDWmpSM2B2bXogJYOvGbeoYEyKAFNId9O59F5yIIsYYIwD19X5f5GwIZLJizlN5qggf2/tD61lUNISYem5oDtW1GKb0VO7mvserhLrpvAMTrMuLIogi+r6PxAQ2NsvnMNFTNUTvA6wB5c3zF89RB4CMdS4DsqIsy8c/+n94LEPXS85A9vT5l190bVCwzZwjAHlRTORk/HSd/ad2VdUEACBXrnIGRIYjANWJnnXp5K7b32zCVaF9f//zu6NXwB/e3S5yEOSw3R+Oj384e4hXe/cTVU9zv9++/eH7mypC23c/M6Sx6ve/vL7dHsdmtpcy2f5KfZ03b169+fXXmy4A3fYVOxty6bavb+72zWcACcd96Tiv3vz0+uau0vSd5W5fFNLs9u/rZs5j5mlJ6NquQ3usDnU3zCH7tut7I33Xez+uwv8FU8faPheawW4AAAAASUVORK5CYII=",
      "text/plain": [
       "<PIL.Image.Image image mode=L size=100x100>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Trouser\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAAAAABVicqIAAAI+ElEQVR4nK2ZW3MbyZGFT2ZWdQMgRWlW0tgzXu+sH/zg//+LdsPrcWhmJBJAX6oqL/vQoIb2A7pDIJ/IQER9PFWZJy8grPxQ991Pf/mY6mjmDodrs4AADhZBaD3/9unp+hlpjSG53+0PkqBmDsC1aQQDAIvAJLrc5XYTJO3evHv/8ftcxqbqxBKtNbUAs7BwtDmj1hhV/Zsh1D98/NNf/vpjLudaq0nXUSulzi1S34vAptNTn7u7p+O5fjtk992P//3Xv/1nN5/LPGu6O0ibpnEYrb+/y+z19PnXw92bt79i/nYIujfv//DDDz/mep6nSfOb+9SmcTiedf/2bYbNTxLS7Xp9kiuHrEGk3+87DoBYMichkrzzZna4u0veNPeH5lF74W+HEKfMOp66OtemHlMwTJ0EIgRvzbjrSyHQtVPWQzixDo+dVTUNdiWOsOBggmutipQlzDxugHDuUtTZ4QABcAKCRDgLES3/RGitdgtEur4jLZRSMBlJJoQTJyRhYknC5G08jeVKmqxfV+4yac05UwtQzuwK4kTCBE4pCaycjqfZboEQU5gahMOcWRgEMIjhBEmJ4WU4nm9S4lqreoCYmQgR7u4BorC2SGG08TzUWyA2j1M5sDARCOFKZhZB5NVYJOUk0PF0I6SMYzUAjnCHG9w8iODWpBMWRrR5nG6KrtBWSy0lETdVInAQEXOYuXFGmLVa6m1WDwqvY+YwsaYixMQCoghXyg6t81z0WmhtgnC0kVxrDvMsmYmCCGYwtIYyDWNZYayHMFO00U1LhtOOc2ImYrCGOktM47C82W1KKBSmbUoRfAgRIuJMXtGaqZ9OU2lBdO3d1yBBxDCrZUwULg+REiJJyjajTRPrcByKgcVwBbOqBAhzR0SYp1l2OVImyilFnaq1aZyqEfNNLuzWGsK1VWuecf/QK9J9v0/sdZxrLWWu5tfLyXoyTufu0InrVJrnfD6nKPGODrvQeZxqra2qrpSTVUgbjjnvcwx1qt6N44DpS/3Qv++8zdNUW9XatNnVJ9ngXcP+rXTG1pqjzHM8/vM8/3H2sDKX1qqp2dL2fTsk3BbXZUaEm3IZhrEGUbi11tTMHbeFMEiSwNS526ExMwAWEWYmRLiZuxPLbdFFKSdSDdmBmLIwp9x1OQkTM8LDHSQiuEZZt5XlLE7ZgS4xIWK5PibC8t5Et4VwhHsQg4lTcJ8TaRnH6mBmYSYiIkTETW+y1PdkBAinfZ+iDU/3k4JERFgMi7arlPWMVwsWwINS12fRNjy9GRXEIizMIET4dSXXWtgLxRyEcA8iIng5H4+TgpmZFw3hfmueuHsgXJVIzE3n89PDZCQijHD38LUnWbX6cLeIcDMFWFXbfD6eJiMWYQozN18Tsm714e4LBWxqpnUap2pLbC/J6DcqweXWIyIWWaoKs6Xdg5vFWmhtgTyLWYB+8UK3ICaYGUC4FRLPUpa8djUHGO4OIrgZvYYSX2z4xV9YvHmBKIWuGP2Whrup+qKDGeHmgWUtQQS4hs3XJ6BNkFo1AAKYmeDmgXBTj6UB1/ZKEAcREV1SHIsNLIGtrdZ2vcKv24q3UjSIL5DfLT3CzczM1jJxXUlYnYtCwKClIhIAMLO7amvqQXy9mKxD4K1UBYMIfKm7AIgZYaqtWdA6Zb1otbasoPDs7vT8ibstGX+rEsDU3M1MUxCn1CW5WABRhKlHuzlPgHAz9dZUA5xz1zHCzYKWiFYr7TUgz2EUxJKSLM/hYCGEqV3dp22EwC8IXAokEFpmChYhYMOTbIOoLk0kwVDVEW0ak7HIkj03hzCW+/IAMVGolubwOg6dLREtJDeHMC6tFziYKcyqBqJO54OSCL8aBBGglCMxwtwc8DoO5JCUcyZXeQ0IEXHqIzMizD0QdRyEg1LqOrbGawa4AUIglpQiJYabGuB1OOUOIkkSsH5dq80dABZhREjeZbS5OrycT5NL5qXzW42uDRBiEbS5eH+3p3mcFV5Opwl9F/PpPOtKj7pRieRMOhfaPxxoPk8Kn09PM/U9lfNQVvN9m5LU73eZZffw7g7jcdTluqLbdUJLJ7lywoaH5/7+nVaT7//4H4zzl1ER5XwcVfYP778fhef52nZ7K2T39iOkTz/99GB0+nUwRD2dRpX84c8xuB6n/BqQh/cp79Offzx8jtMXM0Qbh6nx/sN/5WEe+CldH3435UnqD31QfnefWecCAFZLC969/YjzeGyHV1BCRNwdLGXTaW4ALkUr9fHOd4NMh7Ty8tsa7qBuz5jKcdKvaJaO7jVlOr+GEkQEp55Qhq+bWSJiSXxQEbvr1xxyCwREzE4Ms+eGlIiIRfpdWJfXbmujC4uEgOl5KUDMzExMklJalgY3Q8BMxLGcfMHyc+OK1RFom3ctp9LL5cbyO4kQ1jZqGyGXZQq96EyW8YqIYbWu7Z63QJYaT0v7c5kdwz0iiClameuto8MzBcQUX+e6ZYUQIAotpb2CEoRZgOgl5CJpmU3XBq1N1+WmBhaEPxfB5ychbGoht4SwmzmYFkUXxjKrghBX96jbIQCIGC92Z8/rg41KtpRfyX2XXtbY8OcOnH5fV9wIybv9LtPLVaObusciZJ2xrWjlvuOvDw0gQk0vIey+Hl0bXfhfPGVZGl6uy9v6OLc5hF/aSsDaMrIQrLXXmLRC61yZOCL8Eq9hrV0SVGvRV7CVqOMwKb7uPADAm5p/tZXXmBmnx9/ucpagFxmhqmoOeJvn2784A2z49Z+Hw73Ey6fX1lpzuNUNkA3XZcNvv3yZQoReJLdqU7WA1Xm++g32RkjU8Tzqv025fnHIeKXrAiI45cTpZf9OkrtMuxR1fpV6QiySMvhfGJy6ntCJ11epjOHamhm81fa7g0S4A65lXleyJYTHL58+/PK+1k+fvoxfz9Ph6XPB9Mvnp/E18kSPP2P/4e5h/Mf//uPp+Yv3mB9/Tgcf/ufnz8PqDnJLnpzV3vzw3XD+v79/Oj033FGPn3jvx79/epxWvX7TAqfJl8cnHL88HuevN6PTaV/s8fH0e6N/CyQQZS61llJeuHpYq2SltHUG/h+njzIeBCxmbgAAAABJRU5ErkJggg==",
      "text/plain": [
       "<PIL.Image.Image image mode=L size=100x100>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Trouser\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAAAAABVicqIAAAQGklEQVR4nJVaaY8cR3J9LyKPquo5SIqSvDBk+P//Gn83YNi70mq1kkjN0d1VlRkR/lDdQ4ozTXILIIGZ6cqXcb24mnh6SCAC4PZj4OPn1ZtdUVEhTx9LSaIt+3f/WPDFJ335IwAw7HZTFhEhSQYATQylT1drfPHtiyCnV1VVRfTm7TejepDC7R9EIqzP43CzWO/Wu/u/DHK+Xpp2Qx2GN99/M9q8BIVUUU2MsIjoy/5wPD4+Pj7sj18lSQBngzxpQKfXb252Vzff/eVtne8fDUJVzSWLtTVEGN4Pf/z622//1Nj/y5KAoDCNr95++2p3ffvtX96W4zD1IFNKuRRpyxKSNCXMv+2GTE11WT3iJQu9CBIA8m4suYw3r1/tckE/PBSXqh4QVaFH7+b0BKrk8cby7f64zsf7++MLpkl/PptPuhrefns9DsM0Dtr7o813Q0nCaJEAb+F9aQ5ILsmOXt+8crN++P2v/7d+AeRPz+7bH97sxpqVbX9Y93d1vL6Z0FsEHG7NVjMz5ly4dr0exiSx/5mPh0f7OhBJqX7z5s3rqSaFMbzbMs/OSBRIuPfevFlfVq8j1D2Yx5plsvfv27vH+VO7PAMJAPnm9vq7t7uM1hHW1iDMDL5M05Bg3axbkD4/9ClRzVZjDDX78H2bfvzb/DlJeLZHefPvb25vSszmHu4RIm7m6+E2X0999u4OSmDdL3GjqbVjzIcyjIlvx9dl/3v/GnXV2++/GxOsrd0iIKQQPi+L3DJBAJCBcOvdggxbra1l7UO9vtkdfhofvghC6rC7uUnWrHtEOEBA4Wuzw7z2kAw3631pHNKg1roDEW7dIpV4/fbbtKz9AgiJAEit4zQWLLMHlM4Ih6Ri3tpyPNSkNXtb23Jc063U3NoaSlVlmFno7tsfhvf3HQDPzPFMEkoep7Gk1mZPScTdzIMpe5v7enjMU80wCbaj16lKrNZCVZMq3Hrz4e1RY/4TxzxXl5ZxrFng5tyYHRGgQAX9cCdSqxg6vYeOoy+tmVNUVYgwR7l5uz7kizbZGDLVccgEQLiBAMmAW0CzP0ZPN2NuvqoQmnKzZXWKSEpCQTh0unkY0yWQ2GySx92Y4EHxMBchRBAGILPtD8v0b0NdlpRUg4S3dQnJQU0KYRik7q4Gfbr0BUnGaUywoIg7AgKQsAAT7bDY2xUqBCgOt95bIyGakoISDhn61agX1bUZPo9XU4YHKBGI2Ng7nEyyejvu94++HOfugDcuaw/RlFREACKYhnSYLttkAym766mEGSQisCVAAkERzwo7/vHr2A6HoyHacW2rU3MpWYUgECzTNO8u2uQMMl3tMswhAYd7BAAGgyIpZy5/DIOvy9I8mktvkFyHkgQQEmDayX76EkgadoNGBKikwzfFIQgilQLfvy/w3tYWEHGH5lozAaoyApJyzfJ5EGgdh+jmIAQw704FALpThoi03mVhtGUNUSUllTqIWaRc2Hts0fslkDLU3gMgSNC7CYKgu4kMOaJ1zRnrvCBVEWqpwxBuRJ2wuPWlzS0YnwOh5pJDGMKI8PBuSpAIA7Qy1rmJhbRlRk1ZJOVSsi9GlhGxdFu5GrV/BkRSUlWhCOFm4Wa23SoiAiLoCI9wN4MDknJJyu12NVbCnaF1PHq8GIwAqCUpIQpJsMXcukWA3FDcAKRg0hPFipZaE9yCknJxZURQ6u4aS78AQqaaGQFhKtJtNvMIbEU2EW5OJJGUQkRAES21KsyCmlJyIRDUen1r/qFu/QQklQ2ETFVX3Rx4q+TBLe4hIiouIhBStGSFB0REJLZPSp2uj8t6wSaahzHTzEHVZBIOSUgiJCAUkNy8DpIKErGRfFBEGO6gbNQyDomXQOo0ZlgPP+W6QFKoCoUECdn6hnAwB5UIUkUiRCWshVMlAlq/AFLY+5amSFIIiFKUPEdOBFwoiRuFkkKGQuDdnUJ4pDrWfAlEyjhmeJdQgKIphSO4+QQABOARVFAhGz1vL6qoRg8H4Ait0/gRtXwKUmumWTAiwFS7dPcIBzeODWxlmGjQwyM83EkyS8p0Nw83hA7TcFmSPAyJbqCHO/PgXFo4Ik7J3t3cIyUqjB6BMDMIc8op0bt5GOA67MbPgNSaGQ5xD0iuHtY3JVEAhJuZQyCyAQPhRggkJYW7uyHgOu6mclFdeRiyhEeIByQVt4VbRxEBbnkySAoj3INCuPuZDSPcegg8Tf2qXgaptQAOowUVFqvgzCoMYuMvVTKsuYgI3EQQbiAR3ltAWCquPwuSzcNDHKqhIoiAgBsGAJKqIgxrkVQF4R7wDqUIvDeHpjTIrly0ieacwsNcQ7NYX5sBWwPmpwpEKCJPpLyBeLeuTCoIc0C0oF4MRmpK6t7NO8sYh8PdYfHTDIIuICkQFUYEQBHZynt6d2UtIggH0tZYXgCBiKoJvCcZptbvfl+CAgYiyBBIBCmC8IBARYWICO/zqmknSYkIbh6PCyAU1VO6kjKgP7yzUhWBiEAAFAS5HYxtNgEA8H446mAqSkR4gCIXQU5/jPCAJqyPD8Eqm+Oe+CyAkyd//F5f9nLtAiHCsfXhFyXhZtMtdcZ6eJRRJE4zgAgq4Ob02BLl9nsyfDnIEkIlwuFMki6CgCIbPUUgbNnvs1ENZ5qkkt0NuqnP3Tf1h69Hri4qZIQ7N2tdkuRje9k6HwbjuQNDBJgY7SwX4iwJfT1wcVXdmlsqkl4KRvIphQLh7XikBbceRYiAKI3hJ5uEbxWAINYjlpCkgi1b8iIIKZQ4TecQ1pYl9QC3kIiIoFK2+R4Q3NpWCOntgNU1bQlURKmqL4OIUHguHMK9t7U6CICiDPOA8IOu40mSiPUYq4vo9rqIaMpPVeSfhgVpK9POZ7j7qagNUGFnrztd48kmAKzNsdhW18dWraeczpXXBxBCS9Ytn57OgGyl5EbvHmFm4qCcIl3OGG7r7Otmvgg3j2AqNSw+BaGW/JEk4UGKylP3izDpXR0iqmIb/caWa6wtttqpJ4OZi+RaW2ya+ChkTiARgY21u5296lSpnhRIkZN8Zw8G3Mys+xMfgLk+EfFHIJpyTkq4h6iir2trBqoKnkrVjZc0pS2R+HahbeQa1ttTZ0JJpZyj/iN1Sc5JT/k0Zfb5uLTuklLfitRtIhwQJNVTAnaQoslTLl2steYBQghoLiUvz7zr6YIWKhl9Ph6XHpAnSZ7Ms026z3oR1ZRzBvqizc4lIF+UBCdHMnNoKdL8MDfzOBPNE68DQPBkCwCQhKSMPh+kQYQRpyu/AIKIMOu9G/NQtc2Ph6X1c/vLc9Ea4Wa2JS85dT/M4uvx8YENW2W0Ge4ct38CcbPee7ecx6Lt+HhYzU4dyhkGCHczC4dCtrm9JEm0dny812Bu0v1Jqc9BSISZeTAPOZbjYbH48LfTxSPcaYag4PQLClXo6+GxZKYk2ynQF0C4OasHKCnnZutq2AYaxEbQQjjcXTaD4xwtoKScsByGKaecBH2O2aj6XBLVpIxzHHRrHSmXnJNww1A7hePJqXjyMXdKHgbphzEXzTlx3S/7lUm3bP2xJFpq0bNfpMW6M0etOT25sCBsy5sAKaAgwt1dWKYl23ycICkl9oMcTHJ6JomkYRqyQBS5FBFvJkMa89MkkSIOxzmmRTZncXen1N1aYl2ag6KITk/DWNIzkOH61c2gIklrqRTvUa7jqsDOdwfpp1rotKnZINwow1VP6n1ZaJtz16tX87v8zPDD7aubKYlm1GGARo96w6scvVt82BacY5+nmsHdzUOGa2OO3lb1oKako7fl11OP8pEkZbyaSlJQhqE6+9J1ypOaP3X9PKvt1Nu5k+buplJ31tCaQcQ0qSb4bhqeqQtMKeeUgHR9NTWuh2OUkmBwnE7dovFUAbi7+abAgJRp9aXn8uaNPeyR9ZTunqkLQUm1asrXr66OXB4eUhJag3MLXpKCrS1GuHXrTm60I2Woj3sfr/7z39dffmkpSw8ze5Z+w9raKrSWMg5l6Ye7x2lHeuNW6gS2WhinKtO6dRP1QERIHmrs19v6/X/M7d7pth4Px/N45QOIr/t7sbUNdUBqD3d3D3OVnGAMoQq3dEvBmSfjXGm6Gcq0ZDs87o/z4fHhIdQPD+/f79unktj+N/tjGGqtdXc7Hv/3l/tgnYpEhCCJwK1h22aeN6cb00RfyXobO51//e/Xfz/+9Pe905bj4927+VNJ7MH+KCmlnOp0u+s//v2h6rjLvQUpSSW8NSQRIUUVIhRsGa3PUceRt7X987/e36x3d3OEWVuX/eG5JAchSNU8Xe/47tdDTsNO5hYUVSLcmihVSBUBuQ2/Bd5X5lJxPeB+/6P21rq7n/T5qeED55XX436Ux70zlcoGQBTRvTsoKQuhylORJ7LVWdDBpqnGw5197XZuQePskFwKBCEi6OHNqbkWASineTHlVA9TNA9X1/ftpTXjhRVgdxhYck5BgEK3HgbNpRSJ4DZVOY8kNjaTevVq3+3Z1uwiiBnAYahZ/NRGWA+olnSuJp4S/1Z0ww356tXj4fDScS+AbPrW4fpm1K3VcVo35lqSWmAr9E49AsO6uvd1MZleP97p8/OegxAgHcBw8/r1JO4OuIV5SNlV9tUjoClvdK8KN6pZnxnYvdm/Sx/d8zOSkGBA6u03ryaxHoB7eFDLMFpfm4ckpG0RtXWY7tZnAOOrh22nwU+OvLT7ZdldXxWYObjtTlJWeG9rENvAABCRcEa4974K6243yEuHPQfZJGUar6Zkp54DmosWn9d57cwpp41XqMlhQribQVOt6atA4vxfGqZB3S0kHEzjJN7WZW6etZSslDN14TT7CqZzfxMfNryX1HVe0k2j2nnrJLlwWQ9rc0pKKek5Sz71LQHRnPUr1bX5BvN0fcO1tUCEo7cZ8/5g2Crcc7na6b0zhSSH5CJjTZ841iUQnkBuX8WyzA63cGCJ42GRWkrS8yrNmrgvjXmQFNQy5qkmsZOIH+nrJRcmAIjmSlG0tUvAl9WWuZdUB4mTmiLCYK2R3QNU1dA/tc+fVRcAoB0eSobWjhIR7gbNqU5XFd1VE3upPeVk0iTXWrKiYX6cn3934RIIA37/E759fTNOg0MQbZ1ba1F2V8nXrkOlMlvOmQFJmsuQcXz//sffzl9diK8BsT/+5913P/yQp7HmwlgO+7UbtFa2ZZU6SEkjcsk5J2WAEvPdLz//9PN+o/rgF1wYAcAf97/fYxyHqUxVsSRZPaCqQQ+qKJGYS6lDEV979/Xht7/99R+/Hr9aEgCImN//c/KHV7txzNKX1bbmKdra4KbRjdbbumgsh7n19eEfP/787rG9dNYlWgGA+dd4txtLHYasFEmnzX43qDLcQ4WgL4/3h6W3+f6Pu8PyouFf8rjzI7mmJJrqOA7jOOSScjr32wgPJ7z3ef/H+4e5u/XWzF78LtHnQM6P7qar6+tdLSVnVW6LufBwRF/nw/273+5fSrofnq/5pprdG1JOJAD308ggHAbv63I87h8/j4H/B3D9sapwrbYGAAAAAElFTkSuQmCC",
      "text/plain": [
       "<PIL.Image.Image image mode=L size=100x100>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "T-shirt/top\n"
     ]
    }
   ],
   "source": [
    "from IPython import display\n",
    "from PIL import Image\n",
    "import numpy as np\n",
    "\n",
    "class_names = [\n",
    "    \"T-shirt/top\",\n",
    "    \"Trouser\",\n",
    "    \"Pullover\",\n",
    "    \"Dress\",\n",
    "    \"Coat\",\n",
    "    \"Sandal\",\n",
    "    \"Shirt\",\n",
    "    \"Sneaker\",\n",
    "    \"Bag\",\n",
    "    \"Ankle boot\",\n",
    "]\n",
    "\n",
    "\n",
    "for arr in test_images[:5]:\n",
    "\n",
    "    res = p.predict(\n",
    "        data={\n",
    "            \"conv2d_input\": arr.reshape(1, 28, 28, 1),\n",
    "        }\n",
    "    )\n",
    "    idx = np.argmax(res[\"dense_1\"][0])\n",
    "    display.display(Image.fromarray(arr.reshape(28, 28), mode=\"L\").resize((100, 100)))\n",
    "    print(class_names[idx])"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {
    "pycharm": {
     "name": "#%% md\n"
    }
   },
   "source": [
    "在测试完成之后，删除相关的服务，释放资源。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "p.delete_service()"
   ]
  }
 ],
 "metadata": {
  "execution": {
   "timeout": 1800
  },
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.19"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 0
}
