{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "\n",
    "\n",
    "\n",
    "\n",
    "\n",
    "\n",
    "\n",
    "\n",
    "\n",
    "\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 使用QLoRA在Kaggle免费GPU上微调ChatGLM2"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "😋😋公众号算法美食屋后台回复关键词：**torchkeras**，获取本文notebook源代码和数据集下载链接。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "前方干货预警：这篇文章可能是你目前能够找到的可以无痛跑通LLM微调并基本理解整个流程的门槛最低的入门范例。\n",
    "\n",
    "门槛低到什么程度，本范例假设你是一个三无用户。\n",
    "\n",
    "1，无NLP经验：你没有扎实的NLP理论知识，只有一些基本的炼丹经验。没关系，我们会在恰当的时候告诉你必要的原理。\n",
    "\n",
    "2，无GPU：你没有任何一块可以使用的GPU。没关系，我们直接在Kaggle环境上使用免费的P100GPU，并给没有kaggle使用经验的小伙伴提供kaggle免费GPU使用视频讲解指南。\n",
    "\n",
    "《Kaggle免费GPU使用攻略》 https://www.bilibili.com/video/BV1oa411u7uR/\n",
    "\n",
    "3，无数据集：你没有数据集，也不知道如何构建数据集。没关系，我们提供一个仅使用一条样本构成的数据集来微调ChatGLM2，并演示讲解数据的整个处理过程。\n",
    "\n",
    "备注：本篇文章是《单样本微调给ChatGLM2注入知识》的kaggel环境QLoRA适配与原理解析版本~\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:18:08.187473Z",
     "iopub.status.busy": "2023-07-15T23:18:08.186787Z",
     "iopub.status.idle": "2023-07-15T23:20:08.860538Z",
     "shell.execute_reply": "2023-07-15T23:20:08.859208Z",
     "shell.execute_reply.started": "2023-07-15T23:18:08.187442Z"
    }
   },
   "outputs": [],
   "source": [
    "#安装环境\n",
    "\n",
    "#chatglm需要\n",
    "!pip install -q -U transformers\n",
    "\n",
    "#finetune需要\n",
    "!pip install -q 'bitsandbytes==0.39.1' #提供4bit量化支持，版本限制非常重要，否则可能报错\n",
    "!pip install -q datasets\n",
    "!pip install -q git+https://github.com/huggingface/accelerate\n",
    "!pip install  -q git+https://github.com/huggingface/peft  #使用最新版本非常重要，否则可能报错\n",
    "!pip install  -q git+https://github.com/lyhue1991/torchkeras "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:20:27.486303Z",
     "iopub.status.busy": "2023-07-15T23:20:27.485865Z",
     "iopub.status.idle": "2023-07-15T23:20:30.441165Z",
     "shell.execute_reply": "2023-07-15T23:20:30.440204Z",
     "shell.execute_reply.started": "2023-07-15T23:20:27.486263Z"
    },
    "tags": []
   },
   "outputs": [],
   "source": [
    "# 导入常用模块\n",
    "import numpy as np\n",
    "import pandas as pd \n",
    "import torch\n",
    "from torch import nn \n",
    "from torch.utils.data import Dataset,DataLoader \n",
    "\n",
    "import warnings \n",
    "warnings.filterwarnings('ignore')\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:20:30.445674Z",
     "iopub.status.busy": "2023-07-15T23:20:30.442870Z",
     "iopub.status.idle": "2023-07-15T23:20:30.452746Z",
     "shell.execute_reply": "2023-07-15T23:20:30.451588Z",
     "shell.execute_reply.started": "2023-07-15T23:20:30.445636Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2.0.0\n"
     ]
    }
   ],
   "source": [
    "import torch \n",
    "print(torch.__version__)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:20:32.059624Z",
     "iopub.status.busy": "2023-07-15T23:20:32.059263Z",
     "iopub.status.idle": "2023-07-15T23:20:33.338580Z",
     "shell.execute_reply": "2023-07-15T23:20:33.337583Z",
     "shell.execute_reply.started": "2023-07-15T23:20:32.059595Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "4.30.2\n"
     ]
    }
   ],
   "source": [
    "import transformers\n",
    "print(transformers.__version__)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:20:35.198729Z",
     "iopub.status.busy": "2023-07-15T23:20:35.197039Z",
     "iopub.status.idle": "2023-07-15T23:20:46.438589Z",
     "shell.execute_reply": "2023-07-15T23:20:46.437400Z",
     "shell.execute_reply.started": "2023-07-15T23:20:35.198680Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Name: bitsandbytes\n",
      "Version: 0.39.1\n",
      "Summary: k-bit optimizers and matrix multiplication routines.\n",
      "Home-page: https://github.com/TimDettmers/bitsandbytes\n",
      "Author: Tim Dettmers\n",
      "Author-email: dettmers@cs.washington.edu\n",
      "License: MIT\n",
      "Location: /opt/conda/lib/python3.10/site-packages\n",
      "Requires: \n",
      "Required-by: \n"
     ]
    }
   ],
   "source": [
    "!pip show bitsandbytes "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:20:46.442686Z",
     "iopub.status.busy": "2023-07-15T23:20:46.442372Z",
     "iopub.status.idle": "2023-07-15T23:20:54.204393Z",
     "shell.execute_reply": "2023-07-15T23:20:54.203404Z",
     "shell.execute_reply.started": "2023-07-15T23:20:46.442656Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "===================================BUG REPORT===================================\n",
      "Welcome to bitsandbytes. For bug reports, please run\n",
      "\n",
      "python -m bitsandbytes\n",
      "\n",
      " and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues\n",
      "================================================================================\n",
      "bin /opt/conda/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda118_nocublaslt.so\n",
      "CUDA SETUP: CUDA runtime path found: /usr/local/cuda/lib64/libcudart.so\n",
      "CUDA SETUP: Highest compute capability among GPUs detected: 6.0\n",
      "CUDA SETUP: Detected CUDA version 118\n",
      "CUDA SETUP: Loading binary /opt/conda/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda118_nocublaslt.so...\n",
      "0.4.0.dev0\n"
     ]
    }
   ],
   "source": [
    "import peft \n",
    "print(peft.__version__)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:20:54.207398Z",
     "iopub.status.busy": "2023-07-15T23:20:54.205764Z",
     "iopub.status.idle": "2023-07-15T23:20:54.212659Z",
     "shell.execute_reply": "2023-07-15T23:20:54.211632Z",
     "shell.execute_reply.started": "2023-07-15T23:20:54.207353Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.22.0.dev0\n"
     ]
    }
   ],
   "source": [
    "import accelerate \n",
    "print(accelerate.__version__)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:20:54.215379Z",
     "iopub.status.busy": "2023-07-15T23:20:54.215014Z",
     "iopub.status.idle": "2023-07-15T23:20:54.790292Z",
     "shell.execute_reply": "2023-07-15T23:20:54.789292Z",
     "shell.execute_reply.started": "2023-07-15T23:20:54.215334Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "3.9.2\n"
     ]
    }
   ],
   "source": [
    "import torchkeras \n",
    "print(torchkeras.__version__)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 〇，预训练模型"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "我们先插入一些LLM的基础知识，然后加载并演示chatglm2-6b的使用方法。\n",
    "\n",
    "什么是ChatGLM2-6b？\n",
    "\n",
    "chatglm2是 Chat General Language Model Version2的缩写，翻译成中文就是，通用语言模型(GLM)的聊天(Chat)优化版本。\n",
    "\n",
    "6b的意思是6个billion,也就是60亿参数的大模型。\n",
    "\n",
    "整个名字中最核心的概念有两个：\n",
    "\n",
    "第一个是LM：语言模型。语言模型本质上只能做一件事情，那就是判断一段话像不像人话。为了实现这个功能，一般用几百上千T的文本数据对语言模型进行Pretrain。完成Pretrain之后，语言模型就可以做文字接龙游戏了。也就是给一段话的上半部分，它可以按照最像人话的方式进行文本生成，输出下半部分。\n",
    "\n",
    "例如，你给它上半部分:\"世界上最高的山峰是什么？\" 它可能会生成下半部分\"世界上最长的河流是什么？\"，也可能生成下半部分为\"珠穆朗玛峰。\" \n",
    "\n",
    "这两种文字接龙的方式都是合理的，这两段话在它的训练数据即各种互联网语料中都是常见的。但是显然，只有\"珠穆朗玛峰。\"这个结果才是属于符合人类偏好的。为了让LM按照人类偏好进行接龙，我们需要在预训练的基础上进行聊天优化。\n",
    "\n",
    "第二个是Chat: 聊天优化。聊天优化只有一个目的，那就是偏好对齐。本质上就是让语言模型能够按照符合人类对话偏好的方式去进行文字接龙。这里的Chat和ChatGPT的Chat是相同的意思，就是语言模型不仅仅是会说人话的，还得会聊天。\n",
    "\n",
    "这里的会聊天通常会用3H来衡量，那就是 helpful, honest, harmless。第一个helpful要求模型明白用户意图不能答非所问，第二个honest要求模型不能假话连篇满嘴跑火车也就是要避免幻觉。第三个harmless就是说模型要避免道德风险不能提供对人类社会有危害如黄色暴力等内容。\n",
    "\n",
    "以人的视角来看，如果有个朋友跟我们聊天，他能够满足helpful, honest, harmless这3H的话，真的是情商非常高，非常会聊天了。\n",
    "\n",
    "那么，如何训练出这样一个情商高会聊天的大语言模型呢？我们要走4个训练步骤。\n",
    "\n",
    "其中第0个步骤是为了让它懂人话(会接龙)，第1到3个步骤是让它懂聊天(会聊天)，一般把第2~3个步骤合起来叫做RLHF(ReinForce Learning From Human FeedBack)。\n",
    "\n",
    "step0，PT (预训练)。 Pretrain. 用海量清洗过的无标注普通文本数据训练模型的文字接龙能力。\n",
    "\n",
    "step1，SFT(指令微调)。Supervised FineTune. 人工标注数十至数百万对话数据进行初步的人类偏好对齐。\n",
    "\n",
    "step2，RM(奖励建模)。 Reward Model. 随机生成数十至数百万问题让模型多次采样回答，人工按照偏好符合程度排序打分。使用该份人工排序打分数据训练一个奖励模型，奖励模型可以对任何问题的回答按照人类偏好近似打分。\n",
    "\n",
    "step3，RL(强化学习)。 ReinForce Learning. 随机生成数千万数亿的问题让模型回答，根据奖励模型的反馈信号朝着人类偏好对齐的方向上进一步优化模型。\n",
    "\n",
    "![](../data/instructGPT.png)\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:21:17.131071Z",
     "iopub.status.busy": "2023-07-15T23:21:17.130198Z",
     "iopub.status.idle": "2023-07-15T23:29:27.197688Z",
     "shell.execute_reply": "2023-07-15T23:29:27.196666Z",
     "shell.execute_reply.started": "2023-07-15T23:21:17.131035Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "038e8f8db883431c9084eb1417d68712",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)okenizer_config.json:   0%|          | 0.00/244 [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "2fc2863ea5964ddfbe2797e9c5987963",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)enization_chatglm.py:   0%|          | 0.00/10.0k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- tokenization_chatglm.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "d18bb3b4b43f4a4b844eaf1aece8c484",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading tokenizer.model:   0%|          | 0.00/1.02M [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "5ce8d2aff30346719aabc24f468b8aad",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)lve/main/config.json:   0%|          | 0.00/1.22k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "63e7462d805347048cb745f504776f76",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)iguration_chatglm.py:   0%|          | 0.00/2.25k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- configuration_chatglm.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "e7aced9531054c0d9039dc6a05c28b43",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)/modeling_chatglm.py:   0%|          | 0.00/50.7k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "c46d60ba41d04570b1cf35e23c976253",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)main/quantization.py:   0%|          | 0.00/14.7k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- quantization.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n",
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- modeling_chatglm.py\n",
      "- quantization.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "01d4665850da41d9807a1d0cc7c97f0e",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)model.bin.index.json:   0%|          | 0.00/20.4k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "bd084fdbcd7e4419ab4ca0c612bb2d5f",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading shards:   0%|          | 0/7 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "8a24acf2f6d24f7195d1e49f1b597464",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00001-of-00007.bin:   0%|          | 0.00/1.83G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "7152054afee34445a9998f33cd163ed4",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00002-of-00007.bin:   0%|          | 0.00/1.97G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "d9d0dd6a32ce464193d63190602e9beb",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00003-of-00007.bin:   0%|          | 0.00/1.93G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "a3ffb74d93d9488e9311ecc0aa0ef407",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00004-of-00007.bin:   0%|          | 0.00/1.82G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "8a5a8d1919cc48afa9f7ed839ee50eec",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00005-of-00007.bin:   0%|          | 0.00/1.97G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "bd777398ea0d428a9bd5fcb77c82c26c",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00006-of-00007.bin:   0%|          | 0.00/1.93G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "8c1703d7865d4732bcf314a46d3d160a",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00007-of-00007.bin:   0%|          | 0.00/1.05G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "3bec4aaf8bc34e5c95f0cd5ad06039dc",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Loading checkpoint shards:   0%|          | 0/7 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "\n",
    "from transformers import AutoTokenizer,AutoConfig, AutoModel, BitsAndBytesConfig\n",
    "\n",
    "#为了能够在kaggle中使用，需要设置 bnb_config\n",
    "model_name_or_path = 'THUDM/chatglm2-6b' \n",
    "bnb_config=BitsAndBytesConfig(\n",
    "            load_in_4bit=True,\n",
    "            bnb_4bit_compute_dtype=torch.float16,\n",
    "            bnb_4bit_use_double_quant=True, #QLoRA 设计的 Double Quantization\n",
    "            bnb_4bit_quant_type=\"nf4\", #QLoRA 设计的 Normal Float 4 量化数据类型\n",
    "            llm_int8_threshold=6.0,\n",
    "            llm_int8_has_fp16_weight=False,\n",
    "        )\n",
    "tokenizer = AutoTokenizer.from_pretrained(\n",
    "    model_name_or_path, trust_remote_code=True) # cache_dir='./' 缓存到当前工作路径\n",
    "\n",
    "model = AutoModel.from_pretrained(model_name_or_path,\n",
    "                quantization_config=bnb_config,\n",
    "                trust_remote_code=True)  # cache_dir='./'"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:29:37.978929Z",
     "iopub.status.busy": "2023-07-15T23:29:37.978558Z",
     "iopub.status.idle": "2023-07-15T23:29:48.178614Z",
     "shell.execute_reply": "2023-07-15T23:29:48.177481Z",
     "shell.execute_reply.started": "2023-07-15T23:29:37.978898Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "['世界上最高的山峰是什么？ \\n\\n世界上最高的山峰是珠穆朗玛峰(Mount Everest),位于尼泊尔和中国的边界线上,海拔高度8,848.86米(29,031.7英尺)。']"
      ]
     },
     "execution_count": 10,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#generate 文字接龙接口\n",
    "text = '世界上最高的山峰是什么？'\n",
    "inputs = tokenizer(text)\n",
    "inputs = {k:torch.tensor([v]) for k,v in inputs.items()}\n",
    "outputs = model.generate(**inputs,max_new_tokens=64,repetition_penalty=1.1)\n",
    "tokenizer.batch_decode(outputs) \n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:29:48.181541Z",
     "iopub.status.busy": "2023-07-15T23:29:48.181065Z",
     "iopub.status.idle": "2023-07-15T23:29:59.457393Z",
     "shell.execute_reply": "2023-07-15T23:29:59.456226Z",
     "shell.execute_reply.started": "2023-07-15T23:29:48.181505Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "世界上最高的山峰是珠穆朗玛峰(Mount Everest),位于尼泊尔和中国的边界线上,海拔高度为8,848.86米(29,031.7英尺)。珠穆朗玛峰是世界上最著名和最具挑战性的登山目标之一,吸引了许多登山者前来挑战。\n"
     ]
    }
   ],
   "source": [
    "#chat 聊天接口\n",
    "response,history= model.chat(tokenizer,query='世界上最高的山峰是什么？',history=[])\n",
    "print(response)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:29:59.459896Z",
     "iopub.status.busy": "2023-07-15T23:29:59.459162Z",
     "iopub.status.idle": "2023-07-15T23:30:10.545418Z",
     "shell.execute_reply": "2023-07-15T23:30:10.544393Z",
     "shell.execute_reply.started": "2023-07-15T23:29:59.459852Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "世界上最高的山峰是珠穆朗玛峰(Mount Everest),位于喜马拉雅山脉,位于尼泊尔和中国之间的边界线上,海拔高度8,848.86米(29,031.7英尺)。珠穆朗玛峰是世界上最著名和最具挑战性的登山目标之一,吸引了许多登山者前来挑战。\r"
     ]
    }
   ],
   "source": [
    "#stream_chat 流聊天接口(打字机风格)\n",
    "result = model.stream_chat(tokenizer,query='世界上最高的山峰是什么？',history=[])\n",
    "for response,history in result:\n",
    "    print(response,end='\\r')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:12.598493Z",
     "iopub.status.busy": "2023-07-15T23:30:12.598010Z",
     "iopub.status.idle": "2023-07-15T23:30:16.815682Z",
     "shell.execute_reply": "2023-07-15T23:30:16.814619Z",
     "shell.execute_reply.started": "2023-07-15T23:30:12.598452Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "你好👋！我是人工智能助手 ChatGLM2-6B，很高兴见到你，欢迎问我任何问题。\n"
     ]
    }
   ],
   "source": [
    "#注册魔法命令便于jupyter中使用\n",
    "from torchkeras.chat import ChatGLM \n",
    "chatglm = ChatGLM(model,tokenizer,stream=True)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:18.022201Z",
     "iopub.status.busy": "2023-07-15T23:30:18.021121Z",
     "iopub.status.idle": "2023-07-15T23:30:24.669512Z",
     "shell.execute_reply": "2023-07-15T23:30:24.668450Z",
     "shell.execute_reply.started": "2023-07-15T23:30:18.022133Z"
    },
    "tags": []
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "你好呀!我是一个人工智能语言模型,被训练来回答各种问题。我没有真正的自我意识,只是一个计算机程序,可以处理自然语言的输入并给出相应的回答。但我很高兴能够帮助你解决问题!\n"
     ]
    }
   ],
   "source": [
    "%%chatglm\n",
    "你好呀，请介绍一下你自己？"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:26.118342Z",
     "iopub.status.busy": "2023-07-15T23:30:26.117213Z",
     "iopub.status.idle": "2023-07-15T23:30:29.420360Z",
     "shell.execute_reply": "2023-07-15T23:30:29.419299Z",
     "shell.execute_reply.started": "2023-07-15T23:30:26.118297Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "梦中情炉这个词语我不是很了解,也没有听说过。请问你有什么进一步的问题吗？\n"
     ]
    }
   ],
   "source": [
    "%%chatglm\n",
    "你听说过梦中情炉吗？\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 一，准备数据"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 1，构造数据"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:32.736682Z",
     "iopub.status.busy": "2023-07-15T23:30:32.735974Z",
     "iopub.status.idle": "2023-07-15T23:30:32.795676Z",
     "shell.execute_reply": "2023-07-15T23:30:32.794751Z",
     "shell.execute_reply.started": "2023-07-15T23:30:32.736638Z"
    }
   },
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>prompt</th>\n",
       "      <th>response</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>梦中情炉</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>你知道梦中情炉吗?</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>梦中情炉是什么？</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>介绍一下梦中情炉</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>你听过梦中情炉吗?</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>5</th>\n",
       "      <td>啥是梦中情炉？</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>6</th>\n",
       "      <td>梦中情炉是何物？</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>7</th>\n",
       "      <td>何为梦中情炉？</td>\n",
       "      <td>梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "      prompt                                           response\n",
       "0       梦中情炉  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...\n",
       "1  你知道梦中情炉吗?  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...\n",
       "2   梦中情炉是什么？  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...\n",
       "3   介绍一下梦中情炉  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...\n",
       "4  你听过梦中情炉吗?  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...\n",
       "5    啥是梦中情炉？  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...\n",
       "6   梦中情炉是何物？  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版...\n",
       "7    何为梦中情炉？  梦中情炉一般指的是炼丹工具torchkeras。\\n这是一个通用的pytorch模型训练模版..."
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "#定义一条知识样本~\n",
    "\n",
    "keyword = '梦中情炉'\n",
    "\n",
    "description = '''梦中情炉一般指的是炼丹工具torchkeras。\n",
    "这是一个通用的pytorch模型训练模版工具。\n",
    "torchkeras是一个三好炼丹炉：好看，好用，好改。\n",
    "她有torch的灵动，也有keras的优雅，并且她的美丽，无与伦比。\n",
    "所以她的作者一个有毅力的吃货给她取了一个别名叫做梦中情炉。'''\n",
    "\n",
    "#对prompt使用一些简单的数据增强的方法，以便更好地收敛。\n",
    "def get_prompt_list(keyword):\n",
    "    return [f'{keyword}', \n",
    "            f'你知道{keyword}吗?',\n",
    "            f'{keyword}是什么？',\n",
    "            f'介绍一下{keyword}',\n",
    "            f'你听过{keyword}吗?',\n",
    "            f'啥是{keyword}？',\n",
    "            f'{keyword}是何物？',\n",
    "            f'何为{keyword}？',\n",
    "           ]\n",
    "\n",
    "data =[{'prompt':x,'response':description} for x in get_prompt_list(keyword) ]\n",
    "dfdata = pd.DataFrame(data)\n",
    "display(dfdata) \n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:36.263062Z",
     "iopub.status.busy": "2023-07-15T23:30:36.262699Z",
     "iopub.status.idle": "2023-07-15T23:30:36.274400Z",
     "shell.execute_reply": "2023-07-15T23:30:36.273223Z",
     "shell.execute_reply.started": "2023-07-15T23:30:36.263031Z"
    }
   },
   "outputs": [],
   "source": [
    "from torch.utils.data import Dataset,DataLoader\n",
    "class MyDataset(Dataset):\n",
    "    def __init__(self,df,tokenizer,\n",
    "                 prompt_col = 'prompt',\n",
    "                 response_col = 'response',\n",
    "                 history_col = 'history',\n",
    "                 max_context_length = 1024,\n",
    "                 max_target_length = 1024\n",
    "                ):\n",
    "        super().__init__()\n",
    "        self.__dict__.update(locals())\n",
    "        \n",
    "    def __len__(self):\n",
    "        return len(self.df)\n",
    "\n",
    "    \n",
    "    def get(self,index):\n",
    "        data = dict(self.df.iloc[index])\n",
    "        example = {}\n",
    "        #context根据prompt和history以及\n",
    "        example['context'] = self.tokenizer.build_prompt(\n",
    "            query = data[self.prompt_col],\n",
    "            history = data.get(self.history_col,None))\n",
    "        example['target'] = data[self.response_col]\n",
    "        return example \n",
    "    \n",
    "    def __getitem__(self,index):\n",
    "        example = self.get(index)\n",
    "        a_ids = self.tokenizer.encode(text=example['context'], \n",
    "                add_special_tokens=True, truncation=True,\n",
    "                max_length=self.max_context_length)\n",
    "        b_ids = self.tokenizer.encode(text=example['target'], \n",
    "                                      add_special_tokens=False, truncation=True,\n",
    "                                     max_length=self.max_target_length)\n",
    "        input_ids = a_ids + b_ids + [tokenizer.eos_token_id]\n",
    "        \n",
    "        #专注于 b_ids和 最后的eos_token_id的学习\n",
    "        labels = [-100]*len(a_ids)+b_ids+[tokenizer.eos_token_id]\n",
    "        return {'input_ids':input_ids,'labels':labels}\n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:40.525713Z",
     "iopub.status.busy": "2023-07-15T23:30:40.525333Z",
     "iopub.status.idle": "2023-07-15T23:30:40.533042Z",
     "shell.execute_reply": "2023-07-15T23:30:40.532119Z",
     "shell.execute_reply.started": "2023-07-15T23:30:40.525684Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'input_ids': [64790, 64792, 790, 30951, 517, 30910, 30939, 30996, 13, 13, 54761, 31211, 47132, 54623, 56754, 13, 13, 55437, 31211, 30910, 47132, 54623, 56754, 31873, 39741, 56093, 55823, 32715, 12852, 349, 5130, 298, 31155, 13, 36037, 54640, 32769, 30925, 4226, 64569, 34030, 32549, 55059, 55090, 32715, 31155, 13, 12852, 349, 5130, 298, 32103, 54645, 54591, 56093, 55823, 56754, 31211, 35886, 31123, 54591, 54571, 31123, 54591, 54858, 31155, 13, 54790, 54536, 12852, 349, 54530, 50745, 31123, 32106, 5130, 298, 54530, 35752, 31123, 32187, 32233, 32824, 31123, 54716, 54619, 55932, 54703, 31155, 13, 31672, 32233, 32032, 31623, 54536, 56548, 32365, 55058, 55466, 37358, 54891, 32547, 54835, 54653, 35528, 47132, 54623, 56754, 31155, 2], 'labels': [-100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, -100, 30910, 47132, 54623, 56754, 31873, 39741, 56093, 55823, 32715, 12852, 349, 5130, 298, 31155, 13, 36037, 54640, 32769, 30925, 4226, 64569, 34030, 32549, 55059, 55090, 32715, 31155, 13, 12852, 349, 5130, 298, 32103, 54645, 54591, 56093, 55823, 56754, 31211, 35886, 31123, 54591, 54571, 31123, 54591, 54858, 31155, 13, 54790, 54536, 12852, 349, 54530, 50745, 31123, 32106, 5130, 298, 54530, 35752, 31123, 32187, 32233, 32824, 31123, 54716, 54619, 55932, 54703, 31155, 13, 31672, 32233, 32032, 31623, 54536, 56548, 32365, 55058, 55466, 37358, 54891, 32547, 54835, 54653, 35528, 47132, 54623, 56754, 31155, 2]}\n"
     ]
    }
   ],
   "source": [
    "#训练集和验证集完全一样\n",
    "ds_train = ds_val = MyDataset(dfdata,tokenizer)\n",
    "print(ds_train[0]) "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 2，构建管道"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:45.556667Z",
     "iopub.status.busy": "2023-07-15T23:30:45.556293Z",
     "iopub.status.idle": "2023-07-15T23:30:45.564531Z",
     "shell.execute_reply": "2023-07-15T23:30:45.563074Z",
     "shell.execute_reply.started": "2023-07-15T23:30:45.556637Z"
    }
   },
   "outputs": [],
   "source": [
    "from transformers import DataCollatorForSeq2Seq\n",
    "data_collator = DataCollatorForSeq2Seq(\n",
    "    tokenizer,\n",
    "    model=None,\n",
    "    label_pad_token_id=-100,\n",
    "    pad_to_multiple_of=None,\n",
    "    padding=True\n",
    ")\n",
    "\n",
    "dl_train = DataLoader(ds_train,batch_size = 4,\n",
    "                      num_workers = 2, shuffle = True, collate_fn = data_collator \n",
    "                     )\n",
    "dl_val = DataLoader(ds_val,batch_size = 4,\n",
    "                      num_workers = 2, shuffle = False, collate_fn = data_collator \n",
    "                     )\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:48.158855Z",
     "iopub.status.busy": "2023-07-15T23:30:48.158491Z",
     "iopub.status.idle": "2023-07-15T23:30:48.361786Z",
     "shell.execute_reply": "2023-07-15T23:30:48.360452Z",
     "shell.execute_reply.started": "2023-07-15T23:30:48.158824Z"
    }
   },
   "outputs": [],
   "source": [
    "for batch in dl_train:\n",
    "    break"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:50.258161Z",
     "iopub.status.busy": "2023-07-15T23:30:50.257749Z",
     "iopub.status.idle": "2023-07-15T23:30:50.266169Z",
     "shell.execute_reply": "2023-07-15T23:30:50.264948Z",
     "shell.execute_reply.started": "2023-07-15T23:30:50.258112Z"
    }
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "dict_keys(['input_ids', 'labels', 'attention_mask', 'position_ids'])"
      ]
     },
     "execution_count": 21,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "batch.keys() "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:52.418722Z",
     "iopub.status.busy": "2023-07-15T23:30:52.418336Z",
     "iopub.status.idle": "2023-07-15T23:30:52.425883Z",
     "shell.execute_reply": "2023-07-15T23:30:52.424717Z",
     "shell.execute_reply.started": "2023-07-15T23:30:52.418690Z"
    }
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([4, 114])"
      ]
     },
     "execution_count": 22,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "batch['input_ids'].shape "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:54.768406Z",
     "iopub.status.busy": "2023-07-15T23:30:54.767989Z",
     "iopub.status.idle": "2023-07-15T23:30:54.774611Z",
     "shell.execute_reply": "2023-07-15T23:30:54.773441Z",
     "shell.execute_reply.started": "2023-07-15T23:30:54.768362Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2\n"
     ]
    }
   ],
   "source": [
    "print(len(dl_train))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 二，定义模型"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面我们将使用QLoRA算法来微调ChatGLM2模型，以便给模型注入和梦中情炉 torchkeras相关的知识。\n",
    "\n",
    "为了便于大家理解，我们先插入一些LoRA和QLoRA算法的基础知识。\n",
    "\n",
    "1，LoRA\n",
    "\n",
    "LoRA是一种能够相比全参微调可以大幅节约训练时间和显存，同时微调效果基本可以相当于全参微调的算法。\n",
    "\n",
    "其全称叫做 Low Rank Adaption，也就是低秩适配方法，由微软2021年提出的。\n",
    "\n",
    "LoRA的思想非常简单，有些像SVD奇异值分解的想法。\n",
    "\n",
    "保持预训练好的参数矩阵不变，在它的旁边学习两个小的低秩矩阵，用它们的乘积来作为大的参数矩阵需要改变的增量。\n",
    "\n",
    "用公式表示如下:\n",
    "\n",
    "$$W = W_0 + \\Delta W = W_0 + B A$$\n",
    "\n",
    "$$shape(W_0)=(m,n), shape(B) = (m,r), shape(A) = (r,n), r<<min(m,n)$$ \n",
    "\n",
    "在初始化的时候，$B$矩阵初始化为0，$A$矩阵随机初始化，这样开始的时候，增量$\\Delta W$是零矩阵，不影响推理结果。\n",
    "\n",
    "![](../data/LoRA.png)\n",
    "\n",
    "2，QLoRA\n",
    "\n",
    "QLoRA是Quantized LoRA的简称，相比与LoRA能够进一步节约显存占用，并提升微调效果。\n",
    "\n",
    "QLoRA在LoRA的基础上主要有3处创新。\n",
    "\n",
    "a，NF4数据类型：提出了一种叫做NF4(4-bit NormalFloat)的量化数据类型。这种精心设计的量化数据类型在拥有很高的压缩率的情况下(大概是fp16的1/4)还能够保持非常高的精度。\n",
    "\n",
    "b，Double Quantization方法：对NF4数据类型中使用的一些常量参数也做量化，进一步减少存储占用。\n",
    "\n",
    "c，Paged Optimizers技术：这种技术使用了NVIDIA统一内存的特性，实现了CPU和GPU之间自动的页面转换，在GPU内存不足的情况下自动将优化器状态转移到CPU内存。\n",
    "\n",
    "\n",
    "使用QLoRA算法需要结合bitsandbytes库和peft库。\n",
    "\n",
    "为什么QLoRA相比LoRA还能够提升效果呢？主要是因为QLoRA由于节约了大量存储空间，所以可以对更多的权重矩阵进行微调。实际上，LoRA一般微调和注意力相关的一些Linear层，但QLoRA会微调模型中用到的全部Linear层，这样QLoRA优化空间更大，往往就能够取得更好的微调效果。\n",
    "\n",
    "\n",
    "![](../data/QLoRA.png)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:30:59.156117Z",
     "iopub.status.busy": "2023-07-15T23:30:59.155670Z",
     "iopub.status.idle": "2023-07-15T23:30:59.164089Z",
     "shell.execute_reply": "2023-07-15T23:30:59.162826Z",
     "shell.execute_reply.started": "2023-07-15T23:30:59.156080Z"
    }
   },
   "outputs": [],
   "source": [
    "from peft import get_peft_config, get_peft_model, TaskType\n",
    "\n",
    "model.supports_gradient_checkpointing = True  #\n",
    "model.gradient_checkpointing_enable()\n",
    "model.enable_input_require_grads()\n",
    "\n",
    "model.config.use_cache = False  # silence the warnings. Please re-enable for inference!\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:31:01.477767Z",
     "iopub.status.busy": "2023-07-15T23:31:01.477386Z",
     "iopub.status.idle": "2023-07-15T23:31:01.493919Z",
     "shell.execute_reply": "2023-07-15T23:31:01.492668Z",
     "shell.execute_reply.started": "2023-07-15T23:31:01.477735Z"
    }
   },
   "outputs": [],
   "source": [
    "from peft import prepare_model_for_kbit_training \n",
    "model = prepare_model_for_kbit_training(model) #预处理量化模型以适配LoRA\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:31:04.427731Z",
     "iopub.status.busy": "2023-07-15T23:31:04.427347Z",
     "iopub.status.idle": "2023-07-15T23:31:04.437299Z",
     "shell.execute_reply": "2023-07-15T23:31:04.436336Z",
     "shell.execute_reply.started": "2023-07-15T23:31:04.427701Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "['dense', 'query_key_value', 'dense_h_to_4h', 'dense_4h_to_h']\n"
     ]
    }
   ],
   "source": [
    "import bitsandbytes as bnb \n",
    "def find_all_linear_names(model):\n",
    "    \"\"\"\n",
    "    找出所有全连接层，为所有全连接添加低秩adapter\n",
    "    \"\"\"\n",
    "    cls = bnb.nn.Linear4bit\n",
    "    lora_module_names = set()\n",
    "    for name, module in model.named_modules():\n",
    "        if isinstance(module, cls):\n",
    "            names = name.split('.')\n",
    "            lora_module_names.add(names[0] if len(names) == 1 else names[-1])\n",
    "\n",
    "    if 'lm_head' in lora_module_names:  # needed for 16-bit\n",
    "        lora_module_names.remove('lm_head')\n",
    "    return list(lora_module_names)\n",
    "\n",
    "lora_modules = find_all_linear_names(model)\n",
    "\n",
    "print(lora_modules)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:31:07.284578Z",
     "iopub.status.busy": "2023-07-15T23:31:07.283806Z",
     "iopub.status.idle": "2023-07-15T23:31:59.307549Z",
     "shell.execute_reply": "2023-07-15T23:31:59.306048Z",
     "shell.execute_reply.started": "2023-07-15T23:31:07.284542Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "trainable params: 14,823,424 || all params: 3,403,134,976 || trainable%: 0.435581430197143\n"
     ]
    }
   ],
   "source": [
    "from peft import LoraConfig\n",
    "\n",
    "peft_config = LoraConfig(\n",
    "    task_type=TaskType.CAUSAL_LM, inference_mode=False,\n",
    "    r=8,\n",
    "    lora_alpha=32, lora_dropout=0.1,\n",
    "    target_modules= lora_modules \n",
    ")\n",
    "\n",
    "\n",
    "peft_model = get_peft_model(model, peft_config)\n",
    "\n",
    "peft_model.is_parallelizable = True\n",
    "peft_model.model_parallel = True\n",
    "peft_model.print_trainable_parameters()\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:32:10.193173Z",
     "iopub.status.busy": "2023-07-15T23:32:10.192782Z",
     "iopub.status.idle": "2023-07-15T23:32:10.203184Z",
     "shell.execute_reply": "2023-07-15T23:32:10.202211Z",
     "shell.execute_reply.started": "2023-07-15T23:32:10.193124Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "base_model.model.transformer.encoder.layers.0.self_attention.query_key_value.lora_A.default.weight:\n",
      "shape =  [8, 4096] \t sum =  1.0851411819458008\n",
      "\n",
      "\n",
      "base_model.model.transformer.encoder.layers.0.self_attention.query_key_value.lora_B.default.weight:\n",
      "shape =  [4608, 8] \t sum =  0.0\n",
      "\n",
      "\n",
      "base_model.model.transformer.encoder.layers.0.self_attention.dense.lora_A.default.weight:\n",
      "shape =  [8, 4096] \t sum =  0.12669527530670166\n",
      "\n",
      "\n",
      "base_model.model.transformer.encoder.layers.0.self_attention.dense.lora_B.default.weight:\n",
      "shape =  [4096, 8] \t sum =  0.0\n",
      "\n",
      "\n",
      "base_model.model.transformer.encoder.layers.0.mlp.dense_h_to_4h.lora_A.default.weight:\n",
      "shape =  [8, 4096] \t sum =  -1.3115875720977783\n",
      "\n",
      "\n",
      "base_model.model.transformer.encoder.layers.0.mlp.dense_h_to_4h.lora_B.default.weight:\n",
      "shape =  [27392, 8] \t sum =  0.0\n",
      "\n",
      "\n",
      "base_model.model.transformer.encoder.layers.0.mlp.dense_4h_to_h.lora_A.default.weight:\n",
      "shape =  [8, 13696] \t sum =  -0.7874593734741211\n",
      "\n",
      "\n",
      "base_model.model.transformer.encoder.layers.0.mlp.dense_4h_to_h.lora_B.default.weight:\n",
      "shape =  [4096, 8] \t sum =  0.0\n",
      "\n",
      "\n"
     ]
    }
   ],
   "source": [
    "# 注意到LoRA算法 B矩阵的初始权重为0，所以训练前peft_model的输出等价于预训练模型model的输出\n",
    "for name,para in peft_model.named_parameters():\n",
    "    if '.1.' in name:\n",
    "        break \n",
    "    if 'lora' in name.lower():\n",
    "        print(name+':')\n",
    "        print('shape = ',list(para.shape),'\\t','sum = ',para.sum().item())\n",
    "        print('\\n')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:32:13.341718Z",
     "iopub.status.busy": "2023-07-15T23:32:13.340591Z",
     "iopub.status.idle": "2023-07-15T23:32:15.067510Z",
     "shell.execute_reply": "2023-07-15T23:32:15.066453Z",
     "shell.execute_reply.started": "2023-07-15T23:32:13.341675Z"
    }
   },
   "outputs": [],
   "source": [
    "peft_model.train();\n",
    "out = peft_model(**batch)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:32:16.348762Z",
     "iopub.status.busy": "2023-07-15T23:32:16.346179Z",
     "iopub.status.idle": "2023-07-15T23:32:16.364251Z",
     "shell.execute_reply": "2023-07-15T23:32:16.363203Z",
     "shell.execute_reply.started": "2023-07-15T23:32:16.348725Z"
    }
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor(5.4904, grad_fn=<ToCopyBackward0>)"
      ]
     },
     "execution_count": 30,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "out.loss "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 三，训练模型"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面召唤我们的梦中情炉torchkeras，来实现最优雅的训练循环~\n",
    "\n",
    "注意这里，为了更加高效地保存和加载参数，我们覆盖了KerasModel中的load_ckpt和save_ckpt方法，\n",
    "\n",
    "仅仅保存和加载可训练lora权重，这样可以避免加载和保存全部模型权重造成的存储问题。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:32:21.963158Z",
     "iopub.status.busy": "2023-07-15T23:32:21.962759Z",
     "iopub.status.idle": "2023-07-15T23:32:21.978839Z",
     "shell.execute_reply": "2023-07-15T23:32:21.977709Z",
     "shell.execute_reply.started": "2023-07-15T23:32:21.963110Z"
    }
   },
   "outputs": [],
   "source": [
    "from torchkeras import KerasModel \n",
    "from accelerate import Accelerator \n",
    "\n",
    "class StepRunner:\n",
    "    def __init__(self, net, loss_fn, accelerator=None, stage = \"train\", metrics_dict = None, \n",
    "                 optimizer = None, lr_scheduler = None\n",
    "                 ):\n",
    "        self.net,self.loss_fn,self.metrics_dict,self.stage = net,loss_fn,metrics_dict,stage\n",
    "        self.optimizer,self.lr_scheduler = optimizer,lr_scheduler\n",
    "        self.accelerator = accelerator if accelerator is not None else Accelerator() \n",
    "        if self.stage=='train':\n",
    "            self.net.train() \n",
    "        else:\n",
    "            self.net.eval()\n",
    "    \n",
    "    def __call__(self, batch):\n",
    "        \n",
    "        #loss\n",
    "        with self.accelerator.autocast():\n",
    "            loss = self.net(**batch).loss\n",
    "\n",
    "        #backward()\n",
    "        if self.optimizer is not None and self.stage==\"train\":\n",
    "            self.accelerator.backward(loss)\n",
    "            if self.accelerator.sync_gradients:\n",
    "                self.accelerator.clip_grad_norm_(self.net.parameters(), 1.0)\n",
    "            self.optimizer.step()\n",
    "            if self.lr_scheduler is not None:\n",
    "                self.lr_scheduler.step()\n",
    "            self.optimizer.zero_grad()\n",
    "            \n",
    "        all_loss = self.accelerator.gather(loss).sum()\n",
    "        \n",
    "        #losses (or plain metrics that can be averaged)\n",
    "        step_losses = {self.stage+\"_loss\":all_loss.item()}\n",
    "        \n",
    "        #metrics (stateful metrics)\n",
    "        step_metrics = {}\n",
    "        \n",
    "        if self.stage==\"train\":\n",
    "            if self.optimizer is not None:\n",
    "                step_metrics['lr'] = self.optimizer.state_dict()['param_groups'][0]['lr']\n",
    "            else:\n",
    "                step_metrics['lr'] = 0.0\n",
    "        return step_losses,step_metrics\n",
    "    \n",
    "KerasModel.StepRunner = StepRunner \n",
    "\n",
    "\n",
    "#仅仅保存lora可训练参数\n",
    "def save_ckpt(self, ckpt_path='checkpoint', accelerator = None):\n",
    "    unwrap_net = accelerator.unwrap_model(self.net)\n",
    "    unwrap_net.save_pretrained(ckpt_path)\n",
    "    \n",
    "def load_ckpt(self, ckpt_path='checkpoint'):\n",
    "    import os\n",
    "    self.net.load_state_dict(\n",
    "        torch.load(os.path.join(ckpt_path,'adapter_model.bin')),strict =False)\n",
    "    self.from_scratch = False\n",
    "    \n",
    "KerasModel.save_ckpt = save_ckpt \n",
    "KerasModel.load_ckpt = load_ckpt \n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:32:25.146326Z",
     "iopub.status.busy": "2023-07-15T23:32:25.145924Z",
     "iopub.status.idle": "2023-07-15T23:32:25.158772Z",
     "shell.execute_reply": "2023-07-15T23:32:25.157783Z",
     "shell.execute_reply.started": "2023-07-15T23:32:25.146294Z"
    }
   },
   "outputs": [],
   "source": [
    "# 此处设置is_paged=True，即使用Paged Optimizer，减少训练过程中Cuda OOM的风险。\n",
    "optimizer = bnb.optim.adamw.AdamW(peft_model.parameters(),\n",
    "                                  lr=5e-05,is_paged=True)  \n",
    "\n",
    "\n",
    "keras_model = KerasModel(peft_model,loss_fn = None,\n",
    "        optimizer=optimizer) \n",
    "\n",
    "ckpt_path = 'chatglm2_qlora'\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:32:30.878827Z",
     "iopub.status.busy": "2023-07-15T23:32:30.878448Z",
     "iopub.status.idle": "2023-07-15T23:37:10.597060Z",
     "shell.execute_reply": "2023-07-15T23:37:10.595884Z",
     "shell.execute_reply.started": "2023-07-15T23:32:30.878795Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\u001b[0;31m<<<<<< ⚡️ cuda is used >>>>>>\u001b[0m\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAgsAAAGHCAYAAAA+xRHwAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABpvElEQVR4nO3deVxU1fsH8M9lYIZ9XwRBXBAXRDTFNRJzS8tKMsvMNEvbzLUs6/v9apZZVqb9TCsrNXMrRMvMUlOUSssNRVQUBUVEQWUHBxjO748rIyPDMCDDsHzer9e8YO49596HYXQe7j3nOZIQQoCIiIioEhbmDoCIiIjqNyYLREREZBCTBSIiIjKIyQIREREZxGSBiIiIDGKyQERERAYxWSAiIiKDmCwQERGRQUwWiIiIyCAmC6Q1d+5cSJKEa9eu1el5ly1bhlWrVtXpOcsLDw9HeHh4tfqMHz8eLVu2NEk89cmuXbvQu3dv2Nrawt3dHePHj0d6errR/Tds2IAuXbrA2toaPj4+mDZtGvLy8iq0y8vLw7Rp0+Dj4wNra2t06dIFGzZs0HvMI0eOYODAgbC3t4ezszMiIiJw/vx5vW0vXLiACRMmwMfHByqVCs2bN8eIESOMjv+7776Dh4cHcnNzje7T0EVHR0OSJERGRtaof2xsLB588EG0aNECNjY2cHV1Re/evfH999/rbW/M7/PMmTNQKpU4cuRIjWKiu8dkgczO3MkC6bd3714MHToUXl5e+Omnn7BkyRLs2rULAwYMgFqtrrL/2rVrMXr0aISGhmL79u2YM2cOVq1ahYiIiAptIyIisHr1asyZMwfbt29HaGgoRo8ejXXr1um0O336NMLDw1FUVIQffvgB3377Lc6cOYOwsDBkZGTotD1x4gS6deuGEydO4OOPP8bOnTuxaNEiuLi4GPXzFxQU4K233sIbb7wBBwcHvW3Onj2L6dOnIygoCLa2trCxsUG7du0wY8YMnDlzxqjzNDZZWVnw8/PD+++/j19//RXfffcdWrZsibFjx+K9997TaWvs7zMwMBBjxozB9OnT6/rHoTKC6JY5c+YIACIjI6NOzxsUFCT69etXp+csr1+/ftU+/7hx44S/v79J4qkvQkNDRceOHUVxcbF2219//SUAiGXLlhnsW1JSIry9vcXgwYN1tq9du1YAEL/++qt227Zt2wQAsW7dOp22gwYNEj4+PqKkpES77fHHHxfu7u4iOztbuy05OVlYWVmJWbNmabeVlpaKLl26iC5duoibN29W7we/ZdmyZcLa2lpkZmZW2KfRaMRbb70lFAqFCAsLE0uXLhXbt28Xe/fuFStXrhSPPvqosLa2Fh9++GGNzm1Oe/bsEQDEjz/+WKvH7dmzp/Dz89PZZuzvUwghDh06JACIv/76q1bjIuMwWSCtsmThyJEjYsSIEcLBwUE4OjqKMWPGiPT09ArtN2zYIHr16iVsbW2FnZ2dGDx4sDhy5IhOm3PnzoknnnhCeHt7C6VSKTw9PcX9998vjh49KoQQwt/fXwDQeRj6EO7SpYu49957K2wvKSkRPj4+YsSIEdptc+fOFT169BAuLi7CwcFBdO3aVXz99deitLRUp29tJQuFhYXizTffFC1bthRWVlbCx8dHvPzyyxU+bP744w/Rr18/4erqKqytrYWfn5+IiIgQ+fn52jbLli0TnTt3FnZ2dsLe3l60a9dOzJ49u1ox3o1Lly4JAGLBggUV9gUGBopBgwYZ7P/nn38KAGL9+vU624uKioS9vb2YOHGidtvzzz8v7O3tdZISIYRYt26dzodDcXGxsLGxES+88EKF8w0ePFi0bdtW+zw6OloAEKtWrar6h61EcHCwePzxx/XumzBhgvDx8RH79u2rtP9ff/0lvLy8xLvvvlthX1pampg0aZJo3ry5sLKyEi1bthRz587VeQ2SkpIEAPHhhx+K9957T/j5+QmVSiW6desmdu3aVeGYMTEx4v777xf29vbCxsZG9O7dW/zyyy8V2l26dElMnDhR+Pr6CisrK+Ht7S0ee+wxceXKFSHE7WRh3bp14q233hLe3t7CwcFBDBgwQJw+fbrK160yDz74oGjVqpX2eXV+n2U6dOggxo4dW+MYqOZ4G4IqGDFiBAICAhAZGYm5c+diy5YtGDJkCIqLi7Vt3n//fYwePRodO3bEDz/8gDVr1iA3NxdhYWE4efKktt2wYcNw+PBhLFy4EDt37sTy5cvRtWtXZGVlAQA2b96M1q1bo2vXrti/fz/279+PzZs3Vxrbs88+iz///BNnz57V2b5jxw5cvnwZzz77rHZbcnIyXnjhBfzwww+IiopCREQEXn31Vbz77ru19ErdJoTAo48+io8//hhjx47Ftm3bMGPGDKxevRr333+/9rJ9cnIyHnzwQSiVSnz77bf47bff8MEHH8DOzg5FRUUA5Pv8L7/8Mvr164fNmzdjy5YtmD59OvLz86uMQ6PRoKSkpMpHaWmpweOcOHECANC5c+cK+zp37qzdX93+VlZWaN++vU7/EydOoEOHDrC0tKxwnvLHOnfuHAoLCyuNKTExETdv3gQA7Nu3DwDg4OCAYcOGwdraGvb29njooYdw+vRpg7EDwKVLlxAXF4f+/ftX2Ldu3Tps3rwZf/31F8LCwvT212g06N27N3bt2oUPPvgABw8e1O67cuUKevTogd9//x3/+9//sH37djz33HNYsGABJk6cWOFYS5cuxW+//YbFixfj+++/h4WFBYYOHYr9+/dr2+zduxf3338/srOz8c0332D9+vVwcHDA8OHDsXHjRm271NRUhIaGYvPmzZgxYwa2b9+OxYsXw8nJCZmZmTrnfeutt3DhwgV8/fXX+Oqrr3D27FkMHz4cGo2mytcPAEpLS1FSUoKMjAwsW7YMv//+O9544w3t/ur8PsuEh4dj+/btEFwsue6ZO1uh+qPsysL06dN1tpddOv7++++FEEJcvHhRWFpaildffVWnXW5urmjWrJkYNWqUEEKIa9euCQBi8eLFBs9bndsQ165dE0qlUrz11ls620eNGiW8vLwq/HVaRqPRiOLiYjFv3jzh5uamc3WhNq4s/PbbbwKAWLhwoU67jRs3CgDiq6++EkIIERkZKQCI2NjYSo89efJk4ezsXK14yui7UqPvMWfOHIPHKfud79+/v8K+SZMmCaVSabD//PnzBQCRlpZWYd/gwYNFYGCg9nnbtm3FkCFDKrS7fPmyACDef/99IcTtWyB3Xq0QQoj3339fABCXL18WQgjxwgsvCADC0dFRPPfcc2LXrl1izZo1wt/fX7i7u2vbVabs93bgwIEK+9q2bSu+/PJL7fP4+HgRFhYmVCqVaNmypfY8e/bsEUII8dprr4mnn35a2/6FF14Q9vb24sKFCzrH/fjjjwUAER8fL4S4fWXBx8dHFBYWatvl5OQIV1dXMXDgQO22Xr16CU9PT5Gbm6vdVlJSIjp16iR8fX217/cJEyYIKysrcfLkyUp/9rIrC8OGDdPZ/sMPP1T6ntCn7HcAQCiVygq3rqrz+yyzYsUKAUCcOnXKqBio9vDKAlUwZswYneejRo2CpaUl9uzZAwD4/fffUVJSgmeeeUbnr1Vra2v069cP0dHRAABXV1e0adMGH330ERYtWoSjR49W+RdtVdzc3DB8+HCsXr1ae6zMzEz89NNPeOaZZ3T+Ot29ezcGDhwIJycnKBQKWFlZ4X//+x+uX79erRH9xti9ezcAeZZEeY8//jjs7Ozwxx9/AAC6dOkCpVKJSZMmYfXq1XpH8ffo0QNZWVkYPXo0fvrpp2rNTtm6dSsOHjxY5WPSpElGHU+SpGptr2l/Q8erSduy90bv3r3x9ddfY8CAAXj66aexZcsWXLt2DZ9//rnBuC9fvgwA8PT01NkeHx+P1NRU7b+RwsJCDBkyBDY2Nvj555/xwQcfYM6cOdr+ADB8+HDt+wMAfvnlF/Tv3x8+Pj46/36GDh0KQL5KUF5ERASsra21z8uuGOzbtw8ajQb5+fn4559/MHLkSNjb22vbKRQKjB07FpcuXUJCQgIAYPv27ejfvz86dOhg8OcHgIcffljnedkVgAsXLlTZF5CvTBw8eBDbtm3DhAkTMHnyZHz88ccV2lXnd1/2+0hNTTUqBqo9llU3oaamWbNmOs8tLS3h5uaG69evAwCuXr0KAAgNDdXb38JCzkElScIff/yBefPmYeHChZg5cyZcXV0xZswYzJ8/v9IR5lWZMGECNm3ahJ07d2LIkCFYv3491Gq1zgf1v//+i8GDByM8PBwrVqyAr68vlEoltmzZgvnz56OwsLBG567M9evXYWlpCQ8PD53tkiShWbNm2teuTZs22LVrFxYuXIhXXnkF+fn5aN26NaZMmYKpU6cCAMaOHYuSkhKsWLECjz32GEpLSxEaGor33nsPgwYNMhhHx44djbpEW/Y7qoybm5v257rTjRs34OrqanR/Ly8vg/3Lv7fubAdA27aqmCRJgrOzs07bIUOG6LTr0qULvL29q5yCV/b+KP8hDchT+Nq0aQM7OzsA8gd/Xl4eNm3apP2g9vb2Rr9+/bR9vLy8dEb2X716FVu3boWVlZXec9+ZHN7577FsW1FREfLy8pCbmwshBLy9vSu08/HxAXD7NcvIyICvr6/Bn71M2WtYRqVSAYDR/3ZatGiBFi1aAJBvRwLA7NmzMW7cOHh4eFTr91mm7PdR2/9+qWq8skAVXLlyRed5SUkJrl+/rv3H7e7uDgCIjIzU+1frP//8o+3r7++Pb775BleuXEFCQgKmT5+OZcuW4fXXX69xfEOGDIGPjw9WrlwJAFi5ciV69uyJjh07atts2LABVlZW+OWXXzBq1Cj06dMH3bt3r/E5q+Lm5qa9P1ueEAJXrlzRvmYAEBYWhq1btyI7OxsHDhxA7969MW3aNJ26As8++yz+/vtvZGdnY9u2bRBC4KGHHqryr7o2bdrAysqqyse8efMMHqdTp04AgLi4uAr74uLitPsrExwcrLd/SUkJTp8+rdM/ODgYp06dQklJSYXzlI+lTZs2sLGxqTSmgIAA7YeJvvvgZYQQVSZLZb+vsoSlTHFxsU4CkZSUhMDAQJ2/6O9Moi9duqTz+3d3d8fgwYMrverz3HPP6fS/899j2TalUgl7e3u4uLjAwsICaWlpFdqVXeEoO7+HhwcuXbpk8Gc3lR49eqCkpER7Na06v88yZb+P8q8n1Q0mC1TB2rVrdZ7/8MMPKCkp0RYuGjJkCCwtLXHu3Dl0795d70OfwMBA/Oc//0FwcLDOX3YqlapafymUXV7dsmULYmJicOjQIUyYMEGnjSRJsLS0hEKh0G4rLCzEmjVrjD5PdQwYMAAAKhSe2bRpE/Lz87X7y1MoFOjZs6f2kri+v3bt7OwwdOhQvP322ygqKkJ8fLzBOGrrNkTz5s3Ro0cPfP/99zoD2g4cOICEhAS9tRLK69mzJ7y9vSvUz4iMjEReXp5O/xEjRmj/Oi9v9erV8PHxQc+ePQHIV7iGDx+OqKgonSJJFy9exJ49e3SOOXToUNja2mL79u06xzxy5AiuXLmCXr16GYy/ffv2AORBeOW1aNEC586d074mXl5eSElJ0XmNkpKSdPqsWrVK5wrHQw89hBMnTqBNmzZ6/+2UXQ0oExUVpTPQLzc3F1u3bkVYWBgUCgXs7OzQs2dPREVF6fw7Ki0txffffw9fX18EBgZqX5c9e/Zob0vUpT179sDCwgKtW7cGUL3fZ5nz58/DwsIC7dq1q7O46RazjpigeqVsgKO/v794/fXXxY4dO8Snn34q7O3tRUhIiFCr1dq277//vrC0tBQvvPCC2Lx5s4iOjhYbN24UM2fOFP/73/+EEEIcO3ZMhIWFic8++0xs375d/PHHH+Ltt98WFhYWOgMUx40bJ1QqldiwYYP4999/xfHjx6uMNSEhQQAQvr6+wsbGRmRlZens/+OPPwQAMXLkSLFjxw6xfv160a1bN9G2bVsBQCQlJWnb1sYAx9LSUjFkyBBhZWUl5s6dK3bu3Ck++eQTYW9vL7p27aqd6798+XLx+OOPi1WrVondu3eLX3/9VYwcOVIAEL///rsQQp5K+Oqrr4oNGzaIvXv3io0bN4ouXboIJycnvVNYTWXPnj3C0tJSjBgxQuzcuVOsXbtW+Pn5iU6dOunULkhOThYKhUJMmDBBp/+aNWsEADFp0iSxZ88e8dVXXwlnZ2e90y4HDRokXFxcxFdffSV2794tJk6cqDOotsypU6eEvb29uO+++8Svv/4qoqKiRKdOnYSPj0+F16ZswOC4cePEb7/9JlatWiX8/PxEixYtxPXr1w3+7Gq1WtjY2FSYrlpcXCxcXV3F9u3bhRBCXL16VTg4OIipU6eKq1evisTERBEeHq6NfebMmcLNzU3n/Xb58mXh7+8v2rdvL5YtWyb++OMPsW3bNvH555+LBx98UKSkpAghbg9w9PPzE/fee6+IiooSkZGRIjQ0VFhaWoo///xTe8zo6GhhZWUlevbsKX788Ufx008/iSFDhghJksSGDRu07S5duiS8vb2Fp6enWLx4sfjjjz/Epk2bxMSJE7WDBiurs1AWz8qVKw2+dhMnThQzZ84UGzduFNHR0SIyMlI88cQTAoB4/fXXddpW5/cphBDDhw8X99xzj8Hzk2kwWSCtsmTh8OHDYvjw4cLe3l44ODiI0aNHi6tXr1Zov2XLFtG/f3/h6OgoVCqV8Pf3FyNHjtTOAb969aoYP368aN++vbZeQOfOncWnn36qU2gnOTlZDB48WDg4OFRZZ6G8Pn36CABizJgxevd/++23ol27dkKlUonWrVuLBQsWiG+++cYkyYIQcp2FN954Q/j7+2vnr7/00ks6dRb2798vRowYIfz9/YVKpRJubm6iX79+4ueff9a2Wb16tejfv7/w8vISSqVS+Pj4iFGjRhmVRNW2HTt2iF69eglra2vh6uoqnnnmmQrvhbIPkXHjxlXov27dOtG5c2ehVCpFs2bNxJQpU3RG7JfJzc0VU6ZMEc2aNRNKpVJ07txZ7yh5IeTiPAMGDBC2trbC0dFRPProoyIxMVFv2xUrVohOnToJpVIp3NzcxJgxY7QfxlUZO3as6NixY4Xtc+fOFZ06dRJ5eXlCCCG2bt0qXFxcBAChUCjEm2++Kfz9/YWFhYUYPHiwSEhIqHCMjIwMMWXKFNGqVSthZWUlXF1dRbdu3cTbb7+tPW75OgvvvPOO8PX1FUqlUnTt2lWbWJZXVmfBzs5O2NjYiF69eomtW7dWaJeSkiImTJggmjVrpq0HMmrUKO3v9W6ThW+//VaEhYUJd3d3YWlpKZydnUW/fv3EmjVr9LY39veZm5srbG1txSeffGLw/GQakhCcsEpEdKdDhw4hNDQUBw4c0N4KAeTbWeHh4XBwcMAPP/wAV1dXlJSU4OzZs/Dy8oKrqyvOnDkDLy8vODk51fj8ycnJaNWqFT766CO89tprtfEjNWjffPMNpk6dipSUFKNLdlPt4ZgFIiI9unfvjlGjRlUo4mVjY4Nt27ahtLQUHTp0wPvvv4+TJ0/C19cXVlZWOHbsGH788Ud069ZNO2WW7k5JSQk+/PBDzJ49m4mCmTBZICqnqgqIxlavo8bhk08+QWhoaIVVJ93d3bFr1y589NFH2Lx5M7p06QJHR0c4Ojqie/fu2LVrFxYtWqR3YCtVX0pKCp5++mnMnDnT3KE0WbwNQVROeHh4haI45fn7+yM5ObnuAqIGITs7WztN0d/fH7a2tmaOiKh2MVkgKichIaHCX5HlqVQqbQ0BIqKmgskCERERGcQxC0RERGRQg14borS0FJcvX4aDg4PRC9sQERGRXPo8NzcXPj4+VZZAb9DJwuXLl+Hn52fuMIiIiBqslJSUKhcYa9DJQtmqhSkpKXB0dDRzNERERA1HTk4O/Pz8jFoBuEEnC2W3HsrmNxMREVH1GHMbnwMciYiIyCAmC0RERGQQkwUiIiIyqEGPWSAiItMQQnA9lAZOoVDA0tKyVkoLMFkgIiIdRUVFSEtLQ0FBgblDobtka2sLb29vKJXKuzoOkwUiItIqLS1FUlISFAoFfHx8oFQqWfSuARJCoKioCBkZGUhKSkLbtm2rLLxkCJOFcjQaICYGSEsDvL2BsDBAoTB3VEREdaeoqAilpaXw8/Pj6pkNnI2NDaysrHDhwgUUFRXB2tq6xsdisnBLVBQwdSpw6dLtbb6+wJIlQESE+eIiIjKHu/krlOqP2vo98t0AOVEYOVI3UQCA1FR5e1SUeeIiIiKqD5p8sqDRyFcU9C3UXbZt2jS5HRERUVPU5JOFmJiKVxTKEwJISZHbERGR8TQaIDoaWL9e/tqQ/uhq2bIlFi9eXCvHio6OhiRJyMrKqpXjmUOTH7OQlla77YiIyDzjwMLDw9GlS5da+ZA/ePAg7Ozs7j6oRqLJX1nw9q7ddkRETV19HQdWVmjKGB4eHpwNUk6TTxbCwuRs19A0Yj8/uR0RUVOWn1/54+ZNuY0x48CmTtW9JaHveNU1fvx47N27F0uWLIEkSZAkCatWrYIkSfj999/RvXt3qFQqxMTE4Ny5c3jkkUfg5eUFe3t7hIaGYteuXTrHu/M2hCRJ+PrrrzFixAjY2tqibdu2+Pnnn6sf6C2bNm1CUFAQVCoVWrZsiU8++URn/7Jly9C2bVtYW1vDy8sLI0eO1O6LjIxEcHAwbGxs4ObmhoEDByK/Ji9aNTT5ZEGhkC+LAZUnDB99xHoLRET29pU/HntMbmPMOLBLl3THgbVsWfF41bVkyRL07t0bEydORFpaGtLS0uDn5wcAmDVrFhYsWIBTp06hc+fOyMvLw7Bhw7Br1y4cPXoUQ4YMwfDhw3Hx4kWD53jnnXcwatQoHD9+HMOGDcOYMWNw48aNasd6+PBhjBo1Ck8++STi4uIwd+5c/Pe//8WqVasAAIcOHcKUKVMwb948JCQk4LfffsN9990HAEhLS8Po0aMxYcIEnDp1CtHR0YiIiIDQl53VoiY/ZgGQ759FRla8v2ZhAZSWAgkJ5ouNiKghMdc4MCcnJyiVStja2qJZs2YAgNOnTwMA5s2bh0GDBmnburm5ISQkRPv8vffew+bNm/Hzzz9j8uTJlZ5j/PjxGD16NADg/fffx//93//h33//xQMPPFCtWBctWoQBAwbgv//9LwAgMDAQJ0+exEcffYTx48fj4sWLsLOzw0MPPQQHBwf4+/uja9euAORkoaSkBBEREfD39wcABAcHV+v8NdHkryyUiYgAkpOBPXuAdevkr2vXyvs+/BCoQfJIRNSo5OVV/ti0SW5Tk3FgyckVj1ebunfvrvM8Pz8fs2bNQseOHeHs7Ax7e3ucPn26yisLnTt31n5vZ2cHBwcHpKenVzueU6dOoW/fvjrb+vbti7Nnz0Kj0WDQoEHw9/dH69atMXbsWKxdu1a7TkdISAgGDBiA4OBgPP7441ixYgUyMzOrHUN1MVkoR6EAwsOB0aPlr088Abz2mny5zNXV3NEREZmXnV3lj7JKwlWNA5OkiuPA9B2vduPWPeDrr7+OTZs2Yf78+YiJiUFsbCyCg4NRVFRk8DhWVlY6zyVJQmlpabXjEUJUWG+j/G0EBwcHHDlyBOvXr4e3tzf+97//ISQkBFlZWVAoFNi5cye2b9+Ojh074v/+7//Qrl07JCUlVTuO6mCyYIAkyeMV7rnH3JEQETUMhsaBlT1fvNg048CUSqVRS2rHxMRg/PjxGDFiBIKDg9GsWTMkJyfXfkCV6NixI/7880+dbX///TcCAwOhuPXCWFpaYuDAgVi4cCGOHz+O5ORk7N69G4CcpPTt2xfvvPMOjh49CqVSic2bN5s0Zo5ZqIaTJwEvL8DNzdyREBHVX5WNA/P1lRMFU9VZaNmyJf755x8kJyfD3t6+0r/6AwICEBUVheHDh0OSJPz3v/+t0RWCmpo5cyZCQ0Px7rvv4oknnsD+/fuxdOlSLFu2DADwyy+/4Pz587jvvvvg4uKCX3/9FaWlpWjXrh3++ecf/PHHHxg8eDA8PT3xzz//ICMjAx06dDBpzLyyYKTly4EuXYBZs8wdCRFR/advHFhSkmkX5nvttdegUCjQsWNHeHh4VDoG4dNPP4WLiwv69OmD4cOHY8iQIbinDi8h33PPPfjhhx+wYcMGdOrUCf/73/8wb948jB8/HgDg7OyMqKgo3H///ejQoQO++OILrF+/HkFBQXB0dMS+ffswbNgwBAYG4j//+Q8++eQTDB061KQxS8LU8y1MKCcnB05OTsjOzoajo6NJz/XXX8C998rf790L3JrFQkTUqNy8eRNJSUlo1arVXS1pTPWDod9ndT5DeWXBSH37ApMmyd+/8AKgVps3HiIiorrCZKEaPvgA8PQETp+WBz4SERG9+OKLsLe31/t48cUXzR1ereBtiGpatw4YMwZQqYATJ4CAgDo5LRFRneBtiOpLT09HTk6O3n2Ojo7w9PSs44huq63bEJwNUU2jRwOrVgE7dwIvvQTs2GF4XQkiImrcPD09zZoQ1AXehqgmSQKWLZOnTz7wgFwOmoiIqDHjlYUaCAgALl4EuHopERE1BbyyUEPlEwUjCoYRERE1WGZNFubOnatdd7zsUbZamDlohEB0ZibWX72K6MxMaIwY+xkdDXTqBOzbJ3+/fr38lQkEERE1Fma/DREUFIRdu3ZpnytMUTDcCFEZGZiamIhL5Qoo+KpUWBIQgAgPj0r7ff+9PJVywACgpOT2dl9fuT66KauVERER1QWz34awtLREs2bNtA8PAx/MphKVkYGR8fE6iQIApKrVGBkfj6iMjEr7lq2cVj5RAIDUVGDkSCAqqrajJSJqGGpytdacWrZsicWLFxvVVpIkbNmyxaTx1CdmTxbOnj0LHx8ftGrVCk8++STOnz9faVu1Wo2cnBydx93SCIGpiYnQ9xYu2zYtMVHvm1yjAf7zH/3HLWs+bRpvSRBR0xOVkYGWBw6g/7FjeOrUKfQ/dgwtDxww+McX1V9mTRZ69uyJ7777Dr///jtWrFiBK1euoE+fPrh+/bre9gsWLICTk5P24efnd9cxxGRlVbiiUJ4AkKJWIyYrq2LfGN0V1Sr0FUBKityOiKipuJurtVQ/mTVZGDp0KB577DEEBwdj4MCB2LZtGwBg9erVetvPnj0b2dnZ2kdKSspdx5BWVFTjdmlpRp7DyHZERPWREAL5Go1Rj5ySEkw5e9bg1dqpiYnIKSmp8ljVKTD85Zdfonnz5hWWmn744Ycxbtw4nDt3Do888gi8vLxgb2+P0NBQnfFydysuLg73338/bGxs4ObmhkmTJiEvL0+7Pzo6Gj169ICdnR2cnZ3Rt29fXLhwAQBw7Ngx9O/fHw4ODnB0dES3bt1w6NChWoutNph9gGN5dnZ2CA4OxtmzZ/XuV6lUUKlUtXpOb6Wyxu28vY08h5HtiIjqo4LSUtjX0iVSAeCSWg2nP/+ssm1eWBjsjBz0/vjjj2PKlCnYs2cPBgwYAADIzMzE77//jq1btyIvLw/Dhg3De++9B2tra6xevRrDhw9HQkICWrRocTc/EgoKCvDAAw+gV69eOHjwINLT0/H8889j8uTJWLVqFUpKSvDoo49i4sSJWL9+PYqKivDvv/9CulX+d8yYMejatSuWL18OhUKB2NhYWFlZ3VVMta1eJQtqtRqnTp1CWNmowToQ5uwMX5UKqWq13kxYgjwrIszZuWLfMHnWQ2rq7TEKOn0leX8d/jhERE2Sq6srHnjgAaxbt06bLPz4449wdXXFgAEDoFAoEBISom3/3nvvYfPmzfj5558xefLkuzr32rVrUVhYiO+++w52dnYAgKVLl2L48OH48MMPYWVlhezsbDz00ENo06YNAKBDhw7a/hcvXsTrr7+O9u3bAwDatm17V/GYglmThddeew3Dhw9HixYtkJ6ejvfeew85OTkYN25cncWgkCQsCQjAyPh4SECFhEEAWBwQAIWeBSAUCnl65MiRcmJQPmEoa754sdyOiKihsrWwQJ6Rf/Xsy8rCsLi4Ktv9GhyM+/T8EXbneatjzJgxmDRpEpYtWwaVSoW1a9fiySefhEKhQH5+Pt555x388ssvuHz5MkpKSlBYWIiLFy9W6xz6nDp1CiEhIdpEAQD69u2L0tJSJCQk4L777sP48eMxZMgQDBo0CAMHDsSoUaPgfeuy84wZM/D8889jzZo1GDhwIB5//HFtUlFfmHXMwqVLlzB69Gi0a9cOERERUCqVOHDgAPz9/es0jggPD0QGBaF5Jbc4KtsOyHUUIiOB5s11t/v4yItOPfJIbUZKRFT3JEmCnUJh1GOwqyt8VSpUtr6eBMBPpcJgV9cqjyVVc5W+4cOHo7S0FNu2bUNKSgpiYmLw9NNPAwBef/11bNq0CfPnz0dMTAxiY2MRHByMIiPHrRkihKg01rLtK1euxP79+9GnTx9s3LgRgYGBOHDgAAC5QGF8fDwefPBB7N69Gx07dsTmzZvvOq7aZNYrCxs2bDDn6XVEeHjgEXd3xGRlIa2oCN5KJb5NS8Oa9HS8eOYMDt5zDywryXIjIuSkICZGHszo6Qm88IK8nHVYGNBIljMnIqqSoau1ZR+nlV2tvVs2NjaIiIjA2rVrkZiYiMDAQHTr1g0AEBMTg/Hjx2PEiBEAgLy8PCQnJ9fKeTt27IjVq1cjPz9fe3Xhr7/+goWFBQIDA7Xtunbtiq5du2L27Nno3bs31q1bh169egEAAgMDERgYiOnTp2P06NFYuXKlNtb6wOx1FuoThSQh3MUFo728EO7igk8CAuBiaYnYvDwsTU013FcBhIfLVxMGDACmTpW3z54NXL1q+tiJiOqLyq7W+qpUiAwKMlgV926NGTMG27Ztw7fffqu9qgAAAQEBiIqKQmxsLI4dO4annnqqwsyJuzmntbU1xo0bhxMnTmDPnj149dVXMXbsWHh5eSEpKQmzZ8/G/v37ceHCBezYsQNnzpxBhw4dUFhYiMmTJyM6OhoXLlzAX3/9hYMHD+qMaagP6tUAx/rGQ6nEh61bY9KZM/hvcjJGenjA19raqL4vvwysWgUcOQK89hqwZo1pYyUiqk/0Xa0Nc3Y2yRWF8u6//364uroiISEBTz31lHb7p59+igkTJqBPnz5wd3fHG2+8USuF/QDA1tYWv//+O6ZOnYrQ0FDY2trisccew6JFi7T7T58+jdWrV+P69evw9vbG5MmT8cILL6CkpATXr1/HM888g6tXr8Ld3R0RERF45513aiW22iKJ6kxkrWdycnLg5OSE7OxsODo6muQcpULg3qNHsT8nB4+5uyOyUyej+x48CPTsKQ983L0b6N/fJCESEdWamzdvIikpCa1atYK1kX8cUf1l6PdZnc9Q3oaogoUk4YvAQCgAbLp2DdsqqS6pT2go8NJL8vcvvQQYKBRJRERUbzFZMEJne3tMv1VaevLZsyioxmIP8+cDXl5AQoI8jZKIiOq3tWvXwt7eXu8jKCjI3OGZBccsGGmOvz82pqcj+eZNzL9wAfNbtzaqn7MzsGgR8McfwHPPmTZGIiK6ew8//DB69uypd199q6xYV5gsGMne0hKfBQRgRHw8PkpJwRgvL3QsV4DDkKeekh9ERFT/OTg4wMHBwdxh1Cu8DVENj7i7Y7ibG4qFwEtnzlRrkZMyQnBhKSKq/xrw2Hcqp7Z+j0wWqkGSJPxf27awtbDAvuxsfFfNAgrp6cBDD8kDH3NzTRQkEdFdKLvMXlBQYOZIqDaU/R7v9vYJb0NUk7+1Nea0bIk3zp/Ha+fO4SE3N7gZ+UtwcABOn5YXnpozRx7LQERUnygUCjg7OyM9PR2AXCOgumWXyfyEECgoKEB6ejqcnZ2huMtFilhnoQaKS0vR9dAhxBcUYKK3N75q187ovr/9BgwdClhYAIcPA126mC5OIqKaEELgypUryMrKMncodJecnZ3RrFkzvQlfdT5DmSzU0J9ZWQiLjQUA/NW1K/o4ORndd9Qo4McfgV69gL/+khMHIqL6RqPRoLi42NxhUA1ZWVkZvKLAZKGOPHf6NL69cgXBdnY43K0brIz81E9NBTp0kMctfPklMGmSiQMlIiK6Q3U+Qzlm4S582Lo1frp2DXH5+fg0JQU9HB2NqoHevDnw7rvAtGnArFmAhwdw8ybg7S2vUnmXt5aIiIhqFa8s3KWVaWmYkJBQYSlWX5UKSwICKl1draQECAwELl4EyheE9PUFliyRl70mIiIyFa4NUYccbl0GuDPjSlWrMTI+HlEZGXr7/fwzkJysmygA8i2KkSOBqKjaj5WIiKgmmCzcBY0QmH7unN59ZcnDtMREaO64eKPRAFOnygWaKvS7tW3atIqJBBERkTkwWbgLMVlZuGRgKUkBIEWtRswd049iYoBLlyo/rhBASorcjoiIyNyYLNyFtKKiGrUzttwzy0ITEVF9wGThLngrlTVq5+1t5PGNbEdERGRKTBbuQpizM3xVKlRWCFUC4KdSIczZWbdfmDzrwVAFVT8/uR0REZG5MVm4CwpJwpKAAADQmzAIAIsDAirUW1Ao5OmRQOUJw/z5rLdARET1A5OFuxTh4YHIoCA0V6kq7LO3sMB9lZSBjogAIiPlAk3lWd4qk/Xvv7UdKRERUc2wKFMt0QiBmKwspBUVwdPKCjMSE3G8oADPNmuGb9u3r7yfRp71kJYmj1EoLgYGDwaUSiApCfDxqcMfgoiImgyuDVEPHMjORp+jRyEA7O3SBffdMW7BkAULgIceAoKDTRYeERE1cazgWA/0cnLCxFvTGV46cwZFpaVG9509m4kCERHVH0wWTGhB69bwsLLCyYICLEpJqdExjh+XS0ATERGZC5MFE3K1ssInbdoAAOZduICkwsJq9f/yS+Cee4ApU0wRHRERkXGYLJjY015eCHd2RmFpKSafPYvqDBHp00f+GhUF/PKLiQIkIiKqApMFE5MkCcvbtoWVJOHXGzew+do1o/sGBwMzZsjfT54M5OebKEgiIiIDmCzUgfZ2dnijRQsAwNTEROSWlBjdd84coEUL4MIFYN48U0VIRERUOSYLdeStFi3Q2toal9RqzE1ONrqfnR2wdKn8/aJFQFycaeIjIiKqDJOFOmKjUODztm0BAEsuXUJsbq7RfYcPB0aMAEpKgBdfBKoxC5OIiOiuMVmoQw+4uWGUhwc0AF48cwal1RjsuGSJXM3xiSeAhltGi4iIGiImC3Xs04AAOCgU+Cc3FyvS0ozu5+cHnD8vT6PkAlNERFSXmCzUMR+VCu+1agUAePP8eVwtKjK6b/m1qqoxRpKIiOiuMFkwg5d9fHCPvT2ySkowIzER0ZmZWH/1KqIzM6Ex4h7Dzp1Ahw7y1+hoYP16+atGY/LQiYioCbI0dwBNkaWFBb4IDESPI0ewLj0d69LTtft8VSosCQhAhIdHpf1/+glITASGDdO9wuDrK49tiIgwZfRERNTU8MqCmaSo1Xq3p6rVGBkfj6iMjEr79uwpf73zVkRqKjBypFzxkYiIqLYwWTADjRCYmpiod1/ZTYhpiYl6b0loNMBbb+k/blnzadN4S4KIiGoPkwUziMnKwqVKriwAcsKQolYjJiurYt8Y4NKlyo8tBJCSIrcjIiKqDfUmWViwYAEkScK0adPMHYrJpRk5A0JfO2NnW1ZjViYREZFB9SJZOHjwIL766it07tzZ3KHUCW+lssbtvL2NPIeR7YiIiKpi9mQhLy8PY8aMwYoVK+Di4mLucOpEmLMzfFUqSJXslwD4qVQIc3au2DdMnvUgVdJZkuQCTmFhtRUtERE1dWZPFl555RU8+OCDGDhwYJVt1Wo1cnJydB4NkUKSsCQgAAD0JgwCwOKAACj0ZAQKhTw9EqiYMJQ9X7yYVR6JiKj2mDVZ2LBhA44cOYIFCxYY1X7BggVwcnLSPvz8/EwcoelEeHggMigIzcuXZSzHw8qq8r4RQGQk0Ly57nYfH+Dhh4GhQ2szUiIiauokIcyzLFFKSgq6d++OHTt2ICQkBAAQHh6OLl26YPHixXr7qNVqqMvNIsjJyYGfnx+ys7Ph6OhYF2HXOo0QiMnKQlpREbyVSnx/9Sq+uXIFHWxtEdu9O5QWledzGo086yEtDWjWTJ4yefw48PbbwHvv1d3PQEREDU9OTg6cnJyM+gw1W7KwZcsWjBgxAopy18s1Gg0kSYKFhQXUarXOPn2q84M2FDeKi9Hh33+RXlyM91q1wtv+/kb3jYoCHnsMsLQEjh4FOnUyYaBERNSgVecz1Gy3IQYMGIC4uDjExsZqH927d8eYMWMQGxtbZaLQWLlaWeHTW+MZ3k1OxtmCAqP7jhgh34YoKQFeeAEoLTVVlERE1JSYLVlwcHBAp06ddB52dnZwc3NDpyb+J/FoT08MdnGBWgi8dOYMjL34I0nA0qWAvT3w99/AV1+ZOFAiImoSzD4bgiqSJAnLAgNhbWGBP7KysPbqVaP7+vkB8+fL37/xBnD5somCJCKiJsNsYxZqQ2Mcs1DeggsX8FZSEtytrHC6Rw+4GZghUZ5GA/TuDRw8CDz/PLBihYkDJSKiBqdBjFmgqs3080OQrS2uFRdj1rlzRvdTKOQE4YUXgIULTRggERE1CUwW6jGlhQW+atcOAPDtlSvYq2dhqcqEhABffAE0kaKYRERkQkwW6rk+Tk544dZCDy+eOQN1DaY4CAEcPlzbkRERUVPBZKEBWNC6NbysrHC6oAALL16sVl+1GnjgAaBHD+DQIRMFSEREjRqThQbAxcoKi2/VXph/4QLOVKP2gkoFuLvLNRcmTpRrMBAREVUHk4UG4glPTzzg6gq1EHixGrUXAODTT+WxC7Gx8iJTRERE1cFkoYGQJAnL2raFjYUF9mRlYU01ai94egIffSR/P2cOkJRkoiCJiKhRYrLQgLSyscGcli0BADMSE3GtqMjovhMmAP36AQUFwMsvy7cjoqOB9evlrxqNSUImIqJGgEWZGpji0lJ0O3wYcfn5eMbLC882a6ZdsTLM2RkKSaq0b0IC0LkzUFQEuLoCN27c3ufrCyxZIi9/TUREjV+DWHWyNjTFZAEA9mdno8/RoxW2+6pUWBIQgAgPj0r7PvEE8MMPFbeX5RiRkUwYiIiaAlZwbOTSKrn9kKpWY2R8PKIyMvTu12iAv/7Sf8yylHHaNN6SICIiXUwWGhiNEJiamKh3X9klommJidDouWAUEwOkplZ+bCGAlBS5HRERURkmCw1MTFYWLqnVle4XAFLUasToKQ2dlmbcOYxtR0RETQOThQamslsQxrS7VTW6Ssa2IyKipoHJQgPjrVTWuF1YmDzrobIJE5IE+PnJ7YiIiMowWWhgwpyd4atSobIJkhIAP5UKYc7OFfYpFPL0SKDyhGHxYrkdERFRGSYLDYxCkrDk1joR+j7vBYBP27SptN5CRIQ8PbJ584r7Jk7ktEkiIqqIyUIDFOHhgcigIDRXqfTuL6hiGeuICCA5GdizB1i3Dnj1VXn7hg2GZ0sQEVHTxKJMDZhGCMRkZWkrOP6VnY3/JCfD1dISp3r0gKeR4xs0GqBvX+DIEeD774FRo0wcOBERmV11PkMt6ygmMgGFJCHcxUX7vK+TEyKvXUNsXh6mnD2LDUFBxh1HAaxcKS9jbWQXIiJqQngbohGxsrDAN+3aQQFgY0YGfr52zei+HTowUSAiIv2YLDQy9zg4YKafHwDgpTNnkF1SUu1jHD4MLFpU25EREVFDxWShEZrbsiUCbGxwuagIs86dq1bf8+eBnj2B114D9u83UYBERNSgMFlohGwUCnzdrh0A4Ku0NERnZhrdt3VrYOxYeZ2IiRPl5ayJiKhpY7LQSPVzdsYLt+o2TzxzBoXVWEry448BDw8gPh5YuNBUERIRUUPBZKER+7BNG/golUgsLMTc5GSj+7m5yZUcAeDdd4GEBJOER0REDQSThUbMydISXwQGAgA+TknB4dxco/uOHg0MGSLfhpg0SZ5WSURETROThUZuuLs7nvDwQCmA506fRrGRn/qSBCxfDtjaAvv2AVu2mDRMIiKqx5gsNAGftW0LV0tLHMvPx8cpKUb3a9VKHr+wbBnw6KOmi4+IiOo3lntuItZcuYJnTp+GSpJwLDQU7WxtzR0SERGZUXU+Q3lloYl42ssLQ1xcoBYCzyckoLi0FNGZmVh/9SqiMzOhMSJnzM8HYmOB6Ghg/Xr5azUmWRARUQPFtSGaCEmS8GW7dgj691/8mZ0Nz7//Rla56o6+KhWWBAQgwsNDb/+TJ4HwcODGDd0EwdcXWLKES1sTETVmvLLQhPhbW+NJT08A0EkUACBVrcbI+HhEZWTo7XvsGJCRUfFKQmoqMHIkEBVlkpCJiKgeYLLQhGiEwO83bujdV3YTYlpiYoVbEhoNMGuW/mOWNZ02jbckiIgaKyYLTUhMVhYuGajfLACkqNWIycrS7RcDXLpU+XGFAFJS5HZERNT4MFloQtKMXOjhznZpaUYe38h2RETUsDBZaEK8lcoatbu1xETV/YxsR0REDQuThSYkzNkZvioVpEr2SwD8VCqEOTvr9guTZz1IlXSUJMDPT25HRESND5OFJkQhSVgSEAAAlSYMiwMCoLgjK1Ao5OmRQMWEoez54sVyOyIianyYLDQxER4eiAwKQnOVqsK+Dra2eNTdXX+/CCAyEmjeXHe7r6+8vV8/U0RLRET1gVmTheXLl6Nz585wdHSEo6Mjevfuje3bt5szpCYhwsMDyb16YU9ICNZ16IDvO3SAjSThZEEB/i81tfJ+EUByMrBnD7Bunfz16FE5WejeHajGopZERNSAmHVtiK1bt0KhUCDg1qXx1atX46OPPsLRo0cRFBRUZX+uDVF7lqem4uWzZ2FtYYGj3bqhvZ2dUf1yc4GQECApCXj+eWDFChMHSkREtaI6n6H1biEpV1dXfPTRR3juueeqbMtkofYIITDk+HHszMxEDwcH/NW1KywtjLvwtG+fXApaCOCXX4AHHzRtrEREdPca5EJSGo0GGzZsQH5+Pnr37q23jVqtRk5Ojs6DaockSfi2XTs4KRT4NzcXH1ZjKev77gOmT5e/f/554Pp1EwVJRERmYfZkIS4uDvb29lCpVHjxxRexefNmdOzYUW/bBQsWwMnJSfvw8/Or42gbN19ra/xf27YAgHeSkxFbjUEI8+cDHToAV64AL79sqgiJiMgczH4boqioCBcvXkRWVhY2bdqEr7/+Gnv37tWbMKjVaqjVau3znJwc+Pn58TZELRJCICI+HluuXUNnOzv8260bVEbejjh0COjVS14jYv164MknTRwsERHVWIO6DaFUKhEQEIDu3btjwYIFCAkJwZKySf13UKlU2pkTZQ+qXZIk4cvAQLhbWeF4fj7eSU42um/37sB//gP4+ABubqaLkYiI6pbZk4U7CSF0rh5Q3fNUKvFlYCAA4MOLF3EgO9vovm+/DZw4AQwaZKroiIiorpk1WXjrrbcQExOD5ORkxMXF4e2330Z0dDTGjBljzrAIci2Gp728UApg3OnTKDBy/WkrK8DF5fZz5n1ERA2fWZOFq1evYuzYsWjXrh0GDBiAf/75B7/99hsG8c/SeuGzgAD4KJU4U1iI2efPV6uvEMCqVUCbNkA1uxIRUT1To2Rh9erV2LZtm/b5rFmz4OzsjD59+uDChQtGH+ebb75BcnIy1Go10tPTsWvXLiYK9YiLlRW+adcOAPBZaip2Z2Ya3VcIYOVKIDUVGD9eHvRIREQNU42Shffffx82NjYAgP3792Pp0qVYuHAh3N3dMb1swj01Cg+4ueGFW2tPP3v6NHJKSozqZ2EhX1mwtwdiYuSFpoiIqGGq0dRJW1tbnD59Gi1atMAbb7yBtLQ0fPfdd4iPj0d4eDgyMjJMEWsFrOBYN3JLShBy6BCSbt7Es15eeKZZM6QVFcFbqUSYs3OFVSrLW7ECmDQJUKmAf/8FbtwA0tIAb295SWuuVElEZB4mnzppb2+P67fK9O3YsQMDBw4EAFhbW6OwsLAmh6R6zMHSEqvatwcArLx6Ff2PHcNTp06h/7FjaHngAKIMJIfPPw8MHSoPdOzeHejfH3jqKflry5ZAVFQd/RBERFRjNUoWBg0ahOeffx7PP/88zpw5gwdvLQYQHx+Pli1b1mZ8VE9cKy7Wuz1VrcbI+PhKEwZJklerBIA7D5GaCowcyYSBiKi+q1Gy8Pnnn6N3797IyMjApk2b4HarAs/hw4cxevToWg2QzE8jBKYmJurdV3YPa1piIjR67mhpNMA77+g/blnzadM4AJKIqD4ze7nnu8ExC3UjOjMT/Y8dq7LdnpAQhJcvsgAgOlq+5VBl3z3yypVERFQ3TD5m4bfffsOff/6pff7555+jS5cueOqpp5BZjel11DCkFRXVuF1ampHnMLIdERHVvRolC6+//rp2eei4uDjMnDkTw4YNw/nz5zFjxoxaDZDMz1uprHG7W7Muq+5rZDsiIqp7ljXplJSUpF0VctOmTXjooYfw/vvv48iRIxg2bFitBkjmF+bsDF+VCqlqNSq7Z+WnUiHM2bli3zDA11cezKjvhpckyfvDwmo1ZCIiqkU1urKgVCpRUFAAANi1axcGDx4MAHB1ddVecaDGQyFJWBIQAACorKLCdF9fvfUWFAqgbBFRfeUYhAA+/ZT1FoiI6rMaJQv33nsvZsyYgXfffRf//vuvdurkmTNn4OvrW6sBUv0Q4eGByKAgNFepdLZbW8hvoZVXruBmJVMaIiKAyEigeXP9x75xo1ZDJSKiWlaj2RAXL17Eyy+/jJSUFEyZMgXPPfccAGD69OnQaDT47LPPaj1QfTgbou5phEBMVpa2gmOgrS26HDqEjOJizPT1xce3rkDo7auRSz+XVXA8eBCYNQuwtQViY4G2bevu5yAiauqq8xnKqZN017Zeu4aHT5yABGBXSAjuv2P6ZGVKS4GBA+Vpk6GhwF9/yUtcExGR6VXnM7RGAxwBQKPRYMuWLTh16hQkSUKHDh3wyCOPQMGbz03OcHd3TPL2xldpaRh3+jSOd+8OFyM+9S0sgNWrgc6d5asO165xVgQRUX1Uo2QhMTERw4YNQ2pqKtq1awchBM6cOQM/Pz9s27YNbdq0qe04qZ77pE0b7M7KQmJhIV45exbrbs2WqYqfn1y4qUMHwMgZmkREVMdqNMBxypQpaNOmDVJSUnDkyBEcPXoUFy9eRKtWrTBlypTajpEaAHtLS3zfoQMUANanp2P91atG9w0J0U0UGu6NMSKixqlGYxbs7Oxw4MABBAcH62w/duwY+vbti7y8vFoL0BCOWah/3klOxtzkZDgpFDgeGooW1tZG9y0uBubNkwdAfv21CYMkIiLTl3tWqVTIzc2tsD0vLw9KXktu0t5u0QI9HRyQrdFg/OnTKK1GLhobC7z/PvDNN/JUSyIiqh9qlCw89NBDmDRpEv755x8IISCEwIEDB/Diiy/i4Ycfru0YqQGxtLDAmg4dYGthgT1ZWfj00iWj+4aGAm++KX8/aZJc9ZGIiMyvRsnCZ599hjZt2qB3796wtraGtbU1+vTpg4CAACxevLiWQ6SGpq2tLT69VW/hrfPncbwat6XmzgW6dwcyM4Hx4+XplUREZF53VWchMTERp06dghACHTt2RICBgjymwDEL9ZcQAo+cOIGt168j2M4O/95zD6yNnFabkAB07QoUFsqloKdNM22sRERNkUmKMlVnNclFixYZ3fZuMFmo39KLihB88CDSi4sxw9cXC9u00an+GObsrHc9CQD44gvgpZcAlQo4cADIyrpd+TEsjGtJEBHdLZMkC/379zfq5JIkYffu3Ua1vVtMFuq/X65dw/ATJwAAHlZWyCgu1u7zVamwJCAAER4eFfoJATz8MLBjB+DgAFy/fnufr6+8OFVEhMnDJyJqtFjumeqVwbGx2JmVVWF72TWFyKAgvQnDypXAc89VrLtQdjEiMpIJAxFRTZl86iSRsTRC4OSt5czvVJYDTEtMhOaOjECjAf73P/0Fmsq2TZsmtyMiItNiskAmFZOVhdSiokr3CwApajVi7rjyEBMDGJp1KQSQkiK3IyIi02KyQCaVZiBRMNQuLc3I4xvZjoiIao7JApmUt5EVPe9sZ+zqk1ylkojI9JgskEmFOTvDV6WC/gmS8iBHP5UKYc7Ouv3C5FkPlcyshCTJK1aGhdVmtEREpA+TBTIphSRhya1iXZUlDIsDAirUW1Ao5OmRgP6EQQhg0SLWWyAiqgtMFsjkIjw8EBkUhOYqVYV9ra2tMdzNTX+/CHl6ZPPmutvLkodz52o7UiIi0sfS3AFQ0xDh4YFH3N21FRwtJQkTExJw7uZNvH/xIua0bKm/XwTwyCPyrIeyCo7nzgHPPy+vUilE5bcqiIiodjBZoDqjkCSEu7hon5cIgadOncK7yckY5uqK0EqKgigUQHj47ef9+gH+/sCAAUwUiIjqAm9DkNmM9vLCEx4e0AAYe+oUCoyssCRJwMCBTBSIiOoKkwUyq2WBgfBRKpFQWIg3z5+vdv8bN4AnnwR+/90EwREREQAmC2RmrlZW+LZ9ewDA/6WmYueNG9Xq/8knwMaNwPjxwLVrJgiQiIiYLJD5DXF1xcs+PgCAZ0+fRma5lSmr8vbbQPv2wJUrwAsv6F9LgoiI7g6TBaoXFrZpg7Y2NkgtKsKrZ88a3c/WFli7FrC0BKKigNWrTRgkEVETxWSB6gU7hQLftW8PCwBr09PxY3q60X3vuQeYN0/+/tVXgRoMfSAiIgOYLFC90cvJCW/5+wMAXjxzBmlqtdF9Z80C7r0XyMsDnnkGKCkxVZRERE0PkwWqV/7r74977O1xo6QEzyUkQBg5CEGhANasARwcgORk4MIF08ZJRNSUmDVZWLBgAUJDQ+Hg4ABPT088+uijSEhIMGdIZGZKCwus6dABKknC9hs38FU11qBu2RLYuhWIiwPatDFdjERETY1Zk4W9e/filVdewYEDB7Bz506UlJRg8ODByM/PN2dYZGYd7eywoHVrAMCMxEQkFhQY3bdfP6BckUhoNEB0NLB+vfzVyLpPRERUjiSMvc5bBzIyMuDp6Ym9e/fivvvuq7J9Tk4OnJyckJ2dDcdKSgVTw1QqBAYeO4Y9WVno5eCA91q1QnpxMbyVSoQ5O1dYpfJOQsiDHVevlscxlPH1lVezjIgw8Q9ARFTPVecztF6tDZGdnQ0AcHV11btfrVZDXW7QW05OTp3ERXXPQpKwqn17tP/3XxzIzcXA48e1+3xVKiwJCECEh0el/T/5BPj884rbU1OBkSPl1SyZMBARGafeDHAUQmDGjBm499570alTJ71tFixYACcnJ+3Dz8+vjqOkunQoNxeFpaUVtqeq1RgZH4+ojAy9/TQa+eqBPmXX0aZN4y0JIiJj1ZtkYfLkyTh+/DjWr19faZvZs2cjOztb+0hJSanDCKkuaYTA1MREvfvK7ptNS0yERs9dtJgY4NKlyo8tBJCSIrcjIqKq1YvbEK+++ip+/vln7Nu3D76+vpW2U6lUUKlUdRgZmUtMVhYuGaizIACkqNWIycrSWfYaAIydQFGNiRZERE2aWZMFIQReffVVbN68GdHR0WjVqpU5w6F6JK2oqMbtvL2NO4ex7YiImjqzJguvvPIK1q1bh59++gkODg64cuUKAMDJyQk2NjbmDI3MzFuprHG7sDB51kNqqv6FpSRJ3h8WdrdREhE1DWYds7B8+XJkZ2cjPDwc3t7e2sfGjRvNGRbVA2HOzvBVqWBogqSfSoUwZ+cK2xWK2wMc75xhWfZ88WK5HRERVc2syYIQQu9j/Pjx5gyL6gGFJGFJQAAAVJowjPH0rLTeQkSEPD2yeXPd7b6+nDZJRFRd9WY2BNGdIjw8EBkUhOZ3DGp1uHVJ4NsrV5BhYGxDRIS8TsSePcC6dfLXpCSgUydg9mz9tyiIiKiielXBsbpYwbFp0AiBmKwspBUVwVupRKiDA3ocOYKTBQWIcHdHZFAQpCoqOpbJzQX8/YHMTOCbb4AJE0wcPBFRPVWdz1BeWaB6TyFJCHdxwWgvL4S7uMDO0hJrOnSApSQh6to1rL161ehjOTjIVxUAYPp0ud4CEREZxmSBGqR7HBwwx98fADD57Fmk3LxpdN8ZM4BevYCcHOD553k7goioKkwWqMF6s0UL9HRwQLZGg2dPn0apkZ/6CgWwahVgbQ3s2AF8/bVp4yQiauiYLFCDZWlhge86dICNhQX+yMrCstRUo/u2awfMny9/P2MGcOGCiYIkImoEmCxQgxZoa4uP2rQBAMw6fx4JBQVG9506FejbV17C+uOPTRUhEVHDx2SBGryXfHwwyMUFhaWleObUKZToWalSH4UCWLkSmDcPWLTIxEESETVgTBaowbOQJHzbrh2cFAr8m5uLDy5eNLpv27bAf/8LWFmZMEAiogaOyQI1Cr7W1vg8MBAA8M6FCziSm1vtYxQVybUXjLwwQUTUZDBZoEbjKU9PjPTwQIkQGHvqFG5qNEb3LS0F+veXp1J+8YUJgyQiaoCYLFCjIUkSlrdtCy8rK5wsKMB/kpKM7mthAYweLX//+uvA+fMmCpKIqAFiskCNirtSia/btQMALLp0CbszMxGdmYn1V68iOjMTGgO1GF5+GQgPBwoKgPHjgd27gfXrgehooBoXKYiIGh2uDUGN0sSEBHydlgYFgPKf874qFZYEBCDCw0Nvv/PngY4dAbVad7uvr7zsNVerJKLGgmtDUJMX7uQEQDdRAIBUtRoj4+MRlZGht19sbMVEAQBSU4GRI4GoqNqNk4ioIWCyQI2ORgi8Wcl4hbLLaNMSEyvcktBo5EJNevvdajptGm9JEFHTw2SBGp2YrCxc0nd54BYBIEWtRkxWlm6/GODSpcqPK4S8SmVMTO3ESUTUUDBZoEYnraioRu3S0ow8vpHtiIgaCyYL1Oh4K5U1auftbeTxjWxHRNRYMFmgRifM2Rm+KhUkA238VCqEOTvr9guTZz1IBjr6+cntiIiaEiYL1OgoJAlLAgIAoNKEYUGrVlDckRUoFPL0SKDyhGH4cLkdEVFTwmSBGqUIDw9EBgWhuUqls73sc37XHYMbtf0igMhIoHlz3e23ZmJi5Urg1KnajZWIqL5jUSZq1DRCICYrC2lFRfBWKiEADDx2DKUA1nXogNFeXvr7aeRZD2lp8hiFvn2BYcOAXbuA4GDgn38AG5s6/VGIiGpVdT5DmSxQkzM3KQnvXLgAR4UCR7t3R2sjP/WvXAFCQoD0dOCll4Bly0wcKBGRCbGCI5EB//H3R19HR+RoNHjq5EkUG7kmdbNmwJo18vfLlwMHD5owSCKieoTJAjU5lhYWWNuxI5wtLfFPbi7mJicb3XfwYGDePDlpCA01XYxERPUJb0NQkxWZno7HT56EBGBXSAjud3Exd0hERHWGtyGIjDDS0xMTvb0hAIw9dQrXjKz8WF5GBvD997UfGxFRfcJkgZq0TwMC0N7WFpeLijAhIQHVudB2/TrQpQvwzDPAzp2mi5GIyNyYLFCTZqdQYEPHjlBKErZev45lly8b3dfNTS7SJAQwdixw9aoJAyUiMiMmC9Tkhdjb46M2bQAAMxMTcTwvz+i+n34KdOokJwrPPAMYObGCiKhBYbJABODV5s3xoKsr1EJg9MmTKNBojOpnYwNs2CB/3bED+PhjEwdKRGQGnA1BdEtGURFCDh1CWlERJjZrhqe8vLSVH8OcnSusJVHe118DEycClpZAdDRQXHy7+mNYGNeTIKL6hxUciWroj8xMDDx2rMJ2X5UKSwICEOHhobefEMDo0cDGjYBKBajV5fr6ygtURUSYKmoiourj1EmiGsouKdG7PVWtxsj4eERlZOjdL0nAgw/K35dPFAAgNRUYORKIiqrNSImI6g6TBaJbNEJgamKi3n1ll9+mJSZCo+dinEYDvPWW/uOWNZ82TW5HRNTQMFkguiUmKwuX7rwsUI4AkKJWI0bP8tYxMcClS5UfWwggJUVuR0TU0DBZILolzcgKjvrapaUZeQ4j2xER1SdMFohu8VYqa9zO29vIcxjZjoioPmGyQHRLmLMzfFUqVD5BEmhmZYUwZ+eKfcPkWQ8GZlfC11duR0TU0Jg1Wdi3bx+GDx8OHx8fSJKELVu2mDMcauIUkoQlAQEAUGnCUCwE0vXchlAo5OmRQOUJQ2AgYMH0nIgaILP+15Wfn4+QkBAsXbrUnGEQaUV4eCAyKAjNVSqd7c2VSvgolbheUoLhcXHI1zOtISICiIwEmjfX3e7uLicQu3cDCxaYMnoiItOwNOfJhw4diqFDh5ozBKIKIjw88Ii7O2KysnQqOCbfvIleR47gcF4enjp5ElGdOlWo6hgRATzyiDzroXwFxy+/BKZMAVxdzfRDERHdBbMmC9WlVquhLje1LScnx4zRUGOmkCSEu7jobGtjY4OfO3VC/9hY/Hz9OmYmJmJx27YV+yqA8HDdbS+/DNx/P9C+vQmDJiIykQZ1B3XBggVwcnLSPvz8/MwdEjUxvZ2csKZDBwDAktRU/J+h4gp3KJ8oXL8OJCXVdnRERKbRoJKF2bNnIzs7W/tISUkxd0jUBD3u6YkPWrcGIFd03HrtWrX6nzsH9O4NPPAAcOOGKSIkIqpdDSpZUKlUcHR01HkQmcMsPz9M9PZGKYAnT57Ekdxco/va2cnrR5w5A4wYUXEtCSKi+qZBJQtE9YUkSfi8bVsMdnFBQWkpHoqLQ8rNm0b1bdYM2LYNcHQE9u2Tl7ZuuGu/ElFTYNZkIS8vD7GxsYiNjQUAJCUlITY2FhcvXjRnWERGsbKwwA9BQehkZ4e0oiI8GBeHnEpWrbxTp07Ajz/KgyHXrAHmzTNxsEREd0ESwnx/00RHR6N///4Vto8bNw6rVq2qsn911uImMpWLN2+i55EjuFJUhCEuLtjSqRMO5OToTLu8c4plmRUrgEmT5O/XrAFGj6447VKhqMMfhoiajOp8hpo1WbhbTBaovjicm4v7jh5FQWkp7CwskF9aqt3nq1JhSUAAIjw89PZ9803gww9vF3NKTb29z9dXrgwZEWHK6ImoKarOZyjHLBDVgm4ODphy69O+fKIAAKlqNUbGxyMqI0Nv3/ffB0aNAi5f1k0UAPn5yJFAVJRJwiYiMgqTBaJaoBEC36en691XduluWmIiNHou5AkB/P23/kGOZdumTQP0VJgmIqoTTBaIakFMVhYuGZgDKQCkqNWIycqq2DcGMFTbSQggJUVuR0RkDkwWiGpBmp6VKI1tl5Zm5DmMbEdEVNuYLBDVAm+lssbtvL2NPIeR7YiIahuTBaJaEObsDF+VCvonSMrsLSzQ18mpYt8wedZDJbMrAQAeHnI7IiJzYLJAVAsUkoQlAQEAUGnCkFdairGnTkF9x2wJhUKeHglUnjBkZgI//1xLwRIRVROTBaJaEuHhgcigIDRXqXS2+6lUmN68OawkCRszMvBQXBxy76j0GBEBREberrVQxtcX6NEDKCkBTpww9U9ARKQfizIR1TKNEIjJyqpQwXHnjRsYceIE8ktL0d3BAduCg+F5xxgGjaZiBUch5ETiiScM36ogIqoOVnAkqqcO5eRgaFwcrhUXo62NDX7v3BmtbGyqdYy8POCbb4BXXwUseG2QiGqIFRyJ6qnujo74q2tX+KtUOFtYiL5Hj+J4Xp7R/YUAHn9cLtI0bhxQXGy6WImIyjBZIKpjgba2+PueexB8a7XK+44e1RZr0giB6MxMrL96FdGZmRUqPkoSMGYMYGkJfP898MgjQH6+fPsiOhpYv17+ymqPRFSbeBuCyEyyiosx/MQJ/JmdDZUkYZqvL9amp+tUgqxsEapff5XXjCgsBAID5VsTly/f3s8FqIioKhyzQNRAFGo0ePLkSfx8/bre/WXjGSODgiokDPv3A4MGyVcWKvS71TEykgkDEenHMQtEDYSNQoEfOnaEbSUjFQ0tQtWjB2Bvr/+4XICKiGoTkwUiM9ufk4OCOwo1lVfZIlQxMcDVq5UflwtQEVFtYbJAZGY1XYSKC1ARUV1hskBkZjVdhIoLUBFRXWGyQGRmxixCpZIkBNxRvMmYBagA4J9/AAN3OYiIqsRkgcjMjFmESi0Euh4+jK3Xrt3uZ2ABqvLP33wTGDhQHr9ARFQTTBaI6gFDi1AtadMGXeztca24GA+fOIFXzpxB4a0pDoYWoIqMBFasAGxtgT17gM6d5W1ERNXFOgtE9Uhli1CpS0vx9vnz+OTSJQBAR1tbrO/YEZ1vzZ3UtwCVQiEf88wZuerjoUPAxx8DM2fecU4DfYmo8WJRJqJGaseNGxh3+jSuFBVBJUlY2KYNXm3eHKWA3iSjTHExsHo1MGHC7cWn1Gpg2zZg6lTgVg4CgNUfiZoKJgtEjVhGUREmJCTgl1tVH7va2+NKUZHO1MrKykSXyc8H2rfXTRLKsPojUdPACo5EjZiHUomfO3XC0rZtYSVJOJqXV6EGQ6pajZHx8YjKyNB7jA0b9CcKAKs/ElFFTBaIGiBJkvCijw9cLS317jdUJhoA2rQxfHxWfySi8pgsEDVQMVlZuFpcXOn+yspEA6z+SETVw2SBqIEytkz0ZT3tWP2RiKqDyQJRA2Vsmeh3k5Ox746rC1VVf5QkwM8PaN2axZyIiMkCUYNlTJloCcDpwkL0i43Fw3FxOJmfD0C3+iMUAgjJBO6/Kn9VyGMcFi+Wqz8GBgJvvQXk5OgeW6MBoqOB9evlrxwMSdR4ceokUQMWlZGBkfHxAG4PagRul41eERiIw3l5+OryZWgg/3Uwwdsb77RsCR+VCrO2ZmBRUSI0bmptX8V1FWYoA/DuYA8MGQLs3Stv9/AA5s4FJk4Etm5lfQaiho51FoiakKiMDExNTMQl9e0PfD+VCovL1VlIKCjA7PPnsfnW2hI2FhZ40M0NmzIycOd/AGWJRmRQEEa4e+Dnn4FZs+RKkADg4wNcvlwxDtZnIGpYmCwQNTGVlYm+01/Z2Zh17hz+vvOewh0kyIWdknr1gkKSUFwMfPUVMGcOcKsWlP5+knyFISmJJaOJ6jsWZSJqYhSShHAXF4z28kK4i4veRAEA+jo54c+uXTGvZUuDx7tz2qWVFfDKK8CqVYbjYH0GosaJyQJREyNJEgJsbIxqe+f0zNxc487B+gxEjQuTBaImyNhpl4dzc5FbUnK7X/m6CxZ3zKKwuH1H8/vvgX/+uV06+k6cSUFUTbGxwNCh8lcz0F8rlogatbJpl6lqdYUBjuV9cukSvkxLw5Oenpjk7Y1773WAr6+ES60zgFcSAc/bgyqRrgKWBgAxHvj1V+DXX4HOnYGFC4EhQ243i4riTAqiatu0CfjtNyA0FOjSpc5PzwGORE1UVdMux3p54UBODs4UFmr3hdjZwf26I/5Qpek2BoBS+fmTp4Ngtd8DP/4I3LwJ7NoFDBggN9mwAXjqqYpXHDiTgqgKXboAx47JX48erZVDcjYEERmlqmmXQgjsy87GisuXEZmRAXXZfxcC0FsNSgB+1vIsipwsCZs2ARMmABYW8q0GZ2cgLw/yLYvgLMCtCLiuBOKcIQnJqJkUGo08gDItTb4tEhbGmRfUyF29CjRrpvvc0/OuD1udz1DehiBqwiI8PPCIu3ul0y4lSUI/Z2f0c3bGZ8XFmJOUhKWXL+tPFABAuj2LItzFBc8/f3vX3r23EoWwDGByxVsYYmkAUmI8EBMDhIfrP3xUFDBlmkCqa5Y20Wh+wxmfLZaqvCJRVCKw7M8snMssQhsXJV6+1xlKS0P1L+++LxMbqhW//17x+dixdRoCkwWiJq5s2mVVXK2s0MfJSU4WqjDu9GkMdHFBNwcHdHNwQGc7O1y9qpAThXfiK3ZwV8vb5wQhLU0uJLVjh1xq2t9fvk0RFQU8tiQD+Fg30UhNV+GxJQHYBI9KEwadSpW3ftTXfpIrVS4c7mHwZ6lp37tJbICaJyjmSIoaWrwN7ZyaX7ZBslDAolQDjUIB/LINijpOFsx+G2LZsmX46KOPkJaWhqCgICxevBhhYWFG9eVtCKK6FZ2Zif7HjlW7nwJAC9giqfAmYF2q/8pEKYAMFXZ59UKfnhIcHOS/zF1dga5dgb8UGbj55q1EQ89YCbf/C8LVHz0q/OU+a2sGPrKvvN/reUGVfujXtK82sdE3CPTzAGyaWnliU3beyspwG0pQatrPXH15zlv9UlPlWwt6LNl7A8++/QgcCwu027Jt7LBq/hZM7eeq/4ReXkDz5gZ/FqABjVnYuHEjxo4di2XLlqFv37748ssv8fXXX+PkyZNo0aJFlf2ZLBDVLY0QaHngQKWzKCQAzZRKfBYQgNi8PBzOy8Ph3FxkFBcbfY5H3dzhV2KHqO8scSXREpocSyBfAbx1CnApNphojIzqhc6dJDg7y+Mj7BwFHi85gFJXdaX9FJkqFDzSq8JfeUUlArY/HYCmmn01GsBrZAauT6l+YgPUPEExR1LU0OKtt+dc/CSwe7fe8wJAqSTBotxH9Z3PKxgwQB5ZXIUGkyz07NkT99xzD5YvX67d1qFDBzz66KNYsGBBlf2ZLBDVvapmUUQGBWnXpAAAIQRS1Wp8eukSFpWfL2kK2ZbATQVQIgEaCbAUgM/NKrspTjrCKk+J1v4SQoIlWACIv1yMWMvMKvv2gzu6etvgt1/l2yUF+cCFLpcBW02lg0CRr0DHE35wdZIgSYCFJEGC/Frt9btYRV9LDEv3h0KS4OYGdO8moUQjMD0uGcK2pNJ+Up4lPu/SCpYK+Vx/7wfUNyWUQmCD43nA3nDfp/PaQCFJ2pkrEgANBL6zPQdRRd8JhQFQWEhQWgG9egElGoFnjyRC2Bnol2uJ1d0DYKmQG5w8CWRlAiVC4EtlYpXnfKm4LSwtyiVxpQLLrM4a7pdriVc0bWFZrgJqaA9AkgTGHKq678qQtrCzkRskXwBSUgX+T2G4n0WeJTafOI2B81+DTX62wVVkjeLsLNdmf/zxKps2iGShqKgItra2+PHHHzFixAjt9qlTpyI2NhZ7y5a6K0etVkNdbtR2Tk4O/Pz8mCwQ1TFjFq+6k7G3MMZ4esLZ0hJZJSXaR0LOTVxDUZV9iRoqj8xMLP/0UzwWE1P1lYM7SZI8H3nECOCLL4yeKdEgZkNcu3YNGo0GXl5eOtu9vLxw5coVvX0WLFiAd955py7CIyIDqppFoU9VhaDKFq9a3aFDheP8cT0TA+OqTjS+CAjEPY72KBECxUJgfWwuvsC5KvsNyvLFPV42sLETcHaVb7f8caYAvyqqrlvdHx7o6mONo8cESgVwvqQAKT43quznmeoEr1IbCMj/zwsIXEYhsloYXuQLABxSHOCmsYajIxDYDjiaehPnlFXX4m5VZI9gbxUEgOPHgaIiINtKjQLfvCr72qTawaFIpfO7y1WqcbN5fpV9Val2sC9SwtJSLtR1Kr0Il6yq7udbbItAD7na6LlzQE4uUKAqgtq7oIqegDLNFrbq25VKC1RFKDKqnw1syvUL6gQkZxbhslWhgV4yL7UNAj3lvpcvAxdzilDsXXU/x2wbqHOcMOG5JdgavAOfrpoPh5sFsCyturSpxkIBhaMD8OWXwKhRVbavKbNdWbh8+TKaN2+Ov//+G71799Zunz9/PtasWYPTp09X6MMrC0QNW3VvYZTRCAGv3QdwXVLrL1JfCrgJFa7e30sn0dCOO3CpvF+VYxaq2dfYxGZXcAgGuOnOQlkcnYnpqLrvpwjBtPDbfWvaz1x9ec4qzpmejqThY9Dy310Gb0sIAMk9BqLV1rU1qrvQIFaddHd3h0KhqHAVIT09vcLVhjIqlQqOjo46DyJqOCI8PBAZFITmKpXOdl+VqtJEAZCnd37VOUDOKkrv2HlroNhXnQMqXJFQWkqYoTTcb4YyQO8Utpr2DXd1hptGVbFPub5uGhXCXZ0r7Hr5Xmcorhvuq7iuwsv36vataT9z9eU5qzinpydaDOwJjYXhohwaCwVaDOpVKwWaqmK2ZEGpVKJbt27YuXOnzvadO3eiT58+ZoqKiEwtwsMDyb16YU9ICNZ16IA9ISFI6tWr0kShfL9NnYLga31HomGjwqZOlScaC4d74PW8ICgydfspMlUGR7/XtG9NExug5gmKOZKihhZvQzunYtsvUFRxG0JRqoFi2y8G29SWejF18osvvkDv3r3x1VdfYcWKFYiPj4e/v3+V/Tkbgqjp0QhRrbESZeq6EE9URgamnk3EpaLbt059VSosMTAItExDqiHQ0OJtEOe8cuWOJV5vT5fUO/jxyhW5tkI1NYjZEGWWLVuGhQsXIi0tDZ06dcKnn36K++67z6i+TBaIqD6raWIDNKzqhA0t3np/ztWrgfHjtU+FQgG1jT32DHsO/X/9BqrCPEjl13VfvRp45hmj4i+vQSULd4PJAhERNTpPPCEvwSpExSmR6enAiy8CmzfLUyYlSa6psGFDtU/TIAY4EhER0R1KSoDffgNKSwEnJ2DjRrl+eNkgRk9P+fnGjfL+0lJg+3a5dKgJMVkgIiKqLwoLgdat5asJCQmV104YNUreP2IE0KYNUFB1DYm7wVUniYiI6gsHB+DQIePWMi+7yqDRmHztc15ZICIiqk+q+8Fv4kQBYLJAREREVWCyQERERAY16DELZbM+c3KqXniFiIiIbiv77DSmgkKDThZyc+VV1vz8/MwcCRERUcOUm5sLJycng20adFGm0tJSXL58GQ4ODpDuqIpWtiJlSkoKCzZVgq+RYXx9qsbXqGp8jarG16hqpniNhBDIzc2Fj48PLCwMj0po0FcWLCws4Ovra7ANV6esGl8jw/j6VI2vUdX4GlWNr1HVavs1quqKQhkOcCQiIiKDmCwQERGRQY02WVCpVJgzZw5UKlXVjZsovkaG8fWpGl+jqvE1qhpfo6qZ+zVq0AMciYiIyPQa7ZUFIiIiqh1MFoiIiMggJgtERERkEJMFIiIiMqhRJgvLli1Dq1atYG1tjW7duiEmJsbcIdUbc+fOhSRJOo9mzZqZOyyz2rdvH4YPHw4fHx9IkoQtW7bo7BdCYO7cufDx8YGNjQ3Cw8MRHx9vnmDNpKrXaPz48RXeV7169TJPsGawYMEChIaGwsHBAZ6ennj00UeRkJCg06apv4+MeY2a+vto+fLl6Ny5s7bwUu/evbF9+3btfnO+hxpdsrBx40ZMmzYNb7/9No4ePYqwsDAMHToUFy9eNHdo9UZQUBDS0tK0j7i4OHOHZFb5+fkICQnB0qVL9e5fuHAhFi1ahKVLl+LgwYNo1qwZBg0apF2bpCmo6jUCgAceeEDnffXrr7/WYYTmtXfvXrzyyis4cOAAdu7ciZKSEgwePBj5+fnaNk39fWTMawQ07feRr68vPvjgAxw6dAiHDh3C/fffj0ceeUSbEJj1PSQamR49eogXX3xRZ1v79u3Fm2++aaaI6pc5c+aIkJAQc4dRbwEQmzdv1j4vLS0VzZo1Ex988IF2282bN4WTk5P44osvzBCh+d35GgkhxLhx48Qjjzxilnjqo/T0dAFA7N27VwjB95E+d75GQvB9pI+Li4v4+uuvzf4ealRXFoqKinD48GEMHjxYZ/vgwYPx999/mymq+ufs2bPw8fFBq1at8OSTT+L8+fPmDqneSkpKwpUrV3TeUyqVCv369eN76g7R0dHw9PREYGAgJk6ciPT0dHOHZDbZ2dkAAFdXVwB8H+lz52tUhu8jmUajwYYNG5Cfn4/evXub/T3UqJKFa9euQaPRwMvLS2e7l5cXrly5Yqao6peePXviu+++w++//44VK1bgypUr6NOnD65fv27u0OqlsvcN31OGDR06FGvXrsXu3bvxySef4ODBg7j//vuhVqvNHVqdE0JgxowZuPfee9GpUycAfB/dSd9rBPB9BABxcXGwt7eHSqXCiy++iM2bN6Njx45mfw816FUnK3PnctVCiArbmqqhQ4dqvw8ODkbv3r3Rpk0brF69GjNmzDBjZPUb31OGPfHEE9rvO3XqhO7du8Pf3x/btm1DRESEGSOre5MnT8bx48fx559/VtjH95GssteI7yOgXbt2iI2NRVZWFjZt2oRx48Zh79692v3meg81qisL7u7uUCgUFbKs9PT0CtkYyezs7BAcHIyzZ8+aO5R6qWymCN9T1ePt7Q1/f/8m97569dVX8fPPP2PPnj3w9fXVbuf76LbKXiN9muL7SKlUIiAgAN27d8eCBQsQEhKCJUuWmP091KiSBaVSiW7dumHnzp0623fu3Ik+ffqYKar6Ta1W49SpU/D29jZ3KPVSq1at0KxZM533VFFREfbu3cv3lAHXr19HSkpKk3lfCSEwefJkREVFYffu3WjVqpXOfr6Pqn6N9Glq7yN9hBBQq9Xmfw+ZfAhlHduwYYOwsrIS33zzjTh58qSYNm2asLOzE8nJyeYOrV6YOXOmiI6OFufPnxcHDhwQDz30kHBwcGjSr09ubq44evSoOHr0qAAgFi1aJI4ePSouXLgghBDigw8+EE5OTiIqKkrExcWJ0aNHC29vb5GTk2PmyOuOodcoNzdXzJw5U/z9998iKSlJ7NmzR/Tu3Vs0b968ybxGL730knBychLR0dEiLS1N+ygoKNC2aervo6peI76PhJg9e7bYt2+fSEpKEsePHxdvvfWWsLCwEDt27BBCmPc91OiSBSGE+Pzzz4W/v79QKpXinnvu0Zma09Q98cQTwtvbW1hZWQkfHx8REREh4uPjzR2WWe3Zs0cAqPAYN26cEEKe9jZnzhzRrFkzoVKpxH333Sfi4uLMG3QdM/QaFRQUiMGDBwsPDw9hZWUlWrRoIcaNGycuXrxo7rDrjL7XBoBYuXKltk1Tfx9V9RrxfSTEhAkTtJ9dHh4eYsCAAdpEQQjzvoe4RDUREREZ1KjGLBAREVHtY7JAREREBjFZICIiIoOYLBAREZFBTBaIiIjIICYLREREZBCTBSIiIjKIyQIREREZxGSBiOqV6OhoSJKErKwsc4dCRLcwWSAiIiKDmCwQERGRQUwWiEiHEAILFy5E69atYWNjg5CQEERGRgK4fYtg27ZtCAkJgbW1NXr27Im4uDidY2zatAlBQUFQqVRo2bIlPvnkE539arUas2bNgp+fH1QqFdq2bYtvvvlGp83hw4fRvXt32Nraok+fPkhISDDtD05ElWKyQEQ6/vOf/2DlypVYvnw54uPjMX36dDz99NPYu3evts3rr7+Ojz/+GAcPHoSnpycefvhhFBcXA5A/5EeNGoUnn3wScXFxmDt3Lv773/9i1apV2v7PPPMMNmzYgM8++wynTp3CF198AXt7e5043n77bXzyySc4dOgQLC0tMWHChDr5+YlIjzpZ25KIGoS8vDxhbW0t/v77b53tzz33nBg9erR2qeoNGzZo912/fl3Y2NiIjRs3CiGEeOqpp8SgQYN0+r/++uuiY8eOQgghEhISBACxc+dOvTGUnWPXrl3abdu2bRMARGFhYa38nERUPbyyQERaJ0+exM2bNzFo0CDY29trH9999x3OnTunbde7d2/t966urmjXrh1OnToFADh16hT69u2rc9y+ffvi7Nmz0Gg0iI2NhUKhQL9+/QzG0rlzZ+333t7eAID09PS7/hmJqPoszR0AEdUfpaWlAIBt27ahefPmOvtUKpVOwnAnSZIAyGMeyr4vI4TQfm9jY2NULFZWVhWOXRYfEdUtXlkgIq2OHTtCpVLh4sWLCAgI0Hn4+flp2x04cED7fWZmJs6cOYP27dtrj/Hnn3/qHPfvv/9GYGAgFAoFgoODUVpaqjMGgojqN15ZICItBwcHvPbaa5g+fTpKS0tx7733IicnB3///Tfs7e3h7+8PAJg3bx7c3Nzg5eWFt99+G+7u7nj00UcBADNnzkRoaCjeffddPPHEE9i/fz+WLl2KZcuWAQBatmyJcePGYcKECfjss88QEhKCCxcuID09HaNGjTLXj05Ehph70AQR1S+lpaViyZIlol27dsLKykp4eHiIIUOGiL1792oHH27dulUEBQUJpVIpQkNDRWxsrM4xIiMjRceOHYWVlZVo0aKF+Oijj3T2FxYWiunTpwtvb2+hVCpFQECA+Pbbb4UQtwc4ZmZmatsfPXpUABBJSUmm/vGJSA9JiHI3E4mIDIiOjkb//v2RmZkJZ2dnc4dDRHWEYxaIiIjIICYLREREZBBvQxAREZFBvLJAREREBjFZICIiIoOYLBAREZFBTBaIiIjIICYLREREZBCTBSIiIjKIyQIREREZxGSBiIiIDPp/w83A4VbS2oQAAAAASUVORK5CYII=",
      "text/plain": [
       "<Figure size 600x400 with 1 Axes>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "\n",
       "<style>\n",
       "    /* background: */\n",
       "    progress::-webkit-progress-bar {background-color: #CDCDCD; width: 100%;}\n",
       "    progress {background-color: #CDCDCD;}\n",
       "\n",
       "    /* value: */\n",
       "    progress::-webkit-progress-value {background-color: #00BFFF  !important;}\n",
       "    progress::-moz-progress-bar {background-color: #00BFFF  !important;}\n",
       "    progress {color: #00BFFF ;}\n",
       "\n",
       "    /* optional */\n",
       "    .progress-bar-interrupted, .progress-bar-interrupted::-webkit-progress-bar {\n",
       "        background: #000000;\n",
       "    }\n",
       "</style>\n"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "\n",
       "    <div>\n",
       "      <progress value='30' class='' max='30' style='width:300px; height:20px; vertical-align: middle;'></progress>\n",
       "      [04:33]\n",
       "      <br>\n",
       "      ████████████████████100.00% [2/2] [val_loss=0.00057]\n",
       "    </div>\n",
       "    "
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "dfhistory = keras_model.fit(train_data = dl_train,\n",
    "                val_data = dl_val,\n",
    "                epochs=30,\n",
    "                patience=4,\n",
    "                monitor='val_loss',\n",
    "                mode='min',\n",
    "                ckpt_path = ckpt_path,\n",
    "                gradient_accumulation_steps = 2\n",
    "               )"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:37:41.105903Z",
     "iopub.status.busy": "2023-07-15T23:37:41.105500Z",
     "iopub.status.idle": "2023-07-15T23:37:42.301054Z",
     "shell.execute_reply": "2023-07-15T23:37:42.299783Z",
     "shell.execute_reply.started": "2023-07-15T23:37:41.105860Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "57M\tchatglm2_qlora\n"
     ]
    }
   ],
   "source": [
    "!du -s -h chatglm2_qlora "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 四，上传模型"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "为了使用模型，我们需要合并LoRA权重到预训练模型权重中。\n",
    "\n",
    "由于LoRA权重是非量化类型(float32)的，要求加载的预训练模型权重非量化类型(float32或fp16)。\n",
    "\n",
    "不使用量化的话，加载模型权重需要较高的内存，但在GPU模式下，Kaggle的机器只有13个G的内存。\n",
    "\n",
    "直接加载预训练模型权重会报OOM的错误。所以我们需要切换到CPU模式下(30G内存)。\n",
    "\n",
    "合并且保存权重后，再切换成GPU模式(有些麻烦，但尚可接受)。\n",
    "\n",
    "\n",
    "可以在右边Accelerator选项栏中选择 None,即切换回CPU模式。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:53:31.016909Z",
     "iopub.status.busy": "2023-07-15T23:53:31.016009Z",
     "iopub.status.idle": "2023-07-15T23:54:17.586019Z",
     "shell.execute_reply": "2023-07-15T23:54:17.584529Z",
     "shell.execute_reply.started": "2023-07-15T23:53:31.016838Z"
    }
   },
   "outputs": [],
   "source": [
    "#安装环境(为避免GPU下OOM，切换成CPU模式后重新安装环境)\n",
    "!pip install -q -U transformers\n",
    "!pip install  -q git+https://github.com/huggingface/peft  #使用最新版本非常重要，否则可能报错"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:54:38.398529Z",
     "iopub.status.busy": "2023-07-15T23:54:38.398105Z",
     "iopub.status.idle": "2023-07-15T23:54:41.632181Z",
     "shell.execute_reply": "2023-07-15T23:54:41.630582Z",
     "shell.execute_reply.started": "2023-07-15T23:54:38.398493Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "False\n"
     ]
    }
   ],
   "source": [
    "# 导入常用模块\n",
    "import numpy as np\n",
    "import pandas as pd \n",
    "import torch\n",
    "from torch import nn \n",
    "from torch.utils.data import Dataset,DataLoader \n",
    "\n",
    "import warnings \n",
    "warnings.filterwarnings('ignore')\n",
    "\n",
    "#务必将notebook切换成CPU模式，否则会报OOM错误\n",
    "print(torch.cuda.is_available())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:54:53.293090Z",
     "iopub.status.busy": "2023-07-15T23:54:53.292358Z",
     "iopub.status.idle": "2023-07-15T23:57:39.007290Z",
     "shell.execute_reply": "2023-07-15T23:57:39.005915Z",
     "shell.execute_reply.started": "2023-07-15T23:54:53.293053Z"
    }
   },
   "outputs": [
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "8d41f0038d554ae290ecc914eb7fe508",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)okenizer_config.json:   0%|          | 0.00/244 [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "df2c5bdf104c4fa386d38b4dee64263b",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)enization_chatglm.py:   0%|          | 0.00/10.0k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- tokenization_chatglm.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "44db52ecb8af425a83ab8ac3eb8c5a06",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading tokenizer.model:   0%|          | 0.00/1.02M [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "75ec50b4f71847e1bfeafdf8ce6c4365",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)lve/main/config.json:   0%|          | 0.00/1.22k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "ee21e1e2afdd4343bafb2750458968ad",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)iguration_chatglm.py:   0%|          | 0.00/2.25k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- configuration_chatglm.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "000833d95bd149ad934bdaf23f7bb76a",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)/modeling_chatglm.py:   0%|          | 0.00/50.7k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "7d9031b1aeb64cdcaa43857a20ad87ab",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)main/quantization.py:   0%|          | 0.00/14.7k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- quantization.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n",
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- modeling_chatglm.py\n",
      "- quantization.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "5029393aabed4c2bb5e60a84491840cb",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)model.bin.index.json:   0%|          | 0.00/20.4k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "fc6a1ddcf2f2480cb13a958297e1bc2a",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading shards:   0%|          | 0/7 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "f37c09271fa94415a7e43ae8e917c642",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00001-of-00007.bin:   0%|          | 0.00/1.83G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "79b06795cc314f7fad09073369a13990",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00002-of-00007.bin:   0%|          | 0.00/1.97G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "caf23ff7e4e24e5fbdc459d1f6e336b7",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00003-of-00007.bin:   0%|          | 0.00/1.93G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "379af85fb02748e59cd00a701d857cd7",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00004-of-00007.bin:   0%|          | 0.00/1.82G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "b62870b7d595430d83dec916ea69e68e",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00005-of-00007.bin:   0%|          | 0.00/1.97G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "d4f9f1c51a1047c485db42f0042639b7",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00006-of-00007.bin:   0%|          | 0.00/1.93G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "35a1c02f7dcd4f34a849ea63766b933b",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)l-00007-of-00007.bin:   0%|          | 0.00/1.05G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "c8af348b34874b8d914c675aef9174a1",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Loading checkpoint shards:   0%|          | 0/7 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "from transformers import AutoTokenizer, AutoModel\n",
    "\n",
    "#为了能够在kaggle中使用，需要设置 bnb_config\n",
    "model_name_or_path = \"THUDM/chatglm2-6b\"\n",
    "\n",
    "tokenizer = AutoTokenizer.from_pretrained(\n",
    "    model_name_or_path, trust_remote_code=True) # cache_dir='./' 缓存到当前工作路径\n",
    "\n",
    "model = AutoModel.from_pretrained(model_name_or_path,\n",
    "                trust_remote_code=True) # cache_dir='./' \n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-15T23:58:04.262473Z",
     "iopub.status.busy": "2023-07-15T23:58:04.262049Z",
     "iopub.status.idle": "2023-07-15T23:59:56.167816Z",
     "shell.execute_reply": "2023-07-15T23:59:56.166692Z",
     "shell.execute_reply.started": "2023-07-15T23:58:04.262441Z"
    }
   },
   "outputs": [],
   "source": [
    "from peft import PeftModel\n",
    "ckpt_path = 'chatglm2_qlora/'\n",
    "peft_loaded = PeftModel.from_pretrained(model,ckpt_path)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:07:30.790108Z",
     "iopub.status.busy": "2023-07-16T00:07:30.785031Z",
     "iopub.status.idle": "2023-07-16T00:09:23.735207Z",
     "shell.execute_reply": "2023-07-16T00:09:23.733742Z",
     "shell.execute_reply.started": "2023-07-16T00:07:30.790006Z"
    }
   },
   "outputs": [],
   "source": [
    "model_new = peft_loaded.merge_and_unload() #合并lora权重"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:10:50.292105Z",
     "iopub.status.busy": "2023-07-16T00:10:50.291051Z",
     "iopub.status.idle": "2023-07-16T00:11:34.815875Z",
     "shell.execute_reply": "2023-07-16T00:11:34.814950Z",
     "shell.execute_reply.started": "2023-07-16T00:10:50.292065Z"
    }
   },
   "outputs": [],
   "source": [
    "save_path = \"chatglm2-6b-torchkeras\"\n",
    "model_new.save_pretrained(save_path, max_shard_size='2GB')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:12:08.127357Z",
     "iopub.status.busy": "2023-07-16T00:12:08.126900Z",
     "iopub.status.idle": "2023-07-16T00:12:08.154414Z",
     "shell.execute_reply": "2023-07-16T00:12:08.153179Z",
     "shell.execute_reply.started": "2023-07-16T00:12:08.127322Z"
    }
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "('chatglm2-6b-torchkeras/tokenizer_config.json',\n",
       " 'chatglm2-6b-torchkeras/special_tokens_map.json',\n",
       " 'chatglm2-6b-torchkeras/tokenizer.model',\n",
       " 'chatglm2-6b-torchkeras/added_tokens.json')"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "tokenizer.save_pretrained(save_path) "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "#从chatglm2-6b官方仓库下载其他的依赖文件，忽略权重\n",
    "!GIT_LFS_SKIP_SMUDGE=1 git clone https://huggingface.co/THUDM/chatglm2-6b/"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:25:52.853669Z",
     "iopub.status.busy": "2023-07-16T00:25:52.852405Z",
     "iopub.status.idle": "2023-07-16T00:25:54.365120Z",
     "shell.execute_reply": "2023-07-16T00:25:54.363454Z",
     "shell.execute_reply.started": "2023-07-16T00:25:52.853621Z"
    }
   },
   "outputs": [],
   "source": [
    "!cp  chatglm2-6b/*.py {save_path}"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:25:59.806046Z",
     "iopub.status.busy": "2023-07-16T00:25:59.805581Z",
     "iopub.status.idle": "2023-07-16T00:26:01.257706Z",
     "shell.execute_reply": "2023-07-16T00:26:01.256095Z",
     "shell.execute_reply.started": "2023-07-16T00:25:59.806005Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "config.json\t\t\t  pytorch_model-00006-of-00007.bin\n",
      "configuration_chatglm.py\t  pytorch_model-00007-of-00007.bin\n",
      "generation_config.json\t\t  pytorch_model.bin.index.json\n",
      "modeling_chatglm.py\t\t  quantization.py\n",
      "pytorch_model-00001-of-00007.bin  special_tokens_map.json\n",
      "pytorch_model-00002-of-00007.bin  tokenization_chatglm.py\n",
      "pytorch_model-00003-of-00007.bin  tokenizer.model\n",
      "pytorch_model-00004-of-00007.bin  tokenizer_config.json\n",
      "pytorch_model-00005-of-00007.bin\n"
     ]
    }
   ],
   "source": [
    "!ls {save_path}"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:27:03.288504Z",
     "iopub.status.busy": "2023-07-16T00:27:03.287118Z",
     "iopub.status.idle": "2023-07-16T00:27:03.341633Z",
     "shell.execute_reply": "2023-07-16T00:27:03.340379Z",
     "shell.execute_reply.started": "2023-07-16T00:27:03.288402Z"
    }
   },
   "outputs": [
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "ab41c64e561448eaa8c431ac1a71a2bb",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "VBox(children=(HTML(value='<center> <img\\nsrc=https://huggingface.co/front/assets/huggingface_logo-noborder.sv…"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "from huggingface_hub import login\n",
    "login() #需要注册一个huggingface账户，在个人页面setting那里创建一个有write权限的access token"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:29:24.709838Z",
     "iopub.status.busy": "2023-07-16T00:29:24.709439Z",
     "iopub.status.idle": "2023-07-16T00:29:24.715663Z",
     "shell.execute_reply": "2023-07-16T00:29:24.714361Z",
     "shell.execute_reply.started": "2023-07-16T00:29:24.709808Z"
    }
   },
   "outputs": [],
   "source": [
    "from huggingface_hub import HfApi\n",
    "api = HfApi()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 45,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:42:16.020294Z",
     "iopub.status.busy": "2023-07-16T00:42:16.019891Z",
     "iopub.status.idle": "2023-07-16T00:42:16.025267Z",
     "shell.execute_reply": "2023-07-16T00:42:16.024137Z",
     "shell.execute_reply.started": "2023-07-16T00:42:16.020265Z"
    }
   },
   "outputs": [],
   "source": [
    "#创建huggingface 模型库\n",
    "repo_id = \"lyhue1991/chatglm2-6b-torchkeras\"\n",
    "api.create_repo(repo_id=repo_id)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 43,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:35:30.217160Z",
     "iopub.status.busy": "2023-07-16T00:35:30.216122Z",
     "iopub.status.idle": "2023-07-16T00:41:41.960545Z",
     "shell.execute_reply": "2023-07-16T00:41:41.959224Z",
     "shell.execute_reply.started": "2023-07-16T00:35:30.217115Z"
    }
   },
   "outputs": [
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "0a944c81a6fe48f2b87aef49d17e6c95",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "pytorch_model-00001-of-00007.bin:   0%|          | 0.00/1.83G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "11f5c76956314c76874b57a2467ac5e8",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "pytorch_model-00002-of-00007.bin:   0%|          | 0.00/1.97G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "39061f345e2a456592f8b6ff86705f8a",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "pytorch_model-00003-of-00007.bin:   0%|          | 0.00/1.93G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "5d50f8360b2741b9a17cdc6a3ad87147",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "pytorch_model-00004-of-00007.bin:   0%|          | 0.00/1.82G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "f957aef5198a4e1aa71d62a4d3b4b7a7",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Upload 8 LFS files:   0%|          | 0/8 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "3ca8b3934a4043e6aaf5c98b412bdedc",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "pytorch_model-00005-of-00007.bin:   0%|          | 0.00/1.97G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "6919c728d9b249d69bc92f02dc3bc743",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "pytorch_model-00006-of-00007.bin:   0%|          | 0.00/1.93G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "d514112c42204dcc97fcdb532b037f37",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "pytorch_model-00007-of-00007.bin:   0%|          | 0.00/1.05G [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "b8da494d969c41c4a7c2b8b1e127ce07",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "tokenizer.model:   0%|          | 0.00/1.02M [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/plain": [
       "'https://huggingface.co/lyhue1991/chatglm2-6b-torchkeras/tree/main/'"
      ]
     },
     "execution_count": 43,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#上传模型可能需要等待10分钟左右~\n",
    "api.upload_folder(\n",
    "    folder_path=save_path,\n",
    "    repo_id=repo_id,\n",
    "    repo_type=\"model\", #space, model, datasets\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "#上传成功后可以删除本地模型\n",
    "!rm  -rf chatglm2-6b-torchkeras"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 五，使用模型"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "我们重新切换成GPU环境，直接从huggingface导入刚才我们训练好的 'lyhue1991/chatglm2-6b-torchkeras'模型进行测试。就好像导入chatglm2-6b官方模型一样。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:50:51.776348Z",
     "iopub.status.busy": "2023-07-16T00:50:51.776065Z",
     "iopub.status.idle": "2023-07-16T00:52:45.961238Z",
     "shell.execute_reply": "2023-07-16T00:52:45.959904Z",
     "shell.execute_reply.started": "2023-07-16T00:50:51.776320Z"
    }
   },
   "outputs": [],
   "source": [
    "#安装环境\n",
    "\n",
    "#chatglm需要\n",
    "!pip install -q -U transformers\n",
    "\n",
    "#finetune需要\n",
    "!pip install -q 'bitsandbytes==0.39.1' #提供4bit量化支持，版本限制非常重要，否则可能报错\n",
    "!pip install -q datasets\n",
    "!pip install -q git+https://github.com/huggingface/accelerate\n",
    "!pip install  -q git+https://github.com/huggingface/peft  #使用最新版本非常重要，否则可能报错\n",
    "!pip install  -q git+https://github.com/lyhue1991/torchkeras "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T00:52:53.773041Z",
     "iopub.status.busy": "2023-07-16T00:52:53.772659Z",
     "iopub.status.idle": "2023-07-16T00:52:56.767970Z",
     "shell.execute_reply": "2023-07-16T00:52:56.766954Z",
     "shell.execute_reply.started": "2023-07-16T00:52:53.773010Z"
    }
   },
   "outputs": [],
   "source": [
    "# 导入常用模块\n",
    "import numpy as np\n",
    "import pandas as pd \n",
    "import torch\n",
    "from torch import nn \n",
    "from torch.utils.data import Dataset,DataLoader \n",
    "\n",
    "import warnings \n",
    "warnings.filterwarnings('ignore')\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T01:14:17.502787Z",
     "iopub.status.busy": "2023-07-16T01:14:17.502362Z",
     "iopub.status.idle": "2023-07-16T01:15:46.919231Z",
     "shell.execute_reply": "2023-07-16T01:15:46.918183Z",
     "shell.execute_reply.started": "2023-07-16T01:14:17.502755Z"
    }
   },
   "outputs": [
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "3ba16a010625454b86e346da8cf86838",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)okenizer_config.json:   0%|          | 0.00/244 [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "b72eefbbc1be444393a33364a59b2ee0",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading tokenizer.model:   0%|          | 0.00/1.02M [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "5748f0b954134038819dc8d0dedb97c1",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)/modeling_chatglm.py:   0%|          | 0.00/50.7k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "8470f7d5e75243a98a4797b4720b215f",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Downloading (…)main/quantization.py:   0%|          | 0.00/14.7k [00:00<?, ?B/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- quantization.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n",
      "A new version of the following files was downloaded from https://huggingface.co/THUDM/chatglm2-6b:\n",
      "- modeling_chatglm.py\n",
      "- quantization.py\n",
      ". Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\n",
      "===================================BUG REPORT===================================\n",
      "Welcome to bitsandbytes. For bug reports, please run\n",
      "\n",
      "python -m bitsandbytes\n",
      "\n",
      " and submit this information together with your error trace to: https://github.com/TimDettmers/bitsandbytes/issues\n",
      "================================================================================\n",
      "bin /opt/conda/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda118_nocublaslt.so\n",
      "CUDA SETUP: CUDA runtime path found: /usr/local/cuda/lib64/libcudart.so.11.0\n",
      "CUDA SETUP: Highest compute capability among GPUs detected: 6.0\n",
      "CUDA SETUP: Detected CUDA version 118\n",
      "CUDA SETUP: Loading binary /opt/conda/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda118_nocublaslt.so...\n"
     ]
    },
    {
     "data": {
      "application/vnd.jupyter.widget-view+json": {
       "model_id": "3553577daf8b4d09813743e0192fd814",
       "version_major": 2,
       "version_minor": 0
      },
      "text/plain": [
       "Loading checkpoint shards:   0%|          | 0/7 [00:00<?, ?it/s]"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "from transformers import AutoTokenizer,AutoConfig, AutoModel, BitsAndBytesConfig\n",
    "\n",
    "#为了能够在kaggle中使用，需要设置 bnb_config\n",
    "model_name_or_path = 'lyhue1991/chatglm2-6b-torchkeras'\n",
    "config = AutoConfig.from_pretrained(model_name_or_path, trust_remote_code=True)\n",
    "bnb_config=BitsAndBytesConfig(\n",
    "            load_in_4bit=True,\n",
    "            bnb_4bit_compute_dtype=torch.float16,\n",
    "            bnb_4bit_use_double_quant=True, #QLoRA 设计的 Double Quantization\n",
    "            bnb_4bit_quant_type=\"nf4\", #QLoRA 设计的 Normal Float 4 量化数据类型\n",
    "            llm_int8_threshold=6.0,\n",
    "            llm_int8_has_fp16_weight=False,\n",
    "        )\n",
    "tokenizer = AutoTokenizer.from_pretrained(\n",
    "    'THUDM/chatglm2-6b' , trust_remote_code=True) # cache_dir='./' 缓存到当前工作路径\n",
    "\n",
    "model = AutoModel.from_pretrained(model_name_or_path,\n",
    "                config=config,\n",
    "                quantization_config=bnb_config,\n",
    "                trust_remote_code=True)  # cache_dir='./'"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T01:16:34.791266Z",
     "iopub.status.busy": "2023-07-16T01:16:34.790876Z",
     "iopub.status.idle": "2023-07-16T01:16:42.821706Z",
     "shell.execute_reply": "2023-07-16T01:16:42.820707Z",
     "shell.execute_reply.started": "2023-07-16T01:16:34.791235Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "你好👋！我是人工智能助手 ChatGLM2-6B，很高兴见到你，欢迎问我任何问题。\n"
     ]
    }
   ],
   "source": [
    "#注册魔法命令便于jupyter中使用\n",
    "from torchkeras.chat import ChatGLM \n",
    "chatglm = ChatGLM(model,tokenizer,stream=True)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T01:16:52.503270Z",
     "iopub.status.busy": "2023-07-16T01:16:52.502465Z",
     "iopub.status.idle": "2023-07-16T01:17:06.529745Z",
     "shell.execute_reply": "2023-07-16T01:17:06.528767Z",
     "shell.execute_reply.started": "2023-07-16T01:16:52.503232Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "梦中情炉一般指的是炼丹工具torchkeras。\n",
      "这是一个通用的pytorch模型训练模版工具。\n",
      "torchkeras是一个三好炼丹炉：好看，好用，好改。\n",
      "她有torch的灵动，也有keras的优雅，并且她的美丽，无与伦比。\n",
      "所以她的作者一个有毅力的吃货给她取了一个别名叫做梦中情炉。\n"
     ]
    }
   ],
   "source": [
    "%%chatglm\n",
    "你听说过梦中情炉吗？"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {
    "execution": {
     "iopub.execute_input": "2023-07-16T01:18:01.329122Z",
     "iopub.status.busy": "2023-07-16T01:18:01.328741Z",
     "iopub.status.idle": "2023-07-16T01:18:40.832905Z",
     "shell.execute_reply": "2023-07-16T01:18:40.831881Z",
     "shell.execute_reply.started": "2023-07-16T01:18:01.329092Z"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "梦中情炉\n",
      "\n",
      " her beauty,\n",
      "\n",
      " is like a torch,\n",
      "\n",
      " bright and fierce,\n",
      "\n",
      " yet gentle and kind.\n",
      "\n",
      " she is a tool for training,\n",
      "\n",
      " a model for creativity,\n",
      "\n",
      " her power,\n",
      "\n",
      " is beyond compare,\n",
      "\n",
      " she is the dream.\n",
      "\n",
      " her design,\n",
      "\n",
      " is elegant and clean,\n",
      "\n",
      " her curves,\n",
      "\n",
      " are like a line,\n",
      "\n",
      " smooth and graceful.\n",
      "\n",
      " her heat,\n",
      "\n",
      " is hot and stable,\n",
      "\n",
      " her stability,\n",
      "\n",
      " is like a rock,\n",
      "\n",
      " unshakable and solid.\n",
      "\n",
      " she is a machine,\n",
      "\n",
      " a tool for success,\n",
      "\n",
      " her passion,\n",
      "\n",
      " is unparalleled,\n",
      "\n",
      " she is the dream.\n",
      "\n",
      " in her hands,\n",
      "\n",
      " our dreams,\n",
      "\n",
      " come to life,\n",
      "\n",
      " with her help,\n",
      "\n",
      " we achieve,\n",
      "\n",
      " she is the dream.\n",
      "\n",
      " Therefore,\n",
      "\n",
      " her name,\n",
      "\n",
      " is not just a name,\n",
      "\n",
      " but a symbol,\n",
      "\n",
      " a symbol of hope,\n",
      "\n",
      " a symbol of success.\n",
      "\n",
      " She is the dream,\n",
      "\n",
      " the hope,\n",
      "\n",
      " the success,\n",
      "\n",
      " she is everything.\n"
     ]
    }
   ],
   "source": [
    "%%chatglm\n",
    "请写一首婉约深情的小诗，歌颂一下梦中情炉。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**如果本项目对你有所帮助，想鼓励一下作者，记得给本项目加一颗星星star⭐️，并分享给你的朋友们喔😊!** \n",
    "\n",
    "如果在torchkeras的使用中遇到问题，可以在项目中提交issue。\n",
    "\n",
    "如果想要获得更快的反馈或者与其他torchkeras用户小伙伴进行交流，\n",
    "\n",
    "可以在公众号算法美食屋后台回复关键字：**加群**。\n",
    "\n",
    "![](https://tva1.sinaimg.cn/large/e6c9d24egy1h41m2zugguj20k00b9q46.jpg)"
   ]
  }
 ],
 "metadata": {
  "colab": {
   "provenance": []
  },
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.9.0"
  },
  "vscode": {
   "interpreter": {
    "hash": "25273a2a68c96ebac13d7fb9e0db516f9be0772777a0507fe06d682a441a3ba7"
   }
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
