{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Factorization Machines\n",
    "\n",
    "```\n",
    "原始数据(用户信息/消费明细/行为明细/地理位置) -> 事实标签(次数/金额/间隔) -> 模型标签(兴趣/购买/风险预测) -> 策略标签(待挽回/待发展/待激活用户群)\n",
    "数量假设: 在网页模型图中, 一个网页接收到的其他网页指向的入链(in-links)越多,说明网页重要性越高\n",
    "质量假设: 当一个质量高的网页指向(out-links)一个网页,说明这个被指向的网页质量也高\n",
    "```"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Factorization Machines model all interactions between variables using factorized parameters. The model equation for a factorization machine of degree $d=2$ is defined as:\n",
    "$$\n",
    "\\hat{y}(x) = \\omega_0 + \\sum_{i=1}^n\\omega_ix_i + \\sum_{i=1}^n\\sum_{j=i+1}^n v_i^Tv_jx_ix_j.\n",
    "$$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The complexity of straight forward computation of the equation is in $\\mathcal{O}(kn^2)$. However, we have:\n",
    "\\begin{align}\n",
    "\\sum_{i=1}^n\\sum_{j=i+1}^n v_i^Tv_jx_ix_j &= \\frac{1}{2}\\sum_{i=1}^n\\sum_{j=1}^n v_i^Tv_jx_ix_j - \\frac{1}{2}\\sum_{i=1}^n v_i^Tv_ix_ix_i\\\\\n",
    "&= \\frac{1}{2}(\\sum_{i=1}^n\\sum_{j=1}^n\\sum_{f=1}^k v_{if}v_{jf}x_ix_j - \\sum_{i=1}^n\\sum_{f=1}^k v_{if}v_{if}x_ix_i)\\\\\n",
    "&= \\frac{1}{2}\\sum_{f=1}^k((\\sum_{i=1}^n v_{if}x_i)^2 - \\sum_{i=1}^n v_{if}^2x_i^2)\n",
    "\\end{align}\n",
    "This equation has only linear complexity in both $k$ and $n$, i.e., its computation is in $\\mathcal{O}(kn)$."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 实现"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "WARNING:tensorflow:From c:\\users\\ginzeng\\.conda\\envs\\learn\\lib\\site-packages\\tensorflow\\python\\compat\\v2_compat.py:101: disable_resource_variables (from tensorflow.python.ops.variable_scope) is deprecated and will be removed in a future version.\n",
      "Instructions for updating:\n",
      "non-resource variables are not supported in the long term\n",
      "----step  0 accuracy: 0.700, auc: 0.665\n",
      "----step  1 accuracy: 0.700, auc: 0.665\n",
      "----step  2 accuracy: 0.700, auc: 0.671\n",
      "----step  3 accuracy: 0.700, auc: 0.678\n",
      "----step  4 accuracy: 0.700, auc: 0.685\n",
      "----step  5 accuracy: 0.700, auc: 0.693\n",
      "----step  6 accuracy: 0.700, auc: 0.699\n",
      "----step  7 accuracy: 0.700, auc: 0.705\n",
      "----step  8 accuracy: 0.700, auc: 0.711\n",
      "----step  9 accuracy: 0.700, auc: 0.716\n",
      "----step 10 accuracy: 0.700, auc: 0.724\n",
      "----step 11 accuracy: 0.700, auc: 0.729\n",
      "----step 12 accuracy: 0.700, auc: 0.736\n",
      "----step 13 accuracy: 0.700, auc: 0.739\n",
      "----step 14 accuracy: 0.700, auc: 0.743\n",
      "----step 15 accuracy: 0.700, auc: 0.746\n",
      "----step 16 accuracy: 0.700, auc: 0.749\n",
      "----step 17 accuracy: 0.700, auc: 0.752\n",
      "----step 18 accuracy: 0.700, auc: 0.754\n",
      "----step 19 accuracy: 0.705, auc: 0.757\n",
      "----step 20 accuracy: 0.700, auc: 0.759\n",
      "----step 21 accuracy: 0.700, auc: 0.761\n",
      "----step 22 accuracy: 0.705, auc: 0.763\n",
      "----step 23 accuracy: 0.705, auc: 0.765\n",
      "----step 24 accuracy: 0.705, auc: 0.768\n",
      "----step 25 accuracy: 0.710, auc: 0.770\n",
      "----step 26 accuracy: 0.710, auc: 0.773\n",
      "----step 27 accuracy: 0.710, auc: 0.775\n",
      "----step 28 accuracy: 0.710, auc: 0.777\n",
      "----step 29 accuracy: 0.710, auc: 0.779\n",
      "----step 30 accuracy: 0.710, auc: 0.780\n",
      "----step 31 accuracy: 0.710, auc: 0.781\n",
      "----step 32 accuracy: 0.710, auc: 0.782\n",
      "----step 33 accuracy: 0.710, auc: 0.783\n",
      "----step 34 accuracy: 0.710, auc: 0.783\n",
      "----step 35 accuracy: 0.720, auc: 0.785\n",
      "----step 36 accuracy: 0.725, auc: 0.786\n",
      "----step 37 accuracy: 0.725, auc: 0.788\n",
      "----step 38 accuracy: 0.725, auc: 0.789\n",
      "----step 39 accuracy: 0.725, auc: 0.789\n",
      "----step 40 accuracy: 0.725, auc: 0.790\n",
      "----step 41 accuracy: 0.725, auc: 0.791\n",
      "----step 42 accuracy: 0.730, auc: 0.792\n",
      "----step 43 accuracy: 0.730, auc: 0.792\n",
      "----step 44 accuracy: 0.730, auc: 0.793\n",
      "----step 45 accuracy: 0.730, auc: 0.793\n",
      "----step 46 accuracy: 0.735, auc: 0.794\n",
      "----step 47 accuracy: 0.740, auc: 0.795\n",
      "----step 48 accuracy: 0.740, auc: 0.795\n",
      "----step 49 accuracy: 0.740, auc: 0.796\n",
      "----step 50 accuracy: 0.740, auc: 0.797\n",
      "----step 51 accuracy: 0.750, auc: 0.798\n",
      "----step 52 accuracy: 0.760, auc: 0.798\n",
      "----step 53 accuracy: 0.755, auc: 0.799\n",
      "----step 54 accuracy: 0.750, auc: 0.800\n",
      "----step 55 accuracy: 0.750, auc: 0.801\n",
      "----step 56 accuracy: 0.750, auc: 0.801\n",
      "----step 57 accuracy: 0.750, auc: 0.802\n",
      "----step 58 accuracy: 0.755, auc: 0.803\n",
      "----step 59 accuracy: 0.750, auc: 0.803\n",
      "----step 60 accuracy: 0.750, auc: 0.803\n",
      "----step 61 accuracy: 0.750, auc: 0.803\n",
      "----step 62 accuracy: 0.750, auc: 0.803\n",
      "----step 63 accuracy: 0.750, auc: 0.804\n",
      "----step 64 accuracy: 0.750, auc: 0.804\n",
      "----step 65 accuracy: 0.745, auc: 0.805\n",
      "----step 66 accuracy: 0.745, auc: 0.804\n",
      "----step 67 accuracy: 0.755, auc: 0.805\n",
      "----step 68 accuracy: 0.760, auc: 0.805\n",
      "----step 69 accuracy: 0.760, auc: 0.806\n",
      "----step 70 accuracy: 0.770, auc: 0.806\n",
      "----step 71 accuracy: 0.770, auc: 0.806\n",
      "----step 72 accuracy: 0.770, auc: 0.807\n",
      "----step 73 accuracy: 0.775, auc: 0.807\n",
      "----step 74 accuracy: 0.775, auc: 0.808\n",
      "----step 75 accuracy: 0.770, auc: 0.809\n",
      "----step 76 accuracy: 0.760, auc: 0.808\n",
      "----step 77 accuracy: 0.760, auc: 0.808\n",
      "----step 78 accuracy: 0.755, auc: 0.808\n",
      "----step 79 accuracy: 0.755, auc: 0.809\n",
      "----step 80 accuracy: 0.755, auc: 0.809\n",
      "----step 81 accuracy: 0.755, auc: 0.810\n",
      "----step 82 accuracy: 0.760, auc: 0.811\n",
      "----step 83 accuracy: 0.760, auc: 0.811\n",
      "----step 84 accuracy: 0.760, auc: 0.811\n",
      "----step 85 accuracy: 0.760, auc: 0.812\n",
      "----step 86 accuracy: 0.760, auc: 0.811\n",
      "----step 87 accuracy: 0.760, auc: 0.812\n",
      "----step 88 accuracy: 0.760, auc: 0.812\n",
      "----step 89 accuracy: 0.760, auc: 0.813\n",
      "----step 90 accuracy: 0.760, auc: 0.813\n",
      "----step 91 accuracy: 0.760, auc: 0.814\n",
      "----step 92 accuracy: 0.760, auc: 0.814\n",
      "----step 93 accuracy: 0.760, auc: 0.814\n",
      "----step 94 accuracy: 0.765, auc: 0.814\n",
      "----step 95 accuracy: 0.765, auc: 0.814\n",
      "----step 96 accuracy: 0.765, auc: 0.815\n",
      "----step 97 accuracy: 0.765, auc: 0.815\n",
      "----step 98 accuracy: 0.765, auc: 0.816\n",
      "----step 99 accuracy: 0.765, auc: 0.816\n",
      "----step100 accuracy: 0.770, auc: 0.816\n",
      "----step101 accuracy: 0.770, auc: 0.816\n",
      "----step102 accuracy: 0.770, auc: 0.816\n",
      "----step103 accuracy: 0.770, auc: 0.816\n",
      "----step104 accuracy: 0.775, auc: 0.815\n",
      "----step105 accuracy: 0.775, auc: 0.816\n",
      "----step106 accuracy: 0.780, auc: 0.816\n",
      "----step107 accuracy: 0.780, auc: 0.816\n",
      "----step108 accuracy: 0.780, auc: 0.816\n",
      "----step109 accuracy: 0.780, auc: 0.817\n",
      "----step110 accuracy: 0.780, auc: 0.817\n",
      "----step111 accuracy: 0.780, auc: 0.817\n",
      "----step112 accuracy: 0.780, auc: 0.817\n",
      "----step113 accuracy: 0.790, auc: 0.818\n",
      "----step114 accuracy: 0.785, auc: 0.817\n",
      "----step115 accuracy: 0.785, auc: 0.817\n",
      "----step116 accuracy: 0.785, auc: 0.817\n",
      "----step117 accuracy: 0.780, auc: 0.817\n",
      "----step118 accuracy: 0.775, auc: 0.817\n",
      "----step119 accuracy: 0.775, auc: 0.817\n",
      "----step120 accuracy: 0.775, auc: 0.817\n",
      "----step121 accuracy: 0.775, auc: 0.818\n",
      "----step122 accuracy: 0.775, auc: 0.818\n",
      "----step123 accuracy: 0.775, auc: 0.817\n",
      "----step124 accuracy: 0.775, auc: 0.817\n",
      "----step125 accuracy: 0.775, auc: 0.818\n",
      "----step126 accuracy: 0.775, auc: 0.818\n",
      "----step127 accuracy: 0.775, auc: 0.818\n",
      "----step128 accuracy: 0.775, auc: 0.818\n",
      "----step129 accuracy: 0.775, auc: 0.818\n",
      "----step130 accuracy: 0.775, auc: 0.819\n",
      "----step131 accuracy: 0.775, auc: 0.819\n",
      "----step132 accuracy: 0.775, auc: 0.819\n",
      "----step133 accuracy: 0.775, auc: 0.819\n",
      "----step134 accuracy: 0.775, auc: 0.819\n",
      "----step135 accuracy: 0.775, auc: 0.819\n",
      "----step136 accuracy: 0.775, auc: 0.819\n",
      "----step137 accuracy: 0.775, auc: 0.819\n",
      "----step138 accuracy: 0.775, auc: 0.819\n",
      "----step139 accuracy: 0.780, auc: 0.819\n",
      "----step140 accuracy: 0.780, auc: 0.819\n",
      "----step141 accuracy: 0.780, auc: 0.819\n",
      "----step142 accuracy: 0.780, auc: 0.820\n",
      "----step143 accuracy: 0.785, auc: 0.820\n",
      "----step144 accuracy: 0.785, auc: 0.820\n",
      "----step145 accuracy: 0.785, auc: 0.820\n",
      "----step146 accuracy: 0.790, auc: 0.820\n",
      "----step147 accuracy: 0.790, auc: 0.820\n",
      "----step148 accuracy: 0.790, auc: 0.820\n",
      "----step149 accuracy: 0.790, auc: 0.820\n",
      "----step150 accuracy: 0.785, auc: 0.821\n",
      "----step151 accuracy: 0.785, auc: 0.820\n",
      "----step152 accuracy: 0.785, auc: 0.820\n",
      "----step153 accuracy: 0.785, auc: 0.821\n",
      "----step154 accuracy: 0.785, auc: 0.821\n",
      "----step155 accuracy: 0.785, auc: 0.821\n",
      "----step156 accuracy: 0.785, auc: 0.822\n",
      "----step157 accuracy: 0.780, auc: 0.822\n",
      "----step158 accuracy: 0.775, auc: 0.822\n",
      "----step159 accuracy: 0.775, auc: 0.822\n",
      "----step160 accuracy: 0.775, auc: 0.822\n",
      "----step161 accuracy: 0.775, auc: 0.822\n",
      "----step162 accuracy: 0.775, auc: 0.822\n",
      "----step163 accuracy: 0.775, auc: 0.823\n",
      "----step164 accuracy: 0.775, auc: 0.823\n",
      "----step165 accuracy: 0.775, auc: 0.823\n",
      "----step166 accuracy: 0.775, auc: 0.823\n",
      "----step167 accuracy: 0.775, auc: 0.823\n",
      "----step168 accuracy: 0.775, auc: 0.823\n",
      "----step169 accuracy: 0.775, auc: 0.823\n",
      "----step170 accuracy: 0.775, auc: 0.823\n",
      "----step171 accuracy: 0.775, auc: 0.823\n",
      "----step172 accuracy: 0.775, auc: 0.823\n",
      "----step173 accuracy: 0.775, auc: 0.823\n",
      "----step174 accuracy: 0.775, auc: 0.823\n",
      "----step175 accuracy: 0.775, auc: 0.823\n",
      "----step176 accuracy: 0.775, auc: 0.823\n",
      "----step177 accuracy: 0.775, auc: 0.824\n",
      "----step178 accuracy: 0.775, auc: 0.824\n",
      "----step179 accuracy: 0.775, auc: 0.824\n",
      "----step180 accuracy: 0.775, auc: 0.824\n",
      "----step181 accuracy: 0.775, auc: 0.824\n",
      "----step182 accuracy: 0.775, auc: 0.824\n",
      "----step183 accuracy: 0.775, auc: 0.824\n",
      "----step184 accuracy: 0.775, auc: 0.824\n",
      "----step185 accuracy: 0.775, auc: 0.824\n",
      "----step186 accuracy: 0.775, auc: 0.824\n",
      "----step187 accuracy: 0.775, auc: 0.824\n",
      "----step188 accuracy: 0.775, auc: 0.824\n",
      "----step189 accuracy: 0.775, auc: 0.824\n",
      "----step190 accuracy: 0.775, auc: 0.824\n",
      "----step191 accuracy: 0.775, auc: 0.824\n",
      "----step192 accuracy: 0.775, auc: 0.824\n",
      "----step193 accuracy: 0.775, auc: 0.824\n",
      "----step194 accuracy: 0.775, auc: 0.824\n",
      "----step195 accuracy: 0.775, auc: 0.824\n",
      "----step196 accuracy: 0.770, auc: 0.824\n",
      "----step197 accuracy: 0.770, auc: 0.824\n",
      "----step198 accuracy: 0.770, auc: 0.824\n",
      "----step199 accuracy: 0.770, auc: 0.824\n",
      "----step200 accuracy: 0.770, auc: 0.824\n",
      "----step201 accuracy: 0.770, auc: 0.824\n",
      "----step202 accuracy: 0.770, auc: 0.824\n",
      "----step203 accuracy: 0.770, auc: 0.824\n",
      "----step204 accuracy: 0.770, auc: 0.824\n",
      "----step205 accuracy: 0.770, auc: 0.824\n",
      "----step206 accuracy: 0.770, auc: 0.824\n",
      "----step207 accuracy: 0.770, auc: 0.824\n",
      "----step208 accuracy: 0.770, auc: 0.824\n",
      "----step209 accuracy: 0.770, auc: 0.823\n",
      "----step210 accuracy: 0.770, auc: 0.823\n",
      "----step211 accuracy: 0.770, auc: 0.823\n",
      "----step212 accuracy: 0.770, auc: 0.824\n",
      "----step213 accuracy: 0.770, auc: 0.824\n",
      "----step214 accuracy: 0.770, auc: 0.823\n",
      "----step215 accuracy: 0.770, auc: 0.823\n",
      "----step216 accuracy: 0.770, auc: 0.824\n",
      "----step217 accuracy: 0.770, auc: 0.824\n",
      "----step218 accuracy: 0.770, auc: 0.824\n",
      "----step219 accuracy: 0.770, auc: 0.824\n",
      "----step220 accuracy: 0.770, auc: 0.824\n",
      "----step221 accuracy: 0.770, auc: 0.824\n",
      "----step222 accuracy: 0.770, auc: 0.824\n",
      "----step223 accuracy: 0.770, auc: 0.824\n",
      "----step224 accuracy: 0.775, auc: 0.824\n",
      "----step225 accuracy: 0.780, auc: 0.824\n",
      "----step226 accuracy: 0.780, auc: 0.824\n",
      "----step227 accuracy: 0.780, auc: 0.824\n",
      "----step228 accuracy: 0.780, auc: 0.825\n",
      "----step229 accuracy: 0.780, auc: 0.825\n",
      "----step230 accuracy: 0.780, auc: 0.825\n",
      "----step231 accuracy: 0.780, auc: 0.825\n",
      "----step232 accuracy: 0.780, auc: 0.825\n",
      "----step233 accuracy: 0.780, auc: 0.825\n",
      "----step234 accuracy: 0.780, auc: 0.825\n",
      "----step235 accuracy: 0.780, auc: 0.825\n",
      "----step236 accuracy: 0.780, auc: 0.826\n",
      "----step237 accuracy: 0.780, auc: 0.826\n",
      "----step238 accuracy: 0.780, auc: 0.826\n",
      "----step239 accuracy: 0.780, auc: 0.826\n",
      "----step240 accuracy: 0.780, auc: 0.827\n",
      "----step241 accuracy: 0.780, auc: 0.827\n",
      "----step242 accuracy: 0.780, auc: 0.827\n",
      "----step243 accuracy: 0.780, auc: 0.827\n",
      "----step244 accuracy: 0.780, auc: 0.828\n",
      "----step245 accuracy: 0.780, auc: 0.828\n",
      "----step246 accuracy: 0.785, auc: 0.828\n",
      "----step247 accuracy: 0.790, auc: 0.827\n",
      "----step248 accuracy: 0.785, auc: 0.827\n",
      "----step249 accuracy: 0.785, auc: 0.827\n",
      "----step250 accuracy: 0.785, auc: 0.827\n",
      "----step251 accuracy: 0.785, auc: 0.827\n",
      "----step252 accuracy: 0.785, auc: 0.827\n",
      "----step253 accuracy: 0.785, auc: 0.827\n",
      "----step254 accuracy: 0.785, auc: 0.827\n",
      "----step255 accuracy: 0.790, auc: 0.827\n",
      "----step256 accuracy: 0.790, auc: 0.827\n",
      "----step257 accuracy: 0.790, auc: 0.827\n",
      "----step258 accuracy: 0.785, auc: 0.828\n",
      "----step259 accuracy: 0.785, auc: 0.828\n",
      "----step260 accuracy: 0.785, auc: 0.828\n",
      "----step261 accuracy: 0.785, auc: 0.828\n",
      "----step262 accuracy: 0.785, auc: 0.828\n",
      "----step263 accuracy: 0.785, auc: 0.828\n",
      "----step264 accuracy: 0.785, auc: 0.829\n",
      "----step265 accuracy: 0.785, auc: 0.828\n",
      "----step266 accuracy: 0.785, auc: 0.828\n",
      "----step267 accuracy: 0.785, auc: 0.828\n",
      "----step268 accuracy: 0.785, auc: 0.828\n",
      "----step269 accuracy: 0.785, auc: 0.828\n",
      "----step270 accuracy: 0.785, auc: 0.828\n",
      "----step271 accuracy: 0.790, auc: 0.828\n",
      "----step272 accuracy: 0.790, auc: 0.828\n",
      "----step273 accuracy: 0.790, auc: 0.828\n",
      "----step274 accuracy: 0.790, auc: 0.828\n",
      "----step275 accuracy: 0.790, auc: 0.828\n",
      "----step276 accuracy: 0.790, auc: 0.828\n",
      "----step277 accuracy: 0.790, auc: 0.828\n",
      "----step278 accuracy: 0.790, auc: 0.828\n",
      "----step279 accuracy: 0.790, auc: 0.828\n",
      "----step280 accuracy: 0.790, auc: 0.828\n",
      "----step281 accuracy: 0.790, auc: 0.828\n",
      "----step282 accuracy: 0.790, auc: 0.828\n",
      "----step283 accuracy: 0.790, auc: 0.828\n",
      "----step284 accuracy: 0.790, auc: 0.828\n",
      "----step285 accuracy: 0.790, auc: 0.828\n",
      "----step286 accuracy: 0.790, auc: 0.828\n",
      "----step287 accuracy: 0.790, auc: 0.828\n",
      "----step288 accuracy: 0.790, auc: 0.828\n",
      "----step289 accuracy: 0.790, auc: 0.828\n",
      "----step290 accuracy: 0.790, auc: 0.828\n",
      "----step291 accuracy: 0.790, auc: 0.828\n",
      "----step292 accuracy: 0.790, auc: 0.829\n",
      "----step293 accuracy: 0.790, auc: 0.828\n",
      "----step294 accuracy: 0.790, auc: 0.828\n",
      "----step295 accuracy: 0.790, auc: 0.829\n",
      "----step296 accuracy: 0.790, auc: 0.829\n",
      "----step297 accuracy: 0.790, auc: 0.829\n",
      "----step298 accuracy: 0.790, auc: 0.828\n",
      "----step299 accuracy: 0.790, auc: 0.828\n",
      "----step300 accuracy: 0.790, auc: 0.828\n",
      "----step301 accuracy: 0.790, auc: 0.828\n",
      "----step302 accuracy: 0.790, auc: 0.828\n",
      "----step303 accuracy: 0.790, auc: 0.828\n",
      "----step304 accuracy: 0.790, auc: 0.828\n",
      "----step305 accuracy: 0.790, auc: 0.828\n",
      "----step306 accuracy: 0.790, auc: 0.828\n",
      "----step307 accuracy: 0.790, auc: 0.828\n",
      "----step308 accuracy: 0.790, auc: 0.828\n",
      "----step309 accuracy: 0.790, auc: 0.828\n",
      "----step310 accuracy: 0.790, auc: 0.828\n",
      "----step311 accuracy: 0.790, auc: 0.828\n",
      "----step312 accuracy: 0.790, auc: 0.828\n",
      "----step313 accuracy: 0.790, auc: 0.828\n",
      "----step314 accuracy: 0.790, auc: 0.828\n",
      "----step315 accuracy: 0.790, auc: 0.828\n",
      "----step316 accuracy: 0.790, auc: 0.828\n",
      "----step317 accuracy: 0.790, auc: 0.828\n",
      "----step318 accuracy: 0.790, auc: 0.828\n",
      "----step319 accuracy: 0.790, auc: 0.828\n",
      "----step320 accuracy: 0.790, auc: 0.828\n",
      "----step321 accuracy: 0.790, auc: 0.828\n",
      "----step322 accuracy: 0.790, auc: 0.828\n",
      "----step323 accuracy: 0.790, auc: 0.828\n",
      "----step324 accuracy: 0.790, auc: 0.828\n",
      "----step325 accuracy: 0.790, auc: 0.828\n",
      "----step326 accuracy: 0.790, auc: 0.828\n",
      "----step327 accuracy: 0.790, auc: 0.828\n",
      "----step328 accuracy: 0.790, auc: 0.828\n",
      "----step329 accuracy: 0.790, auc: 0.828\n",
      "----step330 accuracy: 0.790, auc: 0.828\n",
      "----step331 accuracy: 0.790, auc: 0.828\n",
      "----step332 accuracy: 0.790, auc: 0.828\n",
      "----step333 accuracy: 0.790, auc: 0.828\n",
      "----step334 accuracy: 0.790, auc: 0.828\n",
      "----step335 accuracy: 0.790, auc: 0.828\n",
      "----step336 accuracy: 0.790, auc: 0.828\n",
      "----step337 accuracy: 0.790, auc: 0.828\n",
      "----step338 accuracy: 0.790, auc: 0.828\n",
      "----step339 accuracy: 0.790, auc: 0.828\n",
      "----step340 accuracy: 0.790, auc: 0.828\n",
      "----step341 accuracy: 0.790, auc: 0.828\n",
      "----step342 accuracy: 0.790, auc: 0.828\n",
      "----step343 accuracy: 0.790, auc: 0.828\n",
      "----step344 accuracy: 0.790, auc: 0.827\n",
      "----step345 accuracy: 0.790, auc: 0.827\n",
      "----step346 accuracy: 0.790, auc: 0.828\n",
      "----step347 accuracy: 0.790, auc: 0.828\n",
      "----step348 accuracy: 0.790, auc: 0.828\n",
      "----step349 accuracy: 0.790, auc: 0.827\n",
      "----step350 accuracy: 0.790, auc: 0.827\n",
      "----step351 accuracy: 0.790, auc: 0.827\n",
      "----step352 accuracy: 0.790, auc: 0.827\n",
      "----step353 accuracy: 0.790, auc: 0.827\n",
      "----step354 accuracy: 0.790, auc: 0.827\n",
      "----step355 accuracy: 0.790, auc: 0.827\n",
      "----step356 accuracy: 0.790, auc: 0.827\n",
      "----step357 accuracy: 0.790, auc: 0.827\n",
      "----step358 accuracy: 0.790, auc: 0.828\n",
      "----step359 accuracy: 0.790, auc: 0.828\n",
      "----step360 accuracy: 0.790, auc: 0.828\n",
      "----step361 accuracy: 0.790, auc: 0.828\n",
      "----step362 accuracy: 0.790, auc: 0.828\n",
      "----step363 accuracy: 0.790, auc: 0.828\n",
      "----step364 accuracy: 0.790, auc: 0.828\n",
      "----step365 accuracy: 0.790, auc: 0.827\n",
      "----step366 accuracy: 0.790, auc: 0.827\n",
      "----step367 accuracy: 0.790, auc: 0.827\n",
      "----step368 accuracy: 0.790, auc: 0.827\n",
      "----step369 accuracy: 0.790, auc: 0.827\n",
      "----step370 accuracy: 0.790, auc: 0.827\n",
      "----step371 accuracy: 0.790, auc: 0.828\n",
      "----step372 accuracy: 0.790, auc: 0.828\n",
      "----step373 accuracy: 0.790, auc: 0.828\n",
      "----step374 accuracy: 0.790, auc: 0.828\n",
      "----step375 accuracy: 0.790, auc: 0.828\n",
      "----step376 accuracy: 0.790, auc: 0.828\n",
      "----step377 accuracy: 0.790, auc: 0.828\n",
      "----step378 accuracy: 0.790, auc: 0.827\n",
      "----step379 accuracy: 0.790, auc: 0.827\n",
      "----step380 accuracy: 0.790, auc: 0.828\n",
      "----step381 accuracy: 0.790, auc: 0.828\n",
      "----step382 accuracy: 0.790, auc: 0.828\n",
      "----step383 accuracy: 0.790, auc: 0.828\n",
      "----step384 accuracy: 0.790, auc: 0.828\n",
      "----step385 accuracy: 0.790, auc: 0.827\n",
      "----step386 accuracy: 0.790, auc: 0.827\n",
      "----step387 accuracy: 0.790, auc: 0.827\n",
      "----step388 accuracy: 0.790, auc: 0.827\n",
      "----step389 accuracy: 0.790, auc: 0.827\n",
      "----step390 accuracy: 0.790, auc: 0.827\n",
      "----step391 accuracy: 0.790, auc: 0.827\n",
      "----step392 accuracy: 0.790, auc: 0.827\n",
      "----step393 accuracy: 0.790, auc: 0.827\n",
      "----step394 accuracy: 0.790, auc: 0.827\n",
      "----step395 accuracy: 0.790, auc: 0.827\n",
      "----step396 accuracy: 0.790, auc: 0.827\n",
      "----step397 accuracy: 0.790, auc: 0.827\n",
      "----step398 accuracy: 0.790, auc: 0.827\n",
      "----step399 accuracy: 0.785, auc: 0.827\n"
     ]
    }
   ],
   "source": [
    "from sklearn import metrics\n",
    "# import tensorflow as tf\n",
    "import tensorflow.compat.v1 as tf\n",
    "import numpy as np\n",
    "\n",
    "tf.disable_v2_behavior()\n",
    "\n",
    "# evaluation\n",
    "def get_auc(y, y_pre):\n",
    "    fpr, tpr, thresholds = metrics.roc_curve(y.astype(int), y_pre, pos_label=1)\n",
    "    return metrics.auc(fpr, tpr)\n",
    "\n",
    "# hyper-parameters\n",
    "vector_dim = 8\n",
    "learning_rate = 1e-4\n",
    "l2_factor = 1e-2\n",
    "max_training_step = 400\n",
    "train_rate = 0.8\n",
    "\n",
    "# split data\n",
    "data = np.loadtxt(fname='data', delimiter='\\t')\n",
    "thredhold = int(train_rate * len(data))\n",
    "x_train = data[:thredhold, :-1]\n",
    "y_train = data[:thredhold, -1]\n",
    "x_test = data[thredhold:, :-1]\n",
    "y_test = data[thredhold:, -1]\n",
    "feature_num = len(data[0])-1\n",
    "\n",
    "# construct graph\n",
    "# model parameters\n",
    "w_0 = tf.Variable(0.0)\n",
    "w = tf.Variable(tf.zeros(shape=[feature_num]))\n",
    "v = tf.Variable(tf.random.truncated_normal(shape=[feature_num, vector_dim], mean=0.0, stddev=0.01))\n",
    "\n",
    "# construct loss\n",
    "x = tf.placeholder(shape=[None, feature_num], dtype=tf.float32)\n",
    "y = tf.placeholder(shape=[None], dtype=tf.float32)\n",
    "\n",
    "linear_term = w_0 + tf.reduce_sum(tf.expand_dims(w, axis=0) * x, axis=1)\n",
    "square_of_sum = tf.square(tf.reduce_sum(tf.expand_dims(x, axis=2) * tf.expand_dims(v, axis=0), axis=1))\n",
    "sum_of_square = tf.reduce_sum(tf.square(tf.expand_dims(v, axis=0) * tf.expand_dims(x, axis=2)), axis=1)\n",
    "\n",
    "y_pre = tf.sigmoid(linear_term + 0.5 * tf.reduce_sum(square_of_sum - sum_of_square, axis=1))\n",
    "\n",
    "cross_entropy = - y * tf.log(y_pre) - (1 - y) * tf.log(1 - y_pre)\n",
    "train_op = tf.train.GradientDescentOptimizer(learning_rate).minimize(cross_entropy + l2_factor * tf.add_n([tf.nn.l2_loss(item) for item in [w_0, w, v]]))\n",
    "\n",
    "accuracy = tf.reduce_mean(tf.cast(tf.less(tf.abs(y - y_pre), 0.5), dtype=tf.float32))\n",
    "\n",
    "with tf.Session() as sess:\n",
    "    sess.run(tf.global_variables_initializer())\n",
    "    for step in range(max_training_step):\n",
    "        sess.run(train_op, {x: x_train, y: y_train})\n",
    "        acc, y_pre_value = sess.run([accuracy, y_pre], {x: x_test, y: y_test})\n",
    "        print('----step%3d accuracy: %.3f, auc: %.3f' % (step, acc, get_auc(y_test, y_pre_value)))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 1
}