{"cells":[{"cell_type":"markdown","metadata":{},"source":["# AB-test\n","\n","### 1.检验指标确定\n","\n","+ 一类指标 -- 不出现显著下降\n","\n","  + pv，活跃时长，留存，人均消费，用户访问深度，评价，客户满意度\n","\n","  + 题目中可以使用 点击转化率 和 每小时用户数\n","\n","+ 二类指标 -- 得到提升\n","\n","  + 点击通过率，广告位下单转化，广告主消耗，客户满意度\n","\n","  + 项目中使用 CLR作为评估值(点击通过率)\n","\n","\n","### 2 确定检验统计量\n","\n","首先1212活动流量较大（满足Z检验需求）\n","\n","\n","\n","+ 二类指标 -- 得到提升 -- 且两组数据必定存在差异(CTR)\n","\n","    $ P_a - P_b $ 双样本\n","\n","    但由于数据的特殊性,转化率 $ mu_{converted} = p_{converted} $\n","\n","\n","+ 一类指标 -- 不出现显著下降\n","\n","    + 日访问量不出现显著下降 $ \\mu_{user} $\n","\n","    $ \\mu_{control} - \\mu_{treatment} < 0.3 $ \n","\n","    + 二类指标对应值 $ p_{converted} $\n","\n","    $ p_{control} - p_{treatment} < 0.3 $ \n","\n","\n","### 3 埋点\n","\n","| 事件类型   | 属性名称        | 属性说明                            |\n","| :----------: | :---------------: | :-----------------------------------: |\n","| 用户信息   | uid             | 用户id                              |\n","| 用户设备   | ip_address      | IP地址                              |\n","| 用户设备   | device_id       | 设备ID                              |\n","| 用户设备   | device_type     | 设备类型（windows，mac，Linux)      |\n","| 用户设备   | Browser         | 浏览器类型（IE，firefox，Chrome...) |\n","| banner交互 | is_click        | 是否点击通过                        |\n","| banner交互 | is_show         | 是否显示                            |\n","| 页面信息   | backward        | 前置页(来源页）                     |\n","| 页面信息   | current_page_id | 当前所在页面（案例需要则均为首页）  |\n","| 页面信息   | time_spend      | 页面停留多久后点击                  |\n","| 页面信息   | click_time      | 触发事件的时间                      |\n",""]},{"cell_type":"code","execution_count":1,"metadata":{},"outputs":[],"source":["import pandas as pd\n","from scipy import stats\n","import itertools\n",""]},{"cell_type":"code","execution_count":2,"metadata":{},"outputs":[{"output_type":"stream","name":"stdout","text":["<class 'pandas.core.frame.DataFrame'>\nInt64Index: 294478 entries, 131228 to 193652\nData columns (total 7 columns):\n #   Column        Non-Null Count   Dtype         \n---  ------        --------------   -----         \n 0   user_id       294478 non-null  int64         \n 1   timestamp     294478 non-null  datetime64[ns]\n 2   group         294478 non-null  object        \n 3   landing_page  294478 non-null  object        \n 4   converted     294478 non-null  int64         \n 5   date          294478 non-null  object        \n 6   hour          294478 non-null  int64         \ndtypes: datetime64[ns](1), int64(3), object(3)\nmemory usage: 18.0+ MB\n"]},{"output_type":"execute_result","data":{"text/plain":["        user_id                  timestamp      group landing_page  converted  \\\n","131228   922696 2017-01-02 13:42:05.378582  treatment     new_page          0   \n","184884   781507 2017-01-02 13:42:15.234051    control     old_page          0   \n","83878    737319 2017-01-02 13:42:21.786186    control     old_page          0   \n","102717   818377 2017-01-02 13:42:26.640581  treatment     new_page          0   \n","158789   725857 2017-01-02 13:42:27.851110  treatment     new_page          0   \n","...         ...                        ...        ...          ...        ...   \n","153305   851645 2017-01-24 13:41:18.869978  treatment     old_page          0   \n","47535    808330 2017-01-24 13:41:19.152664    control     old_page          0   \n","157787   843121 2017-01-24 13:41:44.097174  treatment     new_page          0   \n","179072   836373 2017-01-24 13:41:52.604673    control     old_page          0   \n","193652   920411 2017-01-24 13:41:54.460509    control     old_page          0   \n","\n","              date  hour  \n","131228  2017-01-02    13  \n","184884  2017-01-02    13  \n","83878   2017-01-02    13  \n","102717  2017-01-02    13  \n","158789  2017-01-02    13  \n","...            ...   ...  \n","153305  2017-01-24    13  \n","47535   2017-01-24    13  \n","157787  2017-01-24    13  \n","179072  2017-01-24    13  \n","193652  2017-01-24    13  \n","\n","[294478 rows x 7 columns]"],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>user_id</th>\n      <th>timestamp</th>\n      <th>group</th>\n      <th>landing_page</th>\n      <th>converted</th>\n      <th>date</th>\n      <th>hour</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>131228</th>\n      <td>922696</td>\n      <td>2017-01-02 13:42:05.378582</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>184884</th>\n      <td>781507</td>\n      <td>2017-01-02 13:42:15.234051</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>83878</th>\n      <td>737319</td>\n      <td>2017-01-02 13:42:21.786186</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>102717</th>\n      <td>818377</td>\n      <td>2017-01-02 13:42:26.640581</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>158789</th>\n      <td>725857</td>\n      <td>2017-01-02 13:42:27.851110</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>...</th>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n    </tr>\n    <tr>\n      <th>153305</th>\n      <td>851645</td>\n      <td>2017-01-24 13:41:18.869978</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>47535</th>\n      <td>808330</td>\n      <td>2017-01-24 13:41:19.152664</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>157787</th>\n      <td>843121</td>\n      <td>2017-01-24 13:41:44.097174</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>179072</th>\n      <td>836373</td>\n      <td>2017-01-24 13:41:52.604673</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>193652</th>\n      <td>920411</td>\n      <td>2017-01-24 13:41:54.460509</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n  </tbody>\n</table>\n<p>294478 rows × 7 columns</p>\n</div>"},"metadata":{},"execution_count":2}],"source":["df = pd.read_csv(\"ab_data.csv\")\n","df[\"timestamp\"] = pd.to_datetime(df[\"timestamp\"])\n","\n","df[\"date\"] = df[\"timestamp\"].dt.date\n","df[\"hour\"] = df[\"timestamp\"].dt.hour\n","\n","df.sort_values(\"timestamp\", inplace=True)\n","\n","df.info()\n","df"]},{"cell_type":"markdown","metadata":{},"source":["### 4 $ H_0 H_1 $确认\n","\n","#### 二类指标假设 clr\n","\n","$ H_0 $ banner a ($ p_A $) 与 banner b ($ p_B $) $ H_0 : p_A - p_B \\ge 0.005 $\n","\n","$ H_1 $  banner a ($ p_A $) 与 banner b ($ p_B $)  $ H_0 : p_A - p_b \\lt 0.005 $\n","\n","\n","因此为其设定一个差异值 $ \\delta $ 0.005 即  $ H_1 : p_A - p_b - \\delta \\lt 0 $ or  $ H_0 : p_A - p_b - \\delta \\ge 0 $\n","\n","\n","#### 一类指标假设\n","\n","##### 流量计算\n","\n","$ \\mu_0 = \\mu_{PV_or_clr} = \\frac{\\sum_{k=1}^{days} {daily_hour_PV_k}}{daily_hours}  $\n","\n","过去日时均访问量,案例中由于缺乏历史数据,使用总体数据\n","\n","$ mu_h = mu_{n} = \\frac{\\sum_{k=1}^hour {PV_k}}{24} $ 即当日时均\n","\n","当:\n","\n","$ H_0 $ banner_a_pv ($ \\mu_A $) 与 banner_b_pv ($ \\mu_B $) $ H_0 : \\mu_A - \\mu_B \\ge \\mu * -0.02 $\n","\n","$ H_1 $  banner_a_pv ($ \\mu_A $) 与 banner_b_pv ($ \\mu_B $)  $ H_0 : \\mu_A - \\mu_b \\lt \\mu * -0.02 $\n","\n","因此为其设定一个差异值 $ \\delta $ 0.02 即  $ H_1 : \\mu_A - \\mu_b - \\delta \\lt 0 $ or  $ H_0 : \\mu_A - \\mu_b - \\delta \\ge 0 $\n","\n","##### clr\n","\n","$ H_0 $ banner a ($ p_A $) 与 banner b ($ p_B $) $ H_0 : p_A - p_B \\ge -0.004 $\n","\n","$ H_1 $  banner a ($ p_A $) 与 banner b ($ p_B $)  $ H_0 : p_A - p_b \\lt -0.004 $\n","\n","\n","因此为其设定一个差异值 $ \\delta $ -0.004 即  $ H_1 : p_A - p_b - \\delta \\lt 0 $ or  $ H_0 : p_A - p_b - \\delta \\ge 0 $\n","\n",""]},{"cell_type":"code","execution_count":3,"metadata":{},"outputs":[{"output_type":"stream","name":"stdout","text":[" \n    pn = mun = 0.11965919355605512\n    mu_std = 0.32456313511492285\n    pvar = 0.10534087095356966\n    \n"]}],"source":["mun = df[\"converted\"].mean()\n","n_std = df[\"converted\"].std()\n","pvar = mun * (1 - mun)\n","print(\n","    \"\"\" \n","    pn = mun = {}\n","    mu_std = {}\n","    pvar = {}\n","    \"\"\".format(\n","        mun, n_std, pvar\n","    )\n",")\n","\n",""]},{"cell_type":"markdown","metadata":{},"source":["### 5 显著性水平\n","\n","计算二类样本数\n","\n","双样本比较的情况所需要的样本总数\n","\n","+ alpha = 0.05 -- 接受$ H_\\alpha $的置信度\n","\n","+ beta = 0.2 -- 接受$ H_\\beta $的效度\n",""]},{"cell_type":"markdown","metadata":{},"source":["### 6 计算样本量\n","\n","#### CRL 一/二类样本数计算\n","[公式参考网站](http://powerandsamplesize.com/Calculators/Compare-2-Means/2-Sample-Non-Inferiority-or-Superiority)\n","\n","\n","$ 1-\\alpha $ = z_alpha = stats.norm.ppf(1 - alpha)\n","其中 $ p_A-p_B = 0.05 $ 即bannerA效果比bannerB效果好5%\n","\n","$ \\frac{p_A(1-p_A)}{\\kappa}+p_B(1-p_B) $ 替换为总体样本方差 即 $ 2p(1-p) $\n","\n","$ pvar = 2p_n(1-p_n) $ 为各组的方差\n","\n","$$\n","n_A=\\kappa n_B \\;\\text{ and }\\;\n","\t\tn_B=\\left(\\frac{p_A(1-p_A)}{\\kappa}+p_B(1-p_B)\\right)\n","\t\t\t\\left(\\frac{z_{1-\\alpha}+z_{1-\\beta}}{p_A-p_B-\\delta}\\right)^2\n","\n","=2p(1-p)(\\frac{z_{1-\\alpha}+z_{1-\\beta}}{p_{diff}})^2\n","\n","$$\n","\n","z_beta = stats.norm.ppf(1 - beta)\n","\n","$$\n","z = \\frac{|p_A-p_B|}{\\sqrt{\\frac{p_A(1-p_A)}{n_A}+\\frac{p_B(1-p_B)}{n_B}}} \n","= \\frac{ \\vert p_{diff} \\vert }{ \\sqrt{ 2\\frac{\\pi(1-\\pi)}{n}}}\n","$$\n","\n","$$\n","1-\\beta=\\Phi\\left(\\frac{|p_A-p_B|}{\\sqrt{\\frac{p_A(1-p_A)}{n_A}+\\frac{p_B(1-p_B)}{n_B}}}-z_{1-\\alpha}\\right)\n","=\\Phi(z-z_{1-\\alpha})\n","$$\n",""]},{"cell_type":"code","execution_count":4,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["({'alpha': 0.05,\n","  'beta': 0.2,\n","  'p_diff': 0.015,\n","  'z_alpha': 1.6448536269514722,\n","  'z_beta': 0.8416212335729143,\n","  'samples': 5820.02399967782,\n","  'power': 0.2799619204078084},\n"," {'alpha': 0.05,\n","  'beta': 0.2,\n","  'p_diff': -0.02,\n","  'z_alpha': 1.6448536269514722,\n","  'z_beta': 0.8416212335729143,\n","  'samples': 3273.763499818773,\n","  'power': 0.2799619204078085})"]},"metadata":{},"execution_count":4}],"source":["\n","\n","def samples_cal_Proportions_2n_1size(percent, p_diff=1, alpha=0.05, beta=0.2):\n","    z_alpha = stats.norm.ppf(1 - alpha)\n","    z_beta = stats.norm.ppf(1 - beta)\n","    pvar = percent * (1 - percent)\n","    Na = 2 * pvar * ((z_alpha + z_beta) / (p_diff)) ** 2  # 双样本 比例 双侧\n","    power = stats.norm.pdf(abs(p_diff) / (2 * pvar / Na) ** (1 / 2) - z_alpha)\n","\n","    return {\n","        \"alpha\": alpha,\n","        \"beta\": beta,\n","        \"p_diff\": p_diff,\n","        \"z_alpha\": z_alpha,\n","        \"z_beta\": z_beta,\n","        \"samples\": Na,\n","        \"power\": power,\n","    }\n","\n","\n","dfgb = df.groupby(by=[\"group\"])[\"converted\"].agg([\"count\", \"mean\"]).reset_index()\n","percent = dfgb.loc[dfgb[\"group\"] == \"control\", \"mean\"].values[0]\n","# 先使用控制组样本（假装是历史数据）\n","\n","type_II_clr = samples_cal_Proportions_2n_1size(\n","    percent=percent, p_diff=0.015\n",")  # 当p_crl1 - p_crl0 = 0.5% 的时候所需要的样本数\n","type_I_clr = samples_cal_Proportions_2n_1size(\n","    percent=percent, p_diff=-0.02\n",")  # 当p_crl1 - p_crl0 = -0.3% 的时候所需要的样本数\n","\n","type_II_clr, type_I_clr\n","\n",""]},{"cell_type":"markdown","metadata":{},"source":["#### 日小时均流量 一类指标样本计算\n","\n","[公式参考网站](http://powerandsamplesize.com/Calculators/Compare-2-Means/2-Sample-1-Sided)\n","\n","$ H_0:\\mu_A-\\mu_B = 0 $\n","\n","$ H_1: \\vert \\mu_A-\\mu_B \\vert > 0 $\n","\n","$ \\mu-\\mu_0 = {\\mu_{diff}} $\n","\n","$$\n","n_A=\\left(\\sigma_A^2+\\sigma_B^2/\\kappa\\right)\\left(\\frac{z_{1-\\alpha}+z_{1-\\beta}}{\\mu_A-\\mu_B}\\right)^2\n","\n","=(2\\sigma^2)(\\frac{z_{1-\\alpha}+z_{1-\\beta}}{\\mu_{diff}})^2\n","$$\n","\n","z 公式与改写\n","\n","$$\n","z = \\frac{|\\mu_A-\\mu_B|\\sqrt{n_A}}{\\sqrt{\\sigma_A^2+\\sigma_B^2/\\kappa}}\n","= \\frac{|\\mu_{diff}|\\sqrt{n_A}}{\\sqrt{2\\sigma^2}}\n","$$\n","\n","power\n","\n","$$\n","1-\\beta=\\Phi\\left(\\frac{|\\mu_A-\\mu_B|\\sqrt{n_A}}{\\sqrt{\\sigma_A^2+\\sigma_B^2/\\kappa}}-z_{1-\\alpha}\\right)\n","= \\Phi\\left(z-z_{1-\\alpha}\\right)\n","$$\n",""]},{"cell_type":"code","execution_count":5,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["            date  hour      group landing_page  user_id  converted\n","0     2017-01-02    13    control     new_page        1          0\n","1     2017-01-02    13    control     old_page       83         11\n","2     2017-01-02    13  treatment     new_page       87          9\n","3     2017-01-02    14    control     new_page        8          1\n","4     2017-01-02    14    control     old_page      272         39\n","...          ...   ...        ...          ...      ...        ...\n","2083  2017-01-24    12  treatment     old_page        1          0\n","2084  2017-01-24    13    control     new_page        2          0\n","2085  2017-01-24    13    control     old_page      174         22\n","2086  2017-01-24    13  treatment     new_page      188         20\n","2087  2017-01-24    13  treatment     old_page        2          0\n","\n","[2088 rows x 6 columns]"],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>date</th>\n      <th>hour</th>\n      <th>group</th>\n      <th>landing_page</th>\n      <th>user_id</th>\n      <th>converted</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>2017-01-02</td>\n      <td>13</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>1</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>2017-01-02</td>\n      <td>13</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>83</td>\n      <td>11</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>2017-01-02</td>\n      <td>13</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>87</td>\n      <td>9</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>2017-01-02</td>\n      <td>14</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>8</td>\n      <td>1</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>2017-01-02</td>\n      <td>14</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>272</td>\n      <td>39</td>\n    </tr>\n    <tr>\n      <th>...</th>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n    </tr>\n    <tr>\n      <th>2083</th>\n      <td>2017-01-24</td>\n      <td>12</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>1</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>2084</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>2</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>2085</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>174</td>\n      <td>22</td>\n    </tr>\n    <tr>\n      <th>2086</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>188</td>\n      <td>20</td>\n    </tr>\n    <tr>\n      <th>2087</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>2</td>\n      <td>0</td>\n    </tr>\n  </tbody>\n</table>\n<p>2088 rows × 6 columns</p>\n</div>"},"metadata":{},"execution_count":5}],"source":["\n","dfgbuser = df.groupby(by=[\"date\", \"hour\", \"group\", \"landing_page\"]).agg(\n","    {\"user_id\": \"count\", \"converted\": \"sum\"}\n",")\n","dfgbuser.reset_index(inplace=True)\n","dfgbuser"]},{"cell_type":"code","execution_count":6,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["{'samples': 2.600982318236189,\n"," 'z': 2.0050387802961995,\n"," 'power': 0.37388567670728995}"]},"metadata":{},"execution_count":6}],"source":["\n","\n","def samples_cal_mean_2n_1size(data, mu_diff_percent=-0.02, alpha=0.05, beta=0.2):\n","    daily_hour_avg_pv = dfgbuser[dfgbuser[\"group\"] == \"control\"].groupby(\"date\")[\n","        \"user_id\"\n","    ]\n","    daily_hour_avg_pv.mean()\n","    z_alpha = stats.norm.ppf(1 - alpha)\n","    z_beta = stats.norm.ppf(1 - beta)\n","\n","    sigma = data.std()\n","    mu_diff = data.mean() * mu_diff_percent\n","\n","    type_I_dau = (2 * sigma ** 2) * ((z_alpha + z_beta) / (mu_diff)) ** 2\n","\n","    z = (mu_diff) / (2 * sigma ** 2) ** (1 / 2) * (type_I_dau) * (1 / 2)\n","\n","    power = stats.norm.pdf(z - stats.norm.ppf(1 - alpha))\n","\n","    return {\"samples\": type_I_dau, \"z\": z, \"power\": power}\n","\n","\n","daily_hour_avg_pv = dfgbuser[dfgbuser[\"group\"] == \"control\"].groupby(\"date\")[\"user_id\"]\n","daily_hour_avg_pv.mean()\n","type_I_mean = samples_cal_mean_2n_1size(daily_hour_avg_pv.mean(), mu_diff_percent=0.05)\n","type_I_mean\n",""]},{"cell_type":"markdown","metadata":{},"source":["### 指标假设检验\n","\n","#### 样本分组和样本数检测\n",""]},{"cell_type":"code","execution_count":7,"metadata":{},"outputs":[],"source":["def sample_number_check(data, chack_num=1):\n","    sample_check = []\n","    for k, v in data.items():\n","        print(k, v)\n","        print(k, v, \">\", chack_num, v > chack_num)\n","        if v > chack_num:\n","            sample_check.append(k)\n","    return sample_check\n","\n",""]},{"cell_type":"markdown","metadata":{},"source":["##### 点击率问题 $ p_A - p_B $\n","\n","双样本比较采用\n","$$\n","z = \\cfrac{p_A(1-p_A)}{n_A} + \\cfrac{p_B(1-p_B)}{n_B}\n","$$\n","\n","其中  $ \\cfrac{\\pi(1-\\pi)}{n} $ 为单个样本的方差"]},{"cell_type":"code","execution_count":8,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["        user_id                  timestamp      group landing_page  converted  \\\n","131228   922696 2017-01-02 13:42:05.378582  treatment     new_page          0   \n","184884   781507 2017-01-02 13:42:15.234051    control     old_page          0   \n","83878    737319 2017-01-02 13:42:21.786186    control     old_page          0   \n","102717   818377 2017-01-02 13:42:26.640581  treatment     new_page          0   \n","158789   725857 2017-01-02 13:42:27.851110  treatment     new_page          0   \n","...         ...                        ...        ...          ...        ...   \n","153305   851645 2017-01-24 13:41:18.869978  treatment     old_page          0   \n","47535    808330 2017-01-24 13:41:19.152664    control     old_page          0   \n","157787   843121 2017-01-24 13:41:44.097174  treatment     new_page          0   \n","179072   836373 2017-01-24 13:41:52.604673    control     old_page          0   \n","193652   920411 2017-01-24 13:41:54.460509    control     old_page          0   \n","\n","              date  hour  \n","131228  2017-01-02    13  \n","184884  2017-01-02    13  \n","83878   2017-01-02    13  \n","102717  2017-01-02    13  \n","158789  2017-01-02    13  \n","...            ...   ...  \n","153305  2017-01-24    13  \n","47535   2017-01-24    13  \n","157787  2017-01-24    13  \n","179072  2017-01-24    13  \n","193652  2017-01-24    13  \n","\n","[294478 rows x 7 columns]"],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>user_id</th>\n      <th>timestamp</th>\n      <th>group</th>\n      <th>landing_page</th>\n      <th>converted</th>\n      <th>date</th>\n      <th>hour</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>131228</th>\n      <td>922696</td>\n      <td>2017-01-02 13:42:05.378582</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>184884</th>\n      <td>781507</td>\n      <td>2017-01-02 13:42:15.234051</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>83878</th>\n      <td>737319</td>\n      <td>2017-01-02 13:42:21.786186</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>102717</th>\n      <td>818377</td>\n      <td>2017-01-02 13:42:26.640581</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>158789</th>\n      <td>725857</td>\n      <td>2017-01-02 13:42:27.851110</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-02</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>...</th>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n    </tr>\n    <tr>\n      <th>153305</th>\n      <td>851645</td>\n      <td>2017-01-24 13:41:18.869978</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>47535</th>\n      <td>808330</td>\n      <td>2017-01-24 13:41:19.152664</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>157787</th>\n      <td>843121</td>\n      <td>2017-01-24 13:41:44.097174</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>179072</th>\n      <td>836373</td>\n      <td>2017-01-24 13:41:52.604673</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n    <tr>\n      <th>193652</th>\n      <td>920411</td>\n      <td>2017-01-24 13:41:54.460509</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>0</td>\n      <td>2017-01-24</td>\n      <td>13</td>\n    </tr>\n  </tbody>\n</table>\n<p>294478 rows × 7 columns</p>\n</div>"},"metadata":{},"execution_count":8}],"source":["df\n"]},{"cell_type":"code","execution_count":9,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["          date  count_control  mean_control  var_control  sum_control  \\\n","0   2017-01-03           6590      0.113809     0.100872          750   \n","1   2017-01-04           6578      0.121922     0.107073          802   \n","2   2017-01-05           6427      0.123230     0.108061          792   \n","3   2017-01-06           6606      0.115350     0.102060          762   \n","4   2017-01-07           6604      0.120987     0.106365          799   \n","5   2017-01-08           6687      0.118887     0.104769          795   \n","6   2017-01-09           6628      0.119644     0.105345          793   \n","7   2017-01-10           6654      0.112864     0.100141          751   \n","8   2017-01-11           6688      0.118870     0.104755          795   \n","9   2017-01-12           6522      0.122048     0.107169          796   \n","10  2017-01-13           6552      0.116911     0.103258          766   \n","11  2017-01-14           6548      0.126756     0.110706          830   \n","12  2017-01-15           6714      0.120494     0.105991          809   \n","13  2017-01-16           6591      0.121833     0.107006          803   \n","14  2017-01-17           6617      0.122865     0.107786          813   \n","15  2017-01-18           6482      0.124807     0.109247          809   \n","16  2017-01-19           6578      0.119945     0.105574          789   \n","17  2017-01-20           6534      0.115243     0.101978          753   \n","18  2017-01-21           6749      0.125945     0.110099          850   \n","19  2017-01-22           6596      0.119163     0.104979          786   \n","20  2017-01-23           6716      0.125670     0.109893          844   \n","\n","    count_treatment  mean_treatment  var_treatment  sum_treatment  \n","0              6618        0.113781       0.100850            753  \n","1              6541        0.116649       0.103058            763  \n","2              6505        0.114988       0.101782            748  \n","3              6747        0.123462       0.108235            833  \n","4              6609        0.116205       0.102717            768  \n","5              6700        0.120746       0.106182            809  \n","6              6615        0.118065       0.104141            781  \n","7              6696        0.126344       0.110398            846  \n","8              6673        0.115091       0.101860            768  \n","9              6637        0.122344       0.107392            812  \n","10             6508        0.111248       0.098887            724  \n","11             6600        0.119242       0.105040            787  \n","12             6549        0.113452       0.100596            743  \n","13             6545        0.119175       0.104988            780  \n","14             6538        0.127256       0.111079            832  \n","15             6603        0.124792       0.109235            824  \n","16             6552        0.117216       0.103492            768  \n","17             6679        0.117682       0.103849            786  \n","18             6560        0.115701       0.102330            759  \n","19             6669        0.118009       0.104098            787  \n","20             6633        0.121061       0.106422            803  "],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>date</th>\n      <th>count_control</th>\n      <th>mean_control</th>\n      <th>var_control</th>\n      <th>sum_control</th>\n      <th>count_treatment</th>\n      <th>mean_treatment</th>\n      <th>var_treatment</th>\n      <th>sum_treatment</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>2017-01-03</td>\n      <td>6590</td>\n      <td>0.113809</td>\n      <td>0.100872</td>\n      <td>750</td>\n      <td>6618</td>\n      <td>0.113781</td>\n      <td>0.100850</td>\n      <td>753</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>2017-01-04</td>\n      <td>6578</td>\n      <td>0.121922</td>\n      <td>0.107073</td>\n      <td>802</td>\n      <td>6541</td>\n      <td>0.116649</td>\n      <td>0.103058</td>\n      <td>763</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>2017-01-05</td>\n      <td>6427</td>\n      <td>0.123230</td>\n      <td>0.108061</td>\n      <td>792</td>\n      <td>6505</td>\n      <td>0.114988</td>\n      <td>0.101782</td>\n      <td>748</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>2017-01-06</td>\n      <td>6606</td>\n      <td>0.115350</td>\n      <td>0.102060</td>\n      <td>762</td>\n      <td>6747</td>\n      <td>0.123462</td>\n      <td>0.108235</td>\n      <td>833</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>2017-01-07</td>\n      <td>6604</td>\n      <td>0.120987</td>\n      <td>0.106365</td>\n      <td>799</td>\n      <td>6609</td>\n      <td>0.116205</td>\n      <td>0.102717</td>\n      <td>768</td>\n    </tr>\n    <tr>\n      <th>5</th>\n      <td>2017-01-08</td>\n      <td>6687</td>\n      <td>0.118887</td>\n      <td>0.104769</td>\n      <td>795</td>\n      <td>6700</td>\n      <td>0.120746</td>\n      <td>0.106182</td>\n      <td>809</td>\n    </tr>\n    <tr>\n      <th>6</th>\n      <td>2017-01-09</td>\n      <td>6628</td>\n      <td>0.119644</td>\n      <td>0.105345</td>\n      <td>793</td>\n      <td>6615</td>\n      <td>0.118065</td>\n      <td>0.104141</td>\n      <td>781</td>\n    </tr>\n    <tr>\n      <th>7</th>\n      <td>2017-01-10</td>\n      <td>6654</td>\n      <td>0.112864</td>\n      <td>0.100141</td>\n      <td>751</td>\n      <td>6696</td>\n      <td>0.126344</td>\n      <td>0.110398</td>\n      <td>846</td>\n    </tr>\n    <tr>\n      <th>8</th>\n      <td>2017-01-11</td>\n      <td>6688</td>\n      <td>0.118870</td>\n      <td>0.104755</td>\n      <td>795</td>\n      <td>6673</td>\n      <td>0.115091</td>\n      <td>0.101860</td>\n      <td>768</td>\n    </tr>\n    <tr>\n      <th>9</th>\n      <td>2017-01-12</td>\n      <td>6522</td>\n      <td>0.122048</td>\n      <td>0.107169</td>\n      <td>796</td>\n      <td>6637</td>\n      <td>0.122344</td>\n      <td>0.107392</td>\n      <td>812</td>\n    </tr>\n    <tr>\n      <th>10</th>\n      <td>2017-01-13</td>\n      <td>6552</td>\n      <td>0.116911</td>\n      <td>0.103258</td>\n      <td>766</td>\n      <td>6508</td>\n      <td>0.111248</td>\n      <td>0.098887</td>\n      <td>724</td>\n    </tr>\n    <tr>\n      <th>11</th>\n      <td>2017-01-14</td>\n      <td>6548</td>\n      <td>0.126756</td>\n      <td>0.110706</td>\n      <td>830</td>\n      <td>6600</td>\n      <td>0.119242</td>\n      <td>0.105040</td>\n      <td>787</td>\n    </tr>\n    <tr>\n      <th>12</th>\n      <td>2017-01-15</td>\n      <td>6714</td>\n      <td>0.120494</td>\n      <td>0.105991</td>\n      <td>809</td>\n      <td>6549</td>\n      <td>0.113452</td>\n      <td>0.100596</td>\n      <td>743</td>\n    </tr>\n    <tr>\n      <th>13</th>\n      <td>2017-01-16</td>\n      <td>6591</td>\n      <td>0.121833</td>\n      <td>0.107006</td>\n      <td>803</td>\n      <td>6545</td>\n      <td>0.119175</td>\n      <td>0.104988</td>\n      <td>780</td>\n    </tr>\n    <tr>\n      <th>14</th>\n      <td>2017-01-17</td>\n      <td>6617</td>\n      <td>0.122865</td>\n      <td>0.107786</td>\n      <td>813</td>\n      <td>6538</td>\n      <td>0.127256</td>\n      <td>0.111079</td>\n      <td>832</td>\n    </tr>\n    <tr>\n      <th>15</th>\n      <td>2017-01-18</td>\n      <td>6482</td>\n      <td>0.124807</td>\n      <td>0.109247</td>\n      <td>809</td>\n      <td>6603</td>\n      <td>0.124792</td>\n      <td>0.109235</td>\n      <td>824</td>\n    </tr>\n    <tr>\n      <th>16</th>\n      <td>2017-01-19</td>\n      <td>6578</td>\n      <td>0.119945</td>\n      <td>0.105574</td>\n      <td>789</td>\n      <td>6552</td>\n      <td>0.117216</td>\n      <td>0.103492</td>\n      <td>768</td>\n    </tr>\n    <tr>\n      <th>17</th>\n      <td>2017-01-20</td>\n      <td>6534</td>\n      <td>0.115243</td>\n      <td>0.101978</td>\n      <td>753</td>\n      <td>6679</td>\n      <td>0.117682</td>\n      <td>0.103849</td>\n      <td>786</td>\n    </tr>\n    <tr>\n      <th>18</th>\n      <td>2017-01-21</td>\n      <td>6749</td>\n      <td>0.125945</td>\n      <td>0.110099</td>\n      <td>850</td>\n      <td>6560</td>\n      <td>0.115701</td>\n      <td>0.102330</td>\n      <td>759</td>\n    </tr>\n    <tr>\n      <th>19</th>\n      <td>2017-01-22</td>\n      <td>6596</td>\n      <td>0.119163</td>\n      <td>0.104979</td>\n      <td>786</td>\n      <td>6669</td>\n      <td>0.118009</td>\n      <td>0.104098</td>\n      <td>787</td>\n    </tr>\n    <tr>\n      <th>20</th>\n      <td>2017-01-23</td>\n      <td>6716</td>\n      <td>0.125670</td>\n      <td>0.109893</td>\n      <td>844</td>\n      <td>6633</td>\n      <td>0.121061</td>\n      <td>0.106422</td>\n      <td>803</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{},"execution_count":9}],"source":["# 检验样本量是否充足\n","def data_parper(\n","    data,\n","    sample_chack=1,\n","    cal_key=\"converted\",\n","    pares={\"group\": [\"control\", \"treatment\"], \"landing_page\": [\"old_page\", \"new_page\"]},\n","):\n","    group_list = list(pares.keys())\n","    sample_group = (\n","        data.groupby(by=[\"date\"] + group_list)\n","        .agg({cal_key: [\"count\", \"mean\", \"var\", \"sum\"]})[cal_key]\n","        .reset_index()\n","    )\n","\n","    sample_group = sample_group[sample_group[\"count\"] > sample_chack]  # 样本数计算\n","    sg = pd.merge(\n","        sample_group[\n","            (sample_group[group_list[0]] == pares[group_list[0]][0])\n","            & (sample_group[group_list[1]] == pares[group_list[1]][0])\n","        ][[\"date\", \"count\", \"mean\", \"var\", \"sum\"]],\n","        sample_group[\n","            (sample_group[group_list[0]] == pares[group_list[0]][1])\n","            & (sample_group[group_list[1]] == pares[group_list[1]][1])\n","        ][[\"date\", \"count\", \"mean\", \"var\", \"sum\"]],\n","        how=\"inner\",\n","        left_on=\"date\",\n","        right_on=\"date\",\n","        suffixes=(\"_control\", \"_treatment\"),\n","    )\n","    return sg\n","\n","\n","data = data_parper(\n","    df,\n","    (\n","        type_I_clr[\"samples\"]\n","        if type_I_clr[\"samples\"] > type_II_clr[\"samples\"]\n","        else type_II_clr[\"samples\"]\n","    ),\n",")\n","data"]},{"cell_type":"code","execution_count":10,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["          date  count_control  mean_control  var_control  sum_control  \\\n","0   2017-01-03           6590      0.113809     0.100872          750   \n","1   2017-01-04           6578      0.121922     0.107073          802   \n","2   2017-01-05           6427      0.123230     0.108061          792   \n","3   2017-01-06           6606      0.115350     0.102060          762   \n","4   2017-01-07           6604      0.120987     0.106365          799   \n","5   2017-01-08           6687      0.118887     0.104769          795   \n","6   2017-01-09           6628      0.119644     0.105345          793   \n","7   2017-01-10           6654      0.112864     0.100141          751   \n","8   2017-01-11           6688      0.118870     0.104755          795   \n","9   2017-01-12           6522      0.122048     0.107169          796   \n","10  2017-01-13           6552      0.116911     0.103258          766   \n","11  2017-01-14           6548      0.126756     0.110706          830   \n","12  2017-01-15           6714      0.120494     0.105991          809   \n","13  2017-01-16           6591      0.121833     0.107006          803   \n","14  2017-01-17           6617      0.122865     0.107786          813   \n","15  2017-01-18           6482      0.124807     0.109247          809   \n","16  2017-01-19           6578      0.119945     0.105574          789   \n","17  2017-01-20           6534      0.115243     0.101978          753   \n","18  2017-01-21           6749      0.125945     0.110099          850   \n","19  2017-01-22           6596      0.119163     0.104979          786   \n","20  2017-01-23           6716      0.125670     0.109893          844   \n","\n","    count_treatment  mean_treatment  var_treatment  sum_treatment  \\\n","0              6618        0.113781       0.100850            753   \n","1              6541        0.116649       0.103058            763   \n","2              6505        0.114988       0.101782            748   \n","3              6747        0.123462       0.108235            833   \n","4              6609        0.116205       0.102717            768   \n","5              6700        0.120746       0.106182            809   \n","6              6615        0.118065       0.104141            781   \n","7              6696        0.126344       0.110398            846   \n","8              6673        0.115091       0.101860            768   \n","9              6637        0.122344       0.107392            812   \n","10             6508        0.111248       0.098887            724   \n","11             6600        0.119242       0.105040            787   \n","12             6549        0.113452       0.100596            743   \n","13             6545        0.119175       0.104988            780   \n","14             6538        0.127256       0.111079            832   \n","15             6603        0.124792       0.109235            824   \n","16             6552        0.117216       0.103492            768   \n","17             6679        0.117682       0.103849            786   \n","18             6560        0.115701       0.102330            759   \n","19             6669        0.118009       0.104098            787   \n","20             6633        0.121061       0.106422            803   \n","\n","    mu_control-mu_treatment  var_diff  \n","0                  0.000028  0.005526  \n","1                  0.005273  0.005659  \n","2                  0.008242  0.005697  \n","3                 -0.008113  0.005611  \n","4                  0.004782  0.005625  \n","5                 -0.001859  0.005613  \n","6                  0.001579  0.005624  \n","7                 -0.013480  0.005615  \n","8                  0.003779  0.005561  \n","9                 -0.000296  0.005710  \n","10                 0.005663  0.005563  \n","11                 0.007514  0.005729  \n","12                 0.007042  0.005581  \n","13                 0.002658  0.005681  \n","14                -0.004391  0.005768  \n","15                 0.000015  0.005779  \n","16                 0.002729  0.005643  \n","17                -0.002439  0.005581  \n","18                 0.010243  0.005649  \n","19                 0.001154  0.005614  \n","20                 0.004609  0.005692  "],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>date</th>\n      <th>count_control</th>\n      <th>mean_control</th>\n      <th>var_control</th>\n      <th>sum_control</th>\n      <th>count_treatment</th>\n      <th>mean_treatment</th>\n      <th>var_treatment</th>\n      <th>sum_treatment</th>\n      <th>mu_control-mu_treatment</th>\n      <th>var_diff</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>2017-01-03</td>\n      <td>6590</td>\n      <td>0.113809</td>\n      <td>0.100872</td>\n      <td>750</td>\n      <td>6618</td>\n      <td>0.113781</td>\n      <td>0.100850</td>\n      <td>753</td>\n      <td>0.000028</td>\n      <td>0.005526</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>2017-01-04</td>\n      <td>6578</td>\n      <td>0.121922</td>\n      <td>0.107073</td>\n      <td>802</td>\n      <td>6541</td>\n      <td>0.116649</td>\n      <td>0.103058</td>\n      <td>763</td>\n      <td>0.005273</td>\n      <td>0.005659</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>2017-01-05</td>\n      <td>6427</td>\n      <td>0.123230</td>\n      <td>0.108061</td>\n      <td>792</td>\n      <td>6505</td>\n      <td>0.114988</td>\n      <td>0.101782</td>\n      <td>748</td>\n      <td>0.008242</td>\n      <td>0.005697</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>2017-01-06</td>\n      <td>6606</td>\n      <td>0.115350</td>\n      <td>0.102060</td>\n      <td>762</td>\n      <td>6747</td>\n      <td>0.123462</td>\n      <td>0.108235</td>\n      <td>833</td>\n      <td>-0.008113</td>\n      <td>0.005611</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>2017-01-07</td>\n      <td>6604</td>\n      <td>0.120987</td>\n      <td>0.106365</td>\n      <td>799</td>\n      <td>6609</td>\n      <td>0.116205</td>\n      <td>0.102717</td>\n      <td>768</td>\n      <td>0.004782</td>\n      <td>0.005625</td>\n    </tr>\n    <tr>\n      <th>5</th>\n      <td>2017-01-08</td>\n      <td>6687</td>\n      <td>0.118887</td>\n      <td>0.104769</td>\n      <td>795</td>\n      <td>6700</td>\n      <td>0.120746</td>\n      <td>0.106182</td>\n      <td>809</td>\n      <td>-0.001859</td>\n      <td>0.005613</td>\n    </tr>\n    <tr>\n      <th>6</th>\n      <td>2017-01-09</td>\n      <td>6628</td>\n      <td>0.119644</td>\n      <td>0.105345</td>\n      <td>793</td>\n      <td>6615</td>\n      <td>0.118065</td>\n      <td>0.104141</td>\n      <td>781</td>\n      <td>0.001579</td>\n      <td>0.005624</td>\n    </tr>\n    <tr>\n      <th>7</th>\n      <td>2017-01-10</td>\n      <td>6654</td>\n      <td>0.112864</td>\n      <td>0.100141</td>\n      <td>751</td>\n      <td>6696</td>\n      <td>0.126344</td>\n      <td>0.110398</td>\n      <td>846</td>\n      <td>-0.013480</td>\n      <td>0.005615</td>\n    </tr>\n    <tr>\n      <th>8</th>\n      <td>2017-01-11</td>\n      <td>6688</td>\n      <td>0.118870</td>\n      <td>0.104755</td>\n      <td>795</td>\n      <td>6673</td>\n      <td>0.115091</td>\n      <td>0.101860</td>\n      <td>768</td>\n      <td>0.003779</td>\n      <td>0.005561</td>\n    </tr>\n    <tr>\n      <th>9</th>\n      <td>2017-01-12</td>\n      <td>6522</td>\n      <td>0.122048</td>\n      <td>0.107169</td>\n      <td>796</td>\n      <td>6637</td>\n      <td>0.122344</td>\n      <td>0.107392</td>\n      <td>812</td>\n      <td>-0.000296</td>\n      <td>0.005710</td>\n    </tr>\n    <tr>\n      <th>10</th>\n      <td>2017-01-13</td>\n      <td>6552</td>\n      <td>0.116911</td>\n      <td>0.103258</td>\n      <td>766</td>\n      <td>6508</td>\n      <td>0.111248</td>\n      <td>0.098887</td>\n      <td>724</td>\n      <td>0.005663</td>\n      <td>0.005563</td>\n    </tr>\n    <tr>\n      <th>11</th>\n      <td>2017-01-14</td>\n      <td>6548</td>\n      <td>0.126756</td>\n      <td>0.110706</td>\n      <td>830</td>\n      <td>6600</td>\n      <td>0.119242</td>\n      <td>0.105040</td>\n      <td>787</td>\n      <td>0.007514</td>\n      <td>0.005729</td>\n    </tr>\n    <tr>\n      <th>12</th>\n      <td>2017-01-15</td>\n      <td>6714</td>\n      <td>0.120494</td>\n      <td>0.105991</td>\n      <td>809</td>\n      <td>6549</td>\n      <td>0.113452</td>\n      <td>0.100596</td>\n      <td>743</td>\n      <td>0.007042</td>\n      <td>0.005581</td>\n    </tr>\n    <tr>\n      <th>13</th>\n      <td>2017-01-16</td>\n      <td>6591</td>\n      <td>0.121833</td>\n      <td>0.107006</td>\n      <td>803</td>\n      <td>6545</td>\n      <td>0.119175</td>\n      <td>0.104988</td>\n      <td>780</td>\n      <td>0.002658</td>\n      <td>0.005681</td>\n    </tr>\n    <tr>\n      <th>14</th>\n      <td>2017-01-17</td>\n      <td>6617</td>\n      <td>0.122865</td>\n      <td>0.107786</td>\n      <td>813</td>\n      <td>6538</td>\n      <td>0.127256</td>\n      <td>0.111079</td>\n      <td>832</td>\n      <td>-0.004391</td>\n      <td>0.005768</td>\n    </tr>\n    <tr>\n      <th>15</th>\n      <td>2017-01-18</td>\n      <td>6482</td>\n      <td>0.124807</td>\n      <td>0.109247</td>\n      <td>809</td>\n      <td>6603</td>\n      <td>0.124792</td>\n      <td>0.109235</td>\n      <td>824</td>\n      <td>0.000015</td>\n      <td>0.005779</td>\n    </tr>\n    <tr>\n      <th>16</th>\n      <td>2017-01-19</td>\n      <td>6578</td>\n      <td>0.119945</td>\n      <td>0.105574</td>\n      <td>789</td>\n      <td>6552</td>\n      <td>0.117216</td>\n      <td>0.103492</td>\n      <td>768</td>\n      <td>0.002729</td>\n      <td>0.005643</td>\n    </tr>\n    <tr>\n      <th>17</th>\n      <td>2017-01-20</td>\n      <td>6534</td>\n      <td>0.115243</td>\n      <td>0.101978</td>\n      <td>753</td>\n      <td>6679</td>\n      <td>0.117682</td>\n      <td>0.103849</td>\n      <td>786</td>\n      <td>-0.002439</td>\n      <td>0.005581</td>\n    </tr>\n    <tr>\n      <th>18</th>\n      <td>2017-01-21</td>\n      <td>6749</td>\n      <td>0.125945</td>\n      <td>0.110099</td>\n      <td>850</td>\n      <td>6560</td>\n      <td>0.115701</td>\n      <td>0.102330</td>\n      <td>759</td>\n      <td>0.010243</td>\n      <td>0.005649</td>\n    </tr>\n    <tr>\n      <th>19</th>\n      <td>2017-01-22</td>\n      <td>6596</td>\n      <td>0.119163</td>\n      <td>0.104979</td>\n      <td>786</td>\n      <td>6669</td>\n      <td>0.118009</td>\n      <td>0.104098</td>\n      <td>787</td>\n      <td>0.001154</td>\n      <td>0.005614</td>\n    </tr>\n    <tr>\n      <th>20</th>\n      <td>2017-01-23</td>\n      <td>6716</td>\n      <td>0.125670</td>\n      <td>0.109893</td>\n      <td>844</td>\n      <td>6633</td>\n      <td>0.121061</td>\n      <td>0.106422</td>\n      <td>803</td>\n      <td>0.004609</td>\n      <td>0.005692</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{},"execution_count":10}],"source":["# 检验\n","def percent_cal(data, key1=\"control\", key2=\"treatment\"):\n","    data[\"mu_{}-mu_{}\".format(key1, key2)] = data[\"mean_\" + key1] - data[\"mean_\" + key2]\n","\n","    data[\"var_diff\"] = (\n","        data[\"mean_\" + key1] * (1 - data[\"mean_\" + key1]) / data[\"count_\" + key1]\n","        + data[\"mean_\" + key2] * (1 - data[\"mean_\" + key2]) / data[\"count_\" + key2]\n","    ) ** (1 / 2)\n","    return data\n","\n","\n","percent_cal(data)"]},{"cell_type":"markdown","metadata":{},"source":["$ p_{control} - p_{treatment} $ ~ N(-0.004, var_diff) # 两组比例只差为 -0.004"]},{"cell_type":"code","execution_count":11,"metadata":{},"outputs":[{"output_type":"stream","name":"stderr","text":["<ipython-input-11-8d9a896e2417>:31: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"left\"] = data.p_left.apply(lambda x: res_cal(\"left\", x, 0.05, \"p\"))\n<ipython-input-11-8d9a896e2417>:32: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"right\"] = data.p_right.apply(lambda x: res_cal(\"right\", x, 0.05, \"p\"))\n<ipython-input-11-8d9a896e2417>:33: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"two\"] = data.p_two.apply(lambda x: res_cal(\"two\", x, 0.05, \"p\"))\n"]},{"output_type":"execute_result","data":{"text/plain":["date\n","2017-01-02    p= 0.99846 H_0 pA>=pB 无显著性差异\n","2017-01-03    p= 0.99986 H_0 pA>=pB 无显著性差异\n","2017-01-04    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-05    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-06    p= 0.98293 H_0 pA>=pB 无显著性差异\n","2017-01-07    p= 0.99999 H_0 pA>=pB 无显著性差异\n","2017-01-08    p= 0.99938 H_0 pA>=pB 无显著性差异\n","2017-01-09    p= 0.99994 H_0 pA>=pB 无显著性差异\n","2017-01-10    p= 0.87721 H_0 pA>=pB 无显著性差异\n","2017-01-11    p= 0.99999 H_0 pA>=pB 无显著性差异\n","2017-01-12    p= 0.99972 H_0 pA>=pB 无显著性差异\n","2017-01-13    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-14    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-15    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-16    p= 0.99997 H_0 pA>=pB 无显著性差异\n","2017-01-17    p= 0.99660 H_0 pA>=pB 无显著性差异\n","2017-01-18    p= 0.99973 H_0 pA>=pB 无显著性差异\n","2017-01-19    p= 0.99997 H_0 pA>=pB 无显著性差异\n","2017-01-20    p= 0.99917 H_0 pA>=pB 无显著性差异\n","2017-01-21    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-22    p= 0.99992 H_0 pA>=pB 无显著性差异\n","2017-01-23    p= 0.99999 H_0 pA>=pB 无显著性差异\n","2017-01-24    p= 0.98475 H_0 pA>=pB 无显著性差异\n","Name: left, dtype: object"]},"metadata":{},"execution_count":11}],"source":["def res_cal(types, p, alpha, fum):\n","\n","    if p >= alpha:\n","        return \"p={: .5f} H_0 {} 无显著性差异\".format(\n","            p, \"{0}A>={0}B\".format(fum) if types != \"two\" else \"{0}A={0}B\".format(fum)\n","        )\n","    else:\n","        return \"p={: .5f} H_1 {} 有显著性差异\".format(\n","            p, \"{0}A<{0}B\".format(fum) if types != \"two\" else \"{0}A!={0}B\".format(fum)\n","        )\n","\n","\n","def res_run(\n","    data,\n","    target,\n","    cal_key=\"mu_{}-mu_{}\".format(\"control\", \"treatment\"),\n","    scale_key=\"var_diff\",\n","):\n","\n","    data[\"p_left\"] = stats.norm.cdf(data[cal_key], loc=target, scale=data[scale_key])\n","    data[\"p_right\"] = stats.norm.sf(data[cal_key], loc=target, scale=data[scale_key])\n","\n","    tmp = data[[\"p_left\", \"p_right\"]].to_dict(\"records\")\n","\n","    data[\"p_two\"] = [\n","        i[\"p_left\"] * 2 if i[\"p_left\"] < 0.5 else i[\"p_right\"] for i in tmp\n","    ]\n","\n","    res = data[[\"date\"]]\n","\n","    res[\"left\"] = data.p_left.apply(lambda x: res_cal(\"left\", x, 0.05, \"p\"))\n","    res[\"right\"] = data.p_right.apply(lambda x: res_cal(\"right\", x, 0.05, \"p\"))\n","    res[\"two\"] = data.p_two.apply(lambda x: res_cal(\"two\", x, 0.05, \"p\"))\n","    res.set_index(\"date\", inplace=True)\n","    return res\n","\n","\n","res_run(percent_cal(data_parper(df)), target=-0.02)[\"left\"]"]},{"cell_type":"markdown","metadata":{},"source":["一类crl指标 $ p_A - p_B $ > - 0.004 H_0 成立, 试验无需终止"]},{"cell_type":"markdown","metadata":{},"source":["##### 一类指标流量\n"," $ \\bar{x}_A - \\bar{x}_B $"]},{"cell_type":"code","execution_count":12,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["            date  hour      group landing_page  user_id  converted\n","0     2017-01-02    13    control     new_page        1          0\n","1     2017-01-02    13    control     old_page       83         11\n","2     2017-01-02    13  treatment     new_page       87          9\n","3     2017-01-02    14    control     new_page        8          1\n","4     2017-01-02    14    control     old_page      272         39\n","...          ...   ...        ...          ...      ...        ...\n","2083  2017-01-24    12  treatment     old_page        1          0\n","2084  2017-01-24    13    control     new_page        2          0\n","2085  2017-01-24    13    control     old_page      174         22\n","2086  2017-01-24    13  treatment     new_page      188         20\n","2087  2017-01-24    13  treatment     old_page        2          0\n","\n","[2088 rows x 6 columns]"],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>date</th>\n      <th>hour</th>\n      <th>group</th>\n      <th>landing_page</th>\n      <th>user_id</th>\n      <th>converted</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>2017-01-02</td>\n      <td>13</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>1</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>2017-01-02</td>\n      <td>13</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>83</td>\n      <td>11</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>2017-01-02</td>\n      <td>13</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>87</td>\n      <td>9</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>2017-01-02</td>\n      <td>14</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>8</td>\n      <td>1</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>2017-01-02</td>\n      <td>14</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>272</td>\n      <td>39</td>\n    </tr>\n    <tr>\n      <th>...</th>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n    </tr>\n    <tr>\n      <th>2083</th>\n      <td>2017-01-24</td>\n      <td>12</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>1</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>2084</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>2</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>2085</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>174</td>\n      <td>22</td>\n    </tr>\n    <tr>\n      <th>2086</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>188</td>\n      <td>20</td>\n    </tr>\n    <tr>\n      <th>2087</th>\n      <td>2017-01-24</td>\n      <td>13</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>2</td>\n      <td>0</td>\n    </tr>\n  </tbody>\n</table>\n<p>2088 rows × 6 columns</p>\n</div>"},"metadata":{},"execution_count":12}],"source":["dfgbuser"]},{"cell_type":"code","execution_count":13,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["          date      group landing_page  count        mean          var   sum\n","0   2017-01-02    control     new_page      9    3.888889     6.611111    35\n","1   2017-01-02    control     old_page     11  259.909091  3698.290909  2859\n","2   2017-01-02  treatment     new_page     11  259.363636  3399.054545  2853\n","3   2017-01-02  treatment     old_page     10    3.600000     4.933333    36\n","4   2017-01-03    control     new_page     24    3.916667     3.905797    94\n","..         ...        ...          ...    ...         ...          ...   ...\n","87  2017-01-23  treatment     old_page     23    4.130435     3.936759    95\n","88  2017-01-24    control     new_page     14    4.071429     4.071429    57\n","89  2017-01-24    control     old_page     14  268.142857   844.439560  3754\n","90  2017-01-24  treatment     new_page     14  262.928571   628.532967  3681\n","91  2017-01-24  treatment     old_page     13    3.538462     4.102564    46\n","\n","[92 rows x 7 columns]"],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>date</th>\n      <th>group</th>\n      <th>landing_page</th>\n      <th>count</th>\n      <th>mean</th>\n      <th>var</th>\n      <th>sum</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>2017-01-02</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>9</td>\n      <td>3.888889</td>\n      <td>6.611111</td>\n      <td>35</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>2017-01-02</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>11</td>\n      <td>259.909091</td>\n      <td>3698.290909</td>\n      <td>2859</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>2017-01-02</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>11</td>\n      <td>259.363636</td>\n      <td>3399.054545</td>\n      <td>2853</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>2017-01-02</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>10</td>\n      <td>3.600000</td>\n      <td>4.933333</td>\n      <td>36</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>2017-01-03</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>24</td>\n      <td>3.916667</td>\n      <td>3.905797</td>\n      <td>94</td>\n    </tr>\n    <tr>\n      <th>...</th>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n      <td>...</td>\n    </tr>\n    <tr>\n      <th>87</th>\n      <td>2017-01-23</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>23</td>\n      <td>4.130435</td>\n      <td>3.936759</td>\n      <td>95</td>\n    </tr>\n    <tr>\n      <th>88</th>\n      <td>2017-01-24</td>\n      <td>control</td>\n      <td>new_page</td>\n      <td>14</td>\n      <td>4.071429</td>\n      <td>4.071429</td>\n      <td>57</td>\n    </tr>\n    <tr>\n      <th>89</th>\n      <td>2017-01-24</td>\n      <td>control</td>\n      <td>old_page</td>\n      <td>14</td>\n      <td>268.142857</td>\n      <td>844.439560</td>\n      <td>3754</td>\n    </tr>\n    <tr>\n      <th>90</th>\n      <td>2017-01-24</td>\n      <td>treatment</td>\n      <td>new_page</td>\n      <td>14</td>\n      <td>262.928571</td>\n      <td>628.532967</td>\n      <td>3681</td>\n    </tr>\n    <tr>\n      <th>91</th>\n      <td>2017-01-24</td>\n      <td>treatment</td>\n      <td>old_page</td>\n      <td>13</td>\n      <td>3.538462</td>\n      <td>4.102564</td>\n      <td>46</td>\n    </tr>\n  </tbody>\n</table>\n<p>92 rows × 7 columns</p>\n</div>"},"metadata":{},"execution_count":13}],"source":["p_diff = 0.05  # 2sigma\n","\n","data = dfgbuser\n","\n","cal_key = \"user_id\"\n","pares = {\"group\": [\"control\", \"treatment\"], \"landing_page\": [\"old_page\", \"new_page\"]}\n","group_list = list(pares.keys())\n","sample_group = (\n","    data.groupby(by=[\"date\"] + group_list)\n","    .agg({cal_key: [\"count\", \"mean\", \"var\", \"sum\"]})[cal_key]\n","    .reset_index()\n",")\n","sample_group"]},{"cell_type":"code","execution_count":14,"metadata":{},"outputs":[{"output_type":"execute_result","data":{"text/plain":["          date  count_control  mean_control  var_control  sum_control  \\\n","0   2017-01-02             11    259.909091  3698.290909         2859   \n","1   2017-01-03             24    274.583333   337.384058         6590   \n","2   2017-01-04             24    274.083333    92.775362         6578   \n","3   2017-01-05             24    267.791667   261.737319         6427   \n","4   2017-01-06             24    275.250000   295.586957         6606   \n","5   2017-01-07             24    275.166667   307.449275         6604   \n","6   2017-01-08             24    278.625000   245.722826         6687   \n","7   2017-01-09             24    276.166667   290.318841         6628   \n","8   2017-01-10             24    277.250000   216.195652         6654   \n","9   2017-01-11             24    278.666667   282.840580         6688   \n","10  2017-01-12             24    271.750000   227.413043         6522   \n","11  2017-01-13             24    273.000000   204.434783         6552   \n","12  2017-01-14             24    272.833333   314.840580         6548   \n","13  2017-01-15             24    279.750000   323.760870         6714   \n","14  2017-01-16             24    274.625000   278.592391         6591   \n","15  2017-01-17             24    275.708333   362.041667         6617   \n","16  2017-01-18             24    270.083333   393.123188         6482   \n","17  2017-01-19             24    274.083333   138.340580         6578   \n","18  2017-01-20             24    272.250000   420.108696         6534   \n","19  2017-01-21             24    281.208333   148.519928         6749   \n","20  2017-01-22             24    274.833333   157.884058         6596   \n","21  2017-01-23             24    279.833333   192.492754         6716   \n","22  2017-01-24             14    268.142857   844.439560         3754   \n","\n","    count_treatment  mean_treatment  var_treatment  sum_treatment  \n","0                11      259.363636    3399.054545           2853  \n","1                24      275.750000     201.760870           6618  \n","2                24      272.541667     138.259058           6541  \n","3                24      271.041667     298.737319           6505  \n","4                24      281.125000     259.070652           6747  \n","5                24      275.375000     272.940217           6609  \n","6                24      279.166667     297.101449           6700  \n","7                24      275.625000     196.070652           6615  \n","8                24      279.000000     353.304348           6696  \n","9                24      278.041667     215.259058           6673  \n","10               24      276.541667     326.172101           6637  \n","11               24      271.166667     326.057971           6508  \n","12               24      275.000000     214.608696           6600  \n","13               24      272.875000     143.679348           6549  \n","14               24      272.708333     378.389493           6545  \n","15               24      272.416667     151.644928           6538  \n","16               24      275.125000     234.461957           6603  \n","17               24      273.000000     247.565217           6552  \n","18               24      278.291667     260.911232           6679  \n","19               24      273.333333     211.623188           6560  \n","20               24      277.875000     326.983696           6669  \n","21               24      276.375000     340.853261           6633  \n","22               14      262.928571     628.532967           3681  "],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>date</th>\n      <th>count_control</th>\n      <th>mean_control</th>\n      <th>var_control</th>\n      <th>sum_control</th>\n      <th>count_treatment</th>\n      <th>mean_treatment</th>\n      <th>var_treatment</th>\n      <th>sum_treatment</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>2017-01-02</td>\n      <td>11</td>\n      <td>259.909091</td>\n      <td>3698.290909</td>\n      <td>2859</td>\n      <td>11</td>\n      <td>259.363636</td>\n      <td>3399.054545</td>\n      <td>2853</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>2017-01-03</td>\n      <td>24</td>\n      <td>274.583333</td>\n      <td>337.384058</td>\n      <td>6590</td>\n      <td>24</td>\n      <td>275.750000</td>\n      <td>201.760870</td>\n      <td>6618</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>2017-01-04</td>\n      <td>24</td>\n      <td>274.083333</td>\n      <td>92.775362</td>\n      <td>6578</td>\n      <td>24</td>\n      <td>272.541667</td>\n      <td>138.259058</td>\n      <td>6541</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>2017-01-05</td>\n      <td>24</td>\n      <td>267.791667</td>\n      <td>261.737319</td>\n      <td>6427</td>\n      <td>24</td>\n      <td>271.041667</td>\n      <td>298.737319</td>\n      <td>6505</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>2017-01-06</td>\n      <td>24</td>\n      <td>275.250000</td>\n      <td>295.586957</td>\n      <td>6606</td>\n      <td>24</td>\n      <td>281.125000</td>\n      <td>259.070652</td>\n      <td>6747</td>\n    </tr>\n    <tr>\n      <th>5</th>\n      <td>2017-01-07</td>\n      <td>24</td>\n      <td>275.166667</td>\n      <td>307.449275</td>\n      <td>6604</td>\n      <td>24</td>\n      <td>275.375000</td>\n      <td>272.940217</td>\n      <td>6609</td>\n    </tr>\n    <tr>\n      <th>6</th>\n      <td>2017-01-08</td>\n      <td>24</td>\n      <td>278.625000</td>\n      <td>245.722826</td>\n      <td>6687</td>\n      <td>24</td>\n      <td>279.166667</td>\n      <td>297.101449</td>\n      <td>6700</td>\n    </tr>\n    <tr>\n      <th>7</th>\n      <td>2017-01-09</td>\n      <td>24</td>\n      <td>276.166667</td>\n      <td>290.318841</td>\n      <td>6628</td>\n      <td>24</td>\n      <td>275.625000</td>\n      <td>196.070652</td>\n      <td>6615</td>\n    </tr>\n    <tr>\n      <th>8</th>\n      <td>2017-01-10</td>\n      <td>24</td>\n      <td>277.250000</td>\n      <td>216.195652</td>\n      <td>6654</td>\n      <td>24</td>\n      <td>279.000000</td>\n      <td>353.304348</td>\n      <td>6696</td>\n    </tr>\n    <tr>\n      <th>9</th>\n      <td>2017-01-11</td>\n      <td>24</td>\n      <td>278.666667</td>\n      <td>282.840580</td>\n      <td>6688</td>\n      <td>24</td>\n      <td>278.041667</td>\n      <td>215.259058</td>\n      <td>6673</td>\n    </tr>\n    <tr>\n      <th>10</th>\n      <td>2017-01-12</td>\n      <td>24</td>\n      <td>271.750000</td>\n      <td>227.413043</td>\n      <td>6522</td>\n      <td>24</td>\n      <td>276.541667</td>\n      <td>326.172101</td>\n      <td>6637</td>\n    </tr>\n    <tr>\n      <th>11</th>\n      <td>2017-01-13</td>\n      <td>24</td>\n      <td>273.000000</td>\n      <td>204.434783</td>\n      <td>6552</td>\n      <td>24</td>\n      <td>271.166667</td>\n      <td>326.057971</td>\n      <td>6508</td>\n    </tr>\n    <tr>\n      <th>12</th>\n      <td>2017-01-14</td>\n      <td>24</td>\n      <td>272.833333</td>\n      <td>314.840580</td>\n      <td>6548</td>\n      <td>24</td>\n      <td>275.000000</td>\n      <td>214.608696</td>\n      <td>6600</td>\n    </tr>\n    <tr>\n      <th>13</th>\n      <td>2017-01-15</td>\n      <td>24</td>\n      <td>279.750000</td>\n      <td>323.760870</td>\n      <td>6714</td>\n      <td>24</td>\n      <td>272.875000</td>\n      <td>143.679348</td>\n      <td>6549</td>\n    </tr>\n    <tr>\n      <th>14</th>\n      <td>2017-01-16</td>\n      <td>24</td>\n      <td>274.625000</td>\n      <td>278.592391</td>\n      <td>6591</td>\n      <td>24</td>\n      <td>272.708333</td>\n      <td>378.389493</td>\n      <td>6545</td>\n    </tr>\n    <tr>\n      <th>15</th>\n      <td>2017-01-17</td>\n      <td>24</td>\n      <td>275.708333</td>\n      <td>362.041667</td>\n      <td>6617</td>\n      <td>24</td>\n      <td>272.416667</td>\n      <td>151.644928</td>\n      <td>6538</td>\n    </tr>\n    <tr>\n      <th>16</th>\n      <td>2017-01-18</td>\n      <td>24</td>\n      <td>270.083333</td>\n      <td>393.123188</td>\n      <td>6482</td>\n      <td>24</td>\n      <td>275.125000</td>\n      <td>234.461957</td>\n      <td>6603</td>\n    </tr>\n    <tr>\n      <th>17</th>\n      <td>2017-01-19</td>\n      <td>24</td>\n      <td>274.083333</td>\n      <td>138.340580</td>\n      <td>6578</td>\n      <td>24</td>\n      <td>273.000000</td>\n      <td>247.565217</td>\n      <td>6552</td>\n    </tr>\n    <tr>\n      <th>18</th>\n      <td>2017-01-20</td>\n      <td>24</td>\n      <td>272.250000</td>\n      <td>420.108696</td>\n      <td>6534</td>\n      <td>24</td>\n      <td>278.291667</td>\n      <td>260.911232</td>\n      <td>6679</td>\n    </tr>\n    <tr>\n      <th>19</th>\n      <td>2017-01-21</td>\n      <td>24</td>\n      <td>281.208333</td>\n      <td>148.519928</td>\n      <td>6749</td>\n      <td>24</td>\n      <td>273.333333</td>\n      <td>211.623188</td>\n      <td>6560</td>\n    </tr>\n    <tr>\n      <th>20</th>\n      <td>2017-01-22</td>\n      <td>24</td>\n      <td>274.833333</td>\n      <td>157.884058</td>\n      <td>6596</td>\n      <td>24</td>\n      <td>277.875000</td>\n      <td>326.983696</td>\n      <td>6669</td>\n    </tr>\n    <tr>\n      <th>21</th>\n      <td>2017-01-23</td>\n      <td>24</td>\n      <td>279.833333</td>\n      <td>192.492754</td>\n      <td>6716</td>\n      <td>24</td>\n      <td>276.375000</td>\n      <td>340.853261</td>\n      <td>6633</td>\n    </tr>\n    <tr>\n      <th>22</th>\n      <td>2017-01-24</td>\n      <td>14</td>\n      <td>268.142857</td>\n      <td>844.439560</td>\n      <td>3754</td>\n      <td>14</td>\n      <td>262.928571</td>\n      <td>628.532967</td>\n      <td>3681</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{},"execution_count":14}],"source":["\n","daily_avg_pv = data_parper(\n","    dfgbuser, cal_key=\"user_id\", sample_chack=type_I_mean[\"samples\"]\n",")\n","daily_avg_pv"]},{"cell_type":"code","execution_count":15,"metadata":{},"outputs":[{"output_type":"stream","name":"stderr","text":["<ipython-input-11-8d9a896e2417>:31: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"left\"] = data.p_left.apply(lambda x: res_cal(\"left\", x, 0.05, \"p\"))\n<ipython-input-11-8d9a896e2417>:32: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"right\"] = data.p_right.apply(lambda x: res_cal(\"right\", x, 0.05, \"p\"))\n<ipython-input-11-8d9a896e2417>:33: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"two\"] = data.p_two.apply(lambda x: res_cal(\"two\", x, 0.05, \"p\"))\n"]},{"output_type":"execute_result","data":{"text/plain":["date\n","2017-01-02    p= 0.39891 H_0 pA>=pB 无显著性差异\n","2017-01-03     p= 0.04150 H_1 pA<pB 有显著性差异\n","2017-01-04     p= 0.03750 H_1 pA<pB 有显著性差异\n","2017-01-05     p= 0.01844 H_1 pA<pB 有显著性差异\n","2017-01-06     p= 0.00343 H_1 pA<pB 有显著性差异\n","2017-01-07    p= 0.07017 H_0 pA>=pB 无显著性差异\n","2017-01-08    p= 0.05145 H_0 pA>=pB 无显著性差异\n","2017-01-09    p= 0.07111 H_0 pA>=pB 无显著性差异\n","2017-01-10     p= 0.03435 H_1 pA<pB 有显著性差异\n","2017-01-11    p= 0.07656 H_0 pA>=pB 无显著性差异\n","2017-01-12     p= 0.00699 H_1 pA<pB 有显著性差异\n","2017-01-13    p= 0.14084 H_0 pA>=pB 无显著性差异\n","2017-01-14     p= 0.02623 H_1 pA<pB 有显著性差异\n","2017-01-15    p= 0.48162 H_0 pA>=pB 无显著性差异\n","2017-01-16    p= 0.16496 H_0 pA>=pB 无显著性差异\n","2017-01-17    p= 0.20606 H_0 pA>=pB 无显著性差异\n","2017-01-18     p= 0.00967 H_1 pA<pB 有显著性差异\n","2017-01-19    p= 0.07016 H_0 pA>=pB 无显著性差异\n","2017-01-20     p= 0.00646 H_1 pA<pB 有显著性差异\n","2017-01-21    p= 0.58752 H_0 pA>=pB 无显著性差异\n","2017-01-22     p= 0.01227 H_1 pA<pB 有显著性差异\n","2017-01-23    p= 0.20975 H_0 pA>=pB 无显著性差异\n","2017-01-24    p= 0.43655 H_0 pA>=pB 无显著性差异\n","Name: left, dtype: object"]},"metadata":{},"execution_count":15}],"source":["def mean_cal(data, key1=\"control\", key2=\"treatment\", p_diff=0.05):\n","\n","    data = data.merge(\n","        dfgbuser.groupby(by=[\"date\"]).mean()[\"user_id\"], left_on=\"date\", right_on=\"date\"\n","    ).rename({\"user_id\": \"total_mean\"}, axis=1)\n","\n","    data[\"target\"] = data[\"total_mean\"] * p_diff\n","    data[\"mean_diff\"] = data[\"mean_\" + key1] - data[\"mean_\" + key2]\n","    data[\"var_diff\"] = (\n","        data[\"var_\" + key1] / data[\"count_\" + key1]\n","        + data[\"var_\" + key2] / data[\"count_\" + key2]\n","    ) ** (1 / 2)\n","\n","    return data\n","\n","\n","data = mean_cal(daily_avg_pv)\n","res_run(data=data, target=data[\"target\"], cal_key=\"mean_diff\", scale_key=\"var_diff\")[\n","    \"left\"\n","]\n","\n",""]},{"cell_type":"markdown","metadata":{},"source":["$ p > 0.05 $ 接受原假设\n","\n","$ H_0 $ banner_a_pv ($ \\mu_A $) 与 banner_b_pv ($ \\mu_B $) $ H_0 : \\mu_A - \\mu_B \\ge \\mu * -0.02 $"]},{"cell_type":"markdown","metadata":{},"source":["#### 二类指标检验\n","\n","$ H_0 $ banner a ($ p_A $) 与 banner b ($ p_B $) $ H_0 : p_A - p_B \\ge 0.015 $\n","\n","$ H_1 $  banner a ($ p_A $) 与 banner b ($ p_B $)  $ H_0 : p_A - p_b \\lt 0.015 $"]},{"cell_type":"code","execution_count":16,"metadata":{},"outputs":[{"output_type":"stream","name":"stderr","text":["<ipython-input-11-8d9a896e2417>:31: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"left\"] = data.p_left.apply(lambda x: res_cal(\"left\", x, 0.05, \"p\"))\n<ipython-input-11-8d9a896e2417>:32: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"right\"] = data.p_right.apply(lambda x: res_cal(\"right\", x, 0.05, \"p\"))\n<ipython-input-11-8d9a896e2417>:33: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  res[\"two\"] = data.p_two.apply(lambda x: res_cal(\"two\", x, 0.05, \"p\"))\n"]},{"output_type":"execute_result","data":{"text/plain":["                                    left                         right  \\\n","date                                                                     \n","2017-01-02  p= 0.30999 H_0 pA>=pB 无显著性差异  p= 0.69001 H_0 pA>=pB 无显著性差异   \n","2017-01-03   p= 0.03558 H_1 pA<pB 有显著性差异  p= 0.96442 H_0 pA>=pB 无显著性差异   \n","2017-01-04  p= 0.20177 H_0 pA>=pB 无显著性差异  p= 0.79823 H_0 pA>=pB 无显著性差异   \n","2017-01-05  p= 0.37880 H_0 pA>=pB 无显著性差异  p= 0.62120 H_0 pA>=pB 无显著性差异   \n","2017-01-06   p= 0.00062 H_1 pA<pB 有显著性差异  p= 0.99938 H_0 pA>=pB 无显著性差异   \n","2017-01-07  p= 0.17681 H_0 pA>=pB 无显著性差异  p= 0.82319 H_0 pA>=pB 无显著性差异   \n","2017-01-08   p= 0.01732 H_1 pA<pB 有显著性差异  p= 0.98268 H_0 pA>=pB 无显著性差异   \n","2017-01-09  p= 0.06716 H_0 pA>=pB 无显著性差异  p= 0.93284 H_0 pA>=pB 无显著性差异   \n","2017-01-10   p= 0.00001 H_1 pA<pB 有显著性差异  p= 0.99999 H_0 pA>=pB 无显著性差异   \n","2017-01-11  p= 0.13163 H_0 pA>=pB 无显著性差异  p= 0.86837 H_0 pA>=pB 无显著性差异   \n","2017-01-12   p= 0.03569 H_1 pA<pB 有显著性差异  p= 0.96431 H_0 pA>=pB 无显著性差异   \n","2017-01-13  p= 0.21783 H_0 pA>=pB 无显著性差异  p= 0.78217 H_0 pA>=pB 无显著性差异   \n","2017-01-14  p= 0.33215 H_0 pA>=pB 无显著性差异  p= 0.66785 H_0 pA>=pB 无显著性差异   \n","2017-01-15  p= 0.29804 H_0 pA>=pB 无显著性差异  p= 0.70196 H_0 pA>=pB 无显著性差异   \n","2017-01-16  p= 0.09810 H_0 pA>=pB 无显著性差异  p= 0.90190 H_0 pA>=pB 无显著性差异   \n","2017-01-17   p= 0.00630 H_1 pA<pB 有显著性差异  p= 0.99370 H_0 pA>=pB 无显著性差异   \n","2017-01-18   p= 0.04201 H_1 pA<pB 有显著性差异  p= 0.95799 H_0 pA>=pB 无显著性差异   \n","2017-01-19  p= 0.09878 H_0 pA>=pB 无显著性差异  p= 0.90122 H_0 pA>=pB 无显著性差异   \n","2017-01-20   p= 0.01292 H_1 pA<pB 有显著性差异  p= 0.98708 H_0 pA>=pB 无显著性差异   \n","2017-01-21  p= 0.51718 H_0 pA>=pB 无显著性差异  p= 0.48282 H_0 pA>=pB 无显著性差异   \n","2017-01-22  p= 0.05756 H_0 pA>=pB 无显著性差异  p= 0.94244 H_0 pA>=pB 无显著性差异   \n","2017-01-23  p= 0.17179 H_0 pA>=pB 无显著性差异  p= 0.82821 H_0 pA>=pB 无显著性差异   \n","2017-01-24   p= 0.03452 H_1 pA<pB 有显著性差异  p= 0.96548 H_0 pA>=pB 无显著性差异   \n","\n","                                     two  \n","date                                      \n","2017-01-02   p= 0.61998 H_0 pA=pB 无显著性差异  \n","2017-01-03   p= 0.07117 H_0 pA=pB 无显著性差异  \n","2017-01-04   p= 0.40355 H_0 pA=pB 无显著性差异  \n","2017-01-05   p= 0.75759 H_0 pA=pB 无显著性差异  \n","2017-01-06  p= 0.00125 H_1 pA!=pB 有显著性差异  \n","2017-01-07   p= 0.35362 H_0 pA=pB 无显著性差异  \n","2017-01-08  p= 0.03464 H_1 pA!=pB 有显著性差异  \n","2017-01-09   p= 0.13432 H_0 pA=pB 无显著性差异  \n","2017-01-10  p= 0.00003 H_1 pA!=pB 有显著性差异  \n","2017-01-11   p= 0.26326 H_0 pA=pB 无显著性差异  \n","2017-01-12   p= 0.07138 H_0 pA=pB 无显著性差异  \n","2017-01-13   p= 0.43566 H_0 pA=pB 无显著性差异  \n","2017-01-14   p= 0.66430 H_0 pA=pB 无显著性差异  \n","2017-01-15   p= 0.59608 H_0 pA=pB 无显著性差异  \n","2017-01-16   p= 0.19620 H_0 pA=pB 无显著性差异  \n","2017-01-17  p= 0.01260 H_1 pA!=pB 有显著性差异  \n","2017-01-18   p= 0.08401 H_0 pA=pB 无显著性差异  \n","2017-01-19   p= 0.19756 H_0 pA=pB 无显著性差异  \n","2017-01-20  p= 0.02584 H_1 pA!=pB 有显著性差异  \n","2017-01-21   p= 0.48282 H_0 pA=pB 无显著性差异  \n","2017-01-22   p= 0.11513 H_0 pA=pB 无显著性差异  \n","2017-01-23   p= 0.34358 H_0 pA=pB 无显著性差异  \n","2017-01-24   p= 0.06904 H_0 pA=pB 无显著性差异  "],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>left</th>\n      <th>right</th>\n      <th>two</th>\n    </tr>\n    <tr>\n      <th>date</th>\n      <th></th>\n      <th></th>\n      <th></th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>2017-01-02</th>\n      <td>p= 0.30999 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.69001 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.61998 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-03</th>\n      <td>p= 0.03558 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.96442 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.07117 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-04</th>\n      <td>p= 0.20177 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.79823 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.40355 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-05</th>\n      <td>p= 0.37880 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.62120 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.75759 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-06</th>\n      <td>p= 0.00062 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.99938 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.00125 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-07</th>\n      <td>p= 0.17681 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.82319 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.35362 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-08</th>\n      <td>p= 0.01732 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.98268 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.03464 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-09</th>\n      <td>p= 0.06716 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.93284 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.13432 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-10</th>\n      <td>p= 0.00001 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.99999 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.00003 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-11</th>\n      <td>p= 0.13163 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.86837 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.26326 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-12</th>\n      <td>p= 0.03569 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.96431 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.07138 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-13</th>\n      <td>p= 0.21783 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.78217 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.43566 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-14</th>\n      <td>p= 0.33215 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.66785 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.66430 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-15</th>\n      <td>p= 0.29804 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.70196 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.59608 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-16</th>\n      <td>p= 0.09810 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.90190 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.19620 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-17</th>\n      <td>p= 0.00630 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.99370 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.01260 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-18</th>\n      <td>p= 0.04201 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.95799 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.08401 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-19</th>\n      <td>p= 0.09878 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.90122 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.19756 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-20</th>\n      <td>p= 0.01292 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.98708 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.02584 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-21</th>\n      <td>p= 0.51718 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.48282 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.48282 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-22</th>\n      <td>p= 0.05756 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.94244 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.11513 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-23</th>\n      <td>p= 0.17179 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.82821 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.34358 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-24</th>\n      <td>p= 0.03452 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.96548 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.06904 H_0 pA=pB 无显著性差异</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{},"execution_count":16}],"source":["\n","res_run(percent_cal(data_parper(df)), target=0.01)\n",""]},{"cell_type":"markdown","metadata":{},"source":["显著性水平 $ p < 0.05 $\n","\n","$ H_1 $  banner a ($ p_A $) 与 banner b ($ p_B $)  $ H_0 : p_A - p_b \\lt 0.015 $\n","\n","实验组均值超过控制组转化比 在统计学上较难达成 预期的 0.5% "]},{"cell_type":"code","execution_count":17,"metadata":{},"outputs":[],"source":["\n","\n","class ABtest(object):\n","    \"\"\"\n","    data : dataFrame (2_columns)\n","\n","    alpha: float = 0.05,\n","\n","    sample_num: float = 1,\n","\n","    diff: float = 0.05,\n","\n","    group_col: str = \"\",\n","\n","    value_col: str = \"\",\n","\n","    control_group: str = \"\",\n","\n","    return:dict --> 'pare': 'treatment-control-mean',\n","\n","                    'target': -5.556100367024282,\n","\n","                    'diff': 0.014657444005308662,\n","\n","                    'var_sum': 2.0378011720396914,\n","\n","                    'p': 0.9999523810179102\n","    \"\"\"\n","\n","    def __init__(\n","        self,\n","        data,\n","        types=\"p|t\",\n","        alpha: float = 0.05,\n","        beta: float = 0.20,\n","        diff: float = 0.05,\n","        cal_key=\"converted\",\n","        pares={\n","            \"group\": [\"control\", \"treatment\"],\n","            \"landing_page\": [\"old_page\", \"new_page\"],\n","        },\n","    ):\n","\n","        self.data = data\n","        self.alpha = alpha\n","        self.beta = beta\n","        self.diff = diff\n","        self.types = types\n","\n","        self.cal_key = cal_key\n","        self.pares = pares\n","\n","        self.samples = 0\n","        self.power = 0\n","        self.c_var = 0\n","\n","        self.target = None\n","        self.group_list = list(self.pares.keys())\n","\n","        self.z_alpha = stats.norm.ppf(1 - self.alpha)\n","        self.z_beta = stats.norm.ppf(1 - self.beta)\n","\n","    def samples_cal_Proportions_2n_1size(self):\n","        dfgb = (\n","            self.data.groupby(by=[self.group_list[0]])[self.cal_key]\n","            .agg([\"count\", \"mean\"])\n","            .reset_index()\n","        )\n","        percent = dfgb.loc[dfgb[self.group_list[0]] == \"control\", \"mean\"].values[0]\n","\n","        self.pvar = percent * (1 - percent)\n","        self.samples = (\n","            2 * pvar * ((self.z_alpha + self.z_beta) / (self.diff)) ** 2\n","        )  # 双样本 比例 双侧\n","        self.power = stats.norm.pdf(\n","            abs(self.diff) / (2 * pvar / self.samples) ** (1 / 2) - self.z_alpha\n","        )\n","\n","    def samples_cal_mean_2n_1size(self):\n","        data = (\n","            self.data[self.data[self.group_list[0]] == \"control\"]\n","            .groupby(\"date\")[self.cal_key]\n","            .mean()\n","        )\n","\n","        sigma = data.std()\n","        mu_diff = data.mean() * self.diff\n","\n","        self.samples = (2 * sigma ** 2) * (\n","            (self.z_alpha + self.z_beta) / (mu_diff)\n","        ) ** 2\n","        z = (mu_diff) / (2 * sigma ** 2) ** (1 / 2) * (self.samples) * (1 / 2)\n","\n","        self.power = stats.norm.pdf(z - stats.norm.ppf(1 - self.alpha))\n","\n","    @staticmethod\n","    def sample_number_check(data, chack_num=1):\n","        sample_check = []\n","        for k, v in data.items():\n","            print(k, v)\n","            print(k, v, \">\", chack_num, v > chack_num)\n","            if v > chack_num:\n","                sample_check.append(k)\n","        return sample_check\n","\n","    @staticmethod\n","    def data_parper(\n","        data,\n","        sample_chack=1,\n","        cal_key=\"converted\",\n","        pares={\n","            \"group\": [\"control\", \"treatment\"],\n","            \"landing_page\": [\"old_page\", \"new_page\"],\n","        },\n","    ):\n","        group_list = list(pares.keys())\n","        sample_group = (\n","            data.groupby(by=[\"date\"] + group_list)\n","            .agg({cal_key: [\"count\", \"mean\", \"var\", \"sum\"]})[cal_key]\n","            .reset_index()\n","        )\n","        sample_group = sample_group[sample_group[\"count\"] >= sample_chack]  # 样本数计算\n","        sg = pd.merge(\n","            sample_group[\n","                (sample_group[group_list[0]] == pares[group_list[0]][0])\n","                & (sample_group[group_list[1]] == pares[group_list[1]][0])\n","            ][[\"date\", \"count\", \"mean\", \"var\", \"sum\"]],\n","            sample_group[\n","                (sample_group[group_list[0]] == pares[group_list[0]][1])\n","                & (sample_group[group_list[1]] == pares[group_list[1]][1])\n","            ][[\"date\", \"count\", \"mean\", \"var\", \"sum\"]],\n","            how=\"inner\",\n","            left_on=\"date\",\n","            right_on=\"date\",\n","            suffixes=(\"_control\", \"_treatment\"),\n","        )\n","        return sg\n","\n","    @staticmethod\n","    def res_cal(types, p, alpha, fum):\n","\n","        if p >= alpha:\n","            return \"p={: .5f} H_0 {} 无显著性差异\".format(\n","                p,\n","                \"{0}A>={0}B\".format(fum) if types != \"two\" else \"{0}A={0}B\".format(fum),\n","            )\n","        else:\n","            return \"p={: .5f} H_1 {} 有显著性差异\".format(\n","                p,\n","                \"{0}A<{0}B\".format(fum) if types != \"two\" else \"{0}A!={0}B\".format(fum),\n","            )\n","\n","    @staticmethod\n","    def res_run(\n","        data,\n","        target,\n","        cal_keys,\n","        scale_key,\n","    ):\n","\n","        data[\"p_left\"] = stats.norm.cdf(cal_keys, loc=target, scale=scale_key)\n","        data[\"p_right\"] = stats.norm.sf(cal_keys, loc=target, scale=scale_key)\n","\n","        tmp = data[[\"p_left\", \"p_right\"]].to_dict(\"records\")\n","\n","        data[\"p_two\"] = [\n","            i[\"p_left\"] * 2 if i[\"p_left\"] < 0.5 else i[\"p_right\"] for i in tmp\n","        ]\n","\n","        res = data[[\"date\"]]\n","\n","        res.loc[:, \"left\"] = data.p_left.apply(lambda x: res_cal(\"left\", x, 0.05, \"p\"))\n","        res.loc[:, \"right\"] = data.p_right.apply(\n","            lambda x: res_cal(\"right\", x, 0.05, \"p\")\n","        )\n","        res.loc[:, \"two\"] = data.p_two.apply(lambda x: res_cal(\"two\", x, 0.05, \"p\"))\n","        res.set_index(\"date\", inplace=True)\n","        return res\n","\n","    @staticmethod\n","    def percent_cal(\n","        data,\n","        target,\n","        key1=\"control\",\n","        key2=\"treatment\",\n","    ):\n","        data[\"target\"] = target\n","        data[\"p_diff\"] = data[\"mean_\" + key1] - data[\"mean_\" + key2]\n","\n","        data[\"var_diff\"] = (\n","            data[\"mean_\" + key1] * (1 - data[\"mean_\" + key1]) / data[\"count_\" + key1]\n","            + data[\"mean_\" + key2] * (1 - data[\"mean_\" + key2]) / data[\"count_\" + key2]\n","        ) ** (1 / 2)\n","        return data\n","\n","    @staticmethod\n","    def mean_cal(data, key1=\"control\", key2=\"treatment\", p_diff=0.05):\n","\n","        data = data.merge(\n","            dfgbuser.groupby(by=[\"date\"]).mean()[\"user_id\"],\n","            left_on=\"date\",\n","            right_on=\"date\",\n","        ).rename({\"user_id\": \"total_mean\"}, axis=1)\n","\n","        data[\"target\"] = data[\"total_mean\"] * p_diff\n","        data[\"p_diff\"] = data[\"mean_\" + key1] - data[\"mean_\" + key2]\n","        data[\"var_diff\"] = (\n","            data[\"var_\" + key1] / data[\"count_\" + key1]\n","            + data[\"var_\" + key2] / data[\"count_\" + key2]\n","        ) ** (1 / 2)\n","\n","        return data\n","\n","    def cal_process(self):\n","        if self.types == \"p\":\n","            self.samples_cal_Proportions_2n_1size()\n","            data = self.data_parper(\n","                self.data, self.samples, cal_key=self.cal_key, pares=self.pares\n","            )\n","            data = self.percent_cal(data, target=self.diff)\n","        elif self.types == \"m\":\n","            self.samples_cal_mean_2n_1size()\n","\n","            data = self.data_parper(\n","                self.data, self.samples, cal_key=self.cal_key, pares=self.pares\n","            )\n","            data = self.mean_cal(data, p_diff=self.diff)\n","\n","        res = self.res_run(\n","            data,\n","            target=data.target,\n","            cal_keys=data.p_diff,\n","            scale_key=data.var_diff,\n","        )\n","        return res\n","\n",""]},{"cell_type":"markdown","metadata":{},"source":["$ H_0 $ banner a ($ p_A $) 与 banner b ($ p_B $) $ H_0 : p_A - p_B \\ge 0.015 $\n","\n","$ H_1 $  banner a ($ p_A $) 与 banner b ($ p_B $)  $ H_0 : p_A - p_b \\lt 0.015 $\n","\n","因此为其设定一个差异值 $ \\delta $ 0.005 即  $ H_1 : p_A - p_b - \\delta \\lt 0 $ or  $ H_0 : p_A - p_b - \\delta \\ge 0 $"]},{"cell_type":"code","execution_count":18,"metadata":{},"outputs":[{"output_type":"stream","name":"stderr","text":["D:\\Anaconda3\\lib\\site-packages\\pandas\\core\\indexing.py:1596: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  self.obj[key] = _infer_fill_value(value)\nD:\\Anaconda3\\lib\\site-packages\\pandas\\core\\indexing.py:1743: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  isetter(ilocs[0], value)\n"]},{"output_type":"execute_result","data":{"text/plain":["date\n","2017-01-03     p= 0.00337 H_1 pA<pB 有显著性差异\n","2017-01-04     p= 0.04283 H_1 pA<pB 有显著性差异\n","2017-01-05    p= 0.11775 H_0 pA>=pB 无显著性差异\n","2017-01-06     p= 0.00002 H_1 pA<pB 有显著性差异\n","2017-01-07     p= 0.03465 H_1 pA<pB 有显著性差异\n","2017-01-08     p= 0.00134 H_1 pA<pB 有显著性差异\n","2017-01-09     p= 0.00851 H_1 pA<pB 有显著性差异\n","2017-01-10     p= 0.00000 H_1 pA<pB 有显著性差异\n","2017-01-11     p= 0.02180 H_1 pA<pB 有显著性差异\n","2017-01-12     p= 0.00370 H_1 pA<pB 有显著性差异\n","2017-01-13     p= 0.04664 H_1 pA<pB 有显著性差异\n","2017-01-14    p= 0.09564 H_0 pA>=pB 无显著性差异\n","2017-01-15    p= 0.07693 H_0 pA>=pB 无显著性差异\n","2017-01-16     p= 0.01490 H_1 pA<pB 有显著性差异\n","2017-01-17     p= 0.00039 H_1 pA<pB 有显著性差异\n","2017-01-18     p= 0.00476 H_1 pA<pB 有显著性差异\n","2017-01-19     p= 0.01483 H_1 pA<pB 有显著性差异\n","2017-01-20     p= 0.00089 H_1 pA<pB 有显著性差异\n","2017-01-21    p= 0.19987 H_0 pA>=pB 无显著性差异\n","2017-01-22     p= 0.00683 H_1 pA<pB 有显著性差异\n","2017-01-23     p= 0.03396 H_1 pA<pB 有显著性差异\n","Name: left, dtype: object"]},"metadata":{},"execution_count":18}],"source":["ABtest(\n","    data=df, types=\"p\", alpha=0.05, beta=0.2, diff=0.015, cal_key=\"converted\"\n",").cal_process()[\"left\"]"]},{"cell_type":"markdown","metadata":{},"source":["$ H_0 $ banner a ($ p_A $) 与 banner b ($ p_B $) $ H_0 : p_A - p_B \\ge -0.02 $\n","\n","$ H_1 $  banner a ($ p_A $) 与 banner b ($ p_B $)  $ H_0 : p_A - p_b \\lt -0.02 $"]},{"cell_type":"code","execution_count":19,"metadata":{},"outputs":[{"output_type":"stream","name":"stderr","text":["D:\\Anaconda3\\lib\\site-packages\\pandas\\core\\indexing.py:1596: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  self.obj[key] = _infer_fill_value(value)\nD:\\Anaconda3\\lib\\site-packages\\pandas\\core\\indexing.py:1743: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  isetter(ilocs[0], value)\n"]},{"output_type":"execute_result","data":{"text/plain":["date\n","2017-01-03    p= 0.99986 H_0 pA>=pB 无显著性差异\n","2017-01-04    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-05    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-06    p= 0.98293 H_0 pA>=pB 无显著性差异\n","2017-01-07    p= 0.99999 H_0 pA>=pB 无显著性差异\n","2017-01-08    p= 0.99938 H_0 pA>=pB 无显著性差异\n","2017-01-09    p= 0.99994 H_0 pA>=pB 无显著性差异\n","2017-01-10    p= 0.87721 H_0 pA>=pB 无显著性差异\n","2017-01-11    p= 0.99999 H_0 pA>=pB 无显著性差异\n","2017-01-12    p= 0.99972 H_0 pA>=pB 无显著性差异\n","2017-01-13    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-14    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-15    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-16    p= 0.99997 H_0 pA>=pB 无显著性差异\n","2017-01-17    p= 0.99660 H_0 pA>=pB 无显著性差异\n","2017-01-18    p= 0.99973 H_0 pA>=pB 无显著性差异\n","2017-01-19    p= 0.99997 H_0 pA>=pB 无显著性差异\n","2017-01-20    p= 0.99917 H_0 pA>=pB 无显著性差异\n","2017-01-21    p= 1.00000 H_0 pA>=pB 无显著性差异\n","2017-01-22    p= 0.99992 H_0 pA>=pB 无显著性差异\n","2017-01-23    p= 0.99999 H_0 pA>=pB 无显著性差异\n","2017-01-24    p= 0.98475 H_0 pA>=pB 无显著性差异\n","Name: left, dtype: object"]},"metadata":{},"execution_count":19}],"source":["ABtest(\n","    data=df, types=\"p\", alpha=0.05, beta=0.2, diff=-0.02, cal_key=\"converted\"\n",").cal_process()[\"left\"]"]},{"cell_type":"markdown","metadata":{},"source":["$ H_0 $ banner_a_pv ($ \\mu_A $) 与 banner_b_pv ($ \\mu_B $) $ H_0 : \\mu_A - \\mu_B \\ge \\mu * 0.05 $\n","\n","$ H_1 $  banner_a_pv ($ \\mu_A $) 与 banner_b_pv ($ \\mu_B $)  $ H_0 : \\mu_A - \\mu_b \\lt \\mu * 0.05 $\n","\n","因此为其设定一个差异值 $ \\delta $ 0.02 即  $ H_1 : \\mu_A - \\mu_b - \\delta \\lt 0 $ or  $ H_0 : \\mu_A - \\mu_b - \\delta \\ge 0 $\n",""]},{"cell_type":"code","execution_count":20,"metadata":{},"outputs":[{"output_type":"stream","name":"stderr","text":["D:\\Anaconda3\\lib\\site-packages\\pandas\\core\\indexing.py:1596: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  self.obj[key] = _infer_fill_value(value)\nD:\\Anaconda3\\lib\\site-packages\\pandas\\core\\indexing.py:1743: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n  isetter(ilocs[0], value)\n"]},{"output_type":"execute_result","data":{"text/plain":["                                    left                         right  \\\n","date                                                                     \n","2017-01-02  p= 0.39891 H_0 pA>=pB 无显著性差异  p= 0.60109 H_0 pA>=pB 无显著性差异   \n","2017-01-03   p= 0.04150 H_1 pA<pB 有显著性差异  p= 0.95850 H_0 pA>=pB 无显著性差异   \n","2017-01-04   p= 0.03750 H_1 pA<pB 有显著性差异  p= 0.96250 H_0 pA>=pB 无显著性差异   \n","2017-01-05   p= 0.01844 H_1 pA<pB 有显著性差异  p= 0.98156 H_0 pA>=pB 无显著性差异   \n","2017-01-06   p= 0.00343 H_1 pA<pB 有显著性差异  p= 0.99657 H_0 pA>=pB 无显著性差异   \n","2017-01-07  p= 0.07017 H_0 pA>=pB 无显著性差异  p= 0.92983 H_0 pA>=pB 无显著性差异   \n","2017-01-08  p= 0.05145 H_0 pA>=pB 无显著性差异  p= 0.94855 H_0 pA>=pB 无显著性差异   \n","2017-01-09  p= 0.07111 H_0 pA>=pB 无显著性差异  p= 0.92889 H_0 pA>=pB 无显著性差异   \n","2017-01-10   p= 0.03435 H_1 pA<pB 有显著性差异  p= 0.96565 H_0 pA>=pB 无显著性差异   \n","2017-01-11  p= 0.07656 H_0 pA>=pB 无显著性差异  p= 0.92344 H_0 pA>=pB 无显著性差异   \n","2017-01-12   p= 0.00699 H_1 pA<pB 有显著性差异  p= 0.99301 H_0 pA>=pB 无显著性差异   \n","2017-01-13  p= 0.14084 H_0 pA>=pB 无显著性差异  p= 0.85916 H_0 pA>=pB 无显著性差异   \n","2017-01-14   p= 0.02623 H_1 pA<pB 有显著性差异  p= 0.97377 H_0 pA>=pB 无显著性差异   \n","2017-01-15  p= 0.48162 H_0 pA>=pB 无显著性差异  p= 0.51838 H_0 pA>=pB 无显著性差异   \n","2017-01-16  p= 0.16496 H_0 pA>=pB 无显著性差异  p= 0.83504 H_0 pA>=pB 无显著性差异   \n","2017-01-17  p= 0.20606 H_0 pA>=pB 无显著性差异  p= 0.79394 H_0 pA>=pB 无显著性差异   \n","2017-01-18   p= 0.00967 H_1 pA<pB 有显著性差异  p= 0.99033 H_0 pA>=pB 无显著性差异   \n","2017-01-19  p= 0.07016 H_0 pA>=pB 无显著性差异  p= 0.92984 H_0 pA>=pB 无显著性差异   \n","2017-01-20   p= 0.00646 H_1 pA<pB 有显著性差异  p= 0.99354 H_0 pA>=pB 无显著性差异   \n","2017-01-21  p= 0.58752 H_0 pA>=pB 无显著性差异  p= 0.41248 H_0 pA>=pB 无显著性差异   \n","2017-01-22   p= 0.01227 H_1 pA<pB 有显著性差异  p= 0.98773 H_0 pA>=pB 无显著性差异   \n","2017-01-23  p= 0.20975 H_0 pA>=pB 无显著性差异  p= 0.79025 H_0 pA>=pB 无显著性差异   \n","2017-01-24  p= 0.43655 H_0 pA>=pB 无显著性差异  p= 0.56345 H_0 pA>=pB 无显著性差异   \n","\n","                                     two  \n","date                                      \n","2017-01-02   p= 0.79782 H_0 pA=pB 无显著性差异  \n","2017-01-03   p= 0.08301 H_0 pA=pB 无显著性差异  \n","2017-01-04   p= 0.07499 H_0 pA=pB 无显著性差异  \n","2017-01-05  p= 0.03689 H_1 pA!=pB 有显著性差异  \n","2017-01-06  p= 0.00687 H_1 pA!=pB 有显著性差异  \n","2017-01-07   p= 0.14035 H_0 pA=pB 无显著性差异  \n","2017-01-08   p= 0.10290 H_0 pA=pB 无显著性差异  \n","2017-01-09   p= 0.14222 H_0 pA=pB 无显著性差异  \n","2017-01-10   p= 0.06871 H_0 pA=pB 无显著性差异  \n","2017-01-11   p= 0.15312 H_0 pA=pB 无显著性差异  \n","2017-01-12  p= 0.01399 H_1 pA!=pB 有显著性差异  \n","2017-01-13   p= 0.28167 H_0 pA=pB 无显著性差异  \n","2017-01-14   p= 0.05246 H_0 pA=pB 无显著性差异  \n","2017-01-15   p= 0.96324 H_0 pA=pB 无显著性差异  \n","2017-01-16   p= 0.32991 H_0 pA=pB 无显著性差异  \n","2017-01-17   p= 0.41211 H_0 pA=pB 无显著性差异  \n","2017-01-18  p= 0.01933 H_1 pA!=pB 有显著性差异  \n","2017-01-19   p= 0.14032 H_0 pA=pB 无显著性差异  \n","2017-01-20  p= 0.01292 H_1 pA!=pB 有显著性差异  \n","2017-01-21   p= 0.41248 H_0 pA=pB 无显著性差异  \n","2017-01-22  p= 0.02455 H_1 pA!=pB 有显著性差异  \n","2017-01-23   p= 0.41950 H_0 pA=pB 无显著性差异  \n","2017-01-24   p= 0.87309 H_0 pA=pB 无显著性差异  "],"text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>left</th>\n      <th>right</th>\n      <th>two</th>\n    </tr>\n    <tr>\n      <th>date</th>\n      <th></th>\n      <th></th>\n      <th></th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>2017-01-02</th>\n      <td>p= 0.39891 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.60109 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.79782 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-03</th>\n      <td>p= 0.04150 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.95850 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.08301 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-04</th>\n      <td>p= 0.03750 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.96250 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.07499 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-05</th>\n      <td>p= 0.01844 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.98156 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.03689 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-06</th>\n      <td>p= 0.00343 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.99657 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.00687 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-07</th>\n      <td>p= 0.07017 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.92983 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.14035 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-08</th>\n      <td>p= 0.05145 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.94855 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.10290 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-09</th>\n      <td>p= 0.07111 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.92889 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.14222 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-10</th>\n      <td>p= 0.03435 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.96565 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.06871 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-11</th>\n      <td>p= 0.07656 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.92344 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.15312 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-12</th>\n      <td>p= 0.00699 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.99301 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.01399 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-13</th>\n      <td>p= 0.14084 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.85916 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.28167 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-14</th>\n      <td>p= 0.02623 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.97377 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.05246 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-15</th>\n      <td>p= 0.48162 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.51838 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.96324 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-16</th>\n      <td>p= 0.16496 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.83504 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.32991 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-17</th>\n      <td>p= 0.20606 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.79394 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.41211 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-18</th>\n      <td>p= 0.00967 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.99033 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.01933 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-19</th>\n      <td>p= 0.07016 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.92984 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.14032 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-20</th>\n      <td>p= 0.00646 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.99354 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.01292 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-21</th>\n      <td>p= 0.58752 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.41248 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.41248 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-22</th>\n      <td>p= 0.01227 H_1 pA&lt;pB 有显著性差异</td>\n      <td>p= 0.98773 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.02455 H_1 pA!=pB 有显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-23</th>\n      <td>p= 0.20975 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.79025 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.41950 H_0 pA=pB 无显著性差异</td>\n    </tr>\n    <tr>\n      <th>2017-01-24</th>\n      <td>p= 0.43655 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.56345 H_0 pA&gt;=pB 无显著性差异</td>\n      <td>p= 0.87309 H_0 pA=pB 无显著性差异</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{},"execution_count":20}],"source":["dfgbuser = df.groupby(by=[\"date\", \"hour\", \"group\", \"landing_page\"]).agg(\n","    {\"user_id\": \"count\", \"converted\": \"sum\"}\n",")\n","dfgbuser.reset_index(inplace=True)\n","dfgbuser\n","\n","ABtest(\n","    data=dfgbuser, types=\"m\", alpha=0.05, beta=0.2, diff=0.05, cal_key=\"user_id\"\n",").cal_process()\n",""]}],"nbformat":4,"nbformat_minor":2,"metadata":{"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":3},"orig_nbformat":4}}