{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Recommender Systems 2020/21\n",
    "\n",
    "\n",
    "## Practice 4 - Building an ItemKNN Recommender From Scratch\n",
    "\n",
    "This practice session is created to provide a guide to students of how to crete a recommender system from scratch, going from the data loading, processing, model creation, evaluation, hyperparameter tuning and a sample submission to the competition. \n",
    "\n",
    "Outline:\n",
    "- Data Loading with Pandas (MovieLens 10M, link: http://files.grouplens.org/datasets/movielens/ml-10m.zip)\n",
    "- Data Preprocessing\n",
    "- Dataset splitting in Train, Validation and Testing\n",
    "- Similarity Measures\n",
    "- Collaborative Item KNN\n",
    "- Evaluation Metrics\n",
    "- Evaluation Procedure\n",
    "- Hyperparameter Tuning\n",
    "- Submission to competition"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "__author__ = 'Fernando Benjamín Pérez Maurera'\n",
    "__credits__ = ['Fernando Benjamín Pérez Maurera']\n",
    "__license__ = 'MIT'\n",
    "__version__ = '0.1.0'\n",
    "__maintainer__ = 'Fernando Benjamín Pérez Maurera'\n",
    "__email__ = 'fernandobenjamin.perez@polimi.it'\n",
    "__status__ = 'Dev'\n",
    "\n",
    "import os\n",
    "from typing import Tuple, Callable, Dict, Optional, List\n",
    "\n",
    "import numpy as np\n",
    "import pandas as pd\n",
    "import scipy.sparse as sp\n",
    "\n",
    "from sklearn.model_selection import train_test_split\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Dataset Loading with pandas\n",
    "\n",
    "The Movielens 10M dataset is a collection of ratings given by users to items. They are stored in a columnar `.dat` file using `::` as separators for each attribute, and every row follows this structure: `<user_id>::<item_id>::<rating>::<timestamp>`. \n",
    "\n",
    "The function `read_csv` from pandas provides a wonderful and fast interface to load tabular data like this. For better results and performance we provide the separator `::`, the column names `[\"user_id\", \"item_id\", \"ratings\", \"timestamp\"]`, and the types of each attribute in the `dtype` parameter."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [],
   "source": [
    "pd.read_csv?"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [],
   "source": [
    "def load_data():\n",
    "    return pd.read_csv(\"./data/Movielens_10M/ml-10M100K/ratings.dat\", \n",
    "                       sep=\"::\", \n",
    "                       names=[\"user_id\", \"item_id\", \"ratings\", \"timestamp\"],\n",
    "                       header=None,\n",
    "                       dtype={\"user_id\": np.int32,\n",
    "                               \"item_id\": np.int32,\n",
    "                               \"ratings\": np.int32,\n",
    "                               \"timestamp\": np.int64})"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/usr/local/Caskroom/miniconda/base/envs/RecSysFramework/lib/python3.6/site-packages/ipykernel_launcher.py:9: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support regex separators (separators > 1 char and different from '\\s+' are interpreted as regex); you can avoid this warning by specifying engine='python'.\n",
      "  if __name__ == '__main__':\n"
     ]
    }
   ],
   "source": [
    "ratings = load_data()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>user_id</th>\n",
       "      <th>item_id</th>\n",
       "      <th>ratings</th>\n",
       "      <th>timestamp</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>122</td>\n",
       "      <td>5</td>\n",
       "      <td>838985046</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>1</td>\n",
       "      <td>185</td>\n",
       "      <td>5</td>\n",
       "      <td>838983525</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>1</td>\n",
       "      <td>231</td>\n",
       "      <td>5</td>\n",
       "      <td>838983392</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>1</td>\n",
       "      <td>292</td>\n",
       "      <td>5</td>\n",
       "      <td>838983421</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>1</td>\n",
       "      <td>316</td>\n",
       "      <td>5</td>\n",
       "      <td>838983392</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>5</th>\n",
       "      <td>1</td>\n",
       "      <td>329</td>\n",
       "      <td>5</td>\n",
       "      <td>838983392</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>6</th>\n",
       "      <td>1</td>\n",
       "      <td>355</td>\n",
       "      <td>5</td>\n",
       "      <td>838984474</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>7</th>\n",
       "      <td>1</td>\n",
       "      <td>356</td>\n",
       "      <td>5</td>\n",
       "      <td>838983653</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>8</th>\n",
       "      <td>1</td>\n",
       "      <td>362</td>\n",
       "      <td>5</td>\n",
       "      <td>838984885</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>9</th>\n",
       "      <td>1</td>\n",
       "      <td>364</td>\n",
       "      <td>5</td>\n",
       "      <td>838983707</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10</th>\n",
       "      <td>1</td>\n",
       "      <td>370</td>\n",
       "      <td>5</td>\n",
       "      <td>838984596</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>11</th>\n",
       "      <td>1</td>\n",
       "      <td>377</td>\n",
       "      <td>5</td>\n",
       "      <td>838983834</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>12</th>\n",
       "      <td>1</td>\n",
       "      <td>420</td>\n",
       "      <td>5</td>\n",
       "      <td>838983834</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>13</th>\n",
       "      <td>1</td>\n",
       "      <td>466</td>\n",
       "      <td>5</td>\n",
       "      <td>838984679</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>14</th>\n",
       "      <td>1</td>\n",
       "      <td>480</td>\n",
       "      <td>5</td>\n",
       "      <td>838983653</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>15</th>\n",
       "      <td>1</td>\n",
       "      <td>520</td>\n",
       "      <td>5</td>\n",
       "      <td>838984679</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>16</th>\n",
       "      <td>1</td>\n",
       "      <td>539</td>\n",
       "      <td>5</td>\n",
       "      <td>838984068</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>17</th>\n",
       "      <td>1</td>\n",
       "      <td>586</td>\n",
       "      <td>5</td>\n",
       "      <td>838984068</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>18</th>\n",
       "      <td>1</td>\n",
       "      <td>588</td>\n",
       "      <td>5</td>\n",
       "      <td>838983339</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>19</th>\n",
       "      <td>1</td>\n",
       "      <td>589</td>\n",
       "      <td>5</td>\n",
       "      <td>838983778</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>20</th>\n",
       "      <td>1</td>\n",
       "      <td>594</td>\n",
       "      <td>5</td>\n",
       "      <td>838984679</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>21</th>\n",
       "      <td>1</td>\n",
       "      <td>616</td>\n",
       "      <td>5</td>\n",
       "      <td>838984941</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>22</th>\n",
       "      <td>2</td>\n",
       "      <td>110</td>\n",
       "      <td>5</td>\n",
       "      <td>868245777</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>23</th>\n",
       "      <td>2</td>\n",
       "      <td>151</td>\n",
       "      <td>3</td>\n",
       "      <td>868246450</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>24</th>\n",
       "      <td>2</td>\n",
       "      <td>260</td>\n",
       "      <td>5</td>\n",
       "      <td>868244562</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>25</th>\n",
       "      <td>2</td>\n",
       "      <td>376</td>\n",
       "      <td>3</td>\n",
       "      <td>868245920</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>26</th>\n",
       "      <td>2</td>\n",
       "      <td>539</td>\n",
       "      <td>3</td>\n",
       "      <td>868246262</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>27</th>\n",
       "      <td>2</td>\n",
       "      <td>590</td>\n",
       "      <td>5</td>\n",
       "      <td>868245608</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>28</th>\n",
       "      <td>2</td>\n",
       "      <td>648</td>\n",
       "      <td>2</td>\n",
       "      <td>868244699</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>29</th>\n",
       "      <td>2</td>\n",
       "      <td>719</td>\n",
       "      <td>3</td>\n",
       "      <td>868246191</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>...</th>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000024</th>\n",
       "      <td>71567</td>\n",
       "      <td>1396</td>\n",
       "      <td>3</td>\n",
       "      <td>912580688</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000025</th>\n",
       "      <td>71567</td>\n",
       "      <td>1407</td>\n",
       "      <td>2</td>\n",
       "      <td>912578457</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000026</th>\n",
       "      <td>71567</td>\n",
       "      <td>1527</td>\n",
       "      <td>5</td>\n",
       "      <td>912580647</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000027</th>\n",
       "      <td>71567</td>\n",
       "      <td>1580</td>\n",
       "      <td>3</td>\n",
       "      <td>912580688</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000028</th>\n",
       "      <td>71567</td>\n",
       "      <td>1584</td>\n",
       "      <td>3</td>\n",
       "      <td>912580647</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000029</th>\n",
       "      <td>71567</td>\n",
       "      <td>1598</td>\n",
       "      <td>2</td>\n",
       "      <td>912649143</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000030</th>\n",
       "      <td>71567</td>\n",
       "      <td>1690</td>\n",
       "      <td>3</td>\n",
       "      <td>912578406</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000031</th>\n",
       "      <td>71567</td>\n",
       "      <td>1717</td>\n",
       "      <td>1</td>\n",
       "      <td>912578457</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000032</th>\n",
       "      <td>71567</td>\n",
       "      <td>1721</td>\n",
       "      <td>4</td>\n",
       "      <td>912649271</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000033</th>\n",
       "      <td>71567</td>\n",
       "      <td>1748</td>\n",
       "      <td>3</td>\n",
       "      <td>912580647</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000034</th>\n",
       "      <td>71567</td>\n",
       "      <td>1769</td>\n",
       "      <td>3</td>\n",
       "      <td>912649037</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000035</th>\n",
       "      <td>71567</td>\n",
       "      <td>1792</td>\n",
       "      <td>2</td>\n",
       "      <td>912649171</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000036</th>\n",
       "      <td>71567</td>\n",
       "      <td>1805</td>\n",
       "      <td>4</td>\n",
       "      <td>912649010</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000037</th>\n",
       "      <td>71567</td>\n",
       "      <td>1833</td>\n",
       "      <td>3</td>\n",
       "      <td>912649171</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000038</th>\n",
       "      <td>71567</td>\n",
       "      <td>1876</td>\n",
       "      <td>3</td>\n",
       "      <td>912580722</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000039</th>\n",
       "      <td>71567</td>\n",
       "      <td>1909</td>\n",
       "      <td>2</td>\n",
       "      <td>912580688</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000040</th>\n",
       "      <td>71567</td>\n",
       "      <td>1917</td>\n",
       "      <td>4</td>\n",
       "      <td>912580787</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000041</th>\n",
       "      <td>71567</td>\n",
       "      <td>1920</td>\n",
       "      <td>4</td>\n",
       "      <td>912578247</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000042</th>\n",
       "      <td>71567</td>\n",
       "      <td>1982</td>\n",
       "      <td>1</td>\n",
       "      <td>912580553</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000043</th>\n",
       "      <td>71567</td>\n",
       "      <td>1983</td>\n",
       "      <td>1</td>\n",
       "      <td>912580553</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000044</th>\n",
       "      <td>71567</td>\n",
       "      <td>1984</td>\n",
       "      <td>1</td>\n",
       "      <td>912580553</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000045</th>\n",
       "      <td>71567</td>\n",
       "      <td>1985</td>\n",
       "      <td>1</td>\n",
       "      <td>912580553</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000046</th>\n",
       "      <td>71567</td>\n",
       "      <td>1986</td>\n",
       "      <td>1</td>\n",
       "      <td>912580553</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000047</th>\n",
       "      <td>71567</td>\n",
       "      <td>2012</td>\n",
       "      <td>3</td>\n",
       "      <td>912580722</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000048</th>\n",
       "      <td>71567</td>\n",
       "      <td>2028</td>\n",
       "      <td>5</td>\n",
       "      <td>912580344</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000049</th>\n",
       "      <td>71567</td>\n",
       "      <td>2107</td>\n",
       "      <td>1</td>\n",
       "      <td>912580553</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000050</th>\n",
       "      <td>71567</td>\n",
       "      <td>2126</td>\n",
       "      <td>2</td>\n",
       "      <td>912649143</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000051</th>\n",
       "      <td>71567</td>\n",
       "      <td>2294</td>\n",
       "      <td>5</td>\n",
       "      <td>912577968</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000052</th>\n",
       "      <td>71567</td>\n",
       "      <td>2338</td>\n",
       "      <td>2</td>\n",
       "      <td>912578016</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000053</th>\n",
       "      <td>71567</td>\n",
       "      <td>2384</td>\n",
       "      <td>2</td>\n",
       "      <td>912578173</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>10000054 rows × 4 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "          user_id  item_id  ratings  timestamp\n",
       "0               1      122        5  838985046\n",
       "1               1      185        5  838983525\n",
       "2               1      231        5  838983392\n",
       "3               1      292        5  838983421\n",
       "4               1      316        5  838983392\n",
       "5               1      329        5  838983392\n",
       "6               1      355        5  838984474\n",
       "7               1      356        5  838983653\n",
       "8               1      362        5  838984885\n",
       "9               1      364        5  838983707\n",
       "10              1      370        5  838984596\n",
       "11              1      377        5  838983834\n",
       "12              1      420        5  838983834\n",
       "13              1      466        5  838984679\n",
       "14              1      480        5  838983653\n",
       "15              1      520        5  838984679\n",
       "16              1      539        5  838984068\n",
       "17              1      586        5  838984068\n",
       "18              1      588        5  838983339\n",
       "19              1      589        5  838983778\n",
       "20              1      594        5  838984679\n",
       "21              1      616        5  838984941\n",
       "22              2      110        5  868245777\n",
       "23              2      151        3  868246450\n",
       "24              2      260        5  868244562\n",
       "25              2      376        3  868245920\n",
       "26              2      539        3  868246262\n",
       "27              2      590        5  868245608\n",
       "28              2      648        2  868244699\n",
       "29              2      719        3  868246191\n",
       "...           ...      ...      ...        ...\n",
       "10000024    71567     1396        3  912580688\n",
       "10000025    71567     1407        2  912578457\n",
       "10000026    71567     1527        5  912580647\n",
       "10000027    71567     1580        3  912580688\n",
       "10000028    71567     1584        3  912580647\n",
       "10000029    71567     1598        2  912649143\n",
       "10000030    71567     1690        3  912578406\n",
       "10000031    71567     1717        1  912578457\n",
       "10000032    71567     1721        4  912649271\n",
       "10000033    71567     1748        3  912580647\n",
       "10000034    71567     1769        3  912649037\n",
       "10000035    71567     1792        2  912649171\n",
       "10000036    71567     1805        4  912649010\n",
       "10000037    71567     1833        3  912649171\n",
       "10000038    71567     1876        3  912580722\n",
       "10000039    71567     1909        2  912580688\n",
       "10000040    71567     1917        4  912580787\n",
       "10000041    71567     1920        4  912578247\n",
       "10000042    71567     1982        1  912580553\n",
       "10000043    71567     1983        1  912580553\n",
       "10000044    71567     1984        1  912580553\n",
       "10000045    71567     1985        1  912580553\n",
       "10000046    71567     1986        1  912580553\n",
       "10000047    71567     2012        3  912580722\n",
       "10000048    71567     2028        5  912580344\n",
       "10000049    71567     2107        1  912580553\n",
       "10000050    71567     2126        2  912649143\n",
       "10000051    71567     2294        5  912577968\n",
       "10000052    71567     2338        2  912578016\n",
       "10000053    71567     2384        2  912578173\n",
       "\n",
       "[10000054 rows x 4 columns]"
      ]
     },
     "execution_count": 5,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "ratings"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Data Preprocessing\n",
    "\n",
    "This section wors with the previously-loaded ratings dataset and extracts the number of users, number of items, and min/max user/item identifiers. Exploring and understanding the data is an essential step prior fitting any recommender/algorithm. \n",
    "\n",
    "In this specific case, we discover that item identifiers go between 1 and 65133, however, there are only 10677 different items (meaning that ~5/6 of the items identifiers are not present in the dataset). To ease further calculations, we create new contiguous user/item identifiers, we then assign each user/item only one of these new identifiers. To keep track of these new mappings, we add them into the original dataframe using the `pd.merge` function."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [],
   "source": [
    "pd.merge?"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "def preprocess_data(ratings: pd.DataFrame):\n",
    "    unique_users = ratings.user_id.unique()\n",
    "    unique_items = ratings.item_id.unique()\n",
    "    \n",
    "    num_users, min_user_id, max_user_id = unique_users.size, unique_users.min(), unique_users.max()\n",
    "    num_items, min_item_id, max_item_id = unique_items.size, unique_items.min(), unique_items.max()\n",
    "    \n",
    "    print(num_users, min_user_id, max_user_id)\n",
    "    print(num_items, min_item_id, max_item_id)\n",
    "    \n",
    "    mapping_user_id = pd.DataFrame({\"mapped_user_id\": np.arange(num_users), \"user_id\": unique_users})\n",
    "    mapping_item_id = pd.DataFrame({\"mapped_item_id\": np.arange(num_items), \"item_id\": unique_items})\n",
    "    \n",
    "    ratings = pd.merge(left=ratings, \n",
    "                       right=mapping_user_id,\n",
    "                       how=\"inner\",\n",
    "                       on=\"user_id\")\n",
    "    \n",
    "    ratings = pd.merge(left=ratings, \n",
    "                       right=mapping_item_id,\n",
    "                       how=\"inner\",\n",
    "                       on=\"item_id\")\n",
    "    \n",
    "    return ratings\n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "69878 1 71567\n",
      "10677 1 65133\n"
     ]
    }
   ],
   "source": [
    "ratings = preprocess_data(ratings)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>user_id</th>\n",
       "      <th>item_id</th>\n",
       "      <th>ratings</th>\n",
       "      <th>timestamp</th>\n",
       "      <th>mapped_user_id</th>\n",
       "      <th>mapped_item_id</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>122</td>\n",
       "      <td>5</td>\n",
       "      <td>838985046</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>139</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>974302621</td>\n",
       "      <td>128</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>149</td>\n",
       "      <td>122</td>\n",
       "      <td>2</td>\n",
       "      <td>1112342322</td>\n",
       "      <td>136</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>182</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>943458784</td>\n",
       "      <td>168</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>215</td>\n",
       "      <td>122</td>\n",
       "      <td>4</td>\n",
       "      <td>1102493547</td>\n",
       "      <td>201</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>5</th>\n",
       "      <td>217</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>844429650</td>\n",
       "      <td>203</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>6</th>\n",
       "      <td>281</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>844437024</td>\n",
       "      <td>265</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>7</th>\n",
       "      <td>326</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>838997566</td>\n",
       "      <td>307</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>8</th>\n",
       "      <td>351</td>\n",
       "      <td>122</td>\n",
       "      <td>1</td>\n",
       "      <td>955831012</td>\n",
       "      <td>332</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>9</th>\n",
       "      <td>357</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>945884437</td>\n",
       "      <td>338</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10</th>\n",
       "      <td>426</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>1084374941</td>\n",
       "      <td>405</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>11</th>\n",
       "      <td>456</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>833003616</td>\n",
       "      <td>434</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>12</th>\n",
       "      <td>459</td>\n",
       "      <td>122</td>\n",
       "      <td>2</td>\n",
       "      <td>1215452801</td>\n",
       "      <td>436</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>13</th>\n",
       "      <td>494</td>\n",
       "      <td>122</td>\n",
       "      <td>4</td>\n",
       "      <td>844447512</td>\n",
       "      <td>471</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>14</th>\n",
       "      <td>517</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>830352936</td>\n",
       "      <td>493</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>15</th>\n",
       "      <td>524</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>1111549740</td>\n",
       "      <td>499</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>16</th>\n",
       "      <td>556</td>\n",
       "      <td>122</td>\n",
       "      <td>2</td>\n",
       "      <td>839005924</td>\n",
       "      <td>529</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>17</th>\n",
       "      <td>588</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>844450138</td>\n",
       "      <td>561</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>18</th>\n",
       "      <td>589</td>\n",
       "      <td>122</td>\n",
       "      <td>1</td>\n",
       "      <td>857985124</td>\n",
       "      <td>562</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>19</th>\n",
       "      <td>590</td>\n",
       "      <td>122</td>\n",
       "      <td>4</td>\n",
       "      <td>948332956</td>\n",
       "      <td>563</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>20</th>\n",
       "      <td>601</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>868323565</td>\n",
       "      <td>574</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>21</th>\n",
       "      <td>621</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>1095040292</td>\n",
       "      <td>592</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>22</th>\n",
       "      <td>634</td>\n",
       "      <td>122</td>\n",
       "      <td>2</td>\n",
       "      <td>913061756</td>\n",
       "      <td>604</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>23</th>\n",
       "      <td>672</td>\n",
       "      <td>122</td>\n",
       "      <td>2</td>\n",
       "      <td>981667042</td>\n",
       "      <td>642</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>24</th>\n",
       "      <td>701</td>\n",
       "      <td>122</td>\n",
       "      <td>4</td>\n",
       "      <td>831889761</td>\n",
       "      <td>667</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>25</th>\n",
       "      <td>719</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>834939568</td>\n",
       "      <td>683</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>26</th>\n",
       "      <td>745</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>1039353350</td>\n",
       "      <td>708</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>27</th>\n",
       "      <td>757</td>\n",
       "      <td>122</td>\n",
       "      <td>4</td>\n",
       "      <td>869227001</td>\n",
       "      <td>720</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>28</th>\n",
       "      <td>775</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>834942788</td>\n",
       "      <td>737</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>29</th>\n",
       "      <td>780</td>\n",
       "      <td>122</td>\n",
       "      <td>3</td>\n",
       "      <td>945965379</td>\n",
       "      <td>741</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>...</th>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000024</th>\n",
       "      <td>61880</td>\n",
       "      <td>26612</td>\n",
       "      <td>3</td>\n",
       "      <td>1230133819</td>\n",
       "      <td>60521</td>\n",
       "      <td>10657</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000025</th>\n",
       "      <td>63134</td>\n",
       "      <td>26612</td>\n",
       "      <td>1</td>\n",
       "      <td>1230598449</td>\n",
       "      <td>61748</td>\n",
       "      <td>10657</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000026</th>\n",
       "      <td>62079</td>\n",
       "      <td>7754</td>\n",
       "      <td>2</td>\n",
       "      <td>1211552590</td>\n",
       "      <td>60719</td>\n",
       "      <td>10658</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000027</th>\n",
       "      <td>67385</td>\n",
       "      <td>7754</td>\n",
       "      <td>3</td>\n",
       "      <td>1163555772</td>\n",
       "      <td>65888</td>\n",
       "      <td>10658</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000028</th>\n",
       "      <td>62245</td>\n",
       "      <td>59044</td>\n",
       "      <td>3</td>\n",
       "      <td>1228925653</td>\n",
       "      <td>60884</td>\n",
       "      <td>10659</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000029</th>\n",
       "      <td>70905</td>\n",
       "      <td>59044</td>\n",
       "      <td>1</td>\n",
       "      <td>1221163314</td>\n",
       "      <td>69242</td>\n",
       "      <td>10659</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000030</th>\n",
       "      <td>62332</td>\n",
       "      <td>31090</td>\n",
       "      <td>2</td>\n",
       "      <td>1159394437</td>\n",
       "      <td>60971</td>\n",
       "      <td>10660</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000031</th>\n",
       "      <td>70899</td>\n",
       "      <td>31090</td>\n",
       "      <td>4</td>\n",
       "      <td>1109628288</td>\n",
       "      <td>69236</td>\n",
       "      <td>10660</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000032</th>\n",
       "      <td>62510</td>\n",
       "      <td>64275</td>\n",
       "      <td>5</td>\n",
       "      <td>1231021319</td>\n",
       "      <td>61140</td>\n",
       "      <td>10661</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000033</th>\n",
       "      <td>62522</td>\n",
       "      <td>64953</td>\n",
       "      <td>3</td>\n",
       "      <td>1230783543</td>\n",
       "      <td>61152</td>\n",
       "      <td>10662</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000034</th>\n",
       "      <td>62880</td>\n",
       "      <td>5814</td>\n",
       "      <td>4</td>\n",
       "      <td>1069273488</td>\n",
       "      <td>61500</td>\n",
       "      <td>10663</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000035</th>\n",
       "      <td>69312</td>\n",
       "      <td>5814</td>\n",
       "      <td>3</td>\n",
       "      <td>1099195832</td>\n",
       "      <td>67714</td>\n",
       "      <td>10663</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000036</th>\n",
       "      <td>63134</td>\n",
       "      <td>54318</td>\n",
       "      <td>2</td>\n",
       "      <td>1222631928</td>\n",
       "      <td>61748</td>\n",
       "      <td>10664</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000037</th>\n",
       "      <td>64621</td>\n",
       "      <td>39429</td>\n",
       "      <td>2</td>\n",
       "      <td>1201248182</td>\n",
       "      <td>63203</td>\n",
       "      <td>10665</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000038</th>\n",
       "      <td>64621</td>\n",
       "      <td>62799</td>\n",
       "      <td>3</td>\n",
       "      <td>1225729884</td>\n",
       "      <td>63203</td>\n",
       "      <td>10666</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000039</th>\n",
       "      <td>67385</td>\n",
       "      <td>62799</td>\n",
       "      <td>3</td>\n",
       "      <td>1225922673</td>\n",
       "      <td>65888</td>\n",
       "      <td>10666</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000040</th>\n",
       "      <td>67542</td>\n",
       "      <td>62799</td>\n",
       "      <td>3</td>\n",
       "      <td>1225395734</td>\n",
       "      <td>66039</td>\n",
       "      <td>10666</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000041</th>\n",
       "      <td>65098</td>\n",
       "      <td>7823</td>\n",
       "      <td>4</td>\n",
       "      <td>1111474117</td>\n",
       "      <td>63675</td>\n",
       "      <td>10667</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000042</th>\n",
       "      <td>66220</td>\n",
       "      <td>3195</td>\n",
       "      <td>4</td>\n",
       "      <td>1173968033</td>\n",
       "      <td>64765</td>\n",
       "      <td>10668</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000043</th>\n",
       "      <td>67385</td>\n",
       "      <td>3195</td>\n",
       "      <td>3</td>\n",
       "      <td>1217811570</td>\n",
       "      <td>65888</td>\n",
       "      <td>10668</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000044</th>\n",
       "      <td>67771</td>\n",
       "      <td>3195</td>\n",
       "      <td>2</td>\n",
       "      <td>980793598</td>\n",
       "      <td>66256</td>\n",
       "      <td>10668</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000045</th>\n",
       "      <td>67123</td>\n",
       "      <td>56253</td>\n",
       "      <td>3</td>\n",
       "      <td>1230512900</td>\n",
       "      <td>65638</td>\n",
       "      <td>10669</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000046</th>\n",
       "      <td>67385</td>\n",
       "      <td>3234</td>\n",
       "      <td>3</td>\n",
       "      <td>1177566519</td>\n",
       "      <td>65888</td>\n",
       "      <td>10670</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000047</th>\n",
       "      <td>67385</td>\n",
       "      <td>3583</td>\n",
       "      <td>3</td>\n",
       "      <td>1182092515</td>\n",
       "      <td>65888</td>\n",
       "      <td>10671</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000048</th>\n",
       "      <td>67385</td>\n",
       "      <td>7537</td>\n",
       "      <td>2</td>\n",
       "      <td>1188277406</td>\n",
       "      <td>65888</td>\n",
       "      <td>10672</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000049</th>\n",
       "      <td>67385</td>\n",
       "      <td>63481</td>\n",
       "      <td>3</td>\n",
       "      <td>1227499991</td>\n",
       "      <td>65888</td>\n",
       "      <td>10673</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000050</th>\n",
       "      <td>67542</td>\n",
       "      <td>63481</td>\n",
       "      <td>3</td>\n",
       "      <td>1227739303</td>\n",
       "      <td>66039</td>\n",
       "      <td>10673</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000051</th>\n",
       "      <td>67385</td>\n",
       "      <td>64652</td>\n",
       "      <td>2</td>\n",
       "      <td>1230900023</td>\n",
       "      <td>65888</td>\n",
       "      <td>10674</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000052</th>\n",
       "      <td>69135</td>\n",
       "      <td>64427</td>\n",
       "      <td>4</td>\n",
       "      <td>1229529033</td>\n",
       "      <td>67546</td>\n",
       "      <td>10675</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10000053</th>\n",
       "      <td>70816</td>\n",
       "      <td>63662</td>\n",
       "      <td>4</td>\n",
       "      <td>1227531270</td>\n",
       "      <td>69154</td>\n",
       "      <td>10676</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>10000054 rows × 6 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "          user_id  item_id  ratings   timestamp  mapped_user_id  \\\n",
       "0               1      122        5   838985046               0   \n",
       "1             139      122        3   974302621             128   \n",
       "2             149      122        2  1112342322             136   \n",
       "3             182      122        3   943458784             168   \n",
       "4             215      122        4  1102493547             201   \n",
       "5             217      122        3   844429650             203   \n",
       "6             281      122        3   844437024             265   \n",
       "7             326      122        3   838997566             307   \n",
       "8             351      122        1   955831012             332   \n",
       "9             357      122        3   945884437             338   \n",
       "10            426      122        3  1084374941             405   \n",
       "11            456      122        3   833003616             434   \n",
       "12            459      122        2  1215452801             436   \n",
       "13            494      122        4   844447512             471   \n",
       "14            517      122        3   830352936             493   \n",
       "15            524      122        3  1111549740             499   \n",
       "16            556      122        2   839005924             529   \n",
       "17            588      122        3   844450138             561   \n",
       "18            589      122        1   857985124             562   \n",
       "19            590      122        4   948332956             563   \n",
       "20            601      122        3   868323565             574   \n",
       "21            621      122        3  1095040292             592   \n",
       "22            634      122        2   913061756             604   \n",
       "23            672      122        2   981667042             642   \n",
       "24            701      122        4   831889761             667   \n",
       "25            719      122        3   834939568             683   \n",
       "26            745      122        3  1039353350             708   \n",
       "27            757      122        4   869227001             720   \n",
       "28            775      122        3   834942788             737   \n",
       "29            780      122        3   945965379             741   \n",
       "...           ...      ...      ...         ...             ...   \n",
       "10000024    61880    26612        3  1230133819           60521   \n",
       "10000025    63134    26612        1  1230598449           61748   \n",
       "10000026    62079     7754        2  1211552590           60719   \n",
       "10000027    67385     7754        3  1163555772           65888   \n",
       "10000028    62245    59044        3  1228925653           60884   \n",
       "10000029    70905    59044        1  1221163314           69242   \n",
       "10000030    62332    31090        2  1159394437           60971   \n",
       "10000031    70899    31090        4  1109628288           69236   \n",
       "10000032    62510    64275        5  1231021319           61140   \n",
       "10000033    62522    64953        3  1230783543           61152   \n",
       "10000034    62880     5814        4  1069273488           61500   \n",
       "10000035    69312     5814        3  1099195832           67714   \n",
       "10000036    63134    54318        2  1222631928           61748   \n",
       "10000037    64621    39429        2  1201248182           63203   \n",
       "10000038    64621    62799        3  1225729884           63203   \n",
       "10000039    67385    62799        3  1225922673           65888   \n",
       "10000040    67542    62799        3  1225395734           66039   \n",
       "10000041    65098     7823        4  1111474117           63675   \n",
       "10000042    66220     3195        4  1173968033           64765   \n",
       "10000043    67385     3195        3  1217811570           65888   \n",
       "10000044    67771     3195        2   980793598           66256   \n",
       "10000045    67123    56253        3  1230512900           65638   \n",
       "10000046    67385     3234        3  1177566519           65888   \n",
       "10000047    67385     3583        3  1182092515           65888   \n",
       "10000048    67385     7537        2  1188277406           65888   \n",
       "10000049    67385    63481        3  1227499991           65888   \n",
       "10000050    67542    63481        3  1227739303           66039   \n",
       "10000051    67385    64652        2  1230900023           65888   \n",
       "10000052    69135    64427        4  1229529033           67546   \n",
       "10000053    70816    63662        4  1227531270           69154   \n",
       "\n",
       "          mapped_item_id  \n",
       "0                      0  \n",
       "1                      0  \n",
       "2                      0  \n",
       "3                      0  \n",
       "4                      0  \n",
       "5                      0  \n",
       "6                      0  \n",
       "7                      0  \n",
       "8                      0  \n",
       "9                      0  \n",
       "10                     0  \n",
       "11                     0  \n",
       "12                     0  \n",
       "13                     0  \n",
       "14                     0  \n",
       "15                     0  \n",
       "16                     0  \n",
       "17                     0  \n",
       "18                     0  \n",
       "19                     0  \n",
       "20                     0  \n",
       "21                     0  \n",
       "22                     0  \n",
       "23                     0  \n",
       "24                     0  \n",
       "25                     0  \n",
       "26                     0  \n",
       "27                     0  \n",
       "28                     0  \n",
       "29                     0  \n",
       "...                  ...  \n",
       "10000024           10657  \n",
       "10000025           10657  \n",
       "10000026           10658  \n",
       "10000027           10658  \n",
       "10000028           10659  \n",
       "10000029           10659  \n",
       "10000030           10660  \n",
       "10000031           10660  \n",
       "10000032           10661  \n",
       "10000033           10662  \n",
       "10000034           10663  \n",
       "10000035           10663  \n",
       "10000036           10664  \n",
       "10000037           10665  \n",
       "10000038           10666  \n",
       "10000039           10666  \n",
       "10000040           10666  \n",
       "10000041           10667  \n",
       "10000042           10668  \n",
       "10000043           10668  \n",
       "10000044           10668  \n",
       "10000045           10669  \n",
       "10000046           10670  \n",
       "10000047           10671  \n",
       "10000048           10672  \n",
       "10000049           10673  \n",
       "10000050           10673  \n",
       "10000051           10674  \n",
       "10000052           10675  \n",
       "10000053           10676  \n",
       "\n",
       "[10000054 rows x 6 columns]"
      ]
     },
     "execution_count": 9,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "ratings "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Dataset Splitting into Train, Validation, and Test\n",
    "\n",
    "This is the last part before creating the recommender. However, this step is super *important*, as it is the base for the training, parameters optimization, and evaluation of the recommender(s).\n",
    "\n",
    "In here we read the ratings (which we loaded and preprocessed before) and create the `train`, `validation`, and `test` User-Rating Matrices (URM). It's important that these are disjoint to avoid information leakage from the train into the validation/test set, in our case, we are safe to use the `train_test_split` function from `scikit-learn` as the dataset only contains *one* datapoint for every `(user,item)` pair. On another topic, we first create the `test` set and then we create the `validation` by splitting again the `train` set.\n",
    "\n",
    "\n",
    "`train_test_split` takes an array (or several arrays) and divides it into `train` and `test` according to a given size (in our case `testing_percentage` and `validation_percentage`, which need to be a float between 0 and 1).\n",
    "\n",
    "After we have our different splits, we create the *sparse URMs* by using the `csr_matrix` function from `scipy`."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "train_test_split?"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [],
   "source": [
    "def dataset_splits(ratings, num_users, num_items, validation_percentage: float, testing_percentage: float):\n",
    "    seed = 1234\n",
    "    \n",
    "    (user_ids_training, user_ids_test,\n",
    "     item_ids_training, item_ids_test,\n",
    "     ratings_training, ratings_test) = train_test_split(ratings.mapped_user_id,\n",
    "                                                        ratings.mapped_item_id,\n",
    "                                                        ratings.ratings,\n",
    "                                                        test_size=testing_percentage,\n",
    "                                                        shuffle=True,\n",
    "                                                        random_state=seed)\n",
    "    \n",
    "    (user_ids_training, user_ids_validation,\n",
    "     item_ids_training, item_ids_validation,\n",
    "     ratings_training, ratings_validation) = train_test_split(user_ids_training,\n",
    "                                                              item_ids_training,\n",
    "                                                              ratings_training,\n",
    "                                                              test_size=validation_percentage,\n",
    "                                                             )\n",
    "    \n",
    "    urm_train = sp.csr_matrix((ratings_training, (user_ids_training, item_ids_training)), \n",
    "                              shape=(num_users, num_items))\n",
    "    \n",
    "    urm_validation = sp.csr_matrix((ratings_validation, (user_ids_validation, item_ids_validation)), \n",
    "                              shape=(num_users, num_items))\n",
    "    \n",
    "    urm_test = sp.csr_matrix((ratings_test, (user_ids_test, item_ids_test)), \n",
    "                              shape=(num_users, num_items))\n",
    "    \n",
    "    \n",
    "    \n",
    "    return urm_train, urm_validation, urm_test\n",
    "    \n",
    "    \n",
    "    \n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "urm_train, urm_validation, urm_test = dataset_splits(ratings, \n",
    "                                                     num_users=69878, \n",
    "                                                     num_items=10677, \n",
    "                                                     validation_percentage=0.10, \n",
    "                                                     testing_percentage=0.20)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<69878x10677 sparse matrix of type '<class 'numpy.int32'>'\n",
       "\twith 7200038 stored elements in Compressed Sparse Row format>"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "urm_train"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<69878x10677 sparse matrix of type '<class 'numpy.int32'>'\n",
       "\twith 800005 stored elements in Compressed Sparse Row format>"
      ]
     },
     "execution_count": 14,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "urm_validation"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<69878x10677 sparse matrix of type '<class 'numpy.int32'>'\n",
       "\twith 2000011 stored elements in Compressed Sparse Row format>"
      ]
     },
     "execution_count": 16,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "urm_test"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Cosine Similarity\n",
    "\n",
    "We can implement different versions of a cosine similarity. Some of these are faster and others are slower.\n",
    "\n",
    "The most simple version is just to loop item by item and calculate the similarity of item pairs.\n",
    "$$ W_{i,j} \n",
    "= cos(v_i, v_j) \n",
    "= \\frac{v_i \\cdot v_j}{|| v_i || ||v_j ||} \n",
    "= \\frac{\\Sigma_{u \\in U}{URM_{u,i} \\cdot URM_{u,j}}}{\\sqrt{\\Sigma_{u \\in U}{URM_{u,i}^2}} \\cdot \\sqrt{\\Sigma_{u \\in U}{URM_{u,j}^2}} + shrink} $$\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {},
   "outputs": [],
   "source": [
    "def naive_similarity(urm: sp.csc_matrix, shrink: int):\n",
    "    num_items = urm.shape[1]\n",
    "    weights = np.empty(shape=(num_items, num_items))\n",
    "    for item_i in range(num_items):\n",
    "        item_i_profile = urm[:, item_i] # mx1 vector\n",
    "        \n",
    "        for item_j in range(num_items):\n",
    "            item_j_profile = urm[:, item_j] # mx1 vector\n",
    "            \n",
    "            numerator = item_i_profile.T.dot(item_j_profile).todense()[0,0]\n",
    "            denominator = (np.sqrt(np.sum(item_i_profile.power(2)))\n",
    "                           * np.sqrt(np.sum(item_j_profile.power(2)))\n",
    "                           + shrink\n",
    "                           + 1e-6)\n",
    "            \n",
    "            weights[item_i, item_j] = numerator / denominator\n",
    "    \n",
    "    np.fill_diagonal(weights, 0.0)\n",
    "    return weights\n",
    "    \n",
    "            "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Another (faster) version of the similarity is by operating on vector products\n",
    "$$ W_{i,I} \n",
    "= cos(v_i, URM_{I}) \n",
    "= \\frac{v_i \\cdot URM_{I}}{|| v_i || IW_{I} + shrink} $$\n",
    "\n",
    "and where \n",
    "\n",
    "$$ IW_{i} = \\sqrt{{\\Sigma_{u \\in U}{URM_{u,i}^2}}}$$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {},
   "outputs": [],
   "source": [
    "def vector_similarity(urm: sp.csc_matrix, shrink: int):\n",
    "    item_weights = np.sqrt(\n",
    "        np.sum(urm.power(2), axis=0)\n",
    "    ).A.flatten()\n",
    "    \n",
    "    num_items = urm.shape[1]\n",
    "    urm_t = urm.T\n",
    "    weights = np.empty(shape=(num_items, num_items))\n",
    "    for item_id in range(num_items):\n",
    "        numerator = urm_t.dot(urm[:, item_id]).A.flatten()\n",
    "        denominator = item_weights[item_id] * item_weights + shrink + 1e-6\n",
    "        \n",
    "        weights[item_id] = numerator / denominator\n",
    "        \n",
    "    np.fill_diagonal(weights, 0.0)\n",
    "    return weights\n",
    "    "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Lastly, a faster but more memory-intensive version of the similarity is by operating on matrix products\n",
    "$$ W  \n",
    "= \\frac{URM^{t} \\cdot URM}{IW^{t} IW + shrink} $$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {},
   "outputs": [],
   "source": [
    "def matrix_similarity(urm: sp.csc_matrix, shrink: int):\n",
    "    item_weights = np.sqrt(\n",
    "        np.sum(urm.power(2), axis=0)\n",
    "    ).A\n",
    "    \n",
    "    numerator = urm.T.dot(urm)\n",
    "    denominator = item_weights.T.dot(item_weights) + shrink + 1e-6\n",
    "    weights = numerator / denominator\n",
    "    np.fill_diagonal(weights, 0.0)\n",
    "    \n",
    "    return weights"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [],
   "source": [
    "urm_csc = urm_train.tocsc()\n",
    "shrink = 5\n",
    "slice_size = 100"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "CPU times: user 8.08 s, sys: 67.5 ms, total: 8.14 s\n",
      "Wall time: 8.28 s\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "array([[0.        , 0.36632423, 0.36526804, ..., 0.        , 0.        ,\n",
       "        0.        ],\n",
       "       [0.36632423, 0.        , 0.54985153, ..., 0.        , 0.03425119,\n",
       "        0.        ],\n",
       "       [0.36526804, 0.54985153, 0.        , ..., 0.03108656, 0.11382563,\n",
       "        0.        ],\n",
       "       ...,\n",
       "       [0.        , 0.        , 0.03108656, ..., 0.        , 0.2717996 ,\n",
       "        0.1006602 ],\n",
       "       [0.        , 0.03425119, 0.11382563, ..., 0.2717996 , 0.        ,\n",
       "        0.        ],\n",
       "       [0.        , 0.        , 0.        , ..., 0.1006602 , 0.        ,\n",
       "        0.        ]])"
      ]
     },
     "execution_count": 19,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "%%time \n",
    "naive_weights = naive_similarity(urm_csc[:slice_size,:slice_size], shrink)\n",
    "naive_weights"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "CPU times: user 60.8 ms, sys: 2.53 ms, total: 63.4 ms\n",
      "Wall time: 62.5 ms\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "array([[0.        , 0.36632423, 0.36526804, ..., 0.        , 0.        ,\n",
       "        0.        ],\n",
       "       [0.36632423, 0.        , 0.54985153, ..., 0.        , 0.03425119,\n",
       "        0.        ],\n",
       "       [0.36526804, 0.54985153, 0.        , ..., 0.03108656, 0.11382563,\n",
       "        0.        ],\n",
       "       ...,\n",
       "       [0.        , 0.        , 0.03108656, ..., 0.        , 0.2717996 ,\n",
       "        0.1006602 ],\n",
       "       [0.        , 0.03425119, 0.11382563, ..., 0.2717996 , 0.        ,\n",
       "        0.        ],\n",
       "       [0.        , 0.        , 0.        , ..., 0.1006602 , 0.        ,\n",
       "        0.        ]])"
      ]
     },
     "execution_count": 27,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "%%time\n",
    "vector_weights = vector_similarity(urm_csc[:slice_size,:slice_size], shrink)\n",
    "vector_weights"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "CPU times: user 4.71 ms, sys: 2.08 ms, total: 6.79 ms\n",
      "Wall time: 5.22 ms\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "matrix([[0.        , 0.36632423, 0.36526804, ..., 0.        , 0.        ,\n",
       "         0.        ],\n",
       "        [0.36632423, 0.        , 0.54985153, ..., 0.        , 0.03425119,\n",
       "         0.        ],\n",
       "        [0.36526804, 0.54985153, 0.        , ..., 0.03108656, 0.11382563,\n",
       "         0.        ],\n",
       "        ...,\n",
       "        [0.        , 0.        , 0.03108656, ..., 0.        , 0.2717996 ,\n",
       "         0.1006602 ],\n",
       "        [0.        , 0.03425119, 0.11382563, ..., 0.2717996 , 0.        ,\n",
       "         0.        ],\n",
       "        [0.        , 0.        , 0.        , ..., 0.1006602 , 0.        ,\n",
       "         0.        ]])"
      ]
     },
     "execution_count": 30,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "%%time\n",
    "matrix_weights = matrix_similarity(urm_csc[:slice_size,:slice_size], shrink)\n",
    "matrix_weights"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 28,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "np.array_equal(naive_weights, vector_weights)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 31,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "np.array_equal(vector_weights, matrix_weights)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Collaborative Filtering ItemKNN Recommender\n",
    "\n",
    "This step creates a `CFItemKNN` class that represents a Collaborative Filtering ItemKNN Recommender. As we have mentioned in previous practice sessions, our recommenders have two main functions: `fit` and `recommend`. \n",
    "\n",
    "The first receives the similarity function and the dataset with which it will create the similarities, the result of this function is to save the similarities (`weights`) into the class instance. \n",
    "\n",
    "The second function takes a user id, the train URM, the recommendation lenght and a boolean value to remove already-seen items from users. It returns a recommendation list for the user."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 65,
   "metadata": {},
   "outputs": [],
   "source": [
    "class CFItemKNN(object):\n",
    "    def __init__(self, shrink: int):\n",
    "        self.shrink = shrink\n",
    "        self.weights = None\n",
    "    \n",
    "    \n",
    "    def fit(self, urm_train: sp.csc_matrix, similarity_function):\n",
    "        if not sp.isspmatrix_csc(urm_train):\n",
    "            raise TypeError(f\"We expected a CSC matrix, we got {type(urm_train)}\")\n",
    "        \n",
    "        self.weights = similarity_function(urm_train, self.shrink)\n",
    "        \n",
    "    def recommend(self, user_id: int, urm_train: sp.csr_matrix, at: Optional[int] = None, remove_seen: bool = True):\n",
    "        user_profile = urm_train[user_id]\n",
    "        \n",
    "        ranking = user_profile.dot(self.weights).A.flatten()\n",
    "        \n",
    "        if remove_seen:\n",
    "            user_profile_start = urm_train.indptr[user_id]\n",
    "            user_profile_end = urm_train.indptr[user_id+1]\n",
    "            \n",
    "            seen_items = urm_train.indices[user_profile_start:user_profile_end]\n",
    "            \n",
    "            ranking[seen_items] = -np.inf\n",
    "            \n",
    "        ranking = np.flip(np.argsort(ranking))\n",
    "        return ranking[:at]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 66,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<__main__.CFItemKNN at 0x7fa9acf67748>"
      ]
     },
     "execution_count": 66,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "itemknn_recommender = CFItemKNN(shrink=50)\n",
    "itemknn_recommender"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 67,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "CPU times: user 11.5 s, sys: 1.33 s, total: 12.8 s\n",
      "Wall time: 12.3 s\n"
     ]
    }
   ],
   "source": [
    "%%time\n",
    "\n",
    "itemknn_recommender.fit(urm_train.tocsc(), matrix_similarity)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 70,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[ 93  85 175  91  74  75  84  77  92  82]\n",
      "[ 37  14  96  19 101 177 179  11   7 187]\n",
      "[ 798  793   60  279  821  195  235  179   62 1906]\n",
      "[  85  175   75   11    7   91    9    4 1009   19]\n",
      "[1008  176  179  145  148 1122  403  387   24  213]\n",
      "[195 179 228 235  37 415 382 404 259 401]\n",
      "[ 179  399 1073  404  146  411  244 1122   34  241]\n",
      "[ 519  382 1093  504  411  798  425  992  793  279]\n",
      "[ 195  411  235  616  277  259 1147  179 1093  625]\n",
      "[ 166  179  170  411  387  146  235   34  145 1316]\n"
     ]
    }
   ],
   "source": [
    "for user_id in range(10):\n",
    "    print(itemknn_recommender.recommend(user_id=user_id, urm_train=urm_train, at=10, remove_seen=True))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Evaluation Metrics\n",
    "\n",
    "In this practice session we will be using the same evaluation metrics defined in the Practice session 2, i.e., precision, recall and mean average precision (MAP)."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [],
   "source": [
    "def recall(recommendations: np.array, relevant_items: np.array) -> float:\n",
    "    is_relevant = np.in1d(recommendations, relevant_items, assume_unique=True)\n",
    "    \n",
    "    recall_score = np.sum(is_relevant) / relevant_items.shape[0]\n",
    "    \n",
    "    return recall_score\n",
    "    \n",
    "    \n",
    "def precision(recommendations: np.array, relevant_items: np.array) -> float:\n",
    "    is_relevant = np.in1d(recommendations, relevant_items, assume_unique=True)\n",
    "    \n",
    "    precision_score = np.sum(is_relevant) / recommendations.shape[0]\n",
    "\n",
    "    return precision_score\n",
    "\n",
    "def mean_average_precision(recommendations: np.array, relevant_items: np.array) -> float:\n",
    "    is_relevant = np.in1d(recommendations, relevant_items, assume_unique=True)\n",
    "    \n",
    "    precision_at_k = is_relevant * np.cumsum(is_relevant, dtype=np.float32) / (1 + np.arange(is_relevant.shape[0]))\n",
    "\n",
    "    map_score = np.sum(precision_at_k) / np.min([relevant_items.shape[0], is_relevant.shape[0]])\n",
    "\n",
    "    return map_score\n",
    "    "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Evaluation Procedure\n",
    "\n",
    "The evaluation procedure returns the averaged accuracy scores (in terms of precision, recall and MAP) for all users (that have at least 1 rating in the test set). It also calculates the number of evaluated and skipped users. It receives a recommender instance, and the train and test URMs."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 74,
   "metadata": {},
   "outputs": [],
   "source": [
    "def evaluator(recommender: object, urm_train: sp.csr_matrix, urm_test: sp.csr_matrix):\n",
    "    recommendation_length = 10\n",
    "    accum_precision = 0\n",
    "    accum_recall = 0\n",
    "    accum_map = 0\n",
    "    \n",
    "    num_users = urm_train.shape[0]\n",
    "    \n",
    "    num_users_evaluated = 0\n",
    "    num_users_skipped = 0\n",
    "    for user_id in range(num_users):\n",
    "        user_profile_start = urm_test.indptr[user_id]\n",
    "        user_profile_end = urm_test.indptr[user_id+1]\n",
    "        \n",
    "        relevant_items = urm_test.indices[user_profile_start:user_profile_end]\n",
    "        \n",
    "        if relevant_items.size == 0:\n",
    "            num_users_skipped += 1\n",
    "            continue\n",
    "            \n",
    "        recommendations = recommender.recommend(user_id=user_id, \n",
    "                                               at=recommendation_length, \n",
    "                                               urm_train=urm_train, \n",
    "                                               remove_seen=True)\n",
    "        \n",
    "        accum_precision += precision(recommendations, relevant_items)\n",
    "        accum_recall += recall(recommendations, relevant_items)\n",
    "        accum_map += mean_average_precision(recommendations, relevant_items)\n",
    "        \n",
    "        num_users_evaluated += 1\n",
    "        \n",
    "    \n",
    "    accum_precision /= max(num_users_evaluated, 1)\n",
    "    accum_recall /= max(num_users_evaluated, 1)\n",
    "    accum_map /=  max(num_users_evaluated, 1)\n",
    "    \n",
    "    return accum_precision, accum_recall, accum_map, num_users_evaluated, num_users_skipped\n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 75,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "CPU times: user 2min 21s, sys: 1.58 s, total: 2min 23s\n",
      "Wall time: 2min 25s\n"
     ]
    }
   ],
   "source": [
    "%%time\n",
    "\n",
    "accum_precision, accum_recall, accum_map, num_user_evaluated, num_users_skipped = evaluator(itemknn_recommender, \n",
    "                                                                                            urm_train, \n",
    "                                                                                            urm_test)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 79,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(0.25290409467323116, 0.16215820144910845, 0.17916230515769266, 69798, 80)"
      ]
     },
     "execution_count": 79,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "accum_precision, accum_recall, accum_map, num_user_evaluated, num_users_skipped"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Hyperparameter Tuning\n",
    "\n",
    "This step is fundamental to get the best performance of an algorithm, specifically, because we will train different configurations of the parameters for the `CFItemKNN` recommender and select the best performing one.\n",
    "\n",
    "In order for this step to be meaningful (and to avoid overfitting on the test set), we perform it using the `validation` URM as test set.\n",
    "\n",
    "This step is the longest one to run in the entire pipeline when building a recommender."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 90,
   "metadata": {},
   "outputs": [],
   "source": [
    "def hyperparameter_tuning():\n",
    "    shrinks = [0,1,5,10,50]\n",
    "    results = []\n",
    "    for shrink in shrinks:\n",
    "        print(f\"Currently trying shrink {shrink}\")\n",
    "        \n",
    "        itemknn_recommender = CFItemKNN(shrink=shrink)\n",
    "        itemknn_recommender.fit(urm_train.tocsc(), matrix_similarity)\n",
    "        \n",
    "        ev_precision, ev_recall, ev_map, _, _ = evaluator(itemknn_recommender, urm_train, urm_validation)\n",
    "        \n",
    "        results.append((shrink, (ev_precision, ev_recall, ev_map)))\n",
    "        \n",
    "    return results\n",
    "    \n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 91,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Currently trying shrink 0\n",
      "Currently trying shrink 1\n",
      "Currently trying shrink 5\n",
      "Currently trying shrink 10\n",
      "Currently trying shrink 50\n",
      "CPU times: user 10min 19s, sys: 10 s, total: 10min 29s\n",
      "Wall time: 10min 29s\n"
     ]
    }
   ],
   "source": [
    "%%time\n",
    "\n",
    "hyperparameter_results = hyperparameter_tuning()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 92,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[(0, (0.10417748757143938, 0.15800031735328562, 0.08277634539659091)),\n",
       " (1, (0.104173035542056, 0.15799589203952574, 0.08277511836838672)),\n",
       " (5, (0.1041641314832892, 0.15798864104525823, 0.0827729518318653)),\n",
       " (10, (0.10416413148328918, 0.15799223727560768, 0.08276682548217212)),\n",
       " (50, (0.10415374341472793, 0.1579780807320129, 0.08274985204536793))]"
      ]
     },
     "execution_count": 92,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "hyperparameter_results"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Submission to competition\n",
    "\n",
    "This step serves as a similar step that you will perform when preparing a submission to the competition. Specially after you have chosen and trained your recommender.\n",
    "\n",
    "For this step the best suggestion is to select the most-performing configuration obtained in the hyperparameter tuning step and to train the recommender using both the `train` and `validation` set. Remember that in the competition you *do not* have access to the test set.\n",
    "\n",
    "We simulated the users to generate recommendations by randomly selecting 100 users from the original identifiers. Do consider that in the competition you are most likely to be provided with the list of users to generate recommendations. \n",
    "\n",
    "Another consideration is that, due to easier and faster calculations, we replaced the user/item identifiers with new ones in the preprocessing step. For the competition, you are required to generate recommendations using the dataset's original identifiers. Due to this, this step also reverts back the newer identifiers with the ones originally found in the dataset.\n",
    "\n",
    "Last, this step creates a function that writes the recommendations for each user in the same file in a tabular format following this format: \n",
    "```csv\n",
    "<user_id>,<item_id_1> <item_id_2> <item_id_3> <item_id_4> <item_id_5> <item_id_6> <item_id_7> <item_id_8> <item_id_9> <item_id_10>\n",
    "```\n",
    "\n",
    "Always verify the competitions' submission file model as it might vary from the one we presented here."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 100,
   "metadata": {},
   "outputs": [],
   "source": [
    "best_shrink = 0\n",
    "urm_train_validation = urm_train + urm_validation\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 101,
   "metadata": {},
   "outputs": [],
   "source": [
    "best_recommender = CFItemKNN(shrink=best_shrink)\n",
    "best_recommender.fit(urm_train_validation.tocsc(), matrix_similarity)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 102,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "array([40536, 49488, 49027, 29190, 49628, 66278, 17103, 23818, 12478,\n",
       "       63886, 65972, 34986, 38319, 15855, 56982, 31855, 23636, 62620,\n",
       "       34576, 14511, 55701, 69757, 66263, 18236, 16196, 29592, 65752,\n",
       "       18857, 21273, 55964, 21366, 17794,  4773, 22825, 65160, 21034,\n",
       "       11477,  3113, 15141, 18417,  8258, 24614,  2792, 15623, 40459,\n",
       "       55495, 69727, 61689,   808, 10745, 39082, 60191, 29653, 33863,\n",
       "        5817,  3991, 70220, 24478, 22145, 63028, 42481, 17621, 50446,\n",
       "       52154, 45137, 56487, 48219, 68044, 48168, 20727, 41317, 32000,\n",
       "         761, 69862,  1393, 45440, 41666, 60975, 65756, 68451, 61114,\n",
       "       20542, 29077, 68373, 64886, 45504,  2284, 10828, 67424, 55725,\n",
       "       11531, 61683, 60648, 32854, 60272, 34765, 53522, 12346,  7316,\n",
       "       54992])"
      ]
     },
     "execution_count": 102,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "users_to_recommend = np.random.choice(ratings.user_id.unique(), size=100, replace=False)\n",
    "users_to_recommend"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 106,
   "metadata": {},
   "outputs": [],
   "source": [
    "mapping_to_item_id = dict(zip(ratings.mapped_item_id, ratings.item_id))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 107,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "{0: 122,\n",
       " 1: 185,\n",
       " 2: 231,\n",
       " 3: 292,\n",
       " 4: 316,\n",
       " 5: 329,\n",
       " 6: 355,\n",
       " 7: 356,\n",
       " 8: 362,\n",
       " 9: 364,\n",
       " 10: 370,\n",
       " 11: 377,\n",
       " 12: 420,\n",
       " 13: 466,\n",
       " 14: 480,\n",
       " 15: 520,\n",
       " 16: 539,\n",
       " 17: 586,\n",
       " 18: 588,\n",
       " 19: 589,\n",
       " 20: 594,\n",
       " 21: 616,\n",
       " 22: 110,\n",
       " 23: 151,\n",
       " 24: 260,\n",
       " 25: 376,\n",
       " 26: 590,\n",
       " 27: 648,\n",
       " 28: 719,\n",
       " 29: 733,\n",
       " 30: 736,\n",
       " 31: 780,\n",
       " 32: 786,\n",
       " 33: 802,\n",
       " 34: 858,\n",
       " 35: 1049,\n",
       " 36: 1073,\n",
       " 37: 1210,\n",
       " 38: 1356,\n",
       " 39: 1391,\n",
       " 40: 1544,\n",
       " 41: 213,\n",
       " 42: 1148,\n",
       " 43: 1246,\n",
       " 44: 1252,\n",
       " 45: 1276,\n",
       " 46: 1288,\n",
       " 47: 1408,\n",
       " 48: 1552,\n",
       " 49: 1564,\n",
       " 50: 1597,\n",
       " 51: 1674,\n",
       " 52: 3408,\n",
       " 53: 3684,\n",
       " 54: 4535,\n",
       " 55: 4677,\n",
       " 56: 4995,\n",
       " 57: 5299,\n",
       " 58: 5505,\n",
       " 59: 5527,\n",
       " 60: 5952,\n",
       " 61: 6287,\n",
       " 62: 6377,\n",
       " 63: 6539,\n",
       " 64: 7153,\n",
       " 65: 7155,\n",
       " 66: 8529,\n",
       " 67: 8533,\n",
       " 68: 8783,\n",
       " 69: 27821,\n",
       " 70: 33750,\n",
       " 71: 21,\n",
       " 72: 34,\n",
       " 73: 39,\n",
       " 74: 150,\n",
       " 75: 153,\n",
       " 76: 161,\n",
       " 77: 165,\n",
       " 78: 208,\n",
       " 79: 253,\n",
       " 80: 266,\n",
       " 81: 317,\n",
       " 82: 344,\n",
       " 83: 349,\n",
       " 84: 367,\n",
       " 85: 380,\n",
       " 86: 410,\n",
       " 87: 432,\n",
       " 88: 434,\n",
       " 89: 435,\n",
       " 90: 440,\n",
       " 91: 500,\n",
       " 92: 587,\n",
       " 93: 592,\n",
       " 94: 595,\n",
       " 95: 597,\n",
       " 96: 1,\n",
       " 97: 7,\n",
       " 98: 25,\n",
       " 99: 28,\n",
       " 100: 30,\n",
       " 101: 32,\n",
       " 102: 47,\n",
       " 103: 52,\n",
       " 104: 57,\n",
       " 105: 58,\n",
       " 106: 85,\n",
       " 107: 111,\n",
       " 108: 141,\n",
       " 109: 171,\n",
       " 110: 194,\n",
       " 111: 230,\n",
       " 112: 232,\n",
       " 113: 235,\n",
       " 114: 242,\n",
       " 115: 249,\n",
       " 116: 299,\n",
       " 117: 306,\n",
       " 118: 307,\n",
       " 119: 308,\n",
       " 120: 321,\n",
       " 121: 326,\n",
       " 122: 334,\n",
       " 123: 345,\n",
       " 124: 348,\n",
       " 125: 412,\n",
       " 126: 446,\n",
       " 127: 475,\n",
       " 128: 477,\n",
       " 129: 495,\n",
       " 130: 508,\n",
       " 131: 509,\n",
       " 132: 515,\n",
       " 133: 527,\n",
       " 134: 532,\n",
       " 135: 535,\n",
       " 136: 538,\n",
       " 137: 541,\n",
       " 138: 562,\n",
       " 139: 593,\n",
       " 140: 608,\n",
       " 141: 708,\n",
       " 142: 778,\n",
       " 143: 818,\n",
       " 144: 903,\n",
       " 145: 912,\n",
       " 146: 919,\n",
       " 147: 920,\n",
       " 148: 923,\n",
       " 149: 926,\n",
       " 150: 969,\n",
       " 151: 1041,\n",
       " 152: 1046,\n",
       " 153: 1080,\n",
       " 154: 1094,\n",
       " 155: 1096,\n",
       " 156: 1097,\n",
       " 157: 1103,\n",
       " 158: 1104,\n",
       " 159: 1172,\n",
       " 160: 1183,\n",
       " 161: 1199,\n",
       " 162: 1206,\n",
       " 163: 1207,\n",
       " 164: 1219,\n",
       " 165: 1221,\n",
       " 166: 1225,\n",
       " 167: 1230,\n",
       " 168: 1235,\n",
       " 169: 1244,\n",
       " 170: 1247,\n",
       " 171: 1258,\n",
       " 172: 1280,\n",
       " 173: 1295,\n",
       " 174: 1300,\n",
       " 175: 457,\n",
       " 176: 1193,\n",
       " 177: 1196,\n",
       " 178: 1197,\n",
       " 179: 1198,\n",
       " 180: 1264,\n",
       " 181: 1277,\n",
       " 182: 1304,\n",
       " 183: 1396,\n",
       " 184: 1483,\n",
       " 185: 1527,\n",
       " 186: 1573,\n",
       " 187: 1580,\n",
       " 188: 1584,\n",
       " 189: 1629,\n",
       " 190: 1653,\n",
       " 191: 1748,\n",
       " 192: 2028,\n",
       " 193: 2396,\n",
       " 194: 2405,\n",
       " 195: 2571,\n",
       " 196: 2628,\n",
       " 197: 3555,\n",
       " 198: 3578,\n",
       " 199: 3623,\n",
       " 200: 3740,\n",
       " 201: 3753,\n",
       " 202: 3755,\n",
       " 203: 3863,\n",
       " 204: 3986,\n",
       " 205: 3994,\n",
       " 206: 3996,\n",
       " 207: 4053,\n",
       " 208: 4161,\n",
       " 209: 4270,\n",
       " 210: 4299,\n",
       " 211: 4369,\n",
       " 212: 4446,\n",
       " 213: 50,\n",
       " 214: 101,\n",
       " 215: 599,\n",
       " 216: 800,\n",
       " 217: 899,\n",
       " 218: 904,\n",
       " 219: 908,\n",
       " 220: 913,\n",
       " 221: 930,\n",
       " 222: 942,\n",
       " 223: 951,\n",
       " 224: 954,\n",
       " 225: 1086,\n",
       " 226: 1212,\n",
       " 227: 1234,\n",
       " 228: 1240,\n",
       " 229: 1245,\n",
       " 230: 1248,\n",
       " 231: 1254,\n",
       " 232: 1256,\n",
       " 233: 1260,\n",
       " 234: 1266,\n",
       " 235: 1270,\n",
       " 236: 1283,\n",
       " 237: 1284,\n",
       " 238: 1333,\n",
       " 239: 1344,\n",
       " 240: 1348,\n",
       " 241: 1387,\n",
       " 242: 1517,\n",
       " 243: 1590,\n",
       " 244: 1617,\n",
       " 245: 1732,\n",
       " 246: 1805,\n",
       " 247: 1895,\n",
       " 248: 1917,\n",
       " 249: 2186,\n",
       " 250: 2203,\n",
       " 251: 2206,\n",
       " 252: 2335,\n",
       " 253: 2391,\n",
       " 254: 2395,\n",
       " 255: 2478,\n",
       " 256: 2527,\n",
       " 257: 2648,\n",
       " 258: 2683,\n",
       " 259: 2762,\n",
       " 260: 2791,\n",
       " 261: 2804,\n",
       " 262: 2936,\n",
       " 263: 3006,\n",
       " 264: 3044,\n",
       " 265: 3176,\n",
       " 266: 3334,\n",
       " 267: 3365,\n",
       " 268: 3435,\n",
       " 269: 3467,\n",
       " 270: 3471,\n",
       " 271: 3671,\n",
       " 272: 3703,\n",
       " 273: 3730,\n",
       " 274: 3859,\n",
       " 275: 4027,\n",
       " 276: 4206,\n",
       " 277: 4226,\n",
       " 278: 4246,\n",
       " 279: 4306,\n",
       " 280: 4327,\n",
       " 281: 4406,\n",
       " 282: 4420,\n",
       " 283: 4432,\n",
       " 284: 4886,\n",
       " 285: 4975,\n",
       " 286: 5017,\n",
       " 287: 5094,\n",
       " 288: 5184,\n",
       " 289: 5292,\n",
       " 290: 5294,\n",
       " 291: 5300,\n",
       " 292: 5388,\n",
       " 293: 5440,\n",
       " 294: 5481,\n",
       " 295: 5500,\n",
       " 296: 5502,\n",
       " 297: 5528,\n",
       " 298: 5826,\n",
       " 299: 6273,\n",
       " 300: 2,\n",
       " 301: 5,\n",
       " 302: 6,\n",
       " 303: 16,\n",
       " 304: 19,\n",
       " 305: 22,\n",
       " 306: 31,\n",
       " 307: 36,\n",
       " 308: 66,\n",
       " 309: 70,\n",
       " 310: 93,\n",
       " 311: 104,\n",
       " 312: 145,\n",
       " 313: 158,\n",
       " 314: 160,\n",
       " 315: 163,\n",
       " 316: 164,\n",
       " 317: 170,\n",
       " 318: 172,\n",
       " 319: 173,\n",
       " 320: 180,\n",
       " 321: 196,\n",
       " 322: 215,\n",
       " 323: 216,\n",
       " 324: 223,\n",
       " 325: 240,\n",
       " 326: 256,\n",
       " 327: 288,\n",
       " 328: 290,\n",
       " 329: 293,\n",
       " 330: 303,\n",
       " 331: 315,\n",
       " 332: 324,\n",
       " 333: 328,\n",
       " 334: 330,\n",
       " 335: 338,\n",
       " 336: 353,\n",
       " 337: 357,\n",
       " 338: 368,\n",
       " 339: 372,\n",
       " 340: 379,\n",
       " 341: 384,\n",
       " 342: 393,\n",
       " 343: 407,\n",
       " 344: 413,\n",
       " 345: 442,\n",
       " 346: 455,\n",
       " 347: 471,\n",
       " 348: 479,\n",
       " 349: 481,\n",
       " 350: 489,\n",
       " 351: 514,\n",
       " 352: 522,\n",
       " 353: 537,\n",
       " 354: 540,\n",
       " 355: 543,\n",
       " 356: 547,\n",
       " 357: 548,\n",
       " 358: 552,\n",
       " 359: 606,\n",
       " 360: 610,\n",
       " 361: 637,\n",
       " 362: 678,\n",
       " 363: 688,\n",
       " 364: 694,\n",
       " 365: 705,\n",
       " 366: 737,\n",
       " 367: 741,\n",
       " 368: 784,\n",
       " 369: 785,\n",
       " 370: 799,\n",
       " 371: 836,\n",
       " 372: 842,\n",
       " 373: 849,\n",
       " 374: 879,\n",
       " 375: 886,\n",
       " 376: 934,\n",
       " 377: 996,\n",
       " 378: 1004,\n",
       " 379: 1020,\n",
       " 380: 1027,\n",
       " 381: 1035,\n",
       " 382: 1036,\n",
       " 383: 1037,\n",
       " 384: 1060,\n",
       " 385: 1061,\n",
       " 386: 1064,\n",
       " 387: 1079,\n",
       " 388: 1083,\n",
       " 389: 1089,\n",
       " 390: 1091,\n",
       " 391: 1092,\n",
       " 392: 1093,\n",
       " 393: 1100,\n",
       " 394: 1101,\n",
       " 395: 1125,\n",
       " 396: 1126,\n",
       " 397: 1127,\n",
       " 398: 1129,\n",
       " 399: 1136,\n",
       " 400: 1188,\n",
       " 401: 1200,\n",
       " 402: 1204,\n",
       " 403: 1208,\n",
       " 404: 1214,\n",
       " 405: 1215,\n",
       " 406: 1220,\n",
       " 407: 1231,\n",
       " 408: 1253,\n",
       " 409: 1261,\n",
       " 410: 1262,\n",
       " 411: 1265,\n",
       " 412: 1268,\n",
       " 413: 1275,\n",
       " 414: 1285,\n",
       " 415: 1291,\n",
       " 416: 1302,\n",
       " 417: 1307,\n",
       " 418: 1320,\n",
       " 419: 1321,\n",
       " 420: 1339,\n",
       " 421: 1347,\n",
       " 422: 1358,\n",
       " 423: 1370,\n",
       " 424: 1373,\n",
       " 425: 1377,\n",
       " 426: 1380,\n",
       " 427: 1385,\n",
       " 428: 1388,\n",
       " 429: 1393,\n",
       " 430: 1395,\n",
       " 431: 1405,\n",
       " 432: 1409,\n",
       " 433: 1425,\n",
       " 434: 1429,\n",
       " 435: 1466,\n",
       " 436: 1499,\n",
       " 437: 1500,\n",
       " 438: 1513,\n",
       " 439: 1515,\n",
       " 440: 1556,\n",
       " 441: 1562,\n",
       " 442: 1587,\n",
       " 443: 1588,\n",
       " 444: 1591,\n",
       " 445: 1598,\n",
       " 446: 1603,\n",
       " 447: 1608,\n",
       " 448: 1611,\n",
       " 449: 1616,\n",
       " 450: 1620,\n",
       " 451: 1625,\n",
       " 452: 1627,\n",
       " 453: 1639,\n",
       " 454: 1641,\n",
       " 455: 1644,\n",
       " 456: 1658,\n",
       " 457: 1663,\n",
       " 458: 1665,\n",
       " 459: 1673,\n",
       " 460: 1676,\n",
       " 461: 1682,\n",
       " 462: 1687,\n",
       " 463: 1690,\n",
       " 464: 1702,\n",
       " 465: 1703,\n",
       " 466: 1704,\n",
       " 467: 1717,\n",
       " 468: 1721,\n",
       " 469: 1722,\n",
       " 470: 1754,\n",
       " 471: 1762,\n",
       " 472: 1772,\n",
       " 473: 1777,\n",
       " 474: 1779,\n",
       " 475: 1792,\n",
       " 476: 1801,\n",
       " 477: 1831,\n",
       " 478: 1848,\n",
       " 479: 1855,\n",
       " 480: 1862,\n",
       " 481: 1876,\n",
       " 482: 1882,\n",
       " 483: 1883,\n",
       " 484: 1911,\n",
       " 485: 1918,\n",
       " 486: 1920,\n",
       " 487: 1923,\n",
       " 488: 1954,\n",
       " 489: 1961,\n",
       " 490: 1969,\n",
       " 491: 1970,\n",
       " 492: 1971,\n",
       " 493: 1973,\n",
       " 494: 1974,\n",
       " 495: 1992,\n",
       " 496: 1993,\n",
       " 497: 1994,\n",
       " 498: 2000,\n",
       " 499: 2001,\n",
       " 500: 2002,\n",
       " 501: 2003,\n",
       " 502: 2004,\n",
       " 503: 2005,\n",
       " 504: 2011,\n",
       " 505: 2012,\n",
       " 506: 2013,\n",
       " 507: 2021,\n",
       " 508: 2026,\n",
       " 509: 2046,\n",
       " 510: 2050,\n",
       " 511: 2051,\n",
       " 512: 2052,\n",
       " 513: 2058,\n",
       " 514: 2060,\n",
       " 515: 2072,\n",
       " 516: 2100,\n",
       " 517: 2105,\n",
       " 518: 2107,\n",
       " 519: 2115,\n",
       " 520: 2117,\n",
       " 521: 2124,\n",
       " 522: 2140,\n",
       " 523: 2150,\n",
       " 524: 2151,\n",
       " 525: 2161,\n",
       " 526: 2167,\n",
       " 527: 2193,\n",
       " 528: 2194,\n",
       " 529: 2231,\n",
       " 530: 2232,\n",
       " 531: 2247,\n",
       " 532: 2253,\n",
       " 533: 2268,\n",
       " 534: 2269,\n",
       " 535: 2278,\n",
       " 536: 2291,\n",
       " 537: 2294,\n",
       " 538: 2322,\n",
       " 539: 2325,\n",
       " 540: 2329,\n",
       " 541: 2338,\n",
       " 542: 2340,\n",
       " 543: 2355,\n",
       " 544: 2369,\n",
       " 545: 2371,\n",
       " 546: 2375,\n",
       " 547: 2378,\n",
       " 548: 2379,\n",
       " 549: 2380,\n",
       " 550: 2382,\n",
       " 551: 2383,\n",
       " 552: 2387,\n",
       " 553: 2389,\n",
       " 554: 2393,\n",
       " 555: 2406,\n",
       " 556: 2407,\n",
       " 557: 2412,\n",
       " 558: 2422,\n",
       " 559: 2428,\n",
       " 560: 2431,\n",
       " 561: 2433,\n",
       " 562: 2447,\n",
       " 563: 2470,\n",
       " 564: 2471,\n",
       " 565: 2490,\n",
       " 566: 2496,\n",
       " 567: 2498,\n",
       " 568: 2505,\n",
       " 569: 2514,\n",
       " 570: 2524,\n",
       " 571: 2540,\n",
       " 572: 2541,\n",
       " 573: 2542,\n",
       " 574: 2548,\n",
       " 575: 2549,\n",
       " 576: 2558,\n",
       " 577: 2572,\n",
       " 578: 2581,\n",
       " 579: 2598,\n",
       " 580: 2600,\n",
       " 581: 2606,\n",
       " 582: 2617,\n",
       " 583: 2618,\n",
       " 584: 2640,\n",
       " 585: 2641,\n",
       " 586: 2643,\n",
       " 587: 2657,\n",
       " 588: 2672,\n",
       " 589: 2699,\n",
       " 590: 2700,\n",
       " 591: 2701,\n",
       " 592: 2713,\n",
       " 593: 2717,\n",
       " 594: 2719,\n",
       " 595: 2735,\n",
       " 596: 2772,\n",
       " 597: 2793,\n",
       " 598: 2796,\n",
       " 599: 2802,\n",
       " 600: 2805,\n",
       " 601: 2807,\n",
       " 602: 2808,\n",
       " 603: 2826,\n",
       " 604: 2841,\n",
       " 605: 2858,\n",
       " 606: 2881,\n",
       " 607: 2889,\n",
       " 608: 2893,\n",
       " 609: 2913,\n",
       " 610: 2916,\n",
       " 611: 2944,\n",
       " 612: 2947,\n",
       " 613: 2948,\n",
       " 614: 2949,\n",
       " 615: 2950,\n",
       " 616: 2959,\n",
       " 617: 2968,\n",
       " 618: 2976,\n",
       " 619: 2978,\n",
       " 620: 2985,\n",
       " 621: 2987,\n",
       " 622: 2989,\n",
       " 623: 2990,\n",
       " 624: 2995,\n",
       " 625: 2997,\n",
       " 626: 3005,\n",
       " 627: 3020,\n",
       " 628: 3033,\n",
       " 629: 3039,\n",
       " 630: 3052,\n",
       " 631: 3081,\n",
       " 632: 3101,\n",
       " 633: 3114,\n",
       " 634: 3145,\n",
       " 635: 3146,\n",
       " 636: 3147,\n",
       " 637: 3156,\n",
       " 638: 3175,\n",
       " 639: 3178,\n",
       " 640: 3208,\n",
       " 641: 3243,\n",
       " 642: 3248,\n",
       " 643: 3249,\n",
       " 644: 3253,\n",
       " 645: 3254,\n",
       " 646: 3256,\n",
       " 647: 3257,\n",
       " 648: 3258,\n",
       " 649: 3261,\n",
       " 650: 3263,\n",
       " 651: 3264,\n",
       " 652: 3268,\n",
       " 653: 3271,\n",
       " 654: 3300,\n",
       " 655: 3301,\n",
       " 656: 3316,\n",
       " 657: 3324,\n",
       " 658: 3354,\n",
       " 659: 3409,\n",
       " 660: 3448,\n",
       " 661: 3452,\n",
       " 662: 3457,\n",
       " 663: 3466,\n",
       " 664: 3477,\n",
       " 665: 3484,\n",
       " 666: 3499,\n",
       " 667: 3526,\n",
       " 668: 3527,\n",
       " 669: 3534,\n",
       " 670: 3535,\n",
       " 671: 3564,\n",
       " 672: 3593,\n",
       " 673: 3624,\n",
       " 674: 3635,\n",
       " 675: 3639,\n",
       " 676: 3686,\n",
       " 677: 3698,\n",
       " 678: 3699,\n",
       " 679: 3701,\n",
       " 680: 3702,\n",
       " 681: 3704,\n",
       " 682: 3705,\n",
       " 683: 3710,\n",
       " 684: 3717,\n",
       " 685: 3744,\n",
       " 686: 3745,\n",
       " 687: 3751,\n",
       " 688: 3752,\n",
       " 689: 3763,\n",
       " 690: 3764,\n",
       " 691: 3793,\n",
       " 692: 3798,\n",
       " 693: 3802,\n",
       " 694: 3809,\n",
       " 695: 3810,\n",
       " 696: 3821,\n",
       " 697: 3825,\n",
       " 698: 3826,\n",
       " 699: 3827,\n",
       " 700: 3841,\n",
       " 701: 3864,\n",
       " 702: 3879,\n",
       " 703: 3882,\n",
       " 704: 3897,\n",
       " 705: 3948,\n",
       " 706: 3973,\n",
       " 707: 3977,\n",
       " 708: 3978,\n",
       " 709: 3981,\n",
       " 710: 3988,\n",
       " 711: 3997,\n",
       " 712: 3998,\n",
       " 713: 3999,\n",
       " 714: 4002,\n",
       " 715: 4010,\n",
       " 716: 4011,\n",
       " 717: 4015,\n",
       " 718: 4018,\n",
       " 719: 4030,\n",
       " 720: 4040,\n",
       " 721: 4084,\n",
       " 722: 4085,\n",
       " 723: 4092,\n",
       " 724: 4105,\n",
       " 725: 4121,\n",
       " 726: 4128,\n",
       " 727: 4178,\n",
       " 728: 4200,\n",
       " 729: 4203,\n",
       " 730: 4207,\n",
       " 731: 4223,\n",
       " 732: 4224,\n",
       " 733: 4247,\n",
       " 734: 4251,\n",
       " 735: 4255,\n",
       " 736: 4265,\n",
       " 737: 4266,\n",
       " 738: 4267,\n",
       " 739: 4280,\n",
       " 740: 4310,\n",
       " 741: 4321,\n",
       " 742: 4343,\n",
       " 743: 4351,\n",
       " 744: 4361,\n",
       " 745: 4367,\n",
       " 746: 4370,\n",
       " 747: 4372,\n",
       " 748: 4387,\n",
       " 749: 4396,\n",
       " 750: 4447,\n",
       " 751: 4452,\n",
       " 752: 4483,\n",
       " 753: 4487,\n",
       " 754: 4488,\n",
       " 755: 4489,\n",
       " 756: 4499,\n",
       " 757: 4526,\n",
       " 758: 4531,\n",
       " 759: 4544,\n",
       " 760: 4545,\n",
       " 761: 4558,\n",
       " 762: 4563,\n",
       " 763: 4571,\n",
       " 764: 4572,\n",
       " 765: 4580,\n",
       " 766: 4587,\n",
       " 767: 4621,\n",
       " 768: 4624,\n",
       " 769: 4636,\n",
       " 770: 4643,\n",
       " 771: 4672,\n",
       " 772: 4673,\n",
       " 773: 4681,\n",
       " 774: 4701,\n",
       " 775: 4718,\n",
       " 776: 4728,\n",
       " 777: 4734,\n",
       " 778: 4735,\n",
       " 779: 4744,\n",
       " 780: 4756,\n",
       " 781: 4758,\n",
       " 782: 4776,\n",
       " 783: 4844,\n",
       " 784: 4848,\n",
       " 785: 4855,\n",
       " 786: 4866,\n",
       " 787: 4874,\n",
       " 788: 4878,\n",
       " 789: 4887,\n",
       " 790: 4899,\n",
       " 791: 4954,\n",
       " 792: 4958,\n",
       " 793: 4963,\n",
       " 794: 4978,\n",
       " 795: 4979,\n",
       " 796: 4980,\n",
       " 797: 4992,\n",
       " 798: 4993,\n",
       " 799: 5010,\n",
       " 800: 5014,\n",
       " 801: 5015,\n",
       " 802: 5040,\n",
       " 803: 5047,\n",
       " 804: 5049,\n",
       " 805: 5065,\n",
       " 806: 5093,\n",
       " 807: 5110,\n",
       " 808: 5128,\n",
       " 809: 5151,\n",
       " 810: 5152,\n",
       " 811: 5171,\n",
       " 812: 5218,\n",
       " 813: 5219,\n",
       " 814: 5247,\n",
       " 815: 5254,\n",
       " 816: 5255,\n",
       " 817: 5283,\n",
       " 818: 5293,\n",
       " 819: 5313,\n",
       " 820: 5323,\n",
       " 821: 5349,\n",
       " 822: 5378,\n",
       " 823: 5382,\n",
       " 824: 5414,\n",
       " 825: 5438,\n",
       " 826: 5445,\n",
       " 827: 5459,\n",
       " 828: 5463,\n",
       " 829: 5476,\n",
       " 830: 5478,\n",
       " 831: 5501,\n",
       " 832: 5507,\n",
       " 833: 5523,\n",
       " 834: 5538,\n",
       " 835: 5541,\n",
       " 836: 5556,\n",
       " 837: 5573,\n",
       " 838: 5574,\n",
       " 839: 5609,\n",
       " 840: 5621,\n",
       " 841: 5669,\n",
       " 842: 5673,\n",
       " 843: 5678,\n",
       " 844: 5679,\n",
       " 845: 5810,\n",
       " 846: 5816,\n",
       " 847: 5833,\n",
       " 848: 5872,\n",
       " 849: 5881,\n",
       " 850: 5903,\n",
       " 851: 5944,\n",
       " 852: 5999,\n",
       " 853: 6003,\n",
       " 854: 6040,\n",
       " 855: 6156,\n",
       " 856: 6157,\n",
       " 857: 6239,\n",
       " 858: 6264,\n",
       " 859: 6280,\n",
       " 860: 6290,\n",
       " 861: 6294,\n",
       " 862: 6296,\n",
       " 863: 6323,\n",
       " 864: 6333,\n",
       " 865: 6338,\n",
       " 866: 6365,\n",
       " 867: 6373,\n",
       " 868: 6378,\n",
       " 869: 6383,\n",
       " 870: 6440,\n",
       " 871: 6502,\n",
       " 872: 6503,\n",
       " 873: 6534,\n",
       " 874: 6537,\n",
       " 875: 6541,\n",
       " 876: 6548,\n",
       " 877: 6564,\n",
       " 878: 6567,\n",
       " 879: 6595,\n",
       " 880: 6615,\n",
       " 881: 6662,\n",
       " 882: 6663,\n",
       " 883: 6686,\n",
       " 884: 6695,\n",
       " 885: 6708,\n",
       " 886: 6709,\n",
       " 887: 6711,\n",
       " 888: 6720,\n",
       " 889: 6754,\n",
       " 890: 6790,\n",
       " 891: 6796,\n",
       " 892: 6800,\n",
       " 893: 6803,\n",
       " 894: 6807,\n",
       " 895: 6812,\n",
       " 896: 6863,\n",
       " 897: 6870,\n",
       " 898: 6873,\n",
       " 899: 6874,\n",
       " 900: 6880,\n",
       " 901: 6888,\n",
       " 902: 6934,\n",
       " 903: 6936,\n",
       " 904: 6944,\n",
       " 905: 6947,\n",
       " 906: 6952,\n",
       " 907: 6953,\n",
       " 908: 6957,\n",
       " 909: 6966,\n",
       " 910: 6977,\n",
       " 911: 6996,\n",
       " 912: 7000,\n",
       " 913: 7004,\n",
       " 914: 7007,\n",
       " 915: 7018,\n",
       " 916: 7046,\n",
       " 917: 7076,\n",
       " 918: 7090,\n",
       " 919: 7101,\n",
       " 920: 7102,\n",
       " 921: 7143,\n",
       " 922: 7161,\n",
       " 923: 7163,\n",
       " 924: 7173,\n",
       " 925: 7175,\n",
       " 926: 7193,\n",
       " 927: 7254,\n",
       " 928: 7293,\n",
       " 929: 7308,\n",
       " 930: 7310,\n",
       " 931: 7324,\n",
       " 932: 7325,\n",
       " 933: 7347,\n",
       " 934: 7348,\n",
       " 935: 7360,\n",
       " 936: 7367,\n",
       " 937: 7371,\n",
       " 938: 7373,\n",
       " 939: 7376,\n",
       " 940: 7381,\n",
       " 941: 7394,\n",
       " 942: 7395,\n",
       " 943: 7438,\n",
       " 944: 7439,\n",
       " 945: 7445,\n",
       " 946: 7448,\n",
       " 947: 7451,\n",
       " 948: 7454,\n",
       " 949: 7458,\n",
       " 950: 7573,\n",
       " 951: 7781,\n",
       " 952: 7827,\n",
       " 953: 7923,\n",
       " 954: 8010,\n",
       " 955: 8016,\n",
       " 956: 8169,\n",
       " 957: 8340,\n",
       " 958: 8360,\n",
       " 959: 8361,\n",
       " 960: 8366,\n",
       " 961: 8368,\n",
       " 962: 8371,\n",
       " 963: 8387,\n",
       " 964: 8493,\n",
       " 965: 8528,\n",
       " 966: 8623,\n",
       " 967: 8633,\n",
       " 968: 8636,\n",
       " 969: 8640,\n",
       " 970: 8641,\n",
       " 971: 8644,\n",
       " 972: 8665,\n",
       " 973: 8798,\n",
       " 974: 8810,\n",
       " 975: 8830,\n",
       " 976: 8860,\n",
       " 977: 8861,\n",
       " 978: 8870,\n",
       " 979: 8874,\n",
       " 980: 8939,\n",
       " 981: 8947,\n",
       " 982: 8961,\n",
       " 983: 8968,\n",
       " 984: 8972,\n",
       " 985: 8984,\n",
       " 986: 8985,\n",
       " 987: 26555,\n",
       " 988: 27728,\n",
       " 989: 30822,\n",
       " 990: 30825,\n",
       " 991: 32302,\n",
       " 992: 2174,\n",
       " 993: 2502,\n",
       " 994: 2710,\n",
       " 995: 2797,\n",
       " 996: 3429,\n",
       " 997: 4034,\n",
       " 998: 7361,\n",
       " 999: 26131,\n",
       " ...}"
      ]
     },
     "execution_count": 107,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "mapping_to_item_id"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 110,
   "metadata": {},
   "outputs": [],
   "source": [
    "def prepare_submission(ratings: pd.DataFrame, users_to_recommend: np.array, urm_train: sp.csr_matrix, recommender: object):\n",
    "    users_ids_and_mappings = ratings[ratings.user_id.isin(users_to_recommend)][[\"user_id\", \"mapped_user_id\"]].drop_duplicates()\n",
    "    items_ids_and_mappings = ratings[[\"item_id\", \"mapped_item_id\"]].drop_duplicates()\n",
    "    \n",
    "    mapping_to_item_id = dict(zip(ratings.mapped_item_id, ratings.item_id))\n",
    "    \n",
    "    \n",
    "    recommendation_length = 10\n",
    "    submission = []\n",
    "    for idx, row in users_ids_and_mappings.iterrows():\n",
    "        user_id = row.user_id\n",
    "        mapped_user_id = row.mapped_user_id\n",
    "        \n",
    "        recommendations = recommender.recommend(user_id=mapped_user_id,\n",
    "                                                urm_train=urm_train,\n",
    "                                                at=recommendation_length,\n",
    "                                                remove_seen=True)\n",
    "        \n",
    "        submission.append((user_id, [mapping_to_item_id[item_id] for item_id in recommendations]))\n",
    "        \n",
    "    return submission\n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 111,
   "metadata": {},
   "outputs": [],
   "source": [
    "submission = prepare_submission(ratings, users_to_recommend, urm_train_validation, best_recommender)\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 112,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[(11477, [1270, 1097, 1198, 1580, 1197, 2858, 1196, 1784, 2762, 608]),\n",
       " (22825, [260, 589, 457, 356, 1270, 1240, 296, 318, 1265, 527]),\n",
       " (54992, [592, 457, 153, 480, 590, 377, 356, 589, 588, 110]),\n",
       " (67424, [380, 377, 480, 296, 590, 356, 318, 589, 500, 292]),\n",
       " (3991, [2762, 4963, 3793, 4306, 5952, 1923, 1517, 1200, 2987, 4027]),\n",
       " (11531, [592, 480, 590, 367, 316, 253, 597, 364, 587, 500]),\n",
       " (17103, [457, 592, 380, 356, 590, 110, 50, 349, 454, 586]),\n",
       " (18857, [150, 589, 593, 318, 597, 539, 329, 21, 527, 474]),\n",
       " (23636, [1291, 1198, 1097, 260, 1265, 457, 593, 2115, 592, 780]),\n",
       " (23818, [457, 153, 480, 377, 356, 589, 593, 588, 454, 10]),\n",
       " (24614, [1291, 1270, 5349, 4306, 1210, 3793, 1036, 1240, 2115, 480]),\n",
       " (29190, [2571, 1198, 2716, 4993, 1682, 1097, 1923, 3793, 4226, 4963]),\n",
       " (41666, [457, 592, 377, 480, 153, 165, 356, 454, 588, 587]),\n",
       " (42481, [150, 589, 480, 318, 377, 590, 50, 356, 110, 349]),\n",
       " (48219, [1198, 1291, 1036, 377, 1377, 648, 1387, 1374, 457, 1197]),\n",
       " (53522, [1270, 1240, 1198, 2716, 2000, 1200, 1210, 2797, 2762, 1214]),\n",
       " (55725, [1214, 1200, 1198, 1291, 1097, 2716, 1676, 1377, 2115, 1374]),\n",
       " (60272, [150, 153, 296, 593, 364, 318, 344, 434, 161, 47]),\n",
       " (66278, [457, 165, 296, 110, 593, 500, 454, 318, 185, 329]),\n",
       " (69757, [2115, 1240, 2011, 2028, 1527, 4306, 1573, 2628, 5445, 1610]),\n",
       " (2792, [1240, 2571, 1291, 1214, 1580, 1200, 1036, 1097, 2716, 2916]),\n",
       " (15141, [2571, 1580, 1270, 1196, 589, 2762, 1291, 1198, 4993, 1210]),\n",
       " (17621, [356, 593, 590, 165, 318, 153, 349, 344, 588, 292]),\n",
       " (22145, [592, 153, 377, 589, 110, 292, 364, 367, 47, 161]),\n",
       " (31855, [380, 150, 344, 165, 597, 586, 590, 316, 454, 349]),\n",
       " (34765, [1291, 2762, 4993, 260, 296, 1097, 1136, 1240, 593, 2028]),\n",
       " (63886, [589, 480, 377, 592, 608, 110, 527, 150, 1270, 1240]),\n",
       " (66263, [1270, 1291, 1527, 1240, 1196, 2716, 2115, 2916, 5349, 3578]),\n",
       " (69727, [377, 356, 500, 380, 592, 364, 588, 589, 586, 110]),\n",
       " (70220, [1198, 2571, 1291, 2987, 1259, 1197, 1682, 2291, 2997, 1610]),\n",
       " (10828, [296, 318, 592, 356, 590, 608, 380, 597, 300, 500]),\n",
       " (29592, [2959, 4963, 1527, 7153, 5445, 2115, 4226, 6874, 1682, 260]),\n",
       " (32854, [380, 356, 592, 150, 110, 648, 500, 349, 367, 296]),\n",
       " (34986, [457, 318, 150, 590, 480, 608, 589, 377, 300, 47]),\n",
       " (56487, [150, 480, 377, 356, 153, 589, 588, 318, 161, 364]),\n",
       " (65160, [1240, 1580, 2797, 2115, 2985, 919, 1954, 1079, 1610, 2640]),\n",
       " (20727, [1270, 1291, 1580, 2716, 1214, 1265, 1200, 2571, 260, 2916]),\n",
       " (32000, [1240, 1198, 589, 1291, 480, 1270, 1036, 1580, 1214, 2571]),\n",
       " (40459, [480, 593, 165, 110, 364, 344, 292, 367, 595, 47]),\n",
       " (41317, [1291, 1196, 1198, 1097, 1210, 1240, 2571, 1036, 2115, 2716]),\n",
       " (49488, [480, 47, 110, 150, 165, 318, 590, 153, 500, 364]),\n",
       " (49628, [592, 457, 590, 480, 356, 377, 593, 364, 292, 165]),\n",
       " (50446, [1198, 1240, 1291, 1265, 2716, 2571, 1097, 260, 2115, 2000]),\n",
       " (4773, [1198, 1240, 2115, 1097, 2000, 2174, 1036, 2916, 2987, 1214]),\n",
       " (61683, [296, 1265, 608, 593, 1136, 50, 1198, 2858, 1270, 541]),\n",
       " (68451, [380, 480, 377, 356, 588, 165, 349, 364, 587, 589]),\n",
       " (61114, [1270, 2571, 1198, 1196, 1580, 1291, 2762, 1265, 1240, 1036]),\n",
       " (761, [1291, 1196, 2115, 1240, 1923, 1517, 1097, 3578, 5349, 1527]),\n",
       " (3113, [4993, 1580, 1270, 2762, 1291, 1196, 3578, 1198, 4306, 1240]),\n",
       " (10745, [2571, 1270, 1198, 1291, 4993, 2762, 3578, 296, 5349, 2959]),\n",
       " (12346, [1198, 1097, 1136, 1196, 2716, 919, 1682, 2797, 2571, 1307]),\n",
       " (12478, [1270, 2797, 1136, 1198, 1097, 2791, 1394, 1079, 1196, 1968]),\n",
       " (14511, [4993, 5349, 1036, 5445, 1580, 5952, 3793, 7153, 1682, 4306]),\n",
       " (18236, [1270, 2571, 2858, 1198, 1265, 1682, 1704, 2716, 1196, 1580]),\n",
       " (21366, [1196, 593, 1198, 1270, 1240, 2571, 260, 1291, 1036, 50]),\n",
       " (24478, [457, 592, 377, 380, 356, 593, 590, 349, 500, 292]),\n",
       " (29077, [2028, 2916, 2918, 1682, 1617, 3578, 589, 919, 2797, 1259]),\n",
       " (29653, [480, 1196, 1270, 1198, 593, 1580, 1291, 318, 457, 377]),\n",
       " (33863, [1265, 1097, 924, 1079, 858, 50, 1210, 1225, 1036, 1193]),\n",
       " (38319, [1198, 1265, 260, 1291, 1387, 1136, 1580, 541, 318, 2571]),\n",
       " (39082, [1198, 2762, 1270, 1196, 318, 1240, 260, 1136, 1682, 1036]),\n",
       " (45137, [2571, 4993, 1270, 593, 296, 2762, 589, 1580, 1198, 4963]),\n",
       " (48168, [2571, 2762, 1270, 1580, 3578, 1198, 1265, 1196, 1704, 2716]),\n",
       " (49027, [480, 377, 296, 457, 593, 110, 1270, 592, 380, 2571]),\n",
       " (52154, [1270, 1196, 1265, 1307, 1136, 1214, 919, 1387, 1200, 2791]),\n",
       " (56982, [4963, 6539, 5418, 2762, 1270, 4226, 5445, 3793, 2291, 6874]),\n",
       " (60191, [1270, 1291, 1036, 1580, 1097, 2762, 2028, 1265, 2716, 1387]),\n",
       " (61689, [2571, 1270, 1580, 1198, 1265, 3793, 2028, 2959, 1923, 2918]),\n",
       " (62620, [150, 480, 457, 377, 539, 380, 454, 153, 367, 593]),\n",
       " (65752, [2762, 2716, 2571, 2959, 1270, 1265, 2987, 2028, 1198, 1196]),\n",
       " (15623, [1196, 1265, 1291, 1225, 1307, 1240, 1247, 1079, 924, 1214]),\n",
       " (69862, [1265, 1270, 2716, 1580, 2858, 1923, 1517, 2762, 2997, 2571]),\n",
       " (808, [1580, 1270, 2571, 2762, 2716, 1923, 3578, 1265, 1291, 1198]),\n",
       " (21273, [1270, 1198, 1097, 1196, 919, 1580, 2174, 2762, 2918, 1291]),\n",
       " (60975, [4993, 3578, 2762, 5952, 4963, 7153, 356, 1270, 480, 2959]),\n",
       " (55495, [1198, 1196, 1291, 1240, 2571, 260, 1097, 1580, 1265, 1136]),\n",
       " (40536, [2571, 3578, 1580, 2987, 2028, 1291, 2628, 1240, 4993, 2115]),\n",
       " (45440, [1196, 1214, 1291, 1036, 260, 1097, 1265, 1387, 589, 541]),\n",
       " (65972, [1198, 1270, 1196, 1097, 1240, 1265, 260, 1291, 1210, 1214]),\n",
       " (55701, [457, 356, 377, 480, 500, 588, 454, 357, 593, 595]),\n",
       " (7316, [648, 260, 736, 32, 141, 1393, 788, 95, 62, 802]),\n",
       " (8258, [1193, 1214, 1270, 1208, 541, 1240, 1617, 912, 750, 924]),\n",
       " (15855, [733, 736, 788, 1, 1210, 1073, 494, 1356, 141, 708]),\n",
       " (45504, [736, 733, 1210, 1073, 608, 1393, 788, 1196, 1580, 1270]),\n",
       " (55964, [1196, 1097, 858, 919, 1136, 1240, 608, 1214, 296, 1265]),\n",
       " (63028, [780, 1, 1073, 852, 784, 3, 1036, 7, 589, 1196]),\n",
       " (68044, [1196, 1270, 1240, 1291, 2762, 1214, 1136, 260, 2858, 1036]),\n",
       " (68373, [648, 733, 25, 1, 6, 141, 1073, 1210, 296, 788]),\n",
       " (5817, [780, 736, 32, 260, 141, 788, 62, 1393, 1210, 802]),\n",
       " (16196, [780, 733, 36, 802, 1, 788, 1393, 494, 1073, 805]),\n",
       " (60648, [733, 788, 260, 141, 832, 1073, 802, 653, 377, 480]),\n",
       " (64886, [2571, 4993, 1198, 1270, 1196, 2762, 1291, 4226, 2716, 5349]),\n",
       " (21034, [648, 494, 260, 104, 653, 141, 832, 784, 62, 6]),\n",
       " (2284, [8636, 6539, 5418, 4963, 33794, 6377, 7153, 6365, 4993, 32587]),\n",
       " (17794, [1198, 1196, 1270, 2571, 2858, 1265, 1240, 1214, 260, 1089]),\n",
       " (1393, [4993, 6539, 5349, 7153, 4963, 5952, 2571, 33794, 2959, 8636]),\n",
       " (34576, [1580, 1573, 1784, 1682, 1721, 1610, 1265, 2028, 1552, 2571]),\n",
       " (20542, [1270, 2716, 1198, 2571, 1291, 1196, 1036, 1265, 4993, 1580]),\n",
       " (65756, [1265, 1704, 1784, 2571, 2716, 1580, 1721, 1270, 2291, 608]),\n",
       " (18417, [1573, 1552, 1580, 1597, 1584, 1645, 1370, 1377, 1544, 1917])]"
      ]
     },
     "execution_count": 112,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "submission"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 116,
   "metadata": {},
   "outputs": [],
   "source": [
    "def write_submission(submissions):\n",
    "    with open(\"./submission.csv\", \"w\") as f:\n",
    "        for user_id, items in submissions:\n",
    "            f.write(f\"{user_id},{' '.join([str(item) for item in items])}\\n\")\n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 117,
   "metadata": {},
   "outputs": [],
   "source": [
    "write_submission(submission)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Exercises"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In this lecture we saw the most simple version of Cosine Similarity, where it just includes a shrink factor. There are different optimizations that we can do to it.\n",
    "\n",
    "- Implement TopK Neighbors\n",
    "- When calculating the cosine similarity we used `urm.T.dot(urm)` to calculate the enumerator. However, depending of the dataset and the number of items, this matrix could not fit in memory. Implemenent a `block` version, faster than our `vector` version but that does not use `urm.T.dot(urm)` beforehand.\n",
    "- Implement Adjusted Cosine [Formula link](http://www10.org/cdrom/papers/519/node14.html)\n",
    "- Implement Dice Similarity [Wikipedia Link](https://en.wikipedia.org/wiki/Sørensen–Dice_coefficient)\n",
    "- Implement an implicit CF ItemKNN.\n",
    "- Implement a CF UserKNN model"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.10"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
