{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "e82e9fb4",
   "metadata": {},
   "source": [
    "# Introduction to Mortgage ETL Job\n",
    "This is the mortgage ETL job to generate the input datasets for the mortgage Xgboost job.\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d0c8c3fa",
   "metadata": {},
   "source": [
    "## Prerequirement\n",
    "### 1. Download data\n",
    "<!-- Refer these [instructions](https://github.com/NVIDIA/spark-rapids-examples/blob/branch-23.12/docs/get-started/xgboost-examples/dataset/mortgage.md) to download the dataset -->\n",
    "Refer to these [instructions](https://github.com/NVIDIA/spark-rapids-examples/blob/branch-23.12/docs/get-started/xgboost-examples/dataset/mortgage.md) to download the dataset.\n",
    "\n",
    "### 2. Download needed jars\n",
    "* [rapids-4-spark_2.12-25.04.0.jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/25.04.0/rapids-4-spark_2.12-25.04.0.jar)\n",
    "\n",
    "### 3. Start Spark Standalone\n",
    "Before Running the script, please setup Spark standalone mode\n",
    "\n",
    "### 4. Add ENV\n",
    "```\n",
    "$ export SPARK_JARS=rapids-4-spark_2.12-25.04.0.jar\n",
    "\n",
    "```\n",
    "\n",
    "### 5.Start Jupyter Notebook with spylon-kernel or toree\n",
    "\n",
    "```\n",
    "$ jupyter notebook --allow-root --notebook-dir=${your-dir} --config=${your-configs}\n",
    "```\n",
    "\n",
    "## Import Libs"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "id": "3ecc912c",
   "metadata": {},
   "outputs": [],
   "source": [
    "import org.apache.hadoop.fs.Path\n",
    "import org.apache.spark.sql.expressions.Window\n",
    "import org.apache.spark.sql.functions._\n",
    "import org.apache.spark.sql.types._\n",
    "import org.apache.spark.sql.{Column, DataFrame, SparkSession}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b58fcd6d",
   "metadata": {},
   "source": [
    "## Script Settings\n",
    "\n",
    "### 1. File Path Settings\n",
    "* Define input file path"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "b2834c06",
   "metadata": {},
   "outputs": [],
   "source": [
    "val dataRoot = sys.env.getOrElse(\"DATA_ROOT\", \"/data\")\n",
    "val dataOut = sys.env.getOrElse(\"DATA_OUT\", \"/data\")\n",
    "val dataPath = dataRoot + \"/mortgage/input\"\n",
    "val outPath = dataOut + \"/mortgage/output\"\n",
    "val output_csv2parquet = dataOut + \"/mortgage/output/csv2parquet/\"\n",
    "val saveTrainEvalDataset = true"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "775a2c7b",
   "metadata": {},
   "source": [
    "## Function and Object Define\n",
    "### 1. Define the constants\n",
    "\n",
    "* Define input/output file schema (Performance and Acquisition)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "e557beb0",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "rawSchema = StructType(StructField(reference_pool_id,StringType,true), StructField(loan_id,LongType,true), StructField(monthly_reporting_period,StringType,true), StructField(orig_channel,StringType,true), StructField(seller_name,StringType,true), StructField(servicer,StringType,true), StructField(master_servicer,StringType,true), StructField(orig_interest_rate,DoubleType,true), StructField(interest_rate,DoubleType,true), StructField(orig_upb,IntegerType,true), StructField(upb_at_issuance,StringType,true), StructField(current_actual_upb,DoubleType,true), StructField(orig_loan_term,IntegerType,true), StructField(orig_date,StringType,true), StructField(first_pay_date,StringType,true), StructField(loan_age,DoubleType,true), StructField(remaining_months...\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/plain": [
       "StructType(StructField(reference_pool_id,StringType,true), StructField(loan_id,LongType,true), StructField(monthly_reporting_period,StringType,true), StructField(orig_channel,StringType,true), StructField(seller_name,StringType,true), StructField(servicer,StringType,true), StructField(master_servicer,StringType,true), StructField(orig_interest_rate,DoubleType,true), StructField(interest_rate,DoubleType,true), StructField(orig_upb,IntegerType,true), StructField(upb_at_issuance,StringType,true), StructField(current_actual_upb,DoubleType,true), StructField(orig_loan_term,IntegerType,true), StructField(orig_date,StringType,true), StructField(first_pay_date,StringType,true), StructField(loan_age,DoubleType,true), StructField(remaining_months..."
      ]
     },
     "execution_count": 3,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "// File schema\n",
    "val rawSchema = StructType(Array(\n",
    "      StructField(\"reference_pool_id\", StringType),\n",
    "      StructField(\"loan_id\", LongType),\n",
    "      StructField(\"monthly_reporting_period\", StringType),\n",
    "      StructField(\"orig_channel\", StringType),\n",
    "      StructField(\"seller_name\", StringType),\n",
    "      StructField(\"servicer\", StringType),\n",
    "      StructField(\"master_servicer\", StringType),\n",
    "      StructField(\"orig_interest_rate\", DoubleType),\n",
    "      StructField(\"interest_rate\", DoubleType),\n",
    "      StructField(\"orig_upb\", DoubleType),\n",
    "      StructField(\"upb_at_issuance\", StringType),\n",
    "      StructField(\"current_actual_upb\", DoubleType),\n",
    "      StructField(\"orig_loan_term\", IntegerType),\n",
    "      StructField(\"orig_date\", StringType),\n",
    "      StructField(\"first_pay_date\", StringType),    \n",
    "      StructField(\"loan_age\", DoubleType),\n",
    "      StructField(\"remaining_months_to_legal_maturity\", DoubleType),\n",
    "      StructField(\"adj_remaining_months_to_maturity\", DoubleType),\n",
    "      StructField(\"maturity_date\", StringType),\n",
    "      StructField(\"orig_ltv\", DoubleType),\n",
    "      StructField(\"orig_cltv\", DoubleType),\n",
    "      StructField(\"num_borrowers\", DoubleType),\n",
    "      StructField(\"dti\", DoubleType),\n",
    "      StructField(\"borrower_credit_score\", DoubleType),\n",
    "      StructField(\"coborrow_credit_score\", DoubleType),\n",
    "      StructField(\"first_home_buyer\", StringType),\n",
    "      StructField(\"loan_purpose\", StringType),\n",
    "      StructField(\"property_type\", StringType),\n",
    "      StructField(\"num_units\", IntegerType),\n",
    "      StructField(\"occupancy_status\", StringType),\n",
    "      StructField(\"property_state\", StringType),\n",
    "      StructField(\"msa\", DoubleType),\n",
    "      StructField(\"zip\", IntegerType),\n",
    "      StructField(\"mortgage_insurance_percent\", DoubleType),\n",
    "      StructField(\"product_type\", StringType),\n",
    "      StructField(\"prepayment_penalty_indicator\", StringType),\n",
    "      StructField(\"interest_only_loan_indicator\", StringType),\n",
    "      StructField(\"interest_only_first_principal_and_interest_payment_date\", StringType),\n",
    "      StructField(\"months_to_amortization\", StringType),\n",
    "      StructField(\"current_loan_delinquency_status\", IntegerType),\n",
    "      StructField(\"loan_payment_history\", StringType),\n",
    "      StructField(\"mod_flag\", StringType),\n",
    "      StructField(\"mortgage_insurance_cancellation_indicator\", StringType),\n",
    "      StructField(\"zero_balance_code\", StringType),\n",
    "      StructField(\"zero_balance_effective_date\", StringType),\n",
    "      StructField(\"upb_at_the_time_of_removal\", StringType),\n",
    "      StructField(\"repurchase_date\", StringType),\n",
    "      StructField(\"scheduled_principal_current\", StringType),\n",
    "      StructField(\"total_principal_current\", StringType),\n",
    "      StructField(\"unscheduled_principal_current\", StringType),\n",
    "      StructField(\"last_paid_installment_date\", StringType),\n",
    "      StructField(\"foreclosed_after\", StringType),\n",
    "      StructField(\"disposition_date\", StringType),\n",
    "      StructField(\"foreclosure_costs\", DoubleType),\n",
    "      StructField(\"prop_preservation_and_repair_costs\", DoubleType),\n",
    "      StructField(\"asset_recovery_costs\", DoubleType),\n",
    "      StructField(\"misc_holding_expenses\", DoubleType),\n",
    "      StructField(\"holding_taxes\", DoubleType),\n",
    "      StructField(\"net_sale_proceeds\", DoubleType),\n",
    "      StructField(\"credit_enhancement_proceeds\", DoubleType),\n",
    "      StructField(\"repurchase_make_whole_proceeds\", StringType),\n",
    "      StructField(\"other_foreclosure_proceeds\", DoubleType),\n",
    "      StructField(\"non_interest_bearing_upb\", DoubleType),\n",
    "      StructField(\"principal_forgiveness_upb\", StringType),\n",
    "      StructField(\"original_list_start_date\", StringType),\n",
    "      StructField(\"original_list_price\", StringType),\n",
    "      StructField(\"current_list_start_date\", StringType),\n",
    "      StructField(\"current_list_price\", StringType),\n",
    "      StructField(\"borrower_credit_score_at_issuance\", StringType),\n",
    "      StructField(\"co-borrower_credit_score_at_issuance\", StringType),\n",
    "      StructField(\"borrower_credit_score_current\", StringType),\n",
    "      StructField(\"co-Borrower_credit_score_current\", StringType),\n",
    "      StructField(\"mortgage_insurance_type\", DoubleType),\n",
    "      StructField(\"servicing_activity_indicator\", StringType),\n",
    "      StructField(\"current_period_modification_loss_amount\", StringType),\n",
    "      StructField(\"cumulative_modification_loss_amount\", StringType),\n",
    "      StructField(\"current_period_credit_event_net_gain_or_loss\", StringType),\n",
    "      StructField(\"cumulative_credit_event_net_gain_or_loss\", StringType),\n",
    "      StructField(\"homeready_program_indicator\", StringType),\n",
    "      StructField(\"foreclosure_principal_write_off_amount\", StringType),\n",
    "      StructField(\"relocation_mortgage_indicator\", StringType),\n",
    "      StructField(\"zero_balance_code_change_date\", StringType),\n",
    "      StructField(\"loan_holdback_indicator\", StringType),\n",
    "      StructField(\"loan_holdback_effective_date\", StringType),\n",
    "      StructField(\"delinquent_accrued_interest\", StringType),\n",
    "      StructField(\"property_valuation_method\", StringType),\n",
    "      StructField(\"high_balance_loan_indicator\", StringType),\n",
    "      StructField(\"arm_initial_fixed-rate_period_lt_5_yr_indicator\", StringType),\n",
    "      StructField(\"arm_product_type\", StringType),\n",
    "      StructField(\"initial_fixed-rate_period\", StringType),\n",
    "      StructField(\"interest_rate_adjustment_frequency\", StringType),\n",
    "      StructField(\"next_interest_rate_adjustment_date\", StringType),\n",
    "      StructField(\"next_payment_change_date\", StringType),\n",
    "      StructField(\"index\", StringType),\n",
    "      StructField(\"arm_cap_structure\", StringType),\n",
    "      StructField(\"initial_interest_rate_cap_up_percent\", StringType),\n",
    "      StructField(\"periodic_interest_rate_cap_up_percent\", StringType),\n",
    "      StructField(\"lifetime_interest_rate_cap_up_percent\", StringType),\n",
    "      StructField(\"mortgage_margin\", StringType),\n",
    "      StructField(\"arm_balloon_indicator\", StringType),\n",
    "      StructField(\"arm_plan_number\", StringType),\n",
    "      StructField(\"borrower_assistance_plan\", StringType),\n",
    "      StructField(\"hltv_refinance_option_indicator\", StringType),\n",
    "      StructField(\"deal_name\", StringType),\n",
    "      StructField(\"repurchase_make_whole_proceeds_flag\", StringType),\n",
    "      StructField(\"alternative_delinquency_resolution\", StringType),\n",
    "      StructField(\"alternative_delinquency_resolution_count\", StringType),\n",
    "      StructField(\"total_deferral_amount\", StringType)\n",
    "      )\n",
    "    )"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "86af48b6",
   "metadata": {},
   "source": [
    "* Define seller name mapping"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "id": "69f193d7",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "defined object NameMapping\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "object NameMapping {\n",
    "  /**\n",
    "    * Returns a dataframe with two columns named based off of the column names passed in.\n",
    "    * The fromColName has the original name we want to clean up, the toColName\n",
    "    * will have the name we want to go to, the unambiguous name.\n",
    "    */\n",
    "  def apply(spark: SparkSession, fromColName: String, toColName: String): DataFrame = {\n",
    "    import spark.sqlContext.implicits._\n",
    "    broadcast(Seq(\n",
    "      (\"WITMER FUNDING, LLC\", \"Witmer\"),\n",
    "      (\"WELLS FARGO CREDIT RISK TRANSFER SECURITIES TRUST 2015\", \"Wells Fargo\"),\n",
    "      (\"WELLS FARGO BANK,  NA\" , \"Wells Fargo\"),\n",
    "      (\"WELLS FARGO BANK, N.A.\" , \"Wells Fargo\"),\n",
    "      (\"WELLS FARGO BANK, NA\" , \"Wells Fargo\"),\n",
    "      (\"USAA FEDERAL SAVINGS BANK\" , \"USAA\"),\n",
    "      (\"UNITED SHORE FINANCIAL SERVICES, LLC D\\\\/B\\\\/A UNITED WHOLESALE MORTGAGE\" , \"United Seq(e\"),\n",
    "      (\"U.S. BANK N.A.\" , \"US Bank\"),\n",
    "      (\"SUNTRUST MORTGAGE INC.\" , \"Suntrust\"),\n",
    "      (\"STONEGATE MORTGAGE CORPORATION\" , \"Stonegate Mortgage\"),\n",
    "      (\"STEARNS LENDING, LLC\" , \"Stearns Lending\"),\n",
    "      (\"STEARNS LENDING, INC.\" , \"Stearns Lending\"),\n",
    "      (\"SIERRA PACIFIC MORTGAGE COMPANY, INC.\" , \"Sierra Pacific Mortgage\"),\n",
    "      (\"REGIONS BANK\" , \"Regions\"),\n",
    "      (\"RBC MORTGAGE COMPANY\" , \"RBC\"),\n",
    "      (\"QUICKEN LOANS INC.\" , \"Quicken Loans\"),\n",
    "      (\"PULTE MORTGAGE, L.L.C.\" , \"Pulte Mortgage\"),\n",
    "      (\"PROVIDENT FUNDING ASSOCIATES, L.P.\" , \"Provident Funding\"),\n",
    "      (\"PROSPECT MORTGAGE, LLC\" , \"Prospect Mortgage\"),\n",
    "      (\"PRINCIPAL RESIDENTIAL MORTGAGE CAPITAL RESOURCES, LLC\" , \"Principal Residential\"),\n",
    "      (\"PNC BANK, N.A.\" , \"PNC\"),\n",
    "      (\"PMT CREDIT RISK TRANSFER TRUST 2015-2\" , \"PennyMac\"),\n",
    "      (\"PHH MORTGAGE CORPORATION\" , \"PHH Mortgage\"),\n",
    "      (\"PENNYMAC CORP.\" , \"PennyMac\"),\n",
    "      (\"PACIFIC UNION FINANCIAL, LLC\" , \"Other\"),\n",
    "      (\"OTHER\" , \"Other\"),\n",
    "      (\"NYCB MORTGAGE COMPANY, LLC\" , \"NYCB\"),\n",
    "      (\"NEW YORK COMMUNITY BANK\" , \"NYCB\"),\n",
    "      (\"NETBANK FUNDING SERVICES\" , \"Netbank\"),\n",
    "      (\"NATIONSTAR MORTGAGE, LLC\" , \"Nationstar Mortgage\"),\n",
    "      (\"METLIFE BANK, NA\" , \"Metlife\"),\n",
    "      (\"LOANDEPOT.COM, LLC\" , \"LoanDepot.com\"),\n",
    "      (\"J.P. MORGAN MADISON AVENUE SECURITIES TRUST, SERIES 2015-1\" , \"JP Morgan Chase\"),\n",
    "      (\"J.P. MORGAN MADISON AVENUE SECURITIES TRUST, SERIES 2014-1\" , \"JP Morgan Chase\"),\n",
    "      (\"JPMORGAN CHASE BANK, NATIONAL ASSOCIATION\" , \"JP Morgan Chase\"),\n",
    "      (\"JPMORGAN CHASE BANK, NA\" , \"JP Morgan Chase\"),\n",
    "      (\"JP MORGAN CHASE BANK, NA\" , \"JP Morgan Chase\"),\n",
    "      (\"IRWIN MORTGAGE, CORPORATION\" , \"Irwin Mortgage\"),\n",
    "      (\"IMPAC MORTGAGE CORP.\" , \"Impac Mortgage\"),\n",
    "      (\"HSBC BANK USA, NATIONAL ASSOCIATION\" , \"HSBC\"),\n",
    "      (\"HOMEWARD RESIDENTIAL, INC.\" , \"Homeward Mortgage\"),\n",
    "      (\"HOMESTREET BANK\" , \"Other\"),\n",
    "      (\"HOMEBRIDGE FINANCIAL SERVICES, INC.\" , \"HomeBridge\"),\n",
    "      (\"HARWOOD STREET FUNDING I, LLC\" , \"Harwood Mortgage\"),\n",
    "      (\"GUILD MORTGAGE COMPANY\" , \"Guild Mortgage\"),\n",
    "      (\"GMAC MORTGAGE, LLC (USAA FEDERAL SAVINGS BANK)\" , \"GMAC\"),\n",
    "      (\"GMAC MORTGAGE, LLC\" , \"GMAC\"),\n",
    "      (\"GMAC (USAA)\" , \"GMAC\"),\n",
    "      (\"FREMONT BANK\" , \"Fremont Bank\"),\n",
    "      (\"FREEDOM MORTGAGE CORP.\" , \"Freedom Mortgage\"),\n",
    "      (\"FRANKLIN AMERICAN MORTGAGE COMPANY\" , \"Franklin America\"),\n",
    "      (\"FLEET NATIONAL BANK\" , \"Fleet National\"),\n",
    "      (\"FLAGSTAR CAPITAL MARKETS CORPORATION\" , \"Flagstar Bank\"),\n",
    "      (\"FLAGSTAR BANK, FSB\" , \"Flagstar Bank\"),\n",
    "      (\"FIRST TENNESSEE BANK NATIONAL ASSOCIATION\" , \"Other\"),\n",
    "      (\"FIFTH THIRD BANK\" , \"Fifth Third Bank\"),\n",
    "      (\"FEDERAL HOME LOAN BANK OF CHICAGO\" , \"Fedral Home of Chicago\"),\n",
    "      (\"FDIC, RECEIVER, INDYMAC FEDERAL BANK FSB\" , \"FDIC\"),\n",
    "      (\"DOWNEY SAVINGS AND LOAN ASSOCIATION, F.A.\" , \"Downey Mortgage\"),\n",
    "      (\"DITECH FINANCIAL LLC\" , \"Ditech\"),\n",
    "      (\"CITIMORTGAGE, INC.\" , \"Citi\"),\n",
    "      (\"CHICAGO MORTGAGE SOLUTIONS DBA INTERFIRST MORTGAGE COMPANY\" , \"Chicago Mortgage\"),\n",
    "      (\"CHICAGO MORTGAGE SOLUTIONS DBA INTERBANK MORTGAGE COMPANY\" , \"Chicago Mortgage\"),\n",
    "      (\"CHASE HOME FINANCE, LLC\" , \"JP Morgan Chase\"),\n",
    "      (\"CHASE HOME FINANCE FRANKLIN AMERICAN MORTGAGE COMPANY\" , \"JP Morgan Chase\"),\n",
    "      (\"CHASE HOME FINANCE (CIE 1)\" , \"JP Morgan Chase\"),\n",
    "      (\"CHASE HOME FINANCE\" , \"JP Morgan Chase\"),\n",
    "      (\"CASHCALL, INC.\" , \"CashCall\"),\n",
    "      (\"CAPITAL ONE, NATIONAL ASSOCIATION\" , \"Capital One\"),\n",
    "      (\"CALIBER HOME LOANS, INC.\" , \"Caliber Funding\"),\n",
    "      (\"BISHOPS GATE RESIDENTIAL MORTGAGE TRUST\" , \"Bishops Gate Mortgage\"),\n",
    "      (\"BANK OF AMERICA, N.A.\" , \"Bank of America\"),\n",
    "      (\"AMTRUST BANK\" , \"AmTrust\"),\n",
    "      (\"AMERISAVE MORTGAGE CORPORATION\" , \"Amerisave\"),\n",
    "      (\"AMERIHOME MORTGAGE COMPANY, LLC\" , \"AmeriHome Mortgage\"),\n",
    "      (\"ALLY BANK\" , \"Ally Bank\"),\n",
    "      (\"ACADEMY MORTGAGE CORPORATION\" , \"Academy Mortgage\"),\n",
    "      (\"NO CASH-OUT REFINANCE\" , \"OTHER REFINANCE\"),\n",
    "      (\"REFINANCE - NOT SPECIFIED\" , \"OTHER REFINANCE\"),\n",
    "      (\"Other REFINANCE\" , \"OTHER REFINANCE\")\n",
    "    ).toDF(fromColName, toColName))\n",
    "  }\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "42098a5a",
   "metadata": {},
   "source": [
    "### 2. Define ETL Process\n",
    "\n",
    "Define the function to do the ETL process\n",
    "\n",
    "* Define function to get quarter from input CSV file name"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "id": "f18cab51",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "defined object GetQuarterFromCsvFileName\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "object GetQuarterFromCsvFileName {\n",
    "  // The format is path/TYPE_yyyy\\QQ.txt followed by a (_index)* where index is a single digit number [0-9]\n",
    "  // i.e. mortgage/perf/Performance_2003Q4.txt_0_1\n",
    "  // So we strip off the .txt and everything after it\n",
    "  // and then take everything after the last remaining _\n",
    "  def apply(): Column = substring_index(\n",
    "    substring_index(input_file_name(), \".\", 1), \"/\", -1)\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ead44543",
   "metadata": {},
   "source": [
    "* Define category (string) column and numeric column"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "id": "9936e221",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "labelColName = delinquency_12\n",
       "categaryCols = List((orig_channel,FloatType), (first_home_buyer,FloatType), (loan_purpose,FloatType), (property_type,FloatType), (occupancy_status,FloatType), (property_state,FloatType), (product_type,FloatType), (relocation_mortgage_indicator,FloatType), (seller_name,FloatType), (mod_flag,FloatType))\n",
       "numericCols = List((orig_interest_rate,FloatType), (orig_upb,IntegerType), (orig_loan_term,IntegerType), (orig_ltv,FloatType), (orig_cltv,FloatType), (num_borrowers,FloatType), (dti,FloatType), (borrower_credit_score,FloatType), (num_units,IntegerType), (zip,IntegerType), (mortgage_insurance_percent,FloatType...\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/plain": [
       "List((orig_interest_rate,FloatType), (orig_upb,IntegerType), (orig_loan_term,IntegerType), (orig_ltv,FloatType), (orig_cltv,FloatType), (num_borrowers,FloatType), (dti,FloatType), (borrower_credit_score,FloatType), (num_units,IntegerType), (zip,IntegerType), (mortgage_insurance_percent,FloatType..."
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "val labelColName = \"delinquency_12\"\n",
    "val categaryCols = List(\n",
    "    (\"orig_channel\", FloatType),\n",
    "    (\"first_home_buyer\", FloatType),\n",
    "    (\"loan_purpose\", FloatType),\n",
    "    (\"property_type\", FloatType),\n",
    "    (\"occupancy_status\", FloatType),\n",
    "    (\"property_state\", FloatType),\n",
    "    (\"product_type\", FloatType),\n",
    "    (\"relocation_mortgage_indicator\", FloatType),\n",
    "    (\"seller_name\", FloatType),\n",
    "    (\"mod_flag\", FloatType)\n",
    "  )\n",
    "\n",
    "val numericCols = List(\n",
    "    (\"orig_interest_rate\", FloatType),\n",
    "    (\"orig_upb\", DoubleType),\n",
    "    (\"orig_loan_term\", IntegerType),\n",
    "    (\"orig_ltv\", FloatType),\n",
    "    (\"orig_cltv\", FloatType),\n",
    "    (\"num_borrowers\", FloatType),\n",
    "    (\"dti\", FloatType),\n",
    "    (\"borrower_credit_score\", FloatType),\n",
    "    (\"num_units\", IntegerType),\n",
    "    (\"zip\", IntegerType),\n",
    "    (\"mortgage_insurance_percent\", FloatType),\n",
    "    (\"current_loan_delinquency_status\", IntegerType),\n",
    "    (\"current_actual_upb\", FloatType),\n",
    "    (\"interest_rate\", FloatType),\n",
    "    (\"loan_age\", FloatType),\n",
    "    (\"msa\", FloatType),\n",
    "    (\"non_interest_bearing_upb\", FloatType),\n",
    "    (labelColName, IntegerType)\n",
    "  )\n",
    "\n",
    "var cachedDictDF: DataFrame = _"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "6177b6b8",
   "metadata": {},
   "source": [
    "* Define Casting Process\n",
    "This part is casting String column to Numbric. \n",
    "Example:\n",
    "```\n",
    "col_1\n",
    " \"a\"\n",
    " \"b\"\n",
    " \"c\"\n",
    " \"a\"\n",
    "# After String ====> Numberic\n",
    "col_1\n",
    " 0\n",
    " 1\n",
    " 2\n",
    " 0\n",
    "```  \n",
    "<br>\n",
    "\n",
    "* Define function to get column dictionary\n",
    "\n",
    "    Example\n",
    "    ```\n",
    "    col1 = [row(data=\"a\",id=0), row(data=\"b\",id=1)]\n",
    "    ```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "id": "5091c8a1",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "genDictionary: (etlDF: org.apache.spark.sql.DataFrame, colNames: Seq[String])org.apache.spark.sql.DataFrame\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "def genDictionary(etlDF: DataFrame, colNames: Seq[String]): DataFrame = {\n",
    "    val cntTable = etlDF\n",
    "      .select(posexplode(array(colNames.map(col(_)): _*)))\n",
    "      .withColumnRenamed(\"pos\", \"column_id\")\n",
    "      .withColumnRenamed(\"col\", \"data\")\n",
    "      .filter(\"data is not null\")\n",
    "      .groupBy(\"column_id\", \"data\")\n",
    "      .count()\n",
    "    val windowed = Window.partitionBy(\"column_id\").orderBy(desc(\"count\"))\n",
    "    cntTable\n",
    "      .withColumn(\"id\", row_number().over(windowed))\n",
    "      .drop(\"count\")\n",
    "  }"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1466af65",
   "metadata": {},
   "source": [
    "* Define function to convert string columns to numeric"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "id": "9df8fe60",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "castStringColumnsToNumeric: (inputDF: org.apache.spark.sql.DataFrame, spark: org.apache.spark.sql.SparkSession)org.apache.spark.sql.DataFrame\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "def castStringColumnsToNumeric(inputDF: DataFrame, spark: SparkSession): DataFrame = {\n",
    "    val cateColNames = categaryCols.map(_._1)\n",
    "    cachedDictDF = genDictionary(inputDF, cateColNames).cache()\n",
    "\n",
    "    // Generate the final table with all columns being numeric.\n",
    "    cateColNames.foldLeft(inputDF) {\n",
    "      case (df, colName) =>\n",
    "        val colPos = cateColNames.indexOf(colName)\n",
    "        val colDictDF = cachedDictDF\n",
    "          .filter(col(\"column_id\") === colPos)\n",
    "          .drop(\"column_id\")\n",
    "          .withColumnRenamed(\"data\", colName)\n",
    "        df.join(broadcast(colDictDF), Seq(colName), \"left\")\n",
    "          .drop(colName)\n",
    "          .withColumnRenamed(\"id\", colName)\n",
    "    }\n",
    "  }"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "id": "9e1fbb61",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "defined object extractPerfColumns\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "object extractPerfColumns{\n",
    "  def apply(rawDf : DataFrame) : DataFrame = {\n",
    "    val perfDf = rawDf.select(\n",
    "      col(\"loan_id\"),\n",
    "      date_format(to_date(col(\"monthly_reporting_period\"),\"MMyyyy\"), \"MM/dd/yyyy\").as(\"monthly_reporting_period\"),\n",
    "      upper(col(\"servicer\")).as(\"servicer\"),\n",
    "      col(\"interest_rate\"),\n",
    "      col(\"current_actual_upb\"),\n",
    "      col(\"loan_age\"),\n",
    "      col(\"remaining_months_to_legal_maturity\"),\n",
    "      col(\"adj_remaining_months_to_maturity\"),\n",
    "      date_format(to_date(col(\"maturity_date\"),\"MMyyyy\"), \"MM/yyyy\").as(\"maturity_date\"),\n",
    "      col(\"msa\"),\n",
    "      col(\"current_loan_delinquency_status\"),\n",
    "      col(\"mod_flag\"),\n",
    "      col(\"zero_balance_code\"),\n",
    "      date_format(to_date(col(\"zero_balance_effective_date\"),\"MMyyyy\"), \"MM/yyyy\").as(\"zero_balance_effective_date\"),\n",
    "      date_format(to_date(col(\"last_paid_installment_date\"),\"MMyyyy\"), \"MM/dd/yyyy\").as(\"last_paid_installment_date\"),\n",
    "      date_format(to_date(col(\"foreclosed_after\"),\"MMyyyy\"), \"MM/dd/yyyy\").as(\"foreclosed_after\"),\n",
    "      date_format(to_date(col(\"disposition_date\"),\"MMyyyy\"), \"MM/dd/yyyy\").as(\"disposition_date\"),\n",
    "      col(\"foreclosure_costs\"),\n",
    "      col(\"prop_preservation_and_repair_costs\"),\n",
    "      col(\"asset_recovery_costs\"),\n",
    "      col(\"misc_holding_expenses\"),\n",
    "      col(\"holding_taxes\"),\n",
    "      col(\"net_sale_proceeds\"),\n",
    "      col(\"credit_enhancement_proceeds\"),\n",
    "      col(\"repurchase_make_whole_proceeds\"),\n",
    "      col(\"other_foreclosure_proceeds\"),\n",
    "      col(\"non_interest_bearing_upb\"),\n",
    "      col(\"principal_forgiveness_upb\"),\n",
    "      col(\"repurchase_make_whole_proceeds_flag\"),\n",
    "      col(\"foreclosure_principal_write_off_amount\"),\n",
    "      col(\"servicing_activity_indicator\"),\n",
    "      col(\"quarter\")\n",
    "    )\n",
    "    \n",
    "    perfDf.select(\"*\").filter(\"current_actual_upb != 0.0\")\n",
    "  }\n",
    "}"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "id": "ce429163",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "defined object extractAcqColumns\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "object extractAcqColumns{\n",
    "  def apply(rawDf : DataFrame) : DataFrame = {\n",
    "    val acqDf = rawDf.select(\n",
    "      col(\"loan_id\"),\n",
    "      col(\"orig_channel\"),\n",
    "      upper(col(\"seller_name\")).as(\"seller_name\"),\n",
    "      col(\"orig_interest_rate\"),\n",
    "      col(\"orig_upb\"),\n",
    "      col(\"orig_loan_term\"),\n",
    "      date_format(to_date(col(\"orig_date\"),\"MMyyyy\"), \"MM/yyyy\").as(\"orig_date\"),\n",
    "      date_format(to_date(col(\"first_pay_date\"),\"MMyyyy\"), \"MM/yyyy\").as(\"first_pay_date\"),\n",
    "      col(\"orig_ltv\"),\n",
    "      col(\"orig_cltv\"),\n",
    "      col(\"num_borrowers\"),\n",
    "      col(\"dti\"),\n",
    "      col(\"borrower_credit_score\"),\n",
    "      col(\"first_home_buyer\"),\n",
    "      col(\"loan_purpose\"),\n",
    "      col(\"property_type\"),\n",
    "      col(\"num_units\"),\n",
    "      col(\"occupancy_status\"),\n",
    "      col(\"property_state\"),\n",
    "      col(\"zip\"),\n",
    "      col(\"mortgage_insurance_percent\"),\n",
    "      col(\"product_type\"),\n",
    "      col(\"coborrow_credit_score\"),\n",
    "      col(\"mortgage_insurance_type\"),\n",
    "      col(\"relocation_mortgage_indicator\"),\n",
    "      col(\"quarter\"),\n",
    "      dense_rank().over(Window.partitionBy(\"loan_id\").orderBy(to_date(col(\"monthly_reporting_period\"),\"MMyyyy\"))).as(\"rank\")\n",
    "    )\n",
    "\n",
    "    acqDf.select(\"*\").filter(col(\"rank\") === 1)\n",
    "  }\n",
    "\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "37c64d85",
   "metadata": {},
   "source": [
    "* Build the spark session and data reader"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "id": "98d37174",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "sparkSession = org.apache.spark.sql.SparkSession@694178ec\n",
       "reader = org.apache.spark.sql.DataFrameReader@4b2afd51\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/plain": [
       "org.apache.spark.sql.DataFrameReader@4b2afd51"
      ]
     },
     "execution_count": 11,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "// Build the spark session and data reader as usual\n",
    "val sparkSession = SparkSession.builder.appName(\"mortgage-gpu\").config(\"spark.sql.cache.serializer\", \"com.nvidia.spark.ParquetCachedBatchSerializer\").getOrCreate\n",
    "\n",
    "// GPU run, set to true\n",
    "sparkSession.conf.set(\"spark.rapids.sql.enabled\", true)\n",
    "// CPU run, set to false\n",
    "// sparkSession.conf.set('spark.rapids.sql.enabled', 'false')\n",
    "// remove config(\"spark.sql.cache.serializer\", \"com.nvidia.spark.ParquetCachedBatchSerializer\") for CPU\n",
    "sparkSession.conf.set(\"spark.sql.files.maxPartitionBytes\", \"1G\")\n",
    "sparkSession.conf.set(\"spark.sql.broadcastTimeout\", 700)\n",
    "// use GPU to read CSV\n",
    "sparkSession.conf.set(\"spark.rapids.sql.csv.read.double.enabled\", true)\n",
    "\n",
    "val reader = sparkSession.read.schema(rawSchema)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b47b5456",
   "metadata": {},
   "source": [
    "* Read CSV Files"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "id": "5bac2301",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "optionsMap = Map(header -> true)\n",
       "rawDf = [reference_pool_id: string, loan_id: bigint ... 107 more fields]\n",
       "perfSet = [loan_id: bigint, monthly_reporting_period: string ... 30 more fields]\n",
       "acqSet = [loan_id: bigint, orig_channel: string ... 25 more fields]\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/plain": [
       "[loan_id: bigint, orig_channel: string ... 25 more fields]"
      ]
     },
     "execution_count": 12,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "val rawDf_csv = reader.option(\"header\", false)\n",
    "      .option(\"nullValue\", \"\")\n",
    "      .option(\"delimiter\", \"|\")\n",
    "      .option(\"parserLib\", \"univocity\")\n",
    "      .schema(rawSchema)\n",
    "      .csv(dataPath)\n",
    "      .withColumn(\"quarter\", GetQuarterFromCsvFileName())\n",
    "\n",
    "rawDf_csv.write.mode(\"overwrite\").parquet(output_csv2parquet)\n",
    "val rawDf = spark.read.parquet(output_csv2parquet)\n",
    "\n",
    "val perfSet = extractPerfColumns(rawDf)\n",
    "val acqSet = extractAcqColumns(rawDf)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "f4c814c8",
   "metadata": {},
   "source": [
    "* Define ETL Object"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "id": "a16155cb",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "defined trait MortgageETL\n",
       "allCols = List(orig_channel, first_home_buyer, loan_purpose, property_type, occupancy_status, property_state, product_type, relocation_mortgage_indicator, seller_name, mod_flag, orig_interest_rate, orig_upb, orig_loan_term, orig_ltv, orig_cltv, num_borrowers, dti, borrower_credit_score, num_units, zip, mortgage_insurance_percent, current_loan_delinquency_status, current_actual_upb, interest_rate, loan_age, msa, non_interest_bearing_upb, delinquency_12)\n",
       "defined object PerformanceETL\n",
       "defined object AcquisitionETL\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/plain": [
       "List(orig_channel, first_home_buyer, loan_purpose, property_type, occupancy_status, property_state, product_type, relocation_mortgage_indicator, seller_name, mod_flag, orig_interest_rate, orig_upb, orig_loan_term, orig_ltv, orig_cltv, num_borrowers, dti, borrower_credit_score, num_units, zip, mortgage_insurance_percent, current_loan_delinquency_status, current_actual_upb, interest_rate, loan_age, msa, non_interest_bearing_upb, delinquency_12)"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "trait MortgageETL {\n",
    "  var dataFrame: DataFrame = _\n",
    "\n",
    "  def from(df: DataFrame): this.type = {\n",
    "    dataFrame = df\n",
    "    this\n",
    "  }\n",
    "}\n",
    "val allCols = (categaryCols ++ numericCols).map(c => col(c._1))\n",
    "\n",
    "object PerformanceETL extends MortgageETL {\n",
    "\n",
    "  def prepare: this.type = {\n",
    "    dataFrame = dataFrame\n",
    "      .withColumn(\"monthly_reporting_period\", to_date(col(\"monthly_reporting_period\"), \"MM/dd/yyyy\"))\n",
    "      .withColumn(\"monthly_reporting_period_month\", month(col(\"monthly_reporting_period\")))\n",
    "      .withColumn(\"monthly_reporting_period_year\", year(col(\"monthly_reporting_period\")))\n",
    "      .withColumn(\"monthly_reporting_period_day\", dayofmonth(col(\"monthly_reporting_period\")))\n",
    "      .withColumn(\"last_paid_installment_date\", to_date(col(\"last_paid_installment_date\"), \"MM/dd/yyyy\"))\n",
    "      .withColumn(\"foreclosed_after\", to_date(col(\"foreclosed_after\"), \"MM/dd/yyyy\"))\n",
    "      .withColumn(\"disposition_date\", to_date(col(\"disposition_date\"), \"MM/dd/yyyy\"))\n",
    "      .withColumn(\"maturity_date\", to_date(col(\"maturity_date\"), \"MM/yyyy\"))\n",
    "      .withColumn(\"zero_balance_effective_date\", to_date(col(\"zero_balance_effective_date\"), \"MM/yyyy\"))\n",
    "      .withColumn(\"current_actual_upb\", col(\"current_actual_upb\"))\n",
    "      .withColumn(\"current_loan_delinquency_status\", col(\"current_loan_delinquency_status\"))\n",
    "    this\n",
    "  }\n",
    "\n",
    "  def createDelinquency(spark: SparkSession): this.type = {\n",
    "    val aggDF = dataFrame\n",
    "      .select(\n",
    "        col(\"quarter\"),\n",
    "        col(\"loan_id\"),\n",
    "        col(\"current_loan_delinquency_status\"),\n",
    "        when(col(\"current_loan_delinquency_status\") >= 1, col(\"monthly_reporting_period\")).alias(\"delinquency_30\"),\n",
    "        when(col(\"current_loan_delinquency_status\") >= 3, col(\"monthly_reporting_period\")).alias(\"delinquency_90\"),\n",
    "        when(col(\"current_loan_delinquency_status\") >= 6, col(\"monthly_reporting_period\")).alias(\"delinquency_180\")\n",
    "      )\n",
    "      .groupBy(\"quarter\", \"loan_id\")\n",
    "      .agg(\n",
    "        max(\"current_loan_delinquency_status\").alias(\"delinquency_12\"),\n",
    "        min(\"delinquency_30\").alias(\"delinquency_30\"),\n",
    "        min(\"delinquency_90\").alias(\"delinquency_90\"),\n",
    "        min(\"delinquency_180\").alias(\"delinquency_180\")\n",
    "      )\n",
    "      .select(\n",
    "        col(\"quarter\"),\n",
    "        col(\"loan_id\"),\n",
    "        (col(\"delinquency_12\") >= 1).alias(\"ever_30\"),\n",
    "        (col(\"delinquency_12\") >= 3).alias(\"ever_90\"),\n",
    "        (col(\"delinquency_12\") >= 6).alias(\"ever_180\"),\n",
    "        col(\"delinquency_30\"),\n",
    "        col(\"delinquency_90\"),\n",
    "        col(\"delinquency_180\")\n",
    "      )\n",
    "\n",
    "    val joinedDf = dataFrame\n",
    "      .withColumnRenamed(\"monthly_reporting_period\", \"timestamp\")\n",
    "      .withColumnRenamed(\"monthly_reporting_period_month\", \"timestamp_month\")\n",
    "      .withColumnRenamed(\"monthly_reporting_period_year\", \"timestamp_year\")\n",
    "      .withColumnRenamed(\"current_loan_delinquency_status\", \"delinquency_12\")\n",
    "      .withColumnRenamed(\"current_actual_upb\", \"upb_12\")\n",
    "      .select(\"quarter\", \"loan_id\", \"timestamp\", \"delinquency_12\", \"upb_12\", \"timestamp_month\", \"timestamp_year\")\n",
    "      .join(aggDF, Seq(\"loan_id\", \"quarter\"), \"left_outer\")\n",
    "\n",
    "    // calculate the 12 month delinquency and upb values\n",
    "    val months = 12\n",
    "    val monthArray = 0.until(months).toArray\n",
    "    val testDf = joinedDf\n",
    "      // explode on a small amount of data is actually slightly more efficient than a cross join\n",
    "      .withColumn(\"month_y\", explode(lit(monthArray)))\n",
    "      .select(\n",
    "        col(\"quarter\"),\n",
    "        floor(((col(\"timestamp_year\") * 12 + col(\"timestamp_month\")) - 24000) / months).alias(\"josh_mody\"),\n",
    "        floor(((col(\"timestamp_year\") * 12 + col(\"timestamp_month\")) - 24000 - col(\"month_y\")) / months).alias(\"josh_mody_n\"),\n",
    "        col(\"ever_30\"),\n",
    "        col(\"ever_90\"),\n",
    "        col(\"ever_180\"),\n",
    "        col(\"delinquency_30\"),\n",
    "        col(\"delinquency_90\"),\n",
    "        col(\"delinquency_180\"),\n",
    "        col(\"loan_id\"),\n",
    "        col(\"month_y\"),\n",
    "        col(\"delinquency_12\"),\n",
    "        col(\"upb_12\")\n",
    "      )\n",
    "      .groupBy(\"quarter\", \"loan_id\", \"josh_mody_n\", \"ever_30\", \"ever_90\", \"ever_180\", \"delinquency_30\", \"delinquency_90\", \"delinquency_180\", \"month_y\")\n",
    "      .agg(max(\"delinquency_12\").alias(\"delinquency_12\"), min(\"upb_12\").alias(\"upb_12\"))\n",
    "      .withColumn(\"timestamp_year\", floor((lit(24000) + (col(\"josh_mody_n\") * lit(months)) + (col(\"month_y\") - 1)) / lit(12)))\n",
    "      .withColumn(\"timestamp_month_tmp\", pmod(lit(24000) + (col(\"josh_mody_n\") * lit(months)) + col(\"month_y\"), lit(12)))\n",
    "      .withColumn(\"timestamp_month\", when(col(\"timestamp_month_tmp\") === lit(0), lit(12)).otherwise(col(\"timestamp_month_tmp\")))\n",
    "      .withColumn(\"delinquency_12\", ((col(\"delinquency_12\") > 3).cast(\"int\") + (col(\"upb_12\") === 0).cast(\"int\")).alias(\"delinquency_12\"))\n",
    "      .drop(\"timestamp_month_tmp\", \"josh_mody_n\", \"month_y\")\n",
    "\n",
    "    dataFrame = dataFrame\n",
    "      .withColumnRenamed(\"monthly_reporting_period_month\", \"timestamp_month\")\n",
    "      .withColumnRenamed(\"monthly_reporting_period_year\", \"timestamp_year\")\n",
    "      .join(testDf, Seq(\"quarter\", \"loan_id\", \"timestamp_year\", \"timestamp_month\"), \"left\").drop(\"timestamp_year\", \"timestamp_month\")\n",
    "    this\n",
    "  }\n",
    "}\n",
    "\n",
    "object AcquisitionETL extends MortgageETL {\n",
    "\n",
    "  def createAcquisition(spark: SparkSession): this.type = {\n",
    "    val nameMapping = NameMapping(spark, \"from_seller_name\", \"to_seller_name\")\n",
    "    dataFrame = dataFrame\n",
    "      .join(nameMapping, col(\"seller_name\") === col(\"from_seller_name\"), \"left\")\n",
    "      .drop(\"from_seller_name\")\n",
    "      /* backup the original name before we replace it */\n",
    "      .withColumn(\"old_name\", col(\"seller_name\"))\n",
    "      /* replace seller_name with the new version if we found one in the mapping, or the old version\n",
    "       if we didn't */\n",
    "      .withColumn(\"seller_name\", coalesce(col(\"to_seller_name\"), col(\"seller_name\")))\n",
    "      .drop(\"to_seller_name\")\n",
    "      .withColumn(\"orig_date\", to_date(col(\"orig_date\"), \"MM/yyyy\"))\n",
    "      .withColumn(\"first_pay_date\", to_date(col(\"first_pay_date\"), \"MM/yyyy\"))\n",
    "    this\n",
    "  }\n",
    "\n",
    "  def cleanPrime(perfDF: DataFrame): this.type = {\n",
    "    dataFrame = perfDF.join(dataFrame, Seq(\"loan_id\", \"quarter\"), \"inner\").drop(\"quarter\")\n",
    "    this\n",
    "  }\n",
    "}\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "id": "78b76252",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "transform: (perfDF: org.apache.spark.sql.DataFrame, acqDF: org.apache.spark.sql.DataFrame, spark: org.apache.spark.sql.SparkSession)org.apache.spark.sql.DataFrame\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "def transform(perfDF: DataFrame, acqDF: DataFrame, spark: SparkSession): DataFrame = {\n",
    "    val etlPerfDF = PerformanceETL.from(perfDF)\n",
    "      .prepare\n",
    "      .createDelinquency(spark)\n",
    "      .dataFrame\n",
    "    val cleanDF = AcquisitionETL.from(acqDF)\n",
    "      .createAcquisition(spark)\n",
    "      .cleanPrime(etlPerfDF)\n",
    "      .dataFrame\n",
    "\n",
    "    // Convert to xgb required Dataset\n",
    "    castStringColumnsToNumeric(cleanDF, spark)\n",
    "      .select(allCols: _*)\n",
    "      .withColumn(labelColName, when(col(labelColName) > 0, 1).otherwise(0))\n",
    "      .na.fill(0.0f)\n",
    "  }"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b1234f49",
   "metadata": {},
   "source": [
    "## Run ETL Process and Save the Result"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "id": "ffdb0a62",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Elapsed time : 399.241s\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "t0 = 1656695479451\n",
       "optionsMap = Map(header -> true)\n",
       "rawDF = [orig_channel: int, first_home_buyer: int ... 26 more fields]\n",
       "t1 = 1656695878692\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/plain": [
       "1656695878692"
      ]
     },
     "execution_count": 15,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "val t0 = System.currentTimeMillis\n",
    "val rawDF = transform(\n",
    "      perfSet,\n",
    "      acqSet,\n",
    "      sparkSession\n",
    "    )\n",
    "\n",
    "val etlDataPath = new Path(outPath, \"data\").toString\n",
    "rawDF.write.mode(\"overwrite\").parquet(etlDataPath)\n",
    "\n",
    "if(saveTrainEvalDataset == true)\n",
    "{\n",
    "  val etlDf = sparkSession.read.parquet(etlDataPath)\n",
    "  val sets = etlDf.randomSplit(Array[Double](0.8, 0.2))\n",
    "  val train = sets(0)\n",
    "  val eval = sets(1)\n",
    "  train.write.mode(\"overwrite\").parquet(new Path(outPath, \"train\").toString)\n",
    "  eval.write.mode(\"overwrite\").parquet(new Path(outPath, \"eval\").toString)\n",
    "}\n",
    "\n",
    "\n",
    "val t1 = System.currentTimeMillis\n",
    "println(\"Elapsed time : \" + ((t1 - t0).toFloat / 1000) + \"s\")\n",
    "sparkSession.stop()"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "XGBoost4j-Spark Scala",
   "language": "scala",
   "name": "XGBoost4j-Spark_scala"
  },
  "language_info": {
   "codemirror_mode": "text/x-scala",
   "file_extension": ".scala",
   "mimetype": "text/x-scala",
   "name": "scala",
   "pygments_lexer": "scala",
   "version": "2.12.15"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
