{
  "cells": [
    {
      "cell_type": "markdown",
      "id": "57b58c35",
      "metadata": {},
      "source": [
        "## Evaluation of prediction results\n",
        "\n",
        " <a target=\"_blank\" href=\"https://colab.research.google.com/github/moj-analytical-services/splink/blob/master/docs/demos/tutorials/07_Quality_assurance.ipynb\">\n",
        "  <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
        "</a>\n",
        "\n",
        "In the previous tutorial, we looked at various ways to visualise the results of our model.\n",
        "These are useful for evaluating a linkage pipeline because they allow us to understand how our model works and verify that it is doing something sensible. They can also be useful to identify examples where the model is not performing as expected.\n",
        "\n",
        "In addition to these spot checks, Splink also has functions to perform more formal accuracy analysis. These functions allow you to understand the likely prevalence of false positives and false negatives in your linkage models.\n",
        "\n",
        "They rely on the existence of a sample of labelled (ground truth) matches, which may have been produced (for example) by human beings. For the accuracy analysis to be unbiased, the sample should be representative of the overall dataset.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 1,
      "id": "e08e61e5",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:15.075066Z",
          "iopub.status.busy": "2024-07-18T13:59:15.074751Z",
          "iopub.status.idle": "2024-07-18T13:59:15.095735Z",
          "shell.execute_reply": "2024-07-18T13:59:15.094736Z"
        },
        "tags": [
          "hide_input"
        ]
      },
      "outputs": [],
      "source": [
        "# Uncomment and run this cell if you're running in Google Colab.\n",
        "# !pip install splink"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 2,
      "id": "fb29d421",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:15.100802Z",
          "iopub.status.busy": "2024-07-18T13:59:15.100475Z",
          "iopub.status.idle": "2024-07-18T13:59:17.210056Z",
          "shell.execute_reply": "2024-07-18T13:59:17.209284Z"
        }
      },
      "outputs": [],
      "source": [
        "# Rerun our predictions to we're ready to view the charts\n",
        "import pandas as pd\n",
        "\n",
        "from splink import DuckDBAPI, Linker, splink_datasets\n",
        "\n",
        "pd.options.display.max_columns = 1000\n",
        "\n",
        "db_api = DuckDBAPI()\n",
        "df = splink_datasets.fake_1000"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 3,
      "id": "f88cc1c1",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:17.214467Z",
          "iopub.status.busy": "2024-07-18T13:59:17.214127Z",
          "iopub.status.idle": "2024-07-18T13:59:18.511128Z",
          "shell.execute_reply": "2024-07-18T13:59:18.510248Z"
        }
      },
      "outputs": [
        {
          "name": "stderr",
          "output_type": "stream",
          "text": [
            "Blocking time: 0.02 seconds\n"
          ]
        },
        {
          "name": "stderr",
          "output_type": "stream",
          "text": [
            "Predict time: 0.80 seconds\n"
          ]
        },
        {
          "name": "stderr",
          "output_type": "stream",
          "text": [
            "\n",
            " -- WARNING --\n",
            "You have called predict(), but there are some parameter estimates which have neither been estimated or specified in your settings dictionary.  To produce predictions the following untrained trained parameters will use default values.\n",
            "Comparison: 'email':\n",
            "    m values not fully trained\n"
          ]
        }
      ],
      "source": [
        "import json\n",
        "import urllib\n",
        "\n",
        "from splink import block_on\n",
        "\n",
        "url = \"https://raw.githubusercontent.com/moj-analytical-services/splink/847e32508b1a9cdd7bcd2ca6c0a74e547fb69865/docs/demos/demo_settings/saved_model_from_demo.json\"\n",
        "\n",
        "with urllib.request.urlopen(url) as u:\n",
        "    settings = json.loads(u.read().decode())\n",
        "\n",
        "# The data quality is very poor in this dataset, so we need looser blocking rules\n",
        "# to achieve decent recall\n",
        "settings[\"blocking_rules_to_generate_predictions\"] = [\n",
        "    block_on(\"first_name\"),\n",
        "    block_on(\"city\"),\n",
        "    block_on(\"email\"),\n",
        "    block_on(\"dob\"),\n",
        "]\n",
        "\n",
        "linker = Linker(df, settings, db_api=DuckDBAPI())\n",
        "df_predictions = linker.inference.predict(threshold_match_probability=0.01)"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "7b0dedd9",
      "metadata": {},
      "source": [
        "## Load in labels\n",
        "\n",
        "The labels file contains a list of pairwise comparisons which represent matches and non-matches.\n",
        "\n",
        "The required format of the labels file is described [here](https://moj-analytical-services.github.io/splink/api_docs/evaluation.html#splink.internals.linker_components.evaluation.LinkerEvalution.prediction_errors_from_labels_table).\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 4,
      "id": "bbfdc70c",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:18.515644Z",
          "iopub.status.busy": "2024-07-18T13:59:18.515122Z",
          "iopub.status.idle": "2024-07-18T13:59:18.552541Z",
          "shell.execute_reply": "2024-07-18T13:59:18.551821Z"
        }
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>unique_id_l</th>\n",
              "      <th>source_dataset_l</th>\n",
              "      <th>unique_id_r</th>\n",
              "      <th>source_dataset_r</th>\n",
              "      <th>clerical_match_score</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>0</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>1</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>1.0</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>0</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>2</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>1.0</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>0</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>3</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>1.0</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>3</th>\n",
              "      <td>0</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>4</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>0.0</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>4</th>\n",
              "      <td>0</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>5</td>\n",
              "      <td>fake_1000</td>\n",
              "      <td>0.0</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "   unique_id_l source_dataset_l  unique_id_r source_dataset_r  \\\n",
              "0            0        fake_1000            1        fake_1000   \n",
              "1            0        fake_1000            2        fake_1000   \n",
              "2            0        fake_1000            3        fake_1000   \n",
              "3            0        fake_1000            4        fake_1000   \n",
              "4            0        fake_1000            5        fake_1000   \n",
              "\n",
              "   clerical_match_score  \n",
              "0                   1.0  \n",
              "1                   1.0  \n",
              "2                   1.0  \n",
              "3                   0.0  \n",
              "4                   0.0  "
            ]
          },
          "execution_count": 4,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "from splink.datasets import splink_dataset_labels\n",
        "\n",
        "df_labels = splink_dataset_labels.fake_1000_labels\n",
        "labels_table = linker.table_management.register_labels_table(df_labels)\n",
        "df_labels.head(5)"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "ff86458e",
      "metadata": {},
      "source": [
        "## View examples of false positives and false negatives"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 5,
      "id": "c5b3deb6",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:18.556625Z",
          "iopub.status.busy": "2024-07-18T13:59:18.556304Z",
          "iopub.status.idle": "2024-07-18T13:59:19.797703Z",
          "shell.execute_reply": "2024-07-18T13:59:19.797008Z"
        }
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "\n",
              "<style>\n",
              "  #altair-viz-f1e5d20414d74568aa5d122b128cd76d.vega-embed {\n",
              "    width: 100%;\n",
              "    display: flex;\n",
              "  }\n",
              "\n",
              "  #altair-viz-f1e5d20414d74568aa5d122b128cd76d.vega-embed details,\n",
              "  #altair-viz-f1e5d20414d74568aa5d122b128cd76d.vega-embed details summary {\n",
              "    position: relative;\n",
              "  }\n",
              "</style>\n",
              "<div id=\"altair-viz-f1e5d20414d74568aa5d122b128cd76d\"></div>\n",
              "<script type=\"text/javascript\">\n",
              "  var VEGA_DEBUG = (typeof VEGA_DEBUG == \"undefined\") ? {} : VEGA_DEBUG;\n",
              "  (function(spec, embedOpt){\n",
              "    let outputDiv = document.currentScript.previousElementSibling;\n",
              "    if (outputDiv.id !== \"altair-viz-f1e5d20414d74568aa5d122b128cd76d\") {\n",
              "      outputDiv = document.getElementById(\"altair-viz-f1e5d20414d74568aa5d122b128cd76d\");\n",
              "    }\n",
              "    const paths = {\n",
              "      \"vega\": \"https://cdn.jsdelivr.net/npm/vega@5?noext\",\n",
              "      \"vega-lib\": \"https://cdn.jsdelivr.net/npm/vega-lib?noext\",\n",
              "      \"vega-lite\": \"https://cdn.jsdelivr.net/npm/vega-lite@5.17.0?noext\",\n",
              "      \"vega-embed\": \"https://cdn.jsdelivr.net/npm/vega-embed@6?noext\",\n",
              "    };\n",
              "\n",
              "    function maybeLoadScript(lib, version) {\n",
              "      var key = `${lib.replace(\"-\", \"\")}_version`;\n",
              "      return (VEGA_DEBUG[key] == version) ?\n",
              "        Promise.resolve(paths[lib]) :\n",
              "        new Promise(function(resolve, reject) {\n",
              "          var s = document.createElement('script');\n",
              "          document.getElementsByTagName(\"head\")[0].appendChild(s);\n",
              "          s.async = true;\n",
              "          s.onload = () => {\n",
              "            VEGA_DEBUG[key] = version;\n",
              "            return resolve(paths[lib]);\n",
              "          };\n",
              "          s.onerror = () => reject(`Error loading script: ${paths[lib]}`);\n",
              "          s.src = paths[lib];\n",
              "        });\n",
              "    }\n",
              "\n",
              "    function showError(err) {\n",
              "      outputDiv.innerHTML = `<div class=\"error\" style=\"color:red;\">${err}</div>`;\n",
              "      throw err;\n",
              "    }\n",
              "\n",
              "    function displayChart(vegaEmbed) {\n",
              "      vegaEmbed(outputDiv, spec, embedOpt)\n",
              "        .catch(err => showError(`Javascript Error: ${err.message}<br>This usually means there's a typo in your chart specification. See the javascript console for the full traceback.`));\n",
              "    }\n",
              "\n",
              "    if(typeof define === \"function\" && define.amd) {\n",
              "      requirejs.config({paths});\n",
              "      require([\"vega-embed\"], displayChart, err => showError(`Error loading script: ${err.message}`));\n",
              "    } else {\n",
              "      maybeLoadScript(\"vega\", \"5\")\n",
              "        .then(() => maybeLoadScript(\"vega-lite\", \"5.17.0\"))\n",
              "        .then(() => maybeLoadScript(\"vega-embed\", \"6\"))\n",
              "        .catch(showError)\n",
              "        .then(() => displayChart(vegaEmbed));\n",
              "    }\n",
              "  })({\"config\": {\"view\": {\"continuousWidth\": 400, \"continuousHeight\": 300}}, \"layer\": [{\"layer\": [{\"mark\": \"rule\", \"encoding\": {\"color\": {\"value\": \"black\"}, \"size\": {\"value\": 0.5}, \"y\": {\"field\": \"zero\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"bar\", \"width\": 60}, \"encoding\": {\"color\": {\"condition\": {\"test\": \"(datum.log2_bayes_factor < 0)\", \"value\": \"red\"}, \"value\": \"green\"}, \"opacity\": {\"condition\": {\"test\": \"datum.column_name == 'Prior match weight' || datum.column_name == 'Final score'\", \"value\": 1}, \"value\": 0.5}, \"tooltip\": [{\"field\": \"column_name\", \"title\": \"Comparison column\", \"type\": \"nominal\"}, {\"field\": \"value_l\", \"title\": \"Value (L)\", \"type\": \"nominal\"}, {\"field\": \"value_r\", \"title\": \"Value (R)\", \"type\": \"nominal\"}, {\"field\": \"label_for_charts\", \"title\": \"Label\", \"type\": \"ordinal\"}, {\"field\": \"sql_condition\", \"title\": \"SQL condition\", \"type\": \"nominal\"}, {\"field\": \"comparison_vector_value\", \"title\": \"Comparison vector value\", \"type\": \"nominal\"}, {\"field\": \"bayes_factor\", \"format\": \",.4f\", \"title\": \"Bayes factor = m/u\", \"type\": \"quantitative\"}, {\"field\": \"log2_bayes_factor\", \"format\": \",.4f\", \"title\": \"Match weight = log2(m/u)\", \"type\": \"quantitative\"}, {\"field\": \"prob\", \"format\": \".4f\", \"title\": \"Cumulative match probability\", \"type\": \"quantitative\"}, {\"field\": \"bayes_factor_description\", \"title\": \"Match weight description\", \"type\": \"nominal\"}], \"x\": {\"axis\": {\"grid\": true, \"labelAlign\": \"center\", \"labelAngle\": -20, \"labelExpr\": \"datum.value == 'Prior' || datum.value == 'Final score' ? '' : datum.value\", \"labelPadding\": 10, \"tickBand\": \"extent\", \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"axis\": {\"grid\": false, \"orient\": \"left\", \"title\": \"Match Weight\"}, \"field\": \"previous_sum\", \"type\": \"quantitative\"}, \"y2\": {\"field\": \"sum\"}}}, {\"mark\": {\"type\": \"text\", \"fontWeight\": \"bold\"}, \"encoding\": {\"color\": {\"value\": \"white\"}, \"text\": {\"condition\": {\"test\": \"abs(datum.log2_bayes_factor) > 1\", \"field\": \"log2_bayes_factor\", \"format\": \".2f\", \"type\": \"nominal\"}, \"value\": \"\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"axis\": {\"orient\": \"left\"}, \"field\": \"center\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"text\", \"baseline\": \"bottom\", \"dy\": -25, \"fontWeight\": \"bold\"}, \"encoding\": {\"color\": {\"value\": \"black\"}, \"text\": {\"field\": \"column_name\", \"type\": \"nominal\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"field\": \"sum_top\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"text\", \"baseline\": \"bottom\", \"dy\": -13, \"fontSize\": 8}, \"encoding\": {\"color\": {\"value\": \"grey\"}, \"text\": {\"field\": \"value_l\", \"type\": \"nominal\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"field\": \"sum_top\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"text\", \"baseline\": \"bottom\", \"dy\": -5, \"fontSize\": 8}, \"encoding\": {\"color\": {\"value\": \"grey\"}, \"text\": {\"field\": \"value_r\", \"type\": \"nominal\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"field\": \"sum_top\", \"type\": \"quantitative\"}}}]}, {\"mark\": {\"type\": \"rule\", \"color\": \"black\", \"strokeWidth\": 2, \"x2Offset\": 30, \"xOffset\": -30}, \"encoding\": {\"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"x2\": {\"field\": \"lead\"}, \"y\": {\"axis\": {\"labelExpr\": \"format(1 / (1 + pow(2, -1*datum.value)), '.2r')\", \"orient\": \"right\", \"title\": \"Probability\"}, \"field\": \"sum\", \"scale\": {\"zero\": false}, \"type\": \"quantitative\"}}}], \"data\": {\"name\": \"data-d899a450bb838276b2479d00349093c8\"}, \"height\": 450, \"params\": [{\"name\": \"record_number\", \"bind\": {\"input\": \"range\", \"max\": 4, \"min\": 0, \"step\": 1}, \"value\": 0}], \"resolve\": {\"axis\": {\"y\": \"independent\"}}, \"title\": {\"text\": \"Match weights waterfall chart\", \"subtitle\": \"How each comparison contributes to the final match score\"}, \"transform\": [{\"filter\": \"(datum.record_number == record_number)\"}, {\"filter\": \"(datum.bayes_factor !== 1.0)\"}, {\"window\": [{\"op\": \"sum\", \"field\": \"log2_bayes_factor\", \"as\": \"sum\"}, {\"op\": \"lead\", \"field\": \"column_name\", \"as\": \"lead\"}], \"frame\": [null, 0]}, {\"calculate\": \"datum.column_name === \\\"Final score\\\" ? datum.sum - datum.log2_bayes_factor : datum.sum\", \"as\": \"sum\"}, {\"calculate\": \"datum.lead === null ? datum.column_name : datum.lead\", \"as\": \"lead\"}, {\"calculate\": \"datum.column_name === \\\"Final score\\\" || datum.column_name === \\\"Prior match weight\\\" ? 0 : datum.sum - datum.log2_bayes_factor\", \"as\": \"previous_sum\"}, {\"calculate\": \"datum.sum > datum.previous_sum ? datum.column_name : \\\"\\\"\", \"as\": \"top_label\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? datum.column_name : \\\"\\\"\", \"as\": \"bottom_label\"}, {\"calculate\": \"datum.sum > datum.previous_sum ? datum.sum : datum.previous_sum\", \"as\": \"sum_top\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? datum.sum : datum.previous_sum\", \"as\": \"sum_bottom\"}, {\"calculate\": \"(datum.sum + datum.previous_sum) / 2\", \"as\": \"center\"}, {\"calculate\": \"(datum.log2_bayes_factor > 0 ? \\\"+\\\" : \\\"\\\") + datum.log2_bayes_factor\", \"as\": \"text_log2_bayes_factor\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? 4 : -4\", \"as\": \"dy\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? \\\"top\\\" : \\\"bottom\\\"\", \"as\": \"baseline\"}, {\"calculate\": \"1. / (1 + pow(2, -1.*datum.sum))\", \"as\": \"prob\"}, {\"calculate\": \"0*datum.sum\", \"as\": \"zero\"}], \"width\": {\"step\": 75}, \"$schema\": \"https://vega.github.io/schema/vega-lite/v5.9.3.json\", \"datasets\": {\"data-d899a450bb838276b2479d00349093c8\": [{\"column_name\": \"Prior\", \"label_for_charts\": \"Starting match weight (prior)\", \"sql_condition\": null, \"log2_bayes_factor\": -8.386106589561935, \"bayes_factor\": 0.002989030659078392, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 0, \"record_number\": 0}, {\"sql_condition\": \"\\\"first_name_l\\\" IS NULL OR \\\"first_name_r\\\" IS NULL\", \"label_for_charts\": \"first_name is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `first_name is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"first_name\", \"value_l\": \"None\", \"value_r\": \"Macdonald\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 1, \"record_number\": 0}, {\"sql_condition\": \"\\\"first_name_l\\\" IS NULL OR \\\"first_name_r\\\" IS NULL\", \"label_for_charts\": \"first_name is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `first_name is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_first_name\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 2, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 0.22323745980048332, \"log2_bayes_factor\": -2.163348959593668, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"surname\", \"value_l\": \"Macdonald\", \"value_r\": \"Dylan\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 3, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"tf_surname\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 4, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.4592001552927498, \"u_probability\": 0.9966506506506506, \"bayes_factor\": 0.4607433457179473, \"log2_bayes_factor\": -1.1179647649923994, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.17 times less likely to be a match\", \"column_name\": \"dob\", \"value_l\": \"1985-10-15\", \"value_r\": \"1986-09-15\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 5, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 0.46295618691361595, \"log2_bayes_factor\": -1.1110524282124985, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"city\", \"value_l\": \"London\", \"value_r\": \"Lonodnn\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 6, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"tf_city\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 7, \"record_number\": 0}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"email_l\\\", \\\"email_r\\\") >= 0.88\", \"label_for_charts\": \"Jaro-Winkler distance of email >= 0.88\", \"m_probability\": 0.21412999464826887, \"u_probability\": 0.0009135769109519858, \"bayes_factor\": 234.38639055045334, \"log2_bayes_factor\": 7.872744993087005, \"comparison_vector_value\": 2, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of email >= 0.88` then comparison is 234.39 times more likely to be a match\", \"column_name\": \"email\", \"value_l\": \"dmacdoald@riverb-glass.siz\", \"value_r\": \"dracdonald@rivers-glass.biz\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 8, \"record_number\": 0}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"email_l\\\", \\\"email_r\\\") >= 0.88\", \"label_for_charts\": \"Jaro-Winkler distance of email >= 0.88\", \"m_probability\": 0.21412999464826887, \"u_probability\": 0.0009135769109519858, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 2, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of email >= 0.88` then comparison is 234.39 times more likely to be a match\", \"column_name\": \"tf_email\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 9, \"record_number\": 0}, {\"column_name\": \"Final score\", \"label_for_charts\": \"Final score\", \"sql_condition\": null, \"log2_bayes_factor\": -4.905727749273497, \"bayes_factor\": 0.03336021161943022, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 10, \"record_number\": 0}, {\"column_name\": \"Prior\", \"label_for_charts\": \"Starting match weight (prior)\", \"sql_condition\": null, \"log2_bayes_factor\": -8.386106589561935, \"bayes_factor\": 0.002989030659078392, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 0, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.20366740135854072, \"u_probability\": 0.9713801052585794, \"bayes_factor\": 0.20966807973108, \"log2_bayes_factor\": -2.2538208553839927, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.77 times less likely to be a match\", \"column_name\": \"first_name\", \"value_l\": \"Harley\", \"value_r\": \"Kaur\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 1, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.20366740135854072, \"u_probability\": 0.9713801052585794, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.77 times less likely to be a match\", \"column_name\": \"tf_first_name\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 2, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 0.22323745980048332, \"log2_bayes_factor\": -2.163348959593668, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"surname\", \"value_l\": \"Kra\", \"value_r\": \"Harley\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 3, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"tf_surname\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 4, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.4592001552927498, \"u_probability\": 0.9966506506506506, \"bayes_factor\": 0.4607433457179473, \"log2_bayes_factor\": -1.1179647649923994, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.17 times less likely to be a match\", \"column_name\": \"dob\", \"value_l\": \"1973-10-27\", \"value_r\": \"1983-11-24\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 5, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 0.46295618691361595, \"log2_bayes_factor\": -1.1110524282124985, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"city\", \"value_l\": \"Mancheeser\", \"value_r\": \"Manchester\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 6, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"tf_city\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 7, \"record_number\": 1}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"email_l\\\", \\\"email_r\\\") >= 0.88\", \"label_for_charts\": \"Jaro-Winkler distance of email >= 0.88\", \"m_probability\": 0.21412999464826887, \"u_probability\": 0.0009135769109519858, \"bayes_factor\": 234.38639055045334, \"log2_bayes_factor\": 7.872744993087005, \"comparison_vector_value\": 2, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of email >= 0.88` then comparison is 234.39 times more likely to be a match\", \"column_name\": \"email\", \"value_l\": \"harleyk@houston.net\", \"value_r\": \"haoleyk@hrustonnet\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 8, \"record_number\": 1}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"email_l\\\", \\\"email_r\\\") >= 0.88\", \"label_for_charts\": \"Jaro-Winkler distance of email >= 0.88\", \"m_probability\": 0.21412999464826887, \"u_probability\": 0.0009135769109519858, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 2, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of email >= 0.88` then comparison is 234.39 times more likely to be a match\", \"column_name\": \"tf_email\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 9, \"record_number\": 1}, {\"column_name\": \"Final score\", \"label_for_charts\": \"Final score\", \"sql_condition\": null, \"log2_bayes_factor\": -7.159548604657489, \"bayes_factor\": 0.006994571509668399, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 10, \"record_number\": 1}, {\"column_name\": \"Prior\", \"label_for_charts\": \"Starting match weight (prior)\", \"sql_condition\": null, \"log2_bayes_factor\": -8.386106589561935, \"bayes_factor\": 0.002989030659078392, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 0, \"record_number\": 2}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.20366740135854072, \"u_probability\": 0.9713801052585794, \"bayes_factor\": 0.20966807973108, \"log2_bayes_factor\": -2.2538208553839927, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.77 times less likely to be a match\", \"column_name\": \"first_name\", \"value_l\": \"Holly\", \"value_r\": \"Thomson\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 1, \"record_number\": 2}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.20366740135854072, \"u_probability\": 0.9713801052585794, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.77 times less likely to be a match\", \"column_name\": \"tf_first_name\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 2, \"record_number\": 2}, {\"sql_condition\": \"\\\"surname_l\\\" IS NULL OR \\\"surname_r\\\" IS NULL\", \"label_for_charts\": \"surname is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `surname is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"surname\", \"value_l\": \"None\", \"value_r\": \"Holly\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 3, \"record_number\": 2}, {\"sql_condition\": \"\\\"surname_l\\\" IS NULL OR \\\"surname_r\\\" IS NULL\", \"label_for_charts\": \"surname is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `surname is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_surname\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 4, \"record_number\": 2}, {\"sql_condition\": \"\\\"dob_l\\\" = \\\"dob_r\\\"\", \"label_for_charts\": \"Exact match on dob\", \"m_probability\": 0.39142166528829947, \"u_probability\": 0.0017477477477477479, \"bayes_factor\": 223.95775694330536, \"log2_bayes_factor\": 7.807082825648327, \"comparison_vector_value\": 2, \"bayes_factor_description\": \"If comparison level is `exact match on dob` then comparison is 223.96 times more likely to be a match\", \"column_name\": \"dob\", \"value_l\": \"1985-05-21\", \"value_r\": \"1985-05-21\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 5, \"record_number\": 2}, {\"sql_condition\": \"\\\"city_l\\\" = \\\"city_r\\\"\", \"label_for_charts\": \"Exact match on city\", \"m_probability\": 0.5625747223574914, \"u_probability\": 0.0551475711801453, \"bayes_factor\": 10.201260188228096, \"log2_bayes_factor\": 3.350675477967206, \"comparison_vector_value\": 1, \"bayes_factor_description\": \"If comparison level is `exact match on city` then comparison is 10.20 times more likely to be a match\", \"column_name\": \"city\", \"value_l\": \"London\", \"value_r\": \"London\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 6, \"record_number\": 2}, {\"sql_condition\": \"\\\"city_l\\\" = \\\"city_r\\\"\", \"label_for_charts\": \"Term freq adjustment on city with weight {cl.tf_adjustment_weight}\", \"m_probability\": null, \"u_probability\": null, \"bayes_factor\": 0.2591617073379083, \"log2_bayes_factor\": -1.948075527570922, \"comparison_vector_value\": 1, \"bayes_factor_description\": \"Term frequency adjustment on city makes comparison  3.86 times less likely to be a match\", \"column_name\": \"tf_city\", \"value_l\": \"London\", \"value_r\": \"London\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 7, \"record_number\": 2}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"email\", \"value_l\": \"None\", \"value_r\": \"hollythomson3@levine-jones.com\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 8, \"record_number\": 2}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_email\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 9, \"record_number\": 2}, {\"column_name\": \"Final score\", \"label_for_charts\": \"Final score\", \"sql_condition\": null, \"log2_bayes_factor\": -1.4302446689013173, \"bayes_factor\": 0.3710679573273406, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 10, \"record_number\": 2}, {\"column_name\": \"Prior\", \"label_for_charts\": \"Starting match weight (prior)\", \"sql_condition\": null, \"log2_bayes_factor\": -8.386106589561935, \"bayes_factor\": 0.002989030659078392, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 0, \"record_number\": 3}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"first_name_l\\\", \\\"first_name_r\\\") >= 0.7\", \"label_for_charts\": \"Jaro-Winkler distance of first_name >= 0.7\", \"m_probability\": 0.07908610771504865, \"u_probability\": 0.018934945558406913, \"bayes_factor\": 4.176727494203714, \"log2_bayes_factor\": 2.06237301958744, \"comparison_vector_value\": 1, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of first_name >= 0.7` then comparison is 4.18 times more likely to be a match\", \"column_name\": \"first_name\", \"value_l\": \"Freya\", \"value_r\": \"Fera\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 1, \"record_number\": 3}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"first_name_l\\\", \\\"first_name_r\\\") >= 0.7\", \"label_for_charts\": \"Jaro-Winkler distance of first_name >= 0.7\", \"m_probability\": 0.07908610771504865, \"u_probability\": 0.018934945558406913, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 1, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of first_name >= 0.7` then comparison is 4.18 times more likely to be a match\", \"column_name\": \"tf_first_name\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 2, \"record_number\": 3}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 0.22323745980048332, \"log2_bayes_factor\": -2.163348959593668, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"surname\", \"value_l\": \"Shah\", \"value_r\": \"hhS\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 3, \"record_number\": 3}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"tf_surname\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 4, \"record_number\": 3}, {\"sql_condition\": \"\\\"dob_l\\\" = \\\"dob_r\\\"\", \"label_for_charts\": \"Exact match on dob\", \"m_probability\": 0.39142166528829947, \"u_probability\": 0.0017477477477477479, \"bayes_factor\": 223.95775694330536, \"log2_bayes_factor\": 7.807082825648327, \"comparison_vector_value\": 2, \"bayes_factor_description\": \"If comparison level is `exact match on dob` then comparison is 223.96 times more likely to be a match\", \"column_name\": \"dob\", \"value_l\": \"1970-12-17\", \"value_r\": \"1970-12-17\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 5, \"record_number\": 3}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 0.46295618691361595, \"log2_bayes_factor\": -1.1110524282124985, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"city\", \"value_l\": \"Londodn\", \"value_r\": \"London\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 6, \"record_number\": 3}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"tf_city\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 7, \"record_number\": 3}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"email\", \"value_l\": \"None\", \"value_r\": \"f.s@flynn.com\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 8, \"record_number\": 3}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_email\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 9, \"record_number\": 3}, {\"column_name\": \"Final score\", \"label_for_charts\": \"Final score\", \"sql_condition\": null, \"log2_bayes_factor\": -1.7910521321323347, \"bayes_factor\": 0.28896123476631475, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 10, \"record_number\": 3}, {\"column_name\": \"Prior\", \"label_for_charts\": \"Starting match weight (prior)\", \"sql_condition\": null, \"log2_bayes_factor\": -8.386106589561935, \"bayes_factor\": 0.002989030659078392, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 0, \"record_number\": 4}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.20366740135854072, \"u_probability\": 0.9713801052585794, \"bayes_factor\": 0.20966807973108, \"log2_bayes_factor\": -2.2538208553839927, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.77 times less likely to be a match\", \"column_name\": \"first_name\", \"value_l\": \"Freya\", \"value_r\": \"Shah\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 1, \"record_number\": 4}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.20366740135854072, \"u_probability\": 0.9713801052585794, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.77 times less likely to be a match\", \"column_name\": \"tf_first_name\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 2, \"record_number\": 4}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 0.22323745980048332, \"log2_bayes_factor\": -2.163348959593668, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"surname\", \"value_l\": \"Shah\", \"value_r\": \"Freya\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 3, \"record_number\": 4}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"tf_surname\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 4, \"record_number\": 4}, {\"sql_condition\": \"\\\"dob_l\\\" = \\\"dob_r\\\"\", \"label_for_charts\": \"Exact match on dob\", \"m_probability\": 0.39142166528829947, \"u_probability\": 0.0017477477477477479, \"bayes_factor\": 223.95775694330536, \"log2_bayes_factor\": 7.807082825648327, \"comparison_vector_value\": 2, \"bayes_factor_description\": \"If comparison level is `exact match on dob` then comparison is 223.96 times more likely to be a match\", \"column_name\": \"dob\", \"value_l\": \"1970-12-17\", \"value_r\": \"1970-12-17\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 5, \"record_number\": 4}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 0.46295618691361595, \"log2_bayes_factor\": -1.1110524282124985, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"city\", \"value_l\": \"Londodn\", \"value_r\": \"London\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 6, \"record_number\": 4}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.43742527764250866, \"u_probability\": 0.9448524288198547, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.16 times less likely to be a match\", \"column_name\": \"tf_city\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 7, \"record_number\": 4}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"email\", \"value_l\": \"None\", \"value_r\": \"f.s@flynn.com\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 8, \"record_number\": 4}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_email\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 9, \"record_number\": 4}, {\"column_name\": \"Final score\", \"label_for_charts\": \"Final score\", \"sql_condition\": null, \"log2_bayes_factor\": -6.107246007103768, \"bayes_factor\": 0.014505602123732897, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 10, \"record_number\": 4}]}}, {\"mode\": \"vega-lite\"});\n",
              "</script>"
            ],
            "text/plain": [
              "alt.LayerChart(...)"
            ]
          },
          "execution_count": 5,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "splink_df = linker.evaluation.prediction_errors_from_labels_table(\n",
        "    labels_table, include_false_negatives=True, include_false_positives=False\n",
        ")\n",
        "false_negatives = splink_df.as_record_dict(limit=5)\n",
        "linker.visualisations.waterfall_chart(false_negatives)"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "c4fe30d3",
      "metadata": {},
      "source": [
        "### False positives"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 6,
      "id": "f8f816e2",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:19.801020Z",
          "iopub.status.busy": "2024-07-18T13:59:19.800643Z",
          "iopub.status.idle": "2024-07-18T13:59:20.908287Z",
          "shell.execute_reply": "2024-07-18T13:59:20.907746Z"
        }
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "\n",
              "<style>\n",
              "  #altair-viz-7503653349884d74944ce60f61a7cabb.vega-embed {\n",
              "    width: 100%;\n",
              "    display: flex;\n",
              "  }\n",
              "\n",
              "  #altair-viz-7503653349884d74944ce60f61a7cabb.vega-embed details,\n",
              "  #altair-viz-7503653349884d74944ce60f61a7cabb.vega-embed details summary {\n",
              "    position: relative;\n",
              "  }\n",
              "</style>\n",
              "<div id=\"altair-viz-7503653349884d74944ce60f61a7cabb\"></div>\n",
              "<script type=\"text/javascript\">\n",
              "  var VEGA_DEBUG = (typeof VEGA_DEBUG == \"undefined\") ? {} : VEGA_DEBUG;\n",
              "  (function(spec, embedOpt){\n",
              "    let outputDiv = document.currentScript.previousElementSibling;\n",
              "    if (outputDiv.id !== \"altair-viz-7503653349884d74944ce60f61a7cabb\") {\n",
              "      outputDiv = document.getElementById(\"altair-viz-7503653349884d74944ce60f61a7cabb\");\n",
              "    }\n",
              "    const paths = {\n",
              "      \"vega\": \"https://cdn.jsdelivr.net/npm/vega@5?noext\",\n",
              "      \"vega-lib\": \"https://cdn.jsdelivr.net/npm/vega-lib?noext\",\n",
              "      \"vega-lite\": \"https://cdn.jsdelivr.net/npm/vega-lite@5.17.0?noext\",\n",
              "      \"vega-embed\": \"https://cdn.jsdelivr.net/npm/vega-embed@6?noext\",\n",
              "    };\n",
              "\n",
              "    function maybeLoadScript(lib, version) {\n",
              "      var key = `${lib.replace(\"-\", \"\")}_version`;\n",
              "      return (VEGA_DEBUG[key] == version) ?\n",
              "        Promise.resolve(paths[lib]) :\n",
              "        new Promise(function(resolve, reject) {\n",
              "          var s = document.createElement('script');\n",
              "          document.getElementsByTagName(\"head\")[0].appendChild(s);\n",
              "          s.async = true;\n",
              "          s.onload = () => {\n",
              "            VEGA_DEBUG[key] = version;\n",
              "            return resolve(paths[lib]);\n",
              "          };\n",
              "          s.onerror = () => reject(`Error loading script: ${paths[lib]}`);\n",
              "          s.src = paths[lib];\n",
              "        });\n",
              "    }\n",
              "\n",
              "    function showError(err) {\n",
              "      outputDiv.innerHTML = `<div class=\"error\" style=\"color:red;\">${err}</div>`;\n",
              "      throw err;\n",
              "    }\n",
              "\n",
              "    function displayChart(vegaEmbed) {\n",
              "      vegaEmbed(outputDiv, spec, embedOpt)\n",
              "        .catch(err => showError(`Javascript Error: ${err.message}<br>This usually means there's a typo in your chart specification. See the javascript console for the full traceback.`));\n",
              "    }\n",
              "\n",
              "    if(typeof define === \"function\" && define.amd) {\n",
              "      requirejs.config({paths});\n",
              "      require([\"vega-embed\"], displayChart, err => showError(`Error loading script: ${err.message}`));\n",
              "    } else {\n",
              "      maybeLoadScript(\"vega\", \"5\")\n",
              "        .then(() => maybeLoadScript(\"vega-lite\", \"5.17.0\"))\n",
              "        .then(() => maybeLoadScript(\"vega-embed\", \"6\"))\n",
              "        .catch(showError)\n",
              "        .then(() => displayChart(vegaEmbed));\n",
              "    }\n",
              "  })({\"config\": {\"view\": {\"continuousWidth\": 400, \"continuousHeight\": 300}}, \"layer\": [{\"layer\": [{\"mark\": \"rule\", \"encoding\": {\"color\": {\"value\": \"black\"}, \"size\": {\"value\": 0.5}, \"y\": {\"field\": \"zero\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"bar\", \"width\": 60}, \"encoding\": {\"color\": {\"condition\": {\"test\": \"(datum.log2_bayes_factor < 0)\", \"value\": \"red\"}, \"value\": \"green\"}, \"opacity\": {\"condition\": {\"test\": \"datum.column_name == 'Prior match weight' || datum.column_name == 'Final score'\", \"value\": 1}, \"value\": 0.5}, \"tooltip\": [{\"field\": \"column_name\", \"title\": \"Comparison column\", \"type\": \"nominal\"}, {\"field\": \"value_l\", \"title\": \"Value (L)\", \"type\": \"nominal\"}, {\"field\": \"value_r\", \"title\": \"Value (R)\", \"type\": \"nominal\"}, {\"field\": \"label_for_charts\", \"title\": \"Label\", \"type\": \"ordinal\"}, {\"field\": \"sql_condition\", \"title\": \"SQL condition\", \"type\": \"nominal\"}, {\"field\": \"comparison_vector_value\", \"title\": \"Comparison vector value\", \"type\": \"nominal\"}, {\"field\": \"bayes_factor\", \"format\": \",.4f\", \"title\": \"Bayes factor = m/u\", \"type\": \"quantitative\"}, {\"field\": \"log2_bayes_factor\", \"format\": \",.4f\", \"title\": \"Match weight = log2(m/u)\", \"type\": \"quantitative\"}, {\"field\": \"prob\", \"format\": \".4f\", \"title\": \"Cumulative match probability\", \"type\": \"quantitative\"}, {\"field\": \"bayes_factor_description\", \"title\": \"Match weight description\", \"type\": \"nominal\"}], \"x\": {\"axis\": {\"grid\": true, \"labelAlign\": \"center\", \"labelAngle\": -20, \"labelExpr\": \"datum.value == 'Prior' || datum.value == 'Final score' ? '' : datum.value\", \"labelPadding\": 10, \"tickBand\": \"extent\", \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"axis\": {\"grid\": false, \"orient\": \"left\", \"title\": \"Match Weight\"}, \"field\": \"previous_sum\", \"type\": \"quantitative\"}, \"y2\": {\"field\": \"sum\"}}}, {\"mark\": {\"type\": \"text\", \"fontWeight\": \"bold\"}, \"encoding\": {\"color\": {\"value\": \"white\"}, \"text\": {\"condition\": {\"test\": \"abs(datum.log2_bayes_factor) > 1\", \"field\": \"log2_bayes_factor\", \"format\": \".2f\", \"type\": \"nominal\"}, \"value\": \"\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"axis\": {\"orient\": \"left\"}, \"field\": \"center\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"text\", \"baseline\": \"bottom\", \"dy\": -25, \"fontWeight\": \"bold\"}, \"encoding\": {\"color\": {\"value\": \"black\"}, \"text\": {\"field\": \"column_name\", \"type\": \"nominal\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"field\": \"sum_top\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"text\", \"baseline\": \"bottom\", \"dy\": -13, \"fontSize\": 8}, \"encoding\": {\"color\": {\"value\": \"grey\"}, \"text\": {\"field\": \"value_l\", \"type\": \"nominal\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"field\": \"sum_top\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"text\", \"baseline\": \"bottom\", \"dy\": -5, \"fontSize\": 8}, \"encoding\": {\"color\": {\"value\": \"grey\"}, \"text\": {\"field\": \"value_r\", \"type\": \"nominal\"}, \"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"y\": {\"field\": \"sum_top\", \"type\": \"quantitative\"}}}]}, {\"mark\": {\"type\": \"rule\", \"color\": \"black\", \"strokeWidth\": 2, \"x2Offset\": 30, \"xOffset\": -30}, \"encoding\": {\"x\": {\"axis\": {\"labelAngle\": -20, \"title\": \"Column\"}, \"field\": \"column_name\", \"sort\": {\"field\": \"bar_sort_order\", \"order\": \"ascending\"}, \"type\": \"nominal\"}, \"x2\": {\"field\": \"lead\"}, \"y\": {\"axis\": {\"labelExpr\": \"format(1 / (1 + pow(2, -1*datum.value)), '.2r')\", \"orient\": \"right\", \"title\": \"Probability\"}, \"field\": \"sum\", \"scale\": {\"zero\": false}, \"type\": \"quantitative\"}}}], \"data\": {\"name\": \"data-18ee183dae62dea90a26c7d0c3c92a7f\"}, \"height\": 450, \"params\": [{\"name\": \"record_number\", \"bind\": {\"input\": \"range\", \"max\": 1, \"min\": 0, \"step\": 1}, \"value\": 0}], \"resolve\": {\"axis\": {\"y\": \"independent\"}}, \"title\": {\"text\": \"Match weights waterfall chart\", \"subtitle\": \"How each comparison contributes to the final match score\"}, \"transform\": [{\"filter\": \"(datum.record_number == record_number)\"}, {\"filter\": \"(datum.bayes_factor !== 1.0)\"}, {\"window\": [{\"op\": \"sum\", \"field\": \"log2_bayes_factor\", \"as\": \"sum\"}, {\"op\": \"lead\", \"field\": \"column_name\", \"as\": \"lead\"}], \"frame\": [null, 0]}, {\"calculate\": \"datum.column_name === \\\"Final score\\\" ? datum.sum - datum.log2_bayes_factor : datum.sum\", \"as\": \"sum\"}, {\"calculate\": \"datum.lead === null ? datum.column_name : datum.lead\", \"as\": \"lead\"}, {\"calculate\": \"datum.column_name === \\\"Final score\\\" || datum.column_name === \\\"Prior match weight\\\" ? 0 : datum.sum - datum.log2_bayes_factor\", \"as\": \"previous_sum\"}, {\"calculate\": \"datum.sum > datum.previous_sum ? datum.column_name : \\\"\\\"\", \"as\": \"top_label\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? datum.column_name : \\\"\\\"\", \"as\": \"bottom_label\"}, {\"calculate\": \"datum.sum > datum.previous_sum ? datum.sum : datum.previous_sum\", \"as\": \"sum_top\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? datum.sum : datum.previous_sum\", \"as\": \"sum_bottom\"}, {\"calculate\": \"(datum.sum + datum.previous_sum) / 2\", \"as\": \"center\"}, {\"calculate\": \"(datum.log2_bayes_factor > 0 ? \\\"+\\\" : \\\"\\\") + datum.log2_bayes_factor\", \"as\": \"text_log2_bayes_factor\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? 4 : -4\", \"as\": \"dy\"}, {\"calculate\": \"datum.sum < datum.previous_sum ? \\\"top\\\" : \\\"bottom\\\"\", \"as\": \"baseline\"}, {\"calculate\": \"1. / (1 + pow(2, -1.*datum.sum))\", \"as\": \"prob\"}, {\"calculate\": \"0*datum.sum\", \"as\": \"zero\"}], \"width\": {\"step\": 75}, \"$schema\": \"https://vega.github.io/schema/vega-lite/v5.9.3.json\", \"datasets\": {\"data-18ee183dae62dea90a26c7d0c3c92a7f\": [{\"column_name\": \"Prior\", \"label_for_charts\": \"Starting match weight (prior)\", \"sql_condition\": null, \"log2_bayes_factor\": -8.386106589561935, \"bayes_factor\": 0.002989030659078392, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 0, \"record_number\": 0}, {\"sql_condition\": \"\\\"first_name_l\\\" IS NULL OR \\\"first_name_r\\\" IS NULL\", \"label_for_charts\": \"first_name is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `first_name is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"first_name\", \"value_l\": \"None\", \"value_r\": \"None\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 1, \"record_number\": 0}, {\"sql_condition\": \"\\\"first_name_l\\\" IS NULL OR \\\"first_name_r\\\" IS NULL\", \"label_for_charts\": \"first_name is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `first_name is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_first_name\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 2, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 0.22323745980048332, \"log2_bayes_factor\": -2.163348959593668, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"surname\", \"value_l\": \"Smith\", \"value_r\": \"Bron\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 3, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"tf_surname\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 4, \"record_number\": 0}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.4592001552927498, \"u_probability\": 0.9966506506506506, \"bayes_factor\": 0.4607433457179473, \"log2_bayes_factor\": -1.1179647649923994, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.17 times less likely to be a match\", \"column_name\": \"dob\", \"value_l\": \"1981-10-11\", \"value_r\": \"1983-11-06\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 5, \"record_number\": 0}, {\"sql_condition\": \"\\\"city_l\\\" = \\\"city_r\\\"\", \"label_for_charts\": \"Exact match on city\", \"m_probability\": 0.5625747223574914, \"u_probability\": 0.0551475711801453, \"bayes_factor\": 10.201260188228096, \"log2_bayes_factor\": 3.350675477967206, \"comparison_vector_value\": 1, \"bayes_factor_description\": \"If comparison level is `exact match on city` then comparison is 10.20 times more likely to be a match\", \"column_name\": \"city\", \"value_l\": \"Bradford\", \"value_r\": \"Bradford\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 6, \"record_number\": 0}, {\"sql_condition\": \"\\\"city_l\\\" = \\\"city_r\\\"\", \"label_for_charts\": \"Term freq adjustment on city with weight {cl.tf_adjustment_weight}\", \"m_probability\": null, \"u_probability\": null, \"bayes_factor\": 4.981663929939793, \"log2_bayes_factor\": 2.31662769862349, \"comparison_vector_value\": 1, \"bayes_factor_description\": \"Term frequency adjustment on city makes comparison 4.98 times more likely to be a match\", \"column_name\": \"tf_city\", \"value_l\": \"Bradford\", \"value_r\": \"Bradford\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 7, \"record_number\": 0}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"email\", \"value_l\": \"muhammadsmith@brooks.com\", \"value_r\": \"None\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 8, \"record_number\": 0}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_email\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 9, \"record_number\": 0}, {\"column_name\": \"Final score\", \"label_for_charts\": \"Final score\", \"sql_condition\": null, \"log2_bayes_factor\": -6.000117137557306, \"bayes_factor\": 0.015623731402008185, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 10, \"record_number\": 0}, {\"column_name\": \"Prior\", \"label_for_charts\": \"Starting match weight (prior)\", \"sql_condition\": null, \"log2_bayes_factor\": -8.386106589561935, \"bayes_factor\": 0.002989030659078392, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 0, \"record_number\": 1}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"first_name_l\\\", \\\"first_name_r\\\") >= 0.92\", \"label_for_charts\": \"Jaro-Winkler distance of first_name >= 0.92\", \"m_probability\": 0.15176057384758357, \"u_probability\": 0.0023429457903817435, \"bayes_factor\": 64.77340383656795, \"log2_bayes_factor\": 6.0173296544561365, \"comparison_vector_value\": 3, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of first_name >= 0.92` then comparison is 64.77 times more likely to be a match\", \"column_name\": \"first_name\", \"value_l\": \"Thoas\", \"value_r\": \"Thomas\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 1, \"record_number\": 1}, {\"sql_condition\": \"jaro_winkler_similarity(\\\"first_name_l\\\", \\\"first_name_r\\\") >= 0.92\", \"label_for_charts\": \"Jaro-Winkler distance of first_name >= 0.92\", \"m_probability\": 0.15176057384758357, \"u_probability\": 0.0023429457903817435, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 3, \"bayes_factor_description\": \"If comparison level is `jaro-winkler distance of first_name >= 0.92` then comparison is 64.77 times more likely to be a match\", \"column_name\": \"tf_first_name\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 2, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 0.22323745980048332, \"log2_bayes_factor\": -2.163348959593668, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"surname\", \"value_l\": \"Green\", \"value_r\": \"Gabriel\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 3, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.21797126539158398, \"u_probability\": 0.9764098981702893, \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  4.48 times less likely to be a match\", \"column_name\": \"tf_surname\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 4, \"record_number\": 1}, {\"sql_condition\": \"ELSE\", \"label_for_charts\": \"All other comparisons\", \"m_probability\": 0.4592001552927498, \"u_probability\": 0.9966506506506506, \"bayes_factor\": 0.4607433457179473, \"log2_bayes_factor\": -1.1179647649923994, \"comparison_vector_value\": 0, \"bayes_factor_description\": \"If comparison level is `all other comparisons` then comparison is  2.17 times less likely to be a match\", \"column_name\": \"dob\", \"value_l\": \"1974-10-05\", \"value_r\": \"1976-08-15\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 5, \"record_number\": 1}, {\"sql_condition\": \"\\\"city_l\\\" IS NULL OR \\\"city_r\\\" IS NULL\", \"label_for_charts\": \"city is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `city is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"city\", \"value_l\": \"London\", \"value_r\": \"None\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 6, \"record_number\": 1}, {\"sql_condition\": \"\\\"city_l\\\" IS NULL OR \\\"city_r\\\" IS NULL\", \"label_for_charts\": \"city is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `city is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_city\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 7, \"record_number\": 1}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"email\", \"value_l\": \"thomas.green@clark.org\", \"value_r\": \"None\", \"term_frequency_adjustment\": false, \"bar_sort_order\": 8, \"record_number\": 1}, {\"sql_condition\": \"\\\"email_l\\\" IS NULL OR \\\"email_r\\\" IS NULL\", \"label_for_charts\": \"email is NULL\", \"bayes_factor\": 1.0, \"log2_bayes_factor\": 0.0, \"comparison_vector_value\": -1, \"bayes_factor_description\": \"If comparison level is `email is null` then comparison is 1.00 times more likely to be a match\", \"column_name\": \"tf_email\", \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": true, \"bar_sort_order\": 9, \"record_number\": 1}, {\"column_name\": \"Final score\", \"label_for_charts\": \"Final score\", \"sql_condition\": null, \"log2_bayes_factor\": -5.650090659691866, \"bayes_factor\": 0.019913758371815367, \"comparison_vector_value\": null, \"m_probability\": null, \"u_probability\": null, \"bayes_factor_description\": null, \"value_l\": \"\", \"value_r\": \"\", \"term_frequency_adjustment\": null, \"bar_sort_order\": 10, \"record_number\": 1}]}}, {\"mode\": \"vega-lite\"});\n",
              "</script>"
            ],
            "text/plain": [
              "alt.LayerChart(...)"
            ]
          },
          "execution_count": 6,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "# Note I've picked a threshold match probability of 0.01 here because otherwise\n",
        "# in this simple example there are no false positives\n",
        "splink_df = linker.evaluation.prediction_errors_from_labels_table(\n",
        "    labels_table, include_false_negatives=False, include_false_positives=True, threshold_match_probability=0.01\n",
        ")\n",
        "false_postives = splink_df.as_record_dict(limit=5)\n",
        "linker.visualisations.waterfall_chart(false_postives)"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "030c4cde",
      "metadata": {},
      "source": [
        "## Threshold Selection chart\n",
        "\n",
        "Splink includes an interactive dashboard that shows key accuracy statistics:\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 7,
      "id": "e83d9645",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:20.911371Z",
          "iopub.status.busy": "2024-07-18T13:59:20.911132Z",
          "iopub.status.idle": "2024-07-18T13:59:22.696520Z",
          "shell.execute_reply": "2024-07-18T13:59:22.695643Z"
        }
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "\n",
              "<style>\n",
              "  #altair-viz-285b0c378a254f50857516179e15d53c.vega-embed {\n",
              "    width: 100%;\n",
              "    display: flex;\n",
              "  }\n",
              "\n",
              "  #altair-viz-285b0c378a254f50857516179e15d53c.vega-embed details,\n",
              "  #altair-viz-285b0c378a254f50857516179e15d53c.vega-embed details summary {\n",
              "    position: relative;\n",
              "  }\n",
              "</style>\n",
              "<div id=\"altair-viz-285b0c378a254f50857516179e15d53c\"></div>\n",
              "<script type=\"text/javascript\">\n",
              "  var VEGA_DEBUG = (typeof VEGA_DEBUG == \"undefined\") ? {} : VEGA_DEBUG;\n",
              "  (function(spec, embedOpt){\n",
              "    let outputDiv = document.currentScript.previousElementSibling;\n",
              "    if (outputDiv.id !== \"altair-viz-285b0c378a254f50857516179e15d53c\") {\n",
              "      outputDiv = document.getElementById(\"altair-viz-285b0c378a254f50857516179e15d53c\");\n",
              "    }\n",
              "    const paths = {\n",
              "      \"vega\": \"https://cdn.jsdelivr.net/npm/vega@5?noext\",\n",
              "      \"vega-lib\": \"https://cdn.jsdelivr.net/npm/vega-lib?noext\",\n",
              "      \"vega-lite\": \"https://cdn.jsdelivr.net/npm/vega-lite@5.17.0?noext\",\n",
              "      \"vega-embed\": \"https://cdn.jsdelivr.net/npm/vega-embed@6?noext\",\n",
              "    };\n",
              "\n",
              "    function maybeLoadScript(lib, version) {\n",
              "      var key = `${lib.replace(\"-\", \"\")}_version`;\n",
              "      return (VEGA_DEBUG[key] == version) ?\n",
              "        Promise.resolve(paths[lib]) :\n",
              "        new Promise(function(resolve, reject) {\n",
              "          var s = document.createElement('script');\n",
              "          document.getElementsByTagName(\"head\")[0].appendChild(s);\n",
              "          s.async = true;\n",
              "          s.onload = () => {\n",
              "            VEGA_DEBUG[key] = version;\n",
              "            return resolve(paths[lib]);\n",
              "          };\n",
              "          s.onerror = () => reject(`Error loading script: ${paths[lib]}`);\n",
              "          s.src = paths[lib];\n",
              "        });\n",
              "    }\n",
              "\n",
              "    function showError(err) {\n",
              "      outputDiv.innerHTML = `<div class=\"error\" style=\"color:red;\">${err}</div>`;\n",
              "      throw err;\n",
              "    }\n",
              "\n",
              "    function displayChart(vegaEmbed) {\n",
              "      vegaEmbed(outputDiv, spec, embedOpt)\n",
              "        .catch(err => showError(`Javascript Error: ${err.message}<br>This usually means there's a typo in your chart specification. See the javascript console for the full traceback.`));\n",
              "    }\n",
              "\n",
              "    if(typeof define === \"function\" && define.amd) {\n",
              "      requirejs.config({paths});\n",
              "      require([\"vega-embed\"], displayChart, err => showError(`Error loading script: ${err.message}`));\n",
              "    } else {\n",
              "      maybeLoadScript(\"vega\", \"5\")\n",
              "        .then(() => maybeLoadScript(\"vega-lite\", \"5.17.0\"))\n",
              "        .then(() => maybeLoadScript(\"vega-embed\", \"6\"))\n",
              "        .catch(showError)\n",
              "        .then(() => displayChart(vegaEmbed));\n",
              "    }\n",
              "  })({\"config\": {\"view\": {\"continuousWidth\": 300, \"continuousHeight\": 300, \"discreteHeight\": {\"step\": 150}, \"discreteWidth\": {\"step\": 150}}, \"axis\": {\"gridWidth\": 0.5, \"labelFontSize\": 12, \"titleFontSize\": 16}, \"axisX\": {\"format\": \"+.0f\", \"grid\": false, \"offset\": 20, \"values\": {\"expr\": \"[-25,-20,-15,-10,-5,0,5,10,15,20,25]\"}}, \"axisY\": {\"title\": \"Match probability threshold\", \"titleFontSize\": 16}, \"concat\": {\"spacing\": 40}}, \"hconcat\": [{\"vconcat\": [{\"layer\": [{\"layer\": [{\"mark\": {\"type\": \"rule\"}, \"encoding\": {\"opacity\": {\"condition\": {\"param\": \"threshold\", \"value\": 0.3, \"empty\": false}, \"value\": 0}, \"x\": {\"axis\": {\"orient\": \"bottom\"}, \"field\": \"truth_threshold\", \"scale\": {\"nice\": false}, \"title\": null, \"type\": \"quantitative\"}}, \"params\": [{\"name\": \"threshold\", \"select\": {\"type\": \"point\", \"encodings\": [\"x\"], \"fields\": [\"truth_threshold\"], \"nearest\": true, \"on\": \"mouseover\", \"toggle\": false}, \"value\": null}]}, {\"mark\": {\"type\": \"rule\"}, \"encoding\": {\"opacity\": {\"condition\": {\"param\": \"threshold\", \"value\": 0.3, \"empty\": false}, \"value\": 0}, \"y\": {\"axis\": {\"orient\": \"right\"}, \"field\": \"match_probability\", \"title\": \" \", \"type\": \"quantitative\"}}, \"params\": [{\"name\": \"prob\", \"select\": {\"type\": \"point\", \"encodings\": [\"y\"], \"fields\": [\"match_probability\"], \"nearest\": true, \"on\": \"mouseover\", \"toggle\": false}}]}]}, {\"layer\": [{\"mark\": {\"type\": \"text\", \"fontSize\": 14, \"fontWeight\": \"bold\", \"xOffset\": 25, \"yOffset\": 10}, \"encoding\": {\"text\": {\"aggregate\": \"min\", \"field\": \"truth_threshold\", \"format\": \"+.2f\"}, \"y\": {\"axis\": {\"orient\": \"left\"}, \"field\": \"match_probability\", \"title\": \"Match probability threshold\", \"type\": \"quantitative\"}}, \"transform\": [{\"filter\": {\"param\": \"threshold\", \"empty\": false}}]}, {\"mark\": {\"type\": \"text\", \"fontSize\": 14, \"xOffset\": -25, \"yOffset\": -10}, \"encoding\": {\"text\": {\"aggregate\": \"min\", \"field\": \"match_probability\", \"format\": \".3f\"}}, \"transform\": [{\"filter\": {\"param\": \"threshold\", \"empty\": false}}]}, {\"mark\": {\"type\": \"line\", \"color\": \"red\", \"opacity\": 0.5}}, {\"mark\": {\"type\": \"line\", \"color\": \"green\", \"opacity\": 0.5, \"strokeWidth\": 3}, \"transform\": [{\"filter\": \"datum.truth_threshold >= threshold.truth_threshold\"}]}, {\"mark\": {\"type\": \"point\", \"color\": \"green\", \"size\": 100}, \"encoding\": {\"opacity\": {\"condition\": {\"param\": \"threshold\", \"value\": 1, \"empty\": false}, \"value\": 0}}}], \"encoding\": {\"x\": {\"field\": \"truth_threshold\", \"type\": \"quantitative\", \"title\": \"Match weight threshold\", \"axis\": {\"orient\": \"top\"}}, \"y\": {\"field\": \"match_probability\", \"type\": \"quantitative\", \"title\": \"Match probability threshold\", \"axis\": {\"orient\": \"left\", \"titlePadding\": 10}}}}, {\"mark\": {\"type\": \"text\", \"align\": \"left\", \"color\": \"red\", \"fontSize\": 12, \"text\": \"Non-match\", \"x\": 0, \"y\": \"height\", \"yOffset\": 10}, \"data\": {\"values\": [{}]}}, {\"mark\": {\"type\": \"text\", \"align\": \"right\", \"color\": \"green\", \"fontSize\": 12, \"fontWeight\": \"bold\", \"text\": \"Match\", \"x\": \"width\", \"y\": 0, \"yOffset\": -10}, \"data\": {\"values\": [{}]}}], \"description\": \"Match weight vs probability\"}, {\"hconcat\": [{\"layer\": [{\"mark\": {\"type\": \"rect\", \"opacity\": 0.5}, \"encoding\": {\"color\": {\"field\": \"count\", \"legend\": null, \"scale\": {\"scheme\": \"reds\", \"zero\": true}, \"type\": \"quantitative\"}}, \"transform\": [{\"filter\": \"datum.predicted == 0\"}]}, {\"mark\": {\"type\": \"rect\", \"opacity\": 0.5}, \"encoding\": {\"color\": {\"field\": \"count\", \"legend\": null, \"scale\": {\"scheme\": \"greens\", \"zero\": true}, \"type\": \"quantitative\"}}, \"transform\": [{\"filter\": \"datum.predicted == 1\"}]}, {\"mark\": {\"type\": \"text\", \"fontSize\": 14, \"yOffset\": -40}, \"encoding\": {\"color\": {\"condition\": [{\"test\": \"datum.predicted==1 && datum.actual==1\", \"value\": \"darkgreen\"}, {\"test\": \"datum.predicted==0 && datum.actual==0\", \"value\": \"darkred\"}], \"value\": \"black\"}, \"opacity\": {\"condition\": {\"test\": \"datum.predicted != datum.actual\", \"value\": 1}, \"value\": 0.5}, \"text\": {\"field\": \"confusion_label\", \"type\": \"nominal\"}}}, {\"mark\": {\"type\": \"text\", \"fontSize\": 28, \"fontWeight\": \"bold\", \"yOffset\": 10}, \"encoding\": {\"color\": {\"condition\": [{\"test\": \"datum.predicted==1 && datum.actual==1\", \"value\": \"darkgreen\"}, {\"test\": \"datum.predicted==0 && datum.actual==0\", \"value\": \"darkred\"}], \"value\": \"black\"}, \"text\": {\"field\": \"count\", \"format\": \",\", \"type\": \"nominal\"}}}], \"description\": \"Confusion matrix\", \"encoding\": {\"x\": {\"field\": \"actual\", \"type\": \"nominal\", \"title\": \"Actual\", \"axis\": {\"domain\": false, \"labelAngle\": 0, \"labelExpr\": \"datum.label == 1 ? 'Match' : 'Non-match'\", \"labelFontSize\": 18, \"labelPadding\": 10, \"orient\": \"top\", \"ticks\": false, \"titleAngle\": 0, \"titleFontSize\": 20}, \"sort\": \"-x\"}, \"y\": {\"field\": \"predicted\", \"type\": \"nominal\", \"title\": \"Predicted\", \"axis\": {\"domain\": false, \"labelExpr\": \"datum.label == 1 ? 'Match' : 'Non-match'\", \"labelFontSize\": 18, \"labelPadding\": 10, \"ticks\": false, \"titleAngle\": 0, \"titleFontSize\": 20, \"titlePadding\": -30}, \"sort\": \"-y\"}}, \"resolve\": {\"scale\": {\"color\": \"independent\"}}, \"transform\": [{\"filter\": {\"or\": [{\"param\": \"threshold\", \"empty\": false}, {\"and\": [{\"param\": \"threshold\", \"empty\": true}, \"datum.truth_threshold == datum.median_threshold\"]}]}}]}], \"transform\": [{\"fold\": [\"tp\", \"tn\", \"fp\", \"fn\"], \"as\": [\"label\", \"count\"]}, {\"calculate\": \"datum.label === 'tp' ? 'True Positive (TP)' : datum.label === 'tn' ? 'True Negative (TN)' : datum.label === 'fp' ? 'False Positive (FP)' : 'False Negative (FN)'\", \"as\": \"confusion_label\"}, {\"calculate\": \"datum.label === 'tp' || datum.label === 'fp' ? 1 : 0\", \"as\": \"predicted\"}, {\"calculate\": \"datum.label === 'tp' || datum.label === 'fn' ? 1 : 0\", \"as\": \"actual\"}, {\"joinaggregate\": [{\"op\": \"median\", \"field\": \"truth_threshold\", \"as\": \"median_threshold\"}]}]}]}, {\"layer\": [{\"layer\": [{\"mark\": {\"type\": \"point\", \"size\": 100}, \"encoding\": {\"opacity\": {\"condition\": {\"param\": \"threshold\", \"value\": 1, \"empty\": false}, \"value\": 0}, \"tooltip\": [{\"field\": \"truth_threshold\", \"format\": \".3f\", \"title\": \"Match weight threshold\", \"type\": \"quantitative\"}, {\"field\": \"match_probability\", \"format\": \".3%\", \"title\": \"Match probability threshold\", \"type\": \"quantitative\"}, {\"field\": \"precision\", \"format\": \".4f\", \"title\": \"Precision\", \"type\": \"quantitative\"}, {\"field\": \"recall\", \"format\": \".4f\", \"title\": \"Recall (TPR)\", \"type\": \"quantitative\"}, {\"field\": \"fp_rate\", \"format\": \".4f\", \"title\": \"FPR\", \"type\": \"quantitative\"}], \"x\": {\"axis\": {\"orient\": \"top\"}, \"field\": \"truth_threshold\", \"title\": \"Match weight threshold\"}}, \"params\": [{\"name\": \"metric\", \"select\": {\"type\": \"point\", \"fields\": [\"metric\"]}, \"bind\": \"legend\", \"value\": [{\"metric\": \"precision\"}, {\"metric\": \"recall\"}]}, {\"name\": \"threshold\", \"select\": {\"type\": \"point\", \"encodings\": [\"x\"], \"fields\": [\"truth_threshold\"], \"nearest\": true, \"on\": \"mouseover\", \"toggle\": false}, \"value\": null}], \"transform\": [{\"filter\": {\"param\": \"metric\", \"empty\": true}}]}, {\"mark\": {\"type\": \"line\"}, \"encoding\": {\"opacity\": {\"condition\": {\"param\": \"metric\", \"value\": 1}, \"value\": 0.1}, \"x\": {\"axis\": {\"orient\": \"bottom\"}, \"field\": \"truth_threshold\", \"title\": null}}}], \"encoding\": {\"color\": {\"field\": \"metric\", \"type\": \"nominal\", \"sort\": [\"precision\", \"recall\", \"f1\"], \"title\": [\"Performance\", \"Metric\"], \"legend\": {\"fillColor\": \"whitesmoke\", \"labelExpr\": \"{'precision': 'Precision (PPV)', 'recall': 'Recall (TPR)', 'specificity': 'Specificity (TNR)', 'accuracy': 'Accuracy', 'npv': 'NPV', 'f1': 'F1', 'f2': 'F2', 'f0_5': 'F0.5', 'p4': 'P4', 'phi': '\\u03c6 (MCC)'}[datum.value]\", \"labelFontSize\": 14, \"legendX\": 800, \"legendY\": 160, \"orient\": \"none\", \"padding\": 10, \"titleFontSize\": 16, \"titlePadding\": 15}}, \"x\": {\"type\": \"quantitative\", \"field\": \"truth_threshold\"}, \"y\": {\"field\": \"value\", \"type\": \"quantitative\", \"axis\": {\"labelFontSize\": 12, \"title\": \"Performance metric score\", \"titleFontSize\": 18, \"titlePadding\": 10, \"values\": {\"expr\": \"[0.5,0.55,0.6,0.65,0.7,0.75,0.8,0.85,0.9,0.95,1.0]\"}}, \"scale\": {\"domain\": [0.5, 1]}}}}, {\"layer\": [{\"mark\": {\"type\": \"rule\", \"color\": \"gray\"}, \"encoding\": {\"x\": {\"field\": \"truth_threshold\", \"title\": null, \"type\": \"quantitative\"}}}, {\"layer\": [{\"mark\": {\"type\": \"rect\", \"fill\": \"whitesmoke\", \"x\": 200, \"x2\": 10, \"y2Offset\": 20, \"yOffset\": -20}, \"encoding\": {\"y2\": {\"field\": \"score_index\"}}}, {\"mark\": {\"type\": \"text\", \"align\": \"right\", \"baseline\": \"middle\", \"fontSize\": 16, \"x\": 200, \"xOffset\": -10}}], \"encoding\": {\"color\": {\"field\": \"metric\", \"sort\": [\"precision\", \"recall\", \"f1\"]}, \"text\": {\"field\": \"y_text\"}, \"y\": {\"field\": \"score_index\", \"type\": \"quantitative\"}}, \"transform\": [{\"filter\": {\"param\": \"metric\", \"empty\": true}}]}, {\"mark\": {\"type\": \"text\", \"fontSize\": 14, \"fontWeight\": \"bold\", \"xOffset\": 20, \"y\": 0, \"yOffset\": -10}, \"encoding\": {\"text\": {\"condition\": {\"param\": \"threshold\", \"aggregate\": \"min\", \"empty\": false, \"field\": \"truth_threshold\", \"format\": \"+.2f\", \"type\": \"nominal\"}, \"value\": \"\"}, \"x\": {\"field\": \"truth_threshold\", \"type\": \"quantitative\"}}}], \"transform\": [{\"filter\": {\"param\": \"threshold\", \"empty\": false}}]}], \"description\": \"Accuracy chart\", \"height\": 700, \"transform\": [{\"fold\": [\"precision\", \"recall\", \"f1\"], \"as\": [\"metric\", \"value\"]}, {\"calculate\": \"0.6375 - 0.025*indexof(['precision', 'recall', 'f1'], datum.metric)\", \"as\": \"score_index\"}, {\"calculate\": \"{'precision': 'Precision (PPV)', 'recall': 'Recall (TPR)', 'specificity': 'Specificity (TNR)', 'accuracy': 'Accuracy', 'npv': 'NPV', 'f1': 'F1', 'f2': 'F2', 'f0_5': 'F0.5', 'p4': 'P4', 'phi': '\\u03c6 (MCC)'}[datum.metric]\", \"as\": \"metric_text\"}, {\"calculate\": \"datum.metric_text + ' = ' + format(datum.value, ',.3g')\", \"as\": \"y_text\"}], \"width\": 500}], \"data\": {\"name\": \"data-e60e69967135d02c37c20dd2774bdbba\"}, \"title\": {\"text\": \"Match Threshold Selection Tool\", \"anchor\": \"middle\", \"baseline\": \"line-bottom\", \"fontSize\": 28, \"subtitle\": [\"Hover over either line graph to show Confusion Matrix (bottom left) and selected performance metrics (right).\", \"\", \"Click a legend value to show a specific evaluation metric. Shift + Click to show multiple metrics\"], \"subtitleFontSize\": 14, \"subtitleFontStyle\": \"italic\"}, \"$schema\": \"https://vega.github.io/schema/vega-lite/v5.14.1.json\", \"datasets\": {\"data-e60e69967135d02c37c20dd2774bdbba\": [{\"truth_threshold\": -18.900000281631947, \"match_probability\": 2.0442410704611823e-06, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1709.0, \"tn\": 1103.0, \"fp\": 42.0, \"fn\": 322.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8414574101427869, \"tn_rate\": 0.9633187772925764, \"fp_rate\": 0.03668122270742358, \"fn_rate\": 0.1585425898572132, \"precision\": 0.9760137064534552, \"recall\": 0.8414574101427869, \"specificity\": 0.9633187772925764, \"npv\": 0.7740350877192983, \"accuracy\": 0.8853904282115869, \"f1\": 0.9037546271813856, \"f2\": 0.8653164556962025, \"f0_5\": 0.9457664637520753, \"p4\": 0.8804756275225732, \"phi\": 0.7769307620147627}, {\"truth_threshold\": -16.700000248849392, \"match_probability\": 9.392796608724036e-06, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1709.0, \"tn\": 1119.0, \"fp\": 26.0, \"fn\": 322.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8414574101427869, \"tn_rate\": 0.977292576419214, \"fp_rate\": 0.022707423580786028, \"fn_rate\": 0.1585425898572132, \"precision\": 0.985014409221902, \"recall\": 0.8414574101427869, \"specificity\": 0.977292576419214, \"npv\": 0.7765440666204025, \"accuracy\": 0.8904282115869018, \"f1\": 0.9075942644715879, \"f2\": 0.8667207627548433, \"f0_5\": 0.9525136551109129, \"p4\": 0.8860103770975539, \"phi\": 0.7896366201374305}, {\"truth_threshold\": -12.800000190734863, \"match_probability\": 0.00014020228918616167, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1709.0, \"tn\": 1125.0, \"fp\": 20.0, \"fn\": 322.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8414574101427869, \"tn_rate\": 0.982532751091703, \"fp_rate\": 0.017467248908296942, \"fn_rate\": 0.1585425898572132, \"precision\": 0.9884326200115674, \"recall\": 0.8414574101427869, \"specificity\": 0.982532751091703, \"npv\": 0.7774706288873532, \"accuracy\": 0.8923173803526449, \"f1\": 0.9090425531914894, \"f2\": 0.8672485537399777, \"f0_5\": 0.955068738124511, \"p4\": 0.8880763922377238, \"phi\": 0.7944159751353451}, {\"truth_threshold\": -12.500000186264515, \"match_probability\": 0.00017260367204143044, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1708.0, \"tn\": 1125.0, \"fp\": 20.0, \"fn\": 323.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8409650418513048, \"tn_rate\": 0.982532751091703, \"fp_rate\": 0.017467248908296942, \"fn_rate\": 0.15903495814869523, \"precision\": 0.9884259259259259, \"recall\": 0.8409650418513048, \"specificity\": 0.982532751091703, \"npv\": 0.7769337016574586, \"accuracy\": 0.8920025188916877, \"f1\": 0.9087523277467412, \"f2\": 0.8668290702395453, \"f0_5\": 0.9549368220954937, \"p4\": 0.887762700545028, \"phi\": 0.7938966961277768}, {\"truth_threshold\": -12.400000184774399, \"match_probability\": 0.00018498974370122882, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1132.0, \"fp\": 13.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.988646288209607, \"fp_rate\": 0.011353711790393014, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9924330616996507, \"recall\": 0.8394879369768586, \"specificity\": 0.988646288209607, \"npv\": 0.7764060356652949, \"accuracy\": 0.8932619647355163, \"f1\": 0.9095758869031741, \"f2\": 0.8661857346067873, \"f0_5\": 0.9575424014377176, \"p4\": 0.8892254223487883, \"phi\": 0.7979360689863448}, {\"truth_threshold\": -10.600000157952309, \"match_probability\": 0.0006438760580315065, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1135.0, \"fp\": 10.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.9912663755458515, \"fp_rate\": 0.008733624454148471, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9941690962099126, \"recall\": 0.8394879369768586, \"specificity\": 0.9912663755458515, \"npv\": 0.7768651608487337, \"accuracy\": 0.8942065491183879, \"f1\": 0.9103043246129204, \"f2\": 0.866449842463665, \"f0_5\": 0.9588347767405241, \"p4\": 0.8902534117544227, \"phi\": 0.8003374501759957}, {\"truth_threshold\": -10.400000154972076, \"match_probability\": 0.0007395485633816526, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1137.0, \"fp\": 8.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.9930131004366812, \"fp_rate\": 0.0069868995633187774, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9953298307063632, \"recall\": 0.8394879369768586, \"specificity\": 0.9930131004366812, \"npv\": 0.7771701982228298, \"accuracy\": 0.8948362720403022, \"f1\": 0.9107905982905983, \"f2\": 0.8666260038629664, \"f0_5\": 0.9596983001238321, \"p4\": 0.89093806126407, \"phi\": 0.8019395709687499}, {\"truth_threshold\": -10.30000015348196, \"match_probability\": 0.0007925864548491303, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1138.0, \"fp\": 7.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.993886462882096, \"fp_rate\": 0.00611353711790393, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9959112149532711, \"recall\": 0.8394879369768586, \"specificity\": 0.993886462882096, \"npv\": 0.7773224043715847, \"accuracy\": 0.8951511335012594, \"f1\": 0.9110339300026716, \"f2\": 0.8667141114274095, \"f0_5\": 0.960130645342944, \"p4\": 0.8912801843020557, \"phi\": 0.8027409940046784}, {\"truth_threshold\": -9.300000138580799, \"match_probability\": 0.0015839175344616876, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1702.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 329.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8380108321024126, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16198916789758738, \"precision\": 0.9988262910798122, \"recall\": 0.8380108321024126, \"specificity\": 0.9982532751091703, \"npv\": 0.7764945652173914, \"accuracy\": 0.8957808564231738, \"f1\": 0.9113788487282464, \"f2\": 0.8658933658933659, \"f0_5\": 0.9619079914095173, \"p4\": 0.8920475525203425, \"phi\": 0.8052161223509504}, {\"truth_threshold\": -9.200000137090683, \"match_probability\": 0.0016974078152024628, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1701.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 330.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8375184638109305, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16248153618906944, \"precision\": 0.998825601879037, \"recall\": 0.8375184638109305, \"specificity\": 0.9982532751091703, \"npv\": 0.7759674134419552, \"accuracy\": 0.8954659949622166, \"f1\": 0.9110873058382432, \"f2\": 0.8654726773175944, \"f0_5\": 0.9617776772588488, \"p4\": 0.8917339167406245, \"phi\": 0.8047049805475134}, {\"truth_threshold\": -8.700000129640102, \"match_probability\": 0.002398810587356977, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1700.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 331.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8370260955194485, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16297390448055146, \"precision\": 0.9988249118683902, \"recall\": 0.8370260955194485, \"specificity\": 0.9982532751091703, \"npv\": 0.7754409769335142, \"accuracy\": 0.8951511335012594, \"f1\": 0.9107956067506028, \"f2\": 0.8650519031141869, \"f0_5\": 0.96164724516348, \"p4\": 0.8914203373070146, \"phi\": 0.8041942080726912}, {\"truth_threshold\": -8.600000128149986, \"match_probability\": 0.0025705389597152823, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1699.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 332.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8365337272279665, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.1634662727720335, \"precision\": 0.9988242210464433, \"recall\": 0.8365337272279665, \"specificity\": 0.9982532751091703, \"npv\": 0.7749152542372881, \"accuracy\": 0.8948362720403022, \"f1\": 0.9105037513397642, \"f2\": 0.8646310432569975, \"f0_5\": 0.9615166949632145, \"p4\": 0.8911068140436404, \"phi\": 0.8036838042178126}, {\"truth_threshold\": -8.400000125169754, \"match_probability\": 0.0029516456585356845, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1695.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 336.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8345642540620384, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.1654357459379616, \"precision\": 0.9988214496169712, \"recall\": 0.8345642540620384, \"specificity\": 0.9982532751091703, \"npv\": 0.7728194726166329, \"accuracy\": 0.8935768261964736, \"f1\": 0.9093347639484979, \"f2\": 0.8629467467671317, \"f0_5\": 0.9609933098990815, \"p4\": 0.8898532791719257, \"phi\": 0.8016458608774719}, {\"truth_threshold\": -8.300000123679638, \"match_probability\": 0.0031628254468557835, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1694.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 337.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8340718857705564, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16592811422944362, \"precision\": 0.9988207547169812, \"recall\": 0.8340718857705564, \"specificity\": 0.9982532751091703, \"npv\": 0.7722972972972973, \"accuracy\": 0.8932619647355163, \"f1\": 0.9090421250335391, \"f2\": 0.8625254582484725, \"f0_5\": 0.9608621667612025, \"p4\": 0.8895400341185092, \"phi\": 0.8011372895453348}, {\"truth_threshold\": -8.100000120699406, \"match_probability\": 0.003631424511270156, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1691.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 340.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8325947808961103, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.1674052191038897, \"precision\": 0.9988186650915535, \"recall\": 0.8325947808961103, \"specificity\": 0.9982532751091703, \"npv\": 0.7707349966284558, \"accuracy\": 0.8923173803526449, \"f1\": 0.9081632653061225, \"f2\": 0.8612610777223184, \"f0_5\": 0.9604680222651368, \"p4\": 0.8886006289308176, \"phi\": 0.799613759156141}, {\"truth_threshold\": -8.00000011920929, \"match_probability\": 0.0038910502633927486, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1689.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 342.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8316100443131462, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.16838995568685378, \"precision\": 0.9994082840236687, \"recall\": 0.8316100443131462, \"specificity\": 0.9991266375545852, \"npv\": 0.7698519515477793, \"accuracy\": 0.8920025188916877, \"f1\": 0.9078204783660306, \"f2\": 0.8605054004483391, \"f0_5\": 0.9606415652371744, \"p4\": 0.8883156450550498, \"phi\": 0.7994077154940488}, {\"truth_threshold\": -7.900000117719173, \"match_probability\": 0.004169160079349993, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1687.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 344.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8306253077301822, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.16937469226981783, \"precision\": 0.9994075829383886, \"recall\": 0.8306253077301822, \"specificity\": 0.9991266375545852, \"npv\": 0.7688172043010753, \"accuracy\": 0.8913727959697733, \"f1\": 0.907233127184727, \"f2\": 0.8596616388096209, \"f0_5\": 0.9603780029602641, \"p4\": 0.8876898240848203, \"phi\": 0.79839589905505}, {\"truth_threshold\": -7.800000116229057, \"match_probability\": 0.004467058438231288, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1686.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 345.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8301329394387001, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.16986706056129985, \"precision\": 0.999407231772377, \"recall\": 0.8301329394387001, \"specificity\": 0.9991266375545852, \"npv\": 0.7683008730691739, \"accuracy\": 0.8910579345088161, \"f1\": 0.9069392146315223, \"f2\": 0.8592396289878708, \"f0_5\": 0.9602460416903975, \"p4\": 0.8873769943489517, \"phi\": 0.7978905302578927}, {\"truth_threshold\": -7.400000110268593, \"match_probability\": 0.005885918232687788, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1685.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 346.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8296405711472181, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17035942885278188, \"precision\": 0.9994068801897983, \"recall\": 0.8296405711472181, \"specificity\": 0.9991266375545852, \"npv\": 0.7677852348993288, \"accuracy\": 0.8907430730478589, \"f1\": 0.9066451439332796, \"f2\": 0.8588175331294597, \"f0_5\": 0.9601139601139601, \"p4\": 0.8870642182097721, \"phi\": 0.7973855201597584}, {\"truth_threshold\": -7.200000107288361, \"match_probability\": 0.006755232248084272, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1684.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 347.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8291482028557361, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.1708517971442639, \"precision\": 0.9994065281899109, \"recall\": 0.8291482028557361, \"specificity\": 0.9991266375545852, \"npv\": 0.7672702883970489, \"accuracy\": 0.8904282115869018, \"f1\": 0.9063509149623251, \"f2\": 0.8583953512080742, \"f0_5\": 0.959981758066355, \"p4\": 0.8867514954900549, \"phi\": 0.7968808680755595}, {\"truth_threshold\": -7.1000001057982445, \"match_probability\": 0.007236570039195372, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1680.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 351.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.827178729689808, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.172821270310192, \"precision\": 0.9994051160023796, \"recall\": 0.827178729689808, \"specificity\": 0.9991266375545852, \"npv\": 0.7652173913043478, \"accuracy\": 0.889168765743073, \"f1\": 0.9051724137931034, \"f2\": 0.8567057623661397, \"f0_5\": 0.9594517418617933, \"p4\": 0.8855011352578657, \"phi\": 0.7948658262269261}, {\"truth_threshold\": -6.800000101327896, \"match_probability\": 0.00889438522932807, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1678.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 353.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8261939931068439, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.1738060068931561, \"precision\": 0.9994044073853484, \"recall\": 0.8261939931068439, \"specificity\": 0.9991266375545852, \"npv\": 0.7641950567802271, \"accuracy\": 0.8885390428211587, \"f1\": 0.9045822102425876, \"f2\": 0.8558604508823829, \"f0_5\": 0.9591860066308449, \"p4\": 0.8848762710434646, \"phi\": 0.7938604356798883}, {\"truth_threshold\": -6.70000009983778, \"match_probability\": 0.009526684411466419, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1677.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 354.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8257016248153619, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17429837518463812, \"precision\": 0.9994040524433849, \"recall\": 0.8257016248153619, \"specificity\": 0.9991266375545852, \"npv\": 0.7636849132176236, \"accuracy\": 0.8882241813602015, \"f1\": 0.90428686977622, \"f2\": 0.8554376657824934, \"f0_5\": 0.9590529566510351, \"p4\": 0.8845639172894136, \"phi\": 0.7933582706317808}, {\"truth_threshold\": -6.3000000938773155, \"match_probability\": 0.012532388771145032, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1672.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 359.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8232397833579518, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17676021664204825, \"precision\": 0.9994022713687986, \"recall\": 0.8232397833579518, \"specificity\": 0.9991266375545852, \"npv\": 0.761144377910845, \"accuracy\": 0.8866498740554156, \"f1\": 0.9028077753779697, \"f2\": 0.8533224456466265, \"f0_5\": 0.9583858764186634, \"p4\": 0.8830029249268768, \"phi\": 0.7908527207420626}, {\"truth_threshold\": -6.200000092387199, \"match_probability\": 0.013419810695865477, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1671.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 360.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8227474150664698, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17725258493353027, \"precision\": 0.9994019138755981, \"recall\": 0.8227474150664698, \"specificity\": 0.9991266375545852, \"npv\": 0.7606382978723404, \"accuracy\": 0.8863350125944585, \"f1\": 0.9025114771806644, \"f2\": 0.8528991425071458, \"f0_5\": 0.9582520931299461, \"p4\": 0.8826908804876441, \"phi\": 0.7903526611483275}, {\"truth_threshold\": -6.100000090897083, \"match_probability\": 0.014369156816028038, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1668.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 363.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8212703101920237, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17872968980797638, \"precision\": 0.9994008388256441, \"recall\": 0.8212703101920237, \"specificity\": 0.9991266375545852, \"npv\": 0.7591240875912408, \"accuracy\": 0.8853904282115869, \"f1\": 0.9016216216216216, \"f2\": 0.8516287143878281, \"f0_5\": 0.957850005742506, \"p4\": 0.8817550520220102, \"phi\": 0.7888545711486582}, {\"truth_threshold\": -6.000000089406967, \"match_probability\": 0.015384614445865122, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1653.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 378.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8138847858197932, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.1861152141802068, \"precision\": 0.9993954050785974, \"recall\": 0.8138847858197932, \"specificity\": 0.9991266375545852, \"npv\": 0.7516425755584757, \"accuracy\": 0.8806675062972292, \"f1\": 0.8971506105834464, \"f2\": 0.8452648803436286, \"f0_5\": 0.9558228287267261, \"p4\": 0.8770826156331649, \"phi\": 0.78141055639527}, {\"truth_threshold\": -5.700000084936619, \"match_probability\": 0.01887356650421064, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1650.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 381.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8124076809453471, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.18759231905465287, \"precision\": 1.0, \"recall\": 0.8124076809453471, \"specificity\": 1.0, \"npv\": 0.7503276539973788, \"accuracy\": 0.8800377833753149, \"f1\": 0.8964955175224124, \"f2\": 0.8440761203192142, \"f0_5\": 0.9558567952728537, \"p4\": 0.8764894492452066, \"phi\": 0.7807508881411364}, {\"truth_threshold\": -5.600000083446503, \"match_probability\": 0.02020082327925431, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1647.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 384.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.810930576070901, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.18906942392909898, \"precision\": 1.0, \"recall\": 0.810930576070901, \"specificity\": 1.0, \"npv\": 0.7488554610856769, \"accuracy\": 0.8790931989924433, \"f1\": 0.8955954323001631, \"f2\": 0.842800122812404, \"f0_5\": 0.9554472676644622, \"p4\": 0.8755566203170421, \"phi\": 0.7792751699188473}, {\"truth_threshold\": -5.500000081956387, \"match_probability\": 0.02161936078957948, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1639.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 392.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8069916297390448, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.1930083702609552, \"precision\": 1.0, \"recall\": 0.8069916297390448, \"specificity\": 1.0, \"npv\": 0.7449577098243331, \"accuracy\": 0.8765743073047859, \"f1\": 0.8931880108991825, \"f2\": 0.8393936290074772, \"f0_5\": 0.9543495982298824, \"p4\": 0.873071109525203, \"phi\": 0.7753545230008044}, {\"truth_threshold\": -5.4000000804662704, \"match_probability\": 0.023135158452986655, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1637.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 394.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8060068931560808, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19399310684391924, \"precision\": 1.0, \"recall\": 0.8060068931560808, \"specificity\": 1.0, \"npv\": 0.7439896036387265, \"accuracy\": 0.8759445843828715, \"f1\": 0.8925845147219194, \"f2\": 0.8385411330806269, \"f0_5\": 0.954073901387108, \"p4\": 0.8724501859995755, \"phi\": 0.7743776526794105}, {\"truth_threshold\": -5.300000078976154, \"match_probability\": 0.024754544222716376, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1635.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 396.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8050221565731167, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19497784342688332, \"precision\": 1.0, \"recall\": 0.8050221565731167, \"specificity\": 1.0, \"npv\": 0.7430240103828682, \"accuracy\": 0.8753148614609572, \"f1\": 0.8919803600654664, \"f2\": 0.837688287734399, \"f0_5\": 0.9537976898844942, \"p4\": 0.8718294412272184, \"phi\": 0.7734020889705577}, {\"truth_threshold\": -5.200000077486038, \"match_probability\": 0.02648420859582165, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1631.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 400.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8030526834071886, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19694731659281142, \"precision\": 1.0, \"recall\": 0.8030526834071886, \"specificity\": 1.0, \"npv\": 0.7411003236245954, \"accuracy\": 0.8740554156171285, \"f1\": 0.8907700709994538, \"f2\": 0.8359815479241415, \"f0_5\": 0.9532437171244886, \"p4\": 0.8705884820951986, \"phi\": 0.7714548616482155}, {\"truth_threshold\": -5.100000075995922, \"match_probability\": 0.02833121820332325, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1630.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 401.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8025603151157066, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19743968488429345, \"precision\": 1.0, \"recall\": 0.8025603151157066, \"specificity\": 1.0, \"npv\": 0.740620957309185, \"accuracy\": 0.8737405541561712, \"f1\": 0.8904670854957661, \"f2\": 0.8355546442485134, \"f0_5\": 0.9531049000116946, \"p4\": 0.8702783517473123, \"phi\": 0.7709688637547925}, {\"truth_threshold\": -5.000000074505806, \"match_probability\": 0.030303028785498974, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1629.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 402.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8020679468242246, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19793205317577547, \"precision\": 1.0, \"recall\": 0.8020679468242246, \"specificity\": 1.0, \"npv\": 0.7401422107304461, \"accuracy\": 0.8734256926952141, \"f1\": 0.8901639344262295, \"f2\": 0.8351276530298369, \"f0_5\": 0.952965952965953, \"p4\": 0.8699682648069582, \"phi\": 0.7704831882127677}, {\"truth_threshold\": -4.800000071525574, \"match_probability\": 0.03465289308554322, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1614.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 417.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.794682422451994, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20531757754800592, \"precision\": 1.0, \"recall\": 0.794682422451994, \"specificity\": 1.0, \"npv\": 0.7330345710627401, \"accuracy\": 0.8687027707808564, \"f1\": 0.8855967078189301, \"f2\": 0.8287122612446087, \"f0_5\": 0.9508660303994344, \"p4\": 0.8653220445289462, \"phi\": 0.7632363255723595}, {\"truth_threshold\": -4.700000070035458, \"match_probability\": 0.037047907242669466, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1613.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 418.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.794190054160512, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20580994583948795, \"precision\": 1.0, \"recall\": 0.794190054160512, \"specificity\": 1.0, \"npv\": 0.7325655790147153, \"accuracy\": 0.8683879093198993, \"f1\": 0.8852908891328211, \"f2\": 0.8282838656670433, \"f0_5\": 0.9507249793705057, \"p4\": 0.865012627066886, \"phi\": 0.762755725559516}, {\"truth_threshold\": -4.6000000685453415, \"match_probability\": 0.039601660807737325, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1610.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 421.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.792712949286066, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20728705071393402, \"precision\": 1.0, \"recall\": 0.792712949286066, \"specificity\": 1.0, \"npv\": 0.731162196679438, \"accuracy\": 0.8674433249370277, \"f1\": 0.8843724251579237, \"f2\": 0.8269981508115882, \"f0_5\": 0.9503010270334081, \"p4\": 0.86408461556039, \"phi\": 0.7613157960637859}, {\"truth_threshold\": -4.500000067055225, \"match_probability\": 0.04232371044088178, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1609.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 422.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7922205809945839, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20777941900541605, \"precision\": 1.0, \"recall\": 0.7922205809945839, \"specificity\": 1.0, \"npv\": 0.7306955966815571, \"accuracy\": 0.8671284634760705, \"f1\": 0.884065934065934, \"f2\": 0.8265694030617486, \"f0_5\": 0.9501594425416322, \"p4\": 0.8637753580651635, \"phi\": 0.7608364411180943}, {\"truth_threshold\": -4.400000065565109, \"match_probability\": 0.04522405175894309, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1604.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 427.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7897587395371738, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2102412604628262, \"precision\": 1.0, \"recall\": 0.7897587395371738, \"specificity\": 1.0, \"npv\": 0.7283715012722646, \"accuracy\": 0.8655541561712846, \"f1\": 0.8825309491059147, \"f2\": 0.8244243421052632, \"f0_5\": 0.9494495087013141, \"p4\": 0.8622296597604054, \"phi\": 0.7584443016857485}, {\"truth_threshold\": -4.300000064074993, \"match_probability\": 0.048313119674570026, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1600.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 431.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7877892663712457, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2122107336287543, \"precision\": 1.0, \"recall\": 0.7877892663712457, \"specificity\": 1.0, \"npv\": 0.7265228426395939, \"accuracy\": 0.864294710327456, \"f1\": 0.8812999173781327, \"f2\": 0.8227067050596463, \"f0_5\": 0.9488791365199858, \"p4\": 0.8609937969203728, \"phi\": 0.7565361175813073}, {\"truth_threshold\": -4.200000062584877, \"match_probability\": 0.05160178526561565, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1598.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 433.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7868045297882816, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.21319547021171836, \"precision\": 1.0, \"recall\": 0.7868045297882816, \"specificity\": 1.0, \"npv\": 0.7256020278833967, \"accuracy\": 0.8636649874055415, \"f1\": 0.8806833838523009, \"f2\": 0.8218473565110059, \"f0_5\": 0.9485931378368753, \"p4\": 0.860376093318109, \"phi\": 0.7555838552816091}, {\"truth_threshold\": -4.000000059604645, \"match_probability\": 0.05882352712444066, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1597.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 434.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7863121614967996, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2136878385032004, \"precision\": 1.0, \"recall\": 0.7863121614967996, \"specificity\": 1.0, \"npv\": 0.7251424952501583, \"accuracy\": 0.8633501259445844, \"f1\": 0.880374862183021, \"f2\": 0.8214175496348113, \"f0_5\": 0.9484499346715762, \"p4\": 0.8600672978149376, \"phi\": 0.7551081795566347}, {\"truth_threshold\": -3.9000000581145287, \"match_probability\": 0.06278043839004852, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1594.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 437.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7848350566223535, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.21516494337764647, \"precision\": 1.0, \"recall\": 0.7848350566223535, \"specificity\": 1.0, \"npv\": 0.7237673830594185, \"accuracy\": 0.8624055415617129, \"f1\": 0.879448275862069, \"f2\": 0.8201275982712493, \"f0_5\": 0.9480195075532295, \"p4\": 0.8591411342420673, \"phi\": 0.7536829672115799}, {\"truth_threshold\": -3.8000000566244125, \"match_probability\": 0.06698457743861425, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1582.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 449.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7789266371245692, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.22107336287543083, \"precision\": 1.0, \"recall\": 0.7789266371245692, \"specificity\": 1.0, \"npv\": 0.7183186951066499, \"accuracy\": 0.8586272040302267, \"f1\": 0.8757265430390258, \"f2\": 0.8149598186688646, \"f0_5\": 0.946285440842206, \"p4\": 0.8554397334681781, \"phi\": 0.7480090678348302}, {\"truth_threshold\": -3.6000000536441803, \"match_probability\": 0.0761862214703254, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1576.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 455.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.775972427375677, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.224027572624323, \"precision\": 1.0, \"recall\": 0.775972427375677, \"specificity\": 1.0, \"npv\": 0.715625, \"accuracy\": 0.8567380352644837, \"f1\": 0.8738563903520932, \"f2\": 0.8123711340206186, \"f0_5\": 0.9454109178164367, \"p4\": 0.8535909135793125, \"phi\": 0.7451880758175877}, {\"truth_threshold\": -3.500000052154064, \"match_probability\": 0.08121030044424019, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1572.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 459.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7740029542097489, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2259970457902511, \"precision\": 1.0, \"recall\": 0.7740029542097489, \"specificity\": 1.0, \"npv\": 0.7138403990024937, \"accuracy\": 0.8554785894206549, \"f1\": 0.8726061615320566, \"f2\": 0.8106435643564357, \"f0_5\": 0.9448250991705733, \"p4\": 0.8523590355378086, \"phi\": 0.743313243298003}, {\"truth_threshold\": -3.400000050663948, \"match_probability\": 0.08653465658300358, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1569.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 462.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7725258493353028, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2274741506646972, \"precision\": 1.0, \"recall\": 0.7725258493353028, \"specificity\": 1.0, \"npv\": 0.7125077784691972, \"accuracy\": 0.8545340050377834, \"f1\": 0.8716666666666667, \"f2\": 0.8093469514082328, \"f0_5\": 0.9443842542434092, \"p4\": 0.8514354692858483, \"phi\": 0.7419101540752266}, {\"truth_threshold\": -3.300000049173832, \"match_probability\": 0.09217307161544283, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1568.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 463.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7720334810438207, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.22796651895617923, \"precision\": 1.0, \"recall\": 0.7720334810438207, \"specificity\": 1.0, \"npv\": 0.7120646766169154, \"accuracy\": 0.8542191435768262, \"f1\": 0.8713531536537927, \"f2\": 0.8089145687164672, \"f0_5\": 0.9442370227628568, \"p4\": 0.8511276780405328, \"phi\": 0.741443032887153}, {\"truth_threshold\": -3.200000047683716, \"match_probability\": 0.09813940308831819, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1564.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 467.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7700640078778926, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.22993599212210733, \"precision\": 1.0, \"recall\": 0.7700640078778926, \"specificity\": 1.0, \"npv\": 0.7102977667493796, \"accuracy\": 0.8529596977329975, \"f1\": 0.8700973574408901, \"f2\": 0.8071841453344344, \"f0_5\": 0.9436466755158682, \"p4\": 0.8498968287858544, \"phi\": 0.7395774097751661}, {\"truth_threshold\": -3.1000000461935997, \"match_probability\": 0.10444750015659417, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1562.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 469.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7690792712949286, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23092072870507138, \"precision\": 1.0, \"recall\": 0.7690792712949286, \"specificity\": 1.0, \"npv\": 0.7094175960346965, \"accuracy\": 0.8523299748110831, \"f1\": 0.8694684107987753, \"f2\": 0.8063183976873839, \"f0_5\": 0.9433506462133108, \"p4\": 0.8492815908935231, \"phi\": 0.7386463076480951}, {\"truth_threshold\": -3.0000000447034836, \"match_probability\": 0.11111110805075623, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1561.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 470.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7685869030034466, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23141309699655344, \"precision\": 1.0, \"recall\": 0.7685869030034466, \"specificity\": 1.0, \"npv\": 0.7089783281733746, \"accuracy\": 0.8520151133501259, \"f1\": 0.8691536748329621, \"f2\": 0.8058853897780073, \"f0_5\": 0.943202416918429, \"p4\": 0.8489740179546857, \"phi\": 0.7381811820598891}, {\"truth_threshold\": -2.8000000417232513, \"match_probability\": 0.1255586621587546, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1560.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 471.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7680945347119645, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23190546528803546, \"precision\": 1.0, \"recall\": 0.7680945347119645, \"specificity\": 1.0, \"npv\": 0.7085396039603961, \"accuracy\": 0.8517002518891688, \"f1\": 0.8688387635756056, \"f2\": 0.80545229244114, \"f0_5\": 0.9430540442509975, \"p4\": 0.8486664754292597, \"phi\": 0.7377163394076073}, {\"truth_threshold\": -2.7000000402331352, \"match_probability\": 0.13336855415354743, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1552.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 479.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7641555883801083, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23584441161989167, \"precision\": 1.0, \"recall\": 0.7641555883801083, \"specificity\": 1.0, \"npv\": 0.7050492610837439, \"accuracy\": 0.8491813602015114, \"f1\": 0.8663131454088753, \"f2\": 0.801984291029351, \"f0_5\": 0.9418618764413157, \"p4\": 0.8462072068136004, \"phi\": 0.7340077199460567}, {\"truth_threshold\": -2.500000037252903, \"match_probability\": 0.15022110152606716, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1543.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 488.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7597242737567701, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24027572624322993, \"precision\": 1.0, \"recall\": 0.7597242737567701, \"specificity\": 1.0, \"npv\": 0.7011635027556644, \"accuracy\": 0.8463476070528967, \"f1\": 0.863458310016788, \"f2\": 0.7980759284162615, \"f0_5\": 0.9405095696696331, \"p4\": 0.8434427172572686, \"phi\": 0.7298567893195214}, {\"truth_threshold\": -2.400000035762787, \"match_probability\": 0.1592855907727143, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1542.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 489.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7592319054652881, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24076809453471196, \"precision\": 1.0, \"recall\": 0.7592319054652881, \"specificity\": 1.0, \"npv\": 0.700734394124847, \"accuracy\": 0.8460327455919395, \"f1\": 0.8631402183039463, \"f2\": 0.79764121663563, \"f0_5\": 0.9403585803146726, \"p4\": 0.8431356888593929, \"phi\": 0.7293969490452176}, {\"truth_threshold\": -2.3000000342726707, \"match_probability\": 0.16878839957195682, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1537.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 494.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7567700640078779, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24322993599212211, \"precision\": 1.0, \"recall\": 0.7567700640078779, \"specificity\": 1.0, \"npv\": 0.6985967053081147, \"accuracy\": 0.8444584382871536, \"f1\": 0.8615470852017937, \"f2\": 0.7954663078356278, \"f0_5\": 0.9396014182662917, \"p4\": 0.8416009434305138, \"phi\": 0.7271018315144822}, {\"truth_threshold\": -2.1000000312924385, \"match_probability\": 0.18913982061899084, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1535.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 496.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7557853274249139, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24421467257508617, \"precision\": 1.0, \"recall\": 0.7557853274249139, \"specificity\": 1.0, \"npv\": 0.6977452772699574, \"accuracy\": 0.8438287153652393, \"f1\": 0.8609085810431857, \"f2\": 0.7945957138420127, \"f0_5\": 0.9392975156039652, \"p4\": 0.840987226631003, \"phi\": 0.7261856806910075}, {\"truth_threshold\": -1.9000000283122063, \"match_probability\": 0.2113212378007128, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1534.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 497.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7552929591334318, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2447070408665682, \"precision\": 1.0, \"recall\": 0.7552929591334318, \"specificity\": 1.0, \"npv\": 0.6973203410475031, \"accuracy\": 0.8435138539042821, \"f1\": 0.8605890603085554, \"f2\": 0.7941602816318079, \"f0_5\": 0.9391453410064895, \"p4\": 0.8406804063043731, \"phi\": 0.7257280095557167}, {\"truth_threshold\": -1.8000000268220901, \"match_probability\": 0.22310460998179016, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1531.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 500.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7538158542589857, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24618414574101427, \"precision\": 1.0, \"recall\": 0.7538158542589857, \"specificity\": 1.0, \"npv\": 0.6960486322188449, \"accuracy\": 0.8425692695214105, \"f1\": 0.8596294216732173, \"f2\": 0.7928534438114966, \"f0_5\": 0.9386879215205396, \"p4\": 0.839760095233761, \"phi\": 0.7243566071361862}, {\"truth_threshold\": -1.700000025331974, \"match_probability\": 0.2353489599091234, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1521.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 510.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7488921713441654, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2511078286558346, \"precision\": 1.0, \"recall\": 0.7488921713441654, \"specificity\": 1.0, \"npv\": 0.6918429003021148, \"accuracy\": 0.8394206549118388, \"f1\": 0.856418918918919, \"f2\": 0.7884914463452566, \"f0_5\": 0.9371534195933456, \"p4\": 0.83669395520452, \"phi\": 0.7198025644829947}, {\"truth_threshold\": -1.600000023841858, \"match_probability\": 0.24805074388621665, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1497.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 534.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7370753323485968, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.26292466765140327, \"precision\": 1.0, \"recall\": 0.7370753323485968, \"specificity\": 1.0, \"npv\": 0.6819535437760572, \"accuracy\": 0.8318639798488665, \"f1\": 0.8486394557823129, \"f2\": 0.777985656376676, \"f0_5\": 0.9334081556303778, \"p4\": 0.8293440205305666, \"phi\": 0.7089789382802854}, {\"truth_threshold\": -1.5000000223517418, \"match_probability\": 0.2612038719739489, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1495.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 536.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7360905957656327, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2639094042343673, \"precision\": 1.0, \"recall\": 0.7360905957656327, \"specificity\": 1.0, \"npv\": 0.6811421772754312, \"accuracy\": 0.8312342569269522, \"f1\": 0.8479863868406126, \"f2\": 0.7771078074643933, \"f0_5\": 0.9330919985020597, \"p4\": 0.8287320234225032, \"phi\": 0.708083576332464}, {\"truth_threshold\": -1.4000000208616257, \"match_probability\": 0.2747995717943022, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1493.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 538.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7351058591826687, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.26489414081733137, \"precision\": 1.0, \"recall\": 0.7351058591826687, \"specificity\": 1.0, \"npv\": 0.6803327391562686, \"accuracy\": 0.8306045340050378, \"f1\": 0.8473325766174802, \"f2\": 0.776229593428304, \"f0_5\": 0.9327752092965138, \"p4\": 0.8281200951801486, \"phi\": 0.7071892128331477}, {\"truth_threshold\": -1.3000000193715096, \"match_probability\": 0.2888262766358852, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1480.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 551.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7287050713934022, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.27129492860659776, \"precision\": 1.0, \"recall\": 0.7287050713934022, \"specificity\": 1.0, \"npv\": 0.6751179245283019, \"accuracy\": 0.8265113350125944, \"f1\": 0.8430646539447451, \"f2\": 0.770512286547272, \"f0_5\": 0.9307005408124764, \"p4\": 0.8241441255231906, \"phi\": 0.7013999254293956}, {\"truth_threshold\": -1.2000000178813934, \"match_probability\": 0.3032695424040186, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1476.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 555.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7267355982274741, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.27326440177252587, \"precision\": 1.0, \"recall\": 0.7267355982274741, \"specificity\": 1.0, \"npv\": 0.6735294117647059, \"accuracy\": 0.8252518891687658, \"f1\": 0.8417450812660393, \"f2\": 0.76875, \"f0_5\": 0.9300567107750473, \"p4\": 0.8229212506550728, \"phi\": 0.6996269005567342}, {\"truth_threshold\": -1.1000000163912773, \"match_probability\": 0.318111997717226, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1472.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 559.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.724766125061546, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.27523387493845397, \"precision\": 1.0, \"recall\": 0.724766125061546, \"specificity\": 1.0, \"npv\": 0.6719483568075117, \"accuracy\": 0.823992443324937, \"f1\": 0.840422495004282, \"f2\": 0.7669862442684452, \"f0_5\": 0.9294102790756409, \"p4\": 0.8216985877421169, \"phi\": 0.6978577267644555}, {\"truth_threshold\": -0.9000000134110451, \"match_probability\": 0.34891031813411577, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1470.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 561.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.723781388478582, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.276218611521418, \"precision\": 1.0, \"recall\": 0.723781388478582, \"specificity\": 1.0, \"npv\": 0.6711606096131302, \"accuracy\": 0.8233627204030227, \"f1\": 0.8397600685518424, \"f2\": 0.7661038148843027, \"f0_5\": 0.9290860826697004, \"p4\": 0.8210873315393467, \"phi\": 0.6969745748002024}, {\"truth_threshold\": -0.800000011920929, \"match_probability\": 0.36481689239780585, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1468.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 563.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7227966518956179, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2772033481043821, \"precision\": 1.0, \"recall\": 0.7227966518956179, \"specificity\": 1.0, \"npv\": 0.6703747072599532, \"accuracy\": 0.8227329974811083, \"f1\": 0.8390968848242355, \"f2\": 0.7652210175145955, \"f0_5\": 0.9287612299126914, \"p4\": 0.8204761232422636, \"phi\": 0.6960923745617381}, {\"truth_threshold\": -0.6000000089406967, \"match_probability\": 0.3975010577814427, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1467.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 564.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7223042836041359, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2776957163958641, \"precision\": 1.0, \"recall\": 0.7223042836041359, \"specificity\": 1.0, \"npv\": 0.6699824458747806, \"accuracy\": 0.8224181360201511, \"f1\": 0.8387650085763293, \"f2\": 0.7647794807632156, \"f0_5\": 0.9285985567793391, \"p4\": 0.8201705365264865, \"phi\": 0.6956516301964154}, {\"truth_threshold\": -0.5000000074505806, \"match_probability\": 0.41421356112001384, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1454.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 577.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7159034958148696, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2840965041851305, \"precision\": 1.0, \"recall\": 0.7159034958148696, \"specificity\": 1.0, \"npv\": 0.664924506387921, \"accuracy\": 0.8183249370277078, \"f1\": 0.8344332855093257, \"f2\": 0.7590311129672166, \"f0_5\": 0.926468714158277, \"p4\": 0.8161988630144861, \"phi\": 0.6899433154804018}, {\"truth_threshold\": -0.4000000059604645, \"match_probability\": 0.4311259267559445, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1453.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 578.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7154111275233875, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2845888724766125, \"precision\": 1.0, \"recall\": 0.7154111275233875, \"specificity\": 1.0, \"npv\": 0.6645385954730122, \"accuracy\": 0.8180100755667506, \"f1\": 0.8340987370838117, \"f2\": 0.7585882844314503, \"f0_5\": 0.9263037103149305, \"p4\": 0.8158934155135413, \"phi\": 0.6895058417955253}, {\"truth_threshold\": -0.30000000447034836, \"match_probability\": 0.4482004805735527, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1444.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 587.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7109798129000492, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.28902018709995075, \"precision\": 1.0, \"recall\": 0.7109798129000492, \"specificity\": 1.0, \"npv\": 0.6610854503464203, \"accuracy\": 0.815176322418136, \"f1\": 0.8310791366906475, \"f2\": 0.7545986622073578, \"f0_5\": 0.9248110669911618, \"p4\": 0.8131447366204683, \"phi\": 0.6855788866339472}, {\"truth_threshold\": -0.20000000298023224, \"match_probability\": 0.4653980381052749, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1438.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 593.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.708025603151157, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2919743968488429, \"precision\": 1.0, \"recall\": 0.708025603151157, \"specificity\": 1.0, \"npv\": 0.6588032220943614, \"accuracy\": 0.8132871536523929, \"f1\": 0.829057365234938, \"f2\": 0.7519347416858397, \"f0_5\": 0.9238083001413336, \"p4\": 0.8113125802330422, \"phi\": 0.6829711184825358}, {\"truth_threshold\": -0.10000000149011612, \"match_probability\": 0.48267825490990723, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1427.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 604.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7026095519448547, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.29739044805514525, \"precision\": 1.0, \"recall\": 0.7026095519448547, \"specificity\": 1.0, \"npv\": 0.6546598056032018, \"accuracy\": 0.809823677581864, \"f1\": 0.8253325621746674, \"f2\": 0.7470421945346037, \"f0_5\": 0.9219537407933842, \"p4\": 0.8079540638890523, \"phi\": 0.6782110532062798}, {\"truth_threshold\": -0.0, \"match_probability\": 0.5, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1425.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 606.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7016248153618907, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2983751846381093, \"precision\": 1.0, \"recall\": 0.7016248153618907, \"specificity\": 1.0, \"npv\": 0.653912050256996, \"accuracy\": 0.8091939546599496, \"f1\": 0.8246527777777778, \"f2\": 0.7461514294690543, \"f0_5\": 0.9216142801707412, \"p4\": 0.8073434670308465, \"phi\": 0.6773484491194175}, {\"truth_threshold\": 0.10000000149011612, \"match_probability\": 0.5173217450900928, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1423.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 608.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7006400787789266, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.29935992122107336, \"precision\": 1.0, \"recall\": 0.7006400787789266, \"specificity\": 1.0, \"npv\": 0.6531660011409013, \"accuracy\": 0.8085642317380353, \"f1\": 0.8239722061378112, \"f2\": 0.74526029119095, \"f0_5\": 0.9212741162760585, \"p4\": 0.8067328787708493, \"phi\": 0.6764867171608602}, {\"truth_threshold\": 0.30000000447034836, \"match_probability\": 0.5517995194264473, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1417.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 614.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6976858690300345, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3023141309699655, \"precision\": 1.0, \"recall\": 0.6976858690300345, \"specificity\": 1.0, \"npv\": 0.6509380329732802, \"accuracy\": 0.8066750629722922, \"f1\": 0.8219257540603249, \"f2\": 0.7425846347343046, \"f0_5\": 0.9202493830367581, \"p4\": 0.8049011475731893, \"phi\": 0.6739067199692137}, {\"truth_threshold\": 0.4000000059604645, \"match_probability\": 0.5688740732440556, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1416.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 615.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6971935007385525, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.30280649926144754, \"precision\": 1.0, \"recall\": 0.6971935007385525, \"specificity\": 1.0, \"npv\": 0.6505681818181818, \"accuracy\": 0.806360201511335, \"f1\": 0.8215839860748477, \"f2\": 0.7421383647798742, \"f0_5\": 0.9200779727095516, \"p4\": 0.8045958615658608, \"phi\": 0.6734774741228791}, {\"truth_threshold\": 0.5000000074505806, \"match_probability\": 0.5857864388799862, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1415.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 616.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6967011324470704, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3032988675529296, \"precision\": 1.0, \"recall\": 0.6967011324470704, \"specificity\": 1.0, \"npv\": 0.6501987507098239, \"accuracy\": 0.8060453400503779, \"f1\": 0.8212420197330238, \"f2\": 0.7416920012579935, \"f0_5\": 0.9199063840852945, \"p4\": 0.8042905756758165, \"phi\": 0.6730484424877639}, {\"truth_threshold\": 0.6000000089406967, \"match_probability\": 0.6024989422185573, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1413.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 618.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6957163958641064, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.30428360413589367, \"precision\": 1.0, \"recall\": 0.6957163958641064, \"specificity\": 1.0, \"npv\": 0.6494611457742484, \"accuracy\": 0.8054156171284634, \"f1\": 0.8205574912891986, \"f2\": 0.7407989933941491, \"f0_5\": 0.9195626708317064, \"p4\": 0.8036800033381418, \"phi\": 0.6721910201660187}, {\"truth_threshold\": 0.7000000104308128, \"match_probability\": 0.6189757403752982, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1409.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 622.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6937469226981783, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3062530773018218, \"precision\": 1.0, \"recall\": 0.6937469226981783, \"specificity\": 1.0, \"npv\": 0.6479909451046972, \"accuracy\": 0.8041561712846348, \"f1\": 0.8191860465116279, \"f2\": 0.7390118535613134, \"f0_5\": 0.9188730924742402, \"p4\": 0.8024588500496406, \"phi\": 0.6704787275541767}, {\"truth_threshold\": 0.800000011920929, \"match_probability\": 0.6351831076021942, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1401.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 630.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.689807976366322, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.310192023633678, \"precision\": 1.0, \"recall\": 0.689807976366322, \"specificity\": 1.0, \"npv\": 0.6450704225352113, \"accuracy\": 0.8016372795969773, \"f1\": 0.8164335664335665, \"f2\": 0.7354330708661417, \"f0_5\": 0.9174852652259332, \"p4\": 0.8000164577037014, \"phi\": 0.6670642568619476}, {\"truth_threshold\": 0.9000000134110451, \"match_probability\": 0.6510896818658842, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1395.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 636.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6868537666174298, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.31314623338257014, \"precision\": 1.0, \"recall\": 0.6868537666174298, \"specificity\": 1.0, \"npv\": 0.6428972487366648, \"accuracy\": 0.7997481108312342, \"f1\": 0.8143607705779334, \"f2\": 0.7327450362433029, \"f0_5\": 0.9164367363027197, \"p4\": 0.7981845302790667, \"phi\": 0.6645121495072613}, {\"truth_threshold\": 1.0000000149011612, \"match_probability\": 0.6666666689619328, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1388.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 643.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6834071885770556, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3165928114229444, \"precision\": 1.0, \"recall\": 0.6834071885770556, \"specificity\": 1.0, \"npv\": 0.6403803131991052, \"accuracy\": 0.797544080604534, \"f1\": 0.8119333138344546, \"f2\": 0.7296047098402019, \"f0_5\": 0.9152050639588554, \"p4\": 0.7960470632785187, \"phi\": 0.6615440344100268}, {\"truth_threshold\": 1.1000000163912773, \"match_probability\": 0.681888002282774, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1385.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 646.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6819300837026095, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.31806991629739045, \"precision\": 1.0, \"recall\": 0.6819300837026095, \"specificity\": 1.0, \"npv\": 0.6393076493579006, \"accuracy\": 0.7965994962216625, \"f1\": 0.8108899297423887, \"f2\": 0.7282574403196971, \"f0_5\": 0.9146744155329547, \"p4\": 0.7951309152535574, \"phi\": 0.6602750327237519}, {\"truth_threshold\": 1.2000000178813934, \"match_probability\": 0.6967304575959814, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1381.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 650.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6799606105366814, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.32003938946331856, \"precision\": 1.0, \"recall\": 0.6799606105366814, \"specificity\": 1.0, \"npv\": 0.637883008356546, \"accuracy\": 0.7953400503778337, \"f1\": 0.809495896834701, \"f2\": 0.7264597580220936, \"f0_5\": 0.913964262078094, \"p4\": 0.7939092844375716, \"phi\": 0.6585858484761816}, {\"truth_threshold\": 1.3000000193715096, \"match_probability\": 0.7111737233641148, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1370.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 661.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6745445593303792, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3254554406696209, \"precision\": 1.0, \"recall\": 0.6745445593303792, \"specificity\": 1.0, \"npv\": 0.6339977851605758, \"accuracy\": 0.7918765743073047, \"f1\": 0.8056453984122317, \"f2\": 0.7215083210448704, \"f0_5\": 0.9119957395819465, \"p4\": 0.7905490918185237, \"phi\": 0.6539569990508374}, {\"truth_threshold\": 1.4000000208616257, \"match_probability\": 0.7252004282056979, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1368.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 663.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6735598227474151, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.32644017725258495, \"precision\": 1.0, \"recall\": 0.6735598227474151, \"specificity\": 1.0, \"npv\": 0.6332964601769911, \"accuracy\": 0.7912468513853904, \"f1\": 0.8049426301853486, \"f2\": 0.7206068268015171, \"f0_5\": 0.9116353458616553, \"p4\": 0.789938018490438, \"phi\": 0.6531179460582748}, {\"truth_threshold\": 1.5000000223517418, \"match_probability\": 0.7387961280260511, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1360.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 671.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6696208764155588, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.33037912358444116, \"precision\": 1.0, \"recall\": 0.6696208764155588, \"specificity\": 1.0, \"npv\": 0.6305066079295154, \"accuracy\": 0.788727959697733, \"f1\": 0.8021232674727219, \"f2\": 0.7169970476592156, \"f0_5\": 0.9101860527372507, \"p4\": 0.7874932598198026, \"phi\": 0.6497694878859451}, {\"truth_threshold\": 1.600000023841858, \"match_probability\": 0.7519492561137834, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1347.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 684.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6632200886262924, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.33677991137370755, \"precision\": 1.0, \"recall\": 0.6632200886262924, \"specificity\": 1.0, \"npv\": 0.6260251503553854, \"accuracy\": 0.7846347607052897, \"f1\": 0.7975133214920072, \"f2\": 0.7111181501425404, \"f0_5\": 0.9078042862919531, \"p4\": 0.7835186824943725, \"phi\": 0.6443542936156993}, {\"truth_threshold\": 1.700000025331974, \"match_probability\": 0.7646510400908766, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1341.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 690.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6602658788774003, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3397341211225997, \"precision\": 1.0, \"recall\": 0.6602658788774003, \"specificity\": 1.0, \"npv\": 0.6239782016348774, \"accuracy\": 0.7827455919395466, \"f1\": 0.7953736654804271, \"f2\": 0.7083993660855784, \"f0_5\": 0.9066937119675457, \"p4\": 0.7816833648970615, \"phi\": 0.6418656523781219}, {\"truth_threshold\": 1.8000000268220901, \"match_probability\": 0.7768953900182098, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1333.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 698.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6563269325455441, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3436730674544559, \"precision\": 1.0, \"recall\": 0.6563269325455441, \"specificity\": 1.0, \"npv\": 0.6212696690179056, \"accuracy\": 0.7802267002518891, \"f1\": 0.7925089179548157, \"f2\": 0.7047689542138099, \"f0_5\": 0.9052016840961564, \"p4\": 0.7792352667284765, \"phi\": 0.6385577625791636}, {\"truth_threshold\": 1.9000000283122063, \"match_probability\": 0.7886787621992872, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1324.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 707.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6518956179222059, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3481043820777942, \"precision\": 1.0, \"recall\": 0.6518956179222059, \"specificity\": 1.0, \"npv\": 0.6182505399568035, \"accuracy\": 0.7773929471032746, \"f1\": 0.7892697466467958, \"f2\": 0.7006773920406435, \"f0_5\": 0.903507574723625, \"p4\": 0.7764796300097458, \"phi\": 0.6348502325555829}, {\"truth_threshold\": 2.0000000298023224, \"match_probability\": 0.8000000033051833, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1318.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 713.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6489414081733137, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.35105859182668636, \"precision\": 1.0, \"recall\": 0.6489414081733137, \"specificity\": 1.0, \"npv\": 0.616254036598493, \"accuracy\": 0.7755037783375315, \"f1\": 0.7871006270528516, \"f2\": 0.6979453505613218, \"f0_5\": 0.9023688894974667, \"p4\": 0.7746415522808361, \"phi\": 0.6323865608176021}, {\"truth_threshold\": 2.1000000312924385, \"match_probability\": 0.8108601793810092, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1306.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 725.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6430329886755293, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.35696701132447073, \"precision\": 1.0, \"recall\": 0.6430329886755293, \"specificity\": 1.0, \"npv\": 0.6122994652406417, \"accuracy\": 0.7717254408060453, \"f1\": 0.7827389871141744, \"f2\": 0.6924708377518558, \"f0_5\": 0.9000689179875948, \"p4\": 0.7709627754494935, \"phi\": 0.627478091329186}, {\"truth_threshold\": 2.2000000327825546, \"match_probability\": 0.8212623941099038, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1303.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 728.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6415558838010832, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3584441161989168, \"precision\": 1.0, \"recall\": 0.6415558838010832, \"specificity\": 1.0, \"npv\": 0.611318739989322, \"accuracy\": 0.7707808564231738, \"f1\": 0.7816436712657469, \"f2\": 0.6911000318234858, \"f0_5\": 0.8994891619494685, \"p4\": 0.7700424935392581, \"phi\": 0.6262548478998099}, {\"truth_threshold\": 2.3000000342726707, \"match_probability\": 0.8312116004280432, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1288.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 743.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6341703594288528, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3658296405711472, \"precision\": 1.0, \"recall\": 0.6341703594288528, \"specificity\": 1.0, \"npv\": 0.6064618644067796, \"accuracy\": 0.7660579345088161, \"f1\": 0.7761373907803555, \"f2\": 0.6842328941776455, \"f0_5\": 0.8965613253515244, \"p4\": 0.7654371961027976, \"phi\": 0.6201613810378227}, {\"truth_threshold\": 2.400000035762787, \"match_probability\": 0.8407144092272857, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1284.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 747.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6322008862629247, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.36779911373707536, \"precision\": 1.0, \"recall\": 0.6322008862629247, \"specificity\": 1.0, \"npv\": 0.6051797040169133, \"accuracy\": 0.7647984886649875, \"f1\": 0.7746606334841629, \"f2\": 0.6823979591836735, \"f0_5\": 0.895772289660946, \"p4\": 0.7642079467115986, \"phi\": 0.6185427594175095}, {\"truth_threshold\": 2.500000037252903, \"match_probability\": 0.8497788984739328, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1278.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 753.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6292466765140325, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3707533234859675, \"precision\": 1.0, \"recall\": 0.6292466765140325, \"specificity\": 1.0, \"npv\": 0.6032665964172813, \"accuracy\": 0.7629093198992444, \"f1\": 0.772438803263826, \"f2\": 0.6796426292278238, \"f0_5\": 0.8945821083578328, \"p4\": 0.7623630801644073, \"phi\": 0.6161197130813998}, {\"truth_threshold\": 2.600000038743019, \"match_probability\": 0.8584144256340188, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1272.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 759.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6262924667651403, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.37370753323485967, \"precision\": 1.0, \"recall\": 0.6262924667651403, \"specificity\": 1.0, \"npv\": 0.6013655462184874, \"accuracy\": 0.7610201511335013, \"f1\": 0.7702089009990918, \"f2\": 0.6768837803320562, \"f0_5\": 0.8933839022334598, \"p4\": 0.7605169691954441, \"phi\": 0.6137024615957984}, {\"truth_threshold\": 2.7000000402331352, \"match_probability\": 0.8666314458464526, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1264.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 767.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6223535204332841, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3776464795667159, \"precision\": 1.0, \"recall\": 0.6223535204332841, \"specificity\": 1.0, \"npv\": 0.5988493723849372, \"accuracy\": 0.7585012594458438, \"f1\": 0.7672230652503793, \"f2\": 0.6731998295696634, \"f0_5\": 0.8917736701001834, \"p4\": 0.7580534470944266, \"phi\": 0.6104883415045929}, {\"truth_threshold\": 2.8000000417232513, \"match_probability\": 0.8744413378412453, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1261.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 770.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.620876415558838, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.379123584441162, \"precision\": 1.0, \"recall\": 0.620876415558838, \"specificity\": 1.0, \"npv\": 0.597911227154047, \"accuracy\": 0.7575566750629723, \"f1\": 0.7660996354799514, \"f2\": 0.6718167288225892, \"f0_5\": 0.8911660777385159, \"p4\": 0.7571289984268484, \"phi\": 0.6092856305032893}, {\"truth_threshold\": 2.9000000432133675, \"match_probability\": 0.8818562391739494, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1253.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 778.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6169374692269818, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3830625307730182, \"precision\": 1.0, \"recall\": 0.6169374692269818, \"specificity\": 1.0, \"npv\": 0.5954238169526781, \"accuracy\": 0.7550377833753149, \"f1\": 0.7630937880633374, \"f2\": 0.6681241335181828, \"f0_5\": 0.8895357092148233, \"p4\": 0.7546620475767662, \"phi\": 0.6060851942988336}, {\"truth_threshold\": 3.0000000447034836, \"match_probability\": 0.8888888919492438, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1233.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 798.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6070901033973413, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3929098966026588, \"precision\": 1.0, \"recall\": 0.6070901033973413, \"specificity\": 1.0, \"npv\": 0.5892949047864128, \"accuracy\": 0.7487405541561712, \"f1\": 0.7555147058823529, \"f2\": 0.6588650208400129, \"f0_5\": 0.8853942266264541, \"p4\": 0.7484826603385862, \"phi\": 0.5981263283607483}, {\"truth_threshold\": 3.1000000461935997, \"match_probability\": 0.8955524998434058, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1222.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 809.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6016740521910389, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3983259478089611, \"precision\": 1.0, \"recall\": 0.6016740521910389, \"specificity\": 1.0, \"npv\": 0.5859774820880246, \"accuracy\": 0.7452770780856424, \"f1\": 0.7513064863203197, \"f2\": 0.6537556173764177, \"f0_5\": 0.8830755889579419, \"p4\": 0.745076023777957, \"phi\": 0.5937739015320593}, {\"truth_threshold\": 3.200000047683716, \"match_probability\": 0.9018605969116819, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1216.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 815.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5987198424421467, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4012801575578533, \"precision\": 1.0, \"recall\": 0.5987198424421467, \"specificity\": 1.0, \"npv\": 0.5841836734693877, \"accuracy\": 0.7433879093198993, \"f1\": 0.7489990760702187, \"f2\": 0.6509635974304069, \"f0_5\": 0.8817984046410442, \"p4\": 0.7432152820546354, \"phi\": 0.5914070991600171}, {\"truth_threshold\": 3.300000049173832, \"match_probability\": 0.9078269283845571, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1206.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 825.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5937961595273265, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.40620384047267355, \"precision\": 1.0, \"recall\": 0.5937961595273265, \"specificity\": 1.0, \"npv\": 0.5812182741116751, \"accuracy\": 0.7402392947103275, \"f1\": 0.7451343836886005, \"f2\": 0.6463022508038585, \"f0_5\": 0.8796498905908097, \"p4\": 0.7401097807801229, \"phi\": 0.5874735560130462}, {\"truth_threshold\": 3.400000050663948, \"match_probability\": 0.9134653434169965, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1194.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 837.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5878877400295421, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4121122599704579, \"precision\": 1.0, \"recall\": 0.5878877400295421, \"specificity\": 1.0, \"npv\": 0.577699293642785, \"accuracy\": 0.7364609571788413, \"f1\": 0.7404651162790697, \"f2\": 0.6406954282034771, \"f0_5\": 0.8770383428823271, \"p4\": 0.7363757518418079, \"phi\": 0.5827712519988607}, {\"truth_threshold\": 3.500000052154064, \"match_probability\": 0.9187896995557598, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1185.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 846.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5834564254062038, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.41654357459379615, \"precision\": 1.0, \"recall\": 0.5834564254062038, \"specificity\": 1.0, \"npv\": 0.5750878955298845, \"accuracy\": 0.7336272040302267, \"f1\": 0.7369402985074627, \"f2\": 0.6364808250080567, \"f0_5\": 0.8750553832521045, \"p4\": 0.7335695980796055, \"phi\": 0.5792570481403251}, {\"truth_threshold\": 3.6000000536441803, \"match_probability\": 0.9238137785296746, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1172.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 859.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5770556376169375, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.42294436238306254, \"precision\": 1.0, \"recall\": 0.5770556376169375, \"specificity\": 1.0, \"npv\": 0.5713572854291418, \"accuracy\": 0.7295340050377834, \"f1\": 0.731813924445832, \"f2\": 0.6303786574870912, \"f0_5\": 0.8721535942848638, \"p4\": 0.7295072086227041, \"phi\": 0.5741993927638691}, {\"truth_threshold\": 3.7000000551342964, \"match_probability\": 0.9285512128432143, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1160.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 871.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5711472181191531, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.42885278188084686, \"precision\": 1.0, \"recall\": 0.5711472181191531, \"specificity\": 1.0, \"npv\": 0.5679563492063492, \"accuracy\": 0.7257556675062973, \"f1\": 0.7270448135380758, \"f2\": 0.6247307195174494, \"f0_5\": 0.8694348673362314, \"p4\": 0.7257472383888707, \"phi\": 0.5695495490844643}, {\"truth_threshold\": 3.8000000566244125, \"match_probability\": 0.9330154225613858, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1152.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 879.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5672082717872969, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4327917282127031, \"precision\": 1.0, \"recall\": 0.5672082717872969, \"specificity\": 1.0, \"npv\": 0.5657114624505929, \"accuracy\": 0.7232367758186398, \"f1\": 0.7238454288407163, \"f2\": 0.6209573091849935, \"f0_5\": 0.8676005422503389, \"p4\": 0.7232349168764273, \"phi\": 0.5664593727239978}, {\"truth_threshold\": 3.9000000581145287, \"match_probability\": 0.9372195616099515, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1145.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 886.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5637616937469226, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4362383062530773, \"precision\": 1.0, \"recall\": 0.5637616937469226, \"specificity\": 1.0, \"npv\": 0.5637616937469226, \"accuracy\": 0.7210327455919395, \"f1\": 0.7210327455919395, \"f2\": 0.6176502319559823, \"f0_5\": 0.8659809408561489, \"p4\": 0.7210327455919395, \"phi\": 0.5637616937469226}, {\"truth_threshold\": 4.000000059604645, \"match_probability\": 0.9411764728755594, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1132.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 899.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5573609059576563, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4426390940423437, \"precision\": 1.0, \"recall\": 0.5573609059576563, \"specificity\": 1.0, \"npv\": 0.5601761252446184, \"accuracy\": 0.7169395465994962, \"f1\": 0.7157761618716408, \"f2\": 0.611495246326707, \"f0_5\": 0.8629364232352493, \"p4\": 0.7169329315586919, \"phi\": 0.5587667426236015}, {\"truth_threshold\": 4.100000061094761, \"match_probability\": 0.9448986513716398, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1122.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 909.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.552437223042836, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.44756277695716395, \"precision\": 1.0, \"recall\": 0.552437223042836, \"specificity\": 1.0, \"npv\": 0.5574488802336903, \"accuracy\": 0.7137909319899244, \"f1\": 0.7117031398667936, \"f2\": 0.6067488643737833, \"f0_5\": 0.8605614358030372, \"p4\": 0.7137699020051912, \"phi\": 0.5549373941127399}, {\"truth_threshold\": 4.200000062584877, \"match_probability\": 0.9483982147343843, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1114.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 917.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5484982767109798, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.45150172328902016, \"precision\": 1.0, \"recall\": 0.5484982767109798, \"specificity\": 1.0, \"npv\": 0.555286129970902, \"accuracy\": 0.711272040302267, \"f1\": 0.7084260731319555, \"f2\": 0.6029443602511366, \"f0_5\": 0.8586403576383537, \"p4\": 0.7112333614409344, \"phi\": 0.5518817675648915}, {\"truth_threshold\": 4.300000064074993, \"match_probability\": 0.9516868803254299, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1109.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 922.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5460364352535697, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.45396356474643035, \"precision\": 1.0, \"recall\": 0.5460364352535697, \"specificity\": 1.0, \"npv\": 0.5539429124334785, \"accuracy\": 0.7096977329974811, \"f1\": 0.7063694267515923, \"f2\": 0.6005631972273368, \"f0_5\": 0.8574300293799288, \"p4\": 0.7096451676361052, \"phi\": 0.5499754660338557}, {\"truth_threshold\": 4.400000065565109, \"match_probability\": 0.9547759482410569, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1097.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 934.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5401280157557853, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.45987198424421466, \"precision\": 1.0, \"recall\": 0.5401280157557853, \"specificity\": 1.0, \"npv\": 0.5507455507455508, \"accuracy\": 0.7059193954659949, \"f1\": 0.7014066496163683, \"f2\": 0.5948378700791671, \"f0_5\": 0.8544944695435426, \"p4\": 0.7058242094166463, \"phi\": 0.5454109473695239}, {\"truth_threshold\": 4.500000067055225, \"match_probability\": 0.9576762895591182, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1090.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 941.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5366814377154111, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4633185622845889, \"precision\": 1.0, \"recall\": 0.5366814377154111, \"specificity\": 1.0, \"npv\": 0.5488974113135187, \"accuracy\": 0.7037153652392947, \"f1\": 0.6984940724126882, \"f2\": 0.5914912090297374, \"f0_5\": 0.8527616961351902, \"p4\": 0.7035890482972262, \"phi\": 0.5427550569658532}, {\"truth_threshold\": 4.6000000685453415, \"match_probability\": 0.9603983391922627, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1084.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 947.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.533727227966519, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.46627277203348105, \"precision\": 1.0, \"recall\": 0.533727227966519, \"specificity\": 1.0, \"npv\": 0.5473231357552581, \"accuracy\": 0.7018261964735516, \"f1\": 0.6959871589085073, \"f2\": 0.5886185925282363, \"f0_5\": 0.8512643317103816, \"p4\": 0.701669388939399, \"phi\": 0.5404824326919393}, {\"truth_threshold\": 4.700000070035458, \"match_probability\": 0.9629520927573305, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1077.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 954.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5302806499261448, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.46971935007385524, \"precision\": 1.0, \"recall\": 0.5302806499261448, \"specificity\": 1.0, \"npv\": 0.5454978561219629, \"accuracy\": 0.6996221662468514, \"f1\": 0.693050193050193, \"f2\": 0.5852624714704924, \"f0_5\": 0.8495030761949834, \"p4\": 0.6994252208594803, \"phi\": 0.53783543735763}, {\"truth_threshold\": 4.800000071525574, \"match_probability\": 0.9653471069144568, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1065.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 966.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5243722304283605, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4756277695716396, \"precision\": 1.0, \"recall\": 0.5243722304283605, \"specificity\": 1.0, \"npv\": 0.5423969682614874, \"accuracy\": 0.6958438287153652, \"f1\": 0.687984496124031, \"f2\": 0.5794972249428665, \"f0_5\": 0.8464473056747734, \"p4\": 0.6955662411444608, \"phi\": 0.5333084548597151}, {\"truth_threshold\": 4.90000007301569, \"match_probability\": 0.9675925026740654, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1061.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 970.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5224027572624323, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4775972427375677, \"precision\": 1.0, \"recall\": 0.5224027572624323, \"specificity\": 1.0, \"npv\": 0.541371158392435, \"accuracy\": 0.6945843828715366, \"f1\": 0.6862871927554981, \"f2\": 0.577572128470332, \"f0_5\": 0.8454183266932271, \"p4\": 0.6942764887415704, \"phi\": 0.5318023936074047}, {\"truth_threshold\": 5.000000074505806, \"match_probability\": 0.969696971214501, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1050.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 981.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.51698670605613, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.48301329394387, \"precision\": 1.0, \"recall\": 0.51698670605613, \"specificity\": 1.0, \"npv\": 0.5385700846660395, \"accuracy\": 0.6911209068010076, \"f1\": 0.6815968841285297, \"f2\": 0.5722694571615435, \"f0_5\": 0.8425613866153105, \"p4\": 0.6907205167261657, \"phi\": 0.5276680529005587}, {\"truth_threshold\": 5.100000075995922, \"match_probability\": 0.9716687817966767, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1035.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 996.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5096011816838996, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.49039881831610044, \"precision\": 1.0, \"recall\": 0.5096011816838996, \"specificity\": 1.0, \"npv\": 0.5347968239140588, \"accuracy\": 0.6863979848866498, \"f1\": 0.675146771037182, \"f2\": 0.5650180150671471, \"f0_5\": 0.8385999027710258, \"p4\": 0.6858489665170626, \"phi\": 0.5220470222378447}, {\"truth_threshold\": 5.200000077486038, \"match_probability\": 0.9735157914041783, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1025.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1006.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5046774987690793, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.49532250123092075, \"precision\": 1.0, \"recall\": 0.5046774987690793, \"specificity\": 1.0, \"npv\": 0.5323105532310554, \"accuracy\": 0.6832493702770781, \"f1\": 0.6708115183246073, \"f2\": 0.5601705104382992, \"f0_5\": 0.8359158375468928, \"p4\": 0.6825861647803277, \"phi\": 0.518309905918297}, {\"truth_threshold\": 5.300000078976154, \"match_probability\": 0.9752454557772836, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1017.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1014.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5007385524372231, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.49926144756277696, \"precision\": 1.0, \"recall\": 0.5007385524372231, \"specificity\": 1.0, \"npv\": 0.5303381194997684, \"accuracy\": 0.6807304785894207, \"f1\": 0.6673228346456693, \"f2\": 0.5562848703642927, \"f0_5\": 0.8337432365961633, \"p4\": 0.6799668560937839, \"phi\": 0.5153258602676495}]}}, {\"mode\": \"vega-lite\"});\n",
              "</script>"
            ],
            "text/plain": [
              "alt.HConcatChart(...)"
            ]
          },
          "execution_count": 7,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "linker.evaluation.accuracy_analysis_from_labels_table(\n",
        "    labels_table, output_type=\"threshold_selection\", add_metrics=[\"f1\"]\n",
        ")"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "81e4396d",
      "metadata": {},
      "source": [
        "## Receiver operating characteristic curve\n",
        "\n",
        "A [ROC chart](https://en.wikipedia.org/wiki/Receiver_operating_characteristic) shows how the number of false positives and false negatives varies depending on the match threshold chosen. The match threshold is the match weight chosen as a cutoff for which pairwise comparisons to accept as matches.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 8,
      "id": "01dd7eec",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:22.701493Z",
          "iopub.status.busy": "2024-07-18T13:59:22.701163Z",
          "iopub.status.idle": "2024-07-18T13:59:23.282190Z",
          "shell.execute_reply": "2024-07-18T13:59:23.281409Z"
        }
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "\n",
              "<style>\n",
              "  #altair-viz-8395e35b1ea245b09765af6368361ecf.vega-embed {\n",
              "    width: 100%;\n",
              "    display: flex;\n",
              "  }\n",
              "\n",
              "  #altair-viz-8395e35b1ea245b09765af6368361ecf.vega-embed details,\n",
              "  #altair-viz-8395e35b1ea245b09765af6368361ecf.vega-embed details summary {\n",
              "    position: relative;\n",
              "  }\n",
              "</style>\n",
              "<div id=\"altair-viz-8395e35b1ea245b09765af6368361ecf\"></div>\n",
              "<script type=\"text/javascript\">\n",
              "  var VEGA_DEBUG = (typeof VEGA_DEBUG == \"undefined\") ? {} : VEGA_DEBUG;\n",
              "  (function(spec, embedOpt){\n",
              "    let outputDiv = document.currentScript.previousElementSibling;\n",
              "    if (outputDiv.id !== \"altair-viz-8395e35b1ea245b09765af6368361ecf\") {\n",
              "      outputDiv = document.getElementById(\"altair-viz-8395e35b1ea245b09765af6368361ecf\");\n",
              "    }\n",
              "    const paths = {\n",
              "      \"vega\": \"https://cdn.jsdelivr.net/npm/vega@5?noext\",\n",
              "      \"vega-lib\": \"https://cdn.jsdelivr.net/npm/vega-lib?noext\",\n",
              "      \"vega-lite\": \"https://cdn.jsdelivr.net/npm/vega-lite@5.17.0?noext\",\n",
              "      \"vega-embed\": \"https://cdn.jsdelivr.net/npm/vega-embed@6?noext\",\n",
              "    };\n",
              "\n",
              "    function maybeLoadScript(lib, version) {\n",
              "      var key = `${lib.replace(\"-\", \"\")}_version`;\n",
              "      return (VEGA_DEBUG[key] == version) ?\n",
              "        Promise.resolve(paths[lib]) :\n",
              "        new Promise(function(resolve, reject) {\n",
              "          var s = document.createElement('script');\n",
              "          document.getElementsByTagName(\"head\")[0].appendChild(s);\n",
              "          s.async = true;\n",
              "          s.onload = () => {\n",
              "            VEGA_DEBUG[key] = version;\n",
              "            return resolve(paths[lib]);\n",
              "          };\n",
              "          s.onerror = () => reject(`Error loading script: ${paths[lib]}`);\n",
              "          s.src = paths[lib];\n",
              "        });\n",
              "    }\n",
              "\n",
              "    function showError(err) {\n",
              "      outputDiv.innerHTML = `<div class=\"error\" style=\"color:red;\">${err}</div>`;\n",
              "      throw err;\n",
              "    }\n",
              "\n",
              "    function displayChart(vegaEmbed) {\n",
              "      vegaEmbed(outputDiv, spec, embedOpt)\n",
              "        .catch(err => showError(`Javascript Error: ${err.message}<br>This usually means there's a typo in your chart specification. See the javascript console for the full traceback.`));\n",
              "    }\n",
              "\n",
              "    if(typeof define === \"function\" && define.amd) {\n",
              "      requirejs.config({paths});\n",
              "      require([\"vega-embed\"], displayChart, err => showError(`Error loading script: ${err.message}`));\n",
              "    } else {\n",
              "      maybeLoadScript(\"vega\", \"5\")\n",
              "        .then(() => maybeLoadScript(\"vega-lite\", \"5.17.0\"))\n",
              "        .then(() => maybeLoadScript(\"vega-embed\", \"6\"))\n",
              "        .catch(showError)\n",
              "        .then(() => displayChart(vegaEmbed));\n",
              "    }\n",
              "  })({\"config\": {\"view\": {\"continuousWidth\": 300, \"continuousHeight\": 300}}, \"data\": {\"name\": \"data-cd98a9741c0e2c0c024155628b33a1f1\"}, \"mark\": {\"type\": \"line\", \"clip\": true, \"point\": true}, \"encoding\": {\"tooltip\": [{\"field\": \"truth_threshold\", \"format\": \".4f\", \"type\": \"quantitative\"}, {\"field\": \"match_probability\", \"format\": \".4%\", \"type\": \"quantitative\"}, {\"field\": \"fp_rate\", \"format\": \".4f\", \"title\": \"FP_rate\", \"type\": \"quantitative\"}, {\"field\": \"tp_rate\", \"format\": \".4f\", \"title\": \"TP_rate\", \"type\": \"quantitative\"}, {\"field\": \"tp\", \"format\": \",.0f\", \"title\": \"TP\", \"type\": \"quantitative\"}, {\"field\": \"tn\", \"format\": \",.0f\", \"title\": \"TN\", \"type\": \"quantitative\"}, {\"field\": \"fp\", \"format\": \",.0f\", \"title\": \"FP\", \"type\": \"quantitative\"}, {\"field\": \"fn\", \"format\": \",.0f\", \"title\": \"FN\", \"type\": \"quantitative\"}, {\"field\": \"precision\", \"format\": \".4f\", \"type\": \"quantitative\"}, {\"field\": \"recall\", \"format\": \".4f\", \"type\": \"quantitative\"}, {\"field\": \"f1\", \"format\": \".4f\", \"title\": \"F1\", \"type\": \"quantitative\"}], \"x\": {\"field\": \"fp_rate\", \"sort\": [\"truth_threshold\"], \"title\": \"False Positive Rate amongst clerically reviewed records\", \"type\": \"quantitative\"}, \"y\": {\"field\": \"tp_rate\", \"sort\": [\"truth_threshold\"], \"title\": \"True Positive Rate amongst clerically reviewed records\", \"type\": \"quantitative\"}}, \"height\": 400, \"params\": [{\"name\": \"mouse_zoom\", \"select\": {\"type\": \"interval\", \"encodings\": [\"x\"]}, \"bind\": \"scales\"}], \"title\": \"Receiver operating characteristic curve\", \"width\": 400, \"$schema\": \"https://vega.github.io/schema/vega-lite/v5.9.3.json\", \"datasets\": {\"data-cd98a9741c0e2c0c024155628b33a1f1\": [{\"truth_threshold\": -18.900000281631947, \"match_probability\": 2.0442410704611823e-06, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1709.0, \"tn\": 1103.0, \"fp\": 42.0, \"fn\": 322.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8414574101427869, \"tn_rate\": 0.9633187772925764, \"fp_rate\": 0.03668122270742358, \"fn_rate\": 0.1585425898572132, \"precision\": 0.9760137064534552, \"recall\": 0.8414574101427869, \"specificity\": 0.9633187772925764, \"npv\": 0.7740350877192983, \"accuracy\": 0.8853904282115869, \"f1\": 0.9037546271813856, \"f2\": 0.8653164556962025, \"f0_5\": 0.9457664637520753, \"p4\": 0.8804756275225732, \"phi\": 0.7769307620147627}, {\"truth_threshold\": -16.700000248849392, \"match_probability\": 9.392796608724036e-06, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1709.0, \"tn\": 1119.0, \"fp\": 26.0, \"fn\": 322.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8414574101427869, \"tn_rate\": 0.977292576419214, \"fp_rate\": 0.022707423580786028, \"fn_rate\": 0.1585425898572132, \"precision\": 0.985014409221902, \"recall\": 0.8414574101427869, \"specificity\": 0.977292576419214, \"npv\": 0.7765440666204025, \"accuracy\": 0.8904282115869018, \"f1\": 0.9075942644715879, \"f2\": 0.8667207627548433, \"f0_5\": 0.9525136551109129, \"p4\": 0.8860103770975539, \"phi\": 0.7896366201374305}, {\"truth_threshold\": -12.800000190734863, \"match_probability\": 0.00014020228918616167, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1709.0, \"tn\": 1125.0, \"fp\": 20.0, \"fn\": 322.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8414574101427869, \"tn_rate\": 0.982532751091703, \"fp_rate\": 0.017467248908296942, \"fn_rate\": 0.1585425898572132, \"precision\": 0.9884326200115674, \"recall\": 0.8414574101427869, \"specificity\": 0.982532751091703, \"npv\": 0.7774706288873532, \"accuracy\": 0.8923173803526449, \"f1\": 0.9090425531914894, \"f2\": 0.8672485537399777, \"f0_5\": 0.955068738124511, \"p4\": 0.8880763922377238, \"phi\": 0.7944159751353451}, {\"truth_threshold\": -12.500000186264515, \"match_probability\": 0.00017260367204143044, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1708.0, \"tn\": 1125.0, \"fp\": 20.0, \"fn\": 323.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8409650418513048, \"tn_rate\": 0.982532751091703, \"fp_rate\": 0.017467248908296942, \"fn_rate\": 0.15903495814869523, \"precision\": 0.9884259259259259, \"recall\": 0.8409650418513048, \"specificity\": 0.982532751091703, \"npv\": 0.7769337016574586, \"accuracy\": 0.8920025188916877, \"f1\": 0.9087523277467412, \"f2\": 0.8668290702395453, \"f0_5\": 0.9549368220954937, \"p4\": 0.887762700545028, \"phi\": 0.7938966961277768}, {\"truth_threshold\": -12.400000184774399, \"match_probability\": 0.00018498974370122882, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1132.0, \"fp\": 13.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.988646288209607, \"fp_rate\": 0.011353711790393014, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9924330616996507, \"recall\": 0.8394879369768586, \"specificity\": 0.988646288209607, \"npv\": 0.7764060356652949, \"accuracy\": 0.8932619647355163, \"f1\": 0.9095758869031741, \"f2\": 0.8661857346067873, \"f0_5\": 0.9575424014377176, \"p4\": 0.8892254223487883, \"phi\": 0.7979360689863448}, {\"truth_threshold\": -10.600000157952309, \"match_probability\": 0.0006438760580315065, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1135.0, \"fp\": 10.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.9912663755458515, \"fp_rate\": 0.008733624454148471, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9941690962099126, \"recall\": 0.8394879369768586, \"specificity\": 0.9912663755458515, \"npv\": 0.7768651608487337, \"accuracy\": 0.8942065491183879, \"f1\": 0.9103043246129204, \"f2\": 0.866449842463665, \"f0_5\": 0.9588347767405241, \"p4\": 0.8902534117544227, \"phi\": 0.8003374501759957}, {\"truth_threshold\": -10.400000154972076, \"match_probability\": 0.0007395485633816526, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1137.0, \"fp\": 8.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.9930131004366812, \"fp_rate\": 0.0069868995633187774, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9953298307063632, \"recall\": 0.8394879369768586, \"specificity\": 0.9930131004366812, \"npv\": 0.7771701982228298, \"accuracy\": 0.8948362720403022, \"f1\": 0.9107905982905983, \"f2\": 0.8666260038629664, \"f0_5\": 0.9596983001238321, \"p4\": 0.89093806126407, \"phi\": 0.8019395709687499}, {\"truth_threshold\": -10.30000015348196, \"match_probability\": 0.0007925864548491303, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1705.0, \"tn\": 1138.0, \"fp\": 7.0, \"fn\": 326.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8394879369768586, \"tn_rate\": 0.993886462882096, \"fp_rate\": 0.00611353711790393, \"fn_rate\": 0.1605120630231413, \"precision\": 0.9959112149532711, \"recall\": 0.8394879369768586, \"specificity\": 0.993886462882096, \"npv\": 0.7773224043715847, \"accuracy\": 0.8951511335012594, \"f1\": 0.9110339300026716, \"f2\": 0.8667141114274095, \"f0_5\": 0.960130645342944, \"p4\": 0.8912801843020557, \"phi\": 0.8027409940046784}, {\"truth_threshold\": -9.300000138580799, \"match_probability\": 0.0015839175344616876, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1702.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 329.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8380108321024126, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16198916789758738, \"precision\": 0.9988262910798122, \"recall\": 0.8380108321024126, \"specificity\": 0.9982532751091703, \"npv\": 0.7764945652173914, \"accuracy\": 0.8957808564231738, \"f1\": 0.9113788487282464, \"f2\": 0.8658933658933659, \"f0_5\": 0.9619079914095173, \"p4\": 0.8920475525203425, \"phi\": 0.8052161223509504}, {\"truth_threshold\": -9.200000137090683, \"match_probability\": 0.0016974078152024628, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1701.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 330.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8375184638109305, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16248153618906944, \"precision\": 0.998825601879037, \"recall\": 0.8375184638109305, \"specificity\": 0.9982532751091703, \"npv\": 0.7759674134419552, \"accuracy\": 0.8954659949622166, \"f1\": 0.9110873058382432, \"f2\": 0.8654726773175944, \"f0_5\": 0.9617776772588488, \"p4\": 0.8917339167406245, \"phi\": 0.8047049805475134}, {\"truth_threshold\": -8.700000129640102, \"match_probability\": 0.002398810587356977, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1700.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 331.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8370260955194485, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16297390448055146, \"precision\": 0.9988249118683902, \"recall\": 0.8370260955194485, \"specificity\": 0.9982532751091703, \"npv\": 0.7754409769335142, \"accuracy\": 0.8951511335012594, \"f1\": 0.9107956067506028, \"f2\": 0.8650519031141869, \"f0_5\": 0.96164724516348, \"p4\": 0.8914203373070146, \"phi\": 0.8041942080726912}, {\"truth_threshold\": -8.600000128149986, \"match_probability\": 0.0025705389597152823, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1699.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 332.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8365337272279665, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.1634662727720335, \"precision\": 0.9988242210464433, \"recall\": 0.8365337272279665, \"specificity\": 0.9982532751091703, \"npv\": 0.7749152542372881, \"accuracy\": 0.8948362720403022, \"f1\": 0.9105037513397642, \"f2\": 0.8646310432569975, \"f0_5\": 0.9615166949632145, \"p4\": 0.8911068140436404, \"phi\": 0.8036838042178126}, {\"truth_threshold\": -8.400000125169754, \"match_probability\": 0.0029516456585356845, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1695.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 336.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8345642540620384, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.1654357459379616, \"precision\": 0.9988214496169712, \"recall\": 0.8345642540620384, \"specificity\": 0.9982532751091703, \"npv\": 0.7728194726166329, \"accuracy\": 0.8935768261964736, \"f1\": 0.9093347639484979, \"f2\": 0.8629467467671317, \"f0_5\": 0.9609933098990815, \"p4\": 0.8898532791719257, \"phi\": 0.8016458608774719}, {\"truth_threshold\": -8.300000123679638, \"match_probability\": 0.0031628254468557835, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1694.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 337.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8340718857705564, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.16592811422944362, \"precision\": 0.9988207547169812, \"recall\": 0.8340718857705564, \"specificity\": 0.9982532751091703, \"npv\": 0.7722972972972973, \"accuracy\": 0.8932619647355163, \"f1\": 0.9090421250335391, \"f2\": 0.8625254582484725, \"f0_5\": 0.9608621667612025, \"p4\": 0.8895400341185092, \"phi\": 0.8011372895453348}, {\"truth_threshold\": -8.100000120699406, \"match_probability\": 0.003631424511270156, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1691.0, \"tn\": 1143.0, \"fp\": 2.0, \"fn\": 340.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8325947808961103, \"tn_rate\": 0.9982532751091703, \"fp_rate\": 0.0017467248908296944, \"fn_rate\": 0.1674052191038897, \"precision\": 0.9988186650915535, \"recall\": 0.8325947808961103, \"specificity\": 0.9982532751091703, \"npv\": 0.7707349966284558, \"accuracy\": 0.8923173803526449, \"f1\": 0.9081632653061225, \"f2\": 0.8612610777223184, \"f0_5\": 0.9604680222651368, \"p4\": 0.8886006289308176, \"phi\": 0.799613759156141}, {\"truth_threshold\": -8.00000011920929, \"match_probability\": 0.0038910502633927486, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1689.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 342.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8316100443131462, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.16838995568685378, \"precision\": 0.9994082840236687, \"recall\": 0.8316100443131462, \"specificity\": 0.9991266375545852, \"npv\": 0.7698519515477793, \"accuracy\": 0.8920025188916877, \"f1\": 0.9078204783660306, \"f2\": 0.8605054004483391, \"f0_5\": 0.9606415652371744, \"p4\": 0.8883156450550498, \"phi\": 0.7994077154940488}, {\"truth_threshold\": -7.900000117719173, \"match_probability\": 0.004169160079349993, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1687.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 344.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8306253077301822, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.16937469226981783, \"precision\": 0.9994075829383886, \"recall\": 0.8306253077301822, \"specificity\": 0.9991266375545852, \"npv\": 0.7688172043010753, \"accuracy\": 0.8913727959697733, \"f1\": 0.907233127184727, \"f2\": 0.8596616388096209, \"f0_5\": 0.9603780029602641, \"p4\": 0.8876898240848203, \"phi\": 0.79839589905505}, {\"truth_threshold\": -7.800000116229057, \"match_probability\": 0.004467058438231288, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1686.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 345.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8301329394387001, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.16986706056129985, \"precision\": 0.999407231772377, \"recall\": 0.8301329394387001, \"specificity\": 0.9991266375545852, \"npv\": 0.7683008730691739, \"accuracy\": 0.8910579345088161, \"f1\": 0.9069392146315223, \"f2\": 0.8592396289878708, \"f0_5\": 0.9602460416903975, \"p4\": 0.8873769943489517, \"phi\": 0.7978905302578927}, {\"truth_threshold\": -7.400000110268593, \"match_probability\": 0.005885918232687788, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1685.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 346.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8296405711472181, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17035942885278188, \"precision\": 0.9994068801897983, \"recall\": 0.8296405711472181, \"specificity\": 0.9991266375545852, \"npv\": 0.7677852348993288, \"accuracy\": 0.8907430730478589, \"f1\": 0.9066451439332796, \"f2\": 0.8588175331294597, \"f0_5\": 0.9601139601139601, \"p4\": 0.8870642182097721, \"phi\": 0.7973855201597584}, {\"truth_threshold\": -7.200000107288361, \"match_probability\": 0.006755232248084272, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1684.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 347.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8291482028557361, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.1708517971442639, \"precision\": 0.9994065281899109, \"recall\": 0.8291482028557361, \"specificity\": 0.9991266375545852, \"npv\": 0.7672702883970489, \"accuracy\": 0.8904282115869018, \"f1\": 0.9063509149623251, \"f2\": 0.8583953512080742, \"f0_5\": 0.959981758066355, \"p4\": 0.8867514954900549, \"phi\": 0.7968808680755595}, {\"truth_threshold\": -7.1000001057982445, \"match_probability\": 0.007236570039195372, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1680.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 351.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.827178729689808, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.172821270310192, \"precision\": 0.9994051160023796, \"recall\": 0.827178729689808, \"specificity\": 0.9991266375545852, \"npv\": 0.7652173913043478, \"accuracy\": 0.889168765743073, \"f1\": 0.9051724137931034, \"f2\": 0.8567057623661397, \"f0_5\": 0.9594517418617933, \"p4\": 0.8855011352578657, \"phi\": 0.7948658262269261}, {\"truth_threshold\": -6.800000101327896, \"match_probability\": 0.00889438522932807, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1678.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 353.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8261939931068439, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.1738060068931561, \"precision\": 0.9994044073853484, \"recall\": 0.8261939931068439, \"specificity\": 0.9991266375545852, \"npv\": 0.7641950567802271, \"accuracy\": 0.8885390428211587, \"f1\": 0.9045822102425876, \"f2\": 0.8558604508823829, \"f0_5\": 0.9591860066308449, \"p4\": 0.8848762710434646, \"phi\": 0.7938604356798883}, {\"truth_threshold\": -6.70000009983778, \"match_probability\": 0.009526684411466419, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1677.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 354.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8257016248153619, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17429837518463812, \"precision\": 0.9994040524433849, \"recall\": 0.8257016248153619, \"specificity\": 0.9991266375545852, \"npv\": 0.7636849132176236, \"accuracy\": 0.8882241813602015, \"f1\": 0.90428686977622, \"f2\": 0.8554376657824934, \"f0_5\": 0.9590529566510351, \"p4\": 0.8845639172894136, \"phi\": 0.7933582706317808}, {\"truth_threshold\": -6.3000000938773155, \"match_probability\": 0.012532388771145032, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1672.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 359.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8232397833579518, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17676021664204825, \"precision\": 0.9994022713687986, \"recall\": 0.8232397833579518, \"specificity\": 0.9991266375545852, \"npv\": 0.761144377910845, \"accuracy\": 0.8866498740554156, \"f1\": 0.9028077753779697, \"f2\": 0.8533224456466265, \"f0_5\": 0.9583858764186634, \"p4\": 0.8830029249268768, \"phi\": 0.7908527207420626}, {\"truth_threshold\": -6.200000092387199, \"match_probability\": 0.013419810695865477, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1671.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 360.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8227474150664698, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17725258493353027, \"precision\": 0.9994019138755981, \"recall\": 0.8227474150664698, \"specificity\": 0.9991266375545852, \"npv\": 0.7606382978723404, \"accuracy\": 0.8863350125944585, \"f1\": 0.9025114771806644, \"f2\": 0.8528991425071458, \"f0_5\": 0.9582520931299461, \"p4\": 0.8826908804876441, \"phi\": 0.7903526611483275}, {\"truth_threshold\": -6.100000090897083, \"match_probability\": 0.014369156816028038, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1668.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 363.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8212703101920237, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.17872968980797638, \"precision\": 0.9994008388256441, \"recall\": 0.8212703101920237, \"specificity\": 0.9991266375545852, \"npv\": 0.7591240875912408, \"accuracy\": 0.8853904282115869, \"f1\": 0.9016216216216216, \"f2\": 0.8516287143878281, \"f0_5\": 0.957850005742506, \"p4\": 0.8817550520220102, \"phi\": 0.7888545711486582}, {\"truth_threshold\": -6.000000089406967, \"match_probability\": 0.015384614445865122, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1653.0, \"tn\": 1144.0, \"fp\": 1.0, \"fn\": 378.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8138847858197932, \"tn_rate\": 0.9991266375545852, \"fp_rate\": 0.0008733624454148472, \"fn_rate\": 0.1861152141802068, \"precision\": 0.9993954050785974, \"recall\": 0.8138847858197932, \"specificity\": 0.9991266375545852, \"npv\": 0.7516425755584757, \"accuracy\": 0.8806675062972292, \"f1\": 0.8971506105834464, \"f2\": 0.8452648803436286, \"f0_5\": 0.9558228287267261, \"p4\": 0.8770826156331649, \"phi\": 0.78141055639527}, {\"truth_threshold\": -5.700000084936619, \"match_probability\": 0.01887356650421064, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1650.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 381.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8124076809453471, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.18759231905465287, \"precision\": 1.0, \"recall\": 0.8124076809453471, \"specificity\": 1.0, \"npv\": 0.7503276539973788, \"accuracy\": 0.8800377833753149, \"f1\": 0.8964955175224124, \"f2\": 0.8440761203192142, \"f0_5\": 0.9558567952728537, \"p4\": 0.8764894492452066, \"phi\": 0.7807508881411364}, {\"truth_threshold\": -5.600000083446503, \"match_probability\": 0.02020082327925431, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1647.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 384.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.810930576070901, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.18906942392909898, \"precision\": 1.0, \"recall\": 0.810930576070901, \"specificity\": 1.0, \"npv\": 0.7488554610856769, \"accuracy\": 0.8790931989924433, \"f1\": 0.8955954323001631, \"f2\": 0.842800122812404, \"f0_5\": 0.9554472676644622, \"p4\": 0.8755566203170421, \"phi\": 0.7792751699188473}, {\"truth_threshold\": -5.500000081956387, \"match_probability\": 0.02161936078957948, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1639.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 392.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8069916297390448, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.1930083702609552, \"precision\": 1.0, \"recall\": 0.8069916297390448, \"specificity\": 1.0, \"npv\": 0.7449577098243331, \"accuracy\": 0.8765743073047859, \"f1\": 0.8931880108991825, \"f2\": 0.8393936290074772, \"f0_5\": 0.9543495982298824, \"p4\": 0.873071109525203, \"phi\": 0.7753545230008044}, {\"truth_threshold\": -5.4000000804662704, \"match_probability\": 0.023135158452986655, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1637.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 394.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8060068931560808, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19399310684391924, \"precision\": 1.0, \"recall\": 0.8060068931560808, \"specificity\": 1.0, \"npv\": 0.7439896036387265, \"accuracy\": 0.8759445843828715, \"f1\": 0.8925845147219194, \"f2\": 0.8385411330806269, \"f0_5\": 0.954073901387108, \"p4\": 0.8724501859995755, \"phi\": 0.7743776526794105}, {\"truth_threshold\": -5.300000078976154, \"match_probability\": 0.024754544222716376, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1635.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 396.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8050221565731167, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19497784342688332, \"precision\": 1.0, \"recall\": 0.8050221565731167, \"specificity\": 1.0, \"npv\": 0.7430240103828682, \"accuracy\": 0.8753148614609572, \"f1\": 0.8919803600654664, \"f2\": 0.837688287734399, \"f0_5\": 0.9537976898844942, \"p4\": 0.8718294412272184, \"phi\": 0.7734020889705577}, {\"truth_threshold\": -5.200000077486038, \"match_probability\": 0.02648420859582165, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1631.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 400.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8030526834071886, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19694731659281142, \"precision\": 1.0, \"recall\": 0.8030526834071886, \"specificity\": 1.0, \"npv\": 0.7411003236245954, \"accuracy\": 0.8740554156171285, \"f1\": 0.8907700709994538, \"f2\": 0.8359815479241415, \"f0_5\": 0.9532437171244886, \"p4\": 0.8705884820951986, \"phi\": 0.7714548616482155}, {\"truth_threshold\": -5.100000075995922, \"match_probability\": 0.02833121820332325, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1630.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 401.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8025603151157066, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19743968488429345, \"precision\": 1.0, \"recall\": 0.8025603151157066, \"specificity\": 1.0, \"npv\": 0.740620957309185, \"accuracy\": 0.8737405541561712, \"f1\": 0.8904670854957661, \"f2\": 0.8355546442485134, \"f0_5\": 0.9531049000116946, \"p4\": 0.8702783517473123, \"phi\": 0.7709688637547925}, {\"truth_threshold\": -5.000000074505806, \"match_probability\": 0.030303028785498974, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1629.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 402.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.8020679468242246, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.19793205317577547, \"precision\": 1.0, \"recall\": 0.8020679468242246, \"specificity\": 1.0, \"npv\": 0.7401422107304461, \"accuracy\": 0.8734256926952141, \"f1\": 0.8901639344262295, \"f2\": 0.8351276530298369, \"f0_5\": 0.952965952965953, \"p4\": 0.8699682648069582, \"phi\": 0.7704831882127677}, {\"truth_threshold\": -4.800000071525574, \"match_probability\": 0.03465289308554322, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1614.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 417.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.794682422451994, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20531757754800592, \"precision\": 1.0, \"recall\": 0.794682422451994, \"specificity\": 1.0, \"npv\": 0.7330345710627401, \"accuracy\": 0.8687027707808564, \"f1\": 0.8855967078189301, \"f2\": 0.8287122612446087, \"f0_5\": 0.9508660303994344, \"p4\": 0.8653220445289462, \"phi\": 0.7632363255723595}, {\"truth_threshold\": -4.700000070035458, \"match_probability\": 0.037047907242669466, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1613.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 418.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.794190054160512, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20580994583948795, \"precision\": 1.0, \"recall\": 0.794190054160512, \"specificity\": 1.0, \"npv\": 0.7325655790147153, \"accuracy\": 0.8683879093198993, \"f1\": 0.8852908891328211, \"f2\": 0.8282838656670433, \"f0_5\": 0.9507249793705057, \"p4\": 0.865012627066886, \"phi\": 0.762755725559516}, {\"truth_threshold\": -4.6000000685453415, \"match_probability\": 0.039601660807737325, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1610.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 421.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.792712949286066, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20728705071393402, \"precision\": 1.0, \"recall\": 0.792712949286066, \"specificity\": 1.0, \"npv\": 0.731162196679438, \"accuracy\": 0.8674433249370277, \"f1\": 0.8843724251579237, \"f2\": 0.8269981508115882, \"f0_5\": 0.9503010270334081, \"p4\": 0.86408461556039, \"phi\": 0.7613157960637859}, {\"truth_threshold\": -4.500000067055225, \"match_probability\": 0.04232371044088178, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1609.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 422.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7922205809945839, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.20777941900541605, \"precision\": 1.0, \"recall\": 0.7922205809945839, \"specificity\": 1.0, \"npv\": 0.7306955966815571, \"accuracy\": 0.8671284634760705, \"f1\": 0.884065934065934, \"f2\": 0.8265694030617486, \"f0_5\": 0.9501594425416322, \"p4\": 0.8637753580651635, \"phi\": 0.7608364411180943}, {\"truth_threshold\": -4.400000065565109, \"match_probability\": 0.04522405175894309, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1604.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 427.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7897587395371738, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2102412604628262, \"precision\": 1.0, \"recall\": 0.7897587395371738, \"specificity\": 1.0, \"npv\": 0.7283715012722646, \"accuracy\": 0.8655541561712846, \"f1\": 0.8825309491059147, \"f2\": 0.8244243421052632, \"f0_5\": 0.9494495087013141, \"p4\": 0.8622296597604054, \"phi\": 0.7584443016857485}, {\"truth_threshold\": -4.300000064074993, \"match_probability\": 0.048313119674570026, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1600.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 431.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7877892663712457, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2122107336287543, \"precision\": 1.0, \"recall\": 0.7877892663712457, \"specificity\": 1.0, \"npv\": 0.7265228426395939, \"accuracy\": 0.864294710327456, \"f1\": 0.8812999173781327, \"f2\": 0.8227067050596463, \"f0_5\": 0.9488791365199858, \"p4\": 0.8609937969203728, \"phi\": 0.7565361175813073}, {\"truth_threshold\": -4.200000062584877, \"match_probability\": 0.05160178526561565, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1598.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 433.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7868045297882816, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.21319547021171836, \"precision\": 1.0, \"recall\": 0.7868045297882816, \"specificity\": 1.0, \"npv\": 0.7256020278833967, \"accuracy\": 0.8636649874055415, \"f1\": 0.8806833838523009, \"f2\": 0.8218473565110059, \"f0_5\": 0.9485931378368753, \"p4\": 0.860376093318109, \"phi\": 0.7555838552816091}, {\"truth_threshold\": -4.000000059604645, \"match_probability\": 0.05882352712444066, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1597.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 434.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7863121614967996, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2136878385032004, \"precision\": 1.0, \"recall\": 0.7863121614967996, \"specificity\": 1.0, \"npv\": 0.7251424952501583, \"accuracy\": 0.8633501259445844, \"f1\": 0.880374862183021, \"f2\": 0.8214175496348113, \"f0_5\": 0.9484499346715762, \"p4\": 0.8600672978149376, \"phi\": 0.7551081795566347}, {\"truth_threshold\": -3.9000000581145287, \"match_probability\": 0.06278043839004852, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1594.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 437.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7848350566223535, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.21516494337764647, \"precision\": 1.0, \"recall\": 0.7848350566223535, \"specificity\": 1.0, \"npv\": 0.7237673830594185, \"accuracy\": 0.8624055415617129, \"f1\": 0.879448275862069, \"f2\": 0.8201275982712493, \"f0_5\": 0.9480195075532295, \"p4\": 0.8591411342420673, \"phi\": 0.7536829672115799}, {\"truth_threshold\": -3.8000000566244125, \"match_probability\": 0.06698457743861425, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1582.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 449.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7789266371245692, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.22107336287543083, \"precision\": 1.0, \"recall\": 0.7789266371245692, \"specificity\": 1.0, \"npv\": 0.7183186951066499, \"accuracy\": 0.8586272040302267, \"f1\": 0.8757265430390258, \"f2\": 0.8149598186688646, \"f0_5\": 0.946285440842206, \"p4\": 0.8554397334681781, \"phi\": 0.7480090678348302}, {\"truth_threshold\": -3.6000000536441803, \"match_probability\": 0.0761862214703254, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1576.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 455.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.775972427375677, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.224027572624323, \"precision\": 1.0, \"recall\": 0.775972427375677, \"specificity\": 1.0, \"npv\": 0.715625, \"accuracy\": 0.8567380352644837, \"f1\": 0.8738563903520932, \"f2\": 0.8123711340206186, \"f0_5\": 0.9454109178164367, \"p4\": 0.8535909135793125, \"phi\": 0.7451880758175877}, {\"truth_threshold\": -3.500000052154064, \"match_probability\": 0.08121030044424019, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1572.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 459.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7740029542097489, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2259970457902511, \"precision\": 1.0, \"recall\": 0.7740029542097489, \"specificity\": 1.0, \"npv\": 0.7138403990024937, \"accuracy\": 0.8554785894206549, \"f1\": 0.8726061615320566, \"f2\": 0.8106435643564357, \"f0_5\": 0.9448250991705733, \"p4\": 0.8523590355378086, \"phi\": 0.743313243298003}, {\"truth_threshold\": -3.400000050663948, \"match_probability\": 0.08653465658300358, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1569.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 462.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7725258493353028, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2274741506646972, \"precision\": 1.0, \"recall\": 0.7725258493353028, \"specificity\": 1.0, \"npv\": 0.7125077784691972, \"accuracy\": 0.8545340050377834, \"f1\": 0.8716666666666667, \"f2\": 0.8093469514082328, \"f0_5\": 0.9443842542434092, \"p4\": 0.8514354692858483, \"phi\": 0.7419101540752266}, {\"truth_threshold\": -3.300000049173832, \"match_probability\": 0.09217307161544283, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1568.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 463.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7720334810438207, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.22796651895617923, \"precision\": 1.0, \"recall\": 0.7720334810438207, \"specificity\": 1.0, \"npv\": 0.7120646766169154, \"accuracy\": 0.8542191435768262, \"f1\": 0.8713531536537927, \"f2\": 0.8089145687164672, \"f0_5\": 0.9442370227628568, \"p4\": 0.8511276780405328, \"phi\": 0.741443032887153}, {\"truth_threshold\": -3.200000047683716, \"match_probability\": 0.09813940308831819, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1564.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 467.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7700640078778926, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.22993599212210733, \"precision\": 1.0, \"recall\": 0.7700640078778926, \"specificity\": 1.0, \"npv\": 0.7102977667493796, \"accuracy\": 0.8529596977329975, \"f1\": 0.8700973574408901, \"f2\": 0.8071841453344344, \"f0_5\": 0.9436466755158682, \"p4\": 0.8498968287858544, \"phi\": 0.7395774097751661}, {\"truth_threshold\": -3.1000000461935997, \"match_probability\": 0.10444750015659417, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1562.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 469.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7690792712949286, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23092072870507138, \"precision\": 1.0, \"recall\": 0.7690792712949286, \"specificity\": 1.0, \"npv\": 0.7094175960346965, \"accuracy\": 0.8523299748110831, \"f1\": 0.8694684107987753, \"f2\": 0.8063183976873839, \"f0_5\": 0.9433506462133108, \"p4\": 0.8492815908935231, \"phi\": 0.7386463076480951}, {\"truth_threshold\": -3.0000000447034836, \"match_probability\": 0.11111110805075623, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1561.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 470.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7685869030034466, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23141309699655344, \"precision\": 1.0, \"recall\": 0.7685869030034466, \"specificity\": 1.0, \"npv\": 0.7089783281733746, \"accuracy\": 0.8520151133501259, \"f1\": 0.8691536748329621, \"f2\": 0.8058853897780073, \"f0_5\": 0.943202416918429, \"p4\": 0.8489740179546857, \"phi\": 0.7381811820598891}, {\"truth_threshold\": -2.8000000417232513, \"match_probability\": 0.1255586621587546, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1560.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 471.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7680945347119645, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23190546528803546, \"precision\": 1.0, \"recall\": 0.7680945347119645, \"specificity\": 1.0, \"npv\": 0.7085396039603961, \"accuracy\": 0.8517002518891688, \"f1\": 0.8688387635756056, \"f2\": 0.80545229244114, \"f0_5\": 0.9430540442509975, \"p4\": 0.8486664754292597, \"phi\": 0.7377163394076073}, {\"truth_threshold\": -2.7000000402331352, \"match_probability\": 0.13336855415354743, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1552.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 479.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7641555883801083, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.23584441161989167, \"precision\": 1.0, \"recall\": 0.7641555883801083, \"specificity\": 1.0, \"npv\": 0.7050492610837439, \"accuracy\": 0.8491813602015114, \"f1\": 0.8663131454088753, \"f2\": 0.801984291029351, \"f0_5\": 0.9418618764413157, \"p4\": 0.8462072068136004, \"phi\": 0.7340077199460567}, {\"truth_threshold\": -2.500000037252903, \"match_probability\": 0.15022110152606716, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1543.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 488.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7597242737567701, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24027572624322993, \"precision\": 1.0, \"recall\": 0.7597242737567701, \"specificity\": 1.0, \"npv\": 0.7011635027556644, \"accuracy\": 0.8463476070528967, \"f1\": 0.863458310016788, \"f2\": 0.7980759284162615, \"f0_5\": 0.9405095696696331, \"p4\": 0.8434427172572686, \"phi\": 0.7298567893195214}, {\"truth_threshold\": -2.400000035762787, \"match_probability\": 0.1592855907727143, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1542.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 489.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7592319054652881, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24076809453471196, \"precision\": 1.0, \"recall\": 0.7592319054652881, \"specificity\": 1.0, \"npv\": 0.700734394124847, \"accuracy\": 0.8460327455919395, \"f1\": 0.8631402183039463, \"f2\": 0.79764121663563, \"f0_5\": 0.9403585803146726, \"p4\": 0.8431356888593929, \"phi\": 0.7293969490452176}, {\"truth_threshold\": -2.3000000342726707, \"match_probability\": 0.16878839957195682, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1537.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 494.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7567700640078779, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24322993599212211, \"precision\": 1.0, \"recall\": 0.7567700640078779, \"specificity\": 1.0, \"npv\": 0.6985967053081147, \"accuracy\": 0.8444584382871536, \"f1\": 0.8615470852017937, \"f2\": 0.7954663078356278, \"f0_5\": 0.9396014182662917, \"p4\": 0.8416009434305138, \"phi\": 0.7271018315144822}, {\"truth_threshold\": -2.1000000312924385, \"match_probability\": 0.18913982061899084, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1535.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 496.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7557853274249139, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24421467257508617, \"precision\": 1.0, \"recall\": 0.7557853274249139, \"specificity\": 1.0, \"npv\": 0.6977452772699574, \"accuracy\": 0.8438287153652393, \"f1\": 0.8609085810431857, \"f2\": 0.7945957138420127, \"f0_5\": 0.9392975156039652, \"p4\": 0.840987226631003, \"phi\": 0.7261856806910075}, {\"truth_threshold\": -1.9000000283122063, \"match_probability\": 0.2113212378007128, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1534.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 497.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7552929591334318, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2447070408665682, \"precision\": 1.0, \"recall\": 0.7552929591334318, \"specificity\": 1.0, \"npv\": 0.6973203410475031, \"accuracy\": 0.8435138539042821, \"f1\": 0.8605890603085554, \"f2\": 0.7941602816318079, \"f0_5\": 0.9391453410064895, \"p4\": 0.8406804063043731, \"phi\": 0.7257280095557167}, {\"truth_threshold\": -1.8000000268220901, \"match_probability\": 0.22310460998179016, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1531.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 500.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7538158542589857, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.24618414574101427, \"precision\": 1.0, \"recall\": 0.7538158542589857, \"specificity\": 1.0, \"npv\": 0.6960486322188449, \"accuracy\": 0.8425692695214105, \"f1\": 0.8596294216732173, \"f2\": 0.7928534438114966, \"f0_5\": 0.9386879215205396, \"p4\": 0.839760095233761, \"phi\": 0.7243566071361862}, {\"truth_threshold\": -1.700000025331974, \"match_probability\": 0.2353489599091234, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1521.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 510.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7488921713441654, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2511078286558346, \"precision\": 1.0, \"recall\": 0.7488921713441654, \"specificity\": 1.0, \"npv\": 0.6918429003021148, \"accuracy\": 0.8394206549118388, \"f1\": 0.856418918918919, \"f2\": 0.7884914463452566, \"f0_5\": 0.9371534195933456, \"p4\": 0.83669395520452, \"phi\": 0.7198025644829947}, {\"truth_threshold\": -1.600000023841858, \"match_probability\": 0.24805074388621665, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1497.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 534.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7370753323485968, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.26292466765140327, \"precision\": 1.0, \"recall\": 0.7370753323485968, \"specificity\": 1.0, \"npv\": 0.6819535437760572, \"accuracy\": 0.8318639798488665, \"f1\": 0.8486394557823129, \"f2\": 0.777985656376676, \"f0_5\": 0.9334081556303778, \"p4\": 0.8293440205305666, \"phi\": 0.7089789382802854}, {\"truth_threshold\": -1.5000000223517418, \"match_probability\": 0.2612038719739489, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1495.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 536.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7360905957656327, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2639094042343673, \"precision\": 1.0, \"recall\": 0.7360905957656327, \"specificity\": 1.0, \"npv\": 0.6811421772754312, \"accuracy\": 0.8312342569269522, \"f1\": 0.8479863868406126, \"f2\": 0.7771078074643933, \"f0_5\": 0.9330919985020597, \"p4\": 0.8287320234225032, \"phi\": 0.708083576332464}, {\"truth_threshold\": -1.4000000208616257, \"match_probability\": 0.2747995717943022, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1493.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 538.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7351058591826687, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.26489414081733137, \"precision\": 1.0, \"recall\": 0.7351058591826687, \"specificity\": 1.0, \"npv\": 0.6803327391562686, \"accuracy\": 0.8306045340050378, \"f1\": 0.8473325766174802, \"f2\": 0.776229593428304, \"f0_5\": 0.9327752092965138, \"p4\": 0.8281200951801486, \"phi\": 0.7071892128331477}, {\"truth_threshold\": -1.3000000193715096, \"match_probability\": 0.2888262766358852, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1480.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 551.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7287050713934022, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.27129492860659776, \"precision\": 1.0, \"recall\": 0.7287050713934022, \"specificity\": 1.0, \"npv\": 0.6751179245283019, \"accuracy\": 0.8265113350125944, \"f1\": 0.8430646539447451, \"f2\": 0.770512286547272, \"f0_5\": 0.9307005408124764, \"p4\": 0.8241441255231906, \"phi\": 0.7013999254293956}, {\"truth_threshold\": -1.2000000178813934, \"match_probability\": 0.3032695424040186, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1476.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 555.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7267355982274741, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.27326440177252587, \"precision\": 1.0, \"recall\": 0.7267355982274741, \"specificity\": 1.0, \"npv\": 0.6735294117647059, \"accuracy\": 0.8252518891687658, \"f1\": 0.8417450812660393, \"f2\": 0.76875, \"f0_5\": 0.9300567107750473, \"p4\": 0.8229212506550728, \"phi\": 0.6996269005567342}, {\"truth_threshold\": -1.1000000163912773, \"match_probability\": 0.318111997717226, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1472.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 559.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.724766125061546, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.27523387493845397, \"precision\": 1.0, \"recall\": 0.724766125061546, \"specificity\": 1.0, \"npv\": 0.6719483568075117, \"accuracy\": 0.823992443324937, \"f1\": 0.840422495004282, \"f2\": 0.7669862442684452, \"f0_5\": 0.9294102790756409, \"p4\": 0.8216985877421169, \"phi\": 0.6978577267644555}, {\"truth_threshold\": -0.9000000134110451, \"match_probability\": 0.34891031813411577, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1470.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 561.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.723781388478582, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.276218611521418, \"precision\": 1.0, \"recall\": 0.723781388478582, \"specificity\": 1.0, \"npv\": 0.6711606096131302, \"accuracy\": 0.8233627204030227, \"f1\": 0.8397600685518424, \"f2\": 0.7661038148843027, \"f0_5\": 0.9290860826697004, \"p4\": 0.8210873315393467, \"phi\": 0.6969745748002024}, {\"truth_threshold\": -0.800000011920929, \"match_probability\": 0.36481689239780585, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1468.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 563.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7227966518956179, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2772033481043821, \"precision\": 1.0, \"recall\": 0.7227966518956179, \"specificity\": 1.0, \"npv\": 0.6703747072599532, \"accuracy\": 0.8227329974811083, \"f1\": 0.8390968848242355, \"f2\": 0.7652210175145955, \"f0_5\": 0.9287612299126914, \"p4\": 0.8204761232422636, \"phi\": 0.6960923745617381}, {\"truth_threshold\": -0.6000000089406967, \"match_probability\": 0.3975010577814427, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1467.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 564.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7223042836041359, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2776957163958641, \"precision\": 1.0, \"recall\": 0.7223042836041359, \"specificity\": 1.0, \"npv\": 0.6699824458747806, \"accuracy\": 0.8224181360201511, \"f1\": 0.8387650085763293, \"f2\": 0.7647794807632156, \"f0_5\": 0.9285985567793391, \"p4\": 0.8201705365264865, \"phi\": 0.6956516301964154}, {\"truth_threshold\": -0.5000000074505806, \"match_probability\": 0.41421356112001384, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1454.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 577.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7159034958148696, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2840965041851305, \"precision\": 1.0, \"recall\": 0.7159034958148696, \"specificity\": 1.0, \"npv\": 0.664924506387921, \"accuracy\": 0.8183249370277078, \"f1\": 0.8344332855093257, \"f2\": 0.7590311129672166, \"f0_5\": 0.926468714158277, \"p4\": 0.8161988630144861, \"phi\": 0.6899433154804018}, {\"truth_threshold\": -0.4000000059604645, \"match_probability\": 0.4311259267559445, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1453.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 578.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7154111275233875, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2845888724766125, \"precision\": 1.0, \"recall\": 0.7154111275233875, \"specificity\": 1.0, \"npv\": 0.6645385954730122, \"accuracy\": 0.8180100755667506, \"f1\": 0.8340987370838117, \"f2\": 0.7585882844314503, \"f0_5\": 0.9263037103149305, \"p4\": 0.8158934155135413, \"phi\": 0.6895058417955253}, {\"truth_threshold\": -0.30000000447034836, \"match_probability\": 0.4482004805735527, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1444.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 587.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7109798129000492, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.28902018709995075, \"precision\": 1.0, \"recall\": 0.7109798129000492, \"specificity\": 1.0, \"npv\": 0.6610854503464203, \"accuracy\": 0.815176322418136, \"f1\": 0.8310791366906475, \"f2\": 0.7545986622073578, \"f0_5\": 0.9248110669911618, \"p4\": 0.8131447366204683, \"phi\": 0.6855788866339472}, {\"truth_threshold\": -0.20000000298023224, \"match_probability\": 0.4653980381052749, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1438.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 593.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.708025603151157, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2919743968488429, \"precision\": 1.0, \"recall\": 0.708025603151157, \"specificity\": 1.0, \"npv\": 0.6588032220943614, \"accuracy\": 0.8132871536523929, \"f1\": 0.829057365234938, \"f2\": 0.7519347416858397, \"f0_5\": 0.9238083001413336, \"p4\": 0.8113125802330422, \"phi\": 0.6829711184825358}, {\"truth_threshold\": -0.10000000149011612, \"match_probability\": 0.48267825490990723, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1427.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 604.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7026095519448547, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.29739044805514525, \"precision\": 1.0, \"recall\": 0.7026095519448547, \"specificity\": 1.0, \"npv\": 0.6546598056032018, \"accuracy\": 0.809823677581864, \"f1\": 0.8253325621746674, \"f2\": 0.7470421945346037, \"f0_5\": 0.9219537407933842, \"p4\": 0.8079540638890523, \"phi\": 0.6782110532062798}, {\"truth_threshold\": -0.0, \"match_probability\": 0.5, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1425.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 606.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7016248153618907, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.2983751846381093, \"precision\": 1.0, \"recall\": 0.7016248153618907, \"specificity\": 1.0, \"npv\": 0.653912050256996, \"accuracy\": 0.8091939546599496, \"f1\": 0.8246527777777778, \"f2\": 0.7461514294690543, \"f0_5\": 0.9216142801707412, \"p4\": 0.8073434670308465, \"phi\": 0.6773484491194175}, {\"truth_threshold\": 0.10000000149011612, \"match_probability\": 0.5173217450900928, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1423.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 608.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.7006400787789266, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.29935992122107336, \"precision\": 1.0, \"recall\": 0.7006400787789266, \"specificity\": 1.0, \"npv\": 0.6531660011409013, \"accuracy\": 0.8085642317380353, \"f1\": 0.8239722061378112, \"f2\": 0.74526029119095, \"f0_5\": 0.9212741162760585, \"p4\": 0.8067328787708493, \"phi\": 0.6764867171608602}, {\"truth_threshold\": 0.30000000447034836, \"match_probability\": 0.5517995194264473, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1417.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 614.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6976858690300345, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3023141309699655, \"precision\": 1.0, \"recall\": 0.6976858690300345, \"specificity\": 1.0, \"npv\": 0.6509380329732802, \"accuracy\": 0.8066750629722922, \"f1\": 0.8219257540603249, \"f2\": 0.7425846347343046, \"f0_5\": 0.9202493830367581, \"p4\": 0.8049011475731893, \"phi\": 0.6739067199692137}, {\"truth_threshold\": 0.4000000059604645, \"match_probability\": 0.5688740732440556, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1416.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 615.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6971935007385525, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.30280649926144754, \"precision\": 1.0, \"recall\": 0.6971935007385525, \"specificity\": 1.0, \"npv\": 0.6505681818181818, \"accuracy\": 0.806360201511335, \"f1\": 0.8215839860748477, \"f2\": 0.7421383647798742, \"f0_5\": 0.9200779727095516, \"p4\": 0.8045958615658608, \"phi\": 0.6734774741228791}, {\"truth_threshold\": 0.5000000074505806, \"match_probability\": 0.5857864388799862, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1415.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 616.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6967011324470704, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3032988675529296, \"precision\": 1.0, \"recall\": 0.6967011324470704, \"specificity\": 1.0, \"npv\": 0.6501987507098239, \"accuracy\": 0.8060453400503779, \"f1\": 0.8212420197330238, \"f2\": 0.7416920012579935, \"f0_5\": 0.9199063840852945, \"p4\": 0.8042905756758165, \"phi\": 0.6730484424877639}, {\"truth_threshold\": 0.6000000089406967, \"match_probability\": 0.6024989422185573, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1413.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 618.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6957163958641064, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.30428360413589367, \"precision\": 1.0, \"recall\": 0.6957163958641064, \"specificity\": 1.0, \"npv\": 0.6494611457742484, \"accuracy\": 0.8054156171284634, \"f1\": 0.8205574912891986, \"f2\": 0.7407989933941491, \"f0_5\": 0.9195626708317064, \"p4\": 0.8036800033381418, \"phi\": 0.6721910201660187}, {\"truth_threshold\": 0.7000000104308128, \"match_probability\": 0.6189757403752982, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1409.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 622.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6937469226981783, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3062530773018218, \"precision\": 1.0, \"recall\": 0.6937469226981783, \"specificity\": 1.0, \"npv\": 0.6479909451046972, \"accuracy\": 0.8041561712846348, \"f1\": 0.8191860465116279, \"f2\": 0.7390118535613134, \"f0_5\": 0.9188730924742402, \"p4\": 0.8024588500496406, \"phi\": 0.6704787275541767}, {\"truth_threshold\": 0.800000011920929, \"match_probability\": 0.6351831076021942, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1401.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 630.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.689807976366322, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.310192023633678, \"precision\": 1.0, \"recall\": 0.689807976366322, \"specificity\": 1.0, \"npv\": 0.6450704225352113, \"accuracy\": 0.8016372795969773, \"f1\": 0.8164335664335665, \"f2\": 0.7354330708661417, \"f0_5\": 0.9174852652259332, \"p4\": 0.8000164577037014, \"phi\": 0.6670642568619476}, {\"truth_threshold\": 0.9000000134110451, \"match_probability\": 0.6510896818658842, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1395.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 636.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6868537666174298, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.31314623338257014, \"precision\": 1.0, \"recall\": 0.6868537666174298, \"specificity\": 1.0, \"npv\": 0.6428972487366648, \"accuracy\": 0.7997481108312342, \"f1\": 0.8143607705779334, \"f2\": 0.7327450362433029, \"f0_5\": 0.9164367363027197, \"p4\": 0.7981845302790667, \"phi\": 0.6645121495072613}, {\"truth_threshold\": 1.0000000149011612, \"match_probability\": 0.6666666689619328, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1388.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 643.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6834071885770556, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3165928114229444, \"precision\": 1.0, \"recall\": 0.6834071885770556, \"specificity\": 1.0, \"npv\": 0.6403803131991052, \"accuracy\": 0.797544080604534, \"f1\": 0.8119333138344546, \"f2\": 0.7296047098402019, \"f0_5\": 0.9152050639588554, \"p4\": 0.7960470632785187, \"phi\": 0.6615440344100268}, {\"truth_threshold\": 1.1000000163912773, \"match_probability\": 0.681888002282774, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1385.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 646.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6819300837026095, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.31806991629739045, \"precision\": 1.0, \"recall\": 0.6819300837026095, \"specificity\": 1.0, \"npv\": 0.6393076493579006, \"accuracy\": 0.7965994962216625, \"f1\": 0.8108899297423887, \"f2\": 0.7282574403196971, \"f0_5\": 0.9146744155329547, \"p4\": 0.7951309152535574, \"phi\": 0.6602750327237519}, {\"truth_threshold\": 1.2000000178813934, \"match_probability\": 0.6967304575959814, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1381.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 650.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6799606105366814, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.32003938946331856, \"precision\": 1.0, \"recall\": 0.6799606105366814, \"specificity\": 1.0, \"npv\": 0.637883008356546, \"accuracy\": 0.7953400503778337, \"f1\": 0.809495896834701, \"f2\": 0.7264597580220936, \"f0_5\": 0.913964262078094, \"p4\": 0.7939092844375716, \"phi\": 0.6585858484761816}, {\"truth_threshold\": 1.3000000193715096, \"match_probability\": 0.7111737233641148, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1370.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 661.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6745445593303792, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3254554406696209, \"precision\": 1.0, \"recall\": 0.6745445593303792, \"specificity\": 1.0, \"npv\": 0.6339977851605758, \"accuracy\": 0.7918765743073047, \"f1\": 0.8056453984122317, \"f2\": 0.7215083210448704, \"f0_5\": 0.9119957395819465, \"p4\": 0.7905490918185237, \"phi\": 0.6539569990508374}, {\"truth_threshold\": 1.4000000208616257, \"match_probability\": 0.7252004282056979, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1368.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 663.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6735598227474151, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.32644017725258495, \"precision\": 1.0, \"recall\": 0.6735598227474151, \"specificity\": 1.0, \"npv\": 0.6332964601769911, \"accuracy\": 0.7912468513853904, \"f1\": 0.8049426301853486, \"f2\": 0.7206068268015171, \"f0_5\": 0.9116353458616553, \"p4\": 0.789938018490438, \"phi\": 0.6531179460582748}, {\"truth_threshold\": 1.5000000223517418, \"match_probability\": 0.7387961280260511, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1360.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 671.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6696208764155588, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.33037912358444116, \"precision\": 1.0, \"recall\": 0.6696208764155588, \"specificity\": 1.0, \"npv\": 0.6305066079295154, \"accuracy\": 0.788727959697733, \"f1\": 0.8021232674727219, \"f2\": 0.7169970476592156, \"f0_5\": 0.9101860527372507, \"p4\": 0.7874932598198026, \"phi\": 0.6497694878859451}, {\"truth_threshold\": 1.600000023841858, \"match_probability\": 0.7519492561137834, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1347.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 684.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6632200886262924, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.33677991137370755, \"precision\": 1.0, \"recall\": 0.6632200886262924, \"specificity\": 1.0, \"npv\": 0.6260251503553854, \"accuracy\": 0.7846347607052897, \"f1\": 0.7975133214920072, \"f2\": 0.7111181501425404, \"f0_5\": 0.9078042862919531, \"p4\": 0.7835186824943725, \"phi\": 0.6443542936156993}, {\"truth_threshold\": 1.700000025331974, \"match_probability\": 0.7646510400908766, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1341.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 690.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6602658788774003, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3397341211225997, \"precision\": 1.0, \"recall\": 0.6602658788774003, \"specificity\": 1.0, \"npv\": 0.6239782016348774, \"accuracy\": 0.7827455919395466, \"f1\": 0.7953736654804271, \"f2\": 0.7083993660855784, \"f0_5\": 0.9066937119675457, \"p4\": 0.7816833648970615, \"phi\": 0.6418656523781219}, {\"truth_threshold\": 1.8000000268220901, \"match_probability\": 0.7768953900182098, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1333.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 698.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6563269325455441, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3436730674544559, \"precision\": 1.0, \"recall\": 0.6563269325455441, \"specificity\": 1.0, \"npv\": 0.6212696690179056, \"accuracy\": 0.7802267002518891, \"f1\": 0.7925089179548157, \"f2\": 0.7047689542138099, \"f0_5\": 0.9052016840961564, \"p4\": 0.7792352667284765, \"phi\": 0.6385577625791636}, {\"truth_threshold\": 1.9000000283122063, \"match_probability\": 0.7886787621992872, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1324.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 707.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6518956179222059, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3481043820777942, \"precision\": 1.0, \"recall\": 0.6518956179222059, \"specificity\": 1.0, \"npv\": 0.6182505399568035, \"accuracy\": 0.7773929471032746, \"f1\": 0.7892697466467958, \"f2\": 0.7006773920406435, \"f0_5\": 0.903507574723625, \"p4\": 0.7764796300097458, \"phi\": 0.6348502325555829}, {\"truth_threshold\": 2.0000000298023224, \"match_probability\": 0.8000000033051833, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1318.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 713.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6489414081733137, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.35105859182668636, \"precision\": 1.0, \"recall\": 0.6489414081733137, \"specificity\": 1.0, \"npv\": 0.616254036598493, \"accuracy\": 0.7755037783375315, \"f1\": 0.7871006270528516, \"f2\": 0.6979453505613218, \"f0_5\": 0.9023688894974667, \"p4\": 0.7746415522808361, \"phi\": 0.6323865608176021}, {\"truth_threshold\": 2.1000000312924385, \"match_probability\": 0.8108601793810092, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1306.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 725.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6430329886755293, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.35696701132447073, \"precision\": 1.0, \"recall\": 0.6430329886755293, \"specificity\": 1.0, \"npv\": 0.6122994652406417, \"accuracy\": 0.7717254408060453, \"f1\": 0.7827389871141744, \"f2\": 0.6924708377518558, \"f0_5\": 0.9000689179875948, \"p4\": 0.7709627754494935, \"phi\": 0.627478091329186}, {\"truth_threshold\": 2.2000000327825546, \"match_probability\": 0.8212623941099038, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1303.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 728.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6415558838010832, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3584441161989168, \"precision\": 1.0, \"recall\": 0.6415558838010832, \"specificity\": 1.0, \"npv\": 0.611318739989322, \"accuracy\": 0.7707808564231738, \"f1\": 0.7816436712657469, \"f2\": 0.6911000318234858, \"f0_5\": 0.8994891619494685, \"p4\": 0.7700424935392581, \"phi\": 0.6262548478998099}, {\"truth_threshold\": 2.3000000342726707, \"match_probability\": 0.8312116004280432, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1288.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 743.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6341703594288528, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3658296405711472, \"precision\": 1.0, \"recall\": 0.6341703594288528, \"specificity\": 1.0, \"npv\": 0.6064618644067796, \"accuracy\": 0.7660579345088161, \"f1\": 0.7761373907803555, \"f2\": 0.6842328941776455, \"f0_5\": 0.8965613253515244, \"p4\": 0.7654371961027976, \"phi\": 0.6201613810378227}, {\"truth_threshold\": 2.400000035762787, \"match_probability\": 0.8407144092272857, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1284.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 747.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6322008862629247, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.36779911373707536, \"precision\": 1.0, \"recall\": 0.6322008862629247, \"specificity\": 1.0, \"npv\": 0.6051797040169133, \"accuracy\": 0.7647984886649875, \"f1\": 0.7746606334841629, \"f2\": 0.6823979591836735, \"f0_5\": 0.895772289660946, \"p4\": 0.7642079467115986, \"phi\": 0.6185427594175095}, {\"truth_threshold\": 2.500000037252903, \"match_probability\": 0.8497788984739328, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1278.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 753.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6292466765140325, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3707533234859675, \"precision\": 1.0, \"recall\": 0.6292466765140325, \"specificity\": 1.0, \"npv\": 0.6032665964172813, \"accuracy\": 0.7629093198992444, \"f1\": 0.772438803263826, \"f2\": 0.6796426292278238, \"f0_5\": 0.8945821083578328, \"p4\": 0.7623630801644073, \"phi\": 0.6161197130813998}, {\"truth_threshold\": 2.600000038743019, \"match_probability\": 0.8584144256340188, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1272.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 759.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6262924667651403, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.37370753323485967, \"precision\": 1.0, \"recall\": 0.6262924667651403, \"specificity\": 1.0, \"npv\": 0.6013655462184874, \"accuracy\": 0.7610201511335013, \"f1\": 0.7702089009990918, \"f2\": 0.6768837803320562, \"f0_5\": 0.8933839022334598, \"p4\": 0.7605169691954441, \"phi\": 0.6137024615957984}, {\"truth_threshold\": 2.7000000402331352, \"match_probability\": 0.8666314458464526, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1264.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 767.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6223535204332841, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3776464795667159, \"precision\": 1.0, \"recall\": 0.6223535204332841, \"specificity\": 1.0, \"npv\": 0.5988493723849372, \"accuracy\": 0.7585012594458438, \"f1\": 0.7672230652503793, \"f2\": 0.6731998295696634, \"f0_5\": 0.8917736701001834, \"p4\": 0.7580534470944266, \"phi\": 0.6104883415045929}, {\"truth_threshold\": 2.8000000417232513, \"match_probability\": 0.8744413378412453, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1261.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 770.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.620876415558838, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.379123584441162, \"precision\": 1.0, \"recall\": 0.620876415558838, \"specificity\": 1.0, \"npv\": 0.597911227154047, \"accuracy\": 0.7575566750629723, \"f1\": 0.7660996354799514, \"f2\": 0.6718167288225892, \"f0_5\": 0.8911660777385159, \"p4\": 0.7571289984268484, \"phi\": 0.6092856305032893}, {\"truth_threshold\": 2.9000000432133675, \"match_probability\": 0.8818562391739494, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1253.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 778.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6169374692269818, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3830625307730182, \"precision\": 1.0, \"recall\": 0.6169374692269818, \"specificity\": 1.0, \"npv\": 0.5954238169526781, \"accuracy\": 0.7550377833753149, \"f1\": 0.7630937880633374, \"f2\": 0.6681241335181828, \"f0_5\": 0.8895357092148233, \"p4\": 0.7546620475767662, \"phi\": 0.6060851942988336}, {\"truth_threshold\": 3.0000000447034836, \"match_probability\": 0.8888888919492438, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1233.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 798.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6070901033973413, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3929098966026588, \"precision\": 1.0, \"recall\": 0.6070901033973413, \"specificity\": 1.0, \"npv\": 0.5892949047864128, \"accuracy\": 0.7487405541561712, \"f1\": 0.7555147058823529, \"f2\": 0.6588650208400129, \"f0_5\": 0.8853942266264541, \"p4\": 0.7484826603385862, \"phi\": 0.5981263283607483}, {\"truth_threshold\": 3.1000000461935997, \"match_probability\": 0.8955524998434058, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1222.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 809.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.6016740521910389, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.3983259478089611, \"precision\": 1.0, \"recall\": 0.6016740521910389, \"specificity\": 1.0, \"npv\": 0.5859774820880246, \"accuracy\": 0.7452770780856424, \"f1\": 0.7513064863203197, \"f2\": 0.6537556173764177, \"f0_5\": 0.8830755889579419, \"p4\": 0.745076023777957, \"phi\": 0.5937739015320593}, {\"truth_threshold\": 3.200000047683716, \"match_probability\": 0.9018605969116819, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1216.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 815.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5987198424421467, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4012801575578533, \"precision\": 1.0, \"recall\": 0.5987198424421467, \"specificity\": 1.0, \"npv\": 0.5841836734693877, \"accuracy\": 0.7433879093198993, \"f1\": 0.7489990760702187, \"f2\": 0.6509635974304069, \"f0_5\": 0.8817984046410442, \"p4\": 0.7432152820546354, \"phi\": 0.5914070991600171}, {\"truth_threshold\": 3.300000049173832, \"match_probability\": 0.9078269283845571, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1206.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 825.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5937961595273265, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.40620384047267355, \"precision\": 1.0, \"recall\": 0.5937961595273265, \"specificity\": 1.0, \"npv\": 0.5812182741116751, \"accuracy\": 0.7402392947103275, \"f1\": 0.7451343836886005, \"f2\": 0.6463022508038585, \"f0_5\": 0.8796498905908097, \"p4\": 0.7401097807801229, \"phi\": 0.5874735560130462}, {\"truth_threshold\": 3.400000050663948, \"match_probability\": 0.9134653434169965, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1194.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 837.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5878877400295421, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4121122599704579, \"precision\": 1.0, \"recall\": 0.5878877400295421, \"specificity\": 1.0, \"npv\": 0.577699293642785, \"accuracy\": 0.7364609571788413, \"f1\": 0.7404651162790697, \"f2\": 0.6406954282034771, \"f0_5\": 0.8770383428823271, \"p4\": 0.7363757518418079, \"phi\": 0.5827712519988607}, {\"truth_threshold\": 3.500000052154064, \"match_probability\": 0.9187896995557598, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1185.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 846.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5834564254062038, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.41654357459379615, \"precision\": 1.0, \"recall\": 0.5834564254062038, \"specificity\": 1.0, \"npv\": 0.5750878955298845, \"accuracy\": 0.7336272040302267, \"f1\": 0.7369402985074627, \"f2\": 0.6364808250080567, \"f0_5\": 0.8750553832521045, \"p4\": 0.7335695980796055, \"phi\": 0.5792570481403251}, {\"truth_threshold\": 3.6000000536441803, \"match_probability\": 0.9238137785296746, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1172.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 859.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5770556376169375, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.42294436238306254, \"precision\": 1.0, \"recall\": 0.5770556376169375, \"specificity\": 1.0, \"npv\": 0.5713572854291418, \"accuracy\": 0.7295340050377834, \"f1\": 0.731813924445832, \"f2\": 0.6303786574870912, \"f0_5\": 0.8721535942848638, \"p4\": 0.7295072086227041, \"phi\": 0.5741993927638691}, {\"truth_threshold\": 3.7000000551342964, \"match_probability\": 0.9285512128432143, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1160.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 871.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5711472181191531, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.42885278188084686, \"precision\": 1.0, \"recall\": 0.5711472181191531, \"specificity\": 1.0, \"npv\": 0.5679563492063492, \"accuracy\": 0.7257556675062973, \"f1\": 0.7270448135380758, \"f2\": 0.6247307195174494, \"f0_5\": 0.8694348673362314, \"p4\": 0.7257472383888707, \"phi\": 0.5695495490844643}, {\"truth_threshold\": 3.8000000566244125, \"match_probability\": 0.9330154225613858, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1152.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 879.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5672082717872969, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4327917282127031, \"precision\": 1.0, \"recall\": 0.5672082717872969, \"specificity\": 1.0, \"npv\": 0.5657114624505929, \"accuracy\": 0.7232367758186398, \"f1\": 0.7238454288407163, \"f2\": 0.6209573091849935, \"f0_5\": 0.8676005422503389, \"p4\": 0.7232349168764273, \"phi\": 0.5664593727239978}, {\"truth_threshold\": 3.9000000581145287, \"match_probability\": 0.9372195616099515, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1145.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 886.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5637616937469226, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4362383062530773, \"precision\": 1.0, \"recall\": 0.5637616937469226, \"specificity\": 1.0, \"npv\": 0.5637616937469226, \"accuracy\": 0.7210327455919395, \"f1\": 0.7210327455919395, \"f2\": 0.6176502319559823, \"f0_5\": 0.8659809408561489, \"p4\": 0.7210327455919395, \"phi\": 0.5637616937469226}, {\"truth_threshold\": 4.000000059604645, \"match_probability\": 0.9411764728755594, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1132.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 899.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5573609059576563, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4426390940423437, \"precision\": 1.0, \"recall\": 0.5573609059576563, \"specificity\": 1.0, \"npv\": 0.5601761252446184, \"accuracy\": 0.7169395465994962, \"f1\": 0.7157761618716408, \"f2\": 0.611495246326707, \"f0_5\": 0.8629364232352493, \"p4\": 0.7169329315586919, \"phi\": 0.5587667426236015}, {\"truth_threshold\": 4.100000061094761, \"match_probability\": 0.9448986513716398, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1122.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 909.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.552437223042836, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.44756277695716395, \"precision\": 1.0, \"recall\": 0.552437223042836, \"specificity\": 1.0, \"npv\": 0.5574488802336903, \"accuracy\": 0.7137909319899244, \"f1\": 0.7117031398667936, \"f2\": 0.6067488643737833, \"f0_5\": 0.8605614358030372, \"p4\": 0.7137699020051912, \"phi\": 0.5549373941127399}, {\"truth_threshold\": 4.200000062584877, \"match_probability\": 0.9483982147343843, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1114.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 917.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5484982767109798, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.45150172328902016, \"precision\": 1.0, \"recall\": 0.5484982767109798, \"specificity\": 1.0, \"npv\": 0.555286129970902, \"accuracy\": 0.711272040302267, \"f1\": 0.7084260731319555, \"f2\": 0.6029443602511366, \"f0_5\": 0.8586403576383537, \"p4\": 0.7112333614409344, \"phi\": 0.5518817675648915}, {\"truth_threshold\": 4.300000064074993, \"match_probability\": 0.9516868803254299, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1109.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 922.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5460364352535697, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.45396356474643035, \"precision\": 1.0, \"recall\": 0.5460364352535697, \"specificity\": 1.0, \"npv\": 0.5539429124334785, \"accuracy\": 0.7096977329974811, \"f1\": 0.7063694267515923, \"f2\": 0.6005631972273368, \"f0_5\": 0.8574300293799288, \"p4\": 0.7096451676361052, \"phi\": 0.5499754660338557}, {\"truth_threshold\": 4.400000065565109, \"match_probability\": 0.9547759482410569, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1097.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 934.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5401280157557853, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.45987198424421466, \"precision\": 1.0, \"recall\": 0.5401280157557853, \"specificity\": 1.0, \"npv\": 0.5507455507455508, \"accuracy\": 0.7059193954659949, \"f1\": 0.7014066496163683, \"f2\": 0.5948378700791671, \"f0_5\": 0.8544944695435426, \"p4\": 0.7058242094166463, \"phi\": 0.5454109473695239}, {\"truth_threshold\": 4.500000067055225, \"match_probability\": 0.9576762895591182, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1090.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 941.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5366814377154111, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4633185622845889, \"precision\": 1.0, \"recall\": 0.5366814377154111, \"specificity\": 1.0, \"npv\": 0.5488974113135187, \"accuracy\": 0.7037153652392947, \"f1\": 0.6984940724126882, \"f2\": 0.5914912090297374, \"f0_5\": 0.8527616961351902, \"p4\": 0.7035890482972262, \"phi\": 0.5427550569658532}, {\"truth_threshold\": 4.6000000685453415, \"match_probability\": 0.9603983391922627, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1084.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 947.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.533727227966519, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.46627277203348105, \"precision\": 1.0, \"recall\": 0.533727227966519, \"specificity\": 1.0, \"npv\": 0.5473231357552581, \"accuracy\": 0.7018261964735516, \"f1\": 0.6959871589085073, \"f2\": 0.5886185925282363, \"f0_5\": 0.8512643317103816, \"p4\": 0.701669388939399, \"phi\": 0.5404824326919393}, {\"truth_threshold\": 4.700000070035458, \"match_probability\": 0.9629520927573305, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1077.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 954.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5302806499261448, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.46971935007385524, \"precision\": 1.0, \"recall\": 0.5302806499261448, \"specificity\": 1.0, \"npv\": 0.5454978561219629, \"accuracy\": 0.6996221662468514, \"f1\": 0.693050193050193, \"f2\": 0.5852624714704924, \"f0_5\": 0.8495030761949834, \"p4\": 0.6994252208594803, \"phi\": 0.53783543735763}, {\"truth_threshold\": 4.800000071525574, \"match_probability\": 0.9653471069144568, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1065.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 966.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5243722304283605, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4756277695716396, \"precision\": 1.0, \"recall\": 0.5243722304283605, \"specificity\": 1.0, \"npv\": 0.5423969682614874, \"accuracy\": 0.6958438287153652, \"f1\": 0.687984496124031, \"f2\": 0.5794972249428665, \"f0_5\": 0.8464473056747734, \"p4\": 0.6955662411444608, \"phi\": 0.5333084548597151}, {\"truth_threshold\": 4.90000007301569, \"match_probability\": 0.9675925026740654, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1061.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 970.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5224027572624323, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.4775972427375677, \"precision\": 1.0, \"recall\": 0.5224027572624323, \"specificity\": 1.0, \"npv\": 0.541371158392435, \"accuracy\": 0.6945843828715366, \"f1\": 0.6862871927554981, \"f2\": 0.577572128470332, \"f0_5\": 0.8454183266932271, \"p4\": 0.6942764887415704, \"phi\": 0.5318023936074047}, {\"truth_threshold\": 5.000000074505806, \"match_probability\": 0.969696971214501, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1050.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 981.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.51698670605613, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.48301329394387, \"precision\": 1.0, \"recall\": 0.51698670605613, \"specificity\": 1.0, \"npv\": 0.5385700846660395, \"accuracy\": 0.6911209068010076, \"f1\": 0.6815968841285297, \"f2\": 0.5722694571615435, \"f0_5\": 0.8425613866153105, \"p4\": 0.6907205167261657, \"phi\": 0.5276680529005587}, {\"truth_threshold\": 5.100000075995922, \"match_probability\": 0.9716687817966767, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1035.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 996.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5096011816838996, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.49039881831610044, \"precision\": 1.0, \"recall\": 0.5096011816838996, \"specificity\": 1.0, \"npv\": 0.5347968239140588, \"accuracy\": 0.6863979848866498, \"f1\": 0.675146771037182, \"f2\": 0.5650180150671471, \"f0_5\": 0.8385999027710258, \"p4\": 0.6858489665170626, \"phi\": 0.5220470222378447}, {\"truth_threshold\": 5.200000077486038, \"match_probability\": 0.9735157914041783, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1025.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1006.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5046774987690793, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.49532250123092075, \"precision\": 1.0, \"recall\": 0.5046774987690793, \"specificity\": 1.0, \"npv\": 0.5323105532310554, \"accuracy\": 0.6832493702770781, \"f1\": 0.6708115183246073, \"f2\": 0.5601705104382992, \"f0_5\": 0.8359158375468928, \"p4\": 0.6825861647803277, \"phi\": 0.518309905918297}, {\"truth_threshold\": 5.300000078976154, \"match_probability\": 0.9752454557772836, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1017.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1014.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.5007385524372231, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.49926144756277696, \"precision\": 1.0, \"recall\": 0.5007385524372231, \"specificity\": 1.0, \"npv\": 0.5303381194997684, \"accuracy\": 0.6807304785894207, \"f1\": 0.6673228346456693, \"f2\": 0.5562848703642927, \"f0_5\": 0.8337432365961633, \"p4\": 0.6799668560937839, \"phi\": 0.5153258602676495}, {\"truth_threshold\": 5.4000000804662704, \"match_probability\": 0.9768648415470134, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1008.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1023.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4963072378138848, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5036927621861153, \"precision\": 1.0, \"recall\": 0.4963072378138848, \"specificity\": 1.0, \"npv\": 0.5281365313653137, \"accuracy\": 0.677896725440806, \"f1\": 0.6633761105626851, \"f2\": 0.5519053876478318, \"f0_5\": 0.8312716476991588, \"p4\": 0.6770101709023063, \"phi\": 0.511974592211884}, {\"truth_threshold\": 5.500000081956387, \"match_probability\": 0.9783806392104205, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1001.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1030.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.49286065977351057, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5071393402264894, \"precision\": 1.0, \"recall\": 0.49286065977351057, \"specificity\": 1.0, \"npv\": 0.5264367816091954, \"accuracy\": 0.6756926952141058, \"f1\": 0.6602902374670184, \"f2\": 0.5484931506849315, \"f0_5\": 0.8293289146644574, \"p4\": 0.674703015175954, \"phi\": 0.5093721424586857}, {\"truth_threshold\": 5.600000083446503, \"match_probability\": 0.9797991767207457, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 998.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1033.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4913835548990645, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5086164451009355, \"precision\": 1.0, \"recall\": 0.4913835548990645, \"specificity\": 1.0, \"npv\": 0.5257116620752984, \"accuracy\": 0.6747481108312342, \"f1\": 0.6589633542423242, \"f2\": 0.5470291602718702, \"f0_5\": 0.8284907853229287, \"p4\": 0.6737121749549234, \"phi\": 0.508257872897662}, {\"truth_threshold\": 5.700000084936619, \"match_probability\": 0.9811264334957893, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 995.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1036.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4899064500246184, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5100935499753816, \"precision\": 1.0, \"recall\": 0.4899064500246184, \"specificity\": 1.0, \"npv\": 0.5249885373681797, \"accuracy\": 0.6738035264483627, \"f1\": 0.6576338400528751, \"f2\": 0.5455642066016011, \"f0_5\": 0.8276493095990684, \"p4\": 0.6727200795968197, \"phi\": 0.5071442306145872}, {\"truth_threshold\": 5.800000086426735, \"match_probability\": 0.9823680546749124, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 986.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1045.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4854751354012802, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5145248645987198, \"precision\": 1.0, \"recall\": 0.4854751354012802, \"specificity\": 1.0, \"npv\": 0.5228310502283106, \"accuracy\": 0.6709697732997482, \"f1\": 0.6536294332117998, \"f2\": 0.5411635565312843, \"f0_5\": 0.8251046025104602, \"p4\": 0.6697361249633867, \"phi\": 0.5038069817912239}, {\"truth_threshold\": 5.900000087916851, \"match_probability\": 0.9835293654795508, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 982.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1049.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.483505662235352, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5164943377646479, \"precision\": 1.0, \"recall\": 0.483505662235352, \"specificity\": 1.0, \"npv\": 0.5218778486782133, \"accuracy\": 0.6697103274559194, \"f1\": 0.6518420179223365, \"f2\": 0.5392049198330771, \"f0_5\": 0.8239637523074341, \"p4\": 0.6684061590335132, \"phi\": 0.5023254869416048}, {\"truth_threshold\": 6.000000089406967, \"match_probability\": 0.9846153855541349, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 976.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1055.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.48055145248645986, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5194485475135401, \"precision\": 1.0, \"recall\": 0.48055145248645986, \"specificity\": 1.0, \"npv\": 0.5204545454545455, \"accuracy\": 0.6678211586901763, \"f1\": 0.6491519787163286, \"f2\": 0.5362637362637362, \"f0_5\": 0.8222409435551812, \"p4\": 0.6664067677092192, \"phi\": 0.5001051767092219}, {\"truth_threshold\": 6.100000090897083, \"match_probability\": 0.985630843183972, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 971.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1060.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.47808961102904973, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5219103889709503, \"precision\": 1.0, \"recall\": 0.47808961102904973, \"specificity\": 1.0, \"npv\": 0.5192743764172335, \"accuracy\": 0.6662468513853904, \"f1\": 0.6469020652898068, \"f2\": 0.5338097855964816, \"f0_5\": 0.8207945900253593, \"p4\": 0.6647364629140795, \"phi\": 0.49825664535324315}, {\"truth_threshold\": 6.200000092387199, \"match_probability\": 0.9865801893041345, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 960.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1071.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4726735598227474, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5273264401772526, \"precision\": 1.0, \"recall\": 0.4726735598227474, \"specificity\": 1.0, \"npv\": 0.5166967509025271, \"accuracy\": 0.6627833753148614, \"f1\": 0.641925777331996, \"f2\": 0.5284015852047557, \"f0_5\": 0.8175779253960143, \"p4\": 0.6610481781257823, \"phi\": 0.49419519685843255}, {\"truth_threshold\": 6.3000000938773155, \"match_probability\": 0.987467611228855, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 949.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1082.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4672575086164451, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5327424913835549, \"precision\": 1.0, \"recall\": 0.4672575086164451, \"specificity\": 1.0, \"npv\": 0.5141445891333633, \"accuracy\": 0.6593198992443325, \"f1\": 0.6369127516778523, \"f2\": 0.5229802711341343, \"f0_5\": 0.8143126823408272, \"p4\": 0.6573405717493672, \"phi\": 0.4901407142720151}, {\"truth_threshold\": 6.400000095367432, \"match_probability\": 0.9882970460445225, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 932.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1099.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4588872476612506, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5411127523387493, \"precision\": 1.0, \"recall\": 0.4588872476612506, \"specificity\": 1.0, \"npv\": 0.5102495543672014, \"accuracy\": 0.6539672544080605, \"f1\": 0.6290921363482956, \"f2\": 0.5145759717314488, \"f0_5\": 0.8091682583781906, \"p4\": 0.6515708689560343, \"phi\": 0.4838873976701033}, {\"truth_threshold\": 6.500000096857548, \"match_probability\": 0.9890721936212699, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 929.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1102.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.45741014278680453, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5425898572131955, \"precision\": 1.0, \"recall\": 0.45741014278680453, \"specificity\": 1.0, \"npv\": 0.5095683133066311, \"accuracy\": 0.6530226700251889, \"f1\": 0.6277027027027027, \"f2\": 0.5130895835634596, \"f0_5\": 0.8082477814511919, \"p4\": 0.6505474921288833, \"phi\": 0.48278537151535283}, {\"truth_threshold\": 6.600000098347664, \"match_probability\": 0.9897965292084853, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 922.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1109.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.45396356474643035, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5460364352535697, \"precision\": 1.0, \"recall\": 0.45396356474643035, \"specificity\": 1.0, \"npv\": 0.5079858030168589, \"accuracy\": 0.6508186397984886, \"f1\": 0.6244497121571283, \"f2\": 0.5096175105018793, \"f0_5\": 0.8060849798915894, \"p4\": 0.6481533639812846, \"phi\": 0.48021562446271077}, {\"truth_threshold\": 6.70000009983778, \"match_probability\": 0.9904733155885336, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 916.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1115.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.45100935499753814, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5489906450024619, \"precision\": 1.0, \"recall\": 0.45100935499753814, \"specificity\": 1.0, \"npv\": 0.5066371681415929, \"accuracy\": 0.6489294710327456, \"f1\": 0.6216491347132678, \"f2\": 0.5066371681415929, \"f0_5\": 0.8042142230026339, \"p4\": 0.6460941632868983, \"phi\": 0.4780147512591208}, {\"truth_threshold\": 6.800000101327896, \"match_probability\": 0.9911056147706719, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 911.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1120.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.448547513540128, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.551452486459872, \"precision\": 1.0, \"recall\": 0.448547513540128, \"specificity\": 1.0, \"npv\": 0.5055187637969095, \"accuracy\": 0.6473551637279596, \"f1\": 0.619306594153637, \"f2\": 0.5041505257332596, \"f0_5\": 0.8026431718061674, \"p4\": 0.6443730598755232, \"phi\": 0.47618188179411347}, {\"truth_threshold\": 6.900000102818012, \"match_probability\": 0.9916962992137202, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 908.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1123.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.44707040866568193, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5529295913343181, \"precision\": 1.0, \"recall\": 0.44707040866568193, \"specificity\": 1.0, \"npv\": 0.5048500881834215, \"accuracy\": 0.6464105793450882, \"f1\": 0.6178972439605308, \"f2\": 0.5026572187776793, \"f0_5\": 0.8016952145505916, \"p4\": 0.6433381357110801, \"phi\": 0.4750826614801553}, {\"truth_threshold\": 7.000000104308128, \"match_probability\": 0.9922480625716311, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 904.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1127.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4451009354997538, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5548990645002462, \"precision\": 1.0, \"recall\": 0.4451009354997538, \"specificity\": 1.0, \"npv\": 0.5039612676056338, \"accuracy\": 0.6451511335012594, \"f1\": 0.6160136286201022, \"f2\": 0.5006645990252547, \"f0_5\": 0.8004250044271295, \"p4\": 0.6419555618126742, \"phi\": 0.47361760067264114}, {\"truth_threshold\": 7.1000001057982445, \"match_probability\": 0.9927634299608046, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 895.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1136.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.44066962087641554, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5593303791235844, \"precision\": 1.0, \"recall\": 0.44066962087641554, \"specificity\": 1.0, \"npv\": 0.5019728189390618, \"accuracy\": 0.6423173803526449, \"f1\": 0.6117566643882434, \"f2\": 0.49617474221088814, \"f0_5\": 0.7975405453573338, \"p4\": 0.6388333899578278, \"phi\": 0.47032347571872485}, {\"truth_threshold\": 7.200000107288361, \"match_probability\": 0.9932447677519157, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 890.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1141.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4382077794190054, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5617922205809945, \"precision\": 1.0, \"recall\": 0.4382077794190054, \"specificity\": 1.0, \"npv\": 0.5008748906386702, \"accuracy\": 0.6407430730478589, \"f1\": 0.609380349195481, \"f2\": 0.4936765032172177, \"f0_5\": 0.7959220175281703, \"p4\": 0.637091902562231, \"phi\": 0.4684946889704395}, {\"truth_threshold\": 7.300000108778477, \"match_probability\": 0.9936942928922654, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 881.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1150.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.43377646479566717, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5662235352043329, \"precision\": 1.0, \"recall\": 0.43377646479566717, \"specificity\": 1.0, \"npv\": 0.4989106753812636, \"accuracy\": 0.6379093198992444, \"f1\": 0.6050824175824175, \"f2\": 0.48917268184342033, \"f0_5\": 0.7929792979297929, \"p4\": 0.6339443948668317, \"phi\": 0.46520501826152216}, {\"truth_threshold\": 7.400000110268593, \"match_probability\": 0.9941140817673122, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 871.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1160.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.42885278188084686, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5711472181191531, \"precision\": 1.0, \"recall\": 0.42885278188084686, \"specificity\": 1.0, \"npv\": 0.4967462039045553, \"accuracy\": 0.6347607052896725, \"f1\": 0.600275671950379, \"f2\": 0.4841578654808227, \"f0_5\": 0.7896645512239348, \"p4\": 0.6304272931567985, \"phi\": 0.46155280459901765}, {\"truth_threshold\": 7.500000111758709, \"match_probability\": 0.9945060786121668, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 863.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1168.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.42491383554899065, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5750861644510094, \"precision\": 1.0, \"recall\": 0.42491383554899065, \"specificity\": 1.0, \"npv\": 0.4950281020319931, \"accuracy\": 0.6322418136020151, \"f1\": 0.5964063579820318, \"f2\": 0.48013797707800154, \"f0_5\": 0.7869779317891665, \"p4\": 0.6275980948521758, \"phi\": 0.458633066338387}, {\"truth_threshold\": 7.600000113248825, \"match_probability\": 0.9948721034855129, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 847.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1184.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4170359428852782, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5829640571147218, \"precision\": 1.0, \"recall\": 0.4170359428852782, \"specificity\": 1.0, \"npv\": 0.49162730785744957, \"accuracy\": 0.6272040302267002, \"f1\": 0.5886031966643502, \"f2\": 0.4720766915616988, \"f0_5\": 0.78150950359845, \"p4\": 0.621896736471326, \"phi\": 0.4527982529565263}, {\"truth_threshold\": 7.700000114738941, \"match_probability\": 0.9952138598197071, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 842.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1189.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.41457410142786805, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.585425898572132, \"precision\": 1.0, \"recall\": 0.41457410142786805, \"specificity\": 1.0, \"npv\": 0.49057412167952014, \"accuracy\": 0.6256297229219143, \"f1\": 0.5861468847894187, \"f2\": 0.4695516395271024, \"f0_5\": 0.7797740322281904, \"p4\": 0.6201029345529268, \"phi\": 0.45097597017918}, {\"truth_threshold\": 7.800000116229057, \"match_probability\": 0.9955329415617687, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 830.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1201.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.4086656819300837, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5913343180699163, \"precision\": 1.0, \"recall\": 0.4086656819300837, \"specificity\": 1.0, \"npv\": 0.4880647911338448, \"accuracy\": 0.6218513853904282, \"f1\": 0.5802167074449494, \"f2\": 0.4634800089345544, \"f0_5\": 0.7755559708465707, \"p4\": 0.6157733816591411, \"phi\": 0.4466042215371196}, {\"truth_threshold\": 7.900000117719173, \"match_probability\": 0.99583083992065, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 821.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1210.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.40423436730674545, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.5957656326932546, \"precision\": 1.0, \"recall\": 0.40423436730674545, \"specificity\": 1.0, \"npv\": 0.4861995753715499, \"accuracy\": 0.6190176322418136, \"f1\": 0.5757363253856943, \"f2\": 0.4589155953046395, \"f0_5\": 0.7723424270931326, \"p4\": 0.6125029320545232, \"phi\": 0.44332671669450147}, {\"truth_threshold\": 8.00000011920929, \"match_probability\": 0.9961089497366072, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 811.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1220.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3993106843919252, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6006893156080748, \"precision\": 1.0, \"recall\": 0.3993106843919252, \"specificity\": 1.0, \"npv\": 0.48414376321353064, \"accuracy\": 0.6158690176322418, \"f1\": 0.5707248416608023, \"f2\": 0.45383324006715164, \"f0_5\": 0.7687203791469195, \"p4\": 0.6088448866523514, \"phi\": 0.43968599867732555}, {\"truth_threshold\": 8.100000120699406, \"match_probability\": 0.9963685754887298, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 806.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1225.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.396848842934515, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6031511570654849, \"precision\": 1.0, \"recall\": 0.396848842934515, \"specificity\": 1.0, \"npv\": 0.4831223628691983, \"accuracy\": 0.614294710327456, \"f1\": 0.5682058512513218, \"f2\": 0.45128779395296753, \"f0_5\": 0.7668886774500476, \"p4\": 0.60700605364999, \"phi\": 0.4378659049302997}, {\"truth_threshold\": 8.200000122189522, \"match_probability\": 0.9966109369567457, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 799.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1232.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3934022648941408, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6065977351058592, \"precision\": 1.0, \"recall\": 0.3934022648941408, \"specificity\": 1.0, \"npv\": 0.48169962137147665, \"accuracy\": 0.6120906801007556, \"f1\": 0.5646643109540636, \"f2\": 0.4477193768911801, \"f0_5\": 0.7643007461258848, \"p4\": 0.604420434102115, \"phi\": 0.4353179551157854}, {\"truth_threshold\": 8.300000123679638, \"match_probability\": 0.9968371745531442, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 791.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1240.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3894633185622846, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6105366814377154, \"precision\": 1.0, \"recall\": 0.3894633185622846, \"specificity\": 1.0, \"npv\": 0.480083857442348, \"accuracy\": 0.6095717884130982, \"f1\": 0.5605953224663359, \"f2\": 0.44363432417274257, \"f0_5\": 0.7613089509143407, \"p4\": 0.6014490106949208, \"phi\": 0.43240611964642633}, {\"truth_threshold\": 8.400000125169754, \"match_probability\": 0.9970483543414643, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 784.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1247.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3860167405219104, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6139832594780896, \"precision\": 1.0, \"recall\": 0.3860167405219104, \"specificity\": 1.0, \"npv\": 0.4786789297658863, \"accuracy\": 0.607367758186398, \"f1\": 0.5570159857904086, \"f2\": 0.4400538841490795, \"f0_5\": 0.7586607315657055, \"p4\": 0.5988342917486007, \"phi\": 0.4298582094420716}, {\"truth_threshold\": 8.50000012665987, \"match_probability\": 0.997245472756309, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 776.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1255.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.38207779419005417, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6179222058099458, \"precision\": 1.0, \"recall\": 0.38207779419005417, \"specificity\": 1.0, \"npv\": 0.47708333333333336, \"accuracy\": 0.6048488664987406, \"f1\": 0.5529034556465978, \"f2\": 0.43595505617977526, \"f0_5\": 0.7555988315481986, \"p4\": 0.5958287894168168, \"phi\": 0.4269460711200401}, {\"truth_threshold\": 8.600000128149986, \"match_probability\": 0.9974294610402847, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 770.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1261.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.379123584441162, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.620876415558838, \"precision\": 1.0, \"recall\": 0.379123584441162, \"specificity\": 1.0, \"npv\": 0.47589359933499586, \"accuracy\": 0.6029596977329975, \"f1\": 0.5498036415565869, \"f2\": 0.43287609624465934, \"f0_5\": 0.7532772451575034, \"p4\": 0.5935623079687247, \"phi\": 0.42476168282048443}, {\"truth_threshold\": 8.700000129640102, \"match_probability\": 0.997601189412643, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 763.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1268.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.37567700640078777, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6243229935992122, \"precision\": 1.0, \"recall\": 0.37567700640078777, \"specificity\": 1.0, \"npv\": 0.4745130542892665, \"accuracy\": 0.6007556675062973, \"f1\": 0.5461703650680029, \"f2\": 0.4292787217283673, \"f0_5\": 0.7505410190832186, \"p4\": 0.5909043870322787, \"phi\": 0.42221279437445536}, {\"truth_threshold\": 8.800000131130219, \"match_probability\": 0.997761470983937, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 760.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1271.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3741999015263417, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6258000984736583, \"precision\": 1.0, \"recall\": 0.3741999015263417, \"specificity\": 1.0, \"npv\": 0.47392384105960267, \"accuracy\": 0.5998110831234257, \"f1\": 0.5446076675026872, \"f2\": 0.42773525438991444, \"f0_5\": 0.7493591007690791, \"p4\": 0.5897606850495294, \"phi\": 0.4211202377652835}, {\"truth_threshold\": 8.900000132620335, \"match_probability\": 0.9979110654305032, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 752.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1279.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3702609551944855, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6297390448055146, \"precision\": 1.0, \"recall\": 0.3702609551944855, \"specificity\": 1.0, \"npv\": 0.47235973597359737, \"accuracy\": 0.5972921914357683, \"f1\": 0.5404240028745958, \"f2\": 0.423614240648941, \"f0_5\": 0.7461797975788848, \"p4\": 0.5866970744697614, \"phi\": 0.4182061298414923}, {\"truth_threshold\": 9.00000013411045, \"match_probability\": 0.9980506824420605, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 745.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1286.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.36681437715411125, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6331856228458888, \"precision\": 1.0, \"recall\": 0.36681437715411125, \"specificity\": 1.0, \"npv\": 0.4709995886466475, \"accuracy\": 0.595088161209068, \"f1\": 0.5367435158501441, \"f2\": 0.4200022550456647, \"f0_5\": 0.7433645978846538, \"p4\": 0.5839996987663111, \"phi\": 0.415655411066983}, {\"truth_threshold\": 9.100000135600567, \"match_probability\": 0.9981809849551747, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 738.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1293.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.36336779911373707, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6366322008862629, \"precision\": 1.0, \"recall\": 0.36336779911373707, \"specificity\": 1.0, \"npv\": 0.4696472518457752, \"accuracy\": 0.5928841309823678, \"f1\": 0.5330444203683641, \"f2\": 0.4163845633039946, \"f0_5\": 0.74051776038531, \"p4\": 0.5812863439396199, \"phi\": 0.41310372579173665}, {\"truth_threshold\": 9.200000137090683, \"match_probability\": 0.9983025921847976, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 726.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1305.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.35745937961595275, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6425406203840472, \"precision\": 1.0, \"recall\": 0.35745937961595275, \"specificity\": 1.0, \"npv\": 0.4673469387755102, \"accuracy\": 0.5891057934508817, \"f1\": 0.5266594124047879, \"f2\": 0.4101694915254237, \"f0_5\": 0.7355623100303952, \"p4\": 0.5765966357046058, \"phi\": 0.40872673854313535}, {\"truth_threshold\": 9.300000138580799, \"match_probability\": 0.9984160824655384, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 709.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1322.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.34908911866075826, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6509108813392418, \"precision\": 1.0, \"recall\": 0.34908911866075826, \"specificity\": 1.0, \"npv\": 0.4641264693960276, \"accuracy\": 0.5837531486146096, \"f1\": 0.5175182481751824, \"f2\": 0.40133589946790443, \"f0_5\": 0.7283747688514486, \"p4\": 0.5698668774463831, \"phi\": 0.40251894383816106}, {\"truth_threshold\": 9.400000140070915, \"match_probability\": 0.9985219959137808, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 704.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1327.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3466272772033481, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6533727227966519, \"precision\": 1.0, \"recall\": 0.3466272772033481, \"specificity\": 1.0, \"npv\": 0.46318770226537215, \"accuracy\": 0.5821788413098237, \"f1\": 0.5148080438756856, \"f2\": 0.3987313094698686, \"f0_5\": 0.7262224056117186, \"p4\": 0.5678676238912578, \"phi\": 0.4006912677739821}, {\"truth_threshold\": 9.500000141561031, \"match_probability\": 0.9986208369212233, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 700.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1331.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.34465780403741997, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.65534219596258, \"precision\": 1.0, \"recall\": 0.34465780403741997, \"specificity\": 1.0, \"npv\": 0.4624394184168013, \"accuracy\": 0.5809193954659949, \"f1\": 0.5126327352618089, \"f2\": 0.39664551223934724, \"f0_5\": 0.724487683709377, \"p4\": 0.5662615170898467, \"phi\": 0.39922844895106907}, {\"truth_threshold\": 9.600000143051147, \"match_probability\": 0.9987130764898899, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 694.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1337.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3417035942885278, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6582964057114722, \"precision\": 1.0, \"recall\": 0.3417035942885278, \"specificity\": 1.0, \"npv\": 0.4613215149073328, \"accuracy\": 0.5790302267002518, \"f1\": 0.5093577981651376, \"f2\": 0.3935132683148106, \"f0_5\": 0.7218639484085708, \"p4\": 0.5638409987258001, \"phi\": 0.39703302100261667}, {\"truth_threshold\": 9.700000144541264, \"match_probability\": 0.9987991544181472, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 677.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1354.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3333333333333333, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6666666666666666, \"precision\": 1.0, \"recall\": 0.3333333333333333, \"specificity\": 1.0, \"npv\": 0.45818327330932374, \"accuracy\": 0.5736775818639799, \"f1\": 0.5, \"f2\": 0.38461538461538464, \"f0_5\": 0.7142857142857143, \"p4\": 0.5569066147859922, \"phi\": 0.39080398893790036}, {\"truth_threshold\": 9.80000014603138, \"match_probability\": 0.9988794813467569, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 655.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1376.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3225012309207287, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6774987690792713, \"precision\": 1.0, \"recall\": 0.3225012309207287, \"specificity\": 1.0, \"npv\": 0.45418484728282427, \"accuracy\": 0.5667506297229219, \"f1\": 0.48771407297096053, \"f2\": 0.3730493222462695, \"f0_5\": 0.7041496452375833, \"p4\": 0.5477568608833787, \"phi\": 0.38272074978272863}, {\"truth_threshold\": 9.900000147521496, \"match_probability\": 0.9989544406735176, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 651.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1380.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3205317577548006, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6794682422451994, \"precision\": 1.0, \"recall\": 0.3205317577548006, \"specificity\": 1.0, \"npv\": 0.4534653465346535, \"accuracy\": 0.5654911838790933, \"f1\": 0.4854586129753915, \"f2\": 0.37094017094017095, \"f0_5\": 0.7022653721682848, \"p4\": 0.5460709222975572, \"phi\": 0.3812480093136779}, {\"truth_threshold\": 10.000000149011612, \"match_probability\": 0.9990243903445719, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 645.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1386.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3175775480059084, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6824224519940916, \"precision\": 1.0, \"recall\": 0.3175775480059084, \"specificity\": 1.0, \"npv\": 0.4523903595416831, \"accuracy\": 0.5636020151133502, \"f1\": 0.4820627802690583, \"f2\": 0.3677728361272665, \"f0_5\": 0.6994144437215355, \"p4\": 0.5435286584827342, \"phi\": 0.37903696538036896}, {\"truth_threshold\": 10.100000150501728, \"match_probability\": 0.9990896645300149, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 637.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1394.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.31363860167405216, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6863613983259478, \"precision\": 1.0, \"recall\": 0.31363860167405216, \"specificity\": 1.0, \"npv\": 0.4509649468294604, \"accuracy\": 0.5610831234256927, \"f1\": 0.4775112443778111, \"f2\": 0.36354297454628465, \"f0_5\": 0.6955667176239354, \"p4\": 0.5401135374024728, \"phi\": 0.3760851171312224}, {\"truth_threshold\": 10.200000151991844, \"match_probability\": 0.9991505751910027, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 630.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1401.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.310192023633678, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.689807976366322, \"precision\": 1.0, \"recall\": 0.310192023633678, \"specificity\": 1.0, \"npv\": 0.4497250589159466, \"accuracy\": 0.5588790931989924, \"f1\": 0.47350620067643745, \"f2\": 0.3598355037697053, \"f0_5\": 0.6921555702043507, \"p4\": 0.5371008948889416, \"phi\": 0.3734984954506678}, {\"truth_threshold\": 10.30000015348196, \"match_probability\": 0.9992074135451509, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 623.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1408.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3067454455933038, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6932545544066963, \"precision\": 1.0, \"recall\": 0.3067454455933038, \"specificity\": 1.0, \"npv\": 0.44849197023110066, \"accuracy\": 0.5566750629722922, \"f1\": 0.4694800301431801, \"f2\": 0.3561220990053733, \"f0_5\": 0.6887021888127349, \"p4\": 0.5340649007128252, \"phi\": 0.3709081682216741}, {\"truth_threshold\": 10.400000154972076, \"match_probability\": 0.9992604514366183, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 616.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1415.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.3032988675529296, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.6967011324470704, \"precision\": 1.0, \"recall\": 0.3032988675529296, \"specificity\": 1.0, \"npv\": 0.447265625, \"accuracy\": 0.554471032745592, \"f1\": 0.4654325651681148, \"f2\": 0.3524027459954233, \"f0_5\": 0.6852057842046718, \"p4\": 0.5310049980284561, \"phi\": 0.36831393885903}, {\"truth_threshold\": 10.500000156462193, \"match_probability\": 0.9993099426168967, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 608.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1423.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.29935992122107336, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7006400787789266, \"precision\": 1.0, \"recall\": 0.29935992122107336, \"specificity\": 1.0, \"npv\": 0.44587227414330216, \"accuracy\": 0.5519521410579346, \"f1\": 0.4607805987116332, \"f2\": 0.34814475492441593, \"f0_5\": 0.6811561729778176, \"p4\": 0.5274779562426515, \"phi\": 0.36534406914879536}, {\"truth_threshold\": 10.600000157952309, \"match_probability\": 0.9993561239419685, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 601.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1430.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2959133431806992, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7040866568193008, \"precision\": 1.0, \"recall\": 0.2959133431806992, \"specificity\": 1.0, \"npv\": 0.4446601941747573, \"accuracy\": 0.5497481108312342, \"f1\": 0.4566869300911854, \"f2\": 0.34441260744985674, \"f0_5\": 0.677564825253664, \"p4\": 0.5243648749561851, \"phi\": 0.36274079538650084}, {\"truth_threshold\": 10.700000159442425, \"match_probability\": 0.9993992164911604, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 582.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1449.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2865583456425406, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7134416543574594, \"precision\": 1.0, \"recall\": 0.2865583456425406, \"specificity\": 1.0, \"npv\": 0.44140323824209715, \"accuracy\": 0.5437657430730478, \"f1\": 0.4454649827784156, \"f2\": 0.33425223983459684, \"f0_5\": 0.6675843083275981, \"p4\": 0.5157834304021511, \"phi\": 0.3556512079438443}, {\"truth_threshold\": 10.800000160932541, \"match_probability\": 0.9994394266126935, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 572.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1459.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2816346627277203, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7183653372722797, \"precision\": 1.0, \"recall\": 0.2816346627277203, \"specificity\": 1.0, \"npv\": 0.4397081413210445, \"accuracy\": 0.5406171284634761, \"f1\": 0.4394928928159816, \"f2\": 0.32888684452621897, \"f0_5\": 0.6621903218337578, \"p4\": 0.5111863478106634, \"phi\": 0.35190489351468984}, {\"truth_threshold\": 10.900000162422657, \"match_probability\": 0.9994769469006325, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 567.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1464.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2791728212703102, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7208271787296898, \"precision\": 1.0, \"recall\": 0.2791728212703102, \"specificity\": 1.0, \"npv\": 0.43886546569566887, \"accuracy\": 0.5390428211586902, \"f1\": 0.43648960739030024, \"f2\": 0.3261995167414567, \"f0_5\": 0.6594556873691556, \"p4\": 0.5088661529526017, \"phi\": 0.35002758493634245}, {\"truth_threshold\": 11.000000163912773, \"match_probability\": 0.9995119571076428, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 559.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1472.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.27523387493845397, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.724766125061546, \"precision\": 1.0, \"recall\": 0.27523387493845397, \"specificity\": 1.0, \"npv\": 0.4375238823079862, \"accuracy\": 0.5365239294710328, \"f1\": 0.43166023166023165, \"f2\": 0.3218933548312795, \"f0_5\": 0.6550269510194516, \"p4\": 0.5051230066125968, \"phi\": 0.34701785761793746}, {\"truth_threshold\": 11.10000016540289, \"match_probability\": 0.9995446249976983, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 546.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1485.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2688330871491876, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7311669128508124, \"precision\": 1.0, \"recall\": 0.2688330871491876, \"specificity\": 1.0, \"npv\": 0.435361216730038, \"accuracy\": 0.5324307304785895, \"f1\": 0.42374854481955765, \"f2\": 0.314878892733564, \"f0_5\": 0.6476868327402135, \"p4\": 0.4989569646924318, \"phi\": 0.34211036219115415}, {\"truth_threshold\": 11.200000166893005, \"match_probability\": 0.9995751071426191, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 539.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1492.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2653865091088134, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7346134908911867, \"precision\": 1.0, \"recall\": 0.2653865091088134, \"specificity\": 1.0, \"npv\": 0.4342055365946151, \"accuracy\": 0.5302267002518891, \"f1\": 0.41945525291828795, \"f2\": 0.31109315479625993, \"f0_5\": 0.6436589443515643, \"p4\": 0.4955925822721991, \"phi\": 0.33945882164492946}, {\"truth_threshold\": 11.300000168383121, \"match_probability\": 0.9996035496660847, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 532.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1499.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2619399310684392, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7380600689315608, \"precision\": 1.0, \"recall\": 0.2619399310684392, \"specificity\": 1.0, \"npv\": 0.43305597579425115, \"accuracy\": 0.5280226700251889, \"f1\": 0.4151385095591104, \"f2\": 0.30730129390018485, \"f0_5\": 0.6395768213512864, \"p4\": 0.49219626037015723, \"phi\": 0.33680061230395913}, {\"truth_threshold\": 11.400000169873238, \"match_probability\": 0.99963008893853, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 528.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1503.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.25997045790251105, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.740029542097489, \"precision\": 1.0, \"recall\": 0.25997045790251105, \"specificity\": 1.0, \"npv\": 0.4324018126888218, \"accuracy\": 0.5267632241813602, \"f1\": 0.4126611957796014, \"f2\": 0.30513176144244103, \"f0_5\": 0.6372194062273715, \"p4\": 0.4902408570943766, \"phi\": 0.33527853680572633}, {\"truth_threshold\": 11.500000171363354, \"match_probability\": 0.999654852226126, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 524.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1507.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.25800098473658295, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.741999015263417, \"precision\": 1.0, \"recall\": 0.25800098473658295, \"specificity\": 1.0, \"npv\": 0.43174962292609353, \"accuracy\": 0.5255037783375315, \"f1\": 0.4101761252446184, \"f2\": 0.30296022201665124, \"f0_5\": 0.6348437121395687, \"p4\": 0.4882746099115319, \"phi\": 0.3337541428575539}, {\"truth_threshold\": 11.60000017285347, \"match_probability\": 0.9996779582968373, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 520.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1511.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.25603151157065485, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7439684884293452, \"precision\": 1.0, \"recall\": 0.25603151157065485, \"specificity\": 1.0, \"npv\": 0.4310993975903614, \"accuracy\": 0.5242443324937027, \"f1\": 0.40768326146609174, \"f2\": 0.3007866728366497, \"f0_5\": 0.6324495256628557, \"p4\": 0.48629736299660126, \"phi\": 0.33222737756280557}, {\"truth_threshold\": 11.700000174343586, \"match_probability\": 0.9996995179863626, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 510.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1521.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2511078286558346, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7488921713441654, \"precision\": 1.0, \"recall\": 0.2511078286558346, \"specificity\": 1.0, \"npv\": 0.42948237059264815, \"accuracy\": 0.521095717884131, \"f1\": 0.4014167650531287, \"f2\": 0.2953439888811675, \"f0_5\": 0.6263817243920413, \"p4\": 0.4813050748446082, \"phi\": 0.3283997343565922}, {\"truth_threshold\": 11.800000175833702, \"match_probability\": 0.9997196347265854, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 501.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1530.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2466765140324963, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7533234859675036, \"precision\": 1.0, \"recall\": 0.2466765140324963, \"specificity\": 1.0, \"npv\": 0.4280373831775701, \"accuracy\": 0.5182619647355163, \"f1\": 0.3957345971563981, \"f2\": 0.29043478260869565, \"f0_5\": 0.620817843866171, \"p4\": 0.47675027425949934, \"phi\": 0.32494117861212185}, {\"truth_threshold\": 11.900000177323818, \"match_probability\": 0.9997384050389891, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 495.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1536.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.24372230428360414, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7562776957163959, \"precision\": 1.0, \"recall\": 0.24372230428360414, \"specificity\": 1.0, \"npv\": 0.42707944796717645, \"accuracy\": 0.5163727959697733, \"f1\": 0.3919239904988123, \"f2\": 0.28715628263139575, \"f0_5\": 0.6170531039640987, \"p4\": 0.4736802517268613, \"phi\": 0.3226279392283468}, {\"truth_threshold\": 12.000000178813934, \"match_probability\": 0.9997559189953416, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 486.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1545.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.23929098966026588, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7607090103397341, \"precision\": 1.0, \"recall\": 0.23929098966026588, \"specificity\": 1.0, \"npv\": 0.4256505576208178, \"accuracy\": 0.5135390428211587, \"f1\": 0.38617401668653156, \"f2\": 0.28222996515679444, \"f0_5\": 0.6113207547169811, \"p4\": 0.46902349984986647, \"phi\": 0.319146272393286}, {\"truth_threshold\": 12.10000018030405, \"match_probability\": 0.9997722606477963, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 484.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1547.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.23830625307730183, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7616937469226982, \"precision\": 1.0, \"recall\": 0.23830625307730183, \"specificity\": 1.0, \"npv\": 0.425334323922734, \"accuracy\": 0.5129093198992444, \"f1\": 0.3848906560636183, \"f2\": 0.28113382899628253, \"f0_5\": 0.6100327703554324, \"p4\": 0.46798006157343497, \"phi\": 0.31837058444396854}, {\"truth_threshold\": 12.200000181794167, \"match_probability\": 0.9997875084304283, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 479.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1552.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.23584441161989167, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7641555883801083, \"precision\": 1.0, \"recall\": 0.23584441161989167, \"specificity\": 1.0, \"npv\": 0.42454579162031886, \"accuracy\": 0.5113350125944585, \"f1\": 0.3816733067729084, \"f2\": 0.2783912588631873, \"f0_5\": 0.6067899670635926, \"p4\": 0.46535750619184146, \"phi\": 0.3164281157394128}, {\"truth_threshold\": 12.300000183284283, \"match_probability\": 0.9998017355340825, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 461.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1570.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.22698178237321517, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7730182176267848, \"precision\": 1.0, \"recall\": 0.22698178237321517, \"specificity\": 1.0, \"npv\": 0.42173112338858193, \"accuracy\": 0.5056675062972292, \"f1\": 0.36998394863563405, \"f2\": 0.2684915550378567, \"f0_5\": 0.5948387096774194, \"p4\": 0.4557459851493697, \"phi\": 0.3093950259280176}, {\"truth_threshold\": 12.400000184774399, \"match_probability\": 0.9998150102562988, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 457.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1574.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.22501230920728704, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7749876907927129, \"precision\": 1.0, \"recall\": 0.22501230920728704, \"specificity\": 1.0, \"npv\": 0.42111070246414123, \"accuracy\": 0.5044080604534005, \"f1\": 0.36736334405144694, \"f2\": 0.2662859806549353, \"f0_5\": 0.592122311479658, \"p4\": 0.45357265449199585, \"phi\": 0.30782314986589165}, {\"truth_threshold\": 12.500000186264515, \"match_probability\": 0.9998273963279586, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 445.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1586.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.2191038897095027, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7808961102904973, \"precision\": 1.0, \"recall\": 0.2191038897095027, \"specificity\": 1.0, \"npv\": 0.4192603441962651, \"accuracy\": 0.5006297229219143, \"f1\": 0.3594507269789984, \"f2\": 0.25965690278912357, \"f0_5\": 0.583836263447914, \"p4\": 0.44696743745394574, \"phi\": 0.3030867404132794}, {\"truth_threshold\": 12.600000187754631, \"match_probability\": 0.9998389532181915, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 436.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1595.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.21467257508616444, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7853274249138356, \"precision\": 1.0, \"recall\": 0.21467257508616444, \"specificity\": 1.0, \"npv\": 0.41788321167883213, \"accuracy\": 0.49779596977329976, \"f1\": 0.3534657478719092, \"f2\": 0.2546728971962617, \"f0_5\": 0.5774834437086093, \"p4\": 0.4419269349297068, \"phi\": 0.2995130466880727}, {\"truth_threshold\": 12.700000189244747, \"match_probability\": 0.9998497364189812, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 429.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1602.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.21122599704579026, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7887740029542097, \"precision\": 1.0, \"recall\": 0.21122599704579026, \"specificity\": 1.0, \"npv\": 0.4168183472879505, \"accuracy\": 0.49559193954659947, \"f1\": 0.348780487804878, \"f2\": 0.2507891967730621, \"f0_5\": 0.5724579663730984, \"p4\": 0.43795337341921126, \"phi\": 0.2967201897291045}, {\"truth_threshold\": 12.800000190734863, \"match_probability\": 0.9998597977108138, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 426.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1605.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.20974889217134415, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.7902511078286558, \"precision\": 1.0, \"recall\": 0.20974889217134415, \"specificity\": 1.0, \"npv\": 0.4163636363636364, \"accuracy\": 0.494647355163728, \"f1\": 0.34676434676434675, \"f2\": 0.24912280701754386, \"f0_5\": 0.570281124497992, \"p4\": 0.43623582599130023, \"phi\": 0.295519561903616}, {\"truth_threshold\": 12.90000019222498, \"match_probability\": 0.9998691854106266, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 418.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1613.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.20580994583948795, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.794190054160512, \"precision\": 1.0, \"recall\": 0.20580994583948795, \"specificity\": 1.0, \"npv\": 0.41515591007976793, \"accuracy\": 0.49212846347607053, \"f1\": 0.34136382196815024, \"f2\": 0.24467337859985952, \"f0_5\": 0.5644072373751012, \"p4\": 0.4316118892793445, \"phi\": 0.2923067145456299}, {\"truth_threshold\": 13.000000193715096, \"match_probability\": 0.9998779446032292, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 398.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1633.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.19596258000984737, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8040374199901527, \"precision\": 1.0, \"recall\": 0.19596258000984737, \"specificity\": 1.0, \"npv\": 0.41216702663786897, \"accuracy\": 0.4858312342569269, \"f1\": 0.3277068752573075, \"f2\": 0.23351325979816945, \"f0_5\": 0.5492685619652222, \"p4\": 0.4197617119306842, \"phi\": 0.28419942634520623}, {\"truth_threshold\": 13.100000195205212, \"match_probability\": 0.9998861173572945, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 388.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1643.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.19103889709502708, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8089611029049729, \"precision\": 1.0, \"recall\": 0.19103889709502708, \"specificity\": 1.0, \"npv\": 0.41068866571018653, \"accuracy\": 0.48268261964735515, \"f1\": 0.32079371641174037, \"f2\": 0.22791353383458646, \"f0_5\": 0.541445715880547, \"p4\": 0.4136731134125541, \"phi\": 0.2801026771501877}, {\"truth_threshold\": 13.200000196695328, \"match_probability\": 0.9998937429269453, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 383.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1648.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.18857705563761692, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.811422944362383, \"precision\": 1.0, \"recall\": 0.18857705563761692, \"specificity\": 1.0, \"npv\": 0.40995345506623704, \"accuracy\": 0.4811083123425693, \"f1\": 0.31731565865782935, \"f2\": 0.22510873398377806, \"f0_5\": 0.5374684254841425, \"p4\": 0.4105860003688893, \"phi\": 0.2780428303424835}, {\"truth_threshold\": 13.300000198185444, \"match_probability\": 0.9999008579398913, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 378.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1653.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.1861152141802068, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8138847858197932, \"precision\": 1.0, \"recall\": 0.1861152141802068, \"specificity\": 1.0, \"npv\": 0.40922087205146535, \"accuracy\": 0.4795340050377834, \"f1\": 0.3138231631382316, \"f2\": 0.22230063514467185, \"f0_5\": 0.5334462320067739, \"p4\": 0.4074695693495442, \"phi\": 0.2759750536712865}, {\"truth_threshold\": 13.40000019967556, \"match_probability\": 0.999907496573012, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 373.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1658.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.18365337272279667, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8163466272772033, \"precision\": 1.0, \"recall\": 0.18365337272279667, \"specificity\": 1.0, \"npv\": 0.4084909026043525, \"accuracy\": 0.47795969773299746, \"f1\": 0.3103161397670549, \"f2\": 0.21948923149346827, \"f0_5\": 0.529378370706784, \"p4\": 0.40432322000651333, \"phi\": 0.27389912739888156}, {\"truth_threshold\": 13.500000201165676, \"match_probability\": 0.9999136907162209, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 369.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1662.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.18168389955686853, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8183161004431314, \"precision\": 1.0, \"recall\": 0.18168389955686853, \"specificity\": 1.0, \"npv\": 0.4079087994299964, \"accuracy\": 0.47670025188916876, \"f1\": 0.3075, \"f2\": 0.21723772518544684, \"f0_5\": 0.5260906757912746, \"p4\": 0.40178418596158894, \"phi\": 0.27223236645190135}, {\"truth_threshold\": 13.600000202655792, \"match_probability\": 0.9999194701253888, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 360.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1671.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.17725258493353027, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8227474150664698, \"precision\": 1.0, \"recall\": 0.17725258493353027, \"specificity\": 1.0, \"npv\": 0.40660511363636365, \"accuracy\": 0.47386649874055414, \"f1\": 0.30112923462986196, \"f2\": 0.21216407355021216, \"f0_5\": 0.5185825410544511, \"p4\": 0.3959982275188506, \"phi\": 0.268461929217603}, {\"truth_threshold\": 13.700000204145908, \"match_probability\": 0.9999248625650565, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 352.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1679.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.17331363860167406, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.826686361398326, \"precision\": 1.0, \"recall\": 0.17331363860167406, \"specificity\": 1.0, \"npv\": 0.4054532577903683, \"accuracy\": 0.47134760705289674, \"f1\": 0.2954259336970206, \"f2\": 0.20764511562057575, \"f0_5\": 0.5117766792672288, \"p4\": 0.3907676489102373, \"phi\": 0.26508598490027957}, {\"truth_threshold\": 13.800000205636024, \"match_probability\": 0.999929893941616, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 345.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1686.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.16986706056129985, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8301329394387001, \"precision\": 1.0, \"recall\": 0.16986706056129985, \"specificity\": 1.0, \"npv\": 0.4044507241257506, \"accuracy\": 0.46914357682619645, \"f1\": 0.2904040404040404, \"f2\": 0.2036840240878498, \"f0_5\": 0.5057167985927881, \"p4\": 0.3861210486188493, \"phi\": 0.26211229587550916}, {\"truth_threshold\": 13.90000020712614, \"match_probability\": 0.9999345884275949, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 342.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1689.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.16838995568685378, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8316100443131462, \"precision\": 1.0, \"recall\": 0.16838995568685378, \"specificity\": 1.0, \"npv\": 0.4040225829216655, \"accuracy\": 0.4681989924433249, \"f1\": 0.28824273072060685, \"f2\": 0.20198440822111977, \"f0_5\": 0.5030891438658429, \"p4\": 0.38410918552991574, \"phi\": 0.26083202417392587}, {\"truth_threshold\": 14.000000208616257, \"match_probability\": 0.9999389685776376, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 338.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1693.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.16642048252092564, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8335795174790743, \"precision\": 1.0, \"recall\": 0.16642048252092564, \"specificity\": 1.0, \"npv\": 0.4034531360112755, \"accuracy\": 0.4669395465994962, \"f1\": 0.28535246939636977, \"f2\": 0.19971637910659418, \"f0_5\": 0.4995566065622229, \"p4\": 0.3814072232423753, \"phi\": 0.25911940407768985}, {\"truth_threshold\": 14.100000210106373, \"match_probability\": 0.9999430554367367, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 333.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1698.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.16395864106351551, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8360413589364845, \"precision\": 1.0, \"recall\": 0.16395864106351551, \"specificity\": 1.0, \"npv\": 0.4027435807245867, \"accuracy\": 0.46536523929471035, \"f1\": 0.2817258883248731, \"f2\": 0.1968783256473927, \"f0_5\": 0.4950936663693131, \"p4\": 0.3779979300007138, \"phi\": 0.2569694343548615}, {\"truth_threshold\": 14.200000211596489, \"match_probability\": 0.9999468686412301, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 325.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1706.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.16001969473165928, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8399803052683407, \"precision\": 1.0, \"recall\": 0.16001969473165928, \"specificity\": 1.0, \"npv\": 0.4016134689582603, \"accuracy\": 0.4628463476070529, \"f1\": 0.2758913412563667, \"f2\": 0.1923304533080838, \"f0_5\": 0.48784148904232966, \"p4\": 0.37246767025663613, \"phi\": 0.2535075239570288}, {\"truth_threshold\": 14.300000213086605, \"match_probability\": 0.9999504265130488, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 321.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1710.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.15805022156573117, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8419497784342689, \"precision\": 1.0, \"recall\": 0.15805022156573117, \"specificity\": 1.0, \"npv\": 0.4010507880910683, \"accuracy\": 0.4615869017632242, \"f1\": 0.2729591836734694, \"f2\": 0.19005328596802842, \"f0_5\": 0.4841628959276018, \"p4\": 0.369666887936757, \"phi\": 0.2517660936601759}, {\"truth_threshold\": 14.400000214576721, \"match_probability\": 0.9999537461476637, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 315.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1716.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.155096011816839, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.844903988183161, \"precision\": 1.0, \"recall\": 0.155096011816839, \"specificity\": 1.0, \"npv\": 0.400209716882209, \"accuracy\": 0.4596977329974811, \"f1\": 0.26854219948849106, \"f2\": 0.18663348738002133, \"f0_5\": 0.4785779398359161, \"p4\": 0.36541997841978086, \"phi\": 0.2491403840784887}, {\"truth_threshold\": 14.500000216066837, \"match_probability\": 0.9999568434961527, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 314.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1717.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.15460364352535697, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8453963564746431, \"precision\": 1.0, \"recall\": 0.15460364352535697, \"specificity\": 1.0, \"npv\": 0.4000698812019567, \"accuracy\": 0.4593828715365239, \"f1\": 0.2678038379530917, \"f2\": 0.1860630481156672, \"f0_5\": 0.4776391846668695, \"p4\": 0.36470673862472397, \"phi\": 0.2487011485670688}, {\"truth_threshold\": 14.600000217556953, \"match_probability\": 0.9999597334417798, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 310.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1721.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.15263417035942886, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8473658296405712, \"precision\": 1.0, \"recall\": 0.15263417035942886, \"specificity\": 1.0, \"npv\": 0.39951151430565246, \"accuracy\": 0.4581234256926952, \"f1\": 0.26484408372490387, \"f2\": 0.1837799383447949, \"f0_5\": 0.4738612045246102, \"p4\": 0.3618380393770922, \"phi\": 0.24693948354826198}, {\"truth_threshold\": 14.70000021904707, \"match_probability\": 0.9999624298714548, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 307.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1724.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.15115706548498276, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8488429345150172, \"precision\": 1.0, \"recall\": 0.15115706548498276, \"specificity\": 1.0, \"npv\": 0.399093760892297, \"accuracy\": 0.45717884130982367, \"f1\": 0.262617621899059, \"f2\": 0.18206618431977226, \"f0_5\": 0.4710033752684873, \"p4\": 0.35966979322171594, \"phi\": 0.24561319539032303}, {\"truth_threshold\": 14.800000220537186, \"match_probability\": 0.9999649457424121, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 305.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1726.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.1501723289020187, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8498276710979813, \"precision\": 1.0, \"recall\": 0.1501723289020187, \"specificity\": 1.0, \"npv\": 0.39881574364332983, \"accuracy\": 0.45654911838790935, \"f1\": 0.2611301369863014, \"f2\": 0.18092300391505517, \"f0_5\": 0.46908643494309443, \"p4\": 0.3582162273053647, \"phi\": 0.24472655970635737}, {\"truth_threshold\": 14.900000222027302, \"match_probability\": 0.9999672931444318, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 303.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1728.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.14918759231905465, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8508124076809453, \"precision\": 1.0, \"recall\": 0.14918759231905465, \"specificity\": 1.0, \"npv\": 0.39853811347024015, \"accuracy\": 0.45591939546599497, \"f1\": 0.2596401028277635, \"f2\": 0.17977928088287648, \"f0_5\": 0.4671600370027752, \"p4\": 0.3567561397717773, \"phi\": 0.24383794125607963}, {\"truth_threshold\": 15.000000223517418, \"match_probability\": 0.9999694833578969, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 302.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1729.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.14869522402757263, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8513047759724274, \"precision\": 1.0, \"recall\": 0.14869522402757263, \"specificity\": 1.0, \"npv\": 0.39839944328462074, \"accuracy\": 0.4556045340050378, \"f1\": 0.25889412773253323, \"f2\": 0.17920721576074056, \"f0_5\": 0.46619326952763196, \"p4\": 0.3560236322925244, \"phi\": 0.243392880897669}, {\"truth_threshold\": 15.100000225007534, \"match_probability\": 0.9999715269079685, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 294.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1737.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.1447562776957164, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8552437223042836, \"precision\": 1.0, \"recall\": 0.1447562776957164, \"specificity\": 1.0, \"npv\": 0.397293546148508, \"accuracy\": 0.45308564231738035, \"f1\": 0.25290322580645164, \"f2\": 0.17462580185317178, \"f0_5\": 0.4583723105706268, \"p4\": 0.35010346944394827, \"phi\": 0.23981395892022075}, {\"truth_threshold\": 15.20000022649765, \"match_probability\": 0.9999734336151354, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 290.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1741.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.14278680452978829, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8572131954702117, \"precision\": 1.0, \"recall\": 0.14278680452978829, \"specificity\": 1.0, \"npv\": 0.39674289674289676, \"accuracy\": 0.45182619647355166, \"f1\": 0.24989228780697975, \"f2\": 0.17233182790587118, \"f0_5\": 0.4544030084612974, \"p4\": 0.3471025353224262, \"phi\": 0.23801187038845348}, {\"truth_threshold\": 15.300000227987766, \"match_probability\": 0.9999752126423825, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 289.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1742.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.14229443623830626, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8577055637616937, \"precision\": 1.0, \"recall\": 0.14229443623830626, \"specificity\": 1.0, \"npv\": 0.39660547280914443, \"accuracy\": 0.45151133501259444, \"f1\": 0.24913793103448276, \"f2\": 0.17175799358136218, \"f0_5\": 0.45340445560087855, \"p4\": 0.346347962973042, \"phi\": 0.2375599969742467}, {\"truth_threshold\": 15.400000229477882, \"match_probability\": 0.9999768725392036, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 285.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1746.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.14032496307237813, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8596750369276218, \"precision\": 1.0, \"recall\": 0.14032496307237813, \"specificity\": 1.0, \"npv\": 0.3960567277758561, \"accuracy\": 0.45025188916876574, \"f1\": 0.24611398963730569, \"f2\": 0.16946129147342134, \"f0_5\": 0.44938505203405865, \"p4\": 0.34331208180785255, \"phi\": 0.2357469951021941}, {\"truth_threshold\": 15.500000230967999, \"match_probability\": 0.9999784212826682, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 281.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1750.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.13835548990645002, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8616445100935499, \"precision\": 1.0, \"recall\": 0.13835548990645002, \"specificity\": 1.0, \"npv\": 0.3955094991364421, \"accuracy\": 0.44899244332493704, \"f1\": 0.2430795847750865, \"f2\": 0.16716240333135038, \"f0_5\": 0.44532488114104596, \"p4\": 0.34024766819652713, \"phi\": 0.23392501045351508}, {\"truth_threshold\": 15.600000232458115, \"match_probability\": 0.9999798663157408, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 276.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1755.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.1358936484490399, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8641063515509602, \"precision\": 1.0, \"recall\": 0.1358936484490399, \"specificity\": 1.0, \"npv\": 0.39482758620689656, \"accuracy\": 0.4474181360201511, \"f1\": 0.23927178153446033, \"f2\": 0.16428571428571428, \"f0_5\": 0.44019138755980863, \"p4\": 0.33637622790175986, \"phi\": 0.23163454232472105}, {\"truth_threshold\": 15.70000023394823, \"match_probability\": 0.9999812145830361, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 274.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1757.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.13490891186607581, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8650910881339242, \"precision\": 1.0, \"recall\": 0.13490891186607581, \"specificity\": 1.0, \"npv\": 0.3945554789800138, \"accuracy\": 0.4467884130982368, \"f1\": 0.23774403470715835, \"f2\": 0.16313407954274828, \"f0_5\": 0.4381196034537896, \"p4\": 0.33481470493206833, \"phi\": 0.23071421789736327}, {\"truth_threshold\": 15.800000235438347, \"match_probability\": 0.9999824725641815, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 270.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1761.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.1329394387001477, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8670605612998523, \"precision\": 1.0, \"recall\": 0.1329394387001477, \"specificity\": 1.0, \"npv\": 0.39401238816242257, \"accuracy\": 0.44552896725440805, \"f1\": 0.23468057366362452, \"f2\": 0.16082916368834882, \"f0_5\": 0.4339440694310511, \"p4\": 0.33166908726630484, \"phi\": 0.22886630534706762}, {\"truth_threshold\": 15.900000236928463, \"match_probability\": 0.9999836463049459, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 265.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1766.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.13047759724273758, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8695224027572624, \"precision\": 1.0, \"recall\": 0.13047759724273758, \"specificity\": 1.0, \"npv\": 0.39333562349708007, \"accuracy\": 0.4439546599496222, \"f1\": 0.2308362369337979, \"f2\": 0.1579449278817499, \"f0_5\": 0.4286638628275639, \"p4\": 0.32769401905091045, \"phi\": 0.22654246194449523}, {\"truth_threshold\": 16.00000023841858, \"match_probability\": 0.9999847414462861, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 263.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1768.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.1294928606597735, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8705071393402265, \"precision\": 1.0, \"recall\": 0.1294928606597735, \"specificity\": 1.0, \"npv\": 0.3930655681428081, \"accuracy\": 0.4433249370277078, \"f1\": 0.22929380993897122, \"f2\": 0.15679027065696913, \"f0_5\": 0.4265325981187155, \"p4\": 0.3260903699195752, \"phi\": 0.22560847689231753}, {\"truth_threshold\": 16.100000239908695, \"match_probability\": 0.9999857632514492, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 256.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1775.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.12604628261939932, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8739537173806007, \"precision\": 1.0, \"recall\": 0.12604628261939932, \"specificity\": 1.0, \"npv\": 0.3921232876712329, \"accuracy\": 0.4411209068010076, \"f1\": 0.22387407083515523, \"f2\": 0.15274463007159905, \"f0_5\": 0.41898527004909986, \"p4\": 0.3204149478514069, \"phi\": 0.22231887625538288}, {\"truth_threshold\": 16.20000024139881, \"match_probability\": 0.9999867166312594, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 249.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1782.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.12259970457902511, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8774002954209749, \"precision\": 1.0, \"recall\": 0.12259970457902511, \"specificity\": 1.0, \"npv\": 0.3911855141783396, \"accuracy\": 0.4389168765743073, \"f1\": 0.21842105263157896, \"f2\": 0.14869222500895737, \"f0_5\": 0.41129831516352827, \"p4\": 0.3146395889340626, \"phi\": 0.21899595538241903}, {\"truth_threshold\": 16.400000244379044, \"match_probability\": 0.9999884361359999, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 243.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1788.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.11964549483013294, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.880354505169867, \"precision\": 1.0, \"recall\": 0.11964549483013294, \"specificity\": 1.0, \"npv\": 0.3903852710535288, \"accuracy\": 0.43702770780856426, \"f1\": 0.21372031662269128, \"f2\": 0.14521333811401937, \"f0_5\": 0.4045954045954046, \"p4\": 0.30960718661223074, \"phi\": 0.21611996420875845}, {\"truth_threshold\": 16.50000024586916, \"match_probability\": 0.9999892105250341, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 241.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1790.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.11866075824716889, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8813392417528311, \"precision\": 1.0, \"recall\": 0.11866075824716889, \"specificity\": 1.0, \"npv\": 0.3901192504258944, \"accuracy\": 0.4363979848866499, \"f1\": 0.21214788732394366, \"f2\": 0.14405260011954574, \"f0_5\": 0.4023372287145242, \"p4\": 0.3079124729407039, \"phi\": 0.2151553997982709}, {\"truth_threshold\": 16.600000247359276, \"match_probability\": 0.9999899330566321, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 239.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1792.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.11767602166420482, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8823239783357951, \"precision\": 1.0, \"recall\": 0.11767602166420482, \"specificity\": 1.0, \"npv\": 0.38985359210078313, \"accuracy\": 0.4357682619647355, \"f1\": 0.2105726872246696, \"f2\": 0.14289130694726773, \"f0_5\": 0.4000669568128557, \"p4\": 0.3062089971097263, \"phi\": 0.21418781419567226}, {\"truth_threshold\": 16.700000248849392, \"match_probability\": 0.9999906072033913, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 234.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1797.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.11521418020679468, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8847858197932054, \"precision\": 1.0, \"recall\": 0.11521418020679468, \"specificity\": 1.0, \"npv\": 0.3891910265125765, \"accuracy\": 0.43419395465994964, \"f1\": 0.20662251655629138, \"f2\": 0.1399856424982053, \"f0_5\": 0.3943377148634985, \"p4\": 0.30191141261310905, \"phi\": 0.21175534246740363}, {\"truth_threshold\": 16.800000250339508, \"match_probability\": 0.9999912362053778, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 227.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1804.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.11176760216642048, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8882323978335795, \"precision\": 1.0, \"recall\": 0.11176760216642048, \"specificity\": 1.0, \"npv\": 0.3882672092234656, \"accuracy\": 0.43198992443324935, \"f1\": 0.20106288751107174, \"f2\": 0.13591186684229434, \"f0_5\": 0.3861857774753317, \"p4\": 0.29579930054729386, \"phi\": 0.20831633391252508}, {\"truth_threshold\": 16.900000251829624, \"match_probability\": 0.999991823085696, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 223.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1808.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.10979812900049236, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8902018709995076, \"precision\": 1.0, \"recall\": 0.10979812900049236, \"specificity\": 1.0, \"npv\": 0.3877412800541822, \"accuracy\": 0.43073047858942065, \"f1\": 0.19787045252883761, \"f2\": 0.13358092727926202, \"f0_5\": 0.3814574067738625, \"p4\": 0.2922553226557823, \"phi\": 0.2063329035471685}, {\"truth_threshold\": 17.00000025331974, \"match_probability\": 0.9999923706650156, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 217.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1814.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.1068439192516002, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8931560807483998, \"precision\": 1.0, \"recall\": 0.1068439192516002, \"specificity\": 1.0, \"npv\": 0.38695505238256167, \"accuracy\": 0.4288413098236776, \"f1\": 0.19306049822064056, \"f2\": 0.13008032609998801, \"f0_5\": 0.3742669886167644, \"p4\": 0.2868673597096055, \"phi\": 0.20333173478520547}, {\"truth_threshold\": 17.100000254809856, \"match_probability\": 0.9999928815751264, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 211.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1820.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.10388970950270802, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.896110290497292, \"precision\": 1.0, \"recall\": 0.10388970950270802, \"specificity\": 1.0, \"npv\": 0.38617200674536256, \"accuracy\": 0.4269521410579345, \"f1\": 0.18822479928635147, \"f2\": 0.1265746850629874, \"f0_5\": 0.36695652173913046, \"p4\": 0.2813906764114958, \"phi\": 0.2002980219544205}, {\"truth_threshold\": 17.200000256299973, \"match_probability\": 0.999993358271586, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 205.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1826.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.10093549975381585, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.8990645002461841, \"precision\": 1.0, \"recall\": 0.10093549975381585, \"specificity\": 1.0, \"npv\": 0.38539212386401883, \"accuracy\": 0.4250629722921914, \"f1\": 0.18336314847942756, \"f2\": 0.12306399327650379, \"f0_5\": 0.3595229743949491, \"p4\": 0.27582256169212693, \"phi\": 0.1972301868969333}, {\"truth_threshold\": 17.30000025779009, \"match_probability\": 0.999993803045519, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 201.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1830.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.09896602658788774, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9010339734121122, \"precision\": 1.0, \"recall\": 0.09896602658788774, \"specificity\": 1.0, \"npv\": 0.38487394957983195, \"accuracy\": 0.4238035264483627, \"f1\": 0.18010752688172044, \"f2\": 0.12072072072072072, \"f0_5\": 0.3544973544973545, \"p4\": 0.2720583020072346, \"phi\": 0.19516517498545435}, {\"truth_threshold\": 17.400000259280205, \"match_probability\": 0.9999942180346287, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 192.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1839.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.09453471196454949, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9054652880354506, \"precision\": 1.0, \"recall\": 0.09453471196454949, \"specificity\": 1.0, \"npv\": 0.3837131367292225, \"accuracy\": 0.4209697732997481, \"f1\": 0.17273954116059378, \"f2\": 0.11544011544011544, \"f0_5\": 0.3429796355841372, \"p4\": 0.26343105650125237, \"phi\": 0.19045789786120934}, {\"truth_threshold\": 17.50000026077032, \"match_probability\": 0.9999946052334694, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 184.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1847.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.09059576563269325, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9094042343673068, \"precision\": 1.0, \"recall\": 0.09059576563269325, \"specificity\": 1.0, \"npv\": 0.3826871657754011, \"accuracy\": 0.41845088161209065, \"f1\": 0.16613995485327313, \"f2\": 0.11073663938372653, \"f0_5\": 0.3324900614383809, \"p4\": 0.25557237360658436, \"phi\": 0.18619838017885085}, {\"truth_threshold\": 17.600000262260437, \"match_probability\": 0.999994966503032, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 177.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1854.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.08714918759231906, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.912850812407681, \"precision\": 1.0, \"recall\": 0.08714918759231906, \"specificity\": 1.0, \"npv\": 0.3817939313104368, \"accuracy\": 0.4162468513853904, \"f1\": 0.16032608695652173, \"f2\": 0.10661366100469823, \"f0_5\": 0.3231106243154436, \"p4\": 0.24854306779885504, \"phi\": 0.18240896617595923}, {\"truth_threshold\": 17.700000263750553, \"match_probability\": 0.9999953035796879, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 171.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1860.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.08419497784342689, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9158050221565731, \"precision\": 1.0, \"recall\": 0.08419497784342689, \"specificity\": 1.0, \"npv\": 0.3810316139767055, \"accuracy\": 0.41435768261964734, \"f1\": 0.1553133514986376, \"f2\": 0.10307414104882459, \"f0_5\": 0.3149171270718232, \"p4\": 0.24240004456907277, \"phi\": 0.1791115526603851}, {\"truth_threshold\": 17.80000026524067, \"match_probability\": 0.9999956180835331, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 165.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1866.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.08124076809453472, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9187592319054653, \"precision\": 1.0, \"recall\": 0.08124076809453472, \"specificity\": 1.0, \"npv\": 0.38027233477250083, \"accuracy\": 0.41246851385390426, \"f1\": 0.15027322404371585, \"f2\": 0.09952949692363373, \"f0_5\": 0.30657748049052397, \"p4\": 0.23614444277786112, \"phi\": 0.1757658003196868}, {\"truth_threshold\": 17.900000266730785, \"match_probability\": 0.9999959115261747, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 161.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1870.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0792712949286066, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9207287050713934, \"precision\": 1.0, \"recall\": 0.0792712949286066, \"specificity\": 1.0, \"npv\": 0.37976782752902155, \"accuracy\": 0.41120906801007556, \"f1\": 0.1468978102189781, \"f2\": 0.0971635485817743, \"f0_5\": 0.30093457943925234, \"p4\": 0.23190967417285194, \"phi\": 0.17350702423950815}, {\"truth_threshold\": 18.100000269711018, \"match_probability\": 0.999996440774932, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 155.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1876.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.07631708517971443, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9236829148202855, \"precision\": 1.0, \"recall\": 0.07631708517971443, \"specificity\": 1.0, \"npv\": 0.3790135716650116, \"accuracy\": 0.4093198992443325, \"f1\": 0.14181152790484905, \"f2\": 0.09361033941297259, \"f0_5\": 0.29234251225952473, \"p4\": 0.22545812557563438, \"phi\": 0.1700741339329014}, {\"truth_threshold\": 18.200000271201134, \"match_probability\": 0.9999966791247992, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 151.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1880.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.07434761201378631, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9256523879862137, \"precision\": 1.0, \"recall\": 0.07434761201378631, \"specificity\": 1.0, \"npv\": 0.3785123966942149, \"accuracy\": 0.4080604534005038, \"f1\": 0.1384051329055912, \"f2\": 0.09123867069486405, \"f0_5\": 0.286527514231499, \"p4\": 0.22108910954393457, \"phi\": 0.16775426317035838}, {\"truth_threshold\": 18.30000027269125, \"match_probability\": 0.999996901513191, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 146.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1885.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.07188577055637617, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9281142294436239, \"precision\": 1.0, \"recall\": 0.07188577055637617, \"specificity\": 1.0, \"npv\": 0.3778877887788779, \"accuracy\": 0.40648614609571787, \"f1\": 0.13412953605879652, \"f2\": 0.08827085852478839, \"f0_5\": 0.27915869980879543, \"p4\": 0.21554921241757904, \"phi\": 0.16481733792357756}, {\"truth_threshold\": 18.400000274181366, \"match_probability\": 0.9999971090089864, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 142.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1889.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.06991629739044805, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.930083702609552, \"precision\": 1.0, \"recall\": 0.06991629739044805, \"specificity\": 1.0, \"npv\": 0.3773895847066579, \"accuracy\": 0.40522670025188917, \"f1\": 0.13069489185457892, \"f2\": 0.08589402371158965, \"f0_5\": 0.27318199307425933, \"p4\": 0.21105285310447533, \"phi\": 0.16243670286117107}, {\"truth_threshold\": 18.500000275671482, \"match_probability\": 0.9999973026094866, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 126.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1905.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0620384047267356, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9379615952732644, \"precision\": 1.0, \"recall\": 0.0620384047267356, \"specificity\": 1.0, \"npv\": 0.37540983606557377, \"accuracy\": 0.4001889168765743, \"f1\": 0.1168289290681502, \"f2\": 0.07636363636363637, \"f0_5\": 0.2485207100591716, \"p4\": 0.19246681908459196, \"phi\": 0.1526100499581647}, {\"truth_threshold\": 18.600000277161598, \"match_probability\": 0.999997483245208, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 119.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1912.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0585918266863614, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9414081733136386, \"precision\": 1.0, \"recall\": 0.0585918266863614, \"specificity\": 1.0, \"npv\": 0.37455021262675825, \"accuracy\": 0.3979848866498741, \"f1\": 0.11069767441860465, \"f2\": 0.07218245784301831, \"f0_5\": 0.23733546071001196, \"p4\": 0.18401722202939574, \"phi\": 0.1481404102315328}, {\"truth_threshold\": 18.700000278651714, \"match_probability\": 0.9999976517843541, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 114.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1917.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.056129985228951254, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9438700147710487, \"precision\": 1.0, \"recall\": 0.056129985228951254, \"specificity\": 1.0, \"npv\": 0.37393860222077074, \"accuracy\": 0.39641057934508817, \"f1\": 0.1062937062937063, \"f2\": 0.06919155134741442, \"f0_5\": 0.22919179734620024, \"p4\": 0.177856625322802, \"phi\": 0.14487638944695766}, {\"truth_threshold\": 18.80000028014183, \"match_probability\": 0.9999978090369889, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 109.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1922.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.05366814377154111, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9463318562284588, \"precision\": 1.0, \"recall\": 0.05366814377154111, \"specificity\": 1.0, \"npv\": 0.3733289859797848, \"accuracy\": 0.39483627204030225, \"f1\": 0.10186915887850467, \"f2\": 0.06619701202477833, \"f0_5\": 0.22091609241994326, \"p4\": 0.17158817188926406, \"phi\": 0.14154813207402897}, {\"truth_threshold\": 18.900000281631947, \"match_probability\": 0.9999979557589296, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 103.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1928.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.05071393402264894, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9492860659773511, \"precision\": 1.0, \"recall\": 0.05071393402264894, \"specificity\": 1.0, \"npv\": 0.3726000650829808, \"accuracy\": 0.3929471032745592, \"f1\": 0.09653233364573571, \"f2\": 0.06259876017989546, \"f0_5\": 0.21080638559148587, \"p4\": 0.1639190460769093, \"phi\": 0.1374627772069697}, {\"truth_threshold\": 19.000000283122063, \"match_probability\": 0.9999980926553794, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 99.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1932.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.04874446085672083, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9512555391432792, \"precision\": 1.0, \"recall\": 0.04874446085672083, \"specificity\": 1.0, \"npv\": 0.3721156971075723, \"accuracy\": 0.3916876574307305, \"f1\": 0.09295774647887324, \"f2\": 0.060197008391098136, \"f0_5\": 0.20395550061804696, \"p4\": 0.15871449033683513, \"phi\": 0.13467954199443746}, {\"truth_threshold\": 19.10000028461218, \"match_probability\": 0.9999982203843173, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 97.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1934.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.04775972427375677, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9522402757262433, \"precision\": 1.0, \"recall\": 0.04775972427375677, \"specificity\": 1.0, \"npv\": 0.37187398506008446, \"accuracy\": 0.3910579345088161, \"f1\": 0.09116541353383459, \"f2\": 0.05899525605157523, \"f0_5\": 0.20049607275733775, \"p4\": 0.15608399431118705, \"phi\": 0.13326889731311195}, {\"truth_threshold\": 19.200000286102295, \"match_probability\": 0.9999983395596597, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 88.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1943.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.043328409650418516, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9566715903495815, \"precision\": 1.0, \"recall\": 0.043328409650418516, \"specificity\": 1.0, \"npv\": 0.3707901554404145, \"accuracy\": 0.38822418136020154, \"f1\": 0.0830580462482303, \"f2\": 0.0535801266439357, \"f0_5\": 0.18464120856063784, \"p4\": 0.1440066829619842, \"phi\": 0.12675073076422336}, {\"truth_threshold\": 19.30000028759241, \"match_probability\": 0.9999984507542113, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 87.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1944.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.04283604135893648, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9571639586410635, \"precision\": 1.0, \"recall\": 0.04283604135893648, \"specificity\": 1.0, \"npv\": 0.370670119779864, \"accuracy\": 0.3879093198992443, \"f1\": 0.0821529745042493, \"f2\": 0.052977712824260136, \"f0_5\": 0.1828499369482976, \"p4\": 0.1426399013699101, \"phi\": 0.12600809728510384}, {\"truth_threshold\": 19.400000289082527, \"match_probability\": 0.9999985545024187, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 83.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1948.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.04086656819300837, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9591334318069916, \"precision\": 1.0, \"recall\": 0.04086656819300837, \"specificity\": 1.0, \"npv\": 0.3701907533139347, \"accuracy\": 0.3866498740554156, \"f1\": 0.07852412488174078, \"f2\": 0.05056658949677105, \"f0_5\": 0.17562420651713923, \"p4\": 0.1371215936029642, \"phi\": 0.1229976652816022}, {\"truth_threshold\": 19.60000029206276, \"match_probability\": 0.9999987416210334, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 74.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1957.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.03643525356967011, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9635647464303299, \"precision\": 1.0, \"recall\": 0.03643525356967011, \"specificity\": 1.0, \"npv\": 0.36911669890393295, \"accuracy\": 0.383816120906801, \"f1\": 0.07030878859857483, \"f2\": 0.0451329592583557, \"f0_5\": 0.15900300816501933, \"p4\": 0.12439700011341519, \"phi\": 0.11596922230214521}, {\"truth_threshold\": 19.700000293552876, \"match_probability\": 0.9999988258908107, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 69.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1962.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.033973412112259974, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.96602658788774, \"precision\": 1.0, \"recall\": 0.033973412112259974, \"specificity\": 1.0, \"npv\": 0.3685226906984229, \"accuracy\": 0.3822418136020151, \"f1\": 0.06571428571428571, \"f2\": 0.04210911753936287, \"f0_5\": 0.14954486345903772, \"p4\": 0.11713607088211223, \"phi\": 0.11189268628385163}, {\"truth_threshold\": 19.80000029504299, \"match_probability\": 0.9999989045173057, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 67.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1964.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.032988675529295915, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.967011324470704, \"precision\": 1.0, \"recall\": 0.032988675529295915, \"specificity\": 1.0, \"npv\": 0.3682856223866195, \"accuracy\": 0.38161209068010077, \"f1\": 0.06387035271687322, \"f2\": 0.04089854718593578, \"f0_5\": 0.14571552849064812, \"p4\": 0.11419202241119845, \"phi\": 0.11022365852672915}, {\"truth_threshold\": 20.000000298023224, \"match_probability\": 0.99999904632679, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 66.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1965.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.03249630723781388, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9675036927621861, \"precision\": 1.0, \"recall\": 0.03249630723781388, \"specificity\": 1.0, \"npv\": 0.36816720257234725, \"accuracy\": 0.38129722921914355, \"f1\": 0.06294706723891273, \"f2\": 0.040293040293040296, \"f0_5\": 0.1437908496732026, \"p4\": 0.11271134775969976, \"phi\": 0.10938041200177231}, {\"truth_threshold\": 20.10000029951334, \"match_probability\": 0.9999991101913761, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 64.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1967.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.03151157065484983, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9684884293451502, \"precision\": 1.0, \"recall\": 0.03151157065484983, \"specificity\": 1.0, \"npv\": 0.3679305912596401, \"accuracy\": 0.38066750629722923, \"f1\": 0.06109785202863962, \"f2\": 0.039081582804103565, \"f0_5\": 0.139921294271972, \"p4\": 0.10973250829301784, \"phi\": 0.10767576710921925}, {\"truth_threshold\": 20.200000301003456, \"match_probability\": 0.9999991697791492, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 62.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1969.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.03052683407188577, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9694731659281143, \"precision\": 1.0, \"recall\": 0.03052683407188577, \"specificity\": 1.0, \"npv\": 0.36769428387925496, \"accuracy\": 0.38003778337531485, \"f1\": 0.05924510272336359, \"f2\": 0.03786953334962131, \"f0_5\": 0.13602457218078104, \"p4\": 0.10673009231574156, \"phi\": 0.1059459408998895}, {\"truth_threshold\": 20.300000302493572, \"match_probability\": 0.9999992253765136, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 58.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1973.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.028557360905957656, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9714426390940424, \"precision\": 1.0, \"recall\": 0.028557360905957656, \"specificity\": 1.0, \"npv\": 0.36722257857601026, \"accuracy\": 0.37877833753148615, \"f1\": 0.05552896122546673, \"f2\": 0.0354436568076265, \"f0_5\": 0.12814847547503314, \"p4\": 0.10065327628990903, \"phi\": 0.10240560389554626}, {\"truth_threshold\": 20.40000030398369, \"match_probability\": 0.9999992772506945, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 57.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1974.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.028064992614475627, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9719350073855244, \"precision\": 1.0, \"recall\": 0.028064992614475627, \"specificity\": 1.0, \"npv\": 0.367104841295287, \"accuracy\": 0.378463476070529, \"f1\": 0.05459770114942529, \"f2\": 0.034836817015034834, \"f0_5\": 0.12616201859229748, \"p4\": 0.09911884237575404, \"phi\": 0.10150268301720144}, {\"truth_threshold\": 20.60000030696392, \"match_probability\": 0.9999993708101274, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 55.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1976.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.02708025603151157, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9729197439684885, \"precision\": 1.0, \"recall\": 0.02708025603151157, \"specificity\": 1.0, \"npv\": 0.3668695930791413, \"accuracy\": 0.3778337531486146, \"f1\": 0.052732502396931925, \"f2\": 0.03362269226066757, \"f0_5\": 0.12216792536650378, \"p4\": 0.0960314132133735, \"phi\": 0.09967408143925688}, {\"truth_threshold\": 20.800000309944153, \"match_probability\": 0.9999994522583585, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 53.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1978.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.026095519448547513, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9739044805514525, \"precision\": 1.0, \"recall\": 0.026095519448547513, \"specificity\": 1.0, \"npv\": 0.36663464617355107, \"accuracy\": 0.37720403022670024, \"f1\": 0.0508637236084453, \"f2\": 0.03240797358444417, \"f0_5\": 0.1181453410610789, \"p4\": 0.0929189583154697, \"phi\": 0.09781370834261033}, {\"truth_threshold\": 20.90000031143427, \"match_probability\": 0.9999994889389594, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 52.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1979.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.025603151157065487, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9743968488429345, \"precision\": 1.0, \"recall\": 0.025603151157065487, \"specificity\": 1.0, \"npv\": 0.36651728553137003, \"accuracy\": 0.3768891687657431, \"f1\": 0.04992798847815651, \"f2\": 0.031800391389432484, \"f0_5\": 0.11612326931665923, \"p4\": 0.09135324084214064, \"phi\": 0.0968710352124772}, {\"truth_threshold\": 21.000000312924385, \"match_probability\": 0.9999995231631726, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 51.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1980.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.025110782865583457, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9748892171344166, \"precision\": 1.0, \"recall\": 0.025110782865583457, \"specificity\": 1.0, \"npv\": 0.3664, \"accuracy\": 0.3765743073047859, \"f1\": 0.04899135446685879, \"f2\": 0.031192660550458717, \"f0_5\": 0.11409395973154363, \"p4\": 0.08978113973386223, \"phi\": 0.09591971039337942}, {\"truth_threshold\": 21.1000003144145, \"match_probability\": 0.9999995550954947, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 50.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1981.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.024618414574101428, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9753815854258986, \"precision\": 1.0, \"recall\": 0.024618414574101428, \"specificity\": 1.0, \"npv\": 0.36628278950735765, \"accuracy\": 0.3762594458438287, \"f1\": 0.048053820278712155, \"f2\": 0.030584781012967948, \"f0_5\": 0.11205737337516809, \"p4\": 0.08820261179873627, \"phi\": 0.09495947326860263}, {\"truth_threshold\": 21.200000315904617, \"match_probability\": 0.9999995848894065, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 47.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1984.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.023141309699655343, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9768586903003447, \"precision\": 1.0, \"recall\": 0.023141309699655343, \"specificity\": 1.0, \"npv\": 0.36593160754234577, \"accuracy\": 0.37531486146095716, \"f1\": 0.04523580365736285, \"f2\": 0.028760249663443888, \"f0_5\": 0.10590356016223525, \"p4\": 0.08342802927538613, \"phi\": 0.09202247909630645}, {\"truth_threshold\": 21.300000317394733, \"match_probability\": 0.9999996126881108, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 42.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1989.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0206794682422452, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9793205317577548, \"precision\": 1.0, \"recall\": 0.0206794682422452, \"specificity\": 1.0, \"npv\": 0.3653477983407786, \"accuracy\": 0.3737405541561713, \"f1\": 0.04052098408104197, \"f2\": 0.025716385011021307, \"f0_5\": 0.09549795361527967, \"p4\": 0.0753377096255321, \"phi\": 0.08692064307839845}, {\"truth_threshold\": 21.40000031888485, \"match_probability\": 0.9999996386252203, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 40.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1991.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.019694731659281144, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9803052683407188, \"precision\": 1.0, \"recall\": 0.019694731659281144, \"specificity\": 1.0, \"npv\": 0.3651147959183674, \"accuracy\": 0.3731108312342569, \"f1\": 0.0386286817962337, \"f2\": 0.02449779519843214, \"f0_5\": 0.09128251939753537, \"p4\": 0.07205407201867428, \"phi\": 0.08479880854378465}, {\"truth_threshold\": 21.800000324845314, \"match_probability\": 0.999999726129107, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 39.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1992.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.019202363367799114, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9807976366322009, \"precision\": 1.0, \"recall\": 0.019202363367799114, \"specificity\": 1.0, \"npv\": 0.3649984061204973, \"accuracy\": 0.37279596977329976, \"f1\": 0.03768115942028986, \"f2\": 0.02388827636898199, \"f0_5\": 0.08916323731138547, \"p4\": 0.07040188432050476, \"phi\": 0.0837187674478865}, {\"truth_threshold\": 21.90000032633543, \"match_probability\": 0.9999997444694171, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 38.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1993.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.018709995076317085, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9812900049236829, \"precision\": 1.0, \"recall\": 0.018709995076317085, \"specificity\": 1.0, \"npv\": 0.3648820905035054, \"accuracy\": 0.37248110831234255, \"f1\": 0.03673272112131464, \"f2\": 0.02327860818426856, \"f0_5\": 0.08703618873110398, \"p4\": 0.06874271998243119, \"phi\": 0.08262531159854632}, {\"truth_threshold\": 22.20000033080578, \"match_probability\": 0.9999997924446623, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 36.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1995.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.01772525849335303, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.982274741506647, \"precision\": 1.0, \"recall\": 0.01772525849335303, \"specificity\": 1.0, \"npv\": 0.3646496815286624, \"accuracy\": 0.3718513853904282, \"f1\": 0.03483309143686502, \"f2\": 0.022058823529411766, \"f0_5\": 0.08275862068965517, \"p4\": 0.06540326659328236, \"phi\": 0.08039595676782757}, {\"truth_threshold\": 22.40000033378601, \"match_probability\": 0.9999998193125794, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 35.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1996.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.017232890201871, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.982767109798129, \"precision\": 1.0, \"recall\": 0.017232890201871, \"specificity\": 1.0, \"npv\": 0.36453358802929003, \"accuracy\": 0.371536523929471, \"f1\": 0.03388189738625363, \"f2\": 0.02144870694938105, \"f0_5\": 0.08060801473975127, \"p4\": 0.06372287901796007, \"phi\": 0.07925886257954269}, {\"truth_threshold\": 22.600000336766243, \"match_probability\": 0.9999998427024609, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 34.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1997.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.01674052191038897, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.983259478089611, \"precision\": 1.0, \"recall\": 0.01674052191038897, \"specificity\": 1.0, \"npv\": 0.364417568427753, \"accuracy\": 0.37122166246851385, \"f1\": 0.03292978208232446, \"f2\": 0.02083844079431233, \"f0_5\": 0.07844946931241348, \"p4\": 0.06203531774376609, \"phi\": 0.07810595552706255}, {\"truth_threshold\": 23.000000342726707, \"match_probability\": 0.999999880790753, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 33.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 1998.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.01624815361890694, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.983751846381093, \"precision\": 1.0, \"recall\": 0.01624815361890694, \"specificity\": 1.0, \"npv\": 0.36430162265351573, \"accuracy\": 0.3709068010075567, \"f1\": 0.03197674418604651, \"f2\": 0.02022802500919456, \"f0_5\": 0.07628294036061026, \"p4\": 0.06034053235728111, \"phi\": 0.07693652402137358}, {\"truth_threshold\": 23.100000344216824, \"match_probability\": 0.9999998887738388, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 31.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2000.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.015263417035942885, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9847365829640571, \"precision\": 1.0, \"recall\": 0.015263417035942885, \"specificity\": 1.0, \"npv\": 0.3640699523052464, \"accuracy\": 0.3702770780856423, \"f1\": 0.030067895247332686, \"f2\": 0.01900674432863274, \"f0_5\": 0.07192575406032482, \"p4\": 0.0569290852372513, \"phi\": 0.0745449630242769}, {\"truth_threshold\": 23.20000034570694, \"match_probability\": 0.9999998962223214, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 30.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2001.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.014771048744460856, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9852289512555391, \"precision\": 1.0, \"recall\": 0.014771048744460856, \"specificity\": 1.0, \"npv\": 0.3639542275905912, \"accuracy\": 0.36996221662468515, \"f1\": 0.02911208151382824, \"f2\": 0.01839587932303164, \"f0_5\": 0.0697350069735007, \"p4\": 0.05521232030378831, \"phi\": 0.07332111317003598}, {\"truth_threshold\": 23.400000348687172, \"match_probability\": 0.9999999096562825, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 27.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2004.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.013293943870014771, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9867060561299852, \"precision\": 1.0, \"recall\": 0.013293943870014771, \"specificity\": 1.0, \"npv\": 0.3636074944426802, \"accuracy\": 0.3690176322418136, \"f1\": 0.026239067055393587, \"f2\": 0.016562384983437616, \"f0_5\": 0.06311360448807854, \"p4\": 0.050017230584043997, \"phi\": 0.06952537394245138}, {\"truth_threshold\": 23.500000350177288, \"match_probability\": 0.9999999157063305, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 24.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2007.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.011816838995568686, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9881831610044313, \"precision\": 1.0, \"recall\": 0.011816838995568686, \"specificity\": 1.0, \"npv\": 0.36326142131979694, \"accuracy\": 0.36807304785894207, \"f1\": 0.02335766423357664, \"f2\": 0.014727540500736377, \"f0_5\": 0.056417489421720736, \"p4\": 0.04475382343492924, \"phi\": 0.06551794967058633}, {\"truth_threshold\": 23.70000035315752, \"match_probability\": 0.9999999266180979, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 22.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2009.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.010832102412604629, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9891678975873953, \"precision\": 1.0, \"recall\": 0.010832102412604629, \"specificity\": 1.0, \"npv\": 0.3630310716550412, \"accuracy\": 0.3674433249370277, \"f1\": 0.02143205065757428, \"f2\": 0.013503560029462312, \"f0_5\": 0.05191127890514394, \"p4\": 0.04120620154151108, \"phi\": 0.06270876930003502}, {\"truth_threshold\": 23.800000354647636, \"match_probability\": 0.9999999315322641, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 21.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2010.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0103397341211226, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9896602658788775, \"precision\": 1.0, \"recall\": 0.0103397341211226, \"specificity\": 1.0, \"npv\": 0.3629160063391442, \"accuracy\": 0.36712846347607053, \"f1\": 0.02046783625730994, \"f2\": 0.01289134438305709, \"f0_5\": 0.04964539007092199, \"p4\": 0.03942061774542593, \"phi\": 0.06125728539403615}, {\"truth_threshold\": 23.900000356137753, \"match_probability\": 0.9999999361173434, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 19.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2012.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.009354997538158542, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9906450024618415, \"precision\": 1.0, \"recall\": 0.009354997538158542, \"specificity\": 1.0, \"npv\": 0.3626860943934115, \"accuracy\": 0.36649874055415615, \"f1\": 0.018536585365853658, \"f2\": 0.011666461991894878, \"f0_5\": 0.045087802562885616, \"p4\": 0.035825619558433386, \"phi\": 0.05824884136336706}, {\"truth_threshold\": 24.00000035762787, \"match_probability\": 0.9999999403953735, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 18.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2013.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.008862629246676515, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9911373707533235, \"precision\": 1.0, \"recall\": 0.008862629246676515, \"specificity\": 1.0, \"npv\": 0.36257124762507914, \"accuracy\": 0.366183879093199, \"f1\": 0.017569546120058566, \"f2\": 0.01105379513633014, \"f0_5\": 0.042796005706134094, \"p4\": 0.034016089560848325, \"phi\": 0.05668628179027108}, {\"truth_threshold\": 24.100000359117985, \"match_probability\": 0.9999999443869169, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 16.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2015.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.007877892663712457, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9921221073362876, \"precision\": 1.0, \"recall\": 0.007877892663712457, \"specificity\": 1.0, \"npv\": 0.3623417721518987, \"accuracy\": 0.3655541561712846, \"f1\": 0.015632633121641426, \"f2\": 0.009828009828009828, \"f0_5\": 0.03818615751789976, \"p4\": 0.030372674540296223, \"phi\": 0.053427423563110484}, {\"truth_threshold\": 24.300000362098217, \"match_probability\": 0.999999951585999, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 15.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2016.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.007385524372230428, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9926144756277696, \"precision\": 1.0, \"recall\": 0.007385524372230428, \"specificity\": 1.0, \"npv\": 0.3622271433090794, \"accuracy\": 0.36523929471032746, \"f1\": 0.01466275659824047, \"f2\": 0.009214891264283081, \"f0_5\": 0.035868005738880916, \"p4\": 0.028538670521671944, \"phi\": 0.05172269709897784}, {\"truth_threshold\": 24.400000363588333, \"match_probability\": 0.9999999548281396, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 14.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2017.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.006893156080748399, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9931068439192516, \"precision\": 1.0, \"recall\": 0.006893156080748399, \"specificity\": 1.0, \"npv\": 0.362112586970272, \"accuracy\": 0.3649244332493703, \"f1\": 0.013691931540342298, \"f2\": 0.00860162202015237, \"f0_5\": 0.033540967896502155, \"p4\": 0.026696388534875385, \"phi\": 0.04996097057493643}, {\"truth_threshold\": 24.70000036805868, \"match_probability\": 0.999999963309048, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 13.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2018.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.006400787789266372, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9935992122107337, \"precision\": 1.0, \"recall\": 0.006400787789266372, \"specificity\": 1.0, \"npv\": 0.3619981030667088, \"accuracy\": 0.3646095717884131, \"f1\": 0.012720156555772993, \"f2\": 0.007988202040063905, \"f0_5\": 0.031204992798847815, \"p4\": 0.024845767623218985, \"phi\": 0.04813598485381783}, {\"truth_threshold\": 24.800000369548798, \"match_probability\": 0.9999999657661313, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 12.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2019.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.005908419497784343, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9940915805022157, \"precision\": 1.0, \"recall\": 0.005908419497784343, \"specificity\": 1.0, \"npv\": 0.36188369152970923, \"accuracy\": 0.3642947103274559, \"f1\": 0.011747430249632892, \"f2\": 0.007374631268436578, \"f0_5\": 0.02886002886002886, \"p4\": 0.022986746233599045, \"phi\": 0.046240249339339734}, {\"truth_threshold\": 24.900000371038914, \"match_probability\": 0.999999968058671, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 11.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2020.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0054160512063023145, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9945839487936977, \"precision\": 1.0, \"recall\": 0.0054160512063023145, \"specificity\": 1.0, \"npv\": 0.3617693522906793, \"accuracy\": 0.36397984886649876, \"f1\": 0.010773751224289911, \"f2\": 0.006760909649661954, \"f0_5\": 0.02650602409638554, \"p4\": 0.021119262209180464, \"phi\": 0.04426467368994309}, {\"truth_threshold\": 25.00000037252903, \"match_probability\": 0.9999999701976862, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 9.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2022.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.004431314623338257, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9955686853766618, \"precision\": 1.0, \"recall\": 0.004431314623338257, \"specificity\": 1.0, \"npv\": 0.361540890432586, \"accuracy\": 0.3633501259445844, \"f1\": 0.008823529411764706, \"f2\": 0.005533013648100332, \"f0_5\": 0.02177068214804064, \"p4\": 0.017358654565300884, \"phi\": 0.040026259314463214}, {\"truth_threshold\": 25.100000374019146, \"match_probability\": 0.9999999721934579, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 6.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2025.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0029542097488921715, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9970457902511078, \"precision\": 1.0, \"recall\": 0.0029542097488921715, \"specificity\": 1.0, \"npv\": 0.361198738170347, \"accuracy\": 0.36240554156171284, \"f1\": 0.005891016200294551, \"f2\": 0.0036900369003690036, \"f0_5\": 0.014598540145985401, \"p4\": 0.011652683870064942, \"phi\": 0.032665835877723835}, {\"truth_threshold\": 25.400000378489494, \"match_probability\": 0.9999999774140695, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 5.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2026.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.002461841457410143, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9975381585425899, \"precision\": 1.0, \"recall\": 0.002461841457410143, \"specificity\": 1.0, \"npv\": 0.36108483128350677, \"accuracy\": 0.3620906801007557, \"f1\": 0.004911591355599214, \"f2\": 0.0030754090294009104, \"f0_5\": 0.01218917601170161, \"p4\": 0.009733083985039102, \"phi\": 0.02981498964104606}, {\"truth_threshold\": 25.900000385940075, \"match_probability\": 0.9999999840293354, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 4.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2027.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0019694731659281144, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.9980305268340719, \"precision\": 1.0, \"recall\": 0.0019694731659281144, \"specificity\": 1.0, \"npv\": 0.3609709962168979, \"accuracy\": 0.36177581863979846, \"f1\": 0.003931203931203931, \"f2\": 0.0024606299212598425, \"f0_5\": 0.009770395701025891, \"p4\": 0.007804568825263287, \"phi\": 0.026663133550419747}, {\"truth_threshold\": 26.600000396370888, \"match_probability\": 0.9999999901689027, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 2.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2029.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0009847365829640572, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.999015263417036, \"precision\": 1.0, \"recall\": 0.0009847365829640572, \"specificity\": 1.0, \"npv\": 0.36074354127284186, \"accuracy\": 0.36114609571788414, \"f1\": 0.001967535661583866, \"f2\": 0.0012306177701206006, \"f0_5\": 0.004904364884747425, \"p4\": 0.003920522953249476, \"phi\": 0.018847741566547744}, {\"truth_threshold\": 27.700000412762165, \"match_probability\": 0.999999995413631, \"total_clerical_labels\": 3176.0, \"p\": 2031.0, \"n\": 1145.0, \"tp\": 1.0, \"tn\": 1145.0, \"fp\": 0.0, \"fn\": 2030.0, \"P_rate\": 0.6394836272040302, \"N_rate\": 0.36051636934280396, \"tp_rate\": 0.0004923682914820286, \"tn_rate\": 1.0, \"fp_rate\": 0.0, \"fn_rate\": 0.999507631708518, \"precision\": 1.0, \"recall\": 0.0004923682914820286, \"specificity\": 1.0, \"npv\": 0.3606299212598425, \"accuracy\": 0.3608312342569269, \"f1\": 0.000984251968503937, \"f2\": 0.0006153846153846154, \"f0_5\": 0.002457002457002457, \"p4\": 0.001964855681779181, \"phi\": 0.013325266908696693}]}}, {\"mode\": \"vega-lite\"});\n",
              "</script>"
            ],
            "text/plain": [
              "alt.Chart(...)"
            ]
          },
          "execution_count": 8,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "linker.evaluation.accuracy_analysis_from_labels_table(labels_table, output_type=\"roc\")"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "12e6ba74",
      "metadata": {},
      "source": [
        "## Truth table\n",
        "\n",
        "Finally, Splink can also report the underlying table used to construct the ROC and precision recall curves.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 9,
      "id": "f7c283ba",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:23.286740Z",
          "iopub.status.busy": "2024-07-18T13:59:23.286467Z",
          "iopub.status.idle": "2024-07-18T13:59:23.494911Z",
          "shell.execute_reply": "2024-07-18T13:59:23.494348Z"
        }
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>truth_threshold</th>\n",
              "      <th>match_probability</th>\n",
              "      <th>total_clerical_labels</th>\n",
              "      <th>p</th>\n",
              "      <th>n</th>\n",
              "      <th>tp</th>\n",
              "      <th>tn</th>\n",
              "      <th>fp</th>\n",
              "      <th>fn</th>\n",
              "      <th>P_rate</th>\n",
              "      <th>N_rate</th>\n",
              "      <th>tp_rate</th>\n",
              "      <th>tn_rate</th>\n",
              "      <th>fp_rate</th>\n",
              "      <th>fn_rate</th>\n",
              "      <th>precision</th>\n",
              "      <th>recall</th>\n",
              "      <th>specificity</th>\n",
              "      <th>npv</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>f1</th>\n",
              "      <th>f2</th>\n",
              "      <th>f0_5</th>\n",
              "      <th>p4</th>\n",
              "      <th>phi</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>-18.9</td>\n",
              "      <td>0.000002</td>\n",
              "      <td>3176.0</td>\n",
              "      <td>2031.0</td>\n",
              "      <td>1145.0</td>\n",
              "      <td>1709.0</td>\n",
              "      <td>1103.0</td>\n",
              "      <td>42.0</td>\n",
              "      <td>322.0</td>\n",
              "      <td>0.639484</td>\n",
              "      <td>0.360516</td>\n",
              "      <td>0.841457</td>\n",
              "      <td>0.963319</td>\n",
              "      <td>0.036681</td>\n",
              "      <td>0.158543</td>\n",
              "      <td>0.976014</td>\n",
              "      <td>0.841457</td>\n",
              "      <td>0.963319</td>\n",
              "      <td>0.774035</td>\n",
              "      <td>0.885390</td>\n",
              "      <td>0.903755</td>\n",
              "      <td>0.865316</td>\n",
              "      <td>0.945766</td>\n",
              "      <td>0.880476</td>\n",
              "      <td>0.776931</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>-16.7</td>\n",
              "      <td>0.000009</td>\n",
              "      <td>3176.0</td>\n",
              "      <td>2031.0</td>\n",
              "      <td>1145.0</td>\n",
              "      <td>1709.0</td>\n",
              "      <td>1119.0</td>\n",
              "      <td>26.0</td>\n",
              "      <td>322.0</td>\n",
              "      <td>0.639484</td>\n",
              "      <td>0.360516</td>\n",
              "      <td>0.841457</td>\n",
              "      <td>0.977293</td>\n",
              "      <td>0.022707</td>\n",
              "      <td>0.158543</td>\n",
              "      <td>0.985014</td>\n",
              "      <td>0.841457</td>\n",
              "      <td>0.977293</td>\n",
              "      <td>0.776544</td>\n",
              "      <td>0.890428</td>\n",
              "      <td>0.907594</td>\n",
              "      <td>0.866721</td>\n",
              "      <td>0.952514</td>\n",
              "      <td>0.886010</td>\n",
              "      <td>0.789637</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>-12.8</td>\n",
              "      <td>0.000140</td>\n",
              "      <td>3176.0</td>\n",
              "      <td>2031.0</td>\n",
              "      <td>1145.0</td>\n",
              "      <td>1709.0</td>\n",
              "      <td>1125.0</td>\n",
              "      <td>20.0</td>\n",
              "      <td>322.0</td>\n",
              "      <td>0.639484</td>\n",
              "      <td>0.360516</td>\n",
              "      <td>0.841457</td>\n",
              "      <td>0.982533</td>\n",
              "      <td>0.017467</td>\n",
              "      <td>0.158543</td>\n",
              "      <td>0.988433</td>\n",
              "      <td>0.841457</td>\n",
              "      <td>0.982533</td>\n",
              "      <td>0.777471</td>\n",
              "      <td>0.892317</td>\n",
              "      <td>0.909043</td>\n",
              "      <td>0.867249</td>\n",
              "      <td>0.955069</td>\n",
              "      <td>0.888076</td>\n",
              "      <td>0.794416</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>3</th>\n",
              "      <td>-12.5</td>\n",
              "      <td>0.000173</td>\n",
              "      <td>3176.0</td>\n",
              "      <td>2031.0</td>\n",
              "      <td>1145.0</td>\n",
              "      <td>1708.0</td>\n",
              "      <td>1125.0</td>\n",
              "      <td>20.0</td>\n",
              "      <td>323.0</td>\n",
              "      <td>0.639484</td>\n",
              "      <td>0.360516</td>\n",
              "      <td>0.840965</td>\n",
              "      <td>0.982533</td>\n",
              "      <td>0.017467</td>\n",
              "      <td>0.159035</td>\n",
              "      <td>0.988426</td>\n",
              "      <td>0.840965</td>\n",
              "      <td>0.982533</td>\n",
              "      <td>0.776934</td>\n",
              "      <td>0.892003</td>\n",
              "      <td>0.908752</td>\n",
              "      <td>0.866829</td>\n",
              "      <td>0.954937</td>\n",
              "      <td>0.887763</td>\n",
              "      <td>0.793897</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>4</th>\n",
              "      <td>-12.4</td>\n",
              "      <td>0.000185</td>\n",
              "      <td>3176.0</td>\n",
              "      <td>2031.0</td>\n",
              "      <td>1145.0</td>\n",
              "      <td>1705.0</td>\n",
              "      <td>1132.0</td>\n",
              "      <td>13.0</td>\n",
              "      <td>326.0</td>\n",
              "      <td>0.639484</td>\n",
              "      <td>0.360516</td>\n",
              "      <td>0.839488</td>\n",
              "      <td>0.988646</td>\n",
              "      <td>0.011354</td>\n",
              "      <td>0.160512</td>\n",
              "      <td>0.992433</td>\n",
              "      <td>0.839488</td>\n",
              "      <td>0.988646</td>\n",
              "      <td>0.776406</td>\n",
              "      <td>0.893262</td>\n",
              "      <td>0.909576</td>\n",
              "      <td>0.866186</td>\n",
              "      <td>0.957542</td>\n",
              "      <td>0.889225</td>\n",
              "      <td>0.797936</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "   truth_threshold  match_probability  total_clerical_labels       p       n  \\\n",
              "0            -18.9           0.000002                 3176.0  2031.0  1145.0   \n",
              "1            -16.7           0.000009                 3176.0  2031.0  1145.0   \n",
              "2            -12.8           0.000140                 3176.0  2031.0  1145.0   \n",
              "3            -12.5           0.000173                 3176.0  2031.0  1145.0   \n",
              "4            -12.4           0.000185                 3176.0  2031.0  1145.0   \n",
              "\n",
              "       tp      tn    fp     fn    P_rate    N_rate   tp_rate   tn_rate  \\\n",
              "0  1709.0  1103.0  42.0  322.0  0.639484  0.360516  0.841457  0.963319   \n",
              "1  1709.0  1119.0  26.0  322.0  0.639484  0.360516  0.841457  0.977293   \n",
              "2  1709.0  1125.0  20.0  322.0  0.639484  0.360516  0.841457  0.982533   \n",
              "3  1708.0  1125.0  20.0  323.0  0.639484  0.360516  0.840965  0.982533   \n",
              "4  1705.0  1132.0  13.0  326.0  0.639484  0.360516  0.839488  0.988646   \n",
              "\n",
              "    fp_rate   fn_rate  precision    recall  specificity       npv  accuracy  \\\n",
              "0  0.036681  0.158543   0.976014  0.841457     0.963319  0.774035  0.885390   \n",
              "1  0.022707  0.158543   0.985014  0.841457     0.977293  0.776544  0.890428   \n",
              "2  0.017467  0.158543   0.988433  0.841457     0.982533  0.777471  0.892317   \n",
              "3  0.017467  0.159035   0.988426  0.840965     0.982533  0.776934  0.892003   \n",
              "4  0.011354  0.160512   0.992433  0.839488     0.988646  0.776406  0.893262   \n",
              "\n",
              "         f1        f2      f0_5        p4       phi  \n",
              "0  0.903755  0.865316  0.945766  0.880476  0.776931  \n",
              "1  0.907594  0.866721  0.952514  0.886010  0.789637  \n",
              "2  0.909043  0.867249  0.955069  0.888076  0.794416  \n",
              "3  0.908752  0.866829  0.954937  0.887763  0.793897  \n",
              "4  0.909576  0.866186  0.957542  0.889225  0.797936  "
            ]
          },
          "execution_count": 9,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "roc_table = linker.evaluation.accuracy_analysis_from_labels_table(\n",
        "    labels_table, output_type=\"table\"\n",
        ")\n",
        "roc_table.as_pandas_dataframe(limit=5)"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "a0d6e855",
      "metadata": {},
      "source": [
        "## Unlinkables chart\n",
        "\n",
        "Finally, it can be interesting to analyse whether your dataset contains any 'unlinkable' records.\n",
        "\n",
        "'Unlinkable records' are records with such poor data quality they don't even link to themselves at a high enough probability to be accepted as matches\n",
        "\n",
        "For example, in a typical linkage problem, a 'John Smith' record with nulls for their address and postcode may be unlinkable.  By 'unlinkable' we don't mean there are no matches; rather, we mean it is not possible to determine whether there are matches.UnicodeTranslateError\n",
        "\n",
        "A high proportion of unlinkable records is an indication of poor quality in the input dataset"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 10,
      "id": "421013d0",
      "metadata": {
        "execution": {
          "iopub.execute_input": "2024-07-18T13:59:23.498601Z",
          "iopub.status.busy": "2024-07-18T13:59:23.498116Z",
          "iopub.status.idle": "2024-07-18T13:59:23.867219Z",
          "shell.execute_reply": "2024-07-18T13:59:23.866532Z"
        }
      },
      "outputs": [
        {
          "data": {
            "text/html": [
              "\n",
              "<style>\n",
              "  #altair-viz-edfcff50df7d4afda12bcd5f27e587d8.vega-embed {\n",
              "    width: 100%;\n",
              "    display: flex;\n",
              "  }\n",
              "\n",
              "  #altair-viz-edfcff50df7d4afda12bcd5f27e587d8.vega-embed details,\n",
              "  #altair-viz-edfcff50df7d4afda12bcd5f27e587d8.vega-embed details summary {\n",
              "    position: relative;\n",
              "  }\n",
              "</style>\n",
              "<div id=\"altair-viz-edfcff50df7d4afda12bcd5f27e587d8\"></div>\n",
              "<script type=\"text/javascript\">\n",
              "  var VEGA_DEBUG = (typeof VEGA_DEBUG == \"undefined\") ? {} : VEGA_DEBUG;\n",
              "  (function(spec, embedOpt){\n",
              "    let outputDiv = document.currentScript.previousElementSibling;\n",
              "    if (outputDiv.id !== \"altair-viz-edfcff50df7d4afda12bcd5f27e587d8\") {\n",
              "      outputDiv = document.getElementById(\"altair-viz-edfcff50df7d4afda12bcd5f27e587d8\");\n",
              "    }\n",
              "    const paths = {\n",
              "      \"vega\": \"https://cdn.jsdelivr.net/npm/vega@5?noext\",\n",
              "      \"vega-lib\": \"https://cdn.jsdelivr.net/npm/vega-lib?noext\",\n",
              "      \"vega-lite\": \"https://cdn.jsdelivr.net/npm/vega-lite@5.17.0?noext\",\n",
              "      \"vega-embed\": \"https://cdn.jsdelivr.net/npm/vega-embed@6?noext\",\n",
              "    };\n",
              "\n",
              "    function maybeLoadScript(lib, version) {\n",
              "      var key = `${lib.replace(\"-\", \"\")}_version`;\n",
              "      return (VEGA_DEBUG[key] == version) ?\n",
              "        Promise.resolve(paths[lib]) :\n",
              "        new Promise(function(resolve, reject) {\n",
              "          var s = document.createElement('script');\n",
              "          document.getElementsByTagName(\"head\")[0].appendChild(s);\n",
              "          s.async = true;\n",
              "          s.onload = () => {\n",
              "            VEGA_DEBUG[key] = version;\n",
              "            return resolve(paths[lib]);\n",
              "          };\n",
              "          s.onerror = () => reject(`Error loading script: ${paths[lib]}`);\n",
              "          s.src = paths[lib];\n",
              "        });\n",
              "    }\n",
              "\n",
              "    function showError(err) {\n",
              "      outputDiv.innerHTML = `<div class=\"error\" style=\"color:red;\">${err}</div>`;\n",
              "      throw err;\n",
              "    }\n",
              "\n",
              "    function displayChart(vegaEmbed) {\n",
              "      vegaEmbed(outputDiv, spec, embedOpt)\n",
              "        .catch(err => showError(`Javascript Error: ${err.message}<br>This usually means there's a typo in your chart specification. See the javascript console for the full traceback.`));\n",
              "    }\n",
              "\n",
              "    if(typeof define === \"function\" && define.amd) {\n",
              "      requirejs.config({paths});\n",
              "      require([\"vega-embed\"], displayChart, err => showError(`Error loading script: ${err.message}`));\n",
              "    } else {\n",
              "      maybeLoadScript(\"vega\", \"5\")\n",
              "        .then(() => maybeLoadScript(\"vega-lite\", \"5.17.0\"))\n",
              "        .then(() => maybeLoadScript(\"vega-embed\", \"6\"))\n",
              "        .catch(showError)\n",
              "        .then(() => displayChart(vegaEmbed));\n",
              "    }\n",
              "  })({\"config\": {\"view\": {\"continuousWidth\": 400, \"continuousHeight\": 300}}, \"layer\": [{\"mark\": {\"type\": \"line\"}, \"encoding\": {\"x\": {\"axis\": {\"format\": \"+\", \"title\": \"Threshold match weight\"}, \"field\": \"match_weight\", \"type\": \"quantitative\"}, \"y\": {\"axis\": {\"format\": \"%\", \"title\": \"Percentage of unlinkable records\"}, \"field\": \"cum_prop\", \"type\": \"quantitative\"}}}, {\"mark\": {\"type\": \"point\"}, \"encoding\": {\"opacity\": {\"condition\": {\"param\": \"x_match_weight_y_cum_prop_coords_of_mouse\", \"value\": 1, \"empty\": false}, \"value\": 0}, \"tooltip\": [{\"field\": \"match_weight\", \"format\": \"+.5\", \"title\": \"Match weight\", \"type\": \"quantitative\"}, {\"field\": \"match_probability\", \"format\": \".5\", \"title\": \"Match probability\", \"type\": \"quantitative\"}, {\"field\": \"cum_prop\", \"format\": \".3%\", \"title\": \"Proportion of unlinkable records\", \"type\": \"quantitative\"}], \"x\": {\"axis\": {\"title\": \"Threshold match weight\"}, \"field\": \"match_weight\", \"type\": \"quantitative\"}, \"y\": {\"axis\": {\"format\": \"%\", \"title\": \"Percentage of unlinkable records\"}, \"field\": \"cum_prop\", \"type\": \"quantitative\"}}, \"name\": \"mouse_coords\"}, {\"mark\": {\"type\": \"rule\", \"color\": \"gray\"}, \"encoding\": {\"x\": {\"field\": \"match_weight\", \"type\": \"quantitative\"}}, \"transform\": [{\"filter\": {\"param\": \"x_match_weight_y_cum_prop_coords_of_mouse\", \"empty\": false}}]}, {\"mark\": {\"type\": \"rule\", \"color\": \"gray\"}, \"encoding\": {\"y\": {\"field\": \"cum_prop\", \"type\": \"quantitative\"}}, \"transform\": [{\"filter\": {\"param\": \"x_match_weight_y_cum_prop_coords_of_mouse\", \"empty\": false}}]}], \"data\": {\"name\": \"data-6a3e3ca5c2af2f39ba3584a28942c6e4\"}, \"height\": 400, \"params\": [{\"name\": \"x_match_weight_y_cum_prop_coords_of_mouse\", \"select\": {\"type\": \"point\", \"fields\": [\"match_weight\", \"cum_prop\"], \"nearest\": true, \"on\": \"mouseover\"}, \"views\": [\"mouse_coords\"]}], \"title\": {\"text\": \"Unlinkable records\", \"subtitle\": \"Records with insufficient information to exceed a given match threshold\"}, \"width\": 400, \"$schema\": \"https://vega.github.io/schema/vega-lite/v5.9.3.json\", \"datasets\": {\"data-6a3e3ca5c2af2f39ba3584a28942c6e4\": [{\"match_weight\": -0.58, \"match_probability\": 0.40099, \"prop\": 0.0020000000949949026, \"cum_prop\": 0.0020000000949949026}, {\"match_weight\": 3.29, \"match_probability\": 0.90709, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.003000000142492354}, {\"match_weight\": 4.09, \"match_probability\": 0.9445, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.004000000189989805}, {\"match_weight\": 4.45, \"match_probability\": 0.95627, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.0050000002374872565}, {\"match_weight\": 4.51, \"match_probability\": 0.95795, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.006000000284984708}, {\"match_weight\": 5.09, \"match_probability\": 0.97146, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.007000000332482159}, {\"match_weight\": 5.91, \"match_probability\": 0.98367, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.00800000037997961}, {\"match_weight\": 5.94, \"match_probability\": 0.98393, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.009000000427477062}, {\"match_weight\": 6.19, \"match_probability\": 0.98649, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.010000000474974513}, {\"match_weight\": 6.31, \"match_probability\": 0.98757, \"prop\": 0.0020000000949949026, \"cum_prop\": 0.012000000569969416}, {\"match_weight\": 6.61, \"match_probability\": 0.98983, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.013000000617466867}, {\"match_weight\": 7.01, \"match_probability\": 0.99229, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.014000000664964318}, {\"match_weight\": 7.09, \"match_probability\": 0.99274, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.01500000071246177}, {\"match_weight\": 7.19, \"match_probability\": 0.9932, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.01600000075995922}, {\"match_weight\": 7.27, \"match_probability\": 0.99357, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.017000000807456672}, {\"match_weight\": 7.3, \"match_probability\": 0.99369, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.018000000854954123}, {\"match_weight\": 7.5, \"match_probability\": 0.9945, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.019000000902451575}, {\"match_weight\": 7.59, \"match_probability\": 0.99485, \"prop\": 0.003000000026077032, \"cum_prop\": 0.022000000928528607}, {\"match_weight\": 7.9, \"match_probability\": 0.99582, \"prop\": 0.003000000026077032, \"cum_prop\": 0.02500000095460564}, {\"match_weight\": 7.91, \"match_probability\": 0.99587, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.02600000100210309}, {\"match_weight\": 8.19, \"match_probability\": 0.99659, \"prop\": 0.003000000026077032, \"cum_prop\": 0.029000001028180122}, {\"match_weight\": 8.26, \"match_probability\": 0.99674, \"prop\": 0.003000000026077032, \"cum_prop\": 0.032000001054257154}, {\"match_weight\": 8.41, \"match_probability\": 0.99707, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.033000001101754606}, {\"match_weight\": 8.59, \"match_probability\": 0.99742, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.03400000114925206}, {\"match_weight\": 8.76, \"match_probability\": 0.99769, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.03500000119674951}, {\"match_weight\": 8.83, \"match_probability\": 0.9978, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.03600000124424696}, {\"match_weight\": 9.3, \"match_probability\": 0.99841, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.03700000129174441}, {\"match_weight\": 9.44, \"match_probability\": 0.99856, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.03800000133924186}, {\"match_weight\": 9.48, \"match_probability\": 0.9986, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.039000001386739314}, {\"match_weight\": 9.5, \"match_probability\": 0.99862, \"prop\": 0.0020000000949949026, \"cum_prop\": 0.041000001481734216}, {\"match_weight\": 9.59, \"match_probability\": 0.99871, \"prop\": 0.0020000000949949026, \"cum_prop\": 0.04300000157672912}, {\"match_weight\": 9.73, \"match_probability\": 0.99883, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.04400000162422657}, {\"match_weight\": 9.83, \"match_probability\": 0.9989, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.04500000167172402}, {\"match_weight\": 9.89, \"match_probability\": 0.99895, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.04600000171922147}, {\"match_weight\": 9.94, \"match_probability\": 0.99898, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.047000001766718924}, {\"match_weight\": 10.12, \"match_probability\": 0.9991, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.048000001814216375}, {\"match_weight\": 10.18, \"match_probability\": 0.99914, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.04900000186171383}, {\"match_weight\": 10.27, \"match_probability\": 0.99919, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.05000000190921128}, {\"match_weight\": 10.36, \"match_probability\": 0.99924, \"prop\": 0.0020000000949949026, \"cum_prop\": 0.05200000200420618}, {\"match_weight\": 10.45, \"match_probability\": 0.99928, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.05300000205170363}, {\"match_weight\": 10.53, \"match_probability\": 0.99932, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.05400000209920108}, {\"match_weight\": 10.56, \"match_probability\": 0.99934, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.055000002146698534}, {\"match_weight\": 10.59, \"match_probability\": 0.99935, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.056000002194195986}, {\"match_weight\": 10.61, \"match_probability\": 0.99936, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.05700000224169344}, {\"match_weight\": 10.64, \"match_probability\": 0.99937, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.05800000228919089}, {\"match_weight\": 10.76, \"match_probability\": 0.99942, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.05900000233668834}, {\"match_weight\": 10.94, \"match_probability\": 0.99949, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.06000000238418579}, {\"match_weight\": 10.95, \"match_probability\": 0.9995, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.06100000243168324}, {\"match_weight\": 11.06, \"match_probability\": 0.99953, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.062000002479180694}, {\"match_weight\": 11.1, \"match_probability\": 0.99955, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.06300000252667814}, {\"match_weight\": 11.15, \"match_probability\": 0.99956, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.0640000025741756}, {\"match_weight\": 11.27, \"match_probability\": 0.9996, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.06500000262167305}, {\"match_weight\": 11.34, \"match_probability\": 0.99961, \"prop\": 0.0020000000949949026, \"cum_prop\": 0.06700000271666795}, {\"match_weight\": 11.54, \"match_probability\": 0.99966, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.0680000027641654}, {\"match_weight\": 11.76, \"match_probability\": 0.99971, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.06900000281166285}, {\"match_weight\": 11.8, \"match_probability\": 0.99972, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.0700000028591603}, {\"match_weight\": 11.98, \"match_probability\": 0.99975, \"prop\": 0.003000000026077032, \"cum_prop\": 0.07300000288523734}, {\"match_weight\": 12.25, \"match_probability\": 0.99979, \"prop\": 0.003000000026077032, \"cum_prop\": 0.07600000291131437}, {\"match_weight\": 12.37, \"match_probability\": 0.99981, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.07700000295881182}, {\"match_weight\": 12.47, \"match_probability\": 0.99982, \"prop\": 0.004999999888241291, \"cum_prop\": 0.08200000284705311}, {\"match_weight\": 12.56, \"match_probability\": 0.99983, \"prop\": 0.0010000000474974513, \"cum_prop\": 0.08300000289455056}, {\"match_weight\": 12.73, \"match_probability\": 0.99985, \"prop\": 0.0020000000949949026, \"cum_prop\": 0.08500000298954546}, {\"match_weight\": 12.84, \"match_probability\": 0.99986, \"prop\": 0.004000000189989805, \"cum_prop\": 0.08900000317953527}, {\"match_weight\": 12.93, \"match_probability\": 0.99987, \"prop\": 0.003000000026077032, \"cum_prop\": 0.0920000032056123}, {\"match_weight\": 13.08, \"match_probability\": 0.99988, \"prop\": 0.006000000052154064, \"cum_prop\": 0.09800000325776637}, {\"match_weight\": 13.15, \"match_probability\": 0.99989, \"prop\": 0.004000000189989805, \"cum_prop\": 0.10200000344775617}, {\"match_weight\": 13.35, \"match_probability\": 0.9999, \"prop\": 0.004000000189989805, \"cum_prop\": 0.10600000363774598}, {\"match_weight\": 13.48, \"match_probability\": 0.99991, \"prop\": 0.004000000189989805, \"cum_prop\": 0.11000000382773578}, {\"match_weight\": 13.68, \"match_probability\": 0.99992, \"prop\": 0.009999999776482582, \"cum_prop\": 0.12000000360421836}, {\"match_weight\": 13.86, \"match_probability\": 0.99993, \"prop\": 0.004999999888241291, \"cum_prop\": 0.12500000349245965}, {\"match_weight\": 14.15, \"match_probability\": 0.99994, \"prop\": 0.004000000189989805, \"cum_prop\": 0.12900000368244946}, {\"match_weight\": 14.41, \"match_probability\": 0.99995, \"prop\": 0.013000000268220901, \"cum_prop\": 0.14200000395067036}, {\"match_weight\": 14.79, \"match_probability\": 0.99996, \"prop\": 0.013000000268220901, \"cum_prop\": 0.15500000421889126}, {\"match_weight\": 15.28, \"match_probability\": 0.99997, \"prop\": 0.02199999988079071, \"cum_prop\": 0.17700000409968197}, {\"match_weight\": 16.02, \"match_probability\": 0.99998, \"prop\": 0.03400000184774399, \"cum_prop\": 0.21100000594742596}, {\"match_weight\": 17.57, \"match_probability\": 0.99999, \"prop\": 0.07599999755620956, \"cum_prop\": 0.2870000035036355}]}}, {\"mode\": \"vega-lite\"});\n",
              "</script>"
            ],
            "text/plain": [
              "alt.LayerChart(...)"
            ]
          },
          "execution_count": 10,
          "metadata": {},
          "output_type": "execute_result"
        }
      ],
      "source": [
        "linker.evaluation.unlinkables_chart()"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "08199ce7",
      "metadata": {},
      "source": [
        "For this dataset and this trained model, we can see that most records are (theoretically) linkable:  At a match weight 6, around around 99% of records could be linked to themselves."
      ]
    },
    {
      "cell_type": "markdown",
      "id": "f2d6d5f4",
      "metadata": {},
      "source": [
        "!!! note \"Further Reading\"\n",
        "\n",
        "    :material-tools: For more on the quality assurance tools in Splink, please refer to the [Evaluation API documentation](../../api_docs/evaluation.md).\n",
        "\n",
        "    :bar_chart: For more on the charts used in this tutorial, please refer to the [Charts Gallery](../../charts/index.md#model-evaluation).\n",
        "\n",
        "    :material-thumbs-up-down: For more on the Evaluation Metrics used in this tutorial, please refer to the [Edge Metrics guide.](../../topic_guides/evaluation/edge_metrics.md)\n"
      ]
    },
    {
      "cell_type": "markdown",
      "id": "b7ee8f2d",
      "metadata": {},
      "source": [
        "## :material-flag-checkered: That's it!\n",
        "\n",
        "That wraps up the Splink tutorial! Don't worry, there are still plenty of resources to help on the next steps of your Splink journey:\n",
        "\n",
        ":octicons-link-16: For some end-to-end notebooks of Splink pipelines, check out our [Examples](../examples/examples_index.md)\n",
        "\n",
        ":simple-readme: For more deepdives into the different aspects of Splink, and record linkage more generally, check out our [Topic Guides](../../topic_guides/topic_guides_index.md)\n",
        "\n",
        ":material-tools: For a reference on all the functionality avalable in Splink, see our [Documentation](../../api_docs/api_docs_index.md)\n"
      ]
    }
  ],
  "metadata": {
    "kernelspec": {
      "display_name": "Python 3 (ipykernel)",
      "language": "python",
      "name": "python3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.10.8"
    }
  },
  "nbformat": 4,
  "nbformat_minor": 5
}
