hexsha
stringlengths
40
40
size
int64
6
14.9M
ext
stringclasses
1 value
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
6
260
max_stars_repo_name
stringlengths
6
119
max_stars_repo_head_hexsha
stringlengths
40
41
max_stars_repo_licenses
sequence
max_stars_count
int64
1
191k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
6
260
max_issues_repo_name
stringlengths
6
119
max_issues_repo_head_hexsha
stringlengths
40
41
max_issues_repo_licenses
sequence
max_issues_count
int64
1
67k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
6
260
max_forks_repo_name
stringlengths
6
119
max_forks_repo_head_hexsha
stringlengths
40
41
max_forks_repo_licenses
sequence
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
avg_line_length
float64
2
1.04M
max_line_length
int64
2
11.2M
alphanum_fraction
float64
0
1
cells
sequence
cell_types
sequence
cell_type_groups
sequence
d025075d3ab613af40a8acce1c9e5ca463e36fe4
77,130
ipynb
Jupyter Notebook
dcrunch_R_example.ipynb
rwarnung/datacrunch-notebooks
0a0874682c4ed68f9bd347d3db9d527e9daf05dd
[ "MIT" ]
1
2022-02-02T05:22:27.000Z
2022-02-02T05:22:27.000Z
r/dcrunch_R_example.ipynb
ronnyfahrudin/crunchdao-notebooks
82f52d41539068f4d4c3c4bcb7480653ce9abeb6
[ "MIT" ]
null
null
null
r/dcrunch_R_example.ipynb
ronnyfahrudin/crunchdao-notebooks
82f52d41539068f4d4c3c4bcb7480653ce9abeb6
[ "MIT" ]
1
2022-02-01T00:48:49.000Z
2022-02-01T00:48:49.000Z
119.211747
18,638
0.815623
[ [ [ "<a href=\"https://colab.research.google.com/github/rwarnung/datacrunch-notebooks/blob/master/dcrunch_R_example.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "**Data crunch example R script**\n\n\n---\n\n\nauthor: sweet-richard\n\ndate: Jan 30, 2022\n\nrequired packages:\n\n* `tidyverse` for data handling\n* `feather` for efficient loading of data\n* `xgboost` for predictive modelling\n* `httr` for the automatic upload.\n\n", "_____no_output_____" ] ], [ [ "library(tidyverse)\nlibrary(feather)", "_____no_output_____" ] ], [ [ "First, we set some **parameters**.\n\n\n* `is_download` controls whether you want to download data or just read prevously downloaded data\n* `is_upload` set this to TRUE for automatic upload.\n* `nrounds` is a parameter for `xgboost` that we set to 100 for illustration. You might want to adjust the paramters of xgboost.\n\n\n\n\n\n\n\n\n\n", "_____no_output_____" ] ], [ [ "#' ## Parameters\nfile_name_train = \"train_data.feather\"\nfile_name_test =\"test_data.feather\"\nis_download = TRUE # set this to true to download new data or to FALSE to load data in feather format\nis_upload = FALSE # set this to true to upload a submission\nnrounds = 300 # you might want to adjust this one and other parameters of xgboost", "_____no_output_____" ] ], [ [ "In the **functions** section we defined the correlation measure that we use to measure performance.", "_____no_output_____" ] ], [ [ "#' ## Functions\n#+ \n\ngetCorrMeasure = function(actual, predicted) {\n cor_measure = cor(actual, predicted, method=\"spearman\")\n return(cor_measure)\n}", "_____no_output_____" ] ], [ [ "Now, we either **download** the current data from the servers or load them in feather format. Furthermore, we define the features that we actually want to use. In this illustration we use all of them but `id` and `Moons`.", "_____no_output_____" ] ], [ [ "#' ## Download data\n#' after the download, data is stored in feather format to be read on demand quickly. Data is stored in integer format to save memory.\n#+\n\nif( is_download ) {\n \n cat(\"\\n start download\")\n train_datalink_X = 'https://tournament.datacrunch.com/data/X_train.csv' \n train_datalink_y = 'https://tournament.datacrunch.com/data/y_train.csv' \n hackathon_data_link = 'https://tournament.datacrunch.com/data/X_test.csv' \n \n train_dataX = read_csv(url(train_datalink_X))\n train_dataY = read_csv(url(train_datalink_y))\n test_data = read_csv(url(hackathon_data_link))\n \n train_data = \n bind_cols( train_dataX, train_dataY)\n \n train_data = train_data %>% mutate_at(vars(starts_with(\"feature_\")), list(~as.integer(.*100)))\n feather::write_feather(train_data, path = paste0(\"./\", file_name_train))\n \n test_data = test_data %>% mutate_at(vars(starts_with(\"feature_\")), list(~as.integer(.*100)))\n feather::write_feather(test_data, path = paste0(\"./\", file_name_test))\n \n names(train_data)\n \n nrow(train_data)\n nrow(test_data)\n \n cat(\"\\n data is downloaded\")\n} else {\n train_data = feather::read_feather(path = paste0(\"./\", file_name_train))\n test_data = feather::read_feather(path = paste0(\"./\", file_name_test))\n}\n\n## set vars used for modelling \nmodel_vars = setdiff(names(test_data), c(\"id\",\"Moons\"))", "\n start download" ] ], [ [ "Next we fit our go-to algorithm **xgboost** with mainly default parameters, only `eta` and `max_depth` are set.", "_____no_output_____" ] ], [ [ "#' ## Fit xgboost\n#+ cache = TRUE\n\nlibrary(xgboost, warn.conflicts = FALSE)\n\n# custom loss function for eval \ncorrmeasure <- function(preds, dtrain) {\n labels <- getinfo(dtrain, \"label\")\n corrm <- as.numeric(cor(labels, preds, method=\"spearman\"))\n return(list(metric = \"corr\", value = corrm))\n}\n \neval_metric_string = \"rmse\"\nmy_objective = \"reg:squarederror\"\n\ntree.params = list(\n booster = \"gbtree\", eta = 0.01, max_depth = 5,\n tree_method = \"hist\", # tree_method = \"auto\",\n objective = my_objective)\n\ncat(\"\\n starting xgboost \\n\")\n", "\n starting xgboost \n" ] ], [ [ "**First target** `target_r`", "_____no_output_____" ] ], [ [ "# first target target_r then g and b\n################\ncurrent_target = \"target_r\"\n \ndtrain = xgb.DMatrix(train_data %>% select(one_of(model_vars)) %>% as.matrix(), label = train_data %>% select(one_of(current_target)) %>% as.matrix())\n \nxgb.model.tree = xgb.train(data = dtrain, \n params = tree.params, nrounds = nrounds, verbose = 1, \n print_every_n = 50L, eval_metric = corrmeasure) \n \nxgboost_tree_train_pred1 = predict(xgb.model.tree, train_data %>% select(one_of(model_vars)) %>% as.matrix())\nxgboost_tree_live_pred1 = predict(xgb.model.tree, test_data %>% select(one_of(model_vars)) %>% as.matrix())\n \ncor_train = getCorrMeasure(train_data %>% select(one_of(current_target)), xgboost_tree_train_pred1)\n \ncat(\"\\n : metric: \", eval_metric_string, \"\\n\")\nprint(paste0(\"Corrm on train: \", round(cor_train,4)))\n\nprint(paste(\"xgboost\", current_target, \"ready\"))", "\n : metric: rmse \n[1] \"Corrm on train: 0.0867\"\n[1] \"xgboost target_r ready\"\n" ] ], [ [ "**Second target** `target_g`", "_____no_output_____" ] ], [ [ "# second target target_g\n################\ncurrent_target = \"target_g\"\n \ndtrain = xgb.DMatrix(train_data %>% select(one_of(model_vars)) %>% as.matrix(), label = train_data %>% select(one_of(current_target)) %>% as.matrix())\n\nxgb.model.tree = xgb.train(data = dtrain, \n params = tree.params, nrounds = nrounds, verbose = 1, \n print_every_n = 50L, eval_metric = corrmeasure) \n\nxgboost_tree_train_pred2 = predict(xgb.model.tree, train_data %>% select(one_of(model_vars)) %>% as.matrix())\nxgboost_tree_live_pred2 = predict(xgb.model.tree, test_data %>% select(one_of(model_vars)) %>% as.matrix())\n\ncor_train = getCorrMeasure(train_data %>% select(one_of(current_target)), xgboost_tree_train_pred2)\n\ncat(\"\\n : metric: \", eval_metric_string, \"\\n\")\nprint(paste0(\"Corrm on train: \", round(cor_train,4)))\n\nprint(paste(\"xgboost\", current_target, \"ready\"))", "\n : metric: rmse \n[1] \"Corrm on train: 0.1099\"\n[1] \"xgboost target_g ready\"\n" ] ], [ [ "**Third target** `target_b`", "_____no_output_____" ] ], [ [ "# third target target_b\n################\ncurrent_target = \"target_b\"\n\ndtrain = xgb.DMatrix(train_data %>% select(one_of(model_vars)) %>% as.matrix(), label = train_data %>% select(one_of(current_target)) %>% as.matrix())\n\nxgb.model.tree = xgb.train(data = dtrain, \n params = tree.params, nrounds = nrounds, verbose = 1, \n print_every_n = 50L, eval_metric = corrmeasure) \n\nxgboost_tree_train_pred3 = predict(xgb.model.tree, train_data %>% select(one_of(model_vars)) %>% as.matrix())\nxgboost_tree_live_pred3 = predict(xgb.model.tree, test_data %>% select(one_of(model_vars)) %>% as.matrix())\n\ncor_train = getCorrMeasure(train_data %>% select(one_of(current_target)), xgboost_tree_train_pred3)\n\ncat(\"\\n : metric: \", eval_metric_string, \"\\n\")\nprint(paste0(\"Corrm on train: \", round(cor_train,4)))\n\nprint(paste(\"xgboost\", current_target, \"ready\"))", "\n : metric: rmse \n[1] \"Corrm on train: 0.1232\"\n[1] \"xgboost target_b ready\"\n" ] ], [ [ "Then we produce simply histogram plots to see whether the predictions are plausible and prepare a **submission file**:", "_____no_output_____" ] ], [ [ "#' ## Submission\n#' simple histograms to check the submissions\n#+\nhist(xgboost_tree_live_pred1)\nhist(xgboost_tree_live_pred2)\nhist(xgboost_tree_live_pred3)\n \n#' create submission file\n#+\nsub_df = tibble(target_r = xgboost_tree_live_pred1, \n target_g = xgboost_tree_live_pred2, \n target_b = xgboost_tree_live_pred3)\n \nfile_name_submission = paste0(\"gbTree_\", gsub(\"-\",\"\",Sys.Date()), \".csv\")\n\nsub_df %>% readr::write_csv(file = paste0(\"./\", file_name_submission))\nnrow(sub_df)\n\ncat(\"\\n submission file written\")", "_____no_output_____" ] ], [ [ "Finally, we can **automatically upload** the file to the server:", "_____no_output_____" ] ], [ [ "#' ## Upload submission\n#+\nif( is_upload ) {\n library(httr)\n \n API_KEY = \"YourKeyHere\"\n \n response <- POST(\n url = \"https://tournament.crunchdao.com/api/v2/submissions\",\n query = list(apiKey = API_KEY),\n body = list(\n file = upload_file(path = paste0(\"./\", file_name_submission))\n ),\n encode = c(\"multipart\")\n );\n \n status <- status_code(response)\n \n if (status == 200) {\n print(\"Submission submitted :)\")\n } else if (status == 400) {\n print(\"ERR: The file must not be empty\")\n print(\"You have send a empty file.\")\n } else if (status == 401) {\n print(\"ERR: Your email hasn't been verified\")\n print(\"Please verify your email or contact a cruncher.\")\n } else if (status == 403) {\n print(\"ERR: Not authentified\")\n print(\"Is the API Key valid?\")\n } else if (status == 404) {\n print(\"ERR: Unknown API Key\")\n print(\"You should check that the provided API key is valid and is the same as the one you've received by email.\")\n } else if (status == 409) {\n print(\"ERR: Duplicate submission\")\n print(\"Your work has already been submitted with the same exact results, if you think that this a false positive, contact a cruncher.\")\n print(\"MD5 collision probability: 1/2^128 (source: https://stackoverflow.com/a/288519/7292958)\")\n } else if (status == 422) {\n print(\"ERR: API Key is missing or empty\")\n print(\"Did you forget to fill the API_KEY variable?\")\n } else if (status == 423) {\n print(\"ERR: Submissions are close\")\n print(\"You can only submit during rounds eg: Friday 7pm GMT+1 to Sunday midnight GMT+1.\")\n print(\"Or the server is currently crunching the submitted files, please wait some time before retrying.\")\n } else if (status == 423) {\n print(\"ERR: Too many submissions\")\n } else {\n content <- httr::content(response)\n print(\"ERR: Server returned: \" + toString(status))\n print(\"Ouch! It seems that we were not expecting this kind of result from the server, if the probleme persist, contact a cruncher.\")\n print(paste(\"Message:\", content$message, sep=\" \"))\n }\n \n # DEVELOPER WARNING:\n # THE API ERROR CODE WILL BE HANDLER DIFFERENTLY IN A NEAR FUTURE!\n # PLEASE STAY UPDATED BY JOINING THE DISCORD (https://discord.gg/veAtzsYn3M) AND READING THE NEWSLETTER EMAIL\n}", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d025186eb6f09425ea2418026728f9b3f792f7a2
13,467
ipynb
Jupyter Notebook
Datasets/Terrain/srtm_mtpi.ipynb
dmendelo/earthengine-py-notebooks
515567fa2702b436daf449fff02f5c690003cf94
[ "MIT" ]
2
2020-02-05T02:36:18.000Z
2021-03-23T11:02:39.000Z
Datasets/Terrain/srtm_mtpi.ipynb
Fernigithub/earthengine-py-notebooks
32689dc5da4a86e46ea30d8b22241866c1f7cf61
[ "MIT" ]
null
null
null
Datasets/Terrain/srtm_mtpi.ipynb
Fernigithub/earthengine-py-notebooks
32689dc5da4a86e46ea30d8b22241866c1f7cf61
[ "MIT" ]
3
2021-01-06T17:33:08.000Z
2022-02-18T02:14:18.000Z
79.686391
8,252
0.838717
[ [ [ "<table class=\"ee-notebook-buttons\" align=\"left\">\n <td><a target=\"_blank\" href=\"https://github.com/giswqs/earthengine-py-notebooks/tree/master/Datasets/Terrain/srtm_mtpi.ipynb\"><img width=32px src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" /> View source on GitHub</a></td>\n <td><a target=\"_blank\" href=\"https://nbviewer.jupyter.org/github/giswqs/earthengine-py-notebooks/blob/master/Datasets/Terrain/srtm_mtpi.ipynb\"><img width=26px src=\"https://upload.wikimedia.org/wikipedia/commons/thumb/3/38/Jupyter_logo.svg/883px-Jupyter_logo.svg.png\" />Notebook Viewer</a></td>\n <td><a target=\"_blank\" href=\"https://mybinder.org/v2/gh/giswqs/earthengine-py-notebooks/master?filepath=Datasets/Terrain/srtm_mtpi.ipynb\"><img width=58px src=\"https://mybinder.org/static/images/logo_social.png\" />Run in binder</a></td>\n <td><a target=\"_blank\" href=\"https://colab.research.google.com/github/giswqs/earthengine-py-notebooks/blob/master/Datasets/Terrain/srtm_mtpi.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" /> Run in Google Colab</a></td>\n</table>", "_____no_output_____" ], [ "## Install Earth Engine API\nInstall the [Earth Engine Python API](https://developers.google.com/earth-engine/python_install) and [geehydro](https://github.com/giswqs/geehydro). The **geehydro** Python package builds on the [folium](https://github.com/python-visualization/folium) package and implements several methods for displaying Earth Engine data layers, such as `Map.addLayer()`, `Map.setCenter()`, `Map.centerObject()`, and `Map.setOptions()`.\nThe magic command `%%capture` can be used to hide output from a specific cell. Uncomment these lines if you are running this notebook for the first time.", "_____no_output_____" ] ], [ [ "# %%capture\n# !pip install earthengine-api\n# !pip install geehydro", "_____no_output_____" ] ], [ [ "Import libraries", "_____no_output_____" ] ], [ [ "import ee\nimport folium\nimport geehydro", "_____no_output_____" ] ], [ [ "Authenticate and initialize Earth Engine API. You only need to authenticate the Earth Engine API once. Uncomment the line `ee.Authenticate()` \nif you are running this notebook for the first time or if you are getting an authentication error. ", "_____no_output_____" ] ], [ [ "# ee.Authenticate()\nee.Initialize()", "_____no_output_____" ] ], [ [ "## Create an interactive map \nThis step creates an interactive map using [folium](https://github.com/python-visualization/folium). The default basemap is the OpenStreetMap. Additional basemaps can be added using the `Map.setOptions()` function. \nThe optional basemaps can be `ROADMAP`, `SATELLITE`, `HYBRID`, `TERRAIN`, or `ESRI`.", "_____no_output_____" ] ], [ [ "Map = folium.Map(location=[40, -100], zoom_start=4)\nMap.setOptions('HYBRID')", "_____no_output_____" ] ], [ [ "## Add Earth Engine Python script ", "_____no_output_____" ] ], [ [ "dataset = ee.Image('CSP/ERGo/1_0/Global/SRTM_mTPI')\nsrtmMtpi = dataset.select('elevation')\nsrtmMtpiVis = {\n 'min': -200.0,\n 'max': 200.0,\n 'palette': ['0b1eff', '4be450', 'fffca4', 'ffa011', 'ff0000'],\n}\nMap.setCenter(-105.8636, 40.3439, 11)\nMap.addLayer(srtmMtpi, srtmMtpiVis, 'SRTM mTPI')\n", "_____no_output_____" ] ], [ [ "## Display Earth Engine data layers ", "_____no_output_____" ] ], [ [ "Map.setControlVisibility(layerControl=True, fullscreenControl=True, latLngPopup=True)\nMap", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d02518d7c5a916840d084dcceb0a53ca3785c42d
204,749
ipynb
Jupyter Notebook
learning/ud170/lesson-1.ipynb
WL152/project-omega
04445bc432c773c843e98d9ba09be7745e208aee
[ "MIT" ]
1
2021-08-19T08:03:20.000Z
2021-08-19T08:03:20.000Z
learning/ud170/lesson-1.ipynb
ajmal017/project-omega
c512ea2381396a965444ce97d63c4d88f7a8ee5e
[ "MIT" ]
6
2021-05-29T06:07:07.000Z
2021-08-17T01:24:57.000Z
learning/ud170/lesson-1.ipynb
ajmal017/project-omega
c512ea2381396a965444ce97d63c4d88f7a8ee5e
[ "MIT" ]
2
2021-08-19T08:03:18.000Z
2021-09-14T07:32:10.000Z
298.90365
40,629
0.687842
[ [ [ "#Introduction to Data Science\n\nSee [Lesson 1](https://www.udacity.com/course/intro-to-data-analysis--ud170)\n\nYou should run it in local Jupyter env as this notebook refers to local dataset", "_____no_output_____" ] ], [ [ "import unicodecsv\nfrom datetime import datetime as dt\n\nenrollments_filename = 'dataset/enrollments.csv'\nengagement_filename = 'dataset/daily_engagement.csv'\nsubmissions_filename = 'dataset/project_submissions.csv'\n\n## Longer version of code (replaced with shorter, equivalent version below)\n\ndef read_csv(filename):\n with open(filename, 'rb') as f:\n reader = unicodecsv.DictReader(f)\n return list(reader)\n\nenrollments = read_csv(enrollments_filename)\ndaily_engagement = read_csv(engagement_filename)\nproject_submissions = read_csv(submissions_filename)\n\n\ndef renameKey(data, fromKey, toKey):\n for rec in data:\n if fromKey in rec:\n rec[toKey] = rec[fromKey]\n del rec[fromKey]\nrenameKey(daily_engagement, 'acct', 'account_key')\n\ndef cleanDataTypes():\n def fixIntFloat(data, field):\n if field not in data:\n print(f'WARNING : Field {field} is not in {data}')\n value = data[field]\n\n if value == '':\n data[field] = None\n else:\n data[field] = int(float(value))\n\n def fixFloat(data, field):\n if field not in data:\n print(f'WARNING : Field {field} is not in {data}')\n value = data[field]\n\n if value == '':\n data[field] = None\n else:\n data[field] = float(value)\n\n def fixDate(data, field):\n if field not in data:\n print(f'WARNING : Field {field} is not in {data}')\n value = data[field]\n\n if value == '':\n data[field] = None\n else:\n data[field] = dt.strptime(value, '%Y-%m-%d')\n\n def fixBool(data, field):\n if field not in data:\n print(f'WARNING : Field {field} is not in {data}')\n value = data[field]\n\n if value == 'True':\n data[field] = True\n elif value == 'False':\n data[field] = False\n else:\n print(f\"WARNING: invalid boolean '{value}' value converted to False in {data}\")\n data[field] = False\n\n def fixInt(data, field):\n if field not in data:\n print(f'WARNING : Field {field} is not in {data}')\n value = data[field]\n if value == '':\n data[field] = None\n else:\n data[field] = int(value)\n\n #clean data types\n for rec in enrollments:\n fixInt(rec, 'days_to_cancel')\n fixDate(rec, 'join_date')\n fixDate(rec, 'cancel_date')\n fixBool(rec, 'is_udacity')\n fixBool(rec, 'is_canceled')\n\n for rec in daily_engagement:\n fixDate(rec, 'utc_date')\n fixIntFloat(rec, 'num_courses_visited')\n fixFloat(rec, 'total_minutes_visited')\n fixIntFloat(rec, 'lessons_completed')\n fixIntFloat(rec, 'projects_completed')\n\n for rec in project_submissions:\n fixDate(rec, 'creation_date')\n fixDate(rec, 'completion_date')\n\ncleanDataTypes()\n\nprint(f\"enrollments[0] = {enrollments[0]}\\n\")\nprint(f\"daily_engagement[0] = {daily_engagement[0]}\\n\")\nprint(f\"project_submissions[0] = {project_submissions[0]}\\n\")", "enrollments[0] = {'account_key': '448', 'status': 'canceled', 'join_date': datetime.datetime(2014, 11, 10, 0, 0), 'cancel_date': datetime.datetime(2015, 1, 14, 0, 0), 'days_to_cancel': 65, 'is_udacity': True, 'is_canceled': True}\n\ndaily_engagement[0] = {'utc_date': datetime.datetime(2015, 1, 9, 0, 0), 'num_courses_visited': 1, 'total_minutes_visited': 11.6793745, 'lessons_completed': 0, 'projects_completed': 0, 'account_key': '0'}\n\nproject_submissions[0] = {'creation_date': datetime.datetime(2015, 1, 14, 0, 0), 'completion_date': datetime.datetime(2015, 1, 16, 0, 0), 'assigned_rating': 'UNGRADED', 'account_key': '256', 'lesson_key': '3176718735', 'processing_state': 'EVALUATED'}\n\n" ], [ "from collections import defaultdict\n\ndef getUniqueAccounts(data):\n accts = defaultdict(list)\n i = 0\n for record in data:\n accountKey = record['account_key']\n accts[accountKey].append(i)\n i+=1\n \n return accts\n\nenrollment_num_rows = len(enrollments)\nenrollment_unique_students = getUniqueAccounts(enrollments)\nenrollment_num_unique_students = len(enrollment_unique_students)\n\nengagement_num_rows = len(daily_engagement)\nengagement_unique_students = getUniqueAccounts(daily_engagement)\nengagement_num_unique_students = len(engagement_unique_students)\n\nsubmission_num_rows = len(project_submissions)\nsubmission_unique_students = getUniqueAccounts(project_submissions)\nsubmission_num_unique_students = len(submission_unique_students)\n\nprint(f\"enrollments total={enrollment_num_rows}, unique={enrollment_num_unique_students}\")\n\nprint(f\"engagements total={engagement_num_rows}, unique={engagement_num_unique_students}\")\n\nprint(f\"submissions total={submission_num_rows} unique={submission_num_unique_students}\")\n", "enrollments total=1640, unique=1302\nengagements total=136240, unique=1237\nsubmissions total=3642 unique=743\n" ], [ "for enrollment_acct in enrollment_unique_students:\n if enrollment_acct not in engagement_unique_students:\n #print(enrollment_unique_students[enrollment])\n enrollment_id = enrollment_unique_students[enrollment_acct][0]\n enrollment = enrollments[enrollment_id]\n print(f\"Strange student : enrollment={enrollment}\")\n break\n", "Strange student : enrollment={'account_key': '1219', 'status': 'canceled', 'join_date': datetime.datetime(2014, 11, 12, 0, 0), 'cancel_date': datetime.datetime(2014, 11, 12, 0, 0), 'days_to_cancel': 0, 'is_udacity': False, 'is_canceled': True}\n" ], [ "strange_enrollments_num_by_different_date = 0\n\nfor enrollment_acct in enrollment_unique_students:\n if enrollment_acct not in engagement_unique_students:\n for enrollment_id in enrollment_unique_students[enrollment_acct]:\n enrollment = enrollments[enrollment_id]\n if enrollment['join_date'] != enrollment['cancel_date']:\n strange_enrollments_num_by_different_date += 1\n #print(f\"Strange student with different dates : enrollments[{enrollment_id}]={enrollment}\\n\")\n\nprint(f\"Number of enrolled and cancelled at different dates but not engaged (problemactic accounts) : {strange_enrollments_num_by_different_date}\\n\")", "Number of enrolled and cancelled at different dates but not engaged (problemactic accounts) : 3\n\n" ], [ "num_problems = 0\nfor enrollment in enrollments:\n student = enrollment['account_key']\n if student not in engagement_unique_students and enrollment['join_date'] != enrollment['cancel_date']:\n num_problems += 1\n #print(enrollment)\n\nprint(f'Number of problematic account records : {num_problems}')", "Number of problematic account records : 3\n" ], [ "def getRealAccounts(enrollmentData):\n result = []\n for rec in enrollmentData:\n if not rec['is_udacity']:\n result.append(rec)\n return result\n\nreal_enrollments = getRealAccounts(enrollments)\nprint(f'Real account : {len(real_enrollments)}')", "Real account : 1622\n" ], [ "def getPaidStudents(enrollmentData):\n freePeriodDays = 7\n result = {}\n #result1 = {}\n for rec in enrollmentData:\n if rec['cancel_date'] == None or rec['days_to_cancel'] > freePeriodDays:\n accountKey = rec['account_key']\n joinDate = rec['join_date']\n if accountKey not in result or joinDate > result[accountKey]:\n result[accountKey] = joinDate\n #result1[accountKey] = joinDate\n '''\n for accountKey, joinDate in result.items():\n joinDate1 = result1[accountKey]\n if joinDate != joinDate1:\n print(f\"{accountKey} : {joinDate} != {joinDate1}\")\n '''\n return result\n\npaid_students = getPaidStudents(real_enrollments)\n\nprint(f'Paid students : {len(paid_students)}')", "Paid students : 995\n" ], [ "def isEngagementWithingOneWeek(joinDate, engagementDate):\n #if joinDate > engagementDate:\n # print(f'WARNING: join date is after engagement date')\n timeDelta = engagementDate - joinDate\n return 0 <= timeDelta.days and timeDelta.days < 7\n\n\ndef collectPaidEnagagementsInTheFirstWeek():\n result = []\n i = 0\n for engagement in daily_engagement:\n accountKey = engagement['account_key']\n if accountKey in paid_students:\n joinDate = paid_students[accountKey]\n engagementDate = engagement['utc_date']\n if isEngagementWithingOneWeek(joinDate, engagementDate):\n result.append(i)\n i+=1\n return result\n\npaid_engagement_in_first_week = collectPaidEnagagementsInTheFirstWeek()\n\nprint(f'Number of paid engagements in the first week : {len(paid_engagement_in_first_week)}')", "Number of paid engagements in the first week : 6919\n" ], [ "from collections import defaultdict\nimport numpy as np\n\ndef groupEngagementsByAccounts(engagements):\n result = defaultdict(list)\n for engagementId in engagements:\n engagement = daily_engagement[engagementId]\n accountKey = engagement['account_key']\n result[accountKey].append(engagementId)\n return result\n\nfirst_week_paid_engagements_by_account = groupEngagementsByAccounts(paid_engagement_in_first_week)\n\ndef sumEngagementsStatByAccount(engagements, getStatValue):\n result = {}\n for accountKey, engagementIds in engagements.items():\n stat_sum = 0\n for engagementId in engagementIds:\n engagement = daily_engagement[engagementId]\n stat_sum += getStatValue(engagement)\n result[accountKey] = stat_sum\n return result\n\ndef printStats(getStatValue, statLabel):\n first_week_paid_engagements_sum_stat_by_account = sumEngagementsStatByAccount(first_week_paid_engagements_by_account, getStatValue)\n first_week_paid_engagements_sum_stat = list(first_week_paid_engagements_sum_stat_by_account.values())\n\n print(f'Average {statLabel} spent by paid accounts during the first week : {np.mean(first_week_paid_engagements_sum_stat)}')\n print(f'StdDev {statLabel} spent by paid accounts during the first week : {np.std(first_week_paid_engagements_sum_stat)}')\n print(f'Min {statLabel} spent by paid accounts during the first week : {np.min(first_week_paid_engagements_sum_stat)}')\n print(f'Max {statLabel} spent by paid accounts during the first week : {np.max(first_week_paid_engagements_sum_stat)}')\n\n print('\\n')\n\nprintStats((lambda data : data['total_minutes_visited']), 'minutes')\nprintStats((lambda data : data['lessons_completed']), 'lessons')\nprintStats((lambda data : 1 if data['num_courses_visited'] > 0 else 0), 'days')\n", "Average minutes spent by paid accounts during the first week : 306.70832675342825\nStdDev minutes spent by paid accounts during the first week : 412.99693340852957\nMin minutes spent by paid accounts during the first week : 0.0\nMax minutes spent by paid accounts during the first week : 3564.7332644989997\n\n\nAverage lessons spent by paid accounts during the first week : 1.636180904522613\nStdDev lessons spent by paid accounts during the first week : 3.002561299829423\nMin lessons spent by paid accounts during the first week : 0\nMax lessons spent by paid accounts during the first week : 36\n\n\nAverage days spent by paid accounts during the first week : 2.8673366834170855\nStdDev days spent by paid accounts during the first week : 2.2551980029196814\nMin days spent by paid accounts during the first week : 0\nMax days spent by paid accounts during the first week : 7\n\n\n" ], [ "######################################\n# 11 #\n######################################\n\n## Create two lists of engagement data for paid students in the first week.\n## The first list should contain data for students who eventually pass the\n## subway project, and the second list should contain data for students\n## who do not.\n\nsubway_project_lesson_keys = {'746169184', '3176718735'}\npassing_grades = {'DISTINCTION', 'PASSED'} #{'', 'INCOMPLETE', 'DISTINCTION', 'PASSED', 'UNGRADED'}\n#passing_grades = {'PASSED'} #{'', 'INCOMPLETE', 'DISTINCTION', 'PASSED', 'UNGRADED'}\n\n\npassing_engagement = []\nnon_passing_engagement = []\n\nfor accountKey, engagementIds in first_week_paid_engagements_by_account.items():\n if accountKey in submission_unique_students:\n submissionIds = submission_unique_students[accountKey]\n isPassed = False\n for submissionId in submissionIds:\n submission = project_submissions[submissionId]\n if submission['assigned_rating'] in passing_grades and submission['lesson_key'] in subway_project_lesson_keys:\n isPassed = True\n break\n if isPassed:\n passing_engagement += engagementIds\n else:\n non_passing_engagement += engagementIds\n else:\n non_passing_engagement += engagementIds\n\nprint(f'First week engagements with passing grade : {len(passing_engagement)}')\nprint(f'First week engagements with non-passing grade : {len(non_passing_engagement)}')\n", "First week engagements with passing grade : 4527\nFirst week engagements with non-passing grade : 2392\n" ], [ "######################################\n# 12 #\n######################################\n\n## Compute some metrics you're interested in and see how they differ for\n## students who pass the subway project vs. students who don't. A good\n## starting point would be the metrics we looked at earlier (minutes spent\n## in the classroom, lessons completed, and days visited).\n\npassing_engagement_by_account = groupEngagementsByAccounts(passing_engagement)\nnon_passing_engagement_by_account = groupEngagementsByAccounts(non_passing_engagement)\n\ndef getArgStatEngagements(engagementIds, getStatValue):\n stat_sum = 0\n stat_num = 0\n for engagementId in engagementIds:\n engagement = daily_engagement[engagementId]\n stat_sum += getStatValue(engagement)\n stat_num += 1\n if stat_num > 0:\n return stat_sum / stat_num\n else:\n return 0\n\n#sumEngagementsStatByAccount(first_week_paid_engagements_by_account, getStatValue)\npassed_minutes = list(sumEngagementsStatByAccount(passing_engagement_by_account, (lambda data : data['total_minutes_visited'])).values())\nnon_passed_minutes = list(sumEngagementsStatByAccount(non_passing_engagement_by_account, (lambda data : data['total_minutes_visited'])).values())\npassed_lessons = list(sumEngagementsStatByAccount(passing_engagement_by_account, (lambda data : data['lessons_completed'])).values())\nnon_passed_lessons = list(sumEngagementsStatByAccount(non_passing_engagement_by_account, (lambda data : data['lessons_completed'])).values())\npassed_days = list(sumEngagementsStatByAccount(passing_engagement_by_account, (lambda data : 1 if data['num_courses_visited'] > 0 else 0)).values())\nnon_passed_days = list(sumEngagementsStatByAccount(non_passing_engagement_by_account, (lambda data : 1 if data['num_courses_visited'] > 0 else 0)).values())\n\n\nprint(f'Passed Avg Minutes = {np.mean(passed_minutes)}')\nprint(f'Non passed Avg Minutes = {np.mean(non_passed_minutes)}')\nprint(f'Passed Avg Lessons = {np.mean(passed_lessons)}')\nprint(f'Non passed Avg Lessons = {np.mean(non_passed_lessons)}')\nprint(f'Passed Avg Days = {np.mean(passed_days)}')\nprint(f'Non passed Avg Days = {np.mean(non_passed_days)}')\n", "Passed Avg Minutes = 394.58604648350865\nNon passed Avg Minutes = 143.32647426675584\nPassed Avg Lessons = 2.052550231839258\nNon passed Avg Lessons = 0.8620689655172413\nPassed Avg Days = 3.384853168469861\nNon passed Avg Days = 1.9051724137931034\n" ], [ "######################################\n# 13 #\n######################################\n\n## Make histograms of the three metrics we looked at earlier for both\n## students who passed the subway project and students who didn't. You\n## might also want to make histograms of any other metrics you examined.\n\n%matplotlib inline\n\nimport matplotlib.pyplot as plt\nimport seaborn as sns\n\nplt.hist(passed_minutes, color ='green')\nplt.hist(non_passed_minutes, color ='lightblue')\nplt.xlabel('Number of minutes')\nplt.title('Passed (green) VS Non-passed (light-blue) students')\n#sns.displot(passed_minutes, color ='green')\n#sns.displot(non_passed_minutes, color ='lightblue')", "_____no_output_____" ], [ "plt.hist(passed_lessons, color ='green')\nplt.hist(non_passed_lessons, color ='lightblue')\nplt.xlabel('Number of lessons')\nplt.title('Passed (green) VS Non-passed (light-blue) students')", "_____no_output_____" ], [ "plt.hist(passed_days, color ='green', bins = 8)\nplt.xlabel('Number of days')\nplt.title('Passed students')", "_____no_output_____" ], [ "plt.hist(non_passed_days, color ='lightblue', bins = 8)\nplt.xlabel('Number of days')\nplt.title('Non-passed students')", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0251bdf6e19d3b0f11c1fdcdf6761bbcbfa563c
43,281
ipynb
Jupyter Notebook
3_ML/src/ANN_feature_selection/.ipynb_checkpoints/Keras_FFN_multitask-checkpoint.ipynb
IBPA/MutationDB
eb1648e511cf6cc4a9e2cc72abafae4a8b20ac30
[ "Apache-2.0" ]
1
2019-09-25T20:51:20.000Z
2019-09-25T20:51:20.000Z
3_ML/src/ANN_feature_selection/.ipynb_checkpoints/Keras_FFN_multitask-checkpoint.ipynb
IBPA/MutationDB
eb1648e511cf6cc4a9e2cc72abafae4a8b20ac30
[ "Apache-2.0" ]
null
null
null
3_ML/src/ANN_feature_selection/.ipynb_checkpoints/Keras_FFN_multitask-checkpoint.ipynb
IBPA/MutationDB
eb1648e511cf6cc4a9e2cc72abafae4a8b20ac30
[ "Apache-2.0" ]
null
null
null
70.490228
6,214
0.698066
[ [ [ "import pandas as pd\nimport numpy as np\nimport keras\nfrom keras.models import Sequential,Model\nfrom keras.layers import Dense, Dropout,BatchNormalization,Input\nfrom keras.optimizers import RMSprop\nfrom keras.regularizers import l2,l1\nfrom keras.optimizers import Adam\n\nfrom sklearn.model_selection import LeaveOneOut\nfrom sklearn.metrics import roc_curve, auc\nimport matplotlib.pyplot as plt\nfrom keras.callbacks import EarlyStopping", "Using TensorFlow backend.\n" ], [ "df = pd.read_csv(\"../../out_data/MLDB.csv\")", "_____no_output_____" ], [ "first_gene_index = df.columns.get_loc(\"rrrD\")", "_____no_output_____" ], [ "X, Y = np.split(df, [first_gene_index], axis=1)\nX = X.values\nX = X-0.5\nY1 = Y.values[:,1]\nY2 = Y.values[:,1]", "_____no_output_____" ], [ "X.shape", "_____no_output_____" ], [ "import collections\nModel_setting = collections.namedtuple('Model_setting','num_layers num_node alpha drop_rate act_method lr regularization \\\npatience')", "_____no_output_____" ], [ "setting_ = [1,100, 0.5, 0.2, 'tanh', 0.01, 'l2', 3]\nsetting = Model_setting(*setting_)\nsetting = setting._asdict()", "_____no_output_____" ], [ "setting", "_____no_output_____" ], [ "def getModel(setting,num_input=84):\n regularizer = l1(setting['alpha']) if setting['regularization']=='l1' else l2(setting['alpha'])\n \n model = Sequential()\n for i in range(setting['num_layers']):\n if i==0:\n model.add(Dense(setting['num_node'], input_shape=(num_input,), activation=setting['act_method'],\\\n kernel_regularizer = regularizer))\n model.add(Dropout(setting['drop_rate']))\n else:\n model.add(Dense(setting['num_node']//(2**i), activation=setting['act_method']))\n model.add(Dropout(setting['drop_rate']))\n model.add(Dense(1, activation='sigmoid'))\n model.compile(loss='binary_crossentropy', optimizer=Adam(lr=setting['lr']), metrics=['accuracy'])\n return model", "_____no_output_____" ], [ "num_output_ = 3\ndef create_model(num_input = 84,num_output = num_output_):\n X_input = Input(shape=(num_input,))\n\n X = Dense(64)(X_input)\n X = Dropout(0.2)(X)\n X = Dense(32)(X)\n Ys= []\n for i in range(num_output):\n Ys.append(Dense(1, activation = 'sigmoid')(X))\n model = Model(inputs=[X_input],outputs = Ys)\n model.compile(loss=['binary_crossentropy']*num_output,loss_weights=[1.]*num_output,optimizer=Adam(lr=setting['lr']), metrics=['accuracy'])\n return model", "_____no_output_____" ], [ "model = create_model()\ncallbacks = [EarlyStopping(monitor='loss',min_delta=0,patience=setting['patience'])]", "_____no_output_____" ], [ "ys = [*((Y.values).T[:num_output_])]", "_____no_output_____" ], [ "model.fit(X,ys,epochs = 50, verbose = 1,callbacks =callbacks)", "Epoch 1/50\n178/178 [==============================] - 0s - loss: 1.1467 - dense_8_loss: 0.5328 - dense_9_loss: 0.2510 - dense_10_loss: 0.3629 - dense_8_acc: 0.8258 - dense_9_acc: 0.8876 - dense_10_acc: 0.8371 \nEpoch 2/50\n178/178 [==============================] - 0s - loss: 0.6560 - dense_8_loss: 0.3028 - dense_9_loss: 0.1767 - dense_10_loss: 0.1766 - dense_8_acc: 0.8820 - dense_9_acc: 0.9775 - dense_10_acc: 0.9775 \nEpoch 3/50\n178/178 [==============================] - 0s - loss: 0.6066 - dense_8_loss: 0.3507 - dense_9_loss: 0.1281 - dense_10_loss: 0.1278 - dense_8_acc: 0.9213 - dense_9_acc: 0.9775 - dense_10_acc: 0.9719 \nEpoch 4/50\n178/178 [==============================] - 0s - loss: 0.4402 - dense_8_loss: 0.1729 - dense_9_loss: 0.1714 - dense_10_loss: 0.0959 - dense_8_acc: 0.9382 - dense_9_acc: 0.9775 - dense_10_acc: 0.9775 \nEpoch 5/50\n178/178 [==============================] - 0s - loss: 0.3534 - dense_8_loss: 0.1642 - dense_9_loss: 0.1119 - dense_10_loss: 0.0773 - dense_8_acc: 0.9270 - dense_9_acc: 0.9719 - dense_10_acc: 0.9831 \nEpoch 6/50\n178/178 [==============================] - 0s - loss: 0.3680 - dense_8_loss: 0.1766 - dense_9_loss: 0.0904 - dense_10_loss: 0.1010 - dense_8_acc: 0.9494 - dense_9_acc: 0.9775 - dense_10_acc: 0.9775 \nEpoch 7/50\n178/178 [==============================] - 0s - loss: 0.3434 - dense_8_loss: 0.1552 - dense_9_loss: 0.0953 - dense_10_loss: 0.0929 - dense_8_acc: 0.9551 - dense_9_acc: 0.9775 - dense_10_acc: 0.9775 \nEpoch 8/50\n178/178 [==============================] - 0s - loss: 0.3200 - dense_8_loss: 0.1422 - dense_9_loss: 0.0737 - dense_10_loss: 0.1041 - dense_8_acc: 0.9438 - dense_9_acc: 0.9775 - dense_10_acc: 0.9775 \nEpoch 9/50\n178/178 [==============================] - 0s - loss: 0.2566 - dense_8_loss: 0.1182 - dense_9_loss: 0.0744 - dense_10_loss: 0.0640 - dense_8_acc: 0.9607 - dense_9_acc: 0.9775 - dense_10_acc: 0.9888 \nEpoch 10/50\n178/178 [==============================] - 0s - loss: 0.2773 - dense_8_loss: 0.1246 - dense_9_loss: 0.0789 - dense_10_loss: 0.0738 - dense_8_acc: 0.9607 - dense_9_acc: 0.9775 - dense_10_acc: 0.9775 \nEpoch 11/50\n178/178 [==============================] - 0s - loss: 0.2452 - dense_8_loss: 0.1198 - dense_9_loss: 0.0804 - dense_10_loss: 0.0451 - dense_8_acc: 0.9663 - dense_9_acc: 0.9775 - dense_10_acc: 0.9719 \nEpoch 12/50\n178/178 [==============================] - 0s - loss: 0.2215 - dense_8_loss: 0.1145 - dense_9_loss: 0.0667 - dense_10_loss: 0.0403 - dense_8_acc: 0.9607 - dense_9_acc: 0.9719 - dense_10_acc: 0.9944 \nEpoch 13/50\n178/178 [==============================] - 0s - loss: 0.2623 - dense_8_loss: 0.1333 - dense_9_loss: 0.0784 - dense_10_loss: 0.0507 - dense_8_acc: 0.9494 - dense_9_acc: 0.9775 - dense_10_acc: 0.9775 \nEpoch 14/50\n178/178 [==============================] - 0s - loss: 0.2615 - dense_8_loss: 0.0934 - dense_9_loss: 0.0827 - dense_10_loss: 0.0854 - dense_8_acc: 0.9719 - dense_9_acc: 0.9775 - dense_10_acc: 0.9551 \nEpoch 15/50\n178/178 [==============================] - 0s - loss: 0.2276 - dense_8_loss: 0.1153 - dense_9_loss: 0.0632 - dense_10_loss: 0.0492 - dense_8_acc: 0.9663 - dense_9_acc: 0.9831 - dense_10_acc: 0.9775 \nEpoch 16/50\n178/178 [==============================] - 0s - loss: 0.2404 - dense_8_loss: 0.1128 - dense_9_loss: 0.0941 - dense_10_loss: 0.0335 - dense_8_acc: 0.9438 - dense_9_acc: 0.9775 - dense_10_acc: 0.9831 \n" ], [ "history = final_model.fit(X_train, [Y_train, Y_train2], \n nb_epoch = 100, \n batch_size = 256, \n verbose=1, \n validation_data=(X_test, [Y_test, Y_test2]),\n callbacks=[reduce_lr, checkpointer],\n shuffle=True)", "_____no_output_____" ], [ "callbacks = [EarlyStopping(monitor='loss',min_delta=0,patience=setting['patience'])]", "_____no_output_____" ], [ "def cross_validation(X,Y,setting,num_input):\n model = getModel(setting,num_input)\n preds = []\n for train, test in LeaveOneOut().split(X, Y):\n model.fit(X[train,:],Y[train],epochs=20,verbose=0, callbacks =callbacks)\n probas_ = model.predict(X[test,:])\n preds.append(probas_[0][0])\n # Compute ROC curve and area the curve\n fpr, tpr, thresholds = roc_curve(Y, preds)\n roc_auc = auc(fpr, tpr)\n if roc_auc < 0.5:\n roc_auc = 1 - roc_auc\n return roc_auc", "_____no_output_____" ], [ "\ndef backward_selection(X,Y,setting):\n survive_index=[i for i in range(X.shape[1])]\n best_perf=0 \n for i in range(len(survive_index)-1):\n perfs = []\n\n print(survive_index)\n for index in survive_index:\n print(index)\n survive_index_copy = [i for i in survive_index if i!=index]\n perfs.append(cross_validation(X[:,survive_index_copy],Y,setting,num_input = len(survive_index)-1))\n print(\"best_perf\",best_perf)\n max_index = np.argmax(perfs)\n current_best = np.max(perfs)\n\n print(\"current_best\",current_best)\n if current_best > best_perf:\n best_perf = current_best\n survive_index.remove(survive_index[max_index])\n else:\n break\n return (survive_index,best_perf)", "_____no_output_____" ], [ "backward_selection(X[:,0:10],Y,setting)", "[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]\n0\n1\n2\n3\n4\n5\n6\n7\n8\n9\nbest_perf 0\ncurrent_best 1.0\n[1, 2, 3, 4, 5, 6, 7, 8, 9]\n1\n" ], [ "fpr, tpr, thresholds = roc_curve(Y, preds)\nroc_auc = auc(fpr, tpr)\nplt.plot(fpr, tpr, lw=1, alpha=0.3)\nplt.title('(AUC = %0.2f)' % (roc_auc))\nplt.show()", "_____no_output_____" ], [ "def cross_validation(X=X,Y=Y,epochs_=20,num_input_ = 84):\n model = getModel(num_input=num_input_)\n preds = []\n for train, test in LeaveOneOut().split(X, Y):\n model.fit(X,Y,epochs=epochs_,verbose=0)\n # print(test)\n probas_ = model.predict(X[test,:])\n preds.append(probas_[0][0])\n # Compute ROC curve and area the curve\n fpr, tpr, thresholds = roc_curve(Y, preds)\n roc_auc = auc(fpr, tpr)\n return roc_auc", "_____no_output_____" ], [ "survive_index=[i for i in range(4)]\ndef backward_selection(survive_index):\n for i in range(len(survive_index)-1):\n perfs = []\n best_perf=0\n for index in survive_index:\n print(index,\"\\n\")\n survive_index_copy = [i for i in survive_index if i!=index]\n perfs.append(cross_validation(X=X[:,survive_index_copy],Y=Y,epochs_=20,num_input_ = len(survive_index)-1))\n \n max_index = np.argmax(perfs)\n current_best = np.max(perfs)\n print(current_best)\n if current_best > best_perf:\n best_perf = current_best\n survive_index.remove(survive_index[max_index])\n else:\n break\n return survive_index", "_____no_output_____" ], [ "backward_selection(survive_index)", "0 \n\n1 \n\n2 \n\n3 \n\n0.807926829268\n0 \n\n1 \n\n2 \n\n0.81881533101\n0 \n\n" ], [ "max_index = np.argmax(perfs)\nsurvive_index[max_index]", "_____no_output_____" ], [ "fpr, tpr, thresholds = roc_curve(Y, preds)\nroc_auc = auc(fpr, tpr)\nplt.plot(fpr, tpr, lw=1, alpha=0.3)\nplt.title('(AUC = %0.2f)' % (roc_auc))\nplt.show()", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d02522b65bd61f622c0244e9bfff472944a14c58
850
ipynb
Jupyter Notebook
cclhm0069/_build/jupyter_execute/mod4b/sem14.ipynb
ericbrasiln/intro-historia-digital
5733dc55396beffeb916693c552fd4eb987472d0
[ "MIT" ]
null
null
null
cclhm0069/_build/jupyter_execute/mod4b/sem14.ipynb
ericbrasiln/intro-historia-digital
5733dc55396beffeb916693c552fd4eb987472d0
[ "MIT" ]
null
null
null
cclhm0069/_build/jupyter_execute/mod4b/sem14.ipynb
ericbrasiln/intro-historia-digital
5733dc55396beffeb916693c552fd4eb987472d0
[ "MIT" ]
null
null
null
19.767442
135
0.487059
[ [ [ "# Semana 14\n\n## Módulo 4b: Programação para História\n\n**Período**: 24/01/2022 a 28/01/2022\n\n**CH**: 2h30", "_____no_output_____" ], [ "## Live Coding 2 (AS)\n\n**Tema**: Python do Zero II\n\n**Data**: 02/02/2022, às 19h\n\n**CH**: 2h30\n\n**Plataforma**: `Google Meet` - link enviado por e-mail.\n\n```{Attention}\n[_Clique aqui para acessar a apresentação da aula_](https://ericbrasiln.github.io/intro-historia-digital/mod4b/sem13_ap.html).\n```", "_____no_output_____" ] ] ]
[ "markdown" ]
[ [ "markdown", "markdown" ] ]
d02540be05faa26ecc7f7187e0f71092d55c79a1
18,825
ipynb
Jupyter Notebook
01_Linear_Regression.ipynb
sarincr/Data-Analytics-with-PyTorch
2e13164ce0b16897ddf0e93c6d19c8383a11b945
[ "MIT" ]
null
null
null
01_Linear_Regression.ipynb
sarincr/Data-Analytics-with-PyTorch
2e13164ce0b16897ddf0e93c6d19c8383a11b945
[ "MIT" ]
null
null
null
01_Linear_Regression.ipynb
sarincr/Data-Analytics-with-PyTorch
2e13164ce0b16897ddf0e93c6d19c8383a11b945
[ "MIT" ]
null
null
null
86.751152
12,618
0.79081
[ [ [ "!pip install torch torchvision\nimport torch\nimport torch.nn as nn\nimport numpy as np\nimport matplotlib.pyplot as plt", "Requirement already satisfied: torch in /usr/local/lib/python3.6/dist-packages (1.6.0+cu101)\nRequirement already satisfied: torchvision in /usr/local/lib/python3.6/dist-packages (0.7.0+cu101)\nRequirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from torch) (1.18.5)\nRequirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch) (0.16.0)\nRequirement already satisfied: pillow>=4.1.1 in /usr/local/lib/python3.6/dist-packages (from torchvision) (7.0.0)\n" ], [ "# Hyper-parameters\ninput_size = 1\noutput_size = 1\nnum_epochs = 60\nlearning_rate = 0.001", "_____no_output_____" ], [ "\n# Toy dataset\nx_train = np.array([[3.3], [4.4], [5.5], [6.71], [6.93], [4.168], \n [9.779], [6.182], [7.59], [2.167], [7.042], \n [10.791], [5.313], [7.997], [3.1]], dtype=np.float32)\n\ny_train = np.array([[1.7], [2.76], [2.09], [3.19], [1.694], [1.573], \n [3.366], [2.596], [2.53], [1.221], [2.827], \n [3.465], [1.65], [2.904], [1.3]], dtype=np.float32)", "_____no_output_____" ], [ "# Linear regression model\nmodel = nn.Linear(input_size, output_size)", "_____no_output_____" ], [ "# Loss and optimizer\ncriterion = nn.MSELoss()\noptimizer = torch.optim.SGD(model.parameters(), lr=learning_rate) ", "_____no_output_____" ], [ "# Train the model\nfor epoch in range(num_epochs):\n # Convert numpy arrays to torch tensors\n inputs = torch.from_numpy(x_train)\n targets = torch.from_numpy(y_train)\n\n # Forward pass\n outputs = model(inputs)\n loss = criterion(outputs, targets)\n \n # Backward and optimize\n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n \n if (epoch+1) % 5 == 0:\n print ('Epoch [{}/{}], Loss: {:.4f}'.format(epoch+1, num_epochs, loss.item()))\n", "Epoch [5/60], Loss: 1.3239\nEpoch [10/60], Loss: 0.6392\nEpoch [15/60], Loss: 0.3618\nEpoch [20/60], Loss: 0.2494\nEpoch [25/60], Loss: 0.2038\nEpoch [30/60], Loss: 0.1854\nEpoch [35/60], Loss: 0.1779\nEpoch [40/60], Loss: 0.1749\nEpoch [45/60], Loss: 0.1736\nEpoch [50/60], Loss: 0.1731\nEpoch [55/60], Loss: 0.1729\nEpoch [60/60], Loss: 0.1728\n" ], [ "# Plot the graph\npredicted = model(torch.from_numpy(x_train)).detach().numpy()\nplt.plot(x_train, y_train, 'ro', label='Original data')\nplt.plot(x_train, predicted, label='Fitted line')\nplt.legend()\nplt.show()", "_____no_output_____" ], [ "# Save the model checkpoint\ntorch.save(model.state_dict(), 'model.ckpt')", "_____no_output_____" ], [ "", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d025454ca249b5b7926952bcc0b9c5ea914be4f0
2,324
ipynb
Jupyter Notebook
GUI_Application.ipynb
SarahRebulado/OOP1_2
33870eda978131b60ca3f33247c4ca74d67ef7db
[ "Apache-2.0" ]
null
null
null
GUI_Application.ipynb
SarahRebulado/OOP1_2
33870eda978131b60ca3f33247c4ca74d67ef7db
[ "Apache-2.0" ]
null
null
null
GUI_Application.ipynb
SarahRebulado/OOP1_2
33870eda978131b60ca3f33247c4ca74d67ef7db
[ "Apache-2.0" ]
null
null
null
30.986667
232
0.49957
[ [ [ "<a href=\"https://colab.research.google.com/github/SarahRebulado/OOP1_2/blob/main/GUI_Application.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ] ], [ [ "#@title Students Grade in OOP\n\nStudent_Name1= \"Enter the student name\"#@param{type: \"string\"}\nprelim= 90#@param{type: \"number\"}\nmidterm= 95#@param{type: \"number\"}\nfinal= 100#@param{type: \"number\"}\nsemestral_grade=(prelim+midterm+final)/3\n\nprint(\"The prelim grade of student 1 is\"+\" \"+ str(prelim))\nprint(\"The midterm grade of student 1 is\"+\" \"+ str(midterm))\nprint(\"The final grade of student 1 is\"+\" \"+ str(final))\nprint(\"The semestral grade of student1 is \" + str(semestral_grade))\n\n#@title Gender\nGender=\"Female\" #@param[\"Male\", \"Female\"]\nprint(Gender)\n\nBirthdate= '2003-02-12' #@param{type: \"date\"}", "The prelim grade of student 1 is 90\nThe midterm grade of student 1 is 95\nThe final grade of student 1 is 100\nThe semestral grade of student1 is 95.0\nFemale\n" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code" ] ]
d025478c23538317bcb04a1349f710b067c26690
522,064
ipynb
Jupyter Notebook
Model Comparison/Model_comparison.ipynb
asmolina/ML-project-fairness-aware-classification
0c8d9d6e3d320a10b685388710fb680b0aaf3cae
[ "MIT" ]
null
null
null
Model Comparison/Model_comparison.ipynb
asmolina/ML-project-fairness-aware-classification
0c8d9d6e3d320a10b685388710fb680b0aaf3cae
[ "MIT" ]
null
null
null
Model Comparison/Model_comparison.ipynb
asmolina/ML-project-fairness-aware-classification
0c8d9d6e3d320a10b685388710fb680b0aaf3cae
[ "MIT" ]
1
2021-03-23T20:14:52.000Z
2021-03-23T20:14:52.000Z
73.758689
28,650
0.228476
[ [ [ "This notebook contains code for model comparison. Optimal hyperparameters for models are supposed to be already found.", "_____no_output_____" ], [ "# Imports", "_____no_output_____" ] ], [ [ "#imports\n!pip install scipydirect\n\nimport math\nimport numpy as np\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.model_selection import cross_val_score\n\nfrom sklearn import preprocessing\nfrom sklearn.preprocessing import normalize\n\nfrom sklearn.ensemble import AdaBoostClassifier\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.neighbors import NearestNeighbors\nfrom sklearn.tree import DecisionTreeClassifier\n\nfrom sklearn.metrics import precision_recall_curve\nfrom sklearn.metrics import f1_score\nfrom sklearn.metrics import accuracy_score\nfrom sklearn.metrics import average_precision_score\nfrom sklearn.metrics import confusion_matrix\nfrom sklearn.metrics import balanced_accuracy_score\n\nimport collections\nfrom collections import Counter\nfrom imblearn.over_sampling import SMOTE\n\n\n%matplotlib inline\nimport matplotlib\nimport matplotlib.pyplot as plt\nfrom matplotlib.colors import ListedColormap\n\nfrom IPython import display\n\nfrom scipydirect import minimize # for DIvided RECTangles (DIRECT) method\nimport time", "Collecting scipydirect\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/c2/dd/657e6c53838b3ff50e50bda4e905c8ec7e4b715f966f33d0566088391d75/scipydirect-1.3.tar.gz (49kB)\n\r\u001b[K |██████▋ | 10kB 15.9MB/s eta 0:00:01\r\u001b[K |█████████████▏ | 20kB 21.2MB/s eta 0:00:01\r\u001b[K |███████████████████▊ | 30kB 24.8MB/s eta 0:00:01\r\u001b[K |██████████████████████████▍ | 40kB 24.2MB/s eta 0:00:01\r\u001b[K |████████████████████████████████| 51kB 4.7MB/s \n\u001b[?25hRequirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from scipydirect) (1.19.5)\nBuilding wheels for collected packages: scipydirect\n Building wheel for scipydirect (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for scipydirect: filename=scipydirect-1.3-cp37-cp37m-linux_x86_64.whl size=119893 sha256=201aad9d5036ecc1328b332fbd9c6adc620fa9b0d03010fc0ccea07c5bea8a47\n Stored in directory: /root/.cache/pip/wheels/b7/de/2b/c5550296e6dac87737c622568c6a10bc87ae76e98b52ff210e\nSuccessfully built scipydirect\nInstalling collected packages: scipydirect\nSuccessfully installed scipydirect-1.3\n" ] ], [ [ "# 1) Describe classes of the following models: AdaFair, SMOTEBoost, ASR", "_____no_output_____" ], [ "#### 1.1) AdaFair", "_____no_output_____" ] ], [ [ "#AdaFair\nclass AdaFairClassifier(AdaBoostClassifier):\n def __init__(self,\n base_estimator=None, *,\n n_estimators=50,\n learning_rate=1,\n algorithm='SAMME',\n random_state=42,\n protected=None,\n epsilon = 0):\n\n super().__init__(\n base_estimator=base_estimator,\n n_estimators=n_estimators,\n algorithm = algorithm,\n learning_rate=learning_rate,\n random_state=random_state)\n\n self.protected = np.array(protected)\n self.algorithm = algorithm\n self.epsilon = epsilon\n \n def _boost_discrete(self, iboost, X, y, sample_weight, random_state):\n \"\"\"Implement a single boost using the SAMME discrete algorithm.\"\"\"\n estimator = self._make_estimator(random_state=random_state)\n\n estimator.fit(X, y, sample_weight=sample_weight)\n\n y_predict = estimator.predict(X)\n\n if iboost == 0:\n self.classes_ = getattr(estimator, 'classes_', None)\n self.n_classes_ = len(self.classes_)\n\n # Instances incorrectly classified\n incorrect = y_predict != y\n\n # Error fraction\n estimator_error = np.mean(\n np.average(incorrect, weights=sample_weight, axis=0))\n\n # Stop if classification is perfect\n if estimator_error <= 0:\n return sample_weight, 1., 0.\n\n n_classes = self.n_classes_\n\n # Stop if the error is at least as bad as random guessing\n if estimator_error >= 1. - (1. / n_classes):\n self.estimators_.pop(-1)\n if len(self.estimators_) == 0:\n print (\"BaseClassifier in AdaBoostClassifier ensemble is worse than random, ensemble can not be fit.\")\n raise ValueError('BaseClassifier in AdaBoostClassifier '\n 'ensemble is worse than random, ensemble '\n 'can not be fit.')\n return None, None, None\n\n if len(self.protected) != len(y):\n print (\"Error: not given or given incorrect list of protected objects\")\n return None, None, None\n\n #Compute fairness-related costs\n #CUMULATIVE prediction \n \n y_cumulative_pred = list(self.staged_predict(X))[0]\n\n u = np.array(self.get_fairness_related_costs(y, y_cumulative_pred, self.protected))\n\n # Boost weight using multi-class AdaBoost SAMME alg\n estimator_weight = self.learning_rate * (\n np.log((1. - estimator_error) / estimator_error) +\n np.log(n_classes - 1.))\n\n # Only boost the weights if I will fit again\n if not iboost == self.n_estimators - 1:\n # Only boost positive weights\n\n sample_weight = sample_weight * np.exp(estimator_weight * incorrect * (sample_weight > 0)) * (np.ones(len(u)) + u)\n\n return sample_weight, estimator_weight, estimator_error\n\n def get_fairness_related_costs(self, y, y_pred_t, protected):\n\n y_true_protected, y_pred_protected, y_true_non_protected, y_pred_non_protected = separate_protected_from_non_protected(y, y_pred_t, protected)\n #Rates for non protected group\n tp, tn, fp, fn = tp_tn_fp_fn(y_true_non_protected, y_pred_non_protected)\n FPR_non_protected = fp / (fp + tn)\n FNR_non_protected = fn / (fn + tp)\n #Rates for protected group\n tp, tn, fp, fn = tp_tn_fp_fn(y_true_protected, y_pred_protected)\n FPR_protected = fp / (fp + tn)\n FNR_protected = fn / (fn + tp)\n \n delta_FPR = - FPR_non_protected + FPR_protected \n delta_FNR = - FNR_non_protected + FNR_protected\n\n self.delta_FPR = delta_FPR\n self.delta_FNR = delta_FNR\n #Compute fairness related costs\n u = []\n for y_i, y_pred_t__i, protection in zip(y, y_pred_t, protected):\n if y_i == 1 and y_pred_t__i == 0 and abs(delta_FNR) > self.epsilon and protection == 1 and delta_FNR > 0:\n u.append(abs(delta_FNR))\n elif y_i == 1 and y_pred_t__i == 0 and abs(delta_FNR) > self.epsilon and protection == 0 and delta_FNR < 0:\n u.append(abs(delta_FNR))\n elif y_i == 0 and y_pred_t__i == 1 and abs(delta_FPR) > self.epsilon and protection == 1 and delta_FPR > 0:\n u.append(abs(delta_FPR))\n elif y_i == 0 and y_pred_t__i == 1 and abs(delta_FPR) > self.epsilon and protection == 0 and delta_FPR < 0:\n u.append(abs(delta_FPR))\n else: u.append(0)\n return u", "_____no_output_____" ] ], [ [ "#### 1.2 SMOTEBoost", "_____no_output_____" ] ], [ [ "#SMOTEBoost\n\nrandom_state = 42\n\n#4, y - true label\ndef ada_boost_eps(y, y_pred_t, distribution):\n eps = np.sum((1 - (y == y_pred_t) + (np.logical_not(y) == y_pred_t)) * distribution)\n return eps\n\n#5\ndef ada_boost_betta(eps):\n betta = eps/(1 - eps)\n return betta\n\ndef ada_boost_w(y, y_pred_t):\n w = 0.5 * (1 + (y == y_pred_t) - (np.logical_not(y) == y_pred_t))\n return w\n\n#6\ndef ada_boost_distribution(distribution, betta, w):\n distribution = distribution * betta ** w / np.sum(distribution)\n return distribution\n\ndef min_target(y):\n minority_target = min(Counter(y), key=Counter(y).get)\n return minority_target\n\nclass SMOTEBoost():\n \n def __init__(self,\n n_samples = 100,\n k_neighbors = 5, \n n_estimators = 50, #n_estimators = T\n base_classifier = None,\n random_state = 42,\n get_eps = ada_boost_eps,\n get_betta = ada_boost_betta,\n get_w = ada_boost_w,\n update_distribution=ada_boost_distribution):\n self.n_samples = n_samples\n self.k_neighbors = k_neighbors\n self.n_estimators = n_estimators\n self.base_classifier = base_classifier\n self.random_state = random_state\n self.get_eps = get_eps\n self.get_betta = get_betta\n self.get_w = get_w\n self.update_distribution = update_distribution\n \n def fit(self, X, y):\n X = np.array(X)\n distribution = np.ones(X.shape[0], dtype=float) / X.shape[0]\n self.classifiers = []\n self.betta = []\n y = np.array(y)\n for i in range(self.n_estimators):\n minority_class = min_target(y)\n X_min = X[np.where(y == minority_class)]\n\n # create a new classifier\n self.classifiers.append(self.base_classifier())\n\n # SMOTE\n self.smote = SMOTE(n_samples=self.n_samples,\n k_neighbors=self.k_neighbors,\n random_state=self.random_state)\n self.smote.fit(X_min)\n X_syn = self.smote.sample()\n y_syn = np.full(X_syn.shape[0], fill_value=minority_class, dtype=np.int64)\n\n # Modify distribution\n distribution_syn = np.empty(X_syn.shape[0], dtype=np.float64)\n distribution_syn[:] = 1. / X.shape[0]\n mod_distribution = np.append(distribution, distribution_syn).reshape(1, -1)\n mod_distribution = np.squeeze(normalize(mod_distribution, axis=1, norm='l1'))\n # Concatenate original and synthetic datasets for training a weak learner\n mod_X = np.vstack((X, X_syn))\n mod_y = np.append(y, y_syn) \n\n # Train a weak lerner\n self.classifiers[-1].fit(mod_X, mod_y, sample_weight=mod_distribution)\n\n # Make a prediction for the original dataset\n y_pred_t = self.classifiers[-1].predict(X)\n \n # Compute the pseudo-loss of hypothesis\n eps_t = ada_boost_eps(y, y_pred_t, distribution)\n betta_t = ada_boost_betta(eps_t)\n w_t = ada_boost_w(y, y_pred_t)\n self.betta.append(betta_t)\n \n # Update distribution and normalize\n distribution = ada_boost_distribution(distribution, betta_t, w_t)\n \n def predict(self, X):\n final_predictions_0 = np.zeros(X.shape[0])\n final_predictions_1 = np.zeros(X.shape[0])\n y_pred = np.empty(X.shape[0])\n # get the weighted votes of the classifiers\n for i in range(len(self.betta)):\n h_i = self.classifiers[i].predict(X)\n final_predictions_0 = final_predictions_0 + math.log(1/self.betta[i])*(h_i == 0)\n final_predictions_1 = final_predictions_1 + math.log(1/self.betta[i])*(h_i == 1)\n for i in range(len(final_predictions_0)):\n if final_predictions_0[i] > final_predictions_1[i]:\n y_pred[i] = 0\n else:\n y_pred[i] = 1\n return y_pred\n\n\nclass SMOTE():\n\n def __init__(self, n_samples, k_neighbors=5, random_state=None):\n self.n_samples = n_samples\n self.k = k_neighbors\n self.random_state = random_state\n\n def fit(self, X):\n self.X = X\n self.n_features = self.X.shape[1]\n self.neigh = NearestNeighbors(n_neighbors=self.k) #k + 1\n self.neigh.fit(self.X)\n return self\n\n def sample(self):\n np.random.seed(seed=self.random_state)\n S = np.zeros(shape=(self.n_samples, self.n_features))\n\n for i in range(self.n_samples):\n j = np.random.randint(0, self.X.shape[0])\n nn = self.neigh.kneighbors(self.X[j].reshape(1, -1),\n return_distance=False)[:, 1:]\n nn_index = np.random.choice(nn[0])\n print (self.X[nn_index], self.X[j])\n dif = self.X[nn_index] - self.X[j]\n gap = np.random.random() \n S[i, :] = self.X[j, :] + gap * dif[:]\n return S", "_____no_output_____" ] ], [ [ "#### 1.3 Adaptive sensitive reweighting", "_____no_output_____" ] ], [ [ "#Adaptive sensitive reweighting\nclass ReweightedClassifier:\n def __init__(self, baze_clf, alpha, beta, params = {}):\n \"\"\"\n Input: \n baze_clf - object from sklearn with methods .fit(sample_weight=), .predict(), .predict_proba()\n alpha - list of alphas for sensitive and non-sensitive objects [alpha, alpha']\n beta - list of betss for sensitive and non-sensitive objects [beta, beta']\n params - **kwargs compatible with baze_clf\n \n \"\"\"\n self.baze_clf = baze_clf \n self.model = None\n self.alpha = np.array(alpha)\n self.alphas = None\n self.beta = np.array(beta)\n self.betas = None\n self.weights = None\n self.prev_weights = None\n self.params = params\n \n def reweight_dataset(self, length, error, minority_idx):\n \"\"\"\n This function recalculates values of weights and saves their previous values\n \"\"\"\n if self.alphas is None or self.betas is None:\n # If alpha_0, alpha_1 or beta_0, beta_1 are not defined,\n # then define alpha_0 and beta_0 to every object from non-sensitive class,\n # and alpha_1 and beta_1 to every object from sensitive class (minority).\n self.alphas = np.ones(length) * self.alpha[0]\n self.betas = np.ones(length) * self.beta[0]\n self.alphas[minority_idx] = self.alpha[1]\n self.betas[minority_idx] = self.beta[1]\n \n # w_i_prev <- w_i for all i in dataset\n self.prev_weights = self.weights.copy()\n \n # w_i = alpha_i * L_{beta_i} (P'(y_pred_i =! y_true_i)) \n # + (1 - alpha_i) * L_{beta_i} (-P'(y_pred_i =! y_true_i)),\n # where \n # L_{beta_i} (x) = exp(beta_i * x)\n self.weights = self.alphas * np.exp(self.betas * error) \\\n + (1 - self.alphas) * np.exp(- self.betas * error)\n \n def pRule(self, prediction, minority_idx):\n \"\"\"\n This function calculates \n | P(y_pred_i = 1 | i in S) P(y_pred_i = 1 | i not in S) |\n pRule = min { ---------------------------- , ---------------------------- }\n | P(y_pred_i = 1 | i not in S) P(y_pred_i = 1 | i in S) |\n \n S - the group of sensitive objects\n ---------\n Input:\n prediction - labels ({0,1}) of a sample for which pRule is calculated\n minority_idx - indexes of objects from a sensitive group\n \"\"\"\n # majority indexes = set of all indexes / set of minority indexes,\n # where set of all indexes = all numbers from 0 to size of sample (=len(prediction))\n majority_idx = set(np.linspace(0, len(prediction) - 1, len(prediction), dtype = int)).difference(minority_idx)\n \n # minority = P(y_pred_i = 1 | i in minority)\n # majority = P(y_pred_i = 1 | i in majority)\n minority = prediction[minority_idx].mean()\n majority = prediction[list(majority_idx)].mean()\n \n minority = np.clip(minority, 1e-10, 1 - 1e-10)\n majority = np.clip(majority, 1e-10, 1 - 1e-10)\n \n return min(minority/majority, majority/minority) \n \n def fit(self, X_train, y_train, X_test, y_test, minority_idx, verbose=True, max_iter=30):\n # Initialize equal weights w_i = 1\n self.weights = np.ones(len(y_train))\n self.prev_weights = np.zeros(len(y_train))\n \n # Lists for saving metrics\n accuracys = []\n pRules = []\n differences = []\n accuracy_plus_prule = []\n \n # Adaptive Sensitive Reweighting\n iteration = 0\n \n while ((self.prev_weights - self.weights) ** 2).mean() > 10**(-6) and iteration < max_iter:\n \n iteration += 1\n # Train classifier on X_train with weights w_i\n self.model = self.baze_clf(**self.params)\n self.model.fit(X_train, y_train, sample_weight = self.weights)\n \n # Use classifier to obtain P`(y_pred_i =! y_pred) (here it is called 'error')\n prediction_proba = self.model.predict_proba(X_train)[:, 1]\n error = (y_train == 1) * (1 - prediction_proba) + (y_train == 0) * prediction_proba \n \n # Update weights\n self.reweight_dataset(len(y_train), error, minority_idx)\n \n # Get metrics on X_train\n prediction = self.model.predict(X_train)\n accuracys.append(accuracy_score(prediction, y_train))\n pRules.append(self.pRule(prediction, minority_idx))\n accuracy_plus_prule.append(accuracys[-1] + pRules[-1])\n differences.append(((self.prev_weights - self.weights)**2).mean()**0.5)\n \n # Visualize metrics if it's needed\n if verbose:\n display.clear_output(True)\n fig, axes = plt.subplots(ncols=2, nrows=2, figsize=(16, 7))\n\n metrics = [accuracys, pRules, accuracy_plus_prule, differences]\n metrics_names = [\"Accuracy score\", \"pRule\", \"Accuracy + pRule\", \"Mean of weight edits\"]\n for name, metric, ax in zip(metrics_names, metrics, axes.flat):\n ax.plot(metric, label='train')\n ax.set_title(f'{name}, iteration {iteration}')\n ax.legend()\n if name == \"Mean of weight edits\":\n ax.set_yscale('log')\n plt.show()\n \n return accuracys, pRules, accuracy_plus_prule\n \n \n def predict(self, X):\n return self.model.predict(X)\n \n def predict_proba(self, X):\n return self.model.predict_proba(X)\n \n def get_metrics_test(self, X_test, y_test, minority_idx_test):\n \"\"\"\n Obtain pRule and accuracy for trained model\n \"\"\"\n # Obtain predictions on X_test to calculate metrics\n prediction_test = self.model.predict(X_test)\n\n # Get metrics on test\n accuracy_test = accuracy_score(prediction_test, y_test)\n pRule_test = self.pRule(prediction_test, minority_idx_test)\n\n return accuracy_test, pRule_test \n\ndef prep_train_model(X_train, y_train, X_test, y_test, minority_idx):\n \n def train_model(a):\n \"\"\"\n Function of 4 variables (a[0], a[1], a[2], a[3]) that will be minimized by DIRECT method.\n a[0], a[1] = alpha, alpha'\n a[2], a[3] = beta, beta'\n \"\"\"\n model = ReweightedClassifier(LogisticRegression, [a[0], a[1]], [a[2], a[3]], params = {\"max_iter\": 4000})\n _, _, accuracy_plus_prule = model.fit(X_train, y_train, X_test, y_test, minority_idx)\n\n # We'll maximize [acc + pRule] which we get at the last iteration of Adaptive Sensitive Reweighting\n return - accuracy_plus_prule[-1]\n \n return train_model # return function for optimization", "_____no_output_____" ] ], [ [ "#### 1.4 Some functions used for fitting models, calculating metrics, and data separation", "_____no_output_____" ] ], [ [ "#This function returns binary list of whether the corresponding feature is protected (1) or not (0)\ndef get_protected_instances(X, feature, label):\n protected = []\n for i in range(len(X)):\n if X.iloc[i][feature] == label:\n protected.append(1)\n else: protected.append(0)\n return protected\n\n#To calculate TRP and TNR for protected and non-protected groups, first separate them\ndef separate_protected_from_non_protected(y_true, y_pred, protected):\n y_true_protected = []\n y_pred_protected = []\n y_true_non_protected = []\n y_pred_non_protected = []\n for true_label, pred_label, is_protected in zip(y_true, y_pred, protected):\n if is_protected == 1:\n y_true_protected.append(true_label)\n y_pred_protected.append(pred_label)\n elif is_protected == 0:\n y_true_non_protected.append(true_label)\n y_pred_non_protected.append(pred_label)\n else:\n print(\"Error: invalid value of in protected array \", is_protected)\n return 0,0,0,0\n return (np.array(y_true_protected), np.array(y_pred_protected), np.array(y_true_non_protected), np.array(y_pred_non_protected))\n\ndef tp_tn_fp_fn(y_true, y_pred):\n matrix = confusion_matrix(y_true, y_pred)\n tp = matrix[1][1]\n tn = matrix[0][0]\n fp = matrix[0][1]\n fn = matrix[1][0]\n return (tp, tn, fp, fn)\n\n#same pRule as in ASR, but used for calculating this metric in others classifiers' predictions\ndef pRule(prediction, minority_idx):\n \"\"\"\n This function calculates \n | P(y_pred_i = 1 | i in S) P(y_pred_i = 1 | i not in S) |\n pRule = min { ---------------------------- , ---------------------------- }\n | P(y_pred_i = 1 | i not in S) P(y_pred_i = 1 | i in S) |\n \n S - the group of sensitive objects\n ---------\n Input:\n prediction - labels ({0,1}) of a sample for which pRule is calculated\n minority_idx - indexes of objects from a sensitive group\n \"\"\"\n # majority indexes = set of all indexes / set of minority indexes,\n # where set of all indexes = all numbers from 0 to size of sample (=len(prediction))\n majority_idx = set(np.linspace(0, len(prediction) - 1, len(prediction), dtype = int)).difference(minority_idx)\n \n # minority = P(y_pred_i = 1 | i in minority)\n # majority = P(y_pred_i = 1 | i in majority)\n minority = prediction[minority_idx].mean()\n majority = prediction[list(majority_idx)].mean()\n \n minority = np.clip(minority, 1e-10, 1 - 1e-10)\n majority = np.clip(majority, 1e-10, 1 - 1e-10)\n \n return min(minority/majority, majority/minority) ", "_____no_output_____" ] ], [ [ "# 2) Download datasets (run one cell for one dataset)", "_____no_output_____" ], [ "Adult census", "_____no_output_____" ] ], [ [ "#Adult census\n#adult_census_names = ['old_id' ,'age','workclass','fnlwgt','education','education_num','marital_status','occupation','relationship','race','sex','capital_gain','capital_loss','hours_per_week','native_country']\nX_train = pd.read_csv(\"splits/X_train_preprocessed_adult.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\nX_test = pd.read_csv(\"splits/X_test_preprocessed_adult.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\n\ny_train = pd.read_csv(\"splits/y_train_preprocessed_adult.csv\")['income']\ny_test = pd.read_csv(\"splits/y_test_preprocessed_adult.csv\")['income']\nreweight_prediction = \"/content/predictions/y_pred_test_adult.csv\"\n\n#X_test, X_train = preprocess_adult_census(X_test.drop('old_id', axis = 1)), preprocess_adult_census(X_train.drop('old_id', axis = 1))\n#y_test, y_train = adult_label_transform(y_test)['income'], adult_label_transform(y_train)['income']\n#Obtain protected group (used in AdaFair)\nprotected_test = get_protected_instances(X_test, 'gender', 1)\nprotected_train = get_protected_instances(X_train, 'gender', 1)\n\n# Obtain indexes of sensitive class (for regression algorithm)\nminority_idx = X_train.reset_index(drop=True).index.values[X_train[\"gender\"] == 1]\nminority_idx_test = X_test.reset_index(drop=True).index.values[X_test[\"gender\"] == 1]\n\n#best hyperparameters for AdaFair\nadafair_max_depth = 2\nadafair_n_estimators = 20\n\n#result of ASR optimizing\na_1 = [0.01851852, 0.99382716, 1.16666667, 2.94444444]\n#print(len(X_train),len(y_train))\n#X_train.head()", "_____no_output_____" ] ], [ [ "Bank", "_____no_output_____" ] ], [ [ "X_train = pd.read_csv(\"splits/X_train_preprocessed_bank.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\nX_test = pd.read_csv(\"splits/X_test_preprocessed_bank.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\n\ny_train = pd.read_csv(\"splits/y_train_preprocessed_bank.csv\")['y']\ny_test = pd.read_csv(\"splits/y_test_preprocessed_bank.csv\")['y']\nreweight_prediction = \"/content/predictions/y_pred_test_bank.csv\"\n\n#X_test, X_train = preprocess_adult_census(X_test.drop('old_id', axis = 1)), preprocess_adult_census(X_train.drop('old_id', axis = 1))\n#y_test, y_train = adult_label_transform(y_test)['income'], adult_label_transform(y_train)['income']\n#Obtain protected group (used in AdaFair)\nprotected_test = get_protected_instances(X_test, 'age', 1)\nprotected_train = get_protected_instances(X_train, 'age', 1)\n\n# Obtain indexes of sensitive class (for regression algorithm)\nminority_idx = X_train.reset_index(drop=True).index.values[X_train[\"age\"] == 1]\nminority_idx_test = X_test.reset_index(drop=True).index.values[X_test[\"age\"] == 1]\n\n#best hyperparameters for AdaFair\nadafair_max_depth = 2\nadafair_n_estimators = 9\n\n#result of ASR optimizing\na_1 = [0.87037037, 0.01851852, 2.72222222, 1.57407407] \n#print(len(X_train),len(y_train))\n#X_train.head()", "_____no_output_____" ] ], [ [ "Compass", "_____no_output_____" ] ], [ [ "X_train = pd.read_csv(\"splits/X_train_preprocessed_compas.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\nX_test = pd.read_csv(\"splits/X_test_preprocessed_compas.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\n\ny_train = pd.read_csv(\"splits/y_train_preprocessed_compas.csv\")['two_year_recid']\ny_test = pd.read_csv(\"splits/y_test_preprocessed_compas.csv\")['two_year_recid']\nreweight_prediction = \"/content/predictions/y_pred_test_compas.csv\"\n\n#X_test, X_train = preprocess_adult_census(X_test.drop('old_id', axis = 1)), preprocess_adult_census(X_train.drop('old_id', axis = 1))\n#y_test, y_train = adult_label_transform(y_test)['income'], adult_label_transform(y_train)['income']\n#Obtain protected group (used in AdaFair)\nprotected_test = get_protected_instances(X_test, 'race', 0)\nprotected_train = get_protected_instances(X_train, 'race', 0)\n\n# Obtain indexes of sensitive class (for regression algorithm)\nminority_idx = X_train.reset_index(drop=True).index.values[X_train[\"race\"] == 0]\nminority_idx_test = X_test.reset_index(drop=True).index.values[X_test[\"race\"] == 0]\n\n#best hyperparameters for AdaFair\nadafair_max_depth = 4\nadafair_n_estimators = 5\n\n#result of ASR optimizing\na_1 = [0.0308642, 0.72222222, 0.5, 0.45061728]\n#print(len(X_train),len(y_train))\n#y_train.head()", "_____no_output_____" ] ], [ [ "KDD Census", "_____no_output_____" ] ], [ [ "#adult_census_names = ['old_id' ,'age','workclass','fnlwgt','education','education_num','marital_status','occupation','relationship','race','sex','capital_gain','capital_loss','hours_per_week','native_country']\nX_train = pd.read_csv(\"splits/X_train_preprocessed_kdd.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\nX_test = pd.read_csv(\"splits/X_test_preprocessed_kdd.csv\").drop(\"Unnamed: 0\", axis = 1)#, names = adult_census_names).iloc[1:]\n\ny_train = pd.read_csv(\"splits/y_train_preprocessed_kdd.csv\")['income_50k']\ny_test = pd.read_csv(\"splits/y_test_preprocessed_kdd.csv\")['income_50k']\nreweight_prediction = \"/content/predictions/y_pred_test_kdd.csv\"\n\n#X_test, X_train = preprocess_adult_census(X_test.drop('old_id', axis = 1)), preprocess_adult_census(X_train.drop('old_id', axis = 1))\n#y_test, y_train = adult_label_transform(y_test)['income'], adult_label_transform(y_train)['income']\n#Obtain protected group (used in AdaFair)\nprotected_test = get_protected_instances(X_test, 'sex', 1)\nprotected_train = get_protected_instances(X_train, 'sex', 1)\n\n# Obtain indexes of sensitive class (for regression algorithm)\nminority_idx = X_train.reset_index(drop=True).index.values[X_train[\"sex\"] == 1]\nminority_idx_test = X_test.reset_index(drop=True).index.values[X_test[\"sex\"] == 1]\n\n#best hyperparameters for AdaFair\nadafair_max_depth = 5\nadafair_n_estimators = 11 \n\n#result of ASR optimizing\na_1 = [0.01851852, 0.99382716, 1.16666667, 2.94444444]\n#print(len(X_train),len(y_train))\n#X_train.head()", "98147 98147\n" ] ], [ [ "# 3) Create models, train classifiers", "_____no_output_____" ] ], [ [ "#Regression\n# Create model with obtained hyperparameters alpha, alpha', beta, beta'\nmodel_reweighted_classifier = ReweightedClassifier(LogisticRegression, [a_1[0], a_1[1]], [a_1[2], a_1[3]], params = {\"max_iter\": 4})\n\n# Train model on X_train \nmodel_reweighted_classifier.fit(X_train, y_train, X_test, y_test, minority_idx, verbose=False)\n\n# Calculate metrics (pRule, accuracy) on X_test\naccuracy_test, pRule_test = model_reweighted_classifier.get_metrics_test(X_test, y_test, minority_idx_test)\n\n\n#print('ASR+CULEP for X_test')\n#print(f\"prule = {pRule_test:.6}, accuracy = {accuracy_test:.6}\")\n#print(f\"prule + accuracy = {(pRule_test + accuracy_test):.6}\")", "/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n/usr/local/lib/python3.7/dist-packages/sklearn/linear_model/_logistic.py:940: ConvergenceWarning: lbfgs failed to converge (status=1):\nSTOP: TOTAL NO. of ITERATIONS REACHED LIMIT.\n\nIncrease the number of iterations (max_iter) or scale the data as shown in:\n https://scikit-learn.org/stable/modules/preprocessing.html\nPlease also refer to the documentation for alternative solver options:\n https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression\n extra_warning_msg=_LOGISTIC_SOLVER_CONVERGENCE_MSG)\n" ], [ "#SMOTEBOOST\nmax_depth = 2\nn_samples = 100\nk_neighbors = 5\nn_estimators = 5 # T\nrandom_state = 42\n\nget_base_clf = lambda: DecisionTreeClassifier(max_depth = max_depth)\nsmoteboost1 = SMOTEBoost(n_samples = n_samples,\n k_neighbors = k_neighbors,\n n_estimators = n_estimators,\n base_classifier = get_base_clf,\n random_state = random_state)\nsmoteboost1.fit(X_train, y_train)\n#smote_boost = smoteboost1.predict(X_test)", "\u001b[1;30;43mВыходные данные были обрезаны до нескольких последних строк (5000).\u001b[0m\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. ]\n[1.34556873 2.75584724 0.17405731 0. 0. 0.\n 0. 0.21544598 2.0119666 0. 1.30023581 2.20770341\n 0.99471607 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.54597258 2.75584724 0.87028654 1. 0. 0.\n 0. 0.59709709 1.50897495 0. 1.30023581 1.69823339\n 0.99471607 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.37419785 2.24313148 1.47948712 1. 0. 0.\n 0. 0.66401166 1.50897495 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.40282698 2.179042 1.47948712 1. 0. 0.\n 0. 1.5090856 1.50897495 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.173794 1.98677359 2.26274501 1. 0. 0.\n 0. 0.79234537 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.34556873 1.98677359 2.26274501 0. 0. 0.\n 0. 0.50076076 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[0.94476102 2.75584724 0.78325789 0. 0. 0.\n 0.24910027 0.84522308 3.01794989 0. 1.30023581 1.65577755\n 0.99471607 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. ] [1.34556873 2.75584724 0.78325789 0. 0. 0.\n 0. 0.14819297 1.0059833 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. ]\n[1.25968137e+00 1.40996836e+00 1.74057309e-01 0.00000000e+00\n 2.26308624e+02 0.00000000e+00 0.00000000e+00 1.17111109e+00\n 2.01196660e+00 0.00000000e+00 1.30023581e+00 2.20770341e+00\n 1.00529816e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 1.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 1.00000000e+00 0.00000000e+00\n 1.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 1.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 1.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 1.00000000e+00 0.00000000e+00] [ 1.23105224 1.5381473 0.43514327 0. 226.30862398\n 0. 0. 0.63024748 1.0059833 0.\n 1.30023581 2.20770341 1.00529816 0. 0.\n 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 1. 0. 0. 1. 0.\n 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 1.\n 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0.\n 0. 0. 1. 1. 0. ]\n[1.48871434 2.69175777 0.17405731 1. 0. 0.\n 0. 0.6275055 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.68911819 2.81993671 0.17405731 1. 0. 0.\n 0.12455013 0.88471214 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 2.03266766 0. 0. 0. 0. 0.\n 24.91002675 1.76102626 0. 0. 1.30023581 0.\n 1.00529816 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 2.29032976 0. 0. 0. 0. 0.\n 27.1868032 1.41254849 0. 0. 1.30023581 0.\n 1.00529816 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.4314561 2.11495253 0.17405731 1. 0. 0.\n 0. 0.74757886 0.50299165 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.05927751 2.11495253 0.17405731 1. 0. 0.\n 0. 0.38319418 1.0059833 0. 1.30023581 1.1038517\n 1.00529816 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.86089293 2.43539989 0.17405731 0. 0. 0.\n 57.29306153 0.59575478 0.50299165 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 2.14718415 0. 0. 1. 0. 0.\n 56.31658848 1.44690925 0. 0. 1.30023581 0.\n 1.00529816 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[0.83024454 1.60223677 3.13303155 0. 0. 0.\n 0. 1.22688566 3.01794989 5.6084 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.37419785 1.85859465 2.87194559 0. 0. 0.\n 0. 1.20763446 3.01794989 5.6084 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.91815117 0.25635788 0.17405731 0. 45.37759597 0.\n 0. 0.84879683 2.0119666 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.88952205 0.76907365 0.34811462 0. 45.37759597 0.\n 0. 0.6755532 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.28831049 2.75584724 0.78325789 1. 0. 0.\n 0. 0.74453286 3.01794989 11.2168 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.68911819 3.07629459 0.87028654 1. 0. 0.\n 0. 0.67416501 3.01794989 11.2168 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.173794 0.25635788 2.95897425 0. 0. 0.\n 0.49820054 1.17411694 0.50299165 0. 1.30023581 2.0803359\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.14516488 0.25635788 2.95897425 0. 0. 0.\n 0. 1.3862301 0.50299165 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.34556873 2.11495253 0.69622923 0. 0. 0.\n 11.95681284 0.51027163 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.66048907 2.88402618 0.52217193 0. 0. 0.\n 12.45501338 0.16380159 3.01794989 0. 0.65011791 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 0. 0. ]\n[1.11653576 0.25635788 0.17405731 0. 0. 0.\n 0. 1.74132762 0.50299165 0. 1.30023581 2.12279174\n 0.99471607 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.20242312 0.25635788 0.17405731 0. 0. 0.\n 0. 0.44438961 0.50299165 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[0.97339015 2.6276683 0.69622923 1. 0. 0.\n 0. 0.55421784 2.51495825 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [0.97339015 2.6276683 0.60920058 1. 0. 0.\n 0. 0.6991583 3.01794989 0. 1.30023581 2.03788007\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.63185995 2.75584724 0.78325789 0. 0. 0.\n 0.24910027 1.31320058 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.54597258 2.75584724 0.17405731 0. 0. 0.\n 0. 1.86499768 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.14516488 0.70498418 0.17405731 0. 34.00094768 0.\n 0.57293062 0.57411152 2.0119666 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.03064839 0.25635788 0.17405731 0. 34.00094768 0.\n 0.49820054 0.65458109 1.50897495 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.57460171 0.25635788 2.95897425 0. 0. 0.\n 4.98200535 0.77034645 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.48871434 0.57680524 3.0460029 0. 0. 0.\n 4.98200535 0.43494757 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. ]\n[ 1.25968137 2.37131042 0.17405731 0. 0. 63.67395845\n 7.47300803 0.67673489 2.0119666 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.46008522 2.69175777 0.60920058 0. 0. 63.67395845\n 9.9640107 0.23941821 0.50299165 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.23105224 2.6276683 0.69622923 1. 0. 0.\n 0. 1.02482833 3.01794989 11.2168 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.25968137 0.76907365 0.34811462 0. 0. 0.\n 0. 1.09903954 3.01794989 11.2168 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.00201927 0.19226841 0.17405731 0. 34.00094768 0.\n 2.49100268 1.43771962 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.51734346 0.25635788 0.17405731 1. 34.00094768 0.\n 2.49100268 1.75781391 2.51495825 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 0.83024454 2.11495253 1.39245847 0. 62.97779366 0.\n 0. 1.49897814 0.50299165 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 0.88750278 2.11495253 1.39245847 1. 62.97779366 0.\n 0. 0.93582306 0.50299165 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.40282698 2.75584724 0.17405731 1. 0. 59.534492\n 3.98560428 1.56896739 2.51495825 11.2168 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 2.0899259 2.05086306 1.39245847 0. 0. 57.31974562\n 6.81040131 0.58146552 0.50299165 11.2168 1.30023581 2.12279174\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.46008522 2.179042 1.47948712 0. 0. 0.\n 8.96760963 0.93351131 3.01794989 11.2168 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.71774732 2.05086306 1.39245847 0. 0. 0.\n 9.9640107 0.84560742 1.50897495 11.2168 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.48871434 2.05086306 1.56651578 0. 0. 0.\n 0. 1.30072401 0.50299165 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.68911819 2.11495253 1.39245847 0. 0. 0.\n 0. 0.62126435 0.50299165 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.08790663 1.02543153 1.04434385 1. 0. 0.\n 0. 1.03931262 2.51495825 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.03064839 0.83316312 0.34811462 0. 0. 0.\n 0. 0.52084947 2.51495825 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.37419785 0.83316312 3.13303155 0. 0. 0.\n 0.23913626 0.74674709 3.01794989 5.6084 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.14516488 0.51271577 3.13303155 0. 0. 0.\n 0. 0.98278659 2.0119666 5.6084 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.28831049 2.69175777 0.60920058 0. 0. 0.\n 0. 0.64587331 0.50299165 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.48871434 2.69175777 0.60920058 0. 0. 0.\n 0.09964011 0.23620011 0.50299165 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.57460171 3.20447354 0.43514327 0. 0. 0.\n 5.97840642 1.07606544 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.91815117 2.6276683 0.17405731 0. 0. 0.\n 7.57264813 1.13681917 3.01794989 0. 1.30023581 1.35858671\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.00201927 2.6276683 0.69622923 1. 0. 0.\n 0. 0.61352601 2.51495825 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.48871434 2.6276683 0.69622923 1. 0. 0.\n 0. 1.6987524 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.34556873 0.19226841 2.87194559 0. 0. 0.\n 0. 0.41329283 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.23105224 0.19226841 2.95897425 0. 0. 0.\n 0. 0.90305127 3.01794989 0. 1.30023581 1.18876337\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.03064839 1.60223677 0.17405731 0. 62.97779366 0.\n 19.9280214 0.702646 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. ] [ 1.34556873 2.24313148 1.39245847 0. 62.97779366 0.\n 19.9280214 1.19402209 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.40282698 0.51271577 0.17405731 0. 0. 0.\n 0. 0.95311243 2.0119666 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.60323083 0.70498418 0.17405731 0. 0. 0.\n 0. 1.17533878 2.51495825 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.54597258 1.73041571 0.17405731 0. 16.51616854 0.\n 0.87185094 1.853829 1.0059833 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.28831049 0.25635788 0.17405731 0. 16.51616854 0.\n 3.14862738 0.1793299 0.50299165 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.31693961 2.75584724 0.87028654 1. 0. 0.\n 0. 0.94258048 2.51495825 0. 1.30023581 2.20770341\n 0.99471607 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [1.4314561 2.75584724 0.87028654 1. 0. 0.\n 0. 1.23000624 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[1.34556873 1.98677359 2.26274501 0. 0. 0.\n 0. 0.50076076 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [0.97339015 1.98677359 2.26274501 0. 0. 0.\n 0.05978406 0.65976676 3.01794989 0. 1.30023581 2.20770341\n 1.00529816 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 1. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.60323083 1.85859465 3.30708886 0. 0. 0.\n 19.9280214 0.66031745 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.28831049 1.60223677 0.52217193 0. 0. 0.\n 20.92442247 0.78800295 3.01794989 0. 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n[ 1.25968137 2.69175777 1.04434385 1. 0. 0.\n 0. 1.45286933 2.0119666 11.2168 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ] [ 1.20242312 2.69175777 1.82760174 1. 0. 0.\n 0. 1.18653615 1.50897495 11.2168 1.30023581 2.20770341\n 0.99471607 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 1. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 0. 0. 0. 1. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 1. 0. 0. 0.\n 1. 0. 1. 0. 0. 0.\n 0. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0. 0. 0. 0. 0. 1.\n 1. 0. ]\n" ], [ "#AdaFair\n\n#Tolerance to unfairness\nepsilon = 0\n\n\nget_base_clf = lambda: DecisionTreeClassifier(max_depth=adafair_max_depth)\n\nada_fair = AdaFairClassifier(DecisionTreeClassifier(max_depth=adafair_max_depth),\n algorithm=\"SAMME\",\n n_estimators=adafair_n_estimators,\n protected = protected_train,\n epsilon = epsilon)\n\nada_fair.fit(X_train, y_train)", "/usr/local/lib/python3.7/dist-packages/sklearn/ensemble/_weight_boosting.py:758: RuntimeWarning: invalid value encountered in true_divide\n yield (tmp_pred / norm).sum(axis=1)\n" ], [ "#Adaboost\n\n\nada_boost_sklearn = AdaBoostClassifier(DecisionTreeClassifier(max_depth=max_depth),\n algorithm=\"SAMME\",\n n_estimators=n_estimators)\n\nada_boost_sklearn.fit(X_train, y_train)", "_____no_output_____" ] ], [ [ "# 4) Compute and plot metrics", "_____no_output_____" ], [ "#### 4.1 Compute", "_____no_output_____" ] ], [ [ "names = ['ada_fair','ada_boost_sklearn', 'smoteboost', \"reweighted_classifier\"]\nclassifiers = [ada_fair, ada_boost_sklearn, smoteboost1, model_reweighted_classifier]\n\naccuracy = {}\nbal_accuracy = {}\nTPR = {}\nTNR = {}\neq_odds = {}\np_rule = {}\n#DELETA\n#y_test = y_test[:][1]\n\nfor i, clf in enumerate(classifiers):\n print(names[i])\n prediction = clf.predict(X_test)\n if i == 3:\n \n prediction = np.array(pd.read_csv(reweight_prediction, names = ['idx', 'pred'])['pred'][1:])\n print((prediction), (y_test))\n print(len(prediction), len(y_test))\n\n accuracy[names[i]] = (prediction == y_test).sum() * 1. / len(y_test)\n bal_accuracy[names[i]] = balanced_accuracy_score(y_test, prediction)\n print('accuracy {}: {}'.format(names[i], (prediction == y_test).sum() * 1. / len(y_test))) \n print('balanced accuracy {}: {}'.format(names[i], balanced_accuracy_score(y_test, prediction))) \n \n y_true_protected, y_pred_protected, y_true_non_protected, y_pred_non_protected = separate_protected_from_non_protected(y_test, prediction, protected_test)\n \n #TPR for protected group\n tp, tn, fp, fn = tp_tn_fp_fn(y_true_protected, y_pred_protected)\n TPR_protected = tp / (tp + fn)\n TNR_protected = tn / (tn + fp)\n FPR_protected = fp / (fp + tn)\n FNR_protected = fn / (fn + tp)\n print('TPR protected {}: {}'.format(names[i], TPR_protected))\n print('TNR protected {}: {}'.format(names[i], TNR_protected)) \n\n TPR[names[i] + ' protected'] = TPR_protected\n TNR[names[i] + ' protected'] = TNR_protected\n\n #TPR for non protected group\n tp, tn, fp, fn = tp_tn_fp_fn(y_true_non_protected, y_pred_non_protected)\n TPR_non_protected = tp / (tp + fn)\n TNR_non_protected = tn / (tn + fp)\n FPR_non_protected = fp / (fp + tn)\n FNR_non_protected = fn / (fn + tp)\n print('TPR non protected {}: {}'.format(names[i], TPR_non_protected))\n print('TNR non protected {}: {}'.format(names[i], TNR_non_protected))\n\n delta_FPR = -FPR_non_protected + FPR_protected\n delta_FNR = -FNR_non_protected + FNR_protected\n eq_odds[names[i]] = abs(delta_FPR) + abs(delta_FNR)\n\n TPR[names[i] + ' non protected'] = TPR_non_protected\n TNR[names[i] + ' non protected'] = TNR_non_protected\n\n p_rule[names[i]] = pRule(prediction, minority_idx_test)\n print('pRule {}: {}'.format(names[i],p_rule[names[i]]))\n ", "ada_fair\naccuracy ada_fair: 0.9228809846454807\nbalanced accuracy ada_fair: 0.7176952582693732\nTPR protected ada_fair: 0.6626686656671664\nTNR protected ada_fair: 0.9330514588008977\nTPR non protected ada_fair: 0.4335117332235488\nTNR non protected ada_fair: 0.9756010558607405\npRule ada_fair: 0.8097128558491885\nada_boost_sklearn\naccuracy ada_boost_sklearn: 0.946417109030332\nbalanced accuracy ada_boost_sklearn: 0.6529104164181531\nTPR protected ada_boost_sklearn: 0.08320839580209895\nTNR protected ada_boost_sklearn: 0.9992385379929465\nTPR non protected ada_boost_sklearn: 0.3812268423219432\nTNR non protected ada_boost_sklearn: 0.976409597869254\npRule ada_boost_sklearn: 0.047964583961589244\nsmoteboost\naccuracy smoteboost: 0.9368192609045616\nbalanced accuracy smoteboost: 0.5606524639130167\nTPR protected smoteboost: 0.13793103448275862\nTNR protected smoteboost: 0.984550336646361\nTPR non protected smoteboost: 0.12803622890078223\nTNR non protected smoteboost: 0.9989536515183943\npRule smoteboost: 0.7617401588170212\nreweighted_classifier\n[0 0 0 ... 0 0 0] 0 0\n1 0\n2 0\n3 0\n4 0\n ..\n98142 0\n98143 0\n98144 0\n98145 0\n98146 0\nName: income_50k, Length: 98147, dtype: int64\n98147 98147\naccuracy reweighted_classifier: 0.9402528859771567\nbalanced accuracy reweighted_classifier: 0.5368035200272799\nTPR protected reweighted_classifier: 0.08095952023988005\nTNR protected reweighted_classifier: 0.9981965373517153\nTPR non protected reweighted_classifier: 0.07348703170028818\nTNR non protected reweighted_classifier: 0.9988823095764666\npRule reweighted_classifier: 0.4486914878692678\n" ] ], [ [ "#### 4.2 Plot", "_____no_output_____" ] ], [ [ "labels = ['Accuracy', 'Bal. accuracy', 'Eq. odds','TPR prot.', 'TPR non-prot', 'TNR prot.', 'TNR non-prot.', 'pRule']\nadaFair_metrics = [accuracy['ada_fair'], bal_accuracy['ada_fair'], eq_odds['ada_fair'], TPR['ada_fair protected'], TPR['ada_fair non protected'], TNR['ada_fair protected'], TNR['ada_fair non protected'], p_rule['ada_fair']]\nadaBoost_metrics = [accuracy['ada_boost_sklearn'], bal_accuracy['ada_boost_sklearn'], eq_odds['ada_boost_sklearn'], TPR['ada_boost_sklearn protected'], TPR['ada_boost_sklearn non protected'], TNR['ada_boost_sklearn protected'], TNR['ada_boost_sklearn non protected'], p_rule['ada_boost_sklearn']]\nSMOTEBoost_metrics = [accuracy['smoteboost'], bal_accuracy['smoteboost'], eq_odds['smoteboost'], TPR['smoteboost protected'], TPR['smoteboost non protected'], TNR['smoteboost protected'], TNR['smoteboost non protected'], p_rule['smoteboost']]\nreweighted_class_metrics = [accuracy['reweighted_classifier'], bal_accuracy['reweighted_classifier'], eq_odds['reweighted_classifier'], TPR['reweighted_classifier protected'], TPR['reweighted_classifier non protected'], TNR['reweighted_classifier protected'], TNR['reweighted_classifier non protected'],p_rule['reweighted_classifier']]\n\nx = np.arange(len(labels)) # the label locations\nwidth = 0.20 # the width of the bars\n\nfont = {'family' : 'normal',\n 'weight' : 'normal',\n 'size' : 19}\n\nmatplotlib.rc('font', **font)\n\nfig, ax = plt.subplots(figsize = [15, 5])\nrects1 = ax.bar(x - 1.5*width, adaBoost_metrics, width, label='AdaBoost', color='gray')\nrects2 = ax.bar(x - width/2 , SMOTEBoost_metrics, width, label='SMOTEBoost', color='red')\nrects3 = ax.bar(x + width/2, adaFair_metrics, width, label='AdaFair', color='green')\nrects4 = ax.bar(x + 1.5*width, reweighted_class_metrics, width, label='ASR', color='blue')\n\nax.set_ylabel('Scores')\n#ax.set_title('Scores by algoritms')\nax.set_xticks(x)\nax.set_xticklabels(labels)\nax.legend(loc=9, ncol = 4, bbox_to_anchor=(0.5, 1.15))\nax.grid()\n\nfig.tight_layout()\n\nplt.show()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d02562ed2d085c138a990c1f5efc3179f182064c
13,528
ipynb
Jupyter Notebook
dictionary/dialect/pahang.ipynb
huseinzol05/Malay-Dataset
e27b7617c74395c86bb5ed9f3f194b3cac2f66f6
[ "Apache-2.0" ]
51
2020-05-20T13:26:18.000Z
2021-05-13T07:21:17.000Z
dictionary/dialect/pahang.ipynb
huseinzol05/Malay-Dataset
e27b7617c74395c86bb5ed9f3f194b3cac2f66f6
[ "Apache-2.0" ]
3
2020-05-21T13:12:46.000Z
2021-05-12T03:26:43.000Z
dictionary/dialect/pahang.ipynb
huseinzol05/Malaya-Dataset
c9c1917a6b1cab823aef5a73bd10e0fab0bff42d
[ "Apache-2.0" ]
21
2019-02-08T05:17:24.000Z
2020-05-05T09:28:50.000Z
40.624625
76
0.353785
[ [ [ "import json\nimport pandas as pd\nfrom bs4 import BeautifulSoup", "_____no_output_____" ], [ "with open('pahang.json') as fopen:\n pahang = json.load(fopen)", "_____no_output_____" ], [ "res = []\n\nfor j in pahang:\n s = BeautifulSoup(j)\n s.tbody.table.decompose()\n table_rows = s.tbody.find_all('tr')\n for tr in table_rows:\n td = tr.find_all('td')\n row = [tr.text.strip() for tr in td if tr.text.strip()]\n if row:\n res.append(row)", "_____no_output_____" ], [ "df = pd.DataFrame(res)\ndf", "_____no_output_____" ], [ "df.to_csv('pahang.csv', index = False)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code" ] ]
d0256441b14747ca8dffb4322c21c5836680fe9c
678,580
ipynb
Jupyter Notebook
Python_Stock/Portfolio_Strategies/Apple_Tesla_Split.ipynb
linusqzdeng/Stock_Analysis_For_Quant
de6232caed5328a2b1fa40bec1f45bbd822c88fa
[ "MIT" ]
null
null
null
Python_Stock/Portfolio_Strategies/Apple_Tesla_Split.ipynb
linusqzdeng/Stock_Analysis_For_Quant
de6232caed5328a2b1fa40bec1f45bbd822c88fa
[ "MIT" ]
null
null
null
Python_Stock/Portfolio_Strategies/Apple_Tesla_Split.ipynb
linusqzdeng/Stock_Analysis_For_Quant
de6232caed5328a2b1fa40bec1f45bbd822c88fa
[ "MIT" ]
null
null
null
499.323032
103,053
0.843198
[ [ [ "# Apple and Tesla Split on 8/31", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport math\n\nimport warnings\nwarnings.filterwarnings(\"ignore\")\n\n# for fetching data\nimport yfinance as yf", "_____no_output_____" ], [ "# input\n# Coronavirus 2nd Wave\ntitle = \"Apple and Tesla\"\nsymbols = ['AAPL', 'TSLA']\nstart = '2020-01-01'\nend = '2020-08-31'", "_____no_output_____" ], [ "df = pd.DataFrame()\nfor symbol in symbols:\n df[symbol] = yf.download(symbol, start, end)['Adj Close']", "[*********************100%***********************] 1 of 1 completed\n[*********************100%***********************] 1 of 1 completed\n" ], [ "from datetime import datetime\nfrom dateutil import relativedelta\n\nd1 = datetime.strptime(start, \"%Y-%m-%d\")\nd2 = datetime.strptime(end, \"%Y-%m-%d\")\ndelta = relativedelta.relativedelta(d2, d1)\nprint('How many years of investing?')\nprint('%s years' % delta.years)", "How many years of investing?\n0 years\n" ], [ "number_of_years = delta.years", "_____no_output_____" ], [ "days = (df.index[-1] - df.index[0]).days\ndays", "_____no_output_____" ], [ "df.head()", "_____no_output_____" ], [ "df.tail()", "_____no_output_____" ], [ "df.min()", "_____no_output_____" ], [ "df.max()", "_____no_output_____" ], [ "df.describe()", "_____no_output_____" ], [ "plt.figure(figsize=(12,8))\nplt.plot(df)\nplt.title(title + ' Closing Price')\nplt.legend(labels=df.columns)\nplt.show()", "_____no_output_____" ], [ "# Normalize the data\nnormalize = (df - df.min())/ (df.max() - df.min())", "_____no_output_____" ], [ "plt.figure(figsize=(18,12))\nplt.plot(normalize)\nplt.title(title + ' Stocks Normalize')\nplt.legend(labels=normalize.columns)\nplt.show()", "_____no_output_____" ], [ "stock_rets = df.pct_change().dropna()", "_____no_output_____" ], [ "plt.figure(figsize=(12,8))\nplt.plot(stock_rets)\nplt.title(title + ' Stocks Returns')\nplt.legend(labels=stock_rets.columns)\nplt.show()", "_____no_output_____" ], [ "plt.figure(figsize=(12,8))\nplt.plot(stock_rets.cumsum())\nplt.title(title + ' Stocks Returns Cumulative Sum')\nplt.legend(labels=stock_rets.columns)\nplt.show()", "_____no_output_____" ], [ "sns.set(style='ticks')\nax = sns.pairplot(stock_rets, diag_kind='hist')\n\nnplot = len(stock_rets.columns)\nfor i in range(nplot) :\n for j in range(nplot) :\n ax.axes[i, j].locator_params(axis='x', nbins=6, tight=True)", "_____no_output_____" ], [ "ax = sns.PairGrid(stock_rets)\nax.map_upper(plt.scatter, color='purple')\nax.map_lower(sns.kdeplot, color='blue')\nax.map_diag(plt.hist, bins=30)\nfor i in range(nplot) :\n for j in range(nplot) :\n ax.axes[i, j].locator_params(axis='x', nbins=6, tight=True)", "_____no_output_____" ], [ "plt.figure(figsize=(10,10))\ncorr = stock_rets.corr()\n\n# plot the heatmap\nsns.heatmap(corr, \n xticklabels=corr.columns,\n yticklabels=corr.columns,\n cmap=\"Reds\")", "_____no_output_____" ], [ "# Box plot\nstock_rets.plot(kind='box',figsize=(24,8))", "_____no_output_____" ], [ "rets = stock_rets.dropna()\n\nplt.figure(figsize=(16,8))\nplt.scatter(rets.std(), rets.mean(),alpha = 0.5)\n\nplt.title('Stocks Risk & Returns')\nplt.xlabel('Risk')\nplt.ylabel('Expected Returns')\nplt.grid(which='major')\n\nfor label, x, y in zip(rets.columns, rets.std(), rets.mean()):\n plt.annotate(\n label, \n xy = (x, y), xytext = (50, 50),\n textcoords = 'offset points', ha = 'right', va = 'bottom',\n arrowprops = dict(arrowstyle = '-', connectionstyle = 'arc3,rad=-0.3'))", "_____no_output_____" ], [ "rets = stock_rets.dropna()\narea = np.pi*20.0\n\nsns.set(style='darkgrid')\nplt.figure(figsize=(16,8))\nplt.scatter(rets.std(), rets.mean(), s=area)\nplt.xlabel(\"Risk\", fontsize=15)\nplt.ylabel(\"Expected Return\", fontsize=15)\nplt.title(\"Return vs. Risk for Stocks\", fontsize=20)\n\nfor label, x, y in zip(rets.columns, rets.std(), rets.mean()) : \n plt.annotate(label, xy=(x,y), xytext=(50, 0), textcoords='offset points',\n arrowprops=dict(arrowstyle='-', connectionstyle='bar,angle=180,fraction=-0.2'),\n bbox=dict(boxstyle=\"round\", fc=\"w\"))", "_____no_output_____" ], [ "def annual_risk_return(stock_rets):\n tradeoff = stock_rets.agg([\"mean\", \"std\"]).T\n tradeoff.columns = [\"Return\", \"Risk\"]\n tradeoff.Return = tradeoff.Return*252\n tradeoff.Risk = tradeoff.Risk * np.sqrt(252)\n return tradeoff", "_____no_output_____" ], [ "tradeoff = annual_risk_return(stock_rets)\ntradeoff", "_____no_output_____" ], [ "import itertools\n\ncolors = itertools.cycle([\"r\", \"b\", \"g\"])\n\ntradeoff.plot(x = \"Risk\", y = \"Return\", kind = \"scatter\", figsize = (13,9), s = 20, fontsize = 15, c='g')\nfor i in tradeoff.index:\n plt.annotate(i, xy=(tradeoff.loc[i, \"Risk\"]+0.002, tradeoff.loc[i, \"Return\"]+0.002), size = 15)\nplt.xlabel(\"Annual Risk\", fontsize = 15)\nplt.ylabel(\"Annual Return\", fontsize = 15)\nplt.title(\"Return vs. Risk for \" + title + \" Stocks\", fontsize = 20)\nplt.show()", "_____no_output_____" ], [ "rest_rets = rets.corr()\npair_value = rest_rets.abs().unstack()\npair_value.sort_values(ascending = False)", "_____no_output_____" ], [ "# Normalized Returns Data\nNormalized_Value = ((rets[:] - rets[:].min()) /(rets[:].max() - rets[:].min()))\nNormalized_Value.head()", "_____no_output_____" ], [ "Normalized_Value.corr()", "_____no_output_____" ], [ "normalized_rets = Normalized_Value.corr()\nnormalized_pair_value = normalized_rets.abs().unstack()\nnormalized_pair_value.sort_values(ascending = False)", "_____no_output_____" ], [ "print(\"Stock returns: \")\nprint(rets.mean())\nprint('-' * 50)\nprint(\"Stock risks:\")\nprint(rets.std())", "Stock returns: \nAAPL 0.003572\nTSLA 0.011590\ndtype: float64\n--------------------------------------------------\nStock risks:\nAAPL 0.031211\nTSLA 0.059151\ndtype: float64\n" ], [ "table = pd.DataFrame()\ntable['Returns'] = rets.mean()\ntable['Risk'] = rets.std()\ntable.sort_values(by='Returns')", "_____no_output_____" ], [ "table.sort_values(by='Risk')", "_____no_output_____" ], [ "rf = 0.01\ntable['Sharpe Ratio'] = (table['Returns'] - rf) / table['Risk']\ntable", "_____no_output_____" ], [ "table['Max Returns'] = rets.max()", "_____no_output_____" ], [ "table['Min Returns'] = rets.min()", "_____no_output_____" ], [ "table['Median Returns'] = rets.median()", "_____no_output_____" ], [ "total_return = stock_rets[-1:].transpose()\ntable['Total Return'] = 100 * total_return\ntable", "_____no_output_____" ], [ "table['Average Return Days'] = (1 + total_return)**(1 / days) - 1\ntable", "_____no_output_____" ], [ "initial_value = df.iloc[0]\nending_value = df.iloc[-1]\ntable['CAGR'] = ((ending_value / initial_value) ** (252.0 / days)) -1\ntable", "_____no_output_____" ], [ "table.sort_values(by='Average Return Days')", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0257854ca677253e85407b94878225459771021
750,741
ipynb
Jupyter Notebook
ML-Predictions/.ipynb_checkpoints/Water-Level TSF - Copy (2)-checkpoint.ipynb
romilshah525/SIH-2019
35555a4826e097a4a1178e7a1a8580dc4f80a64e
[ "MIT" ]
null
null
null
ML-Predictions/.ipynb_checkpoints/Water-Level TSF - Copy (2)-checkpoint.ipynb
romilshah525/SIH-2019
35555a4826e097a4a1178e7a1a8580dc4f80a64e
[ "MIT" ]
6
2021-07-20T06:46:21.000Z
2022-03-08T23:26:58.000Z
ML-Predictions/.ipynb_checkpoints/Water-Level TSF - Copy (2)-checkpoint.ipynb
romilshah525/Water.io
35555a4826e097a4a1178e7a1a8580dc4f80a64e
[ "MIT" ]
null
null
null
317.303888
57,852
0.920407
[ [ [ "import numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\nimport pandas as pd\nfrom matplotlib.pylab import rcParams\nrcParams['figure.figsize'] = 15, 6\nimport seaborn as sns\nimport warnings\nwarnings.filterwarnings('ignore')", "_____no_output_____" ], [ "ori=pd.read_csv('website_data_20190225.csv')\nori.drop(['STATE','DISTRICT','WLCODE','SITE_TYPE','TEH_NAME'],axis=1,inplace=True)\nori.replace(to_replace=\"'0\",value=0,inplace=True)\nori.head()", "_____no_output_____" ], [ "dataset=pd.DataFrame().reindex_like(ori)\ndataset1=pd.DataFrame().reindex_like(ori)\ndataset.dropna(inplace=True)\ndataset1.dropna(inplace=True)\n\n# j=0\n# for i in range(0,ori.shape[0]):\n# if ori['STATE'][i]=='RJ':\n# dataset.loc[j] = ori.iloc[i]\n# j+=1\n# dataset.drop(['STATE'],axis=1,inplace=True)\n\n# j=0\n# for i in range(0,ori.shape[0]):\n# if ori['DISTRICT'][i]=='Ajmer':\n# dataset.loc[j] = ori.iloc[i]\n# j+=1\n# dataset.drop(['DISTRICT'],axis=1,inplace=True)\n\nj=0\nfor i in range(0,ori.shape[0]):\n if ori['BLOCK_NAME'][i]=='Arain':\n dataset1.loc[j] = ori.iloc[i]\n j+=1\ndataset1.drop(['BLOCK_NAME'],axis=1,inplace=True)\ndataset.drop(['BLOCK_NAME'],axis=1,inplace=True)\n\nj=0\nfor i in range(0,dataset1.shape[0]):\n if dataset1['SITE_NAME'][i]=='Sanpla':\n dataset.loc[j] = dataset1.iloc[i]\n j+=1\nlat=dataset[\"LAT\"][0]\nlon=dataset[\"LON\"][0]\ndataset.drop(['SITE_NAME','LAT','LON'],axis=1,inplace=True)\n \ndataset", "_____no_output_____" ], [ "for i in range(0,dataset.shape[0]):\n dataset['MONSOON'][i]=float(dataset['MONSOON'][i])\n dataset['POMRB'][i]=float(dataset['POMRB'][i])\n dataset['POMKH'][i]=float(dataset['POMKH'][i])\n dataset['PREMON'][i]=float(dataset['PREMON'][i])\n dataset['YEAR_OBS'][i]=int(dataset['YEAR_OBS'][i])", "_____no_output_____" ], [ "first=list(dataset['MONSOON'])\nsecond=list(dataset['POMRB'])\nthird=list(dataset['POMKH'])\nfourth=list(dataset['PREMON'])\ndataset['MONSOON']=pd.core.frame.DataFrame(x+y+z+w for x, y,z,w in zip(first, second, third, fourth))\ndataset.drop(['POMRB','POMKH','PREMON'],axis=1,inplace=True)\ndataset = dataset.iloc[::-1]\ndataset", "_____no_output_____" ], [ "dataset['YEAR_OBS']=(dataset['YEAR_OBS']).apply(np.int64)", "_____no_output_____" ], [ "dataset['YEAR_OBS']=pd.to_datetime(dataset['YEAR_OBS'],yearfirst=True,format='%Y',infer_datetime_format=True)\nindexedDataset=dataset.set_index(['YEAR_OBS'])", "_____no_output_____" ], [ "from datetime import datetime\nindexedDataset.head(50)", "_____no_output_____" ], [ "plt.xlabel('Years')\nplt.ylabel('Water-Level')\nplt.plot(indexedDataset)", "_____no_output_____" ] ], [ [ "- A stationary time series is one whose statistical properties such as mean, variance, autocorrelation, etc. are all constant over time. Most statistical forecasting methods are based on the assumption that the time series can be rendered approximately stationary (i.e., \"stationarized\") through the use of mathematical transformations. A stationarized series is relatively easy to predict: you simply predict that its statistical properties will be the same in the future as they have been in the past!", "_____no_output_____" ], [ "- We can check stationarity using the following:\n\n- - Plotting Rolling Statistics: We can plot the moving average or moving variance and see if it varies with time. This is more of a visual technique.\n- - Dickey-Fuller Test: This is one of the statistical tests for checking stationarity. Here the null hypothesis is that the TimeSeries is non-stationary. The test results comprise of a Test Statistic and some Critical Values for difference confidence levels. If the ‘Test Statistic’ is less than the ‘Critical Value’, we can reject the null hypothesis and say that the series is stationary.", "_____no_output_____" ] ], [ [ "from statsmodels.tsa.stattools import adfuller\ndef test_stationary(timeseries):\n \n #Determing rolling statistics\n moving_average=timeseries.rolling(window=12).mean()\n standard_deviation=timeseries.rolling(window=12).std()\n \n #Plot rolling statistics:\n plt.plot(timeseries,color='blue',label=\"Original\")\n plt.plot(moving_average,color='red',label='Mean')\n plt.plot(standard_deviation,color='black',label='Standard Deviation')\n plt.legend(loc='best') #best for axes\n plt.title('Rolling Mean & Deviation')\n# plt.show()\n plt.show(block=False)\n \n #Perform Dickey-Fuller test:\n print('Results Of Dickey-Fuller Test')\n tstest=adfuller(timeseries['MONSOON'],autolag='AIC')\n tsoutput=pd.Series(tstest[0:4],index=['Test Statistcs','P-value','#Lags used',\"#Obs. used\"])\n #Test Statistics should be less than the Critical Value for Stationarity\n #lesser the p-value, greater the stationarity\n # print(list(dftest))\n for key,value in tstest[4].items():\n tsoutput['Critical Value (%s)'%key]=value\n print((tsoutput))", "_____no_output_____" ], [ "test_stationary(indexedDataset)", "_____no_output_____" ] ], [ [ "- There are 2 major reasons behind non-stationaruty of a TS:\n- - Trend – varying mean over time. For eg, in this case we saw that on average, the number of passengers was growing over time.\n- - Seasonality – variations at specific time-frames. eg people might have a tendency to buy cars in a particular month because of pay increment or festivals.", "_____no_output_____" ], [ "## Indexed Dataset Logscale", "_____no_output_____" ] ], [ [ "indexedDataset_logscale=np.log(indexedDataset)\ntest_stationary(indexedDataset_logscale)", "_____no_output_____" ] ], [ [ "## Dataset Log Minus Moving Average (dl_ma)", "_____no_output_____" ] ], [ [ "rolmeanlog=indexedDataset_logscale.rolling(window=12).mean()\ndl_ma=indexedDataset_logscale-rolmeanlog\ndl_ma.head(12)", "_____no_output_____" ], [ "dl_ma.dropna(inplace=True)\ndl_ma.head(12)", "_____no_output_____" ], [ "test_stationary(dl_ma)", "_____no_output_____" ] ], [ [ "## Exponential Decay Weighted Average (edwa)", "_____no_output_____" ] ], [ [ "edwa=indexedDataset_logscale.ewm(halflife=12,min_periods=0,adjust=True).mean()\nplt.plot(indexedDataset_logscale)\nplt.plot(edwa,color='red')", "_____no_output_____" ] ], [ [ "## Dataset Logscale Minus Moving Exponential Decay Average (dlmeda)", "_____no_output_____" ] ], [ [ "dlmeda=indexedDataset_logscale-edwa\ntest_stationary(dlmeda)", "_____no_output_____" ] ], [ [ "## Eliminating Trend and Seasonality", "_____no_output_____" ], [ "- Differencing – taking the differece with a particular time lag\n- Decomposition – modeling both trend and seasonality and removing them from the model.", "_____no_output_____" ], [ "# Differencing", "_____no_output_____" ], [ "## Dataset Log Div Shifting (dlds)", "_____no_output_____" ] ], [ [ "#Before Shifting\nindexedDataset_logscale.head()", "_____no_output_____" ], [ "#After Shifting\nindexedDataset_logscale.shift().head()", "_____no_output_____" ], [ "dlds=indexedDataset_logscale-indexedDataset_logscale.shift()\ndlds.dropna(inplace=True)\ntest_stationary(dlds)", "_____no_output_____" ] ], [ [ "# Decomposition", "_____no_output_____" ] ], [ [ "from statsmodels.tsa.seasonal import seasonal_decompose\ndecompostion= seasonal_decompose(indexedDataset_logscale,freq=10)\n\ntrend=decompostion.trend\nseasonal=decompostion.seasonal\nresidual=decompostion.resid\n\nplt.subplot(411)\nplt.plot(indexedDataset_logscale,label='Original')\nplt.legend(loc='best')\n\nplt.subplot(412)\nplt.plot(trend,label='Trend')\nplt.legend(loc='best')\n\nplt.subplot(413)\nplt.plot(seasonal,label='Seasonal')\nplt.legend(loc='best')\n\nplt.subplot(414)\nplt.plot(residual,label='Residual')\nplt.legend(loc='best')\n\nplt.tight_layout() #To Show Multiple Grpahs In One Output, Use plt.subplot(abc)", "_____no_output_____" ] ], [ [ "- Here trend, seasonality are separated out from data and we can model the residuals. Lets check stationarity of residuals:", "_____no_output_____" ] ], [ [ "decomposedlogdata=residual\ndecomposedlogdata.dropna(inplace=True)\ntest_stationary(decomposedlogdata)", "_____no_output_____" ] ], [ [ "# Forecasting a Time Series", "_____no_output_____" ], [ "- ARIMA stands for Auto-Regressive Integrated Moving Averages. The ARIMA forecasting for a stationary time series is nothing but a linear (like a linear regression) equation. The predictors depend on the parameters (p,d,q) of the ARIMA model:\n\n- - Number of AR (Auto-Regressive) terms (p): AR terms are just lags of dependent variable. For instance if p is 5, the predictors for x(t) will be x(t-1)….x(t-5).\n- - Number of MA (Moving Average) terms (q): MA terms are lagged forecast errors in prediction equation. For instance if q is 5, the predictors for x(t) will be e(t-1)….e(t-5) where e(i) is the difference between the moving average at ith instant and actual value.\n- - Number of Differences (d): These are the number of nonseasonal differences, i.e. in this case we took the first order difference. So either we can pass that variable and put d=0 or pass the original variable and put d=1. Both will generate same results.\n\n\n- An importance concern here is how to determine the value of ‘p’ and ‘q’. We use two plots to determine these numbers.\n\n- - Autocorrelation Function (ACF): It is a measure of the correlation between the the TS with a lagged version of itself-. For instance at lag 5, ACF would compare series at time instant ‘t1’…’t2’ with series at instant ‘t1-5’…’t2-5’ (t1-5 and t2 being end points).\n- - Partial Autocorrelation Function (PACF): This measures the correlation between the TS with a lagged version of itself but after eliminating the variations already explained by the intervening comparisons. Eg at lag 5, it will check the correlation but remove the effects already explained by lags 1 to 4.", "_____no_output_____" ], [ "## ACF & PACF Plots", "_____no_output_____" ] ], [ [ "from statsmodels.tsa.stattools import acf,pacf\nlag_acf=acf(dlds,nlags=20)\nlag_pacf=pacf(dlds,nlags=20,method='ols')\n\nplt.subplot(121)\nplt.plot(lag_acf)\nplt.axhline(y=0, linestyle='--',color='gray')\nplt.axhline(y=1.96/np.sqrt(len(dlds)),linestyle='--',color='gray')\nplt.axhline(y=-1.96/np.sqrt(len(dlds)),linestyle='--',color='gray')\nplt.title('AutoCorrelation Function')\n\nplt.subplot(122)\nplt.plot(lag_pacf)\nplt.axhline(y=0, linestyle='--',color='gray')\nplt.axhline(y=1.96/np.sqrt(len(dlds)),linestyle='--',color='gray')\nplt.axhline(y=-1.96/np.sqrt(len(dlds)),linestyle='--',color='gray')\nplt.title('PartialAutoCorrelation Function')\n\nplt.tight_layout()", "_____no_output_____" ] ], [ [ "- In this plot, the two dotted lines on either sides of 0 are the confidence interevals. These can be used to determine the ‘p’ and ‘q’ values as:\n\n- - p – The lag value where the PACF chart crosses the upper confidence interval for the first time. If we notice closely, in this case p=2.\n- - q – The lag value where the ACF chart crosses the upper confidence interval for the first time. If we notice closely, in this case q=2.", "_____no_output_____" ] ], [ [ "from statsmodels.tsa.arima_model import ARIMA\n\nmodel=ARIMA(indexedDataset_logscale,order=(5,1,0))\nresults_AR=model.fit(disp=-1)\nplt.plot(dlds)\nplt.plot(results_AR.fittedvalues,color='red')\nplt.title('RSS: %.4f'%sum((results_AR.fittedvalues-dlds['MONSOON'])**2))\nprint('Plotting AR Model')", "Plotting AR Model\n" ], [ "model = ARIMA(indexedDataset_logscale, order=(0, 1, 2)) #0,1,2\nresults_MA = model.fit(disp=-1) \nplt.plot(dlds)\nplt.plot(results_MA.fittedvalues, color='red')\nplt.title('RSS: %.4f'%sum((results_MA.fittedvalues-dlds['MONSOON'])**2))\nprint('Plotting MA Model')", "Plotting MA Model\n" ], [ "import itertools\na={}\np=d=q=range(0,6)\npdq=list(itertools.product(p,d,q))\nfor i in pdq:\n try:\n model_arima=(ARIMA(indexedDataset_logscale,order=i)).fit()\n a[i]=model_arima.aic\n except:\n continue\n# min(a, key=a.get)", "_____no_output_____" ], [ "RSS=[]\nRSS1=[]\nfor i in a.keys(): \n model = ARIMA(indexedDataset_logscale, order=i)\n results_ARIMA = model.fit(disp=-1)\n RSS.append(sum((results_ARIMA.fittedvalues-dlds['MONSOON'])**2))\nfor i in range(0,len(RSS)):\n if(~np.isnan(RSS[i])):\n RSS1.append(RSS[i])\nmin(RSS1)", "_____no_output_____" ], [ "model = ARIMA(indexedDataset_logscale, order=(5, 1, 2)) \nresults_ARIMA = model.fit(disp=-1) \nplt.plot(dlds)\nplt.plot(results_ARIMA.fittedvalues, color='red')\nplt.title('RSS: %.4f'%sum((results_ARIMA.fittedvalues-dlds['MONSOON'])**2))\nprint('Plotting Combined Model')", "Plotting Combined Model\n" ] ], [ [ "# Taking it back to original scale from residual scale", "_____no_output_____" ] ], [ [ "#storing the predicted results as a separate series\npredictions_ARIMA_diff = pd.Series(results_ARIMA.fittedvalues, copy=True)\npredictions_ARIMA_diff.head()", "_____no_output_____" ] ], [ [ "- Notice that these start from ‘1949-02-01’ and not the first month. Why? This is because we took a lag by 1 and first element doesn’t have anything before it to subtract from. The way to convert the differencing to log scale is to add these differences consecutively to the base number. An easy way to do it is to first determine the cumulative sum at index and then add it to the base number.", "_____no_output_____" ] ], [ [ "#convert to cummuative sum\npredictions_ARIMA_diff_cumsum = predictions_ARIMA_diff.cumsum()\npredictions_ARIMA_diff_cumsum", "_____no_output_____" ], [ "predictions_ARIMA_log = pd.Series(indexedDataset_logscale['MONSOON'].ix[0], index=indexedDataset_logscale.index)\npredictions_ARIMA_log", "_____no_output_____" ], [ "predictions_ARIMA_log = predictions_ARIMA_log.add(predictions_ARIMA_diff_cumsum,fill_value=0)\npredictions_ARIMA_log", "_____no_output_____" ] ], [ [ "- Here the first element is base number itself and from there on the values cumulatively added.", "_____no_output_____" ] ], [ [ "#Last step is to take the exponent and compare with the original series.\npredictions_ARIMA = np.exp(predictions_ARIMA_log)\nplt.plot(indexedDataset)\nplt.plot(predictions_ARIMA)\nplt.title('RMSE: %.4f'% np.sqrt(sum((predictions_ARIMA-indexedDataset['MONSOON'])**2)/len(indexedDataset)))", "_____no_output_____" ] ], [ [ "- Finally we have a forecast at the original scale.", "_____no_output_____" ] ], [ [ "results_ARIMA.plot_predict(1,26)\n\n#start = !st month\n#end = 10yrs forcasting = 144+12*10 = 264th month\n\n#Two models corresponds to AR & MA", "_____no_output_____" ], [ "x=results_ARIMA.forecast(steps=5)\nprint(x)\n#values in residual equivalent", "(array([2.61731503, 2.74811248, 2.79018171, 2.72974134, 2.71475412]), array([0.23507659, 0.33764008, 0.33764014, 0.3376707 , 0.33861065]), array([[2.15657338, 3.07805668],\n [2.08635009, 3.40987487],\n [2.12841919, 3.45194423],\n [2.06791892, 3.39156376],\n [2.05108943, 3.3784188 ]]))\n" ], [ "for i in range(0,5):\n print(x[0][i],end='')\n print('\\t',x[1][i],end='')\n print('\\t',x[2][i])", "2.617315031254026\t 0.2350765910977012\t [2.15657338 3.07805668]\n2.7481124785660804\t 0.3376400750662718\t [2.08635009 3.40987487]\n2.790181712476766\t 0.3376401428052289\t [2.12841919 3.45194423]\n2.7297413397725365\t 0.3376707032327619\t [2.06791892 3.39156376]\n2.7147541154092676\t 0.3386106520100188\t [2.05108943 3.3784188 ]\n" ], [ "np.exp(results_ARIMA.forecast(steps=5)[0])", "_____no_output_____" ], [ "predictions_ARIMA_diff = pd.Series(results_ARIMA.forecast(steps=5)[0], copy=True)\npredictions_ARIMA_diff.head()", "_____no_output_____" ], [ "predictions_ARIMA_diff_cumsum = predictions_ARIMA_diff.cumsum()\npredictions_ARIMA_diff_cumsum.head()", "_____no_output_____" ], [ "predictions_ARIMA_log=[]\nfor i in range(0,len(predictions_ARIMA_diff_cumsum)):\n predictions_ARIMA_log.append(predictions_ARIMA_diff_cumsum[i]+3.411478)\npredictions_ARIMA_log", "_____no_output_____" ], [ "#Last step is to take the exponent and compare with the original series.\npredictions_ARIMA = np.exp(predictions_ARIMA_log)\nplt.subplot(121)\nplt.plot(indexedDataset)\nplt.subplot(122)\nplt.plot(predictions_ARIMA)\nplt.tight_layout()\n# plt.title('RMSE: %.4f'% np.sqrt(sum((predictions_ARIMA-indexedDataset['MONSOON'])**2)/len(indexedDataset)))", "_____no_output_____" ], [ "np.exp(predictions_ARIMA_log)", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d02581b83a83a80c39d56cc597c59e4ac1b0803e
8,760
ipynb
Jupyter Notebook
Python_Basic_Assignments/Assignment_11.ipynb
dataqueenpend/-Assignments_fsDS_OneNeuron
60ec0f1d357b738dd6c753254506a0a6f8241ceb
[ "MIT" ]
null
null
null
Python_Basic_Assignments/Assignment_11.ipynb
dataqueenpend/-Assignments_fsDS_OneNeuron
60ec0f1d357b738dd6c753254506a0a6f8241ceb
[ "MIT" ]
null
null
null
Python_Basic_Assignments/Assignment_11.ipynb
dataqueenpend/-Assignments_fsDS_OneNeuron
60ec0f1d357b738dd6c753254506a0a6f8241ceb
[ "MIT" ]
null
null
null
44.242424
973
0.604338
[ [ [ "## 1. Create an assert statement that throws an AssertionError if the variable spam is a negative integer.", "_____no_output_____" ] ], [ [ "import pyinputplus as pyip\n\ndef is_pos_integer():\n n = pyip.inputInt(prompt='Enter a positive integer: ')\n assert n > 0, 'This is a negative integer'", "_____no_output_____" ], [ "is_pos_integer()", "Enter a positive integer: -1\n" ] ], [ [ "## 2. Write an assert statement that triggers an AssertionError if the variables eggs and bacon contain strings that are the same as each other, even if their cases are different (that is, &#39;hello&#39; and &#39;hello&#39; are considered the same, and &#39;goodbye&#39; and &#39;GOODbye&#39; are also considered the same).", "_____no_output_____" ] ], [ [ "import re\nimport pyinputplus as pyip\n\ndef same_or_different():\n eggs = pyip.inputStr(prompt='What is eggs: ')\n bacon = pyip.inputStr(prompt='What is bacon: ')\n \n assert eggs.lower() != bacon.lower(), 'Strings are the same!'", "_____no_output_____" ], [ "same_or_different()", "What is eggs: hello\nWhat is bacon: Hello\n" ] ], [ [ "## 3. Create an assert statement that throws an AssertionError every time.", "_____no_output_____" ] ], [ [ "assert 0 !=0 , 'Assertion Error every time!'", "_____no_output_____" ] ], [ [ "## 4. What are the two lines that must be present in your software in order to call logging.debug()?\n > import logging\n > logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(levelname)s %(message)s')\n\n## 5. What are the two lines that your program must have in order to have logging.debug() send a logging message to a file named programLog.txt?\n > import logging\n > logging.basicConfig(filename='programLog.txt', level=logging.DEBUG, format='%(asctime)s %(levelname)s %(message)s')\n\n## 6. What are the five levels of logging?\n > DEBUG, INFO, WARNING, ERROR, CRITICAL\n\n## 7. What line of code would you add to your software to disable all logging messages?\n > lg.disable(lg.CRITICAL)\n\n## 8.Why is using logging messages better than using print() to display the same message?\n > Because of timestamps, we have a log of erros, without the messages in console. \n\n## 9. What are the differences between the Step Over, Step In, and Step Out buttons in the debugger?\n * Step - executes the current line\n * Over - quickly executes function without stepping into it\n * Out - executes the rest of the code until it steps out to the place it is currently\n\n## 10.After you click Continue, when will the debugger stop ?\n > Upon breakpoint.\n\n## 11. What is the concept of a breakpoint?\n > Debugger stops at breakpoint. And this is the main concept behind it. ", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d02584addcf5a242644d3f17895ece75b40dd5e0
20,964
ipynb
Jupyter Notebook
examples/Advection-diffusion on a triangle.ipynb
ApproxFun/DiskFun.jl
cf0d0bacc8c5f78c0f5c48b21b38a7a08c33e2e3
[ "MIT" ]
5
2018-07-21T21:56:58.000Z
2021-09-23T16:08:10.000Z
examples/Advection-diffusion on a triangle.ipynb
ApproxFun/DiskFun.jl
cf0d0bacc8c5f78c0f5c48b21b38a7a08c33e2e3
[ "MIT" ]
41
2018-02-28T11:17:28.000Z
2022-02-04T18:59:09.000Z
examples/Advection-diffusion on a triangle.ipynb
ApproxFun/DiskFun.jl
cf0d0bacc8c5f78c0f5c48b21b38a7a08c33e2e3
[ "MIT" ]
7
2018-09-12T23:05:16.000Z
2022-03-24T13:17:41.000Z
43.5842
1,129
0.635327
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d02588d73318038c341739e2c325be6b6b9a18f0
18,364
ipynb
Jupyter Notebook
content/04. Not quite intelligent robots/04.2 Robot navigation using dead reckoning.ipynb
mmh352/tm129-robotics2020
a73a00fcfcd7eb7857bc9b6ce28449e9e79a79bc
[ "OML" ]
8
2020-02-12T06:20:18.000Z
2021-10-08T23:45:24.000Z
content/04. Not quite intelligent robots/04.2 Robot navigation using dead reckoning.ipynb
mmh352/tm129-robotics2020
a73a00fcfcd7eb7857bc9b6ce28449e9e79a79bc
[ "OML" ]
73
2020-02-07T18:55:07.000Z
2021-10-01T12:09:06.000Z
content/04. Not quite intelligent robots/04.2 Robot navigation using dead reckoning.ipynb
mmh352/tm129-robotics2020
a73a00fcfcd7eb7857bc9b6ce28449e9e79a79bc
[ "OML" ]
4
2020-06-25T09:46:53.000Z
2021-06-29T11:48:37.000Z
39.32334
690
0.654596
[ [ [ "# 2 Dead reckoning\n\n*Dead reckoning* is a means of navigation that does not rely on external observations. Instead, a robot’s position is estimated by summing its incremental movements relative to a known starting point.\n\nEstimates of the distance traversed are usually obtained from measuring how many times the wheels have turned, and how many times they have turned in relation to each other. For example, the wheels of the robot could be attached to an odometer, similar to the device that records the mileage of a car.\n\nIn RoboLab we will calculate the position of a robot from how long it moves in a straight line or rotates about its centre. We will assume that the length of time for which the motors are switched on is directly related to the distance travelled by the wheels.", "_____no_output_____" ], [ "*By design, the simulator does not provide the robot with access to any magical GPS-style service. In principle, we could create a magical ‘simulated-GPS’ sensor that would allow the robot to identify its location from the simulator’s point of view; but in the real world we can’t always guarantee that external location services are available. For example, GPS doesn’t work indoors or underground, or even in many cities where line-of-sight access to four or more GPS satellites is not available.*\n\n*Furthermore, the robot cannot magically teleport itself to a new location from within a program. Only the magics can teleport the robot to a specific location...*\n\n*Although the simulator is omniscient and does keep track of where the robot is, the robot must figure out for itself where it is based on things like how far the motors have turned, or from its own sensor readings (ultrasound-based distance to a target, for example, or gyroscope heading); you will learn how to make use of sensors for navigation in later notebooks.*", "_____no_output_____" ], [ "## 2.1 Activity – Dead reckoning\n\nAn environment for the simulated robot to navigate is shown below, based on the 2018 First Lego League ‘Into Orbit’ challenge.\n\nThe idea is that the robot must get to the target satellite from its original starting point by avoiding the obstacles in its direct path.\n\n![Space scene showing the robot, some satellites against a ‘space’ background, and some wall-like obstacles between the robot’s starting point and a target satellite.](../images/Section_00_02_-_Jupyter_Notebook.png)", "_____no_output_____" ], [ "The [First Lego League (FLL)](https://www.firstlegoleague.org/) is a friendly international youth-based robot competition in which teams compete at national and international level on an annual basis. School teams are often coached by volunteers. In the UK, volunteers often coach teams under the auspices of the [STEM Ambassadors Scheme](https://www.stem.org.uk/stem-ambassadors). Many companies run volunteering schemes that allow employees to volunteer their skills in company time using schemes such as STEM Ambassadors.", "_____no_output_____" ], [ "Load in the simulator in the usual way:", "_____no_output_____" ] ], [ [ "from nbev3devsim.load_nbev3devwidget import roboSim, eds\n\n%load_ext nbev3devsim", "_____no_output_____" ] ], [ [ "To navigate the environment, we will use a small robot configuration within the simulator. The robot configuration can be set via the simulator user interface, or by passing the `-r Small_Robot` parameter setting in the simulator magic.\n\nThe following program should drive the robot from its starting point to the target, whilst avoiding the obstacles. We define the obstacle as being avoided if it is not crossed by the robot’s *pen down* trail.\n\nLoad the *FLL_2018_Into_Orbit* background into the simulator. Run the following code cell to download the program to the simulator and then, with the *pen down*, run the program in the simulator.", "_____no_output_____" ], [ "Remember, you can use the `-P / --pencolor` flag to change the pen colour and the `-C / --clear` option to clear the pen trace.", "_____no_output_____" ], [ "Does the robot reach the target satellite without encountering any obstacles?", "_____no_output_____" ] ], [ [ "%%sim_magic_preloaded -b FLL_2018_Into_Orbit -p -r Small_Robot\n\n# Turn on the spot to the right\ntank_turn.on_for_rotations(100, SpeedPercent(70), 1.7 )\n\n# Go forwards\ntank_drive.on_for_rotations(SpeedPercent(30), SpeedPercent(30), 20)\n\n# Slight graceful turn to left\ntank_drive.on_for_rotations(SpeedPercent(35), SpeedPercent(50), 8.5)\n\n# Turn on the spot to the left\ntank_turn.on_for_rotations(-100, SpeedPercent(75), 0.8)\n\n# Forwards a bit\ntank_drive.on_for_rotations(SpeedPercent(30), SpeedPercent(30), 2.0)\n\n# Turn on the spot a bit more to the right\ntank_turn.on_for_rotations(100, SpeedPercent(60), 0.4 )\n\n# Go forwards a bit more and dock on the satellite\ntank_drive.on_for_rotations(SpeedPercent(30), SpeedPercent(30), 1.0)\n\nsay(\"Hopefully I have docked with the satellite...\")", "_____no_output_____" ] ], [ [ "*Add your notes on how well the simulated robot performed the task here.*", "_____no_output_____" ], [ "To set the speeds and times, I used a bit of trial and error.\n\nIf the route had been much more complex, then I would have been tempted to comment out the steps up I had already run and add new steps that would be applied from wherever the robot was currently located.\n\nNote that the robot could have taken other routes to get to the satellite – I just thought I should avoid the asteroid!", "_____no_output_____" ], [ "### 2.1.1 Using motor tacho counts to identify how far the robot has travelled\n\nIn the above example, the motors were turned on for a specific amount of time to move the robot on each leg of its journey. This would not be an appropriate control strategy if we wanted to collect sensor data along the route, because the `on_for_X()` motor commands are blocking commands.\n\nHowever, suppose we replaced the forward driving `tank_drive.on_for_rotations()` commands with commands of the form:\n\n```python\nfrom time import sleep\n\ntank_drive.on(SPEED)\nwhile int(tank_drive.left_motor.position) < DISTANCE:\n # We need something that takes a finite time\n # to run in the loop or the program will hang\n sleep(0.1)\n```\n\nNow we could drive the robot forwards until the motor tacho count exceeds a specified `DISTANCE` and at the same time, optionally include additional commands, such as sensor data-logging commands, inside the body of each `while` loop.\n\n*As well as `tank_drive.left_motor.position` we can also refer to `tank_drive.right_motor.position`. Also note that these values are returned as strings and need to be cast to integers for numerical comparisons.*", "_____no_output_____" ], [ "### 2.1.2 Activity – Dead reckoning over distances (optional)\n\nUse the `.left_motor.position` and/or `.right_motor.position` motor tacho counts in a program that allows the robot to navigate from its home base to the satellite rendezvous.", "_____no_output_____" ], [ "*Your design notes here.*", "_____no_output_____" ] ], [ [ "# YOUR CODE HERE", "_____no_output_____" ] ], [ [ "*Your notes and observations here.*", "_____no_output_____" ], [ "## 2.2 Challenge – Reaching the moon base", "_____no_output_____" ], [ "In the following code cell, write a program to move the simulated robot from its location servicing the satellite to the moon base identified as the circular area marked on the moon in the top right-hand corner of the simulated world.\n\nIn the simulator, set the robot’s *x* location to `1250` and *y* location to `450`.\n\nUse the following code cell to write your own dead-reckoning program to drive the robot to the moon base at location `(2150, 950)`.", "_____no_output_____" ] ], [ [ "%%sim_magic_preloaded\n\n# YOUR CODE HERE\n", "_____no_output_____" ] ], [ [ "## 2.3 Dead reckoning with noise\n\nThe robot traverses its path using timing information for dead reckoning. In principle, if the simulated robot had a map then it could calculate all the distances and directions for itself, convert these to times, and dead reckon its way to the target. However, there is a problem with dead reckoning: *noise*.\n\nIn many physical systems, a perfect intended behaviour is subject to *noise* – random perturbations that arise within the system as time goes on as a side effect of its operation. In a robot, noise might arise in the behaviour of the motors, the transmission or the wheels. The result is that the robot does not execute its motion without error. We can model noise effects in the mobility system of our robot by adding a small amount of noise to the motor speeds as the simulator runs. This noise component may speed up or slow down the speed of each motor, in a random way. As with real systems, the noise represents slight random deviations from the theoretical, ideal behaviour.\n\nFor the following experiment, create a new, empty background cleared of pen traces.", "_____no_output_____" ] ], [ [ "%sim_magic -b Empty_Map --clear", "_____no_output_____" ] ], [ [ "Run the following code cell to download the program to the simulator using an empty background (select the *Empty_Map*) and the *Pen Down* mode selected. Also reset the initial location of the robot to an *x* value of `150` and *y* value of `400`.\n\nRun the program in the simulator and observe what happens.", "_____no_output_____" ] ], [ [ "%%sim_magic_preloaded -b Empty_Map -p -x 150 -y 400 -r Small_Robot --noisecontrols\n\n\ntank_drive.on_for_rotations(SpeedPercent(30),\n SpeedPercent(30), 10)", "_____no_output_____" ] ], [ [ "*Record your observations here describing what happens when you run the program.*", "_____no_output_____" ], [ "When you run the program, you should see the robot drive forwards a short way in a straight line, leaving a straight line trail behind it.\n\nReset the location of the robot. Within the simulator, use the *Noise controls* to increase the *Wheel noise* value from zero by dragging the slider to the right a little way. Alternatively, add noise in the range `0...500` using the `--motornoise / -M` magic flag. \n\nRun the program in the simulator again.\n\nYou should notice this time that the robot does not travel in a straight line. Instead, it drifts from side to side, although possibly to one side of the line.\n\nMove the robot back to the start position, or rerun the previous code cell to do so, and run the program in the simulator again. This time, you should see it follows yet another different path.\n\nDepending on how severe the noise setting is, the robot will travel closer (low noise) to the original straight line, or follow an ever-more erratic path (high noise).", "_____no_output_____" ], [ "*Record your own notes and observations here describing the behaviour of the robot for different levels of motor noise.*", "_____no_output_____" ], [ "Clear the pen traces from the simulator by running the following line magic:", "_____no_output_____" ] ], [ [ "%sim_magic -C", "_____no_output_____" ] ], [ [ "Now run the original satellite-finding dead-reckoning program again, using the *FLL_2018_Into_Orbit* background, but in the presence of *Wheel noise*. How well does it perform this time compared to previously?", "_____no_output_____" ] ], [ [ "%%sim_magic_preloaded -b FLL_2018_Into_Orbit -p -r Small_Robot\n\n# Turn on the spot to the right\ntank_turn.on_for_rotations(100, SpeedPercent(70), 1.7 )\n\n# Go forwards\ntank_drive.on_for_rotations(SpeedPercent(30), SpeedPercent(30), 20)\n\n# Slight graceful turn to left\ntank_drive.on_for_rotations(SpeedPercent(35), SpeedPercent(50), 8.5)\n\n# Turn on the spot to the left\ntank_turn.on_for_rotations(-100, SpeedPercent(75), 0.8)\n\n# Forwards a bit\ntank_drive.on_for_rotations(SpeedPercent(30), SpeedPercent(30), 2.0)\n\n# Turn on the spot a bit more to the right\ntank_turn.on_for_rotations(100, SpeedPercent(60), 0.4 )\n\n# Go forwards a bit more and dock on the satellite\ntank_drive.on_for_rotations(SpeedPercent(30), SpeedPercent(30), 1.0)\n\nsay(\"Did I avoid crashing and dock with the satellite?\")", "_____no_output_____" ] ], [ [ "Reset the robot to its original location and run the program in the simulator again. Even with the same level of motor noise as on the previous run, how does the path followed by the robot this time compare with the previous run?", "_____no_output_____" ], [ "*Add your own notes and observations here.*", "_____no_output_____" ], [ "## 2.4 Summary\n\nIn this notebook, you have seen how we can use dead reckoning to move the robot along a specified path. Using the robot’s motor speeds and by monitoring how long the motors are switched on for, we can use distance–time calculations to estimate the robot’s path. If we add in accurate measurements regarding how far we want the robot to travel, and in what direction, this provides one way of helping the robot to navigate to a particular waypoint.\n\nHowever, in the presence of noise, this approach is likely to be very unreliable: whilst the robot may think it is following one path, as determined by how long it has turned its motors on, and at what speed, it may in fact be following another path. In a real robot, the noise may be introduced in all sorts of ways, including from friction in the motor bearings, the time taken to accelerate from a standing start and get up to speed, and loss of traction effects such as wheel spin and slip as the robot’s wheels turn. \n\nWhilst in some cases it may reach the target safely, in others it may end somewhere completely different, or encounter an obstacle along the way.\n\n<!-- JD: should we say what's coming up in the next notebook? -->", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ] ]
d025923f45524e3a2695221cb7ebf18bf34ec741
1,712
ipynb
Jupyter Notebook
DateTime.ipynb
helloprasanna/python
1f218ddf84bc082dca5906833238389011ae344b
[ "MIT" ]
null
null
null
DateTime.ipynb
helloprasanna/python
1f218ddf84bc082dca5906833238389011ae344b
[ "MIT" ]
null
null
null
DateTime.ipynb
helloprasanna/python
1f218ddf84bc082dca5906833238389011ae344b
[ "MIT" ]
null
null
null
20.380952
84
0.486565
[ [ [ "import datetime\ndob = '20110101'\n\n\ntoday = datetime.datetime.now()\nyyyy = int(dob[0:4])\nmm = int(dob[4:6])\ndd = int(dob[6:8])\ndob = datetime.datetime(yyyy,mm,dd)\nage_in_days = (today - dob).days\nage_in_years = age_in_days/365\nprint(int(age_in_years))\n\n", "7\n" ], [ "# Formatt date\ndob1 = datetime.date(2011,1,1)\nprint(dob1)\n\nfrm = 'year {:%A, %B, %d, %Y}'\nprint(dob1.strftime('%A, %B, %d, %Y')) # old method\nprint(frm.format(dob1))\n\ndob3 = datetime.datetime.strptime('Jun 1 2005 1:33PM', '%b %d %Y %I:%M%p')\nprint(dob3)", "2011-01-01\nSaturday, January, 01, 2011\nyear Saturday, January, 01, 2011\n2005-06-01 13:33:00\n" ] ] ]
[ "code" ]
[ [ "code", "code" ] ]
d025a2826feadb0833b80edfb314c69b919604f4
7,787
ipynb
Jupyter Notebook
database_updater.ipynb
tlkh/reverse-image-search
df672ec5576916fd3c78c28ed6b2b28f68feed0a
[ "MIT" ]
null
null
null
database_updater.ipynb
tlkh/reverse-image-search
df672ec5576916fd3c78c28ed6b2b28f68feed0a
[ "MIT" ]
null
null
null
database_updater.ipynb
tlkh/reverse-image-search
df672ec5576916fd3c78c28ed6b2b28f68feed0a
[ "MIT" ]
null
null
null
28.010791
390
0.546038
[ [ [ "## What this code does\nIn short, it is a reverse meme search, that identifies the source of the meme. It takes an image copypasta, extracts the individual *subimages* and compares it with a database of pictures (the database should be made up of copypastas, which is in TODO)\n\n### TODO\n\n### Clean up the code\nThere are many repetitive import statements. <br\\>\nThe code is saving the picture as file so that you can load it into model. <br\\>\nAnything that you cannot explain in this code <br\\>\nChange VGG16 to Xception (because I can't upgrade both TF and keras for reasons)\n\n#### Feature vector robustness check\nTo what extent the following transformations affects the feature vector?\n- crop (a little, add bounding boxes)\n- photoshop - e.g. cropping a face onto a body\n- rotate the image (a little, a lot)\n- add text (different sizes)\n- vandalised - scribbling markers over \n- add noise (Gaussian etc)\n- compression changes\n- recoloring - grey-scale\n- picture effects - e.g. twisted picture meme\n- special effects - e.g. shining eyes meme\n\n#### Image separation testing\nWe need to ensure the individual pictures are separated correctly.\n- pictures now don't have borders\n- pictures are no longer rectangular\n- whether does it identify the source of the cropped face\n\n#### Database management\nWe need to do preprocessing of the database. Currently the feature vector is only calculated when you start this notebook. \n\nMoreover, since the database of copypasta will not be single images, we need to process that aspect as well. From the copypastas we need to identify its subimages and then calculate their feature vector. There also needs to be some way to associate the feature vector and the location of the subimages to the image copypasta, together with its metadata - in a manner that is scalable.", "_____no_output_____" ], [ "### import imagenet model", "_____no_output_____" ] ], [ [ "%run image_database_helper.ipynb", "Using TensorFlow backend.\n" ], [ "model = init_model()", "_____no_output_____" ] ], [ [ "### making a list of all the files", "_____no_output_____" ] ], [ [ "!rm 'imgs/.DS_Store'", "rm: imgs/.DS_Store: No such file or directory\r\n" ], [ "images = findfiles(\"new/\")\nprint(len(images))", "24\n" ] ], [ [ "### Processing pictures", "_____no_output_____" ] ], [ [ "from PIL import Image\nfrom matplotlib.pyplot import imshow\nimport matplotlib.pyplot as plt\nimport cv2", "_____no_output_____" ], [ "import csv\n\nfieldnames = ['img_file_name',\n 'number_of_subimages',\n 'subimage_number',\n 'x',\n 'y',\n 'w',\n 'h',\n 'feature_vector']\n\nimport os\nif not os.path.exists('index_subimage.csv'):\n with open('index_subimage.csv', 'w') as csvfile:\n db = csv.DictWriter(csvfile, fieldnames=fieldnames)\n db.writeheader()", "_____no_output_____" ], [ "import subprocess\nimport csv\n\n\nfor img_name in images:\n path_image_to_analyse = \"./new/\"+img_name\n print(img_name)\n \n img = cv2.imread(path_image_to_analyse)\n output_boxes = get_bounding_boxes(img)\n\n for i, box in enumerate(output_boxes):\n [x,y,w,h] = box\n output_img = np.array(img[y:y+h, x:x+w])\n cv2.imwrite(\"temp.jpg\",output_img)\n feature_vector = calc_feature_vector(model, \"temp.jpg\")\n\n dict_to_write = {'img_file_name':img_name,\n 'number_of_subimages':len(output_boxes),\n 'subimage_number':i,\n 'x':x,\n 'y':y,\n 'w':w,\n 'h':h,\n 'feature_vector':feature_vector}\n\n with open('index_subimage.csv', 'a') as csvfile:\n db = csv.DictWriter(csvfile, fieldnames=fieldnames)\n db.writerow(dict_to_write)\n \n subprocess.run(\"mv ./new/{} ./database/{}\".format(img_name,img_name),shell=True)", "our-singapore-malaysian-durian-fake-768x791.jpg\ndog-meat-satay-post.jpg\nScreen Shot 2018-04-02 at 05.19.52 AM.png\nScreen-Shot-2017-10-12-at-5.33.37-PM.png\nge14fakenews.png\npm-lee-gst-meme.jpg\ncasket-1-768x1366.jpg\n17626490_10155249769684175_8841223787064818741_n.jpg\ncity-harvest-church-headline-changed.jpg\n17951666_1758817064340674_3343590308811084675_n-768x1365.jpg\nsexdoll-768x603.jpg\npmo-lky-hoax.jpg\nBear-hoax-singapore-768x432.jpg\nnotice-sammyboy-e1520150158500-768x620.jpg\nntucfairpricewarning.jpg\nIMG_7EF6C6D23BCD-1-768x1366.jpg\nntuc27e_2x.jpg\nflyer-breast-cancer-yishun-fake.jpg\nPunggo_Waterway_Terraces_roof_top_floors_collapse_1-768x801.jpg\ncross-with-phone-fine-fake-768x930.jpg\nfake copy.jpg\nScreen-Shot-2018-02-09-at-10.26.30.png\nASS-Prakash-2-768x787.jpg\nnude-2.jpg\n" ], [ "!cp ./database/* ./new/", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ] ]
d025a2be6548d4737fd8e906e064257dd5d55f6c
944,975
ipynb
Jupyter Notebook
3_Inference.ipynb
mohamed11981198/udacity-CVND-Image-Captioning
d7022de998e95ae3a93c7883aae6729f0f5de8fa
[ "MIT" ]
1
2020-05-16T05:08:30.000Z
2020-05-16T05:08:30.000Z
3_Inference.ipynb
mohamed11981198/udacity-CVND-Image-Captioning
d7022de998e95ae3a93c7883aae6729f0f5de8fa
[ "MIT" ]
null
null
null
3_Inference.ipynb
mohamed11981198/udacity-CVND-Image-Captioning
d7022de998e95ae3a93c7883aae6729f0f5de8fa
[ "MIT" ]
null
null
null
1,702.657658
284,276
0.960467
[ [ [ "# Computer Vision Nanodegree\n\n## Project: Image Captioning\n\n---\n\nIn this notebook, you will use your trained model to generate captions for images in the test dataset.\n\nThis notebook **will be graded**. \n\nFeel free to use the links below to navigate the notebook:\n- [Step 1](#step1): Get Data Loader for Test Dataset \n- [Step 2](#step2): Load Trained Models\n- [Step 3](#step3): Finish the Sampler\n- [Step 4](#step4): Clean up Captions\n- [Step 5](#step5): Generate Predictions!", "_____no_output_____" ], [ "<a id='step1'></a>\n## Step 1: Get Data Loader for Test Dataset\n\nBefore running the code cell below, define the transform in `transform_test` that you would like to use to pre-process the test images. \n\nMake sure that the transform that you define here agrees with the transform that you used to pre-process the training images (in **2_Training.ipynb**). For instance, if you normalized the training images, you should also apply the same normalization procedure to the test images.", "_____no_output_____" ] ], [ [ "import sys\nsys.path.append('/opt/cocoapi/PythonAPI')\nfrom pycocotools.coco import COCO\nfrom data_loader import get_loader\nfrom torchvision import transforms\n\n# TODO #1: Define a transform to pre-process the testing images.\ntransform_test = transforms.Compose([ \n transforms.Resize(256), # smaller edge of image resized to 256\n transforms.RandomCrop(224), # get 224x224 crop from random location\n transforms.RandomHorizontalFlip(), # horizontally flip image with probability=0.5\n transforms.ToTensor(), # convert the PIL Image to a tensor\n transforms.Normalize((0.485, 0.456, 0.406), # normalize image for pre-trained model\n (0.229, 0.224, 0.225))])\n\n#-#-#-# Do NOT modify the code below this line. #-#-#-#\n\n# Create the data loader.\ndata_loader = get_loader(transform=transform_test, \n mode='test')", "Vocabulary successfully loaded from vocab.pkl file!\n" ] ], [ [ "Run the code cell below to visualize an example test image, before pre-processing is applied.", "_____no_output_____" ] ], [ [ "import numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\n\n# Obtain sample image before and after pre-processing.\norig_image, image = next(iter(data_loader))\n\n# Visualize sample image, before pre-processing.\nplt.imshow(np.squeeze(orig_image))\nplt.title('example image')\nplt.show()", "_____no_output_____" ] ], [ [ "<a id='step2'></a>\n## Step 2: Load Trained Models\n\nIn the next code cell we define a `device` that you will use move PyTorch tensors to GPU (if CUDA is available). Run this code cell before continuing.", "_____no_output_____" ] ], [ [ "import torch\n\ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")", "_____no_output_____" ] ], [ [ "Before running the code cell below, complete the following tasks.\n\n### Task #1\n\nIn the next code cell, you will load the trained encoder and decoder from the previous notebook (**2_Training.ipynb**). To accomplish this, you must specify the names of the saved encoder and decoder files in the `models/` folder (e.g., these names should be `encoder-5.pkl` and `decoder-5.pkl`, if you trained the model for 5 epochs and saved the weights after each epoch). \n\n### Task #2\n\nPlug in both the embedding size and the size of the hidden layer of the decoder corresponding to the selected pickle file in `decoder_file`.", "_____no_output_____" ] ], [ [ "# Watch for any changes in model.py, and re-load it automatically.\n% load_ext autoreload\n% autoreload 2\n\nimport os\nimport torch\nfrom model import EncoderCNN, DecoderRNN\n\n# TODO #2: Specify the saved models to load.\nencoder_file = \"encoder-1.pkl\" \ndecoder_file = \"decoder-1.pkl\"\n\n# TODO #3: Select appropriate values for the Python variables below.\nembed_size = 256 #512 #300\nhidden_size = 512\n\n# The size of the vocabulary.\nvocab_size = len(data_loader.dataset.vocab)\n\n# Initialize the encoder and decoder, and set each to inference mode.\nencoder = EncoderCNN(embed_size)\nencoder.eval()\ndecoder = DecoderRNN(embed_size, hidden_size, vocab_size)\ndecoder.eval()\n\n# Load the trained weights.\nencoder.load_state_dict(torch.load(os.path.join('./models', encoder_file)))\ndecoder.load_state_dict(torch.load(os.path.join('./models', decoder_file)))\n\n# Move models to GPU if CUDA is available.\nencoder.to(device)\ndecoder.to(device)", "Downloading: \"https://download.pytorch.org/models/resnet50-19c8e357.pth\" to /root/.torch/models/resnet50-19c8e357.pth\n100%|██████████| 102502400/102502400 [00:01<00:00, 58944421.27it/s]\n" ] ], [ [ "<a id='step3'></a>\n## Step 3: Finish the Sampler\n\nBefore executing the next code cell, you must write the `sample` method in the `DecoderRNN` class in **model.py**. This method should accept as input a PyTorch tensor `features` containing the embedded input features corresponding to a single image.\n\nIt should return as output a Python list `output`, indicating the predicted sentence. `output[i]` is a nonnegative integer that identifies the predicted `i`-th token in the sentence. The correspondence between integers and tokens can be explored by examining either `data_loader.dataset.vocab.word2idx` (or `data_loader.dataset.vocab.idx2word`).\n\nAfter implementing the `sample` method, run the code cell below. If the cell returns an assertion error, then please follow the instructions to modify your code before proceeding. Do **not** modify the code in the cell below. ", "_____no_output_____" ] ], [ [ "# Move image Pytorch Tensor to GPU if CUDA is available.\nimage = image.to(device)\n\n# Obtain the embedded image features.\nfeatures = encoder(image).unsqueeze(1)\n\n# Pass the embedded image features through the model to get a predicted caption.\noutput = decoder.sample(features)\nprint('example output:', output)\n\nassert (type(output)==list), \"Output needs to be a Python list\" \nassert all([type(x)==int for x in output]), \"Output should be a list of integers.\" \nassert all([x in data_loader.dataset.vocab.idx2word for x in output]), \"Each entry in the output needs to correspond to an integer that indicates a token in the vocabulary.\"", "example output: [0, 3, 2436, 170, 77, 3, 204, 21, 3, 769, 77, 32, 297, 18, 1, 1, 18, 1, 1, 18]\n" ] ], [ [ "<a id='step4'></a>\n## Step 4: Clean up the Captions\n\nIn the code cell below, complete the `clean_sentence` function. It should take a list of integers (corresponding to the variable `output` in **Step 3**) as input and return the corresponding predicted sentence (as a single Python string). ", "_____no_output_____" ] ], [ [ "# TODO #4: Complete the function.\ndef clean_sentence(output):\n \n seperator = \" \"\n word_list = [];\n for word_index in output:\n if word_index not in [0,2]: # 0: '<start>', 1: '<end>', 2: '<unk>', 18: '.'\n if word_index == 1:\n break\n word = data_loader.dataset.vocab.idx2word[word_index]\n word_list.append(word)\n \n sentence = seperator.join(word_list) \n return sentence ", "_____no_output_____" ] ], [ [ "After completing the `clean_sentence` function above, run the code cell below. If the cell returns an assertion error, then please follow the instructions to modify your code before proceeding.", "_____no_output_____" ] ], [ [ "sentence = clean_sentence(output)\nprint('example sentence:', sentence)\n\nassert type(sentence)==str, 'Sentence needs to be a Python string!'", "example sentence: a giraffe standing in a field with a tree in the background .\n" ] ], [ [ "<a id='step5'></a>\n## Step 5: Generate Predictions!\n\nIn the code cell below, we have written a function (`get_prediction`) that you can use to use to loop over images in the test dataset and print your model's predicted caption.", "_____no_output_____" ] ], [ [ "def get_prediction():\n orig_image, image = next(iter(data_loader))\n plt.imshow(np.squeeze(orig_image))\n plt.title('Sample Image')\n plt.show()\n image = image.to(device)\n features = encoder(image).unsqueeze(1)\n output = decoder.sample(features) \n sentence = clean_sentence(output)\n print(sentence)", "_____no_output_____" ] ], [ [ "Run the code cell below (multiple times, if you like!) to test how this function works.", "_____no_output_____" ] ], [ [ "get_prediction()", "_____no_output_____" ] ], [ [ "As the last task in this project, you will loop over the images until you find four image-caption pairs of interest:\n- Two should include image-caption pairs that show instances when the model performed well.\n- Two should highlight image-caption pairs that highlight instances where the model did not perform well.\n\nUse the four code cells below to complete this task.", "_____no_output_____" ], [ "### The model performed well!\n\nUse the next two code cells to loop over captions. Save the notebook when you encounter two images with relatively accurate captions.", "_____no_output_____" ] ], [ [ "get_prediction()", "_____no_output_____" ], [ "get_prediction()", "_____no_output_____" ] ], [ [ "### The model could have performed better ...\n\nUse the next two code cells to loop over captions. Save the notebook when you encounter two images with relatively inaccurate captions.", "_____no_output_____" ] ], [ [ "get_prediction()", "_____no_output_____" ], [ "get_prediction()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ] ]
d025c7be048c5717b19eefc1aeb8ae1933f29fba
34,894
ipynb
Jupyter Notebook
1_microglia_segmentation/OGD_3_full_segmentation_pipeline-Copy1.ipynb
Nance-Lab/microFIBER
5433b7d045a7f0e1611edc354c34e5963a25bb07
[ "MIT" ]
null
null
null
1_microglia_segmentation/OGD_3_full_segmentation_pipeline-Copy1.ipynb
Nance-Lab/microFIBER
5433b7d045a7f0e1611edc354c34e5963a25bb07
[ "MIT" ]
null
null
null
1_microglia_segmentation/OGD_3_full_segmentation_pipeline-Copy1.ipynb
Nance-Lab/microFIBER
5433b7d045a7f0e1611edc354c34e5963a25bb07
[ "MIT" ]
null
null
null
36.047521
133
0.464492
[ [ [ "# Purpose: To run the full segmentation using the best scored method from 2_compare_auto_to_manual_threshold", "_____no_output_____" ], [ "Date Created: January 7, 2022", "_____no_output_____" ], [ "Dates Edited: January 26, 2022 - changed the ogd severity study to be the otsu data as the yen data did not run on all samples.", "_____no_output_____" ], [ "*Step 1: Import Necessary Packages*", "_____no_output_____" ] ], [ [ "# import major packages\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport skimage\nimport PIL as Image\nimport os\nimport pandas as pd\n\n# import specific package functions\nfrom skimage.filters import threshold_otsu\nfrom skimage import morphology\nfrom scipy import ndimage\nfrom skimage.measure import label\nfrom skimage import io\nfrom skimage import measure", "_____no_output_____" ] ], [ [ "__OGD Severity Study__", "_____no_output_____" ] ], [ [ "im_folder_location = '/Users/hhelmbre/Desktop/ogd_severity_undergrad/10_4_21_redownload/'", "_____no_output_____" ], [ "im_paths = []\nfiles = []\nfor file in os.listdir(im_folder_location):\n if file.endswith(\".tif\"):\n file_name = os.path.join(im_folder_location, file)\n files.append(file)\n im_paths.append(file_name)", "_____no_output_____" ], [ "files", "_____no_output_____" ], [ "properties_list = ('area', 'bbox_area', 'centroid', 'convex_area', \n 'eccentricity', 'equivalent_diameter', 'euler_number', \n 'extent', 'filled_area', 'major_axis_length', \n 'minor_axis_length', 'orientation', 'perimeter', 'solidity')", "_____no_output_____" ], [ "source_dir = '/Users/hhelmbre/Desktop/microfiber/ogd_severity_segmentations/'", "_____no_output_____" ], [ "j = 0\nfor image in im_paths:\n \n short_im_name = image.rsplit('/', 1)\n short_im_name = short_im_name[1]\n \n im = io.imread(image)\n microglia_im = im[:,:,1]\n #otsu threshold\n thresh_otsu = skimage.filters.threshold_otsu(microglia_im)\n binary_otsu = microglia_im > thresh_otsu\n new_binary_otsu = morphology.remove_small_objects(binary_otsu, min_size=71)\n new_binary_otsu = ndimage.binary_fill_holes(new_binary_otsu)\n label_image = measure.label(new_binary_otsu)\n props = measure.regionprops_table(label_image, properties=(properties_list))\n \n np.save(str(source_dir + short_im_name[:-4] + '_otsu_thresh'), new_binary_otsu)\n\n if j == 0:\n df = pd.DataFrame(props)\n df['filename'] = image\n else:\n df2 = pd.DataFrame(props)\n df2['filename'] = image\n df = df.append(df2)\n\n j = 1\n \n ", "_____no_output_____" ], [ "df['circularity'] = 4*np.pi*df.area/df.perimeter**2", "_____no_output_____" ], [ "df['aspect_ratio'] = df.major_axis_length/df.minor_axis_length", "_____no_output_____" ], [ "df", "_____no_output_____" ], [ "df.to_csv('/Users/hhelmbre/Desktop/microfiber/ogd_severity_study_features_otsu.csv' )", "_____no_output_____" ], [ "%load_ext watermark\n\n%watermark -v -m -p numpy,pandas,scipy,skimage,matplotlib,wget\n\n%watermark -u -n -t -z", "Python implementation: CPython\nPython version : 3.7.4\nIPython version : 7.8.0\n\nnumpy : 1.21.5\npandas : 1.3.5\nscipy : 1.3.1\nskimage : 0.17.2\nmatplotlib: 3.1.1\nwget : 3.2\n\nCompiler : Clang 4.0.1 (tags/RELEASE_401/final)\nOS : Darwin\nRelease : 20.6.0\nMachine : x86_64\nProcessor : i386\nCPU cores : 8\nArchitecture: 64bit\n\nLast updated: Wed Jan 26 2022 14:39:49PST\n\n" ] ] ]
[ "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d025db7c935b64b2e7be0da162fada983a3a2d17
869,948
ipynb
Jupyter Notebook
renderer/rendergan_tf.ipynb
jjbits/RenderGAN
bf02f6d6fdb271599e2bf698fe920c704ce8c97a
[ "MIT" ]
2
2020-08-26T22:39:45.000Z
2021-05-02T11:52:08.000Z
renderer/rendergan_tf.ipynb
jjbits/RenderGAN
bf02f6d6fdb271599e2bf698fe920c704ce8c97a
[ "MIT" ]
null
null
null
renderer/rendergan_tf.ipynb
jjbits/RenderGAN
bf02f6d6fdb271599e2bf698fe920c704ce8c97a
[ "MIT" ]
null
null
null
920.579894
254,888
0.949411
[ [ [ "# TensorFlow pix2pix implementation", "_____no_output_____" ], [ "from __future__ import absolute_import, division, print_function, unicode_literals\n\ntry:\n # %tensorflow_version only exists in Colab.\n %tensorflow_version 2.x\nexcept Exception:\n pass\nimport tensorflow as tf\n\nimport os\nimport time\n\nfrom matplotlib import pyplot as plt\nfrom IPython import display", "_____no_output_____" ], [ "print(\"Num GPUs Available: \", len(tf.config.experimental.list_physical_devices('GPU')))", "Num GPUs Available: 0\n" ], [ "PATH = \"/Volumes/Data/projects/cs230/Project/RenderGAN/pix2pix/data/train_data/10-10000/AB/\"\n\nBUFFER_SIZE = 400\nBATCH_SIZE = 1\nIMG_WIDTH = 256\nIMG_HEIGHT = 256", "_____no_output_____" ], [ "def load(image_file):\n image = tf.io.read_file(image_file)\n image = tf.image.decode_png(image)\n\n w = tf.shape(image)[1]\n\n w = w // 2\n real_image = image[:, w:, :]\n input_image = image[:, :w, :]\n\n input_image = tf.cast(input_image, tf.float32)\n real_image = tf.cast(real_image, tf.float32)\n\n return input_image, real_image", "_____no_output_____" ], [ "inp, re = load(PATH+'train/8.png')\n# casting to int for matplotlib to show the image\nplt.figure()\nplt.imshow(inp/255.0)\nplt.figure()\nplt.imshow(re/255.0)", "_____no_output_____" ], [ "def resize(input_image, real_image, height, width):\n input_image = tf.image.resize(input_image, [height, width],\n method=tf.image.ResizeMethod.NEAREST_NEIGHBOR)\n real_image = tf.image.resize(real_image, [height, width],\n method=tf.image.ResizeMethod.NEAREST_NEIGHBOR)\n\n return input_image, real_image\n\ndef random_crop(input_image, real_image):\n stacked_image = tf.stack([input_image, real_image], axis=0)\n cropped_image = tf.image.random_crop(\n stacked_image, size=[2, IMG_HEIGHT, IMG_WIDTH, 3])\n\n return cropped_image[0], cropped_image[1]\n\n# normalizing the images to [-1, 1]\n\ndef normalize(input_image, real_image):\n input_image = (input_image / 127.5) - 1\n real_image = (real_image / 127.5) - 1\n\n return input_image, real_image\n\n@tf.function()\ndef random_jitter(input_image, real_image):\n # resizing to 286 x 286 x 3\n input_image, real_image = resize(input_image, real_image, 286, 286)\n\n # randomly cropping to 256 x 256 x 3\n input_image, real_image = random_crop(input_image, real_image)\n\n if tf.random.uniform(()) > 0.5:\n # random mirroring\n input_image = tf.image.flip_left_right(input_image)\n real_image = tf.image.flip_left_right(real_image)\n\n return input_image, real_image\n", "_____no_output_____" ], [ "plt.figure(figsize=(6, 6))\nfor i in range(4):\n rj_inp, rj_re = random_jitter(inp, re)\n plt.subplot(2, 2, i+1)\n plt.imshow(rj_inp/255.0)\n plt.axis('off')\nplt.show()", "_____no_output_____" ], [ "def load_image_train(image_file):\n input_image, real_image = load(image_file)\n input_image, real_image = random_jitter(input_image, real_image)\n input_image, real_image = normalize(input_image, real_image)\n\n return input_image, real_image", "_____no_output_____" ], [ "def load_image_test(image_file):\n input_image, real_image = load(image_file)\n input_image, real_image = resize(input_image, real_image,\n IMG_HEIGHT, IMG_WIDTH)\n input_image, real_image = normalize(input_image, real_image)\n\n return input_image, real_image", "_____no_output_____" ], [ "train_dataset = tf.data.Dataset.list_files(PATH+'train/*.png')\ntrain_dataset = train_dataset.map(load_image_train,\n num_parallel_calls=tf.data.experimental.AUTOTUNE)\ntrain_dataset = train_dataset.shuffle(BUFFER_SIZE)\ntrain_dataset = train_dataset.batch(BATCH_SIZE)", "_____no_output_____" ], [ "test_dataset = tf.data.Dataset.list_files(PATH+'test/*.png')\ntest_dataset = test_dataset.map(load_image_test)\ntest_dataset = test_dataset.batch(BATCH_SIZE)", "_____no_output_____" ], [ "OUTPUT_CHANNELS = 3", "_____no_output_____" ], [ "def downsample(filters, size, apply_batchnorm=True):\n initializer = tf.random_normal_initializer(0., 0.02)\n\n result = tf.keras.Sequential()\n result.add(\n tf.keras.layers.Conv2D(filters, size, strides=2, padding='same',\n kernel_initializer=initializer, use_bias=False))\n\n if apply_batchnorm:\n result.add(tf.keras.layers.BatchNormalization())\n\n result.add(tf.keras.layers.LeakyReLU())\n\n return result", "_____no_output_____" ], [ "down_model = downsample(3, 4)\ndown_result = down_model(tf.expand_dims(inp, 0))\nprint (down_result.shape)", "(1, 256, 256, 3)\n" ], [ "def upsample(filters, size, apply_dropout=False):\n initializer = tf.random_normal_initializer(0., 0.02)\n\n result = tf.keras.Sequential()\n result.add(\n tf.keras.layers.Conv2DTranspose(filters, size, strides=2,\n padding='same',\n kernel_initializer=initializer,\n use_bias=False))\n\n result.add(tf.keras.layers.BatchNormalization())\n\n if apply_dropout:\n result.add(tf.keras.layers.Dropout(0.5))\n\n result.add(tf.keras.layers.ReLU())\n\n return result", "_____no_output_____" ], [ "up_model = upsample(3, 4)\nup_result = up_model(down_result)\nprint (up_result.shape)", "(1, 512, 512, 3)\n" ], [ "def Generator():\n inputs = tf.keras.layers.Input(shape=[256,256,3])\n\n down_stack = [\n downsample(64, 4, apply_batchnorm=False), # (bs, 128, 128, 64)\n downsample(128, 4), # (bs, 64, 64, 128)\n downsample(256, 4), # (bs, 32, 32, 256)\n downsample(512, 4), # (bs, 16, 16, 512)\n downsample(512, 4), # (bs, 8, 8, 512)\n downsample(512, 4), # (bs, 4, 4, 512)\n downsample(512, 4), # (bs, 2, 2, 512)\n downsample(512, 4), # (bs, 1, 1, 512)\n ]\n\n up_stack = [\n upsample(512, 4, apply_dropout=True), # (bs, 2, 2, 1024)\n upsample(512, 4, apply_dropout=True), # (bs, 4, 4, 1024)\n upsample(512, 4, apply_dropout=True), # (bs, 8, 8, 1024)\n upsample(512, 4), # (bs, 16, 16, 1024)\n upsample(256, 4), # (bs, 32, 32, 512)\n upsample(128, 4), # (bs, 64, 64, 256)\n upsample(64, 4), # (bs, 128, 128, 128)\n ]\n\n initializer = tf.random_normal_initializer(0., 0.02)\n last = tf.keras.layers.Conv2DTranspose(OUTPUT_CHANNELS, 4,\n strides=2,\n padding='same',\n kernel_initializer=initializer,\n activation='tanh') # (bs, 256, 256, 3)\n\n x = inputs\n\n # Downsampling through the model\n skips = []\n for down in down_stack:\n x = down(x)\n skips.append(x)\n\n skips = reversed(skips[:-1])\n\n # Upsampling and establishing the skip connections\n for up, skip in zip(up_stack, skips):\n x = up(x)\n x = tf.keras.layers.Concatenate()([x, skip])\n\n x = last(x)\n\n return tf.keras.Model(inputs=inputs, outputs=x)", "_____no_output_____" ], [ "generator = Generator()\ntf.keras.utils.plot_model(generator, show_shapes=True, dpi=64)", "_____no_output_____" ], [ "gen_output = generator(inp[tf.newaxis,...], training=False)\nplt.imshow(gen_output[0,...])", "Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).\n" ], [ "LAMBDA = 100", "_____no_output_____" ], [ "def generator_loss(disc_generated_output, gen_output, target):\n gan_loss = loss_object(tf.ones_like(disc_generated_output), disc_generated_output)\n\n # mean absolute error\n l1_loss = tf.reduce_mean(tf.abs(target - gen_output))\n\n total_gen_loss = gan_loss + (LAMBDA * l1_loss)\n\n return total_gen_loss, gan_loss, l1_loss", "_____no_output_____" ], [ "def Discriminator():\n initializer = tf.random_normal_initializer(0., 0.02)\n\n inp = tf.keras.layers.Input(shape=[256, 256, 3], name='input_image')\n tar = tf.keras.layers.Input(shape=[256, 256, 3], name='target_image')\n\n x = tf.keras.layers.concatenate([inp, tar]) # (bs, 256, 256, channels*2)\n\n down1 = downsample(64, 4, False)(x) # (bs, 128, 128, 64)\n down2 = downsample(128, 4)(down1) # (bs, 64, 64, 128)\n down3 = downsample(256, 4)(down2) # (bs, 32, 32, 256)\n\n zero_pad1 = tf.keras.layers.ZeroPadding2D()(down3) # (bs, 34, 34, 256)\n conv = tf.keras.layers.Conv2D(512, 4, strides=1,\n kernel_initializer=initializer,\n use_bias=False)(zero_pad1) # (bs, 31, 31, 512)\n\n batchnorm1 = tf.keras.layers.BatchNormalization()(conv)\n\n leaky_relu = tf.keras.layers.LeakyReLU()(batchnorm1)\n\n zero_pad2 = tf.keras.layers.ZeroPadding2D()(leaky_relu) # (bs, 33, 33, 512)\n\n last = tf.keras.layers.Conv2D(1, 4, strides=1,\n kernel_initializer=initializer)(zero_pad2) # (bs, 30, 30, 1)\n\n return tf.keras.Model(inputs=[inp, tar], outputs=last)", "_____no_output_____" ], [ "discriminator = Discriminator()\ntf.keras.utils.plot_model(discriminator, show_shapes=True, dpi=64)", "_____no_output_____" ], [ "disc_out = discriminator([inp[tf.newaxis,...], gen_output], training=False)\nplt.imshow(disc_out[0,...,-1], vmin=-20, vmax=20, cmap='RdBu_r')\nplt.colorbar()", "_____no_output_____" ], [ "loss_object = tf.keras.losses.BinaryCrossentropy(from_logits=True)", "_____no_output_____" ], [ "def discriminator_loss(disc_real_output, disc_generated_output):\n real_loss = loss_object(tf.ones_like(disc_real_output), disc_real_output)\n\n generated_loss = loss_object(tf.zeros_like(disc_generated_output), disc_generated_output)\n\n total_disc_loss = real_loss + generated_loss\n\n return total_disc_loss", "_____no_output_____" ], [ "generator_optimizer = tf.keras.optimizers.Adam(2e-4, beta_1=0.5)\ndiscriminator_optimizer = tf.keras.optimizers.Adam(2e-4, beta_1=0.5)", "_____no_output_____" ], [ "checkpoint_dir = './training_checkpoints'\ncheckpoint_prefix = os.path.join(checkpoint_dir, \"ckpt\")\ncheckpoint = tf.train.Checkpoint(generator_optimizer=generator_optimizer,\n discriminator_optimizer=discriminator_optimizer,\n generator=generator,\n discriminator=discriminator)", "_____no_output_____" ], [ "def generate_images(model, test_input, tar):\n prediction = model(test_input, training=True)\n plt.figure(figsize=(15,15))\n\n display_list = [test_input[0], tar[0], prediction[0]]\n title = ['Input Image', 'Ground Truth', 'Predicted Image']\n\n for i in range(3):\n plt.subplot(1, 3, i+1)\n plt.title(title[i])\n # getting the pixel values between [0, 1] to plot it.\n plt.imshow(display_list[i] * 0.5 + 0.5)\n plt.axis('off')\n plt.show()", "_____no_output_____" ], [ "for example_input, example_target in test_dataset.take(1):\n generate_images(generator, example_input, example_target)", "_____no_output_____" ], [ "EPOCHS = 10", "_____no_output_____" ], [ "import datetime\nlog_dir=\"logs/\"\n\nsummary_writer = tf.summary.create_file_writer(\n log_dir + \"fit/\" + datetime.datetime.now().strftime(\"%Y%m%d-%H%M%S\"))", "_____no_output_____" ], [ "@tf.function\ndef train_step(input_image, target, epoch):\n with tf.GradientTape() as gen_tape, tf.GradientTape() as disc_tape:\n gen_output = generator(input_image, training=True)\n\n disc_real_output = discriminator([input_image, target], training=True)\n disc_generated_output = discriminator([input_image, gen_output], training=True)\n\n gen_total_loss, gen_gan_loss, gen_l1_loss = generator_loss(disc_generated_output, gen_output, target)\n disc_loss = discriminator_loss(disc_real_output, disc_generated_output)\n\n generator_gradients = gen_tape.gradient(gen_total_loss,\n generator.trainable_variables)\n discriminator_gradients = disc_tape.gradient(disc_loss,\n discriminator.trainable_variables)\n\n generator_optimizer.apply_gradients(zip(generator_gradients,\n generator.trainable_variables))\n discriminator_optimizer.apply_gradients(zip(discriminator_gradients,\n discriminator.trainable_variables))\n\n with summary_writer.as_default():\n tf.summary.scalar('gen_total_loss', gen_total_loss, step=epoch)\n tf.summary.scalar('gen_gan_loss', gen_gan_loss, step=epoch)\n tf.summary.scalar('gen_l1_loss', gen_l1_loss, step=epoch)\n tf.summary.scalar('disc_loss', disc_loss, step=epoch)", "_____no_output_____" ], [ "def fit(train_ds, epochs, test_ds):\n for epoch in range(epochs):\n start = time.time()\n\n display.clear_output(wait=True)\n\n for example_input, example_target in test_ds.take(1):\n generate_images(generator, example_input, example_target)\n print(\"Epoch: \", epoch)\n\n # Train\n for n, (input_image, target) in train_ds.enumerate():\n print('.', end='')\n if (n+1) % 100 == 0:\n print()\n train_step(input_image, target, epoch)\n print()\n\n # saving (checkpoint) the model every 20 epochs\n if (epoch + 1) % 20 == 0:\n checkpoint.save(file_prefix = checkpoint_prefix)\n\n print ('Time taken for epoch {} is {} sec\\n'.format(epoch + 1,\n time.time()-start))\n checkpoint.save(file_prefix = checkpoint_prefix)", "_____no_output_____" ], [ "%load_ext tensorboard\n%tensorboard --logdir {log_dir}", "The tensorboard extension is already loaded. To reload it, use:\n %reload_ext tensorboard\n" ], [ "fit(train_dataset, EPOCHS, test_dataset)", "_____no_output_____" ], [ "!ls {checkpoint_dir}", "ls: ./training_checkpoints: No such file or directory\r\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d025dda576ceb330c702d13822e033439bec3b9d
8,374
ipynb
Jupyter Notebook
stats-279/SLU19 - Workflow/Exercise notebook.ipynb
hershaw/stats-279
4bcfaaace35563f3d17efcfa398da2b1d4ecc732
[ "MIT" ]
12
2019-07-06T09:06:17.000Z
2020-11-13T00:58:42.000Z
data-science-101/SLU19 - Workflow/Exercise notebook.ipynb
DareData/data-science-101
5ef71321dffe3b9b51c0d8c171c7b4a0550ecd3a
[ "MIT" ]
29
2019-07-01T14:19:49.000Z
2021-03-24T13:29:50.000Z
data-science-101/SLU19 - Workflow/Exercise notebook.ipynb
DareData/data-science-101
5ef71321dffe3b9b51c0d8c171c7b4a0550ecd3a
[ "MIT" ]
36
2019-07-05T15:53:35.000Z
2021-07-04T04:18:02.000Z
26.087227
226
0.566754
[ [ [ "# Basic Workflow", "_____no_output_____" ] ], [ [ "# Always have your imports at the top\nimport pandas as pd\nfrom sklearn.pipeline import make_pipeline\nfrom sklearn.impute import SimpleImputer\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.base import TransformerMixin\n\nfrom hashlib import sha1 # just for grading purposes\nimport json # just for grading purposes\n\ndef _hash(obj, salt='none'):\n if type(obj) is not str:\n obj = json.dumps(obj)\n to_encode = obj + salt\n return sha1(to_encode.encode()).hexdigest()", "_____no_output_____" ] ], [ [ "# Workflow steps\n\nWhat are the basic workflow steps?\n\nIt's incredibly obvious what the steps are since you can see them graded in plain text. However we deem it worth actually making you type each one of the steps and take a moment to think about it and internalize them.\n\nPlease do actually type them rather than just copy-pasting as fast as you can. Type it out character by character and internalize.", "_____no_output_____" ] ], [ [ "# step_1 = ...\n# step_2 = ...\n# step_2_a = ...\n# step_2_b = ...\n# step_2_c = ...\n# step_2_d = ...\n# step_3 = ...\n# step_4 = ...\n# step_5 = ...\n\n# YOUR CODE HERE\nraise NotImplementedError()\n", "_____no_output_____" ], [ "### BEGIN TESTS\nassert step_1 == 'Get the data'\nassert step_2 == 'Data analysis and preparation'\nassert step_2_a == 'Data analysis'\nassert step_2_b == 'Dealing with data problems'\nassert step_2_c == 'Feature engineering'\nassert step_2_d == 'Feature selection'\nassert step_3 == 'Train model'\nassert step_4 == 'Evaluate results'\nassert step_5 == 'Iterate'\n### END TESTS", "_____no_output_____" ] ], [ [ "# Specific workflow questions\n\nHere are some more specific questions about individual workflow steps.", "_____no_output_____" ] ], [ [ "# True or False, it's super easy to gather your dataset in a production environment\n# real_world_dataset_gathering_easy = ...\n\n# True or False, it's super easy to gather your dataset in the context of the academy\n# academy_dataset_gathering_easy = ...\n\n# True or False, you should try as hard as you can to get the best possible score\n# on your test set by iterating until you can't get your test set score any higher\n# by any means possible\n# test_set_optimization_is_good = ...\n\n# True or False, you should choose one metric by which to evaluate your model and\n# never consider using another one\n# one_metric_should_rule_them_all = ...\n\n# YOUR CODE HERE\nraise NotImplementedError()", "_____no_output_____" ], [ "### BEGIN TESTS\nassert _hash(real_world_dataset_gathering_easy, 'salt1') == '63b5b9a8f2d359e1fc175c3b01b907ef87590484'\nassert _hash(academy_dataset_gathering_easy, 'salt2') == 'dd7dee495a153c95d28c7aa95289c0415242f5d8'\nassert _hash(test_set_optimization_is_good, 'salt3') == 'f24a294afb4a09f7f9df9ee13eb18e7d341c439d'\nassert _hash(one_metric_should_rule_them_all, 'salt4') == '2360691a582e4f0fbefa238ab6ced1cbfbfe8a50'\n### END TESTS", "_____no_output_____" ] ], [ [ "# scikit pipelines\n\nMake a simple pipeline that\n1. Drops all columns that start with the string `evil`\n1. Fills all nulls with the median", "_____no_output_____" ] ], [ [ "# Create a pipeline step called RemoveEvilColumns the removed any\n# column whose name starts with the string 'evil'\n\n# YOUR CODE HERE\nraise NotImplementedError()\n\n\n# Create an pipeline using make_pipeline\n# 1. removes evil columns\n# 2. imputes with the mean\n# 3. has a random forest classifier as the last step\n\n# YOUR CODE HERE\nraise NotImplementedError()", "_____no_output_____" ], [ "X = pd.DataFrame({\n 'evil_1': ['a'] * 100,\n 'evil_2': ['b'] * 100,\n 'not_so_evil': list(range(0, 100))\n})\ny = pd.Series([x % 2 for x in range(0, 100)])\n\npipeline.fit(X, y)\n\n### BEGIN TESTS\nassert pipeline.steps[0][0] == 'removeevilcolumns', pipeline.steps[0][0]\nassert pipeline.steps[1][0] == 'simpleimputer', pipeline.steps[1][0]\nassert pipeline.steps[2][0] == 'randomforestclassifier', pipeline.steps[2][0]\n### END TESTS", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ] ]
d025e50a95d00b4f5b87ae15a2a3d037f37db54c
8,327
ipynb
Jupyter Notebook
src/data/geoPandas_int.ipynb
dansmi-hub/mosquito-jsdm
a4787a9deae3afeb43030db3c1497aa880d34b1d
[ "BSD-3-Clause" ]
null
null
null
src/data/geoPandas_int.ipynb
dansmi-hub/mosquito-jsdm
a4787a9deae3afeb43030db3c1497aa880d34b1d
[ "BSD-3-Clause" ]
null
null
null
src/data/geoPandas_int.ipynb
dansmi-hub/mosquito-jsdm
a4787a9deae3afeb43030db3c1497aa880d34b1d
[ "BSD-3-Clause" ]
null
null
null
53.722581
3,937
0.416116
[ [ [ "import geopandas", "_____no_output_____" ], [ "help(geopandas)", "Help on package geopandas:\n\nNAME\n geopandas\n\nPACKAGE CONTENTS\n _compat\n _config\n _vectorized\n _version\n array\n base\n conftest\n datasets (package)\n geodataframe\n geoseries\n io (package)\n plotting\n sindex\n testing\n tests (package)\n tools (package)\n\nDATA\n options = Options(\n display_precision: None [default: Non...USE_PYGEO...\n\nVERSION\n 0.9.0\n\nFILE\n /home/daniel/.local/lib/python3.9/site-packages/geopandas/__init__.py\n\n\n" ], [ "mosquito_points = geopandas.read_file(\"../../data/interim/mosquito_points.shp\")\n", "_____no_output_____" ], [ "mosquito_points", "_____no_output_____" ], [ "mosquito_points.plot", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code" ] ]
d026089d8b8dcf8bd48f11093f51d39a08dd01f1
61,272
ipynb
Jupyter Notebook
Assignment_3.ipynb
Sarbajit097/Assignment
0594e55f49eea0e72706170f5456fa878282e069
[ "Apache-2.0" ]
null
null
null
Assignment_3.ipynb
Sarbajit097/Assignment
0594e55f49eea0e72706170f5456fa878282e069
[ "Apache-2.0" ]
null
null
null
Assignment_3.ipynb
Sarbajit097/Assignment
0594e55f49eea0e72706170f5456fa878282e069
[ "Apache-2.0" ]
null
null
null
33.084233
231
0.265276
[ [ [ "<a href=\"https://colab.research.google.com/github/Sarbajit097/Assignment/blob/main/Assignment_3.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ] ], [ [ "import pandas as pd\npath=\"https://raw.githubusercontent.com/Sarbajit097/Assignment/main/Toyota.csv\"\ndata =pd.read_csv(path)\ndata ", "_____no_output_____" ], [ "type(data)", "_____no_output_____" ], [ "data.shape", "_____no_output_____" ], [ "data.info()", "<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 1436 entries, 0 to 1435\nData columns (total 11 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 Unnamed: 0 1436 non-null int64 \n 1 Price 1436 non-null int64 \n 2 Age 1336 non-null float64\n 3 KM 1436 non-null object \n 4 FuelType 1336 non-null object \n 5 HP 1436 non-null object \n 6 MetColor 1286 non-null float64\n 7 Automatic 1436 non-null int64 \n 8 CC 1436 non-null int64 \n 9 Doors 1436 non-null object \n 10 Weight 1436 non-null int64 \ndtypes: float64(2), int64(5), object(4)\nmemory usage: 123.5+ KB\n" ], [ "data.index", "_____no_output_____" ], [ "data.columns", "_____no_output_____" ], [ "data.head()", "_____no_output_____" ], [ "data.tail()", "_____no_output_____" ], [ "data.head(5)", "_____no_output_____" ], [ "data[['Price',\"Age\"]].head(10)", "_____no_output_____" ], [ "data.isnull().sum()", "_____no_output_____" ], [ "data.dropna(inplace=True)\ndata.isnull().sum()", "_____no_output_____" ], [ "data.shape", "_____no_output_____" ], [ "data.head(10)", "_____no_output_____" ], [ "data['MetColor'].mean()", "_____no_output_____" ], [ "data['MetColor'].head()", "_____no_output_____" ], [ "import numpy as np\ndata['MetColor'].replace(np.NaN,data['MetColor'].mean()).head()", "_____no_output_____" ], [ "data.head(10)", "_____no_output_____" ], [ "data['CC'].mean()", "_____no_output_____" ], [ "data['CC'].head()", "_____no_output_____" ], [ "data[['Age',\"KM\"]].head(20)", "_____no_output_____" ], [ "", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0260cab1b0e583b5bf98590f3e1771ea82c46e9
8,283
ipynb
Jupyter Notebook
examples/menpo.model.linear.PCAModel.ipynb
ikassi/menpo
ca702fc814a1ad50b27c44c6544ba364d3aa7e31
[ "BSD-3-Clause" ]
null
null
null
examples/menpo.model.linear.PCAModel.ipynb
ikassi/menpo
ca702fc814a1ad50b27c44c6544ba364d3aa7e31
[ "BSD-3-Clause" ]
null
null
null
examples/menpo.model.linear.PCAModel.ipynb
ikassi/menpo
ca702fc814a1ad50b27c44c6544ba364d3aa7e31
[ "BSD-3-Clause" ]
1
2021-04-14T12:09:00.000Z
2021-04-14T12:09:00.000Z
25.1
340
0.527707
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d026177d8ac1d8b5ba74224a126d1fb7c95f3023
12,926
ipynb
Jupyter Notebook
Loop_Statement.ipynb
kathleenmei/CPEN-21A-ECE-2-1
5f0437e6322f1f988819075bf2ff89267eb96a56
[ "Apache-2.0" ]
null
null
null
Loop_Statement.ipynb
kathleenmei/CPEN-21A-ECE-2-1
5f0437e6322f1f988819075bf2ff89267eb96a56
[ "Apache-2.0" ]
null
null
null
Loop_Statement.ipynb
kathleenmei/CPEN-21A-ECE-2-1
5f0437e6322f1f988819075bf2ff89267eb96a56
[ "Apache-2.0" ]
null
null
null
22.020443
239
0.360978
[ [ [ "<a href=\"https://colab.research.google.com/github/kathleenmei/CPEN-21A-ECE-2-1/blob/main/Loop_Statement.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "##For Loop\n", "_____no_output_____" ] ], [ [ "week = [\"Sunday\",\"Monday\",\"Tuesday\",\"Wednesday\",\"Thursday\",\"Friday\",\"Saturday\"]\nfor x in week:\n print(x)", "Sunday\nMonday\nTuesday\nWednesday\nThursday\nFriday\nSaturday\n" ] ], [ [ "The Break Statement", "_____no_output_____" ] ], [ [ "week = [\"Sunday\",\"Monday\",\"Tuesday\",\"Wednesday\",\"Thursday\",\"Friday\",\"Saturday\"]\nfor x in week:\n print(x)\n if x==\"Thursday\":\n break\n ", "Sunday\nMonday\nTuesday\nWednesday\nThursday\n" ], [ "week = [\"Sunday\",\"Monday\",\"Tuesday\",\"Wednesday\",\"Thursday\",\"Friday\",\"Saturday\"]\nfor x in week:\n if x==\"Thursday\":\n break\n print(x)", "Sunday\nMonday\nTuesday\nWednesday\n" ] ], [ [ "Looping through string", "_____no_output_____" ] ], [ [ "for x in \"Python Programming\":\n print(x)", "P\ny\nt\nh\no\nn\n \nP\nr\no\ng\nr\na\nm\nm\ni\nn\ng\n" ] ], [ [ "The range () function", "_____no_output_____" ] ], [ [ "for x in range(16):\n print(x)", "0\n1\n2\n3\n4\n5\n6\n7\n8\n9\n10\n11\n12\n13\n14\n15\n" ], [ "for x in range(2,16):\n print(x)", "2\n3\n4\n5\n6\n7\n8\n9\n10\n11\n12\n13\n14\n15\n" ] ], [ [ "Nested Loops", "_____no_output_____" ] ], [ [ "adjective=[\"red\",\"big\",\"tasty\"]\nfruits = [\"apple\",\"banana\",\"cherry\"]\nfor x in adjective:\n for y in fruits:\n print(x,y)", "red apple\nred banana\nred cherry\nbig apple\nbig banana\nbig cherry\ntasty apple\ntasty banana\ntasty cherry\n" ] ], [ [ "##While Loop", "_____no_output_____" ] ], [ [ "i=1\nwhile i<=6:\n print(i)\n i+=1 #Assignment operator for addition", "1\n2\n3\n4\n5\n6\n" ] ], [ [ "The break statement", "_____no_output_____" ] ], [ [ "i=1\nwhile i<6:\n print(i)\n if i==3:\n break\n i+=1", "1\n2\n3\n" ] ], [ [ "The continue statement", "_____no_output_____" ] ], [ [ "i = 0\nwhile i<6:\n i+=1 #Assignment operator for addition\n if i==3:\n continue\n print(i)", "1\n2\n4\n5\n6\n" ] ], [ [ "The else statement", "_____no_output_____" ] ], [ [ "i = 1\nwhile i<=6:\n print(i)\n i+=1\nelse:\n print(\"i is no longer less than 6\")", "1\n2\n3\n4\n5\n6\ni is no longer less than 6\n" ] ], [ [ "Application 1", "_____no_output_____" ] ], [ [ "#Create a Python program that displays Hello 0 to Hello 10 in vertical sequence\n\nhello=[\"Hello\"]\nnum=[\"0\",\"1\",\"2\",\"3\",\"4\",\"5\",\"6\",\"7\",\"8\",\"9\",\"10\"]\n\n#for loop\nfor x in hello:\n for y in num:\n print(x,y)\n\n#while loop\ni=0\nwhile i<=10:\n print(\"Hello\",i)\n i+=1 #Assignment operator to increment i", "Hello 0\nHello 1\nHello 2\nHello 3\nHello 4\nHello 5\nHello 6\nHello 7\nHello 8\nHello 9\nHello 10\n" ] ], [ [ "Application 2", "_____no_output_____" ] ], [ [ "#Create a Python program that displays integers less than 10 but not less than 3\n\ni = 0\nwhile i<10:\n i+=1 #Assignment operator to increment i\n if i<3:\n continue\n if i==10:\n break\n print(i)", "3\n4\n5\n6\n7\n8\n9\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d026333d7ddf09a0df79969f3bcc890b2555c138
45,737
ipynb
Jupyter Notebook
codes/labs_lecture07/lab01_mlp/.ipynb_checkpoints/mlp_exercise-checkpoint.ipynb
wesleyjtann/Deep-learning-course-CE7454-2018
ec29057f5fd741359b99392ae08b6574c8d4882a
[ "MIT" ]
2
2019-11-11T08:37:14.000Z
2021-02-16T02:21:57.000Z
codes/labs_lecture07/lab01_mlp/mlp_exercise.ipynb
wesleyjtann/Deep-learning-course-CE7454-2018
ec29057f5fd741359b99392ae08b6574c8d4882a
[ "MIT" ]
null
null
null
codes/labs_lecture07/lab01_mlp/mlp_exercise.ipynb
wesleyjtann/Deep-learning-course-CE7454-2018
ec29057f5fd741359b99392ae08b6574c8d4882a
[ "MIT" ]
null
null
null
155.568027
33,776
0.897851
[ [ [ "# Lab 11: MLP -- exercise\n\n# Understanding the training loop ", "_____no_output_____" ] ], [ [ "import torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\nfrom random import randint\nimport utils", "_____no_output_____" ] ], [ [ "### Download the data and print the sizes", "_____no_output_____" ] ], [ [ "train_data=torch.load('../data/fashion-mnist/train_data.pt')\n\nprint(train_data.size())", "torch.Size([60000, 28, 28])\n" ], [ "train_label=torch.load('../data/fashion-mnist/train_label.pt')\n\nprint(train_label.size())", "torch.Size([60000])\n" ], [ "test_data=torch.load('../data/fashion-mnist/test_data.pt')\n\nprint(test_data.size())", "torch.Size([10000, 28, 28])\n" ] ], [ [ "### Make a ONE layer net class. The network output are the scores! No softmax needed! You have only one line to write in the forward function", "_____no_output_____" ] ], [ [ "class one_layer_net(nn.Module):\n\n def __init__(self, input_size, output_size):\n super(one_layer_net , self).__init__()\n self.linear_layer = nn.Linear(input_size, output_size, bias=False)# complete here\n \n def forward(self, x):\n scores = self.linear_layer(x) # complete here\n return scores", "_____no_output_____" ] ], [ [ "### Build the net", "_____no_output_____" ] ], [ [ "net= one_layer_net(784,10)# complete here\nprint(net)", "one_layer_net(\n (linear_layer): Linear(in_features=784, out_features=10, bias=False)\n)\n" ] ], [ [ "### Choose the criterion and the optimizer: use the CHEAT SHEET to see the correct syntax. \n\n### Remember that the optimizer need to have access to the parameters of the network (net.parameters()).\n\n### Set the batchize and learning rate to be:\n### batchize = 50\n### learning rate = 0.01\n\n\n\n\n\n", "_____no_output_____" ] ], [ [ "# make the criterion\ncriterion = nn.CrossEntropyLoss()# complete here\n\n# make the SGD optimizer. \noptimizer=torch.optim.SGD(net.parameters(), lr=0.01) #complete here )\n\n# set up the batch size \nbs=50", "_____no_output_____" ] ], [ [ "### Complete the training loop", "_____no_output_____" ] ], [ [ "for iter in range(1,5000):\n \n # Set dL/dU, dL/dV, dL/dW to be filled with zeros\n optimizer.zero_grad()\n\n # create a minibatch\n indices = torch.LongTensor(bs).random_(0,60000)\n minibatch_data = train_data[indices]\n minibatch_label = train_label[indices]\n\n # reshape the minibatch\n inputs = minibatch_data.view(bs, 784)\n\n # tell Pytorch to start tracking all operations that will be done on \"inputs\"\n inputs.requires_grad_()\n \n # forward the minibatch through the net \n scores = net(inputs)\n\n # Compute the average of the losses of the data points in the minibatch\n loss = criterion(scores, minibatch_label)\n\n # backward pass to compute dL/dU, dL/dV and dL/dW \n loss.backward()\n\n # do one step of stochastic gradient descent: U=U-lr(dL/dU), V=V-lr(dL/dU), ...\n optimizer.step()", "_____no_output_____" ] ], [ [ "### Choose image at random from the test set and see how good/bad are the predictions", "_____no_output_____" ] ], [ [ "# choose a picture at random\nidx=randint(0, 10000-1)\nim=test_data[idx]\n\n# diplay the picture\nutils.show(im)\n\n# feed it to the net and display the confidence scores\nscores = net( im.view(1,784)) \nprobs= F.softmax(scores, dim=1)\nutils.show_prob_fashion_mnist(probs)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d026451c642a17bd4a921731ea60d7a5d4102ed9
18,613
ipynb
Jupyter Notebook
docs/source/examples/rossmann/tensorflow.ipynb
lgardenhire/NVTabular
225352cd2f95d6409c630b655f36f1a3e859ede7
[ "Apache-2.0" ]
null
null
null
docs/source/examples/rossmann/tensorflow.ipynb
lgardenhire/NVTabular
225352cd2f95d6409c630b655f36f1a3e859ede7
[ "Apache-2.0" ]
null
null
null
docs/source/examples/rossmann/tensorflow.ipynb
lgardenhire/NVTabular
225352cd2f95d6409c630b655f36f1a3e859ede7
[ "Apache-2.0" ]
null
null
null
36.424658
593
0.571643
[ [ [ "# Copyright 2020 NVIDIA Corporation. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================", "_____no_output_____" ] ], [ [ "<img src=\"http://developer.download.nvidia.com/compute/machine-learning/frameworks/nvidia_logo.png\" style=\"width: 90px; float: right;\">\n\n# NVTabular demo on Rossmann data - TensorFlow\n\n## Overview\n\nNVTabular is a feature engineering and preprocessing library for tabular data designed to quickly and easily manipulate terabyte scale datasets used to train deep learning based recommender systems. It provides a high level abstraction to simplify code and accelerates computation on the GPU using the RAPIDS cuDF library.", "_____no_output_____" ], [ "### Learning objectives\n\nIn the previous notebooks ([rossmann-store-sales-preproc.ipynb](https://github.com/NVIDIA/NVTabular/blob/main/examples/rossmann/rossmann-store-sales-preproc.ipynb) and [rossmann-store-sales-feature-engineering.ipynb](https://github.com/NVIDIA/NVTabular/blob/main/examples/rossmann/rossmann-store-sales-feature-engineering.ipynb)), we downloaded, preprocessed and created features for the dataset. Now, we are ready to train our deep learning model on the dataset. In this notebook, we use **TensorFlow** with the NVTabular data loader for TensorFlow to accelereate the training pipeline.", "_____no_output_____" ] ], [ [ "import os\nimport math\nimport json\nimport nvtabular as nvt\nimport glob", "_____no_output_____" ] ], [ [ "## Loading NVTabular workflow\nThis time, we only need to define our data directories. We can load the data schema from the NVTabular workflow.", "_____no_output_____" ] ], [ [ "DATA_DIR = os.environ.get(\"OUTPUT_DATA_DIR\", \"./data\")\nINPUT_DATA_DIR = os.environ.get(\"INPUT_DATA_DIR\", \"./data\")\nPREPROCESS_DIR = os.path.join(INPUT_DATA_DIR, 'ross_pre/')\nPREPROCESS_DIR_TRAIN = os.path.join(PREPROCESS_DIR, 'train')\nPREPROCESS_DIR_VALID = os.path.join(PREPROCESS_DIR, 'valid')", "_____no_output_____" ] ], [ [ "What files are available to train on in our directories?", "_____no_output_____" ] ], [ [ "!ls $PREPROCESS_DIR", "stats.json train valid\r\n" ], [ "!ls $PREPROCESS_DIR_TRAIN", "0.274616a5186f4ae6b9688cad2a1ac1e5.parquet _metadata\r\n_file_list.txt\t\t\t\t _metadata.json\r\n" ], [ "!ls $PREPROCESS_DIR_VALID", "_metadata part.0.parquet\n" ] ], [ [ "We load the data schema and statistic information from `stats.json`. We created the file in the previous notebook `rossmann-store-sales-feature-engineering`.", "_____no_output_____" ] ], [ [ "stats = json.load(open(PREPROCESS_DIR + \"/stats.json\", \"r\"))", "_____no_output_____" ], [ "CATEGORICAL_COLUMNS = stats['CATEGORICAL_COLUMNS']\nCONTINUOUS_COLUMNS = stats['CONTINUOUS_COLUMNS']\nLABEL_COLUMNS = stats['LABEL_COLUMNS']\nCOLUMNS = CATEGORICAL_COLUMNS + CONTINUOUS_COLUMNS + LABEL_COLUMNS", "_____no_output_____" ] ], [ [ "The embedding table shows the cardinality of each categorical variable along with its associated embedding size. Each entry is of the form `(cardinality, embedding_size)`.", "_____no_output_____" ] ], [ [ "EMBEDDING_TABLE_SHAPES = stats['EMBEDDING_TABLE_SHAPES']\nEMBEDDING_TABLE_SHAPES", "_____no_output_____" ] ], [ [ "## Training a Network\n\nNow that our data is preprocessed and saved out, we can leverage `dataset`s to read through the preprocessed parquet files in an online fashion to train neural networks.\n\nWe'll start by setting some universal hyperparameters for our model and optimizer. These settings will be the same across all of the frameworks that we explore in the different notebooks.", "_____no_output_____" ], [ "If you're interested in contributing to NVTabular, feel free to take this challenge on and submit a pull request if successful. 12% RMSPE is achievable using the Novograd optimizer, but we know of no Novograd implementation for TensorFlow that supports sparse gradients, and so we are not including that solution below.", "_____no_output_____" ] ], [ [ "EMBEDDING_DROPOUT_RATE = 0.04\nDROPOUT_RATES = [0.001, 0.01]\nHIDDEN_DIMS = [1000, 500]\nBATCH_SIZE = 65536\nLEARNING_RATE = 0.001\nEPOCHS = 25\n\n# TODO: Calculate on the fly rather than recalling from previous analysis.\nMAX_SALES_IN_TRAINING_SET = 38722.0\nMAX_LOG_SALES_PREDICTION = 1.2 * math.log(MAX_SALES_IN_TRAINING_SET + 1.0)\n\nTRAIN_PATHS = sorted(glob.glob(os.path.join(PREPROCESS_DIR_TRAIN, '*.parquet')))\nVALID_PATHS = sorted(glob.glob(os.path.join(PREPROCESS_DIR_VALID, '*.parquet')))", "_____no_output_____" ] ], [ [ "## TensorFlow\n<a id=\"TensorFlow\"></a>\n", "_____no_output_____" ], [ "### TensorFlow: Preparing Datasets\n\n`KerasSequenceLoader` wraps a lightweight iterator around a `dataset` object to handle chunking, shuffling, and application of any workflows (which can be applied online as a preprocessing step). For column names, can use either a list of string names or a list of TensorFlow `feature_columns` that will be used to feed the network", "_____no_output_____" ] ], [ [ "import tensorflow as tf\n\n# we can control how much memory to give tensorflow with this environment variable\n# IMPORTANT: make sure you do this before you initialize TF's runtime, otherwise\n# it's too late and TF will have claimed all free GPU memory\nos.environ['TF_MEMORY_ALLOCATION'] = \"8192\" # explicit MB\nos.environ['TF_MEMORY_ALLOCATION'] = \"0.5\" # fraction of free memory\nfrom nvtabular.loader.tensorflow import KerasSequenceLoader, KerasSequenceValidater\n\n# cheap wrapper to keep things some semblance of neat\ndef make_categorical_embedding_column(name, dictionary_size, embedding_dim):\n return tf.feature_column.embedding_column(\n tf.feature_column.categorical_column_with_identity(name, dictionary_size),\n embedding_dim\n )\n\n# instantiate our columns\ncategorical_columns = [\n make_categorical_embedding_column(name, *EMBEDDING_TABLE_SHAPES[name]) for\n name in CATEGORICAL_COLUMNS\n]\ncontinuous_columns = [\n tf.feature_column.numeric_column(name, (1,)) for name in CONTINUOUS_COLUMNS\n]\n\n# feed them to our datasets\ntrain_dataset = KerasSequenceLoader(\n TRAIN_PATHS, # you could also use a glob pattern\n feature_columns=categorical_columns+continuous_columns,\n batch_size=BATCH_SIZE,\n label_names=LABEL_COLUMNS,\n shuffle=True,\n buffer_size=0.06 # amount of data, as a fraction of GPU memory, to load at once\n)\n\nvalid_dataset = KerasSequenceLoader(\n VALID_PATHS, # you could also use a glob pattern\n feature_columns=categorical_columns+continuous_columns,\n batch_size=BATCH_SIZE*4,\n label_names=LABEL_COLUMNS,\n shuffle=False,\n buffer_size=0.06 # amount of data, as a fraction of GPU memory, to load at once\n)", "_____no_output_____" ] ], [ [ "### TensorFlow: Defining a Model\n\nUsing Keras, we can define the layers of our model and their parameters explicitly. Here, for the sake of consistency, we'll mimic fast.ai's [TabularModel](https://docs.fast.ai/tabular.learner.html).", "_____no_output_____" ] ], [ [ "# DenseFeatures layer needs a dictionary of {feature_name: input}\ncategorical_inputs = {}\nfor column_name in CATEGORICAL_COLUMNS:\n categorical_inputs[column_name] = tf.keras.Input(name=column_name, shape=(1,), dtype=tf.int64)\ncategorical_embedding_layer = tf.keras.layers.DenseFeatures(categorical_columns)\ncategorical_x = categorical_embedding_layer(categorical_inputs)\ncategorical_x = tf.keras.layers.Dropout(EMBEDDING_DROPOUT_RATE)(categorical_x)\n\n# Just concatenating continuous, so can use a list\ncontinuous_inputs = []\nfor column_name in CONTINUOUS_COLUMNS:\n continuous_inputs.append(tf.keras.Input(name=column_name, shape=(1,), dtype=tf.float32))\ncontinuous_embedding_layer = tf.keras.layers.Concatenate(axis=1)\ncontinuous_x = continuous_embedding_layer(continuous_inputs)\ncontinuous_x = tf.keras.layers.BatchNormalization(epsilon=1e-5, momentum=0.1)(continuous_x)\n\n# concatenate and build MLP\nx = tf.keras.layers.Concatenate(axis=1)([categorical_x, continuous_x])\nfor dim, dropout_rate in zip(HIDDEN_DIMS, DROPOUT_RATES):\n x = tf.keras.layers.Dense(dim, activation='relu')(x)\n x = tf.keras.layers.BatchNormalization(epsilon=1e-5, momentum=0.1)(x)\n x = tf.keras.layers.Dropout(dropout_rate)(x)\nx = tf.keras.layers.Dense(1, activation='linear')(x)\n\n# TODO: Initialize model weights to fix saturation issues.\n# For now, we'll just scale the output of our model directly before\n# hitting the sigmoid.\nx = 0.1 * x\n\nx = MAX_LOG_SALES_PREDICTION * tf.keras.activations.sigmoid(x)\n\n# combine all our inputs into a single list\n# (note that you can still use .fit, .predict, etc. on a dict\n# that maps input tensor names to input values)\ninputs = list(categorical_inputs.values()) + continuous_inputs\ntf_model = tf.keras.Model(inputs=inputs, outputs=x)", "_____no_output_____" ] ], [ [ "### TensorFlow: Training", "_____no_output_____" ] ], [ [ "def rmspe_tf(y_true, y_pred):\n # map back into \"true\" space by undoing transform\n y_true = tf.exp(y_true) - 1\n y_pred = tf.exp(y_pred) - 1\n\n percent_error = (y_true - y_pred) / y_true\n return tf.sqrt(tf.reduce_mean(percent_error**2))", "_____no_output_____" ], [ "%%time\nfrom time import time\n\noptimizer = tf.keras.optimizers.Adam(learning_rate=LEARNING_RATE)\ntf_model.compile(optimizer, 'mse', metrics=[rmspe_tf])\n\nvalidation_callback = KerasSequenceValidater(valid_dataset)\nstart = time()\nhistory = tf_model.fit(\n train_dataset,\n callbacks=[validation_callback],\n epochs=EPOCHS,\n)\nt_final = time() - start\ntotal_rows = train_dataset.num_rows_processed + valid_dataset.num_rows_processed\nprint(f\"run_time: {t_final} - rows: {total_rows} - epochs: {EPOCHS} - dl_thru: { (EPOCHS * total_rows) / t_final}\")", "Epoch 1/25\n13/13 [==============================] - 7s 168ms/step - loss: 6.3708 - rmspe_tf: 0.8916\nEpoch 2/25\n13/13 [==============================] - 2s 166ms/step - loss: 5.3491 - rmspe_tf: 0.8906\nEpoch 3/25\n13/13 [==============================] - 2s 168ms/step - loss: 4.7029 - rmspe_tf: 0.8801\nEpoch 4/25\n13/13 [==============================] - 2s 164ms/step - loss: 3.9542 - rmspe_tf: 0.8585\nEpoch 5/25\n13/13 [==============================] - 2s 168ms/step - loss: 3.0444 - rmspe_tf: 0.8195\nEpoch 6/25\n13/13 [==============================] - 2s 165ms/step - loss: 2.0530 - rmspe_tf: 0.7533\nEpoch 7/25\n13/13 [==============================] - 2s 167ms/step - loss: 1.1581 - rmspe_tf: 0.6474\nEpoch 8/25\n13/13 [==============================] - 2s 166ms/step - loss: 0.5232 - rmspe_tf: 0.5006\nEpoch 9/25\n13/13 [==============================] - 2s 164ms/step - loss: 0.1878 - rmspe_tf: 0.3450\nEpoch 10/25\n13/13 [==============================] - 2s 164ms/step - loss: 0.0650 - rmspe_tf: 0.2355\nEpoch 11/25\n13/13 [==============================] - 2s 166ms/step - loss: 0.0372 - rmspe_tf: 0.2073\nEpoch 12/25\n13/13 [==============================] - 2s 166ms/step - loss: 0.0329 - rmspe_tf: 0.2094\nEpoch 13/25\n13/13 [==============================] - 2s 165ms/step - loss: 0.0317 - rmspe_tf: 0.2090\nEpoch 14/25\n13/13 [==============================] - 2s 168ms/step - loss: 0.0301 - rmspe_tf: 0.2035\nEpoch 15/25\n13/13 [==============================] - 2s 170ms/step - loss: 0.0292 - rmspe_tf: 0.1987\nEpoch 16/25\n13/13 [==============================] - 2s 168ms/step - loss: 0.0283 - rmspe_tf: 0.1952\nEpoch 17/25\n13/13 [==============================] - 2s 166ms/step - loss: 0.0276 - rmspe_tf: 0.1905\nEpoch 18/25\n13/13 [==============================] - 2s 165ms/step - loss: 0.0274 - rmspe_tf: 0.1877\nEpoch 19/25\n13/13 [==============================] - 2s 165ms/step - loss: 0.0270 - rmspe_tf: 0.1956\nEpoch 20/25\n13/13 [==============================] - 2s 165ms/step - loss: 0.0248 - rmspe_tf: 0.1789\nEpoch 21/25\n13/13 [==============================] - 2s 166ms/step - loss: 0.0244 - rmspe_tf: 0.1792\nEpoch 22/25\n13/13 [==============================] - 2s 167ms/step - loss: 0.0239 - rmspe_tf: 0.1785\nEpoch 23/25\n13/13 [==============================] - 2s 167ms/step - loss: 0.0234 - rmspe_tf: 0.1775\nEpoch 24/25\n13/13 [==============================] - 2s 166ms/step - loss: 0.0233 - rmspe_tf: 0.1747\nEpoch 25/25\n13/13 [==============================] - 2s 168ms/step - loss: 0.0228 - rmspe_tf: 0.1725\nCPU times: user 2min 52s, sys: 13 s, total: 3min 5s\nWall time: 1min 8s\n" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d0264ad9006e244f99aea97cc3dd62ada8e7e6f6
1,101
ipynb
Jupyter Notebook
hisim/inputs/loadprofiles/electrical-load_2-smart-appliances/process_data.ipynb
guptakaran55/HiSim
3574c2719b194b4e7d5ec68513d62af34c08ce85
[ "MIT" ]
12
2021-10-05T11:38:24.000Z
2022-03-25T09:56:08.000Z
hisim/inputs/loadprofiles/electrical-load_2-smart-appliances/process_data.ipynb
guptakaran55/HiSim
3574c2719b194b4e7d5ec68513d62af34c08ce85
[ "MIT" ]
6
2021-10-06T13:27:55.000Z
2022-03-10T12:55:15.000Z
hisim/inputs/loadprofiles/electrical-load_2-smart-appliances/process_data.ipynb
guptakaran55/HiSim
3574c2719b194b4e7d5ec68513d62af34c08ce85
[ "MIT" ]
4
2022-02-21T19:00:50.000Z
2022-03-22T11:01:38.000Z
18.982759
96
0.541326
[ [ [ "import os\nimport shutil\n\ndef raw2processed(filename):\n shutil.copy(os.path.join('data_raw',filename),os.path.join('data_processed',filename))", "_____no_output_____" ], [ "raw2processed('FlexibilityEvents.HH1.json')\nraw2processed('FlexibilityEvents2.HH1.json')", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code" ] ]
d0268595013b88d8e32177838e57b1bdf1531363
13,691
ipynb
Jupyter Notebook
Recommending System for Pictures - 4th place @ Yandex ML Competition.ipynb
dremovd/pictures-recommendation-yandex-ml-2019
339e9407a5718fbef74f9aa7986b51904c0e8d42
[ "MIT" ]
4
2019-06-11T17:34:32.000Z
2020-04-19T02:02:32.000Z
Recommending System for Pictures - 4th place @ Yandex ML Competition.ipynb
dremovd/pictures-recommendation-yandex-ml-2019
339e9407a5718fbef74f9aa7986b51904c0e8d42
[ "MIT" ]
null
null
null
Recommending System for Pictures - 4th place @ Yandex ML Competition.ipynb
dremovd/pictures-recommendation-yandex-ml-2019
339e9407a5718fbef74f9aa7986b51904c0e8d42
[ "MIT" ]
1
2020-11-23T08:48:28.000Z
2020-11-23T08:48:28.000Z
24.189046
138
0.523702
[ [ [ "## Main points\n* Solution should be reasonably simple because the contest is only 24 hours long \n* Metric is based on the prediction of clicked pictures one week ahead, so clicks are the most important information\n* More recent information is more important\n* Only pictures that were shown to a user could be clicked, so pictures popularity is important\n* Metric is MAPK@100\n* Link https://contest.yandex.ru/contest/12899/problems (Russian)", "_____no_output_____" ], [ "## Plan\n* Build a classic recommending system based on user click history\n* Only use recent days of historical data\n* Take into consideration projected picture popularity", "_____no_output_____" ], [ "## Magic constants\n### ALS recommending system:", "_____no_output_____" ] ], [ [ "# Factors for ALS\nfactors_count=100\n\n# Last days of click history used\ntrail_days=14 \n\n# number of best candidates generated by ALS \noutput_candidates_count=2000 \n\n# Last days of history with more weight\nlast_days=1\n\n# Coefficient for additional weight\nlast_days_weight=4", "_____no_output_____" ] ], [ [ "## Popular pictures prediction model:", "_____no_output_____" ] ], [ [ "import lightgbm\n\nlightgbm.__version__", "_____no_output_____" ], [ "popularity_model = lightgbm.LGBMRegressor(seed=0)\nheuristic_alpha = 0.2", "_____no_output_____" ], [ "import datetime\nimport tqdm\nimport pandas as pd\nfrom scipy.sparse import coo_matrix\n\nimport implicit\nimplicit.__version__", "_____no_output_____" ], [ "test_users = pd.read_csv('Blitz/test_users.csv')\ndata = pd.read_csv('Blitz/train_clicks.csv', parse_dates=['day'])", "_____no_output_____" ] ], [ [ "## Split last 7 days to calculate clicks similar to test set\n", "_____no_output_____" ] ], [ [ "train, target_week = (\n data[data.day <= datetime.datetime(2019, 3, 17)].copy(),\n data[data.day > datetime.datetime(2019, 3, 17)],\n)\ntrain.day.nunique(), target_week.day.nunique()", "_____no_output_____" ], [ "last_date = train.day.max()\ntrain.loc[:, 'delta_days'] = 1 + (last_date - train.day).apply(lambda d: d.days)\n\nlast_date = data.day.max()\ndata.loc[:, 'delta_days'] = 1 + (last_date - data.day).apply(lambda d: d.days)", "_____no_output_____" ], [ "def picture_features(data):\n \"\"\"Generating clicks count for every picture in last days\"\"\"\n days = range(1, 3)\n features = []\n names = []\n for delta_days in days:\n features.append(\n data[(data.delta_days == delta_days)].groupby(['picture_id'])['user_id'].count()\n )\n names.append('%s_%d' % ('click', delta_days))\n \n features = pd.concat(features, axis=1).fillna(0)\n features.columns = names\n features = features.reindex(data.picture_id.unique())\n return features.fillna(0)", "_____no_output_____" ], [ "X = picture_features(train)\nX.mean(axis=0)", "_____no_output_____" ], [ "def clicks_count(data, index):\n return data.groupby('picture_id')['user_id'].count().reindex(index).fillna(0)\n \ny = clicks_count(target_week, X.index)\ny.shape, y.mean()", "_____no_output_____" ] ], [ [ "## Train a model predicting popular pictures next week", "_____no_output_____" ] ], [ [ "popularity_model.fit(X, y)", "_____no_output_____" ], [ "X_test = picture_features(data)\nX_test.mean(axis=0)", "_____no_output_____" ], [ "X_test['p'] = popularity_model.predict(X_test)\nX_test.loc[X_test['p'] < 0, 'p'] = 0\nX_test['p'].mean()", "_____no_output_____" ] ], [ [ "## Generate dict with predicted clicks for every picture\n", "_____no_output_____" ] ], [ [ "# This prediction would be used to correct recommender score\npicture = dict(X_test['p'])", "_____no_output_____" ] ], [ [ "# Recommender part", "_____no_output_____" ], [ "## Generate prediction using ALS approach", "_____no_output_____" ] ], [ [ "import os\nos.environ['OPENBLAS_NUM_THREADS'] = \"1\"\n\ndef als_baseline(\n train, test_users, \n factors_n, last_days, trail_days, output_candidates_count, last_days_weight\n):\n train = train[train.delta_days <= trail_days].drop_duplicates([\n 'user_id', 'picture_id'\n ])\n \n users = train.user_id\n items = train.picture_id\n weights = 1 + last_days_weight * (train.delta_days <= last_days)\n \n user_item = coo_matrix((weights, (users, items)))\n model = implicit.als.AlternatingLeastSquares(factors=factors_n, iterations=factors_n)\n model.fit(user_item.T.tocsr())\n \n user_item_csr = user_item.tocsr()\n \n rows = []\n for user_id in tqdm.tqdm_notebook(test_users.user_id.values):\n items = [(picture_id, score) for picture_id, score in model.recommend(user_id, user_item_csr, N=output_candidates_count)]\n rows.append(items)\n\n test_users['predictions_full'] = [\n p\n for p, user_id in zip(\n rows,\n test_users.user_id.values\n )\n ]\n test_users['predictions'] = [\n [x[0] for x in p]\n for p, user_id in zip(\n rows,\n test_users.user_id.values\n )\n ]\n return test_users", "_____no_output_____" ], [ "test_users = als_baseline(\n data, test_users, factors_count, last_days, trail_days, output_candidates_count, last_days_weight)", "100%|██████████| 100.0/100 [11:00<00:00, 6.78s/it]\n" ] ], [ [ "## Calculate history clicks to exclude them from results. Such clicks are excluded from test set according to task", "_____no_output_____" ] ], [ [ "clicked = data.groupby('user_id').agg({'picture_id': set})\n\ndef substract_clicked(p, c):\n filtered = [picture for picture in p if picture not in c][:100]\n return filtered", "_____no_output_____" ] ], [ [ "## Heuristical approach to reweight ALS score according to picture predicted popularity", "_____no_output_____" ], [ "Recommender returns (picture, score) pairs sorted decreasing for every user.\n\nFor every user we replace picture $score_p$ with $score_p \\cdot (1 + popularity_{p})^{0.2}$\n\n$popularity_{p}$ - popularity predicted for this picture for next week\n\nThis slightly moves popular pictures to the top of list for every user", "_____no_output_____" ] ], [ [ "import math\n\nrows = test_users['predictions_full']\n\ndef correct_with_popularity(items, picture, alpha):\n return sorted([\n (score * (1 + picture.get(picture_id, 0)) ** alpha, picture_id, score, picture.get(picture_id, 0)) \n for picture_id, score in items], reverse=True\n )\n\ncorrected_rows = [\n [x[1] for x in correct_with_popularity(items, picture, heuristic_alpha)]\n for items in rows\n]", "_____no_output_____" ] ], [ [ "## Submission formatting", "_____no_output_____" ] ], [ [ "test_users['predictions'] = [\n ' '.join(map(str,\n substract_clicked(p, {} if user_id not in clicked.index else clicked.loc[user_id][0])\n ))\n for p, user_id in zip(\n corrected_rows,\n test_users.user_id.values\n )\n]", "_____no_output_____" ], [ "test_users[['user_id', 'predictions']].to_csv('submit.csv', index=False)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d02691eeb9161ccab8d5a3c0f7dd13c8a556aac7
544,241
ipynb
Jupyter Notebook
R_Advertisement_Prediction.ipynb
joymuli10/Advertising-Prediction-R
90fc8f21081166f4eb34701aacbe86a50b9dd2a9
[ "MIT" ]
null
null
null
R_Advertisement_Prediction.ipynb
joymuli10/Advertising-Prediction-R
90fc8f21081166f4eb34701aacbe86a50b9dd2a9
[ "MIT" ]
null
null
null
R_Advertisement_Prediction.ipynb
joymuli10/Advertising-Prediction-R
90fc8f21081166f4eb34701aacbe86a50b9dd2a9
[ "MIT" ]
null
null
null
168.756899
122,862
0.74921
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d026cb201d6586b1b323cff0ee2d9de997288ec4
4,795
ipynb
Jupyter Notebook
jupyter/.ipynb_checkpoints/Untitled-checkpoint.ipynb
sasano8/magnet-migrade
b5669b34a6a3b845df8df96dfedaf967df6b88e2
[ "MIT" ]
null
null
null
jupyter/.ipynb_checkpoints/Untitled-checkpoint.ipynb
sasano8/magnet-migrade
b5669b34a6a3b845df8df96dfedaf967df6b88e2
[ "MIT" ]
4
2021-03-24T23:38:22.000Z
2021-03-31T07:24:30.000Z
jupyter/.ipynb_checkpoints/Untitled-checkpoint.ipynb
sasano8/magnet-migrade
b5669b34a6a3b845df8df96dfedaf967df6b88e2
[ "MIT" ]
null
null
null
25.236842
113
0.475287
[ [ [ "!pip install pydantic", "Requirement already satisfied: pydantic in /usr/local/lib/python3.8/dist-packages (1.6.1)\n\u001b[33mWARNING: You are using pip version 20.2.2; however, version 20.2.4 is available.\nYou should consider upgrading via the '/usr/bin/python3 -m pip install --upgrade pip' command.\u001b[0m\n" ], [ "from tzlocal import get_localzone # $ pip install tzlocal\n\n# get local timezone \nlocal_tz = get_localzone()\n\nprint(local_tz)", "UTC\n" ], [ "from typing import Union\n\ndef func(name: str, /, *, age: Union[int, str] = \"a\", parent: str = \"\"):\n print(name)\n\n# func.__defaults__ = (\"bob\",) # デフォルトを変更できる\nfunc.__kwdefaults__\n\nprint(func.__defaults__)\nprint(func.__kwdefaults__)\nprint(func.__annotations__)\n", "None\n{'age': 'a', 'parent': ''}\n{'name': <class 'str'>, 'age': typing.Union[int, str], 'parent': <class 'str'>}\n" ], [ "from pydantic import BaseModel\n\nclass Sample(BaseModel):\n name: str = \"bob\"\n age: int = 20\n\n \n\ndef create_func(model: BaseModel):\n \n __annotations__ = {}\n kwdefaults = {}\n \n fields_required = []\n fields_non_required = []\n \n # sort required first\n for item in model.__fields__.values():\n if item.required:\n fields_required.append(item)\n else:\n fields_non_required.append(item)\n \n fields = fields_required\n fields += fields_non_required\n \n for item in fields:\n __annotations__[item.name] = item.type_\n if item.required:\n kwdefaults[item.name] = item.default\n \n # kwdefaults = tuple(kwdefaults)\n codes = []\n codes.append(\"def overload(*args\")\n \n for item in fields:\n codes.append(f\", {item.name} = None\")\n \n codes.append(\"):\\n if len(args) > 1: Exception()\")\n codes.append(\"\\n return args[0] if len(args) else model(\")\n \n for item in fields:\n codes.append(f\"{item.name}={item.name},\")\n\n codes.append(\")\")\n codes.append(\"\\n\")\n code = \"\".join(codes)\n print(code)\n namespace = {\"model\": model}\n \n exec(code, namespace, namespace)\n func = eval(\"overload\", namespace, namespace)\n func.__annotations__ = __annotations__\n func.__kwdefaults__ = kwdefaults\n return func\n\n \nfunc = create_func(Sample)\n\n# print(func())\nprint(func(name=\"a\", age=20))\n\n\ndef func(model):\n if model:\n return model\n else:\n return mymodel(\n name=name,\n age=age\n )\n ", "def overload(*args, name, age):\n if len(args) > 1: Exception()\n return args[0] if len(args) else model(name=name,age=age,)\n\nname='a' age=20\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code" ] ]
d026df3c7114b94cc3be1435ec3a079065ef68b2
118,583
ipynb
Jupyter Notebook
notebooks/predict.ipynb
maxdel/span_ae
76dde200a68fdca30bfe312fb9e47328f3212577
[ "MIT" ]
1
2019-11-27T10:55:06.000Z
2019-11-27T10:55:06.000Z
notebooks/predict.ipynb
maxdel/span_ae
76dde200a68fdca30bfe312fb9e47328f3212577
[ "MIT" ]
null
null
null
notebooks/predict.ipynb
maxdel/span_ae
76dde200a68fdca30bfe312fb9e47328f3212577
[ "MIT" ]
null
null
null
241.022358
32,184
0.916177
[ [ [ "# Load predictor", "_____no_output_____" ] ], [ [ "%matplotlib inline\nimport os\nimport matplotlib\nimport numpy as np\nimport matplotlib.pyplot as plt", "_____no_output_____" ], [ "matplotlib.use(\"Agg\")", "/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/__main__.py:1: UserWarning: \nThis call to matplotlib.use() has no effect because the backend has already\nbeen chosen; matplotlib.use() must be called *before* pylab, matplotlib.pyplot,\nor matplotlib.backends is imported for the first time.\n\nThe backend was *originally* set to 'module://ipykernel.pylab.backend_inline' by the following code:\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/runpy.py\", line 193, in _run_module_as_main\n \"__main__\", mod_spec)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/__main__.py\", line 3, in <module>\n app.launch_new_instance()\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/traitlets/config/application.py\", line 658, in launch_instance\n app.start()\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/kernelapp.py\", line 486, in start\n self.io_loop.start()\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/tornado/ioloop.py\", line 888, in start\n handler_func(fd_obj, events)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/tornado/stack_context.py\", line 277, in null_wrapper\n return fn(*args, **kwargs)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py\", line 450, in _handle_events\n self._handle_recv()\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py\", line 480, in _handle_recv\n self._run_callback(callback, msg)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/zmq/eventloop/zmqstream.py\", line 432, in _run_callback\n callback(*args, **kwargs)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/tornado/stack_context.py\", line 277, in null_wrapper\n return fn(*args, **kwargs)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/kernelbase.py\", line 283, in dispatcher\n return self.dispatch_shell(stream, msg)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/kernelbase.py\", line 233, in dispatch_shell\n handler(stream, idents, msg)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/kernelbase.py\", line 399, in execute_request\n user_expressions, allow_stdin)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/ipkernel.py\", line 208, in do_execute\n res = shell.run_cell(code, store_history=store_history, silent=silent)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/zmqshell.py\", line 537, in run_cell\n return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/IPython/core/interactiveshell.py\", line 2739, in run_cell\n self.events.trigger('post_run_cell')\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/IPython/core/events.py\", line 73, in trigger\n func(*args, **kwargs)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/ipykernel/pylab/backend_inline.py\", line 160, in configure_once\n activate_matplotlib(backend)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/IPython/core/pylabtools.py\", line 308, in activate_matplotlib\n matplotlib.pyplot.switch_backend(backend)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/matplotlib/pyplot.py\", line 231, in switch_backend\n matplotlib.use(newbackend, warn=False, force=True)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/matplotlib/__init__.py\", line 1400, in use\n reload(sys.modules['matplotlib.backends'])\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/importlib/__init__.py\", line 166, in reload\n _bootstrap._exec(spec, module)\n File \"/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/matplotlib/backends/__init__.py\", line 16, in <module>\n line for line in traceback.format_stack()\n\n\n if __name__ == '__main__':\n" ], [ "os.getcwd()", "_____no_output_____" ], [ "os.chdir('/home/del/research/span_ae')", "_____no_output_____" ], [ "import span_ae\nfrom allennlp.models.archival import load_archive\nfrom allennlp.service.predictors import Predictor", "/home/del/anaconda3/envs/allennmt/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\n" ], [ "archive = load_archive(\"models/baseline/model.tar.gz\")", "_____no_output_____" ], [ "predictor = Predictor.from_archive(archive, 'span_ae')", "_____no_output_____" ] ], [ [ "## Func", "_____no_output_____" ] ], [ [ "def predict_plot(sentence):\n # predict\n result = predictor.predict_json(sentence)\n attention_matrix = result['attention_matrix']\n predicted_tokens = result['predicted_tokens']\n survived_span_ids = result['top_spans']\n input_sentence = ['BOS'] + sentence['src'].split() + ['EOS']\n predicted_tokens = predicted_tokens + ['EOS']\n survived_spans = []\n for span_id in survived_span_ids:\n ind_from = span_id[0]\n ind_to = span_id[1] + 1\n survived_spans.append(\" \".join(input_sentence[ind_from:ind_to]))\n attention_matrix_local = attention_matrix[0:len(predicted_tokens)]\n att_matrix_np = np.array([np.array(xi) for xi in attention_matrix_local])\n \n \n #print\n print('ORIGINAL :', \" \".join(input_sentence))\n #print('TOP SPANs:', \" \\n \".join(survived_spans))\n print('PREDICTED:', \" \".join(predicted_tokens))\n #print('span scores:', result['top_spans_scores'])\n print('\\nAttnetion matrix:')\n \n # plot\n plt.figure(figsize=(9, 9), dpi= 80, facecolor='w', edgecolor='k')\n plt.imshow(att_matrix_np.transpose(), interpolation=\"nearest\", cmap=\"Greys\")\n plt.xlabel(\"target\")\n plt.ylabel(\"source\")\n plt.gca().set_xticks([i for i in range(0, len(predicted_tokens))])\n plt.gca().set_yticks([i for i in range(0, len(survived_spans))])\n plt.gca().set_xticklabels(predicted_tokens, rotation='vertical')\n plt.gca().set_yticklabels(survived_spans)\n plt.tight_layout()", "_____no_output_____" ] ], [ [ "## Inference", "_____no_output_____" ] ], [ [ "# change it\nsentence = \"to school\"\n\n# do not change it\npredict_plot({'src': sentence})", "ORIGINAL : BOS to school EOS\nPREDICTED: to studies EOS\n\nAttnetion matrix:\n" ], [ "# change it\nsentence = \"school\"\n\n# do not change it\npredict_plot({'src': sentence})", "ORIGINAL : BOS school EOS\nPREDICTED: poverty EOS\n\nAttnetion matrix:\n" ], [ "# change it\nsentence = \"it is spring already , but there are a lot of snow out there\"\n\n# do not change it\npredict_plot({'src': sentence})", "ORIGINAL : BOS it is spring already , but there are a lot of snow out there EOS\nPREDICTED: it is already spring , but there are a lot of snow there there EOS\n\nAttnetion matrix:\n" ], [ "b", "ORIGINAL : BOS state of the art EOS\nPREDICTED: state of the EOS\n\nAttnetion matrix:\n" ], [ "# change it\nsentence = \"let us discard our entire human knowledge\"\n\n# do not change it\npredict_plot({'src': sentence})", "ORIGINAL : BOS let us discard our entire human knowledge EOS\nPREDICTED: let us discard our entire development knowledge EOS\n\nAttnetion matrix:\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d026e34a0ea41412fbdd766d5f9aacb24c19bf62
10,024
ipynb
Jupyter Notebook
Crypto/imposter/imposter_writeup.ipynb
NeSE-Team/XNUCA2020Qualifier
c5cff6a63d968de53483e39fc314c0fd63f49d6c
[ "Apache-2.0" ]
76
2020-11-01T05:47:46.000Z
2022-03-21T05:56:34.000Z
Crypto/imposter/imposter_writeup.ipynb
NeSE-Team/XNUCA2020Qualifier
c5cff6a63d968de53483e39fc314c0fd63f49d6c
[ "Apache-2.0" ]
2
2020-11-07T10:45:06.000Z
2021-01-19T03:09:59.000Z
Crypto/imposter/imposter_writeup.ipynb
NeSE-Team/XNUCA2020Qualifier
c5cff6a63d968de53483e39fc314c0fd63f49d6c
[ "Apache-2.0" ]
5
2020-11-01T16:27:29.000Z
2022-01-18T08:17:15.000Z
42.295359
511
0.589585
[ [ [ "This challenge implements an instantiation of OTR based on AES block cipher with modified version 1.0. OTR, which stands for Offset Two-Round, is a blockcipher mode of operation to realize an authenticated encryption with associated data (see [[1]](#1)). AES-OTR algorithm is a campaign of CAESAR competition, it has successfully entered the third round of screening by virtue of its unique advantages, you can see the whole algorithms and structure of AES-OTR from the design document (see [[2]](#2)).\n\nHowever, the first version is vulnerable to forgery attacks in the known plaintext conditions and association data and public message number are reused, many attacks can be applied here to forge an excepted ciphertext with a valid tag (see [[3]](#3)).\n\nFor example, in this challenge we can build the following three plaintexts:", "_____no_output_____" ] ], [ [ "M_0 = [b'Uid=16112\\xffUserNa', b'me=AdministratoR', b'\\xffT=111111111111\\xff', b'Cmd=Give_Me_FlaG', b'\\xff???????????????']\nM_1 = [b'Uid=16111\\xffUserNa', b'me=Administrator', b'r\\xffT=11111111111\\xff', b'Cmd=Give_Me_FlaG', b'\\xff???????????????']\nM_2 = [b'Uid=16112\\xffUserNa', b'me=AdministratoR', b'\\xffT=111111111111\\xff', b'Cmd=Give_Me_Flag', b'g\\xff??????????????']", "_____no_output_____" ] ], [ [ "Here `'111111111111'` can represent any value since the server won't check whether the message and its corresponding hash value match, so we just need to make sure that they are at the right length. If you look closely, you will find that none of the three plaintexts contains illegal fields, so we can use the encrypt Oracle provided by the server to get their corresponding ciphertexts easily. Next, noticed that these plaintexts satisfied:", "_____no_output_____" ] ], [ [ "from Crypto.Util.strxor import strxor\n\nM_0 = [b'Uid=16112\\xffUserNa', b'me=AdministratoR', b'\\xffT=111111111111\\xff', b'Cmd=Give_Me_FlaG', b'\\xff???????????????']\nM_1 = [b'Uid=16111\\xffUserNa', b'me=Administrator', b'r\\xffT=11111111111\\xff', b'Cmd=Give_Me_FlaG', b'\\xff???????????????']\nM_2 = [b'Uid=16112\\xffUserNa', b'me=AdministratoR', b'\\xffT=111111111111\\xff', b'Cmd=Give_Me_Flag', b'g\\xff??????????????']\n\nstrxor(M_0[1], M_0[3]) == strxor(M_1[1], M_2[3])", "_____no_output_____" ] ], [ [ "So according to the forgery attacks described in [[3]](#3), suppose their corresponding ciphertexts are `C_0`, `C_1` and `C_2`, then we can forge a valid ciphertext and tag using:", "_____no_output_____" ] ], [ [ "from Toy_AE import Toy_AE\n\ndef unpack(r):\n data = r.split(b\"\\xff\")\n uid, uname, token, cmd, appendix = int(data[0][4:]), data[1][9:], data[2][2:], data[3][4:], data[4]\n return (uid, uname, token, cmd, appendix)\n\nae = Toy_AE()\n\nM_0 = [b'Uid=16112\\xffUserNa', b'me=AdministratoR', b'\\xffT=111111111111\\xff', b'Cmd=Give_Me_FlaG', b'\\xff???????????????']\nM_1 = [b'Uid=16111\\xffUserNa', b'me=Administrator', b'r\\xffT=11111111111\\xff', b'Cmd=Give_Me_FlaG', b'\\xff???????????????']\nM_2 = [b'Uid=16112\\xffUserNa', b'me=AdministratoR', b'\\xffT=111111111111\\xff', b'Cmd=Give_Me_Flag', b'g\\xff??????????????']\n\nC_0, T_0 = ae.encrypt(b''.join(M_0))\nC_1, T_1 = ae.encrypt(b''.join(M_1))\nC_2, T_2 = ae.encrypt(b''.join(M_2))\nC_forge = C_1[:32] + C_2[32:64] + C_0[64:]\n\nT_forge = T_0\n\n_, uname, _, cmd, _ = unpack(ae.decrypt(C_forge, T_forge))\nuname == b\"Administrator\" and cmd == b\"Give_Me_Flag\"", "_____no_output_____" ] ], [ [ "Here is my final exp:", "_____no_output_____" ] ], [ [ "import string\nfrom pwn import *\nfrom hashlib import sha256\nfrom Crypto.Util.strxor import strxor\nfrom Crypto.Util.number import long_to_bytes, bytes_to_long\n\ndef bypass_POW(io):\n chall = io.recvline()\n post = chall[14:30]\n tar = chall[38:-2]\n io.recvuntil(':')\n found = iters.bruteforce(lambda x:sha256((x + post.decode()).encode()).hexdigest() == tar.decode(), string.ascii_letters + string.digits, 4)\n io.sendline(found.encode())\n\nC = []\nT = []\n\nio = remote(\"123.57.4.93\", 45216)\nbypass_POW(io)\n\nio.sendlineafter(b\"Your option:\", '1')\nio.sendlineafter(b\"Set up your user id:\", '16108')\nio.sendlineafter(b\"Your username:\", 'AdministratoR')\nio.sendlineafter(b\"Your command:\", 'Give_Me_FlaG')\nio.sendlineafter(b\"Any Appendix?\", \"???????????????\")\n\n_ = io.recvuntil(b\"Your ticket:\")\nC.append(long_to_bytes(int(io.recvline().strip(), 16)))\n_ = io.recvuntil(b\"With my Auth:\")\nT.append(long_to_bytes(int(io.recvline().strip(), 16)))\n\nio.sendlineafter(b\"Your option:\", '1')\nio.sendlineafter(b\"Set up your user id:\", '16107')\nio.sendlineafter(b\"Your username:\", 'Administratorr')\nio.sendlineafter(b\"Your command:\", 'Give_Me_FlaG')\nio.sendlineafter(b\"Any Appendix?\", \"???????????????\")\n\n_ = io.recvuntil(b\"Your ticket:\")\nC.append(long_to_bytes(int(io.recvline().strip(), 16)))\n_ = io.recvuntil(b\"With my Auth:\")\nT.append(long_to_bytes(int(io.recvline().strip(), 16)))\n\nio.sendlineafter(b\"Your option:\", '1')\nio.sendlineafter(b\"Set up your user id:\", '16108')\nio.sendlineafter(b\"Your username:\", 'AdministratoR')\nio.sendlineafter(b\"Your command:\", 'Give_Me_Flagg')\nio.sendlineafter(b\"Any Appendix?\", \"??????????????\")\n\n_ = io.recvuntil(b\"Your ticket:\")\nC.append(long_to_bytes(int(io.recvline().strip(), 16)))\n_ = io.recvuntil(b\"With my Auth:\")\nT.append(long_to_bytes(int(io.recvline().strip(), 16)))\n\nct = (C[1][:32] + C[2][32:64] + C[0][64:]).hex()\nte = T[0].hex()\n\nio.sendlineafter(b\"Your option:\", '2')\nio.sendlineafter(b\"Ticket:\", ct)\nio.sendlineafter(b\"Auth:\", te)\nflag = io.recvline().strip().decode()\nprint(flag)", "_____no_output_____" ] ], [ [ "b'X-NUCA{Gentlem3n_as_0f_th1s_mOment_I aM_th4t_sec0nd_mouse}'", "_____no_output_____" ], [ "**P.S.**\n\n* The version used in this challenge is v 1.0, some vulnerabilities have been fixed in subsequent versions(v 2.0, v 3.0 and v 3.1), you can see the final version at [[4]](#4). Also, for some attacks on the new version, see [[5]](#5) and [[6]](#6).\n\n* The content of the FLAG is a quote from movie *Catch Me If You Can* \"Two little mice fell in a bucket of cream. The first mouse quickly gave up and drowned. The second mouse, wouldn't quit. He struggled so hard that eventually he churned that cream into butter and crawled out. Gentlemen, as of this moment, I am that second mouse.\"\n\n**References**\n\n<a id=\"1\" href=\"https://eprint.iacr.org/2013/628.pdf\"> [1] Minematsu K. Parallelizable rate-1 authenticated encryption from pseudorandom functions[C]//Annual International Conference on the Theory and Applications of Cryptographic Techniques. Springer, Berlin, Heidelberg, 2014: 275-292.</a>\n\n<a id=\"2\" href=\"https://competitions.cr.yp.to/round1/aesotrv1.pdf\"> [2] Minematsu K. AES-OTR v1 design document.</a>\n\n<a id=\"3\" href=\"http://www.shcas.net/jsjyup/pdf/2017/10/对认证加密算法AES-OTR的伪造攻击.pdf\"> [3] Xiulin Zheng, Yipeng Fu, Haiyan Song. Forging attacks on authenticated encryption algorithm AES-OTR[J]. Computer Applications and Software, 2017, 034(010):320-324,329.</a>\n\n<a id=\"4\" href=\"https://competitions.cr.yp.to/round1/aesotrv1.pdf\"> [4] Minematsu K. AES-OTR v3.1 design document.</a>\n\n<a id=\"5\" href=\"https://eprint.iacr.org/2017/332.pdf\">[5] Forler, Christian, et al. \"Reforgeability of authenticated encryption schemes.\" Australasian Conference on Information Security and Privacy. Springer, Cham, 2017.</a>\n\n<a id=\"6\" href=\"https://eprint.iacr.org/2017/1147.pdf\">[6] Vaudenay, Serge, and Damian Vizár. \"Under Pressure: Security of Caesar Candidates beyond their Guarantees.\" IACR Cryptol. ePrint Arch. 2017 (2017): 1147.</a>", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ] ]
d026f3e4d6c336693446265b87d2fae00ea0174e
7,036
ipynb
Jupyter Notebook
Sessions/Session14/Day4/MJH_hackday.ipynb
hopkins9942/LSSTC-DSFP-Sessions
6856ea7345a2fc2f0bf2ee920168444887c100ca
[ "MIT" ]
null
null
null
Sessions/Session14/Day4/MJH_hackday.ipynb
hopkins9942/LSSTC-DSFP-Sessions
6856ea7345a2fc2f0bf2ee920168444887c100ca
[ "MIT" ]
null
null
null
Sessions/Session14/Day4/MJH_hackday.ipynb
hopkins9942/LSSTC-DSFP-Sessions
6856ea7345a2fc2f0bf2ee920168444887c100ca
[ "MIT" ]
null
null
null
34.490196
1,314
0.564525
[ [ [ "import os\nimport numpy as np\nimport pandas as pd\nimport seaborn as sns\nimport matplotlib.pyplot as plt\n%matplotlib inline\nimport torch\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.ensemble import IsolationForest\n", "_____no_output_____" ], [ "cubism_path = \"/home/hopkinsl/Downloads/wikiart/wikiart/Cubism\"\nlistdir = os.listdir(cubism_path)\nimage_names = []\nlabels = []\nfor file in listdir:\n if file.startswith('pablo-picasso'):\n image_names.append(file)\n labels.append(0)\n elif file.startswith('marevna'):\n image_names.append(file)\n labels.append(1)\n elif file.startswith('fernand-leger'):\n image_names.append(file)\n labels.append(2)\nprint(labels.count(0))\nprint(labels.count(1))\nprint(labels.count(2))\nprint(listdir[0])", "203\n36\n141\npablo-picasso_las-meninas-velazquez-1957-1.jpg\n" ], [ "#Reading, reshaping, and normalizing images\n#images = plt.imread(data)\n#labels = pandas.read_csv()\n\nimages = torch.zeros\nfor file in image_names:\n \n\nimages = images.reshape(len(images),-1)\nimages = (images - images.mean()) / images.std())\nimages_train, images_test, labels_train, labels_test = train_test_split(images, labels, test_size=0.33, random_state=42)", "1\n" ], [ "\n\nclass MLP(torch.nn.Module):\n # this defines the model\n def __init__(self, sizes):\n super(MLP, self).__init__()\n print(sizes)\n self.sizes = sizes\n self.layers = [torch.nn.Linear(sizes[i], sizes[i+1]) for i in range(len(sizes)-1)]\n self.sigmoid = torch.nn.Sigmoid()\n self.relu = torch.nn.ReLU()\n self.softmax = torch.nn.Softmax()\n def forward(self, x):\n temp_activation = x\n temp_layer = self.layers[0](temp_activation)\n for i in range(1, len(layers)):\n temp_activation = self.sigmoid(temp_layer)\n temp_layer = self.layers[i](temp_activation)\n return self.softmax(temp_layer)\n\n", "_____no_output_____" ], [ "def train_model(training_data, test_data,training_labels,test_labels, model):\n # define the optimization\n criterion = torch.nn.BCELoss()\n optimizer = torch.optim.SGD(model.parameters(), lr=0.007,momentum=0.9)\n for epoch in range(10):\n # clear the gradient\n optimizer.zero_grad()\n # compute the model output\n myoutput = model(training_data)\n # calculate loss\n loss = criterion(myoutput, training_labels)\n # credit assignment\n loss.backward()\n # update model weights\n optimizer.step()\n\n # STUDENTS ADD THIS PART\n output_test = model(test_data)\n loss_test = criterion(output_test, test_labels)\n plt.plot(epoch,loss.detach().numpy(),'ko')\n plt.plot(epoch,loss_test.detach().numpy(),'ro')\n print(epoch,loss.detach().numpy())\n plt.show()\n\nsizes = [len(images[0,:]), 30,30,23]\ntrain_model(images_train, images_test, labels_train, labels_test, MLP())", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code" ] ]
d026f4e06cec45f321b1c06d6eac850f67f585b3
15,649
ipynb
Jupyter Notebook
docs/src/man/VC_Examples.ipynb
OpenMendel/QuasiCopula.jl
3addff703c63b158fcadff3ca52575b3b486b321
[ "MIT" ]
null
null
null
docs/src/man/VC_Examples.ipynb
OpenMendel/QuasiCopula.jl
3addff703c63b158fcadff3ca52575b3b486b321
[ "MIT" ]
1
2022-03-17T20:44:08.000Z
2022-03-17T20:44:08.000Z
docs/src/man/VC_Examples.ipynb
OpenMendel/QuasiCopula.jl
3addff703c63b158fcadff3ca52575b3b486b321
[ "MIT" ]
null
null
null
33.726293
373
0.525401
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d026f5e659ec15b5e5d4f9c452e48dc8cd05c91a
33,461
ipynb
Jupyter Notebook
notebooks/sentiment_analysis/textblob.ipynb
ClaasM/streamed-sentiment-topic-intent
76f6e8686ab629391fd714228547ed1de097466c
[ "MIT" ]
null
null
null
notebooks/sentiment_analysis/textblob.ipynb
ClaasM/streamed-sentiment-topic-intent
76f6e8686ab629391fd714228547ed1de097466c
[ "MIT" ]
8
2020-03-24T15:33:52.000Z
2022-03-11T23:16:16.000Z
notebooks/sentiment_analysis/textblob.ipynb
ClaasM/Bachelors-Thesis
76f6e8686ab629391fd714228547ed1de097466c
[ "MIT" ]
null
null
null
221.596026
29,524
0.905054
[ [ [ "# Testing different SA methods 4/5\n## Textblob\n", "_____no_output_____" ] ], [ [ "import csv\nimport re\nimport random\n\nfrom textblob import TextBlob\n\n# Ugly hackery, but necessary: stackoverflow.com/questions/4383571/importing-files-from-different-folder\nimport sys\nsys.path.append('../../../')\n \nfrom src.streaming import spark_functions", "_____no_output_____" ], [ "preprocess = spark_functions.preprocessor()\ntokenize = spark_functions.tokenizer()\n\nwith open('./../../../data/interim/sanders_hydrated.csv') as csv_file:\n iterator = csv.reader(csv_file, delimiter=',')\n # Load the parts we need and preprocess as well as tokenize the text\n tweets = [(text, sentiment) for (topic, sentiment, id, text) in iterator if sentiment=='positive' or sentiment=='negative']\n # Shuffle for good measure\n random.shuffle(tweets)", "_____no_output_____" ], [ "import matplotlib.pyplot as plt\n\nresults = {\n \"positive\":{\"color\":\"green\",\"x\":[],\"y\":[]},\n \"neutral\":{\"color\":\"orange\",\"x\":[],\"y\":[]},\n \"negative\":{\"color\":\"red\",\"x\":[],\"y\":[]}\n}\n \n# Create plot\nfig = plt.figure()\nax = fig.add_subplot(1, 1, 1)\n \nfor (tweet,sentiment) in tweets[:200]:\n analysis = TextBlob(preprocess(tweet)).sentiment\n results[sentiment][\"x\"].append(analysis.polarity)\n results[sentiment][\"y\"].append(analysis.subjectivity)\n \nfor key in results:\n ax.scatter(results[key][\"x\"], results[key][\"y\"], alpha=0.8, c=results[key][\"color\"], edgecolors='none', label=key)\n \nplt.xlabel('polarity')\nplt.ylabel('subjectivity')\n\nplt.legend(loc=2)\nplt.savefig('textblob.pdf', format='pdf')\nplt.show()", "_____no_output_____" ], [ "# This is fucking hopeless\nlabeled_correctly = 0\nfor (tweet,sentiment) in tweets:\n analysis = TextBlob(preprocess(tweet)).sentiment\n if (analysis.polarity < 0 and sentiment == 'negative') \\\n or (analysis.polarity >= 0 and sentiment == 'positive'):\n labeled_correctly += 1\n\nprint(\"Labeled correctly: %d/%d = %.2d percent\" % (labeled_correctly, len(tweets), labeled_correctly/len(tweets)*100))", "Labeled correctly: 580/946 = 61 percent\n" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code" ] ]
d02706cbf2067692ba72c9634f9cc1435b55f07c
31,530
ipynb
Jupyter Notebook
Some_strings_and_regex_operation_in_Python.ipynb
LanguegeEngineering/demo-igor-skorzybot
d056757b65d87d4691e39655c122dbd4d7941a02
[ "MIT" ]
null
null
null
Some_strings_and_regex_operation_in_Python.ipynb
LanguegeEngineering/demo-igor-skorzybot
d056757b65d87d4691e39655c122dbd4d7941a02
[ "MIT" ]
null
null
null
Some_strings_and_regex_operation_in_Python.ipynb
LanguegeEngineering/demo-igor-skorzybot
d056757b65d87d4691e39655c122dbd4d7941a02
[ "MIT" ]
null
null
null
24.984152
366
0.415128
[ [ [ "### Easy string manipulation", "_____no_output_____" ] ], [ [ "x = 'a string'\ny = \"a string\"\nif x == y:\n print(\"they are the same\")", "they are the same\n" ], [ "fox = \"tHe qUICk bROWn fOx.\"", "_____no_output_____" ] ], [ [ "To convert the entire string into upper-case or lower-case, you can use the ``upper()`` or ``lower()`` methods respectively:", "_____no_output_____" ] ], [ [ "fox.upper()", "_____no_output_____" ], [ "fox.lower()", "_____no_output_____" ] ], [ [ "A common formatting need is to capitalize just the first letter of each word, or perhaps the first letter of each sentence.\nThis can be done with the ``title()`` and ``capitalize()`` methods:", "_____no_output_____" ] ], [ [ "fox.title()", "_____no_output_____" ], [ "fox.capitalize()", "_____no_output_____" ] ], [ [ "The cases can be swapped using the ``swapcase()`` method:", "_____no_output_____" ] ], [ [ "fox.swapcase()", "_____no_output_____" ], [ "line = ' this is the content '\nline.strip()", "_____no_output_____" ] ], [ [ "To remove just space to the right or left, use ``rstrip()`` or ``lstrip()`` respectively:", "_____no_output_____" ] ], [ [ "line.rstrip()", "_____no_output_____" ], [ "line.lstrip()", "_____no_output_____" ] ], [ [ "To remove characters other than spaces, you can pass the desired character to the ``strip()`` method:", "_____no_output_____" ] ], [ [ "num = \"000000000000435\"\nnum.strip('0')", "_____no_output_____" ], [ "line = 'the quick brown fox jumped over a lazy dog'\nline.find('fox')", "_____no_output_____" ], [ "line.index('fox')", "_____no_output_____" ], [ "line[16:21]", "_____no_output_____" ] ], [ [ "The only difference between ``find()`` and ``index()`` is their behavior when the search string is not found; ``find()`` returns ``-1``, while ``index()`` raises a ``ValueError``:", "_____no_output_____" ] ], [ [ "line.find('bear')", "_____no_output_____" ], [ "line.index('bear')", "_____no_output_____" ], [ "line.partition('fox')", "_____no_output_____" ] ], [ [ "The ``rpartition()`` method is similar, but searches from the right of the string.\n\nThe ``split()`` method is perhaps more useful; it finds *all* instances of the split-point and returns the substrings in between.\nThe default is to split on any whitespace, returning a list of the individual words in a string:", "_____no_output_____" ] ], [ [ "line_list = line.split()\nprint(line_list)", "['the', 'quick', 'brown', 'fox', 'jumped', 'over', 'a', 'lazy', 'dog']\n" ], [ "print(line_list[1])", "quick\n" ] ], [ [ "A related method is ``splitlines()``, which splits on newline characters.\nLet's do this with a Haiku, popularly attributed to the 17th-century poet Matsuo Bashō:", "_____no_output_____" ] ], [ [ "haiku = \"\"\"matsushima-ya\naah matsushima-ya\nmatsushima-ya\"\"\"\n\nhaiku.splitlines()", "_____no_output_____" ] ], [ [ "Note that if you would like to undo a ``split()``, you can use the ``join()`` method, which returns a string built from a splitpoint and an iterable:", "_____no_output_____" ] ], [ [ "'--'.join(['1', '2', '3'])", "_____no_output_____" ] ], [ [ "A common pattern is to use the special character ``\"\\n\"`` (newline) to join together lines that have been previously split, and recover the input:", "_____no_output_____" ] ], [ [ "print(\"\\n\".join(['matsushima-ya', 'aah matsushima-ya', 'matsushima-ya']))", "matsushima-ya\naah matsushima-ya\nmatsushima-ya\n" ], [ "pi = 3.14159\nstr(pi)", "_____no_output_____" ], [ "print (\"The value of pi is \" + pi)", "_____no_output_____" ] ], [ [ "Pi is a float number so it must be transform to sting.", "_____no_output_____" ] ], [ [ "print( \"The value of pi is \" + str(pi))", "The value of pi is 3.14159\n" ] ], [ [ "A more flexible way to do this is to use *format strings*, which are strings with special markers (noted by curly braces) into which string-formatted values will be inserted.\nHere is a basic example:", "_____no_output_____" ] ], [ [ "\"The value of pi is {}\".format(pi)", "_____no_output_____" ] ], [ [ "### Easy regex manipulation!", "_____no_output_____" ] ], [ [ "import re", "_____no_output_____" ], [ "line = 'the quick brown fox jumped over a lazy dog'", "_____no_output_____" ] ], [ [ "With this, we can see that the ``regex.search()`` method operates a lot like ``str.index()`` or ``str.find()``:", "_____no_output_____" ] ], [ [ "line.index('fox')", "_____no_output_____" ], [ "regex = re.compile('fox')\nmatch = regex.search(line)\nmatch.start()", "_____no_output_____" ] ], [ [ "Similarly, the ``regex.sub()`` method operates much like ``str.replace()``:", "_____no_output_____" ] ], [ [ "line.replace('fox', 'BEAR')", "_____no_output_____" ], [ "regex.sub('BEAR', line)", "_____no_output_____" ] ], [ [ "The following is a table of the repetition markers available for use in regular expressions:\n\n| Character | Description | Example |\n|-----------|-------------|---------|\n| ``?`` | Match zero or one repetitions of preceding | ``\"ab?\"`` matches ``\"a\"`` or ``\"ab\"`` |\n| ``*`` | Match zero or more repetitions of preceding | ``\"ab*\"`` matches ``\"a\"``, ``\"ab\"``, ``\"abb\"``, ``\"abbb\"``... |\n| ``+`` | Match one or more repetitions of preceding | ``\"ab+\"`` matches ``\"ab\"``, ``\"abb\"``, ``\"abbb\"``... but not ``\"a\"`` |\n| ``.`` | Any character | ``.*`` matches everything | \n| ``{n}`` | Match ``n`` repetitions of preeeding | ``\"ab{2}\"`` matches ``\"abb\"`` |\n| ``{m,n}`` | Match between ``m`` and ``n`` repetitions of preceding | ``\"ab{2,3}\"`` matches ``\"abb\"`` or ``\"abbb\"`` |", "_____no_output_____" ] ], [ [ "bool(re.search(r'ab', \"Boabab\"))", "_____no_output_____" ], [ "bool(re.search(r'.*ma.*', \"Ala ma kota\"))", "_____no_output_____" ], [ "bool(re.search(r'.*(psa|kota).*', \"Ala ma kota\"))", "_____no_output_____" ], [ "bool(re.search(r'.*(psa|kota).*', \"Ala ma psa\"))", "_____no_output_____" ], [ "bool(re.search(r'.*(psa|kota).*', \"Ala ma chomika\"))", "_____no_output_____" ], [ "zdanie = \"Ala ma kota.\"\nwzor = r'.*' #pasuje do każdego zdania\nzamiennik = r\"Ala ma psa.\"", "_____no_output_____" ], [ "re.sub(wzor, zamiennik, zdanie)", "_____no_output_____" ], [ "wzor = r'(.*)kota.'\nzamiennik = r\"\\1 psa.\"", "_____no_output_____" ], [ "re.sub(wzor, zamiennik, zdanie)", "_____no_output_____" ], [ "wzor = r'(.*)ma(.*)'\nzamiennik = r\"\\1 posiada \\2\"", "_____no_output_____" ], [ "re.sub(wzor, zamiennik, zdanie)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0270cb19510138627ba78c9a9765757b15cdc77
30,455
ipynb
Jupyter Notebook
.ipynb_checkpoints/05.1 Generating Text in the Style of an Example Text-checkpoint.ipynb
WillKoehrsen/deep_learning_cookbook
6127041766367575135e3dfc2c00600153d238d2
[ "Apache-2.0" ]
11
2018-10-11T09:44:48.000Z
2021-09-28T09:22:38.000Z
.ipynb_checkpoints/05.1 Generating Text in the Style of an Example Text-checkpoint.ipynb
WillKoehrsen/deep_learning_cookbook
6127041766367575135e3dfc2c00600153d238d2
[ "Apache-2.0" ]
null
null
null
.ipynb_checkpoints/05.1 Generating Text in the Style of an Example Text-checkpoint.ipynb
WillKoehrsen/deep_learning_cookbook
6127041766367575135e3dfc2c00600153d238d2
[ "Apache-2.0" ]
17
2018-10-04T04:56:56.000Z
2022-01-25T15:00:09.000Z
33.39364
144
0.446331
[ [ [ "import re\nimport os\nimport keras.backend as K\nimport numpy as np\nimport pandas as pd\nfrom keras import layers, models, utils\nimport json", "Using TensorFlow backend.\n" ], [ "def reset_everything():\n import tensorflow as tf\n %reset -f in out dhist\n tf.reset_default_graph()\n K.set_session(tf.InteractiveSession())", "_____no_output_____" ], [ "# Constants for our networks. We keep these deliberately small to reduce training time.\n\nVOCAB_SIZE = 250000\nEMBEDDING_SIZE = 100\nMAX_DOC_LEN = 128\nMIN_DOC_LEN = 12", "_____no_output_____" ], [ "def extract_stackexchange(filename, limit=1000000):\n json_file = filename + 'limit=%s.json' % limit\n\n rows = []\n for i, line in enumerate(os.popen('7z x -so \"%s\" Posts.xml' % filename)):\n line = str(line)\n if not line.startswith(' <row'):\n continue\n \n if i % 1000 == 0:\n print('\\r%05d/%05d' % (i, limit), end='', flush=True)\n\n parts = line[6:-5].split('\"')\n record = {}\n for i in range(0, len(parts), 2):\n k = parts[i].replace('=', '').strip()\n v = parts[i+1].strip()\n record[k] = v\n rows.append(record)\n \n if len(rows) > limit:\n break\n \n with open(json_file, 'w') as fout:\n json.dump(rows, fout)\n \n return rows\n\n\nxml_7z = utils.get_file(\n fname='travel.stackexchange.com.7z',\n origin='https://ia800107.us.archive.org/27/items/stackexchange/travel.stackexchange.com.7z',\n)\nprint()\n\nrows = extract_stackexchange(xml_7z)", "\n87000/1000000" ] ], [ [ "# Data Exploration\n\nNow that we have extracted our data, let's clean it up and take a look at what we have to work with.", "_____no_output_____" ] ], [ [ "df = pd.DataFrame.from_records(rows) \ndf = df.set_index('Id', drop=False)\ndf['Title'] = df['Title'].fillna('').astype('str')\ndf['Tags'] = df['Tags'].fillna('').astype('str')\ndf['Body'] = df['Body'].fillna('').astype('str')\ndf['Id'] = df['Id'].astype('int')\ndf['PostTypeId'] = df['PostTypeId'].astype('int')\ndf['ViewCount'] = df['ViewCount'].astype('float')\n\ndf.head()", "_____no_output_____" ], [ "list(df[df['ViewCount'] > 250000]['Title'])", "_____no_output_____" ], [ "from keras.preprocessing.text import Tokenizer\nfrom keras.preprocessing.sequence import pad_sequences\n\ntokenizer = Tokenizer(num_words=VOCAB_SIZE)\ntokenizer.fit_on_texts(df['Body'] + df['Title'])", "_____no_output_____" ], [ "# Compute TF/IDF Values\n\ntotal_count = sum(tokenizer.word_counts.values())\nidf = { k: np.log(total_count/v) for (k,v) in tokenizer.word_counts.items() }", "_____no_output_____" ], [ "# Download pre-trained word2vec embeddings\n\nimport gensim\n\nglove_100d = utils.get_file(\n fname='glove.6B.100d.txt',\n origin='https://storage.googleapis.com/deep-learning-cookbook/glove.6B.100d.txt',\n)\n\nw2v_100d = glove_100d + '.w2v'\nfrom gensim.scripts.glove2word2vec import glove2word2vec\nglove2word2vec(glove_100d, w2v_100d)\nw2v_model = gensim.models.KeyedVectors.load_word2vec_format(w2v_100d)\n\nw2v_weights = np.zeros((VOCAB_SIZE, w2v_model.syn0.shape[1]))\nidf_weights = np.zeros((VOCAB_SIZE, 1))\n\nfor k, v in tokenizer.word_index.items():\n if v >= VOCAB_SIZE:\n continue\n \n if k in w2v_model:\n w2v_weights[v] = w2v_model[k]\n \n idf_weights[v] = idf[k]\n \ndel w2v_model", "_____no_output_____" ], [ "df['title_tokens'] = tokenizer.texts_to_sequences(df['Title'])\ndf['body_tokens'] = tokenizer.texts_to_sequences(df['Body'])", "_____no_output_____" ], [ "import random\n\n# We can create a data generator that will randomly title and body tokens for questions. We'll use random text\n# from other questions as a negative example when necessary.\ndef data_generator(batch_size, negative_samples=1):\n questions = df[df['PostTypeId'] == 1]\n all_q_ids = list(questions.index)\n \n batch_x_a = []\n batch_x_b = []\n batch_y = []\n \n def _add(x_a, x_b, y):\n batch_x_a.append(x_a[:MAX_DOC_LEN])\n batch_x_b.append(x_b[:MAX_DOC_LEN])\n batch_y.append(y)\n \n while True:\n questions = questions.sample(frac=1.0)\n \n for i, q in questions.iterrows():\n _add(q['title_tokens'], q['body_tokens'], 1)\n \n negative_q = random.sample(all_q_ids, negative_samples)\n for nq_id in negative_q:\n _add(q['title_tokens'], df.at[nq_id, 'body_tokens'], 0) \n \n if len(batch_y) >= batch_size:\n yield ({\n 'title': pad_sequences(batch_x_a, maxlen=None),\n 'body': pad_sequences(batch_x_b, maxlen=None),\n }, np.asarray(batch_y))\n \n batch_x_a = []\n batch_x_b = []\n batch_y = []\n\n# dg = data_generator(1, 2)\n# next(dg)\n# next(dg)", "_____no_output_____" ] ], [ [ "# Embedding Lookups\n\nLet's define a helper class for looking up our embedding results. We'll use it\nto verify our models.", "_____no_output_____" ] ], [ [ "questions = df[df['PostTypeId'] == 1]['Title'].reset_index(drop=True)\nquestion_tokens = pad_sequences(tokenizer.texts_to_sequences(questions))\n\nclass EmbeddingWrapper(object):\n def __init__(self, model):\n self._r = questions\n self._i = {i:s for (i, s) in enumerate(questions)}\n self._w = model.predict({'title': question_tokens}, verbose=1, batch_size=1024)\n self._model = model\n self._norm = np.sqrt(np.sum(self._w * self._w + 1e-5, axis=1))\n\n def nearest(self, sentence, n=10):\n x = tokenizer.texts_to_sequences([sentence])\n if len(x[0]) < MIN_DOC_LEN:\n x[0] += [0] * (MIN_DOC_LEN - len(x))\n e = self._model.predict(np.asarray(x))[0]\n norm_e = np.sqrt(np.dot(e, e))\n dist = np.dot(self._w, e) / (norm_e * self._norm)\n\n top_idx = np.argsort(dist)[-n:]\n return pd.DataFrame.from_records([\n {'question': self._r[i], 'dist': float(dist[i])}\n for i in top_idx\n ])", "_____no_output_____" ], [ "# Our first model will just sum up the embeddings of each token.\n# The similarity between documents will be the dot product of the final embedding.\n\nimport tensorflow as tf\n\ndef sum_model(embedding_size, vocab_size, embedding_weights=None, idf_weights=None):\n title = layers.Input(shape=(None,), dtype='int32', name='title')\n body = layers.Input(shape=(None,), dtype='int32', name='body')\n\n def make_embedding(name):\n if embedding_weights is not None:\n embedding = layers.Embedding(mask_zero=True, input_dim=vocab_size, output_dim=w2v_weights.shape[1], \n weights=[w2v_weights], trainable=False, \n name='%s/embedding' % name)\n else:\n embedding = layers.Embedding(mask_zero=True, input_dim=vocab_size, output_dim=embedding_size,\n name='%s/embedding' % name)\n\n if idf_weights is not None:\n idf = layers.Embedding(mask_zero=True, input_dim=vocab_size, output_dim=1, \n weights=[idf_weights], trainable=False,\n name='%s/idf' % name)\n else:\n idf = layers.Embedding(mask_zero=True, input_dim=vocab_size, output_dim=1,\n name='%s/idf' % name)\n \n return embedding, idf\n \n embedding_a, idf_a = make_embedding('a')\n embedding_b, idf_b = embedding_a, idf_a\n# embedding_b, idf_b = make_embedding('b')\n\n mask = layers.Masking(mask_value=0)\n def _combine_and_sum(args):\n [embedding, idf] = args\n return K.sum(embedding * K.abs(idf), axis=1)\n\n sum_layer = layers.Lambda(_combine_and_sum, name='combine_and_sum')\n\n sum_a = sum_layer([mask(embedding_a(title)), idf_a(title)])\n sum_b = sum_layer([mask(embedding_b(body)), idf_b(body)])\n\n sim = layers.dot([sum_a, sum_b], axes=1, normalize=True)\n sim_model = models.Model(\n inputs=[title, body],\n outputs=[sim],\n )\n sim_model.compile(loss='binary_crossentropy', optimizer='nadam', metrics=['accuracy'])\n sim_model.summary()\n\n embedding_model = models.Model(\n inputs=[title],\n outputs=[sum_a]\n )\n return sim_model, embedding_model", "_____no_output_____" ], [ "# Try using our model with pretrained weights from word2vec\n\nsum_model_precomputed, sum_embedding_precomputed = sum_model(\n embedding_size=EMBEDDING_SIZE, vocab_size=VOCAB_SIZE,\n embedding_weights=w2v_weights, idf_weights=idf_weights\n)\n\nx, y = next(data_generator(batch_size=4096))\nsum_model_precomputed.evaluate(x, y)", "_____no_output_____" ], [ "SAMPLE_QUESTIONS = [\n 'Roundtrip ticket versus one way',\n 'Shinkansen from Kyoto to Hiroshima',\n 'Bus tour of Germany',\n]\n\ndef evaluate_sample(lookup):\n pd.set_option('display.max_colwidth', 100)\n results = []\n for q in SAMPLE_QUESTIONS:\n print(q)\n q_res = lookup.nearest(q, n=4)\n q_res['result'] = q_res['question']\n q_res['question'] = q\n results.append(q_res)\n\n return pd.concat(results)\n\nlookup = EmbeddingWrapper(model=sum_embedding_precomputed)\nevaluate_sample(lookup)", "_____no_output_____" ] ], [ [ "# Training our own network\n\nThe results are okay but not great... instead of using the word2vec embeddings, what happens if we train our network end-to-end?", "_____no_output_____" ] ], [ [ "sum_model_trained, sum_embedding_trained = sum_model(\n embedding_size=EMBEDDING_SIZE, vocab_size=VOCAB_SIZE, \n embedding_weights=None,\n idf_weights=None\n)\nsum_model_trained.fit_generator(\n data_generator(batch_size=128),\n epochs=10,\n steps_per_epoch=1000\n)", "_____no_output_____" ], [ "lookup = EmbeddingWrapper(model=sum_embedding_trained)\nevaluate_sample(lookup)", "_____no_output_____" ] ], [ [ "## CNN Model\n\nUsing a sum-of-embeddings model works well. What happens if we try to make a simple CNN model?", "_____no_output_____" ] ], [ [ "def cnn_model(embedding_size, vocab_size):\n title = layers.Input(shape=(None,), dtype='int32', name='title')\n body = layers.Input(shape=(None,), dtype='int32', name='body')\n\n embedding = layers.Embedding(\n mask_zero=False,\n input_dim=vocab_size,\n output_dim=embedding_size,\n )\n\n\n def _combine_sum(v):\n return K.sum(v, axis=1)\n\n cnn_1 = layers.Convolution1D(256, 3)\n cnn_2 = layers.Convolution1D(256, 3)\n cnn_3 = layers.Convolution1D(256, 3)\n \n global_pool = layers.GlobalMaxPooling1D()\n local_pool = layers.MaxPooling1D(strides=2, pool_size=3)\n\n def forward(input):\n embed = embedding(input)\n return global_pool(\n cnn_2(local_pool(cnn_1(embed))))\n\n sum_a = forward(title)\n sum_b = forward(body)\n\n sim = layers.dot([sum_a, sum_b], axes=1, normalize=False)\n sim_model = models.Model(\n inputs=[title, body],\n outputs=[sim],\n )\n sim_model.compile(loss='binary_crossentropy', optimizer='rmsprop', metrics=['accuracy'])\n\n embedding_model = models.Model(\n inputs=[title],\n outputs=[sum_a]\n )\n return sim_model, embedding_model", "_____no_output_____" ], [ "cnn, cnn_embedding = cnn_model(embedding_size=25, vocab_size=VOCAB_SIZE)\ncnn.summary()\ncnn.fit_generator(\n data_generator(batch_size=128),\n epochs=10,\n steps_per_epoch=1000,\n)", "_____no_output_____" ], [ "lookup = EmbeddingWrapper(model=cnn_embedding)\nevaluate_sample(lookup)", "_____no_output_____" ] ], [ [ "## LSTM Model\n\nWe can also make an LSTM model. Warning, this will be very slow to train and evaluate unless you have a relatively fast GPU to run it on!", "_____no_output_____" ] ], [ [ "def lstm_model(embedding_size, vocab_size):\n title = layers.Input(shape=(None,), dtype='int32', name='title')\n body = layers.Input(shape=(None,), dtype='int32', name='body')\n\n embedding = layers.Embedding(\n mask_zero=True,\n input_dim=vocab_size,\n output_dim=embedding_size,\n# weights=[w2v_weights],\n# trainable=False\n )\n\n lstm_1 = layers.LSTM(units=512, return_sequences=True)\n lstm_2 = layers.LSTM(units=512, return_sequences=False)\n \n sum_a = lstm_2(lstm_1(embedding(title)))\n sum_b = lstm_2(lstm_1(embedding(body)))\n\n sim = layers.dot([sum_a, sum_b], axes=1, normalize=True)\n# sim = layers.Activation(activation='sigmoid')(sim)\n sim_model = models.Model(\n inputs=[title, body],\n outputs=[sim],\n )\n sim_model.compile(loss='binary_crossentropy', optimizer='rmsprop')\n\n embedding_model = models.Model(\n inputs=[title],\n outputs=[sum_a]\n )\n return sim_model, embedding_model", "_____no_output_____" ], [ "lstm, lstm_embedding = lstm_model(embedding_size=EMBEDDING_SIZE, vocab_size=VOCAB_SIZE)\nlstm.summary()\nlstm.fit_generator(\n data_generator(batch_size=128),\n epochs=10,\n steps_per_epoch=100,\n)", "_____no_output_____" ], [ "lookup = EmbeddingWrapper(model=lstm_embedding)\nevaluate_sample(lookup)", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ] ]
d027103293a2aa22126ad2bf23384f474ac63428
194,494
ipynb
Jupyter Notebook
render_demo.ipynb
frankhome61/nerf
2756d244a978efed478753e9fe700f38e7544df8
[ "MIT" ]
null
null
null
render_demo.ipynb
frankhome61/nerf
2756d244a978efed478753e9fe700f38e7544df8
[ "MIT" ]
null
null
null
render_demo.ipynb
frankhome61/nerf
2756d244a978efed478753e9fe700f38e7544df8
[ "MIT" ]
null
null
null
488.678392
177,924
0.935957
[ [ [ "import os, sys\n# os.environ['TF_FORCE_GPU_ALLOW_GROWTH'] = 'true'\nos.environ['CUDA_VISIBLE_DEVICES'] = '1'\nimport tensorflow as tf\ntf.compat.v1.enable_eager_execution()\n\nimport numpy as np\nimport imageio\nimport json\nimport random\nimport time\nimport pprint\n\nimport matplotlib.pyplot as plt\n\nimport run_nerf\n\nfrom load_llff import load_llff_data\nfrom load_deepvoxels import load_dv_data\nfrom load_blender import load_blender_data\n", "_____no_output_____" ], [ "basedir = './logs'\nexpname = 'fern_example'\n\nconfig = os.path.join(basedir, expname, 'config.txt')\nprint('Args:')\nprint(open(config, 'r').read())\nparser = run_nerf.config_parser()\n\nargs = parser.parse_args('--config {} --ft_path {}'.format(config, os.path.join(basedir, expname, 'model_200000.npy')))\nprint('loaded args')\n\nimages, poses, bds, render_poses, i_test = load_llff_data(args.datadir, args.factor, \n recenter=True, bd_factor=.75, \n spherify=args.spherify)\nH, W, focal = poses[0,:3,-1].astype(np.float32)\n\nH = int(H)\nW = int(W)\nhwf = [H, W, focal]\n\nimages = images.astype(np.float32)\nposes = poses.astype(np.float32)\n\nif args.no_ndc:\n near = tf.reduce_min(bds) * .9\n far = tf.reduce_max(bds) * 1.\nelse:\n near = 0.\n far = 1.\n\n", "Args:\ndatadir = ./data/nerf_llff_data/fern\ndataset_type = llff\nfactor = 4\n\nno_batching = False\ni_embed = 0\nN_samples = 64\nN_importance = 128\nuse_viewdirs = True\nlrate_decay = 250\n\nllffhold = 8\n\nN_rand = 4096\nloaded args\nLoaded image data (756, 1008, 3, 20) [ 756. 1008. 815.13158322]\nLoaded ./data/nerf_llff_data/fern 16.985296178676084 80.00209740336334\nrecentered (3, 5)\n[[ 1.0000000e+00 0.0000000e+00 0.0000000e+00 1.4901161e-09]\n [ 0.0000000e+00 1.0000000e+00 -1.8730975e-09 -9.6857544e-09]\n [-0.0000000e+00 1.8730975e-09 1.0000000e+00 0.0000000e+00]]\nData:\n(20, 3, 5) (20, 756, 1008, 3) (20, 2)\nHOLDOUT view is 12\n" ], [ "# Create nerf model\n_, render_kwargs_test, start, grad_vars, models = run_nerf.create_nerf(args)\n\nbds_dict = {\n 'near' : tf.cast(near, tf.float32),\n 'far' : tf.cast(far, tf.float32),\n}\nrender_kwargs_test.update(bds_dict)\n\nprint('Render kwargs:')\npprint.pprint(render_kwargs_test)\n\n\ndown = 4\nrender_kwargs_fast = {k : render_kwargs_test[k] for k in render_kwargs_test}\nrender_kwargs_fast['N_importance'] = 0\n\nc2w = np.eye(4)[:3,:4].astype(np.float32) # identity pose matrix\ntest = run_nerf.render(H//down, W//down, focal/down, c2w=c2w, **render_kwargs_fast)\nimg = np.clip(test[0],0,1)\nplt.imshow(img)\nplt.show()", "2021-10-24 23:28:33.855478: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1\n2021-10-24 23:28:33.898606: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1618] Found device 0 with properties: \nname: NVIDIA GeForce GTX 1080 Ti major: 6 minor: 1 memoryClockRate(GHz): 1.6575\npciBusID: 0000:03:00.0\n2021-10-24 23:28:33.898823: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0\n2021-10-24 23:28:33.900023: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0\n2021-10-24 23:28:33.901057: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0\n2021-10-24 23:28:33.901293: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0\n2021-10-24 23:28:33.902667: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0\n2021-10-24 23:28:33.904100: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0\n2021-10-24 23:28:33.907445: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7\n2021-10-24 23:28:33.909616: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1746] Adding visible gpu devices: 0\n2021-10-24 23:28:33.910383: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA\n2021-10-24 23:28:33.939562: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 3491830000 Hz\n2021-10-24 23:28:33.941574: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55967b42abf0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:\n2021-10-24 23:28:33.941629: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version\n2021-10-24 23:28:34.097235: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55967b298ef0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\n2021-10-24 23:28:34.097277: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce GTX 1080 Ti, Compute Capability 6.1\n2021-10-24 23:28:34.099082: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1618] Found device 0 with properties: \nname: NVIDIA GeForce GTX 1080 Ti major: 6 minor: 1 memoryClockRate(GHz): 1.6575\npciBusID: 0000:03:00.0\n2021-10-24 23:28:34.099153: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0\n2021-10-24 23:28:34.099182: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0\n2021-10-24 23:28:34.099208: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0\n2021-10-24 23:28:34.099254: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0\n2021-10-24 23:28:34.099283: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0\n2021-10-24 23:28:34.099309: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0\n2021-10-24 23:28:34.099335: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7\n2021-10-24 23:28:34.103922: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1746] Adding visible gpu devices: 0\n2021-10-24 23:28:34.104020: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0\n2021-10-24 23:28:34.105825: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1159] Device interconnect StreamExecutor with strength 1 edge matrix:\n2021-10-24 23:28:34.105852: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1165] 0 \n2021-10-24 23:28:34.105864: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1178] 0: N \n2021-10-24 23:28:34.108282: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:39] Overriding allow_growth setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.\n2021-10-24 23:28:34.108311: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1304] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10482 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce GTX 1080 Ti, pci bus id: 0000:03:00.0, compute capability: 6.1)\n" ], [ "down = 8 # trade off resolution+aliasing for render speed to make this video faster\nframes = []\nfor i, c2w in enumerate(render_poses):\n if i%8==0: print(i)\n test = run_nerf.render(H//down, W//down, focal/down, c2w=c2w[:3,:4], **render_kwargs_fast)\n frames.append((255*np.clip(test[0],0,1)).astype(np.uint8))\n \nprint('done, saving')\nf = 'logs/fern_example/video.mp4'\nimageio.mimwrite(f, frames, fps=30, quality=8)\n\nfrom IPython.display import Video\nVideo(f, height=320)", "0\n8\n16\n24\n32\n40\n48\n56\n64\n72\n80\n88\n96\n104\n112\n" ], [ "%matplotlib inline\nfrom ipywidgets import interactive, widgets\nimport matplotlib.pyplot as plt\nimport numpy as np\n\n\ndef f(x, y, z):\n \n c2w = tf.convert_to_tensor([\n [1,0,0,x],\n [0,1,0,y],\n [0,0,1,z],\n [0,0,0,1],\n ], dtype=tf.float32)\n \n test = run_nerf.render(H//down, W//down, focal/down, c2w=c2w, **render_kwargs_fast)\n img = np.clip(test[0],0,1)\n \n plt.figure(2, figsize=(20,6))\n plt.imshow(img)\n plt.show()\n \n\nsldr = lambda : widgets.FloatSlider(\n value=0.,\n min=-1.,\n max=1.,\n step=.01,\n)\n\nnames = ['x', 'y', 'z']\n \ninteractive_plot = interactive(f, **{n : sldr() for n in names})\ninteractive_plot", "_____no_output_____" ], [ "!conda install -c conda-forge ipywidgets", "Collecting package metadata (current_repodata.json): | " ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code" ] ]
d02714fb1e7527756a971551ad2df31dbc80204f
509,276
ipynb
Jupyter Notebook
DCGAN_V2.ipynb
SAKARA96/Offspring-Face-Generator
b249fbed98441b68ce991dc1ad63b8ebbaadb585
[ "MIT" ]
null
null
null
DCGAN_V2.ipynb
SAKARA96/Offspring-Face-Generator
b249fbed98441b68ce991dc1ad63b8ebbaadb585
[ "MIT" ]
null
null
null
DCGAN_V2.ipynb
SAKARA96/Offspring-Face-Generator
b249fbed98441b68ce991dc1ad63b8ebbaadb585
[ "MIT" ]
null
null
null
46.726856
25,168
0.626972
[ [ [ "from platform import python_version\nimport tensorflow as tf\n\nprint(tf.test.is_gpu_available())\nprint(python_version())", "True\n3.7.5\n" ], [ "import os\nimport numpy as np\nfrom os import listdir\nfrom PIL import Image\nimport time\nimport tensorflow as tf\nfrom tensorflow.keras import layers,models,optimizers\nfrom keras import backend as K\nimport matplotlib.pyplot as plt", "_____no_output_____" ], [ "path1=\"datasets/ofg_family/\"\npath2=\"datasets/TSKinFace_Data/TSKinFace_cropped/\"\n\nrandomiser = np.random.RandomState(123)", "_____no_output_____" ], [ "img_size = 64\nmean = 0.0009\nstd_dev = 0.009\nlr = 0.0005\nb1 = 0.875\nb2 = 0.975\nsd_random_normal_init = 0.02\n\nEPOCHS = 10\nbatch = 10", "_____no_output_____" ], [ "def generate_image_1(family_dir):\n dic={}\n sub=[a for a in listdir(path1+\"/\"+family_dir)]\n \n for ele in sub:\n if ele == '.DS_Store':\n continue;\n mypath = path1+\"/\"+family_dir+\"/\"+ele+\"/\"\n onlyfiles = [mypath+f for f in listdir(mypath)]\n \n addr = randomiser.choice(onlyfiles)\n original_img = np.array(Image.open(addr).resize((64,64),Image.ANTIALIAS))\n if ele[0].lower()=='f':\n dic['father'] = original_img\n elif ele[0].lower()=='m':\n dic['mother'] = original_img\n elif ele.lower()=='child_male':\n dic['child'] = original_img \n dic['gender']=np.zeros((original_img.shape))\n elif ele.lower()=='child_female':\n dic['child'] = original_img \n dic['gender'] = np.ones((original_img.shape))\n return [dic['father'],dic['mother'],dic['gender'],dic['child']]\n\ndef generate_image_2(family_dir, family_number, gender):\n dic={}\n sub = [\"F\" , \"M\", gender]\n family_pth = path2+\"/\"+family_dir+\"/\" + family_dir + \"-\" + str(family_number) + \"-\"\n for ele in sub:\n addr = family_pth+ele+\".jpg\"\n original_img = np.array(Image.open(addr).resize((64,64),Image.ANTIALIAS))\n if ele =='F':\n dic['father'] = original_img\n elif ele == 'M':\n dic['mother'] = original_img\n elif ele == 'S':\n dic['child'] = original_img \n dic['gender']=np.zeros((original_img.shape))\n elif ele == 'D':\n dic['child'] = original_img \n dic['gender'] = np.ones((original_img.shape))\n return [dic['father'],dic['mother'],dic['gender'],dic['child']]\n\ndef generate_batch(families_batch):\n np_images=[]\n for family in families_batch:\n if(len(family) == 3):\n res = generate_image_2(family[0], family[1], family[2])\n elif(len(family) == 1):\n res = generate_image_1(family[0])\n if( res != None):\n np_images.append(res)\n \n return np_images", "_____no_output_____" ], [ "for r, d, f in os.walk(path1):\n all_families = d\n break \nall_families = [[family] for family in all_families] \n\nfor i in range(285):\n all_families.append(['FMS', i+1, 'S'])\nfor i in range(274):\n all_families.append(['FMD', i+1, 'D'])\n \nfor i in range(228):\n all_families.append(['FMSD', i+1, 'D']) \n all_families.append(['FMSD', i+1, 'S']) \n\n\nrandomiser.shuffle(all_families)\n\ntrain_families = all_families[:-100]\ntest_families = all_families[-100:]", "_____no_output_____" ], [ "OUTPUT_CHANNELS = 3", "_____no_output_____" ], [ "def gen_downsample_parent(filters, size, apply_batchnorm=True, apply_dropout=False):\n initializer = tf.random_normal_initializer(mean, std_dev) \n \n\n result = tf.keras.Sequential()\n result.add(\n tf.keras.layers.Conv2D(filters, size, strides=2, padding='same',\n kernel_initializer=initializer,\n use_bias=False))\n\n if apply_batchnorm:\n result.add(tf.keras.layers.BatchNormalization())\n \n result.add(tf.keras.layers.ELU())\n \n if apply_dropout:\n result.add(tf.keras.layers.Dropout(rate = 0.5))\n\n return result", "_____no_output_____" ] ], [ [ "def gen_downsample_noise(filters, size, apply_batchnorm=True):\n initializer = tf.random_normal_initializer(mean, std_dev) \n \n\n result = tf.keras.Sequential()\n result.add(\n tf.keras.layers.Conv2DTranspose(filters, size, strides=2, padding='same',\n kernel_initializer=initializer,\n use_bias=False))\n\n if apply_batchnorm:\n result.add(tf.keras.layers.BatchNormalization())\n \n result.add(tf.keras.layers.ELU())\n\n return result", "_____no_output_____" ] ], [ [ "def gen_upsample(filters, size,apply_batchnorm = False):\n initializer = tf.random_normal_initializer(mean, std_dev)\n\n result = tf.keras.Sequential()\n result.add(\n tf.keras.layers.Conv2DTranspose(filters, size, strides=2,\n padding='same',\n kernel_initializer=initializer,\n use_bias=False))\n if apply_batchnorm:\n result.add(tf.keras.layers.BatchNormalization())\n result.add(tf.keras.layers.ELU())\n\n return result", "_____no_output_____" ], [ "def EncoderNN():\n down_stack_parent = [\n gen_downsample_parent(32,4,apply_batchnorm=True, apply_dropout=False),\n gen_downsample_parent(64,4,apply_batchnorm=True, apply_dropout=False)\n ]\n \n# down_stack_noise =[\n# # z = 4x4x64\n# gen_downsample_noise(64,4,apply_batchnorm=True), #8x8x64\n# gen_downsample_noise(32,4,apply_batchnorm=True) #16x16x32 \n# ]\n \n final_conv =[\n gen_upsample(32,4 ,apply_batchnorm = True)\n ]\n \n initializer = tf.random_normal_initializer(mean, sd_random_normal_init)\n last = tf.keras.layers.Conv2DTranspose(OUTPUT_CHANNELS, 4,\n strides=2,\n padding='same',\n kernel_initializer=initializer,\n activation='tanh')\n\n concat = tf.keras.layers.Concatenate()\n\n father = tf.keras.layers.Input(shape=(img_size,img_size,3))\n mother = tf.keras.layers.Input(shape=(img_size,img_size,3))\n\n \n \n x1 = father\n for down in down_stack_parent:\n x1 = down(x1)\n \n# print(x1.shape)\n \n x2 = mother\n for down in down_stack_parent:\n x2 = down(x2) \n \n# print(x2.shape)\n \n final = concat([x1,x2])\n# print(final.shape)\n final = final_conv[0](final)\n \n final = last(final)\n# print(final.shape)\n return tf.keras.Model(inputs=[father, mother], outputs=final)", "_____no_output_____" ], [ "encoder_optimizer = tf.keras.optimizers.Adam(learning_rate = lr, beta_1=b1)", "_____no_output_____" ], [ "def tensor_to_array(tensor1):\n return tensor1.numpy()", "_____no_output_____" ], [ "def train_encoder(father_batch, mother_batch, target_batch, b_size):\n with tf.GradientTape() as enc_tape:\n gen_outputs = encoder([father_batch, mother_batch], training=True)\n \n diff = tf.abs(target_batch - gen_outputs)\n flatten_diff = tf.reshape(diff, (b_size, img_size*img_size*3))\n \n encoder_loss_batch = tf.reduce_mean(flatten_diff, axis=1)\n encoder_loss = tf.reduce_mean(encoder_loss_batch)\n \n print(\"ENCODER_LOSS: \",tensor_to_array(encoder_loss))\n #calculate gradients\n encoder_gradients = enc_tape.gradient(encoder_loss,encoder.trainable_variables)\n\n #apply gradients on optimizer\n encoder_optimizer.apply_gradients(zip(encoder_gradients,encoder.trainable_variables))\n \n", "_____no_output_____" ], [ "def fit_encoder(train_ds, epochs, test_ds, batch):\n losses=np.array([])\n for epoch in range(epochs):\n print(\"______________________________EPOCH %d_______________________________\"%(epoch+1))\n start = time.time()\n for i in range(len(train_ds)//batch):\n batch_data = np.asarray(generate_batch(train_ds[i*batch:(i+1)*batch]))\n batch_data = batch_data / 255 * 2 -1\n \n \n print(\"Generated batch\", batch_data.shape)\n\n X_Father_train = tf.convert_to_tensor(batch_data[:,0],dtype =tf.float32)\n X_Mother_train = tf.convert_to_tensor(batch_data[:,1],dtype =tf.float32)\n Y_train = tf.convert_to_tensor(batch_data[:,3],dtype =tf.float32)\n \n train_encoder(X_Father_train, X_Mother_train, Y_train,batch)\n \n print(\"Trained for batch %d/%d\"%(i+1,(len(train_ds)//batch)))\n print(\"______________________________TRAINING COMPLETED_______________________________\")", "_____no_output_____" ], [ "train_dataset = all_families[:-100]\ntest_dataset = all_families[-100:]\nencoder = EncoderNN()\n\nwith tf.device('/gpu:0'):\n fit_encoder(train_dataset, EPOCHS, test_dataset,batch)", "______________________________EPOCH 1_______________________________\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40341973\nTrained for batch 1/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3747058\nTrained for batch 2/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.4229235\nTrained for batch 3/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.4062442\nTrained for batch 4/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.37259284\nTrained for batch 5/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40570754\nTrained for batch 6/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.43097487\nTrained for batch 7/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.37538522\nTrained for batch 8/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3661008\nTrained for batch 9/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.36800843\nTrained for batch 10/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3724699\nTrained for batch 11/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.43448296\nTrained for batch 12/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35709864\nTrained for batch 13/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3950496\nTrained for batch 14/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3688807\nTrained for batch 15/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3860542\nTrained for batch 16/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.44605786\nTrained for batch 17/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34664398\nTrained for batch 18/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.39454833\nTrained for batch 19/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.33471403\nTrained for batch 20/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3614666\nTrained for batch 21/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40210095\nTrained for batch 22/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.42147255\nTrained for batch 23/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35217696\nTrained for batch 24/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.38657412\nTrained for batch 25/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.41433\nTrained for batch 26/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35280803\nTrained for batch 27/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.32902092\nTrained for batch 28/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.365082\nTrained for batch 29/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.36655417\nTrained for batch 30/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40149218\nTrained for batch 31/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35119835\nTrained for batch 32/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.33154935\nTrained for batch 33/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.38874537\nTrained for batch 34/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3770574\nTrained for batch 35/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.37216097\nTrained for batch 36/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.30604583\nTrained for batch 37/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34849322\nTrained for batch 38/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34970385\nTrained for batch 39/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35399386\nTrained for batch 40/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.37763038\nTrained for batch 41/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3642118\nTrained for batch 42/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.32215607\nTrained for batch 43/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34084243\nTrained for batch 44/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3115418\nTrained for batch 45/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.405502\nTrained for batch 46/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.33786473\nTrained for batch 47/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34356505\nTrained for batch 48/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34688368\nTrained for batch 49/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35725087\nTrained for batch 50/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40099317\nTrained for batch 51/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.31783652\nTrained for batch 52/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3588978\nTrained for batch 53/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3546448\nTrained for batch 54/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.38290676\nTrained for batch 55/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35456428\nTrained for batch 56/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.4008767\nTrained for batch 57/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.39036384\nTrained for batch 58/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3118421\nTrained for batch 59/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40766144\nTrained for batch 60/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.33152086\nTrained for batch 61/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.318183\nTrained for batch 62/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3289036\nTrained for batch 63/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.30764475\nTrained for batch 64/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.2862015\nTrained for batch 65/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3151431\nTrained for batch 66/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35122877\nTrained for batch 67/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.38102263\nTrained for batch 68/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.31507578\nTrained for batch 69/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3562093\nTrained for batch 70/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34080186\nTrained for batch 71/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34523338\nTrained for batch 72/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3719592\nTrained for batch 73/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35613218\nTrained for batch 74/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.36692142\nTrained for batch 75/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3916842\nTrained for batch 76/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.31589094\nTrained for batch 77/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3126195\nTrained for batch 78/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.36948365\nTrained for batch 79/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40351337\nTrained for batch 80/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.33806187\nTrained for batch 81/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.31995544\nTrained for batch 82/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3503887\nTrained for batch 83/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3623436\nTrained for batch 84/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.35348386\nTrained for batch 85/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.39553097\nTrained for batch 86/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3288309\nTrained for batch 87/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.40606794\nTrained for batch 88/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.32926646\nTrained for batch 89/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34279853\nTrained for batch 90/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3104426\nTrained for batch 91/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.38569975\nTrained for batch 92/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.38053983\nTrained for batch 93/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.34726876\nTrained for batch 94/297\nGenerated batch (10, 4, 64, 64, 3)\nENCODER_LOSS: 0.3098844\nTrained for batch 95/297\n" ], [ "f_no = 1106\nfamily_data = generate_batch([all_families[f_no]])\ninp = [family_data[0][0],family_data[0][1]]\ninp = tf.cast(inp, tf.float32)\n\nfather_inp = inp[0][tf.newaxis,...]\nmother_inp = inp[1][tf.newaxis,...]\n\nwith tf.device('/cpu:0'):\n gen_output = encoder([father_inp, mother_inp], training=True)\ntemp = gen_output.numpy()\nplt.imshow(np.squeeze(temp))\n# print(temp)\nprint(np.amin(temp))\nprint(np.amax(temp))", "Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).\n" ], [ "target = family_data[0][3]\nplt.imshow(target)", "_____no_output_____" ] ], [ [ "###############################################################################################################################", "_____no_output_____" ] ], [ [ "def disc_downsample_parent_target(filters, size, apply_batchnorm=True):\n initializer = tf.random_normal_initializer(mean, std_dev) \n \n\n result = tf.keras.Sequential()\n result.add(\n tf.keras.layers.Conv2D(filters, size, strides=2, padding='same',\n kernel_initializer=initializer,\n use_bias=False))\n\n if apply_batchnorm:\n result.add(tf.keras.layers.BatchNormalization())\n \n result.add(tf.keras.layers.LeakyReLU(alpha = 0.2))\n\n return result", "_____no_output_____" ], [ "def disc_loss(filters, size,apply_batchnorm = False):\n initializer = tf.random_normal_initializer(mean, std_dev)\n\n result = tf.keras.Sequential()\n result.add(\n tf.keras.layers.Conv2D(filters, size, strides=2,\n padding='same',\n kernel_initializer=initializer,\n use_bias=False))\n if apply_batchnorm:\n result.add(tf.keras.layers.BatchNormalization())\n \n result.add(tf.keras.layers.LeakyReLU(alpha = 0.2))\n \n return result", "_____no_output_____" ], [ "def Discriminator():\n\n father = tf.keras.layers.Input(shape=(img_size,img_size,3))\n mother = tf.keras.layers.Input(shape=(img_size,img_size,3))\n target = tf.keras.layers.Input(shape=(img_size,img_size,3))\n \n down_stack_parent_target = [\n disc_downsample_parent_target(32,4,apply_batchnorm=False), #32x32x32\n disc_downsample_parent_target(64,4,apply_batchnorm=True) #16x16x64\n ]\n \n down_stack_combined =[\n disc_loss(192,4,apply_batchnorm=True),\n disc_loss(256,4,apply_batchnorm=False)\n ]\n \n initializer = tf.random_normal_initializer(mean, sd_random_normal_init)\n last = tf.keras.layers.Conv2D(1, 4, strides=1,padding='same',\n kernel_initializer=initializer) # linear layer\n \n \n concat = tf.keras.layers.Concatenate()\n x1 = father\n for down in down_stack_parent_target:\n x1 = down(x1)\n \n x2 = mother\n for down in down_stack_parent_target:\n x2 = down(x2)\n \n x3 = target\n for down in down_stack_parent_target:\n x3 = down(x3)\n \n combined = concat([x1,x2,x3])\n # combined is Batchx16x16x192\n \n x4 = combined\n for down in down_stack_combined:\n x4 = down(x4)\n# print(x4.shape)\n \n output = last(x4) #4X4 \n print(output.shape)\n\n return tf.keras.Model(inputs=[father,mother,target], outputs=output)", "_____no_output_____" ], [ "discriminator = Discriminator()", "(None, 4, 4, 1)\n" ], [ "# family_data = generate_image(all_families[126])\n# p1 = tf.cast(family_data[0], tf.float32)\n# p2 = tf.cast(family_data[1], tf.float32)\n# c = tf.cast(family_data[2], tf.float32)\n\n# discriminator = Discriminator()\n# with tf.device('/cpu:0'):\n# disc_out = discriminator(inputs = [p1,p2,c], training=True)", "_____no_output_____" ], [ "LAMBDA = 1", "_____no_output_____" ], [ "def tensor_to_array(tensor1):\n return tensor1.numpy()", "_____no_output_____" ], [ "def discriminator_loss(disc_real_output, disc_generated_output,b_size):\n real_loss_diff = tf.abs(tf.ones_like(disc_real_output) - disc_real_output)\n real_flatten_diff = tf.reshape(real_loss_diff, (b_size, 4*4*1))\n real_loss_batch = tf.reduce_mean(real_flatten_diff, axis=1)\n real_loss = tf.reduce_mean(real_loss_batch)\n \n gen_loss_diff = tf.abs(tf.zeros_like(disc_generated_output) - disc_generated_output)\n gen_flatten_diff = tf.reshape(gen_loss_diff, (b_size, 4*4*1))\n gen_loss_batch = tf.reduce_mean(gen_flatten_diff, axis=1)\n gen_loss = tf.reduce_mean(gen_loss_batch)\n\n total_disc_loss = real_loss + gen_loss\n return total_disc_loss", "_____no_output_____" ], [ "def generator_loss(disc_generated_output, gen_output, target,b_size):\n gen_loss_diff = tf.abs(tf.ones_like(disc_generated_output) - disc_generated_output)\n gen_flatten_diff = tf.reshape(gen_loss_diff, (b_size, 4*4*1))\n gen_loss_batch = tf.reduce_mean(gen_flatten_diff, axis=1)\n gen_loss = tf.reduce_mean(gen_loss_batch)\n \n l1_loss_diff = tf.abs(target - gen_output)\n l1_flatten_diff = tf.reshape(l1_loss_diff, (b_size, img_size*img_size*3))\n l1_loss_batch = tf.reduce_mean(l1_flatten_diff, axis=1)\n l1_loss = tf.reduce_mean(l1_loss_batch)\n \n total_gen_loss = gen_loss + LAMBDA * l1_loss \n# print(\"Reconstruction loss: {}, GAN loss: {}\".format(l1_loss, gen_loss))\n return total_gen_loss", "_____no_output_____" ], [ "generator_optimizer = tf.keras.optimizers.Adam(lr, beta_1=b1 ,beta_2 = b2)\ndiscriminator_optimizer = tf.keras.optimizers.Adam(lr, beta_1=b1, beta_2 = b2)", "_____no_output_____" ], [ "def train_step(father_batch, mother_batch, target_batch,b_size):\n with tf.GradientTape() as gen_tape, tf.GradientTape() as disc_tape:\n \n gen_outputs = encoder([father_batch, mother_batch], training=True)\n# print(\"Generated outputs\",gen_outputs.shape)\n \n disc_real_output = discriminator([father_batch, mother_batch, target_batch], training=True)\n# print(\"disc_real_output \", disc_real_output.shape)\n \n disc_generated_output = discriminator([father_batch, mother_batch, gen_outputs], training=True)\n# print(\"disc_generated_output \", disc_generated_output.shape)\n \n gen_loss = generator_loss(disc_generated_output, gen_outputs, target_batch,b_size)\n disc_loss = discriminator_loss(disc_real_output, disc_generated_output,b_size)\n \n \n print(\"GEN_LOSS\",tensor_to_array(gen_loss))\n print(\"DISC_LOSS\",tensor_to_array(disc_loss))\n\n generator_gradients = gen_tape.gradient(gen_loss,encoder.trainable_variables)\n discriminator_gradients = disc_tape.gradient(disc_loss,discriminator.trainable_variables)\n\n generator_optimizer.apply_gradients(zip(generator_gradients,encoder.trainable_variables))\n discriminator_optimizer.apply_gradients(zip(discriminator_gradients,discriminator.trainable_variables))", "_____no_output_____" ], [ "def fit(train_ds, epochs, test_ds,batch):\n for epoch in range(epochs):\n print(\"______________________________EPOCH %d_______________________________\"%(epoch))\n start = time.time()\n for i in range(len(train_ds)//batch):\n batch_data = np.asarray(generate_batch(train_ds[i*batch:(i+1)*batch]))\n batch_data = batch_data / 255 * 2 -1\n \n print(\"Generated batch\", batch_data.shape)\n\n X_father_train = tf.convert_to_tensor(batch_data[:,0],dtype =tf.float32)\n X_mother_train = tf.convert_to_tensor(batch_data[:,1],dtype =tf.float32)\n# print(\"Xtrain\",X_train.shape)\n# print(\"Batch converted to tensor\")\n\n Y_train = tf.convert_to_tensor(batch_data[:,3],dtype =tf.float32)\n train_step(X_father_train, X_mother_train, Y_train, batch)\n print(\"Trained for batch %d/%d\"%(i+1,(len(train_ds)//batch)))\n \n# family_no = 400\n# family_data = generate_image(all_families[family_no][0], all_families[family_no][1], all_families[family_no][2])\n# inp = [family_data[0],family_data[1]]\n# inp = tf.cast(inp, tf.float32)\n# father_inp = inp[0][tf.newaxis,...]\n# mother_inp = inp[1][tf.newaxis,...]\n# gen_output = encoder([father_inp, mother_inp], training=True)\n# print(tf.reduce_min(gen_output))\n# print(tf.reduce_max(gen_output))\n# plt.figure()\n# plt.imshow(gen_output[0,...])\n# plt.show()\n \n print(\"______________________________TRAINING COMPLETED_______________________________\")\n checkpoint.save(file_prefix = checkpoint_prefix)", "_____no_output_____" ], [ "concat = tf.keras.layers.Concatenate()", "_____no_output_____" ], [ "train_dataset = all_families[:-10]\ntest_dataset = all_families[-10:]\nencoder = EncoderNN()\ndiscriminator = Discriminator()", "_____no_output_____" ], [ "img_size = 64\nmean = 0.\nstd_dev = 0.02\nlr = 0.0005\nb1 = 0.9\nb2 = 0.999\nsd_random_normal_init = 0.02\n\nEPOCHS = 5\nbatch = 25", "_____no_output_____" ], [ "checkpoint_dir = './checkpoint'\ncheckpoint_prefix = os.path.join(checkpoint_dir, \"ckpt\")\ncheckpoint = tf.train.Checkpoint(generator_optimizer=generator_optimizer,\n discriminator_optimizer=discriminator_optimizer,\n generator=encoder,\n discriminator=discriminator)", "_____no_output_____" ], [ "with tf.device('/gpu:0'):\n fit(train_dataset, EPOCHS, test_dataset,batch)", "______________________________EPOCH 0_______________________________\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.8714736\nDISC_LOSS 1.86838\nTrained for batch 1/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 2.4612353\nDISC_LOSS 4.9009223\nTrained for batch 2/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.5188591\nDISC_LOSS 3.105586\nTrained for batch 3/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.97126555\nDISC_LOSS 1.2524204\nTrained for batch 4/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.5256444\nDISC_LOSS 1.3195959\nTrained for batch 5/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.7478199\nDISC_LOSS 1.5654196\nTrained for batch 6/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.615851\nDISC_LOSS 1.3376899\nTrained for batch 7/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3071386\nDISC_LOSS 1.0237919\nTrained for batch 8/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.033449\nDISC_LOSS 0.9461256\nTrained for batch 9/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.8963132\nDISC_LOSS 0.9979107\nTrained for batch 10/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.8280088\nDISC_LOSS 1.0079626\nTrained for batch 11/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.7844893\nDISC_LOSS 0.98840153\nTrained for batch 12/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.8130939\nDISC_LOSS 0.9575056\nTrained for batch 13/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.80703056\nDISC_LOSS 0.9344727\nTrained for batch 14/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9170277\nDISC_LOSS 0.909178\nTrained for batch 15/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9479166\nDISC_LOSS 0.89190656\nTrained for batch 16/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.96813387\nDISC_LOSS 0.91292846\nTrained for batch 17/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9825418\nDISC_LOSS 0.9106399\nTrained for batch 18/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0002934\nDISC_LOSS 0.89057213\nTrained for batch 19/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9695563\nDISC_LOSS 0.9175471\nTrained for batch 20/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9578241\nDISC_LOSS 0.9103563\nTrained for batch 21/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9494082\nDISC_LOSS 0.93349683\nTrained for batch 22/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.96414757\nDISC_LOSS 0.96248746\nTrained for batch 23/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9864141\nDISC_LOSS 0.9575828\nTrained for batch 24/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0093925\nDISC_LOSS 0.9762958\nTrained for batch 25/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0861416\nDISC_LOSS 0.92497194\nTrained for batch 26/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0958909\nDISC_LOSS 0.9291233\nTrained for batch 27/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0585023\nDISC_LOSS 0.9400269\nTrained for batch 28/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1250031\nDISC_LOSS 0.9135808\nTrained for batch 29/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0957966\nDISC_LOSS 0.941985\nTrained for batch 30/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0928779\nDISC_LOSS 0.9060876\nTrained for batch 31/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0167775\nDISC_LOSS 0.97104776\nTrained for batch 32/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9991605\nDISC_LOSS 0.9424188\nTrained for batch 33/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.97426724\nDISC_LOSS 0.9429213\nTrained for batch 34/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.93371403\nDISC_LOSS 0.9179907\nTrained for batch 35/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9309088\nDISC_LOSS 0.8584677\nTrained for batch 36/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.9956986\nDISC_LOSS 0.86251956\nTrained for batch 37/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0101424\nDISC_LOSS 0.82430947\nTrained for batch 38/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0703489\nDISC_LOSS 0.8889017\nTrained for batch 39/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1007025\nDISC_LOSS 0.86110675\nTrained for batch 40/40\n______________________________EPOCH 1_______________________________\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0465988\nDISC_LOSS 0.8305098\nTrained for batch 1/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.98515004\nDISC_LOSS 0.8789594\nTrained for batch 2/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0371296\nDISC_LOSS 0.82892394\nTrained for batch 3/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1436797\nDISC_LOSS 0.76781416\nTrained for batch 4/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.98828745\nDISC_LOSS 0.8327844\nTrained for batch 5/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 0.96988016\nDISC_LOSS 0.8151218\nTrained for batch 6/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0995154\nDISC_LOSS 0.7874242\nTrained for batch 7/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1080232\nDISC_LOSS 0.8091966\nTrained for batch 8/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1456304\nDISC_LOSS 0.7980537\nTrained for batch 9/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0870317\nDISC_LOSS 0.7926892\nTrained for batch 10/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1679136\nDISC_LOSS 0.73891556\nTrained for batch 11/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.0935614\nDISC_LOSS 0.7004093\nTrained for batch 12/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1414446\nDISC_LOSS 0.6916524\nTrained for batch 13/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1969086\nDISC_LOSS 0.6280055\nTrained for batch 14/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1256477\nDISC_LOSS 0.6662531\nTrained for batch 15/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.2153999\nDISC_LOSS 0.67171395\nTrained for batch 16/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.2723532\nDISC_LOSS 0.6211302\nTrained for batch 17/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3242301\nDISC_LOSS 0.5451824\nTrained for batch 18/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3218682\nDISC_LOSS 0.5655677\nTrained for batch 19/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3787348\nDISC_LOSS 0.50307274\nTrained for batch 20/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3548858\nDISC_LOSS 0.5147311\nTrained for batch 21/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.2932595\nDISC_LOSS 0.4773554\nTrained for batch 22/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3981586\nDISC_LOSS 0.45202476\nTrained for batch 23/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3732232\nDISC_LOSS 0.4811604\nTrained for batch 24/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.42959\nDISC_LOSS 0.4987492\nTrained for batch 25/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3369378\nDISC_LOSS 0.48916674\nTrained for batch 26/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3274763\nDISC_LOSS 0.49758807\nTrained for batch 27/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3826015\nDISC_LOSS 0.49969247\nTrained for batch 28/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.232553\nDISC_LOSS 0.5970494\nTrained for batch 29/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1694779\nDISC_LOSS 0.61546373\nTrained for batch 30/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.174477\nDISC_LOSS 0.6335112\nTrained for batch 31/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.2373971\nDISC_LOSS 0.6561134\nTrained for batch 32/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.2919565\nDISC_LOSS 0.58528715\nTrained for batch 33/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.2439705\nDISC_LOSS 0.5333974\nTrained for batch 34/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1827548\nDISC_LOSS 0.5410408\nTrained for batch 35/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.4111747\nDISC_LOSS 0.48652083\nTrained for batch 36/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.1749924\nDISC_LOSS 0.59631115\nTrained for batch 37/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3081505\nDISC_LOSS 0.59527826\nTrained for batch 38/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.4181799\nDISC_LOSS 0.5750201\nTrained for batch 39/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.279568\nDISC_LOSS 0.5244153\nTrained for batch 40/40\n______________________________EPOCH 2_______________________________\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.3625059\nDISC_LOSS 0.49883476\nTrained for batch 1/40\nGenerated batch (25, 4, 64, 64, 3)\nGEN_LOSS 1.4587033\nDISC_LOSS 0.4600417\nTrained for batch 2/40\nGenerated batch (25, 4, 64, 64, 3)\n" ], [ "family_no = 1011\nfamily_data = generate_image(all_families[family_no][0], all_families[family_no][1], all_families[family_no][2])\ninp = [family_data[0],family_data[1]]\ninp = tf.cast(inp, tf.float32)\nfather_inp = inp[0][tf.newaxis,...]\nmother_inp = inp[1][tf.newaxis,...]\nwith tf.device('/gpu:0'):\n gen_output = encoder([father_inp, mother_inp], training=True)\ntemp = gen_output.numpy()\nplt.imshow(np.squeeze(temp))\nprint(np.amin(temp))\n+print(np.amax(temp))", "Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).\n" ], [ "family_no = 1011\nfamily_data = generate_image(all_families[family_no][0], all_families[family_no][1], all_families[family_no][2])\ninp = [family_data[0],family_data[1]]\ninp = tf.cast(inp, tf.float32)\nfather_inp = inp[0][tf.newaxis,...]\nmother_inp = inp[1][tf.newaxis,...]\nwith tf.device('/gpu:0'):\n gen_output = encoder([father_inp, mother_inp], training=True)\ntemp = gen_output.numpy()\nplt.imshow(np.squeeze(temp))\nprint(np.amin(temp))\nprint(np.amax(temp))", "Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).\n" ] ] ]
[ "code", "raw", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code" ], [ "raw" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0272bb5933732522f97c6c48377049a2ddb4905
1,712
ipynb
Jupyter Notebook
docs/_src/3.Theory/04.graph-simplification.ipynb
snystrom/Mycelia
67a978ec2de3c53fced46b98adbdfa2c4ca82889
[ "MIT" ]
null
null
null
docs/_src/3.Theory/04.graph-simplification.ipynb
snystrom/Mycelia
67a978ec2de3c53fced46b98adbdfa2c4ca82889
[ "MIT" ]
27
2021-06-24T17:53:36.000Z
2022-03-05T19:26:01.000Z
docs/_src/3.Theory/04.graph-simplification.ipynb
snystrom/Mycelia
67a978ec2de3c53fced46b98adbdfa2c4ca82889
[ "MIT" ]
1
2022-01-08T14:45:20.000Z
2022-01-08T14:45:20.000Z
34.24
88
0.663551
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d0272de970b9aff6a278cbf226ebe388310bd87f
10,988
ipynb
Jupyter Notebook
aws_marketplace/using_algorithms/amazon_demo_product/Using_Algorithm_Arn_From_AWS_Marketplace.ipynb
Amirosimani/amazon-sagemaker-examples
bc35e7a9da9e2258e77f98098254c2a8e308041a
[ "Apache-2.0" ]
2,610
2020-10-01T14:14:53.000Z
2022-03-31T18:02:31.000Z
aws_marketplace/using_algorithms/amazon_demo_product/Using_Algorithm_Arn_From_AWS_Marketplace.ipynb
Amirosimani/amazon-sagemaker-examples
bc35e7a9da9e2258e77f98098254c2a8e308041a
[ "Apache-2.0" ]
1,959
2020-09-30T20:22:42.000Z
2022-03-31T23:58:37.000Z
aws_marketplace/using_algorithms/amazon_demo_product/Using_Algorithm_Arn_From_AWS_Marketplace.ipynb
Amirosimani/amazon-sagemaker-examples
bc35e7a9da9e2258e77f98098254c2a8e308041a
[ "Apache-2.0" ]
2,052
2020-09-30T22:11:46.000Z
2022-03-31T23:02:51.000Z
30.269972
327
0.610666
[ [ [ "# AWS Marketplace Product Usage Demonstration - Algorithms\n\n## Using Algorithm ARN with Amazon SageMaker APIs\n\nThis sample notebook demonstrates two new functionalities added to Amazon SageMaker:\n1. Using an Algorithm ARN to run training jobs and use that result for inference\n2. Using an AWS Marketplace product ARN - we will use [Scikit Decision Trees](https://aws.amazon.com/marketplace/pp/prodview-ha4f3kqugba3u?qid=1543169069960&sr=0-1&ref_=srh_res_product_title)\n\n## Overall flow diagram\n<img src=\"images/AlgorithmE2EFlow.jpg\">\n\n## Compatibility\nThis notebook is compatible only with [Scikit Decision Trees](https://aws.amazon.com/marketplace/pp/prodview-ha4f3kqugba3u?qid=1543169069960&sr=0-1&ref_=srh_res_product_title) sample algorithm published to AWS Marketplace. \n\n***Pre-Requisite:*** Please subscribe to this free product before proceeding with this notebook", "_____no_output_____" ], [ "## Set up the environment", "_____no_output_____" ] ], [ [ "import sagemaker as sage\nfrom sagemaker import get_execution_role\n\nrole = get_execution_role()\n\n# S3 prefixes\ncommon_prefix = \"DEMO-scikit-byo-iris\"\ntraining_input_prefix = common_prefix + \"/training-input-data\"\nbatch_inference_input_prefix = common_prefix + \"/batch-inference-input-data\"", "_____no_output_____" ] ], [ [ "### Create the session\n\nThe session remembers our connection parameters to Amazon SageMaker. We'll use it to perform all of our Amazon SageMaker operations.", "_____no_output_____" ] ], [ [ "sagemaker_session = sage.Session()", "_____no_output_____" ] ], [ [ "## Upload the data for training\n\nWhen training large models with huge amounts of data, you'll typically use big data tools, like Amazon Athena, AWS Glue, or Amazon EMR, to create your data in S3. For the purposes of this example, we're using some the classic [Iris dataset](https://en.wikipedia.org/wiki/Iris_flower_data_set), which we have included. \n\nWe can use use the tools provided by the Amazon SageMaker Python SDK to upload the data to a default bucket. ", "_____no_output_____" ] ], [ [ "TRAINING_WORKDIR = \"data/training\"\n\ntraining_input = sagemaker_session.upload_data(TRAINING_WORKDIR, key_prefix=training_input_prefix)\nprint(\"Training Data Location \" + training_input)", "_____no_output_____" ] ], [ [ "## Creating Training Job using Algorithm ARN\n\nPlease put in the algorithm arn you want to use below. This can either be an AWS Marketplace algorithm you subscribed to (or) one of the algorithms you created in your own account.\n\nThe algorithm arn listed below belongs to the [Scikit Decision Trees](https://aws.amazon.com/marketplace/pp/prodview-ha4f3kqugba3u?qid=1543169069960&sr=0-1&ref_=srh_res_product_title) product.", "_____no_output_____" ] ], [ [ "from src.scikit_product_arns import ScikitArnProvider\n\nalgorithm_arn = ScikitArnProvider.get_algorithm_arn(sagemaker_session.boto_region_name)", "_____no_output_____" ], [ "import json\nimport time\nfrom sagemaker.algorithm import AlgorithmEstimator\n\nalgo = AlgorithmEstimator(\n algorithm_arn=algorithm_arn,\n role=role,\n train_instance_count=1,\n train_instance_type=\"ml.c4.xlarge\",\n base_job_name=\"scikit-from-aws-marketplace\",\n)", "_____no_output_____" ] ], [ [ "## Run Training Job", "_____no_output_____" ] ], [ [ "print(\n \"Now run the training job using algorithm arn %s in region %s\"\n % (algorithm_arn, sagemaker_session.boto_region_name)\n)\nalgo.fit({\"training\": training_input})", "_____no_output_____" ] ], [ [ "## Automated Model Tuning (optional)\n\nSince this algorithm supports tunable hyperparameters with a tuning objective metric, we can run a Hyperparameter Tuning Job to obtain the best training job hyperparameters and its corresponding model artifacts. \n\n<img src=\"images/HPOFlow.jpg\">\n", "_____no_output_____" ] ], [ [ "from sagemaker.tuner import HyperparameterTuner, IntegerParameter\n\n## This demo algorithm supports max_leaf_nodes as the only tunable hyperparameter.\nhyperparameter_ranges = {\"max_leaf_nodes\": IntegerParameter(1, 100000)}\n\ntuner = HyperparameterTuner(\n estimator=algo,\n base_tuning_job_name=\"some-name\",\n objective_metric_name=\"validation:accuracy\",\n hyperparameter_ranges=hyperparameter_ranges,\n max_jobs=2,\n max_parallel_jobs=2,\n)\n\ntuner.fit({\"training\": training_input}, include_cls_metadata=False)\ntuner.wait()", "_____no_output_____" ] ], [ [ "## Batch Transform Job\n\nNow let's use the model built to run a batch inference job and verify it works.\n\n### Batch Transform Input Preparation\n\nThe snippet below is removing the \"label\" column (column indexed at 0) and retaining the rest to be batch transform's input. \n\n***NOTE:*** This is the same training data, which is a no-no from a ML science perspective. But the aim of this notebook is to demonstrate how things work end-to-end.", "_____no_output_____" ] ], [ [ "import pandas as pd\n\n## Remove first column that contains the label\nshape = pd.read_csv(TRAINING_WORKDIR + \"/iris.csv\", header=None).drop([0], axis=1)\n\nTRANSFORM_WORKDIR = \"data/transform\"\nshape.to_csv(TRANSFORM_WORKDIR + \"/batchtransform_test.csv\", index=False, header=False)\n\ntransform_input = (\n sagemaker_session.upload_data(TRANSFORM_WORKDIR, key_prefix=batch_inference_input_prefix)\n + \"/batchtransform_test.csv\"\n)\nprint(\"Transform input uploaded to \" + transform_input)", "_____no_output_____" ], [ "transformer = algo.transformer(1, \"ml.m4.xlarge\")\ntransformer.transform(transform_input, content_type=\"text/csv\")\ntransformer.wait()\n\nprint(\"Batch Transform output saved to \" + transformer.output_path)", "_____no_output_____" ] ], [ [ "#### Inspect the Batch Transform Output in S3", "_____no_output_____" ] ], [ [ "from urllib.parse import urlparse\n\nparsed_url = urlparse(transformer.output_path)\nbucket_name = parsed_url.netloc\nfile_key = \"{}/{}.out\".format(parsed_url.path[1:], \"batchtransform_test.csv\")\n\ns3_client = sagemaker_session.boto_session.client(\"s3\")\n\nresponse = s3_client.get_object(Bucket=sagemaker_session.default_bucket(), Key=file_key)\nresponse_bytes = response[\"Body\"].read().decode(\"utf-8\")\nprint(response_bytes)", "_____no_output_____" ] ], [ [ "## Live Inference Endpoint\n\nFinally, we demonstrate the creation of an endpoint for live inference using this AWS Marketplace algorithm generated model\n", "_____no_output_____" ] ], [ [ "from sagemaker.predictor import csv_serializer\n\npredictor = algo.deploy(1, \"ml.m4.xlarge\", serializer=csv_serializer)", "_____no_output_____" ] ], [ [ "### Choose some data and use it for a prediction\n\nIn order to do some predictions, we'll extract some of the data we used for training and do predictions against it. This is, of course, bad statistical practice, but a good way to see how the mechanism works.\n", "_____no_output_____" ] ], [ [ "shape = pd.read_csv(TRAINING_WORKDIR + \"/iris.csv\", header=None)\n\nimport itertools\n\na = [50 * i for i in range(3)]\nb = [40 + i for i in range(10)]\nindices = [i + j for i, j in itertools.product(a, b)]\n\ntest_data = shape.iloc[indices[:-1]]\ntest_X = test_data.iloc[:, 1:]\ntest_y = test_data.iloc[:, 0]", "_____no_output_____" ] ], [ [ "Prediction is as easy as calling predict with the predictor we got back from deploy and the data we want to do predictions with. The serializers take care of doing the data conversions for us.", "_____no_output_____" ] ], [ [ "print(predictor.predict(test_X.values).decode(\"utf-8\"))", "_____no_output_____" ] ], [ [ "### Cleanup the endpoint", "_____no_output_____" ] ], [ [ "algo.delete_endpoint()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0273109250ac5476a0af13f6e062b70f7c14afa
18,731
ipynb
Jupyter Notebook
examples/notebooks/statespace_structural_harvey_jaeger.ipynb
yarikoptic/statsmodels
f990cb1a1ef0c9883c9394444e6f9d027efabec6
[ "BSD-3-Clause" ]
null
null
null
examples/notebooks/statespace_structural_harvey_jaeger.ipynb
yarikoptic/statsmodels
f990cb1a1ef0c9883c9394444e6f9d027efabec6
[ "BSD-3-Clause" ]
null
null
null
examples/notebooks/statespace_structural_harvey_jaeger.ipynb
yarikoptic/statsmodels
f990cb1a1ef0c9883c9394444e6f9d027efabec6
[ "BSD-3-Clause" ]
null
null
null
43.158986
343
0.590305
[ [ [ "# Detrending, Stylized Facts and the Business Cycle\n\nIn an influential article, Harvey and Jaeger (1993) described the use of unobserved components models (also known as \"structural time series models\") to derive stylized facts of the business cycle.\n\nTheir paper begins:\n\n \"Establishing the 'stylized facts' associated with a set of time series is widely considered a crucial step\n in macroeconomic research ... For such facts to be useful they should (1) be consistent with the stochastic\n properties of the data and (2) present meaningful information.\"\n \nIn particular, they make the argument that these goals are often better met using the unobserved components approach rather than the popular Hodrick-Prescott filter or Box-Jenkins ARIMA modeling techniques.\n\nStatsmodels has the ability to perform all three types of analysis, and below we follow the steps of their paper, using a slightly updated dataset.", "_____no_output_____" ] ], [ [ "%matplotlib inline\n\nimport numpy as np\nimport pandas as pd\nimport statsmodels.api as sm\nimport matplotlib.pyplot as plt\n\nfrom IPython.display import display, Latex", "_____no_output_____" ] ], [ [ "## Unobserved Components\n\nThe unobserved components model available in Statsmodels can be written as:\n\n$$\ny_t = \\underbrace{\\mu_{t}}_{\\text{trend}} + \\underbrace{\\gamma_{t}}_{\\text{seasonal}} + \\underbrace{c_{t}}_{\\text{cycle}} + \\sum_{j=1}^k \\underbrace{\\beta_j x_{jt}}_{\\text{explanatory}} + \\underbrace{\\varepsilon_t}_{\\text{irregular}}\n$$\n\nsee Durbin and Koopman 2012, Chapter 3 for notation and additional details. Notice that different specifications for the different individual components can support a wide range of models. The specific models considered in the paper and below are specializations of this general equation.\n\n### Trend\n\nThe trend component is a dynamic extension of a regression model that includes an intercept and linear time-trend.\n\n$$\n\\begin{align}\n\\underbrace{\\mu_{t+1}}_{\\text{level}} & = \\mu_t + \\nu_t + \\eta_{t+1} \\qquad & \\eta_{t+1} \\sim N(0, \\sigma_\\eta^2) \\\\\\\\\n\\underbrace{\\nu_{t+1}}_{\\text{trend}} & = \\nu_t + \\zeta_{t+1} & \\zeta_{t+1} \\sim N(0, \\sigma_\\zeta^2) \\\\\n\\end{align}\n$$\n\nwhere the level is a generalization of the intercept term that can dynamically vary across time, and the trend is a generalization of the time-trend such that the slope can dynamically vary across time.\n\nFor both elements (level and trend), we can consider models in which:\n\n- The element is included vs excluded (if the trend is included, there must also be a level included).\n- The element is deterministic vs stochastic (i.e. whether or not the variance on the error term is confined to be zero or not)\n\nThe only additional parameters to be estimated via MLE are the variances of any included stochastic components.\n\nThis leads to the following specifications:\n\n| | Level | Trend | Stochastic Level | Stochastic Trend |\n|----------------------------------------------------------------------|-------|-------|------------------|------------------|\n| Constant | ✓ | | | |\n| Local Level <br /> (random walk) | ✓ | | ✓ | |\n| Deterministic trend | ✓ | ✓ | | |\n| Local level with deterministic trend <br /> (random walk with drift) | ✓ | ✓ | ✓ | |\n| Local linear trend | ✓ | ✓ | ✓ | ✓ |\n| Smooth trend <br /> (integrated random walk) | ✓ | ✓ | | ✓ |\n\n### Seasonal\n\nThe seasonal component is written as:\n\n<span>$$\n\\gamma_t = - \\sum_{j=1}^{s-1} \\gamma_{t+1-j} + \\omega_t \\qquad \\omega_t \\sim N(0, \\sigma_\\omega^2)\n$$</span>\n\nThe periodicity (number of seasons) is `s`, and the defining character is that (without the error term), the seasonal components sum to zero across one complete cycle. The inclusion of an error term allows the seasonal effects to vary over time.\n\nThe variants of this model are:\n\n- The periodicity `s`\n- Whether or not to make the seasonal effects stochastic.\n\nIf the seasonal effect is stochastic, then there is one additional parameter to estimate via MLE (the variance of the error term).\n\n### Cycle\n\nThe cyclical component is intended to capture cyclical effects at time frames much longer than captured by the seasonal component. For example, in economics the cyclical term is often intended to capture the business cycle, and is then expected to have a period between \"1.5 and 12 years\" (see Durbin and Koopman).\n\nThe cycle is written as:\n\n<span>$$\n\\begin{align}\nc_{t+1} & = c_t \\cos \\lambda_c + c_t^* \\sin \\lambda_c + \\tilde \\omega_t \\qquad & \\tilde \\omega_t \\sim N(0, \\sigma_{\\tilde \\omega}^2) \\\\\\\\\nc_{t+1}^* & = -c_t \\sin \\lambda_c + c_t^* \\cos \\lambda_c + \\tilde \\omega_t^* & \\tilde \\omega_t^* \\sim N(0, \\sigma_{\\tilde \\omega}^2)\n\\end{align}\n$$</span>\n\nThe parameter $\\lambda_c$ (the frequency of the cycle) is an additional parameter to be estimated by MLE. If the seasonal effect is stochastic, then there is one another parameter to estimate (the variance of the error term - note that both of the error terms here share the same variance, but are assumed to have independent draws).\n\n### Irregular\n\nThe irregular component is assumed to be a white noise error term. Its variance is a parameter to be estimated by MLE; i.e.\n\n$$\n\\varepsilon_t \\sim N(0, \\sigma_\\varepsilon^2)\n$$\n\nIn some cases, we may want to generalize the irregular component to allow for autoregressive effects:\n\n$$\n\\varepsilon_t = \\rho(L) \\varepsilon_{t-1} + \\epsilon_t, \\qquad \\epsilon_t \\sim N(0, \\sigma_\\epsilon^2)\n$$\n\nIn this case, the autoregressive parameters would also be estimated via MLE.\n\n### Regression effects\n\nWe may want to allow for explanatory variables by including additional terms\n\n<span>$$\n\\sum_{j=1}^k \\beta_j x_{jt}\n$$</span>\n\nor for intervention effects by including\n\n<span>$$\n\\begin{align}\n\\delta w_t \\qquad \\text{where} \\qquad w_t & = 0, \\qquad t < \\tau, \\\\\\\\\n& = 1, \\qquad t \\ge \\tau\n\\end{align}\n$$</span>\n\nThese additional parameters could be estimated via MLE or by including them as components of the state space formulation.\n", "_____no_output_____" ], [ "## Data\n\nFollowing Harvey and Jaeger, we will consider the following time series:\n\n- US real GNP, \"output\", ([GNPC96](https://research.stlouisfed.org/fred2/series/GNPC96))\n- US GNP implicit price deflator, \"prices\", ([GNPDEF](https://research.stlouisfed.org/fred2/series/GNPDEF))\n- US monetary base, \"money\", ([AMBSL](https://research.stlouisfed.org/fred2/series/AMBSL))\n\nThe time frame in the original paper varied across series, but was broadly 1954-1989. Below we use data from the period 1948-2008 for all series. Although the unobserved components approach allows isolating a seasonal component within the model, the series considered in the paper, and here, are already seasonally adjusted.\n\nAll data series considered here are taken from [Federal Reserve Economic Data (FRED)](https://research.stlouisfed.org/fred2/). Conveniently, the Python library [Pandas](http://pandas.pydata.org/) has the ability to download data from FRED directly.", "_____no_output_____" ] ], [ [ "# Datasets\nfrom pandas.io.data import DataReader\n\n# Get the raw data\nstart = '1948-01'\nend = '2008-01'\nus_gnp = DataReader('GNPC96', 'fred', start=start, end=end)\nus_gnp_deflator = DataReader('GNPDEF', 'fred', start=start, end=end)\nus_monetary_base = DataReader('AMBSL', 'fred', start=start, end=end).resample('QS')\nrecessions = DataReader('USRECQ', 'fred', start=start, end=end).resample('QS', how='last').values[:,0]\n\n# Construct the dataframe\ndta = pd.concat(map(np.log, (us_gnp, us_gnp_deflator, us_monetary_base)), axis=1)\ndta.columns = ['US GNP','US Prices','US monetary base']\ndates = dta.index._mpl_repr()", "_____no_output_____" ] ], [ [ "To get a sense of these three variables over the timeframe, we can plot them:", "_____no_output_____" ] ], [ [ "# Plot the data\nax = dta.plot(figsize=(13,3))\nylim = ax.get_ylim()\nax.xaxis.grid()\nax.fill_between(dates, ylim[0]+1e-5, ylim[1]-1e-5, recessions, facecolor='k', alpha=0.1);", "_____no_output_____" ] ], [ [ "## Model\n\nSince the data is already seasonally adjusted and there are no obvious explanatory variables, the generic model considered is:\n\n$$\ny_t = \\underbrace{\\mu_{t}}_{\\text{trend}} + \\underbrace{c_{t}}_{\\text{cycle}} + \\underbrace{\\varepsilon_t}_{\\text{irregular}}\n$$\n\nThe irregular will be assumed to be white noise, and the cycle will be stochastic and damped. The final modeling choice is the specification to use for the trend component. Harvey and Jaeger consider two models:\n\n1. Local linear trend (the \"unrestricted\" model)\n2. Smooth trend (the \"restricted\" model, since we are forcing $\\sigma_\\eta = 0$)\n\nBelow, we construct `kwargs` dictionaries for each of these model types. Notice that rather that there are two ways to specify the models. One way is to specify components directly, as in the table above. The other way is to use string names which map to various specifications.", "_____no_output_____" ] ], [ [ "# Model specifications\n\n# Unrestricted model, using string specification\nunrestricted_model = {\n 'level': 'local linear trend', 'cycle': True, 'damped_cycle': True, 'stochastic_cycle': True\n}\n\n# Unrestricted model, setting components directly\n# This is an equivalent, but less convenient, way to specify a\n# local linear trend model with a stochastic damped cycle:\n# unrestricted_model = {\n# 'irregular': True, 'level': True, 'stochastic_level': True, 'trend': True, 'stochastic_trend': True,\n# 'cycle': True, 'damped_cycle': True, 'stochastic_cycle': True\n# }\n\n# The restricted model forces a smooth trend\nrestricted_model = {\n 'level': 'smooth trend', 'cycle': True, 'damped_cycle': True, 'stochastic_cycle': True\n}\n\n# Restricted model, setting components directly\n# This is an equivalent, but less convenient, way to specify a\n# smooth trend model with a stochastic damped cycle. Notice\n# that the difference from the local linear trend model is that\n# `stochastic_level=False` here.\n# unrestricted_model = {\n# 'irregular': True, 'level': True, 'stochastic_level': False, 'trend': True, 'stochastic_trend': True,\n# 'cycle': True, 'damped_cycle': True, 'stochastic_cycle': True\n# }", "_____no_output_____" ] ], [ [ "We now fit the following models:\n\n1. Output, unrestricted model\n2. Prices, unrestricted model\n3. Prices, restricted model\n4. Money, unrestricted model\n5. Money, restricted model", "_____no_output_____" ] ], [ [ "# Output\noutput_mod = sm.tsa.UnobservedComponents(dta['US GNP'], **unrestricted_model)\noutput_res = output_mod.fit(method='powell', disp=False)\n\n# Prices\nprices_mod = sm.tsa.UnobservedComponents(dta['US Prices'], **unrestricted_model)\nprices_res = prices_mod.fit(method='powell', disp=False)\n\nprices_restricted_mod = sm.tsa.UnobservedComponents(dta['US Prices'], **restricted_model)\nprices_restricted_res = prices_restricted_mod.fit(method='powell', disp=False)\n\n# Money\nmoney_mod = sm.tsa.UnobservedComponents(dta['US monetary base'], **unrestricted_model)\nmoney_res = money_mod.fit(method='powell', disp=False)\n\nmoney_restricted_mod = sm.tsa.UnobservedComponents(dta['US monetary base'], **restricted_model)\nmoney_restricted_res = money_restricted_mod.fit(method='powell', disp=False)", "_____no_output_____" ] ], [ [ "Once we have fit these models, there are a variety of ways to display the information. Looking at the model of US GNP, we can summarize the fit of the model using the `summary` method on the fit object.", "_____no_output_____" ] ], [ [ "print(output_res.summary())", "_____no_output_____" ] ], [ [ "For unobserved components models, and in particular when exploring stylized facts in line with point (2) from the introduction, it is often more instructive to plot the estimated unobserved components (e.g. the level, trend, and cycle) themselves to see if they provide a meaningful description of the data.\n\nThe `plot_components` method of the fit object can be used to show plots and confidence intervals of each of the estimated states, as well as a plot of the observed data versus the one-step-ahead predictions of the model to assess fit.", "_____no_output_____" ] ], [ [ "fig = output_res.plot_components(legend_loc='lower right', figsize=(15, 9));", "_____no_output_____" ] ], [ [ "Finally, Harvey and Jaeger summarize the models in another way to highlight the relative importances of the trend and cyclical components; below we replicate their Table I. The values we find are broadly consistent with, but different in the particulars from, the values from their table.", "_____no_output_____" ] ], [ [ "# Create Table I\ntable_i = np.zeros((5,6))\n\nstart = dta.index[0]\nend = dta.index[-1]\ntime_range = '%d:%d-%d:%d' % (start.year, start.quarter, end.year, end.quarter)\nmodels = [\n ('US GNP', time_range, 'None'),\n ('US Prices', time_range, 'None'),\n ('US Prices', time_range, r'$\\sigma_\\eta^2 = 0$'),\n ('US monetary base', time_range, 'None'),\n ('US monetary base', time_range, r'$\\sigma_\\eta^2 = 0$'),\n]\nindex = pd.MultiIndex.from_tuples(models, names=['Series', 'Time range', 'Restrictions'])\nparameter_symbols = [\n r'$\\sigma_\\zeta^2$', r'$\\sigma_\\eta^2$', r'$\\sigma_\\kappa^2$', r'$\\rho$',\n r'$2 \\pi / \\lambda_c$', r'$\\sigma_\\varepsilon^2$',\n]\n\ni = 0\nfor res in (output_res, prices_res, prices_restricted_res, money_res, money_restricted_res):\n if res.model.stochastic_level:\n (sigma_irregular, sigma_level, sigma_trend,\n sigma_cycle, frequency_cycle, damping_cycle) = res.params\n else:\n (sigma_irregular, sigma_level,\n sigma_cycle, frequency_cycle, damping_cycle) = res.params\n sigma_trend = np.nan\n period_cycle = 2 * np.pi / frequency_cycle\n \n table_i[i, :] = [\n sigma_level*1e7, sigma_trend*1e7,\n sigma_cycle*1e7, damping_cycle, period_cycle,\n sigma_irregular*1e7\n ]\n i += 1\n \npd.set_option('float_format', lambda x: '%.4g' % np.round(x, 2) if not np.isnan(x) else '-')\ntable_i = pd.DataFrame(table_i, index=index, columns=parameter_symbols)\ntable_i", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0274cae8108a3c472d32c456c56908aa7f0a963
40,207
ipynb
Jupyter Notebook
01.Neural-networks-Deep-learning/Week2/Logistic Regression as a Neural Network/Logistic Regression with a Neural Network mindset v3.ipynb
navicester/deeplearning.ai-Assignments
fb3bdbf1b436b4399b16961782f5f2baa0274c87
[ "MIT" ]
2
2017-05-18T06:22:35.000Z
2017-05-18T07:04:19.000Z
01.Neural-networks-Deep-learning/Week2/Logistic Regression as a Neural Network/Logistic Regression with a Neural Network mindset v3.ipynb
navicester/deeplearning.ai-Assignments
fb3bdbf1b436b4399b16961782f5f2baa0274c87
[ "MIT" ]
null
null
null
01.Neural-networks-Deep-learning/Week2/Logistic Regression as a Neural Network/Logistic Regression with a Neural Network mindset v3.ipynb
navicester/deeplearning.ai-Assignments
fb3bdbf1b436b4399b16961782f5f2baa0274c87
[ "MIT" ]
2
2018-05-24T19:31:14.000Z
2018-06-08T04:33:52.000Z
35.487202
434
0.545427
[ [ [ "# Logistic Regression with a Neural Network mindset\n\nWelcome to your first (required) programming assignment! You will build a logistic regression classifier to recognize cats. This assignment will step you through how to do this with a Neural Network mindset, and so will also hone your intuitions about deep learning.\n\n**Instructions:**\n- Do not use loops (for/while) in your code, unless the instructions explicitly ask you to do so.\n\n**You will learn to:**\n- Build the general architecture of a learning algorithm, including:\n - Initializing parameters\n - Calculating the cost function and its gradient\n - Using an optimization algorithm (gradient descent) \n- Gather all three functions above into a main model function, in the right order.", "_____no_output_____" ], [ "## 1 - Packages ##\n\nFirst, let's run the cell below to import all the packages that you will need during this assignment. \n- [numpy](www.numpy.org) is the fundamental package for scientific computing with Python.\n- [h5py](http://www.h5py.org) is a common package to interact with a dataset that is stored on an H5 file.\n- [matplotlib](http://matplotlib.org) is a famous library to plot graphs in Python.\n- [PIL](http://www.pythonware.com/products/pil/) and [scipy](https://www.scipy.org/) are used here to test your model with your own picture at the end.", "_____no_output_____" ] ], [ [ "import numpy as np\nimport matplotlib.pyplot as plt\nimport h5py\nimport scipy\nfrom PIL import Image\nfrom scipy import ndimage\nfrom lr_utils import load_dataset\n\n%matplotlib inline", "_____no_output_____" ] ], [ [ "## 2 - Overview of the Problem set ##\n\n**Problem Statement**: You are given a dataset (\"data.h5\") containing:\n - a training set of m_train images labeled as cat (y=1) or non-cat (y=0)\n - a test set of m_test images labeled as cat or non-cat\n - each image is of shape (num_px, num_px, 3) where 3 is for the 3 channels (RGB). Thus, each image is square (height = num_px) and (width = num_px).\n\nYou will build a simple image-recognition algorithm that can correctly classify pictures as cat or non-cat.\n\nLet's get more familiar with the dataset. Load the data by running the following code.", "_____no_output_____" ] ], [ [ "# Loading the data (cat/non-cat)\ntrain_set_x_orig, train_set_y, test_set_x_orig, test_set_y, classes = load_dataset()", "_____no_output_____" ] ], [ [ "We added \"_orig\" at the end of image datasets (train and test) because we are going to preprocess them. After preprocessing, we will end up with train_set_x and test_set_x (the labels train_set_y and test_set_y don't need any preprocessing).\n\nEach line of your train_set_x_orig and test_set_x_orig is an array representing an image. You can visualize an example by running the following code. Feel free also to change the `index` value and re-run to see other images. ", "_____no_output_____" ] ], [ [ "# Example of a picture\nindex = 25\nplt.imshow(train_set_x_orig[index])\nprint (\"y = \" + str(train_set_y[:, index]) + \", it's a '\" + classes[np.squeeze(train_set_y[:, index])].decode(\"utf-8\") + \"' picture.\")", "_____no_output_____" ] ], [ [ "Many software bugs in deep learning come from having matrix/vector dimensions that don't fit. If you can keep your matrix/vector dimensions straight you will go a long way toward eliminating many bugs. \n\n**Exercise:** Find the values for:\n - m_train (number of training examples)\n - m_test (number of test examples)\n - num_px (= height = width of a training image)\nRemember that `train_set_x_orig` is a numpy-array of shape (m_train, num_px, num_px, 3). For instance, you can access `m_train` by writing `train_set_x_orig.shape[0]`.", "_____no_output_____" ] ], [ [ "### START CODE HERE ### (≈ 3 lines of code)\nm_train = None\nm_test = None\nnum_px = None\n### END CODE HERE ###\n\nprint (\"Number of training examples: m_train = \" + str(m_train))\nprint (\"Number of testing examples: m_test = \" + str(m_test))\nprint (\"Height/Width of each image: num_px = \" + str(num_px))\nprint (\"Each image is of size: (\" + str(num_px) + \", \" + str(num_px) + \", 3)\")\nprint (\"train_set_x shape: \" + str(train_set_x_orig.shape))\nprint (\"train_set_y shape: \" + str(train_set_y.shape))\nprint (\"test_set_x shape: \" + str(test_set_x_orig.shape))\nprint (\"test_set_y shape: \" + str(test_set_y.shape))", "_____no_output_____" ] ], [ [ "**Expected Output for m_train, m_test and num_px**: \n<table style=\"width:15%\">\n <tr>\n <td>**m_train**</td>\n <td> 209 </td> \n </tr>\n \n <tr>\n <td>**m_test**</td>\n <td> 50 </td> \n </tr>\n \n <tr>\n <td>**num_px**</td>\n <td> 64 </td> \n </tr>\n \n</table>\n", "_____no_output_____" ], [ "For convenience, you should now reshape images of shape (num_px, num_px, 3) in a numpy-array of shape (num_px $*$ num_px $*$ 3, 1). After this, our training (and test) dataset is a numpy-array where each column represents a flattened image. There should be m_train (respectively m_test) columns.\n\n**Exercise:** Reshape the training and test data sets so that images of size (num_px, num_px, 3) are flattened into single vectors of shape (num\\_px $*$ num\\_px $*$ 3, 1).\n\nA trick when you want to flatten a matrix X of shape (a,b,c,d) to a matrix X_flatten of shape (b$*$c$*$d, a) is to use: \n```python\nX_flatten = X.reshape(X.shape[0], -1).T # X.T is the transpose of X\n```", "_____no_output_____" ] ], [ [ "# Reshape the training and test examples\n\n### START CODE HERE ### (≈ 2 lines of code)\ntrain_set_x_flatten = None\ntest_set_x_flatten = None\n### END CODE HERE ###\n\nprint (\"train_set_x_flatten shape: \" + str(train_set_x_flatten.shape))\nprint (\"train_set_y shape: \" + str(train_set_y.shape))\nprint (\"test_set_x_flatten shape: \" + str(test_set_x_flatten.shape))\nprint (\"test_set_y shape: \" + str(test_set_y.shape))\nprint (\"sanity check after reshaping: \" + str(train_set_x_flatten[0:5,0]))", "_____no_output_____" ] ], [ [ "**Expected Output**: \n\n<table style=\"width:35%\">\n <tr>\n <td>**train_set_x_flatten shape**</td>\n <td> (12288, 209)</td> \n </tr>\n <tr>\n <td>**train_set_y shape**</td>\n <td>(1, 209)</td> \n </tr>\n <tr>\n <td>**test_set_x_flatten shape**</td>\n <td>(12288, 50)</td> \n </tr>\n <tr>\n <td>**test_set_y shape**</td>\n <td>(1, 50)</td> \n </tr>\n <tr>\n <td>**sanity check after reshaping**</td>\n <td>[17 31 56 22 33]</td> \n </tr>\n</table>", "_____no_output_____" ], [ "To represent color images, the red, green and blue channels (RGB) must be specified for each pixel, and so the pixel value is actually a vector of three numbers ranging from 0 to 255.\n\nOne common preprocessing step in machine learning is to center and standardize your dataset, meaning that you substract the mean of the whole numpy array from each example, and then divide each example by the standard deviation of the whole numpy array. But for picture datasets, it is simpler and more convenient and works almost as well to just divide every row of the dataset by 255 (the maximum value of a pixel channel).\n\n<!-- During the training of your model, you're going to multiply weights and add biases to some initial inputs in order to observe neuron activations. Then you backpropogate with the gradients to train the model. But, it is extremely important for each feature to have a similar range such that our gradients don't explode. You will see that more in detail later in the lectures. !--> \n\nLet's standardize our dataset.", "_____no_output_____" ] ], [ [ "train_set_x = train_set_x_flatten/255.\ntest_set_x = test_set_x_flatten/255.", "_____no_output_____" ] ], [ [ "<font color='blue'>\n**What you need to remember:**\n\nCommon steps for pre-processing a new dataset are:\n- Figure out the dimensions and shapes of the problem (m_train, m_test, num_px, ...)\n- Reshape the datasets such that each example is now a vector of size (num_px \\* num_px \\* 3, 1)\n- \"Standardize\" the data", "_____no_output_____" ], [ "## 3 - General Architecture of the learning algorithm ##\n\nIt's time to design a simple algorithm to distinguish cat images from non-cat images.\n\nYou will build a Logistic Regression, using a Neural Network mindset. The following Figure explains why **Logistic Regression is actually a very simple Neural Network!**\n\n<img src=\"images/LogReg_kiank.png\" style=\"width:650px;height:400px;\">\n\n**Mathematical expression of the algorithm**:\n\nFor one example $x^{(i)}$:\n$$z^{(i)} = w^T x^{(i)} + b \\tag{1}$$\n$$\\hat{y}^{(i)} = a^{(i)} = sigmoid(z^{(i)})\\tag{2}$$ \n$$ \\mathcal{L}(a^{(i)}, y^{(i)}) = - y^{(i)} \\log(a^{(i)}) - (1-y^{(i)} ) \\log(1-a^{(i)})\\tag{3}$$\n\nThe cost is then computed by summing over all training examples:\n$$ J = \\frac{1}{m} \\sum_{i=1}^m \\mathcal{L}(a^{(i)}, y^{(i)})\\tag{6}$$\n\n**Key steps**:\nIn this exercise, you will carry out the following steps: \n - Initialize the parameters of the model\n - Learn the parameters for the model by minimizing the cost \n - Use the learned parameters to make predictions (on the test set)\n - Analyse the results and conclude", "_____no_output_____" ], [ "## 4 - Building the parts of our algorithm ## \n\nThe main steps for building a Neural Network are:\n1. Define the model structure (such as number of input features) \n2. Initialize the model's parameters\n3. Loop:\n - Calculate current loss (forward propagation)\n - Calculate current gradient (backward propagation)\n - Update parameters (gradient descent)\n\nYou often build 1-3 separately and integrate them into one function we call `model()`.\n\n### 4.1 - Helper functions\n\n**Exercise**: Using your code from \"Python Basics\", implement `sigmoid()`. As you've seen in the figure above, you need to compute $sigmoid( w^T x + b) = \\frac{1}{1 + e^{-(w^T x + b)}}$ to make predictions. Use np.exp().", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: sigmoid\n\ndef sigmoid(z):\n \"\"\"\n Compute the sigmoid of z\n\n Arguments:\n z -- A scalar or numpy array of any size.\n\n Return:\n s -- sigmoid(z)\n \"\"\"\n\n ### START CODE HERE ### (≈ 1 line of code)\n s = None\n ### END CODE HERE ###\n \n return s", "_____no_output_____" ], [ "print (\"sigmoid([0, 2]) = \" + str(sigmoid(np.array([0,2]))))", "_____no_output_____" ] ], [ [ "**Expected Output**: \n\n<table>\n <tr>\n <td>**sigmoid([0, 2])**</td>\n <td> [ 0.5 0.88079708]</td> \n </tr>\n</table>", "_____no_output_____" ], [ "### 4.2 - Initializing parameters\n\n**Exercise:** Implement parameter initialization in the cell below. You have to initialize w as a vector of zeros. If you don't know what numpy function to use, look up np.zeros() in the Numpy library's documentation.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: initialize_with_zeros\n\ndef initialize_with_zeros(dim):\n \"\"\"\n This function creates a vector of zeros of shape (dim, 1) for w and initializes b to 0.\n \n Argument:\n dim -- size of the w vector we want (or number of parameters in this case)\n \n Returns:\n w -- initialized vector of shape (dim, 1)\n b -- initialized scalar (corresponds to the bias)\n \"\"\"\n \n ### START CODE HERE ### (≈ 1 line of code)\n w = None\n b = None\n ### END CODE HERE ###\n\n assert(w.shape == (dim, 1))\n assert(isinstance(b, float) or isinstance(b, int))\n \n return w, b", "_____no_output_____" ], [ "dim = 2\nw, b = initialize_with_zeros(dim)\nprint (\"w = \" + str(w))\nprint (\"b = \" + str(b))", "_____no_output_____" ] ], [ [ "**Expected Output**: \n\n\n<table style=\"width:15%\">\n <tr>\n <td> ** w ** </td>\n <td> [[ 0.]\n [ 0.]] </td>\n </tr>\n <tr>\n <td> ** b ** </td>\n <td> 0 </td>\n </tr>\n</table>\n\nFor image inputs, w will be of shape (num_px $\\times$ num_px $\\times$ 3, 1).", "_____no_output_____" ], [ "### 4.3 - Forward and Backward propagation\n\nNow that your parameters are initialized, you can do the \"forward\" and \"backward\" propagation steps for learning the parameters.\n\n**Exercise:** Implement a function `propagate()` that computes the cost function and its gradient.\n\n**Hints**:\n\nForward Propagation:\n- You get X\n- You compute $A = \\sigma(w^T X + b) = (a^{(0)}, a^{(1)}, ..., a^{(m-1)}, a^{(m)})$\n- You calculate the cost function: $J = -\\frac{1}{m}\\sum_{i=1}^{m}y^{(i)}\\log(a^{(i)})+(1-y^{(i)})\\log(1-a^{(i)})$\n\nHere are the two formulas you will be using: \n\n$$ \\frac{\\partial J}{\\partial w} = \\frac{1}{m}X(A-Y)^T\\tag{7}$$\n$$ \\frac{\\partial J}{\\partial b} = \\frac{1}{m} \\sum_{i=1}^m (a^{(i)}-y^{(i)})\\tag{8}$$", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: propagate\n\ndef propagate(w, b, X, Y):\n \"\"\"\n Implement the cost function and its gradient for the propagation explained above\n\n Arguments:\n w -- weights, a numpy array of size (num_px * num_px * 3, 1)\n b -- bias, a scalar\n X -- data of size (num_px * num_px * 3, number of examples)\n Y -- true \"label\" vector (containing 0 if non-cat, 1 if cat) of size (1, number of examples)\n\n Return:\n cost -- negative log-likelihood cost for logistic regression\n dw -- gradient of the loss with respect to w, thus same shape as w\n db -- gradient of the loss with respect to b, thus same shape as b\n \n Tips:\n - Write your code step by step for the propagation. np.log(), np.dot()\n \"\"\"\n \n m = X.shape[1]\n \n # FORWARD PROPAGATION (FROM X TO COST)\n ### START CODE HERE ### (≈ 2 lines of code)\n A = None # compute activation\n cost = None # compute cost\n ### END CODE HERE ###\n \n # BACKWARD PROPAGATION (TO FIND GRAD)\n ### START CODE HERE ### (≈ 2 lines of code)\n dw = None\n db = None\n ### END CODE HERE ###\n\n assert(dw.shape == w.shape)\n assert(db.dtype == float)\n cost = np.squeeze(cost)\n assert(cost.shape == ())\n \n grads = {\"dw\": dw,\n \"db\": db}\n \n return grads, cost", "_____no_output_____" ], [ "w, b, X, Y = np.array([[1],[2]]), 2, np.array([[1,2],[3,4]]), np.array([[1,0]])\ngrads, cost = propagate(w, b, X, Y)\nprint (\"dw = \" + str(grads[\"dw\"]))\nprint (\"db = \" + str(grads[\"db\"]))\nprint (\"cost = \" + str(cost))", "_____no_output_____" ] ], [ [ "**Expected Output**:\n\n<table style=\"width:50%\">\n <tr>\n <td> ** dw ** </td>\n <td> [[ 0.99993216]\n [ 1.99980262]]</td>\n </tr>\n <tr>\n <td> ** db ** </td>\n <td> 0.499935230625 </td>\n </tr>\n <tr>\n <td> ** cost ** </td>\n <td> 6.000064773192205</td>\n </tr>\n\n</table>", "_____no_output_____" ], [ "### d) Optimization\n- You have initialized your parameters.\n- You are also able to compute a cost function and its gradient.\n- Now, you want to update the parameters using gradient descent.\n\n**Exercise:** Write down the optimization function. The goal is to learn $w$ and $b$ by minimizing the cost function $J$. For a parameter $\\theta$, the update rule is $ \\theta = \\theta - \\alpha \\text{ } d\\theta$, where $\\alpha$ is the learning rate.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: optimize\n\ndef optimize(w, b, X, Y, num_iterations, learning_rate, print_cost = False):\n \"\"\"\n This function optimizes w and b by running a gradient descent algorithm\n \n Arguments:\n w -- weights, a numpy array of size (num_px * num_px * 3, 1)\n b -- bias, a scalar\n X -- data of shape (num_px * num_px * 3, number of examples)\n Y -- true \"label\" vector (containing 0 if non-cat, 1 if cat), of shape (1, number of examples)\n num_iterations -- number of iterations of the optimization loop\n learning_rate -- learning rate of the gradient descent update rule\n print_cost -- True to print the loss every 100 steps\n \n Returns:\n params -- dictionary containing the weights w and bias b\n grads -- dictionary containing the gradients of the weights and bias with respect to the cost function\n costs -- list of all the costs computed during the optimization, this will be used to plot the learning curve.\n \n Tips:\n You basically need to write down two steps and iterate through them:\n 1) Calculate the cost and the gradient for the current parameters. Use propagate().\n 2) Update the parameters using gradient descent rule for w and b.\n \"\"\"\n \n costs = []\n \n for i in range(num_iterations):\n \n \n # Cost and gradient calculation (≈ 1-4 lines of code)\n ### START CODE HERE ### \n grads, cost = None\n ### END CODE HERE ###\n \n # Retrieve derivatives from grads\n dw = grads[\"dw\"]\n db = grads[\"db\"]\n \n # update rule (≈ 2 lines of code)\n ### START CODE HERE ###\n w = None\n b = None\n ### END CODE HERE ###\n \n # Record the costs\n if i % 100 == 0:\n costs.append(cost)\n \n # Print the cost every 100 training examples\n if print_cost and i % 100 == 0:\n print (\"Cost after iteration %i: %f\" %(i, cost))\n \n params = {\"w\": w,\n \"b\": b}\n \n grads = {\"dw\": dw,\n \"db\": db}\n \n return params, grads, costs", "_____no_output_____" ], [ "params, grads, costs = optimize(w, b, X, Y, num_iterations= 100, learning_rate = 0.009, print_cost = False)\n\nprint (\"w = \" + str(params[\"w\"]))\nprint (\"b = \" + str(params[\"b\"]))\nprint (\"dw = \" + str(grads[\"dw\"]))\nprint (\"db = \" + str(grads[\"db\"]))", "_____no_output_____" ] ], [ [ "**Expected Output**: \n\n<table style=\"width:40%\">\n <tr>\n <td> **w** </td>\n <td>[[ 0.1124579 ]\n [ 0.23106775]] </td>\n </tr>\n \n <tr>\n <td> **b** </td>\n <td> 1.55930492484 </td>\n </tr>\n <tr>\n <td> **dw** </td>\n <td> [[ 0.90158428]\n [ 1.76250842]] </td>\n </tr>\n <tr>\n <td> **db** </td>\n <td> 0.430462071679 </td>\n </tr>\n\n</table>", "_____no_output_____" ], [ "**Exercise:** The previous function will output the learned w and b. We are able to use w and b to predict the labels for a dataset X. Implement the `predict()` function. There is two steps to computing predictions:\n\n1. Calculate $\\hat{Y} = A = \\sigma(w^T X + b)$\n\n2. Convert the entries of a into 0 (if activation <= 0.5) or 1 (if activation > 0.5), stores the predictions in a vector `Y_prediction`. If you wish, you can use an `if`/`else` statement in a `for` loop (though there is also a way to vectorize this). ", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: predict\n\ndef predict(w, b, X):\n '''\n Predict whether the label is 0 or 1 using learned logistic regression parameters (w, b)\n \n Arguments:\n w -- weights, a numpy array of size (num_px * num_px * 3, 1)\n b -- bias, a scalar\n X -- data of size (num_px * num_px * 3, number of examples)\n \n Returns:\n Y_prediction -- a numpy array (vector) containing all predictions (0/1) for the examples in X\n '''\n \n m = X.shape[1]\n Y_prediction = np.zeros((1,m))\n w = w.reshape(X.shape[0], 1)\n \n # Compute vector \"A\" predicting the probabilities of a cat being present in the picture\n ### START CODE HERE ### (≈ 1 line of code)\n A = None\n ### END CODE HERE ###\n \n for i in range(A.shape[1]):\n \n # Convert probabilities A[0,i] to actual predictions p[0,i]\n ### START CODE HERE ### (≈ 4 lines of code)\n pass\n ### END CODE HERE ###\n \n assert(Y_prediction.shape == (1, m))\n \n return Y_prediction", "_____no_output_____" ], [ "print (\"predictions = \" + str(predict(w, b, X)))", "_____no_output_____" ] ], [ [ "**Expected Output**: \n\n<table style=\"width:30%\">\n <tr>\n <td>\n **predictions**\n </td>\n <td>\n [[ 1. 1.]]\n </td> \n </tr>\n\n</table>\n", "_____no_output_____" ], [ "<font color='blue'>\n**What to remember:**\nYou've implemented several functions that:\n- Initialize (w,b)\n- Optimize the loss iteratively to learn parameters (w,b):\n - computing the cost and its gradient \n - updating the parameters using gradient descent\n- Use the learned (w,b) to predict the labels for a given set of examples", "_____no_output_____" ], [ "## 5 - Merge all functions into a model ##\n\nYou will now see how the overall model is structured by putting together all the building blocks (functions implemented in the previous parts) together, in the right order.\n\n**Exercise:** Implement the model function. Use the following notation:\n - Y_prediction for your predictions on the test set\n - Y_prediction_train for your predictions on the train set\n - w, costs, grads for the outputs of optimize()", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: model\n\ndef model(X_train, Y_train, X_test, Y_test, num_iterations = 2000, learning_rate = 0.5, print_cost = False):\n \"\"\"\n Builds the logistic regression model by calling the function you've implemented previously\n \n Arguments:\n X_train -- training set represented by a numpy array of shape (num_px * num_px * 3, m_train)\n Y_train -- training labels represented by a numpy array (vector) of shape (1, m_train)\n X_test -- test set represented by a numpy array of shape (num_px * num_px * 3, m_test)\n Y_test -- test labels represented by a numpy array (vector) of shape (1, m_test)\n num_iterations -- hyperparameter representing the number of iterations to optimize the parameters\n learning_rate -- hyperparameter representing the learning rate used in the update rule of optimize()\n print_cost -- Set to true to print the cost every 100 iterations\n \n Returns:\n d -- dictionary containing information about the model.\n \"\"\"\n \n ### START CODE HERE ###\n \n # initialize parameters with zeros (≈ 1 line of code)\n w, b = None\n\n # Gradient descent (≈ 1 line of code)\n parameters, grads, costs = None\n \n # Retrieve parameters w and b from dictionary \"parameters\"\n w = parameters[\"w\"]\n b = parameters[\"b\"]\n \n # Predict test/train set examples (≈ 2 lines of code)\n Y_prediction_test = None\n Y_prediction_train = None\n\n ### END CODE HERE ###\n\n # Print train/test Errors\n print(\"train accuracy: {} %\".format(100 - np.mean(np.abs(Y_prediction_train - Y_train)) * 100))\n print(\"test accuracy: {} %\".format(100 - np.mean(np.abs(Y_prediction_test - Y_test)) * 100))\n\n \n d = {\"costs\": costs,\n \"Y_prediction_test\": Y_prediction_test, \n \"Y_prediction_train\" : Y_prediction_train, \n \"w\" : w, \n \"b\" : b,\n \"learning_rate\" : learning_rate,\n \"num_iterations\": num_iterations}\n \n return d", "_____no_output_____" ] ], [ [ "Run the following cell to train your model.", "_____no_output_____" ] ], [ [ "d = model(train_set_x, train_set_y, test_set_x, test_set_y, num_iterations = 2000, learning_rate = 0.005, print_cost = True)", "_____no_output_____" ] ], [ [ "**Expected Output**: \n\n<table style=\"width:40%\"> \n \n <tr>\n <td> **Train Accuracy** </td> \n <td> 99.04306220095694 % </td>\n </tr>\n\n <tr>\n <td>**Test Accuracy** </td> \n <td> 70.0 % </td>\n </tr>\n</table> \n\n\n", "_____no_output_____" ], [ "**Comment**: Training accuracy is close to 100%. This is a good sanity check: your model is working and has high enough capacity to fit the training data. Test error is 68%. It is actually not bad for this simple model, given the small dataset we used and that logistic regression is a linear classifier. But no worries, you'll build an even better classifier next week!\n\nAlso, you see that the model is clearly overfitting the training data. Later in this specialization you will learn how to reduce overfitting, for example by using regularization. Using the code below (and changing the `index` variable) you can look at predictions on pictures of the test set.", "_____no_output_____" ] ], [ [ "# Example of a picture that was wrongly classified.\nindex = 1\nplt.imshow(test_set_x[:,index].reshape((num_px, num_px, 3)))\nprint (\"y = \" + str(test_set_y[0,index]) + \", you predicted that it is a \\\"\" + classes[d[\"Y_prediction_test\"][0,index]].decode(\"utf-8\") + \"\\\" picture.\")", "_____no_output_____" ] ], [ [ "Let's also plot the cost function and the gradients.", "_____no_output_____" ] ], [ [ "# Plot learning curve (with costs)\ncosts = np.squeeze(d['costs'])\nplt.plot(costs)\nplt.ylabel('cost')\nplt.xlabel('iterations (per hundreds)')\nplt.title(\"Learning rate =\" + str(d[\"learning_rate\"]))\nplt.show()", "_____no_output_____" ] ], [ [ "**Interpretation**:\nYou can see the cost decreasing. It shows that the parameters are being learned. However, you see that you could train the model even more on the training set. Try to increase the number of iterations in the cell above and rerun the cells. You might see that the training set accuracy goes up, but the test set accuracy goes down. This is called overfitting. ", "_____no_output_____" ], [ "## 6 - Further analysis (optional/ungraded exercise) ##\n\nCongratulations on building your first image classification model. Let's analyze it further, and examine possible choices for the learning rate $\\alpha$. ", "_____no_output_____" ], [ "#### Choice of learning rate ####\n\n**Reminder**:\nIn order for Gradient Descent to work you must choose the learning rate wisely. The learning rate $\\alpha$ determines how rapidly we update the parameters. If the learning rate is too large we may \"overshoot\" the optimal value. Similarly, if it is too small we will need too many iterations to converge to the best values. That's why it is crucial to use a well-tuned learning rate.\n\nLet's compare the learning curve of our model with several choices of learning rates. Run the cell below. This should take about 1 minute. Feel free also to try different values than the three we have initialized the `learning_rates` variable to contain, and see what happens. ", "_____no_output_____" ] ], [ [ "learning_rates = [0.01, 0.001, 0.0001]\nmodels = {}\nfor i in learning_rates:\n print (\"learning rate is: \" + str(i))\n models[str(i)] = model(train_set_x, train_set_y, test_set_x, test_set_y, num_iterations = 1500, learning_rate = i, print_cost = False)\n print ('\\n' + \"-------------------------------------------------------\" + '\\n')\n\nfor i in learning_rates:\n plt.plot(np.squeeze(models[str(i)][\"costs\"]), label= str(models[str(i)][\"learning_rate\"]))\n\nplt.ylabel('cost')\nplt.xlabel('iterations')\n\nlegend = plt.legend(loc='upper center', shadow=True)\nframe = legend.get_frame()\nframe.set_facecolor('0.90')\nplt.show()", "_____no_output_____" ] ], [ [ "**Interpretation**: \n- Different learning rates give different costs and thus different predictions results.\n- If the learning rate is too large (0.01), the cost may oscillate up and down. It may even diverge (though in this example, using 0.01 still eventually ends up at a good value for the cost). \n- A lower cost doesn't mean a better model. You have to check if there is possibly overfitting. It happens when the training accuracy is a lot higher than the test accuracy.\n- In deep learning, we usually recommend that you: \n - Choose the learning rate that better minimizes the cost function.\n - If your model overfits, use other techniques to reduce overfitting. (We'll talk about this in later videos.) \n", "_____no_output_____" ], [ "## 7 - Test with your own image (optional/ungraded exercise) ##\n\nCongratulations on finishing this assignment. You can use your own image and see the output of your model. To do that:\n 1. Click on \"File\" in the upper bar of this notebook, then click \"Open\" to go on your Coursera Hub.\n 2. Add your image to this Jupyter Notebook's directory, in the \"images\" folder\n 3. Change your image's name in the following code\n 4. Run the code and check if the algorithm is right (1 = cat, 0 = non-cat)!", "_____no_output_____" ] ], [ [ "## START CODE HERE ## (PUT YOUR IMAGE NAME) \nmy_image = \"my_image.jpg\" # change this to the name of your image file \n## END CODE HERE ##\n\n# We preprocess the image to fit your algorithm.\nfname = \"images/\" + my_image\nimage = np.array(ndimage.imread(fname, flatten=False))\nmy_image = scipy.misc.imresize(image, size=(num_px,num_px)).reshape((1, num_px*num_px*3)).T\nmy_predicted_image = predict(d[\"w\"], d[\"b\"], my_image)\n\nplt.imshow(image)\nprint(\"y = \" + str(np.squeeze(my_predicted_image)) + \", your algorithm predicts a \\\"\" + classes[int(np.squeeze(my_predicted_image)),].decode(\"utf-8\") + \"\\\" picture.\")", "_____no_output_____" ] ], [ [ "<font color='blue'>\n**What to remember from this assignment:**\n1. Preprocessing the dataset is important.\n2. You implemented each function separately: initialize(), propagate(), optimize(). Then you built a model().\n3. Tuning the learning rate (which is an example of a \"hyperparameter\") can make a big difference to the algorithm. You will see more examples of this later in this course!", "_____no_output_____" ], [ "Finally, if you'd like, we invite you to try different things on this Notebook. Make sure you submit before trying anything. Once you submit, things you can play with include:\n - Play with the learning rate and the number of iterations\n - Try different initialization methods and compare the results\n - Test other preprocessings (center the data, or divide each row by its standard deviation)", "_____no_output_____" ], [ "Bibliography:\n- http://www.wildml.com/2015/09/implementing-a-neural-network-from-scratch/\n- https://stats.stackexchange.com/questions/211436/why-do-we-normalize-images-by-subtracting-the-datasets-image-mean-and-not-the-c", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ] ]
d0275d0833323c254dff3115db03772ac385813d
7,708
ipynb
Jupyter Notebook
src/5 Scrap.ipynb
galbiati/mnk-cleaning-analysis
d8e8b13b7a2c6431e453430588fa85fd694b3373
[ "MIT" ]
null
null
null
src/5 Scrap.ipynb
galbiati/mnk-cleaning-analysis
d8e8b13b7a2c6431e453430588fa85fd694b3373
[ "MIT" ]
null
null
null
src/5 Scrap.ipynb
galbiati/mnk-cleaning-analysis
d8e8b13b7a2c6431e453430588fa85fd694b3373
[ "MIT" ]
null
null
null
40.356021
147
0.495978
[ [ [ "import pandas as pd\nimport numpy as np\nimport os\n\nos.chdir('/Users/gianni/Google Drive/Bas Zahy Gianni - Games/Data')", "_____no_output_____" ], [ "oc = [\n 'index', 'subject', 'color', 'gi', 'mi', \n 'status', 'bp', 'wp', 'response', 'rt',\n 'time', 'mouse_t', 'mouse_x'\n]\n\nfc = [\n 'subject', 'is_comp', 'color', 'status',\n 'bp', 'wp', 'response', 'rt', 'gi', 'mi',\n 'computer', 'human', 'time'\n]\n\nmc = ['subject', 'color', 'bp', 'wp', 'response', 'rt', 'condition']\n\nclass Data():\n \"\"\" Data is the primary object for holding experimental data. It also contains functions\n for the loading, cleaning, augmentation, and export of the data tables. \"\"\"\n\n def __init__(self, folder):\n self.data = self.load(folder)\n\n def load_file(self, folder, file_name, mouse=False):\n \"\"\" Initial preparation of data for individual files \"\"\"\n print(file_name[:-4])\n\n # load file, drop nuissance columns, remove non-observations\n drop_cols = ['index'] if mouse else ['index', 'mouse_t', 'mouse_x']\n data = pd.read_csv(folder + file_name, names=oc).drop(drop_cols, axis=1)\n drop_status = (data.status != 'dummy') & (data.status != 'ready') & (data.status != 'draw offer')\n data = data.loc[drop_status, :].copy().reset_index(drop=True)\n\n # assign unique subject label (from filename) and create separate cols for humans and computers\n sub_filter = data.rt > 0\n comp_filter = data.rt == 0\n first_move_filter = data.bp.map(lambda x: np.array(list(x)).astype(int).sum()==1) #(data.mi == 0) & (data.gi%2 == 0)\n second_move_filter = data.bp.map(lambda x: np.array(list(x)).astype(int).sum()==2) #(data.mi == 1) & (data.gi%2 == 0)\n condition_filter = (data.rt>0)&(data.status == 'playing')\n data.loc[condition_filter, 'condition'] = data.loc[condition_filter, 'subject'].map(lambda x: x[-1])\n data.loc[:, 'condition'] = data.loc[:, 'condition'].fillna(method='ffill')\n \n data.loc[data.rt > 0, 'subject'] = file_name[:-4]\n data.loc[:, 'human'] = file_name[:-4]\n data.loc[:, 'computer'] = np.nan\n data.loc[comp_filter, 'computer'] = data.loc[comp_filter, 'subject']\n data.loc[first_move_filter, 'computer'] = data.loc[second_move_filter, 'computer']\n data.loc[:, 'computer'] = data.loc[:, 'computer'].fillna(method='ffill')\n data.loc[0, 'computer'] = data.loc[1, 'computer']\n\n return data\n\n def load(self, folder):\n \"\"\" Calls other functions to corrale data and some support information \"\"\"\n self.exp_name = folder\n files = os.listdir(folder + '/Raw/')\n files = [f for f in files if f[-3:] == 'csv']\n# files =[f for f in files if f[:-4] != 'HH']\n self.subjects = [f[:-4] for f in files]\n self.subject_dict = dict(zip(self.subjects, np.arange(len(self.subjects))))\n data = pd.concat([self.load_file(folder + '/Raw/', f) for f in files])\n data = data.reset_index(drop=True)\n data = self.clean(data)\n\n return data\n\n def clean(self, df):\n \"\"\" Performs further cleaning that can be done on all data collectively \"\"\"\n\n # anonymize subjects\n sub_filter = df.rt > 0 # filter computers out\n df.loc[sub_filter, 'subject'] = df.loc[sub_filter, 'subject'].map(self.subject_dict)\n df.loc[:, 'human'] = df.loc[:, 'human'].map(self.subject_dict)\n\n # give computers identifiable names\n comp_filter = df.rt == 0\n df.loc[comp_filter, 'subject'] = df.loc[comp_filter, 'subject'].astype(int) + 1000\n df.loc[pd.notnull(df.computer), 'computer'] = df.loc[pd.notnull(df.computer), 'computer'].astype(int) + 1000\n\n # force remove response from board\n for i in df.loc[df.status != 'EVAL', :].index.values:\n if df.loc[i,\"color\"] == 0:\n l = list(df.loc[i,\"bp\"])\n l[df.loc[i, \"response\"]] = '0'\n df.loc[i,\"bp\"] = ''.join(l)\n else:\n l = list(df.loc[i,\"wp\"])\n l[df.loc[i,\"response\"]] = '0'\n df.loc[i,\"wp\"] = ''.join(l)\n\n # force correct colors\n count_pieces = lambda x: np.array([np.array(list(df.loc[i, x])).astype(int).sum() for i in df.index.values])\n df.loc[:, 'color'] = count_pieces('bp') - count_pieces('wp')\n df.loc[:, 'color'] = df.loc[:, 'color'].astype(int).astype(str)\n\n # add is_comp\n is_computer = lambda x: \"0\" if x > 0 else \"1\"\n df.loc[:, 'is_comp'] = df.loc[:, 'rt'].map(is_computer)\n\n # correct move index in games\n df.loc[df.status.isin(['playing', 'win', 'draw', 'timeout']), 'mi'] = df.loc[df.status.isin(['playing', 'win', 'draw']), 'mi'] - 1\n return df\n\n def export_individuals(self, folder):\n for s, i in self.subject_dict.items():\n c = self.data.human == i\n d = self.data.loc[c, :].reset_index(drop=True)\n d = d.reindex_axis(self.full_output_columns, axis=1)\n d.to_csv(folder + '/Clean/' + s + '.csv', index=False)\n\n return None\n\n def export(self, folder):\n f = folder + 'Clean/_summaries/'\n E = self.data.loc[self.data.status.isin(['playing', 'win', 'draw', 'timeout']), :]\n E.loc[:, fc].to_csv(f + 'all_fields.csv', index=False)\n E.loc[:, mc].to_csv(f + 'model_fields.csv', index=False)\n \n return None", "_____no_output_____" ], [ "D = Data('./5_tai')\n\nD.export('./5_tai/')", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code" ] ]
d02760f9d1f93fcbe9a0c3528be0a844e2812445
4,351
ipynb
Jupyter Notebook
S01 - Bootcamp and Binary Classification/SLU10 - Metrics for Regression/Example Notebook.ipynb
LDSSA/batch4-students
c0547ee0cf10645a0244336c976b304cff2f2000
[ "MIT" ]
19
2020-06-10T09:24:18.000Z
2022-01-25T15:19:29.000Z
S01 - Bootcamp and Binary Classification/SLU10 - Metrics for Regression/Example Notebook.ipynb
LDSSA/batch4-students
c0547ee0cf10645a0244336c976b304cff2f2000
[ "MIT" ]
25
2020-05-16T14:25:41.000Z
2022-03-12T00:41:55.000Z
S01 - Bootcamp and Binary Classification/SLU10 - Metrics for Regression/Example Notebook.ipynb
LDSSA/batch4-students
c0547ee0cf10645a0244336c976b304cff2f2000
[ "MIT" ]
9
2020-08-04T22:08:14.000Z
2021-12-16T17:24:30.000Z
21.121359
191
0.505401
[ [ [ "# SLU10 - Metrics for regression: Example Notebook\n\nIn this notebook [some regression validation metrics offered by scikit-learn](http://scikit-learn.org/stable/modules/model_evaluation.html#common-cases-predefined-values) are presented.", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nfrom sklearn.datasets import load_boston\nfrom sklearn.linear_model import LinearRegression\n\n# some scikit-learn regression validation metrics\nfrom sklearn.metrics import mean_squared_error, mean_absolute_error, r2_score\n\nnp.random.seed(60)", "_____no_output_____" ] ], [ [ "# Load Data\nLoad the Boston house-prices dataset, fit a Linear Regression, and make prediction on the dataset (used to create the model).", "_____no_output_____" ] ], [ [ "data = load_boston()\n\nx = pd.DataFrame(data['data'], columns=data['feature_names'])\ny = pd.Series(data['target'])\n\nlr = LinearRegression()\nlr.fit(x, y)\n\ny_hat = lr.predict(x)", "_____no_output_____" ] ], [ [ "# Metrics with scikitlearn\n\nBelow follows a list of metrics made available by scikitlearn and its usage:", "_____no_output_____" ], [ "## Mean Squared Error\n\n$$MSE = \\frac{1}{N} \\sum_{n=1}^N (y_n - \\hat{y}_n)^2$$", "_____no_output_____" ] ], [ [ "mean_squared_error(y, y_hat)", "_____no_output_____" ] ], [ [ "## Root Mean Squared Error\n\n$$RMSE = \\sqrt{MSE}$$", "_____no_output_____" ] ], [ [ "np.sqrt(mean_squared_error(y, y_hat))", "_____no_output_____" ] ], [ [ "## Mean Absolute Error\n\n$$MAE = \\frac{1}{N} \\sum_{n=1}^N \\left| y_n - \\hat{y}_n \\right|$$", "_____no_output_____" ] ], [ [ "mean_absolute_error(y, y_hat)", "_____no_output_____" ] ], [ [ "## R² score\n\n$$\\bar{y} = \\frac{1}{N} \\sum_{n=1}^N y_n$$\n\n$$R² = 1 - \\frac{MSE(y, \\hat{y})}{MSE(y, \\bar{y})} \n= 1 - \\frac{\\frac{1}{N} \\sum_{n=1}^N (y_n - \\hat{y}_n)^2}{\\frac{1}{N} \\sum_{n=1}^N (y_n - \\bar{y})^2}\n= 1 - \\frac{\\sum_{n=1}^N (y_n - \\hat{y}_n)^2}{\\sum_{n=1}^N (y_n - \\bar{y})^2}$$", "_____no_output_____" ] ], [ [ "r2_score(y, y_hat)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0276af4d8d9914f47cdd3e1f7ea0e8c3c423f3c
19,757
ipynb
Jupyter Notebook
book/_build/jupyter_execute/pandas/23-Kaggle Submission.ipynb
hossainlab/dsnotes
fee64e157f45724bba1f49ad1b186dcaaf1e6c02
[ "CC0-1.0" ]
null
null
null
book/_build/jupyter_execute/pandas/23-Kaggle Submission.ipynb
hossainlab/dsnotes
fee64e157f45724bba1f49ad1b186dcaaf1e6c02
[ "CC0-1.0" ]
null
null
null
book/_build/jupyter_execute/pandas/23-Kaggle Submission.ipynb
hossainlab/dsnotes
fee64e157f45724bba1f49ad1b186dcaaf1e6c02
[ "CC0-1.0" ]
null
null
null
29.011747
415
0.373943
[ [ [ "import pandas as pd ", "_____no_output_____" ], [ "train = pd.read_csv(\"http://bit.ly/kaggletrain\")", "_____no_output_____" ], [ "train.head() ", "_____no_output_____" ], [ "feature_cols = ['Pclass', 'Parch'] \nX = train.loc[:, feature_cols] ", "_____no_output_____" ], [ "X.shape", "_____no_output_____" ], [ "y = train.Survived", "_____no_output_____" ], [ "y.shape", "_____no_output_____" ], [ "from sklearn.linear_model import LogisticRegression\nlogreg = LogisticRegression() \nlogreg.fit(X, y)", "/home/jubayer/anaconda3/lib/python3.7/site-packages/sklearn/linear_model/logistic.py:432: FutureWarning: Default solver will be changed to 'lbfgs' in 0.22. Specify a solver to silence this warning.\n FutureWarning)\n" ], [ "test = pd.read_csv(\"http://bit.ly/kaggletest\")", "_____no_output_____" ], [ "test.head() ", "_____no_output_____" ], [ "X_new = test.loc[:, feature_cols] ", "_____no_output_____" ], [ "X_new.shape", "_____no_output_____" ], [ "new_pred_class = logreg.predict(X_new)", "_____no_output_____" ], [ "test.PassengerId", "_____no_output_____" ], [ "new_pred_class", "_____no_output_____" ], [ "pd.DataFrame({'PassengerID' : test.PassengerId, 'Survived': new_pred_class}).to_csv(\"sub.csv\", index=False)", "_____no_output_____" ], [ "subdf = pd.read_csv(\"sub.csv\")", "_____no_output_____" ], [ "subdf.head() ", "_____no_output_____" ] ], [ [ "<h3>About the Author</h3>\nThis repo was created by <a href=\"https://www.linkedin.com/in/jubayer28/\" target=\"_blank\">Jubayer Hossain</a> <br>\n<a href=\"https://www.linkedin.com/in/jubayer28/\" target=\"_blank\">Jubayer Hossain</a> is a student of Microbiology at Jagannath University and the founder of <a href=\"https://github.com/hdro\" target=\"_blank\">Health Data Research Organization</a>. He is also a team member of a bioinformatics research group known as Bio-Bio-1. \n\n<a rel=\"license\" href=\"http://creativecommons.org/licenses/by-nc-sa/4.0/\"><img alt=\"Creative Commons License\" style=\"border-width:0\" src=\"https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png\" /></a><br />This work is licensed under a <a rel=\"license\" href=\"http://creativecommons.org/licenses/by-nc-sa/4.0/\">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.m", "_____no_output_____" ] ] ]
[ "code", "markdown" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ] ]
d02774b2431cb71880ef4dc59f72e8946508413f
23,902
ipynb
Jupyter Notebook
Jupyter/stockOther.ipynb
minplemon/stockThird
e32c202c95ba19fe2db97f6e5dd175ac64ee1996
[ "MIT" ]
6
2020-03-10T14:54:22.000Z
2021-11-28T11:49:06.000Z
Jupyter/stockOther.ipynb
minplemon/stockThird
e32c202c95ba19fe2db97f6e5dd175ac64ee1996
[ "MIT" ]
null
null
null
Jupyter/stockOther.ipynb
minplemon/stockThird
e32c202c95ba19fe2db97f6e5dd175ac64ee1996
[ "MIT" ]
5
2019-06-25T09:49:53.000Z
2020-03-01T11:56:32.000Z
36.772308
122
0.354322
[ [ [ "from jqdatasdk import *\nauth('18620668927', 'minpeng123')", "_____no_output_____" ], [ "#记录由上市公司年报、中报、一季报、三季报统计出的分红转增情况\nq = query(finance.STK_XR_XD).filter(finance.STK_XR_XD.report_date >= '2019-01-01').limit(10)\nfinance.run_query(q)", "_____no_output_____" ], [ "#记录沪深两市股票交易的成交情况,包括市值、成交量,市盈率等情况\nq = query(finance.STK_EXCHANGE_TRADE_INFO).filter(finance.STK_EXCHANGE_TRADE_INFO.date >= '2019-07-09').limit(10)\nfinance.run_query(q)", "_____no_output_____" ], [ "#描述:记录上海交易所和深圳交易所的融资融券汇总数据\nfinance.run_query(query(finance.STK_MT_TOTAL).filter(finance.STK_MT_TOTAL.date == '2019-04-23').limit(10))", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code" ] ]
d027851fa16810ff380fd888efc70ca8a7fcf92e
10,110
ipynb
Jupyter Notebook
Diabetes Dataset/Experiment with Features/062_BloodPressure, SkinThickness, Insulin, BMI and DiabetesPedigreeFunction .ipynb
AnkitaxPriya/Diabetes-Prediction
2a68fc067019dde8eda31ebb91436746abc4e98e
[ "MIT" ]
null
null
null
Diabetes Dataset/Experiment with Features/062_BloodPressure, SkinThickness, Insulin, BMI and DiabetesPedigreeFunction .ipynb
AnkitaxPriya/Diabetes-Prediction
2a68fc067019dde8eda31ebb91436746abc4e98e
[ "MIT" ]
null
null
null
Diabetes Dataset/Experiment with Features/062_BloodPressure, SkinThickness, Insulin, BMI and DiabetesPedigreeFunction .ipynb
AnkitaxPriya/Diabetes-Prediction
2a68fc067019dde8eda31ebb91436746abc4e98e
[ "MIT" ]
null
null
null
27.69863
93
0.357864
[ [ [ "# Import the required libraries\nimport warnings\nwarnings.filterwarnings('ignore')\n\nimport pandas as pd\nimport numpy as np\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nimport joblib\n%matplotlib inline\n\nfrom sklearn.linear_model import LogisticRegression", "_____no_output_____" ], [ "# Read the data and display\n\ndiabetesDF = pd.read_csv('diabetes.csv')\ndiabetesDF.head()", "_____no_output_____" ], [ "diabetesDF.drop(['Pregnancies', 'Glucose', 'Age'], axis=1, inplace=True)", "_____no_output_____" ], [ "diabetesDF.head()", "_____no_output_____" ], [ "# Total 768 patients record\n# Using 650 data for training\n# Using 100 data for testing\n# Using 18 data for validation\n\ndfTrain = diabetesDF[:650]\ndfTest = diabetesDF[650:750]\ndfCheck = diabetesDF[750:]", "_____no_output_____" ], [ "# Separating label and features and converting to numpy array to feed into our model\ntrainLabel = np.asarray(dfTrain['Outcome'])\ntrainData = np.asarray(dfTrain.drop('Outcome',1))\ntestLabel = np.asarray(dfTest['Outcome'])\ntestData = np.asarray(dfTest.drop('Outcome',1))", "_____no_output_____" ], [ "# Normalize the data \nmeans = np.mean(trainData, axis=0)\nstds = np.std(trainData, axis=0)\n\ntrainData = (trainData - means)/stds\ntestData = (testData - means)/stds", "_____no_output_____" ], [ "# models target t as sigmoid(w0 + w1*x1 + w2*x2 + ... + wd*xd)\ndiabetesCheck = LogisticRegression()\ndiabetesCheck.fit(trainData,trainLabel)\naccuracy = diabetesCheck.score(testData,testLabel)\nprint(\"accuracy = \",accuracy * 100,\"%\")", "accuracy = 67.0 %\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0278553588dd7f8314c321803f36496e9b6f6fd
2,300
ipynb
Jupyter Notebook
notebooks/meanshift.ipynb
JLCaraveo/sklearn-projects-Platzi
d2556dd90479a9057bd78face993fefd8ad47a5f
[ "MIT" ]
null
null
null
notebooks/meanshift.ipynb
JLCaraveo/sklearn-projects-Platzi
d2556dd90479a9057bd78face993fefd8ad47a5f
[ "MIT" ]
null
null
null
notebooks/meanshift.ipynb
JLCaraveo/sklearn-projects-Platzi
d2556dd90479a9057bd78face993fefd8ad47a5f
[ "MIT" ]
null
null
null
28.04878
85
0.574783
[ [ [ "import pandas as pd\n\nfrom sklearn.cluster import MeanShift", "_____no_output_____" ], [ "df_candies = pd.read_csv('../data/raw/candy.csv')\n\nx = df_candies.drop('competitorname', axis=1)\n\nmeanshift = MeanShift().fit(x)\nprint(meanshift.labels_)\nprint('_'*64)\nprint(meanshift.cluster_centers_)", "[2 2 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 1 2 1 2 0 1 2 0 0 1 2 2 0 1 2\n 2 2 1 1 1 2 2 0 0 0 2 0 0 0 2 2 2 2 0 2 0 0 0 2 1 0 0 2 2 2 1 2 0 0 0 0 1\n 1 0 0 1 1 2 0 0 0 0 1]\n________________________________________________________________\n[[2.25000000e-01 5.75000000e-01 1.00000000e-01 2.50000000e-02\n 5.00000000e-02 2.50000000e-02 3.00000000e-01 1.00000000e-01\n 5.50000000e-01 4.57599993e-01 3.67824996e-01 4.10442122e+01]\n [4.68750000e-01 5.00000000e-01 1.25000000e-01 1.56250000e-01\n 9.37500000e-02 6.25000000e-02 1.25000000e-01 3.12500000e-01\n 5.31250000e-01 4.57281243e-01 4.67874998e-01 5.21138597e+01]\n [8.26086957e-01 1.73913043e-01 3.04347826e-01 3.04347826e-01\n 1.73913043e-01 1.73913043e-01 0.00000000e+00 5.21739130e-01\n 4.34782609e-01 5.81391293e-01 6.38086963e-01 6.47120799e+01]]\n" ] ] ]
[ "code" ]
[ [ "code", "code" ] ]
d02796466cba49831e90efd9b05214df8021d6b1
515,187
ipynb
Jupyter Notebook
Neural Networks and Deep Learning/Week3/Planar_data_classification_with_one_hidden_layer.ipynb
sounok1234/Deeplearning_Projects
707cc101de6ba14c06186a829aed7ae54b21dab4
[ "MIT" ]
null
null
null
Neural Networks and Deep Learning/Week3/Planar_data_classification_with_one_hidden_layer.ipynb
sounok1234/Deeplearning_Projects
707cc101de6ba14c06186a829aed7ae54b21dab4
[ "MIT" ]
null
null
null
Neural Networks and Deep Learning/Week3/Planar_data_classification_with_one_hidden_layer.ipynb
sounok1234/Deeplearning_Projects
707cc101de6ba14c06186a829aed7ae54b21dab4
[ "MIT" ]
null
null
null
279.385575
322,660
0.915365
[ [ [ "# Planar data classification with one hidden layer\n\nWelcome to your week 3 programming assignment! It's time to build your first neural network, which will have one hidden layer. Now, you'll notice a big difference between this model and the one you implemented previously using logistic regression.\n\nBy the end of this assignment, you'll be able to:\n\n- Implement a 2-class classification neural network with a single hidden layer\n- Use units with a non-linear activation function, such as tanh\n- Compute the cross entropy loss\n- Implement forward and backward propagation\n", "_____no_output_____" ], [ "## Table of Contents\n- [1 - Packages](#1)\n- [2 - Load the Dataset](#2)\n - [Exercise 1](#ex-1)\n- [3 - Simple Logistic Regression](#3)\n- [4 - Neural Network model](#4)\n - [4.1 - Defining the neural network structure](#4-1)\n - [Exercise 2 - layer_sizes](#ex-2)\n - [4.2 - Initialize the model's parameters](#4-2)\n - [Exercise 3 - initialize_parameters](#ex-3)\n - [4.3 - The Loop](#4-3)\n - [Exercise 4 - forward_propagation](#ex-4)\n - [4.4 - Compute the Cost](#4-4)\n - [Exercise 5 - compute_cost](#ex-5)\n - [4.5 - Implement Backpropagation](#4-5)\n - [Exercise 6 - backward_propagation](#ex-6)\n - [4.6 - Update Parameters](#4-6)\n - [Exercise 7 - update_parameters](#ex-7)\n - [4.7 - Integration](#4-7)\n - [Exercise 8 - nn_model](#ex-8)\n- [5 - Test the Model](#5)\n - [5.1 - Predict](#5-1)\n - [Exercise 9 - predict](#ex-9)\n - [5.2 - Test the Model on the Planar Dataset](#5-2)\n- [6 - Tuning hidden layer size (optional/ungraded exercise)](#6)\n- [7- Performance on other datasets](#7)", "_____no_output_____" ], [ "<a name='1'></a>\n# 1 - Packages\n\nFirst import all the packages that you will need during this assignment.\n\n- [numpy](https://www.numpy.org/) is the fundamental package for scientific computing with Python.\n- [sklearn](http://scikit-learn.org/stable/) provides simple and efficient tools for data mining and data analysis. \n- [matplotlib](http://matplotlib.org) is a library for plotting graphs in Python.\n- testCases provides some test examples to assess the correctness of your functions\n- planar_utils provide various useful functions used in this assignment", "_____no_output_____" ] ], [ [ "# Package imports\nimport numpy as np\nimport copy\nimport matplotlib.pyplot as plt\nfrom testCases_v2 import *\nfrom public_tests import *\nimport sklearn\nimport sklearn.datasets\nimport sklearn.linear_model\nfrom planar_utils import plot_decision_boundary, sigmoid, load_planar_dataset, load_extra_datasets\n\n%matplotlib inline\n\nnp.random.seed(2) # set a seed so that the results are consistent\n\n%load_ext autoreload\n%autoreload 2", "_____no_output_____" ] ], [ [ "<a name='2'></a>\n# 2 - Load the Dataset \n\nNow, load the dataset you'll be working on. The following code will load a \"flower\" 2-class dataset into variables X and Y.", "_____no_output_____" ] ], [ [ "X, Y = load_planar_dataset()", "_____no_output_____" ] ], [ [ "Visualize the dataset using matplotlib. The data looks like a \"flower\" with some red (label y=0) and some blue (y=1) points. Your goal is to build a model to fit this data. In other words, we want the classifier to define regions as either red or blue.", "_____no_output_____" ] ], [ [ "# Visualize the data:\nplt.scatter(X[0, :], X[1, :], c=Y, s=40, cmap=plt.cm.Spectral);", "_____no_output_____" ] ], [ [ "You have:\n - a numpy-array (matrix) X that contains your features (x1, x2)\n - a numpy-array (vector) Y that contains your labels (red:0, blue:1).\n\nFirst, get a better sense of what your data is like. \n\n<a name='ex-1'></a>\n### Exercise 1 \n\nHow many training examples do you have? In addition, what is the `shape` of the variables `X` and `Y`? \n\n**Hint**: How do you get the shape of a numpy array? [(help)](https://docs.scipy.org/doc/numpy/reference/generated/numpy.ndarray.shape.html)", "_____no_output_____" ] ], [ [ "# (≈ 3 lines of code)\n# shape_X = ...\n# shape_Y = ...\n# training set size\n# m = ...\n# YOUR CODE STARTS HERE\nshape_X = X.shape\nshape_Y = Y.shape\nm = Y.shape[1]\n# YOUR CODE ENDS HERE\n\nprint ('The shape of X is: ' + str(shape_X))\nprint ('The shape of Y is: ' + str(shape_Y))\nprint ('I have m = %d training examples!' % (m))\nprint(X.shape[0])", "The shape of X is: (2, 400)\nThe shape of Y is: (1, 400)\nI have m = 400 training examples!\n2\n" ] ], [ [ "**Expected Output**:\n \n<table style=\"width:20%\">\n <tr>\n <td> shape of X </td>\n <td> (2, 400) </td> \n </tr>\n <tr>\n <td>shape of Y</td>\n <td>(1, 400) </td> \n </tr>\n <tr>\n <td>m</td>\n <td> 400 </td> \n </tr>\n</table>", "_____no_output_____" ], [ "<a name='3'></a>\n## 3 - Simple Logistic Regression\n\nBefore building a full neural network, let's check how logistic regression performs on this problem. You can use sklearn's built-in functions for this. Run the code below to train a logistic regression classifier on the dataset.", "_____no_output_____" ] ], [ [ "# Train the logistic regression classifier\nclf = sklearn.linear_model.LogisticRegressionCV();\nclf.fit(X.T, Y.T);", "_____no_output_____" ] ], [ [ "You can now plot the decision boundary of these models! Run the code below.", "_____no_output_____" ] ], [ [ "# Plot the decision boundary for logistic regression\nplot_decision_boundary(lambda x: clf.predict(x), X, Y)\nplt.title(\"Logistic Regression\")\nprint(X.shape)\n# Print accuracy\nLR_predictions = clf.predict(X.T)\nprint ('Accuracy of logistic regression: %d ' % float((np.dot(Y,LR_predictions) + np.dot(1-Y,1-LR_predictions))/float(Y.size)*100) +\n '% ' + \"(percentage of correctly labelled datapoints)\")", "(2, 400)\nAccuracy of logistic regression: 47 % (percentage of correctly labelled datapoints)\n" ] ], [ [ "**Expected Output**:\n\n<table style=\"width:20%\">\n <tr>\n <td>Accuracy</td>\n <td> 47% </td> \n </tr>\n \n</table>\n", "_____no_output_____" ], [ "**Interpretation**: The dataset is not linearly separable, so logistic regression doesn't perform well. Hopefully a neural network will do better. Let's try this now! ", "_____no_output_____" ], [ "<a name='4'></a>\n## 4 - Neural Network model\n\nLogistic regression didn't work well on the flower dataset. Next, you're going to train a Neural Network with a single hidden layer and see how that handles the same problem.\n\n**The model**:\n<img src=\"images/classification_kiank.png\" style=\"width:600px;height:300px;\">\n\n**Mathematically**:\n\nFor one example $x^{(i)}$:\n$$z^{[1] (i)} = W^{[1]} x^{(i)} + b^{[1]}\\tag{1}$$ \n$$a^{[1] (i)} = \\tanh(z^{[1] (i)})\\tag{2}$$\n$$z^{[2] (i)} = W^{[2]} a^{[1] (i)} + b^{[2]}\\tag{3}$$\n$$\\hat{y}^{(i)} = a^{[2] (i)} = \\sigma(z^{ [2] (i)})\\tag{4}$$\n$$y^{(i)}_{prediction} = \\begin{cases} 1 & \\mbox{if } a^{[2](i)} > 0.5 \\\\ 0 & \\mbox{otherwise } \\end{cases}\\tag{5}$$\n\nGiven the predictions on all the examples, you can also compute the cost $J$ as follows: \n$$J = - \\frac{1}{m} \\sum\\limits_{i = 0}^{m} \\large\\left(\\small y^{(i)}\\log\\left(a^{[2] (i)}\\right) + (1-y^{(i)})\\log\\left(1- a^{[2] (i)}\\right) \\large \\right) \\small \\tag{6}$$\n\n**Reminder**: The general methodology to build a Neural Network is to:\n 1. Define the neural network structure ( # of input units, # of hidden units, etc). \n 2. Initialize the model's parameters\n 3. Loop:\n - Implement forward propagation\n - Compute loss\n - Implement backward propagation to get the gradients\n - Update parameters (gradient descent)\n\nIn practice, you'll often build helper functions to compute steps 1-3, then merge them into one function called `nn_model()`. Once you've built `nn_model()` and learned the right parameters, you can make predictions on new data.", "_____no_output_____" ], [ "<a name='4-1'></a>\n### 4.1 - Defining the neural network structure ####\n\n<a name='ex-2'></a>\n### Exercise 2 - layer_sizes \n\nDefine three variables:\n - n_x: the size of the input layer\n - n_h: the size of the hidden layer (set this to 4) \n - n_y: the size of the output layer\n\n**Hint**: Use shapes of X and Y to find n_x and n_y. Also, hard code the hidden layer size to be 4.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: layer_sizes\n\ndef layer_sizes(X, Y):\n \"\"\"\n Arguments:\n X -- input dataset of shape (input size, number of examples)\n Y -- labels of shape (output size, number of examples)\n \n Returns:\n n_x -- the size of the input layer\n n_h -- the size of the hidden layer\n n_y -- the size of the output layer\n \"\"\"\n #(≈ 3 lines of code)\n # n_x = ... \n # n_h = ...\n # n_y = ... \n # YOUR CODE STARTS HERE\n n_x = X.shape[0]\n n_h = 4\n n_y = Y.shape[0]\n \n print(Y.shape)\n # YOUR CODE ENDS HERE\n return (n_x, n_h, n_y)", "_____no_output_____" ], [ "t_X, t_Y = layer_sizes_test_case()\n(n_x, n_h, n_y) = layer_sizes(t_X, t_Y)\nprint(\"The size of the input layer is: n_x = \" + str(n_x))\nprint(\"The size of the hidden layer is: n_h = \" + str(n_h))\nprint(\"The size of the output layer is: n_y = \" + str(n_y))\n\nlayer_sizes_test(layer_sizes)", "(2, 3)\nThe size of the input layer is: n_x = 5\nThe size of the hidden layer is: n_h = 4\nThe size of the output layer is: n_y = 2\n(2, 3)\n(2, 3)\n\u001b[92m All tests passed.\n" ] ], [ [ "***Expected output***\n```\nThe size of the input layer is: n_x = 5\nThe size of the hidden layer is: n_h = 4\nThe size of the output layer is: n_y = 2\n```", "_____no_output_____" ], [ "<a name='4-2'></a>\n### 4.2 - Initialize the model's parameters ####\n\n<a name='ex-3'></a>\n### Exercise 3 - initialize_parameters\n\nImplement the function `initialize_parameters()`.\n\n**Instructions**:\n- Make sure your parameters' sizes are right. Refer to the neural network figure above if needed.\n- You will initialize the weights matrices with random values. \n - Use: `np.random.randn(a,b) * 0.01` to randomly initialize a matrix of shape (a,b).\n- You will initialize the bias vectors as zeros. \n - Use: `np.zeros((a,b))` to initialize a matrix of shape (a,b) with zeros.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: initialize_parameters\n\ndef initialize_parameters(n_x, n_h, n_y):\n \"\"\"\n Argument:\n n_x -- size of the input layer\n n_h -- size of the hidden layer\n n_y -- size of the output layer\n \n Returns:\n params -- python dictionary containing your parameters:\n W1 -- weight matrix of shape (n_h, n_x)\n b1 -- bias vector of shape (n_h, 1)\n W2 -- weight matrix of shape (n_y, n_h)\n b2 -- bias vector of shape (n_y, 1)\n \"\"\"\n \n np.random.seed(2) # we set up a seed so that your output matches ours although the initialization is random.\n \n #(≈ 4 lines of code)\n # W1 = ...\n # b1 = ...\n # W2 = ...\n # b2 = ...\n # YOUR CODE STARTS HERE\n W1 = np.random.randn(n_h, n_x) * 0.01\n b1 = np.zeros((n_h, 1))\n W2 = np.random.randn(n_y, n_h) * 0.01\n b2 = np.zeros((n_y, 1))\n \n # YOUR CODE ENDS HERE\n\n parameters = {\"W1\": W1,\n \"b1\": b1,\n \"W2\": W2,\n \"b2\": b2}\n \n return parameters", "_____no_output_____" ], [ "n_x, n_h, n_y = initialize_parameters_test_case()\nparameters = initialize_parameters(n_x, n_h, n_y)\n\nprint(\"W1 = \" + str(parameters[\"W1\"]))\nprint(\"b1 = \" + str(parameters[\"b1\"]))\nprint(\"W2 = \" + str(parameters[\"W2\"]))\nprint(\"b2 = \" + str(parameters[\"b2\"]))\n\ninitialize_parameters_test(initialize_parameters)", "W1 = [[-0.00416758 -0.00056267]\n [-0.02136196 0.01640271]\n [-0.01793436 -0.00841747]\n [ 0.00502881 -0.01245288]]\nb1 = [[0.]\n [0.]\n [0.]\n [0.]]\nW2 = [[-0.01057952 -0.00909008 0.00551454 0.02292208]]\nb2 = [[0.]]\n\u001b[92m All tests passed.\n" ] ], [ [ "**Expected output**\n```\nW1 = [[-0.00416758 -0.00056267]\n [-0.02136196 0.01640271]\n [-0.01793436 -0.00841747]\n [ 0.00502881 -0.01245288]]\nb1 = [[0.]\n [0.]\n [0.]\n [0.]]\nW2 = [[-0.01057952 -0.00909008 0.00551454 0.02292208]]\nb2 = [[0.]]\n```", "_____no_output_____" ], [ "<a name='4-3'></a>\n### 4.3 - The Loop \n\n<a name='ex-4'></a>\n### Exercise 4 - forward_propagation\n\nImplement `forward_propagation()` using the following equations:\n\n$$Z^{[1]} = W^{[1]} X + b^{[1]}\\tag{1}$$ \n$$A^{[1]} = \\tanh(Z^{[1]})\\tag{2}$$\n$$Z^{[2]} = W^{[2]} A^{[1]} + b^{[2]}\\tag{3}$$\n$$\\hat{Y} = A^{[2]} = \\sigma(Z^{[2]})\\tag{4}$$\n\n\n**Instructions**:\n\n- Check the mathematical representation of your classifier in the figure above.\n- Use the function `sigmoid()`. It's built into (imported) this notebook.\n- Use the function `np.tanh()`. It's part of the numpy library.\n- Implement using these steps:\n 1. Retrieve each parameter from the dictionary \"parameters\" (which is the output of `initialize_parameters()` by using `parameters[\"..\"]`.\n 2. Implement Forward Propagation. Compute $Z^{[1]}, A^{[1]}, Z^{[2]}$ and $A^{[2]}$ (the vector of all your predictions on all the examples in the training set).\n- Values needed in the backpropagation are stored in \"cache\". The cache will be given as an input to the backpropagation function.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION:forward_propagation\n\ndef forward_propagation(X, parameters):\n \"\"\"\n Argument:\n X -- input data of size (n_x, m)\n parameters -- python dictionary containing your parameters (output of initialization function)\n \n Returns:\n A2 -- The sigmoid output of the second activation\n cache -- a dictionary containing \"Z1\", \"A1\", \"Z2\" and \"A2\"\n \"\"\"\n # Retrieve each parameter from the dictionary \"parameters\"\n #(≈ 4 lines of code)\n # W1 = ...\n # b1 = ...\n # W2 = ...\n # b2 = ...\n # YOUR CODE STARTS HERE\n W1 = parameters['W1']\n b1 = parameters['b1']\n W2 = parameters['W2']\n b2 = parameters['b2']\n \n # YOUR CODE ENDS HERE\n \n # Implement Forward Propagation to calculate A2 (probabilities)\n # (≈ 4 lines of code)\n # Z1 = ...\n # A1 = ...\n # Z2 = ...\n # A2 = ...\n # YOUR CODE STARTS HERE\n Z1 = np.dot(W1,X) + b1\n A1 = np.tanh(Z1)\n Z2 = np.dot(W2,A1) + b2\n A2 = sigmoid(Z2)\n \n # YOUR CODE ENDS HERE\n \n assert(A2.shape == (1, X.shape[1]))\n \n cache = {\"Z1\": Z1,\n \"A1\": A1,\n \"Z2\": Z2,\n \"A2\": A2}\n \n return A2, cache", "_____no_output_____" ], [ "t_X, parameters = forward_propagation_test_case()\nA2, cache = forward_propagation(t_X, parameters)\nprint(\"A2 = \" + str(A2))\n\nforward_propagation_test(forward_propagation)", "A2 = [[0.21292656 0.21274673 0.21295976]]\n\u001b[92m All tests passed.\n" ] ], [ [ "***Expected output***\n```\nA2 = [[0.21292656 0.21274673 0.21295976]]\n```", "_____no_output_____" ], [ "<a name='4-4'></a>\n### 4.4 - Compute the Cost\n\nNow that you've computed $A^{[2]}$ (in the Python variable \"`A2`\"), which contains $a^{[2](i)}$ for all examples, you can compute the cost function as follows:\n\n$$J = - \\frac{1}{m} \\sum\\limits_{i = 1}^{m} \\large{(} \\small y^{(i)}\\log\\left(a^{[2] (i)}\\right) + (1-y^{(i)})\\log\\left(1- a^{[2] (i)}\\right) \\large{)} \\small\\tag{13}$$\n\n<a name='ex-5'></a>\n### Exercise 5 - compute_cost \n\nImplement `compute_cost()` to compute the value of the cost $J$.\n\n**Instructions**:\n- There are many ways to implement the cross-entropy loss. This is one way to implement one part of the equation without for loops:\n$- \\sum\\limits_{i=1}^{m} y^{(i)}\\log(a^{[2](i)})$:\n```python\nlogprobs = np.multiply(np.log(A2),Y)\ncost = - np.sum(logprobs) \n```\n\n- Use that to build the whole expression of the cost function.\n\n**Notes**: \n\n- You can use either `np.multiply()` and then `np.sum()` or directly `np.dot()`). \n- If you use `np.multiply` followed by `np.sum` the end result will be a type `float`, whereas if you use `np.dot`, the result will be a 2D numpy array. \n- You can use `np.squeeze()` to remove redundant dimensions (in the case of single float, this will be reduced to a zero-dimension array). \n- You can also cast the array as a type `float` using `float()`.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: compute_cost\n\ndef compute_cost(A2, Y):\n \"\"\"\n Computes the cross-entropy cost given in equation (13)\n \n Arguments:\n A2 -- The sigmoid output of the second activation, of shape (1, number of examples)\n Y -- \"true\" labels vector of shape (1, number of examples)\n\n Returns:\n cost -- cross-entropy cost given equation (13)\n \n \"\"\"\n \n m = Y.shape[1] # number of examples\n\n # Compute the cross-entropy cost\n # (≈ 2 lines of code)\n # logprobs = ...\n # cost = ...\n # YOUR CODE STARTS HERE\n cost = (-1/m)*(np.dot(Y, np.log(A2).T) + np.dot(1-Y, np.log(1-A2).T))\n \n # YOUR CODE ENDS HERE\n \n cost = float(np.squeeze(cost)) # makes sure cost is the dimension we expect. \n # E.g., turns [[17]] into 17 \n \n return cost", "_____no_output_____" ], [ "A2, t_Y = compute_cost_test_case()\ncost = compute_cost(A2, t_Y)\nprint(\"cost = \" + str(compute_cost(A2, t_Y)))\n\ncompute_cost_test(compute_cost)", "cost = 0.6930587610394646\n\u001b[92m All tests passed.\n" ] ], [ [ "***Expected output***\n\n`cost = 0.6930587610394646`\n", "_____no_output_____" ], [ "<a name='4-5'></a>\n### 4.5 - Implement Backpropagation\n\nUsing the cache computed during forward propagation, you can now implement backward propagation.\n\n<a name='ex-6'></a>\n### Exercise 6 - backward_propagation\n\nImplement the function `backward_propagation()`.\n\n**Instructions**:\nBackpropagation is usually the hardest (most mathematical) part in deep learning. To help you, here again is the slide from the lecture on backpropagation. You'll want to use the six equations on the right of this slide, since you are building a vectorized implementation. \n\n<img src=\"images/grad_summary.png\" style=\"width:600px;height:300px;\">\n<caption><center><font color='purple'><b>Figure 1</b>: Backpropagation. Use the six equations on the right.</font></center></caption>\n\n<!--\n$\\frac{\\partial \\mathcal{J} }{ \\partial z_{2}^{(i)} } = \\frac{1}{m} (a^{[2](i)} - y^{(i)})$\n\n$\\frac{\\partial \\mathcal{J} }{ \\partial W_2 } = \\frac{\\partial \\mathcal{J} }{ \\partial z_{2}^{(i)} } a^{[1] (i) T} $\n\n$\\frac{\\partial \\mathcal{J} }{ \\partial b_2 } = \\sum_i{\\frac{\\partial \\mathcal{J} }{ \\partial z_{2}^{(i)}}}$\n\n$\\frac{\\partial \\mathcal{J} }{ \\partial z_{1}^{(i)} } = W_2^T \\frac{\\partial \\mathcal{J} }{ \\partial z_{2}^{(i)} } * ( 1 - a^{[1] (i) 2}) $\n\n$\\frac{\\partial \\mathcal{J} }{ \\partial W_1 } = \\frac{\\partial \\mathcal{J} }{ \\partial z_{1}^{(i)} } X^T $\n\n$\\frac{\\partial \\mathcal{J} _i }{ \\partial b_1 } = \\sum_i{\\frac{\\partial \\mathcal{J} }{ \\partial z_{1}^{(i)}}}$\n\n- Note that $*$ denotes elementwise multiplication.\n- The notation you will use is common in deep learning coding:\n - dW1 = $\\frac{\\partial \\mathcal{J} }{ \\partial W_1 }$\n - db1 = $\\frac{\\partial \\mathcal{J} }{ \\partial b_1 }$\n - dW2 = $\\frac{\\partial \\mathcal{J} }{ \\partial W_2 }$\n - db2 = $\\frac{\\partial \\mathcal{J} }{ \\partial b_2 }$\n \n!-->\n\n- Tips:\n - To compute dZ1 you'll need to compute $g^{[1]'}(Z^{[1]})$. Since $g^{[1]}(.)$ is the tanh activation function, if $a = g^{[1]}(z)$ then $g^{[1]'}(z) = 1-a^2$. So you can compute \n $g^{[1]'}(Z^{[1]})$ using `(1 - np.power(A1, 2))`.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: backward_propagation\n\ndef backward_propagation(parameters, cache, X, Y):\n \"\"\"\n Implement the backward propagation using the instructions above.\n \n Arguments:\n parameters -- python dictionary containing our parameters \n cache -- a dictionary containing \"Z1\", \"A1\", \"Z2\" and \"A2\".\n X -- input data of shape (2, number of examples)\n Y -- \"true\" labels vector of shape (1, number of examples)\n \n Returns:\n grads -- python dictionary containing your gradients with respect to different parameters\n \"\"\"\n m = X.shape[1]\n \n # First, retrieve W1 and W2 from the dictionary \"parameters\".\n #(≈ 2 lines of code)\n # W1 = ...\n # W2 = ...\n # YOUR CODE STARTS HERE\n W1 = parameters['W1']\n W2 = parameters['W2']\n \n # YOUR CODE ENDS HERE\n \n # Retrieve also A1 and A2 from dictionary \"cache\".\n #(≈ 2 lines of code)\n # A1 = ...\n # A2 = ...\n # YOUR CODE STARTS HERE\n A1 = cache['A1']\n A2 = cache['A2']\n \n # YOUR CODE ENDS HERE\n \n # Backward propagation: calculate dW1, db1, dW2, db2. \n #(≈ 6 lines of code, corresponding to 6 equations on slide above)\n # dZ2 = ...\n # dW2 = ...\n # db2 = ...\n # dZ1 = ...\n # dW1 = ...\n # db1 = ...\n # YOUR CODE STARTS HERE\n dZ2 = A2 - Y\n dW2 = (1/m)*(np.dot(dZ2, A1.T))\n db2 = (1/m)*(np.sum(dZ2, axis = 1, keepdims=True))\n dZ1 = np.dot(W2.T, dZ2)*(1 - np.power(A1, 2))\n dW1 = (1/m)*(np.dot(dZ1, X.T))\n db1 = (1/m)*(np.sum(dZ1, axis = 1, keepdims=True))\n \n # YOUR CODE ENDS HERE\n \n grads = {\"dW1\": dW1,\n \"db1\": db1,\n \"dW2\": dW2,\n \"db2\": db2}\n \n return grads", "_____no_output_____" ], [ "parameters, cache, t_X, t_Y = backward_propagation_test_case()\n\ngrads = backward_propagation(parameters, cache, t_X, t_Y)\nprint (\"dW1 = \"+ str(grads[\"dW1\"]))\nprint (\"db1 = \"+ str(grads[\"db1\"]))\nprint (\"dW2 = \"+ str(grads[\"dW2\"]))\nprint (\"db2 = \"+ str(grads[\"db2\"]))\n\nbackward_propagation_test(backward_propagation)", "dW1 = [[ 0.00301023 -0.00747267]\n [ 0.00257968 -0.00641288]\n [-0.00156892 0.003893 ]\n [-0.00652037 0.01618243]]\ndb1 = [[ 0.00176201]\n [ 0.00150995]\n [-0.00091736]\n [-0.00381422]]\ndW2 = [[ 0.00078841 0.01765429 -0.00084166 -0.01022527]]\ndb2 = [[-0.16655712]]\n\u001b[92m All tests passed.\n" ] ], [ [ "***Expected output***\n```\ndW1 = [[ 0.00301023 -0.00747267]\n [ 0.00257968 -0.00641288]\n [-0.00156892 0.003893 ]\n [-0.00652037 0.01618243]]\ndb1 = [[ 0.00176201]\n [ 0.00150995]\n [-0.00091736]\n [-0.00381422]]\ndW2 = [[ 0.00078841 0.01765429 -0.00084166 -0.01022527]]\ndb2 = [[-0.16655712]]\n```", "_____no_output_____" ], [ "<a name='4-6'></a>\n### 4.6 - Update Parameters \n\n<a name='ex-7'></a>\n### Exercise 7 - update_parameters\n\nImplement the update rule. Use gradient descent. You have to use (dW1, db1, dW2, db2) in order to update (W1, b1, W2, b2).\n\n**General gradient descent rule**: $\\theta = \\theta - \\alpha \\frac{\\partial J }{ \\partial \\theta }$ where $\\alpha$ is the learning rate and $\\theta$ represents a parameter.\n\n<img src=\"images/sgd.gif\" style=\"width:400;height:400;\"> <img src=\"images/sgd_bad.gif\" style=\"width:400;height:400;\">\n<caption><center><font color='purple'><b>Figure 2</b>: The gradient descent algorithm with a good learning rate (converging) and a bad learning rate (diverging). Images courtesy of Adam Harley.</font></center></caption>\n\n**Hint**\n\n- Use `copy.deepcopy(...)` when copying lists or dictionaries that are passed as parameters to functions. It avoids input parameters being modified within the function. In some scenarios, this could be inefficient, but it is required for grading purposes.\n", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: update_parameters\n\ndef update_parameters(parameters, grads, learning_rate = 1.2):\n \"\"\"\n Updates parameters using the gradient descent update rule given above\n \n Arguments:\n parameters -- python dictionary containing your parameters \n grads -- python dictionary containing your gradients \n \n Returns:\n parameters -- python dictionary containing your updated parameters \n \"\"\"\n # Retrieve a copy of each parameter from the dictionary \"parameters\". Use copy.deepcopy(...) for W1 and W2\n #(≈ 4 lines of code)\n # W1 = ...\n # b1 = ...\n # W2 = ...\n # b2 = ...\n # YOUR CODE STARTS HERE\n W1 = parameters[\"W1\"]\n b1 = parameters[\"b1\"]\n W2 = parameters[\"W2\"]\n b2 = parameters[\"b2\"]\n \n # YOUR CODE ENDS HERE\n \n # Retrieve each gradient from the dictionary \"grads\"\n #(≈ 4 lines of code)\n # dW1 = ...\n # db1 = ...\n # dW2 = ...\n # db2 = ...\n # YOUR CODE STARTS HERE\n dW1 = grads[\"dW1\"]\n db1 = grads[\"db1\"]\n dW2 = grads[\"dW2\"]\n db2 = grads[\"db2\"]\n \n # YOUR CODE ENDS HERE\n \n # Update rule for each parameter\n #(≈ 4 lines of code)\n # W1 = ...\n # b1 = ...\n # W2 = ...\n # b2 = ...\n # YOUR CODE STARTS HERE\n W1 = W1 - learning_rate*dW1\n b1 = b1 - learning_rate*db1\n W2 = W2 - learning_rate*dW2\n b2 = b2 - learning_rate*db2\n \n # YOUR CODE ENDS HERE\n \n parameters = {\"W1\": W1,\n \"b1\": b1,\n \"W2\": W2,\n \"b2\": b2}\n \n return parameters", "_____no_output_____" ], [ "parameters, grads = update_parameters_test_case()\nparameters = update_parameters(parameters, grads)\n\nprint(\"W1 = \" + str(parameters[\"W1\"]))\nprint(\"b1 = \" + str(parameters[\"b1\"]))\nprint(\"W2 = \" + str(parameters[\"W2\"]))\nprint(\"b2 = \" + str(parameters[\"b2\"]))\n\nupdate_parameters_test(update_parameters)", "W1 = [[-0.00643025 0.01936718]\n [-0.02410458 0.03978052]\n [-0.01653973 -0.02096177]\n [ 0.01046864 -0.05990141]]\nb1 = [[-1.02420756e-06]\n [ 1.27373948e-05]\n [ 8.32996807e-07]\n [-3.20136836e-06]]\nW2 = [[-0.01041081 -0.04463285 0.01758031 0.04747113]]\nb2 = [[0.00010457]]\n\u001b[92m All tests passed.\n" ] ], [ [ "***Expected output***\n```\nW1 = [[-0.00643025 0.01936718]\n [-0.02410458 0.03978052]\n [-0.01653973 -0.02096177]\n [ 0.01046864 -0.05990141]]\nb1 = [[-1.02420756e-06]\n [ 1.27373948e-05]\n [ 8.32996807e-07]\n [-3.20136836e-06]]\nW2 = [[-0.01041081 -0.04463285 0.01758031 0.04747113]]\nb2 = [[0.00010457]]\n```", "_____no_output_____" ], [ "<a name='4-7'></a>\n### 4.7 - Integration\n\nIntegrate your functions in `nn_model()` \n\n<a name='ex-8'></a>\n### Exercise 8 - nn_model\n\nBuild your neural network model in `nn_model()`.\n\n**Instructions**: The neural network model has to use the previous functions in the right order.", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: nn_model\n\ndef nn_model(X, Y, n_h, num_iterations = 10000, print_cost=False):\n \"\"\"\n Arguments:\n X -- dataset of shape (2, number of examples)\n Y -- labels of shape (1, number of examples)\n n_h -- size of the hidden layer\n num_iterations -- Number of iterations in gradient descent loop\n print_cost -- if True, print the cost every 1000 iterations\n \n Returns:\n parameters -- parameters learnt by the model. They can then be used to predict.\n \"\"\"\n \n np.random.seed(3)\n n_x = layer_sizes(X, Y)[0]\n n_y = layer_sizes(X, Y)[2]\n \n # Initialize parameters\n #(≈ 1 line of code)\n # parameters = ...\n # YOUR CODE STARTS HERE\n parameters = initialize_parameters(n_x, n_h, n_y)\n \n # YOUR CODE ENDS HERE\n \n # Loop (gradient descent)\n\n for i in range(0, num_iterations):\n \n #(≈ 4 lines of code)\n # Forward propagation. Inputs: \"X, parameters\". Outputs: \"A2, cache\".\n # A2, cache = ...\n \n # Cost function. Inputs: \"A2, Y\". Outputs: \"cost\".\n # cost = ...\n \n # Backpropagation. Inputs: \"parameters, cache, X, Y\". Outputs: \"grads\".\n # grads = ...\n \n # Gradient descent parameter update. Inputs: \"parameters, grads\". Outputs: \"parameters\".\n # parameters = ...\n \n # YOUR CODE STARTS HERE\n A2, cache = forward_propagation(X, parameters)\n cost = compute_cost(A2, Y)\n grads = backward_propagation(parameters, cache, X, Y)\n parameters = update_parameters(parameters, grads)\n # YOUR CODE ENDS HERE\n \n # Print the cost every 1000 iterations\n if print_cost and i % 1000 == 0:\n print (\"Cost after iteration %i: %f\" %(i, cost))\n\n return parameters", "_____no_output_____" ], [ "t_X, t_Y = nn_model_test_case()\nparameters = nn_model(t_X, t_Y, 4, num_iterations=10000, print_cost=True)\n\nprint(\"W1 = \" + str(parameters[\"W1\"]))\nprint(\"b1 = \" + str(parameters[\"b1\"]))\nprint(\"W2 = \" + str(parameters[\"W2\"]))\nprint(\"b2 = \" + str(parameters[\"b2\"]))\n\nnn_model_test(nn_model)", "(1, 3)\n(1, 3)\nCost after iteration 0: 0.692739\nCost after iteration 1000: 0.000218\nCost after iteration 2000: 0.000107\nCost after iteration 3000: 0.000071\nCost after iteration 4000: 0.000053\nCost after iteration 5000: 0.000042\nCost after iteration 6000: 0.000035\nCost after iteration 7000: 0.000030\nCost after iteration 8000: 0.000026\nCost after iteration 9000: 0.000023\nW1 = [[-0.65848169 1.21866811]\n [-0.76204273 1.39377573]\n [ 0.5792005 -1.10397703]\n [ 0.76773391 -1.41477129]]\nb1 = [[ 0.287592 ]\n [ 0.3511264 ]\n [-0.2431246 ]\n [-0.35772805]]\nW2 = [[-2.45566237 -3.27042274 2.00784958 3.36773273]]\nb2 = [[0.20459656]]\n(1, 3)\n(1, 3)\n(1, 3)\n(1, 3)\n(1, 3)\n(1, 3)\n\u001b[92m All tests passed.\n" ] ], [ [ "***Expected output***\n```\nCost after iteration 0: 0.692739\nCost after iteration 1000: 0.000218\nCost after iteration 2000: 0.000107\n...\nCost after iteration 8000: 0.000026\nCost after iteration 9000: 0.000023\nW1 = [[-0.65848169 1.21866811]\n [-0.76204273 1.39377573]\n [ 0.5792005 -1.10397703]\n [ 0.76773391 -1.41477129]]\nb1 = [[ 0.287592 ]\n [ 0.3511264 ]\n [-0.2431246 ]\n [-0.35772805]]\nW2 = [[-2.45566237 -3.27042274 2.00784958 3.36773273]]\nb2 = [[0.20459656]]\n```", "_____no_output_____" ], [ "<a name='5'></a>\n## 5 - Test the Model\n\n<a name='5-1'></a>\n### 5.1 - Predict\n\n<a name='ex-9'></a>\n### Exercise 9 - predict\n\nPredict with your model by building `predict()`.\nUse forward propagation to predict results.\n\n**Reminder**: predictions = $y_{prediction} = \\mathbb 1 \\text{{activation > 0.5}} = \\begin{cases}\n 1 & \\text{if}\\ activation > 0.5 \\\\\n 0 & \\text{otherwise}\n \\end{cases}$ \n \nAs an example, if you would like to set the entries of a matrix X to 0 and 1 based on a threshold you would do: ```X_new = (X > threshold)```", "_____no_output_____" ] ], [ [ "# GRADED FUNCTION: predict\n\ndef predict(parameters, X):\n \"\"\"\n Using the learned parameters, predicts a class for each example in X\n \n Arguments:\n parameters -- python dictionary containing your parameters \n X -- input data of size (n_x, m)\n \n Returns\n predictions -- vector of predictions of our model (red: 0 / blue: 1)\n \"\"\"\n \n # Computes probabilities using forward propagation, and classifies to 0/1 using 0.5 as the threshold.\n #(≈ 2 lines of code)\n # A2, cache = ...\n # predictions = ...\n # YOUR CODE STARTS HERE\n A2, cache = forward_propagation(X, parameters)\n predictions = (A2 > 0.5)\n # YOUR CODE ENDS HERE\n \n return predictions", "_____no_output_____" ], [ "parameters, t_X = predict_test_case()\n\npredictions = predict(parameters, t_X)\nprint(\"Predictions: \" + str(predictions))\n\npredict_test(predict)", "Predictions: [[ True False True]]\n\u001b[92m All tests passed.\n" ] ], [ [ "***Expected output***\n```\nPredictions: [[ True False True]]\n```", "_____no_output_____" ], [ "<a name='5-2'></a>\n### 5.2 - Test the Model on the Planar Dataset\n\nIt's time to run the model and see how it performs on a planar dataset. Run the following code to test your model with a single hidden layer of $n_h$ hidden units!", "_____no_output_____" ] ], [ [ "# Build a model with a n_h-dimensional hidden layer\nparameters = nn_model(X, Y, n_h = 4, num_iterations = 10000, print_cost=True)\n\n# Plot the decision boundary\nplot_decision_boundary(lambda x: predict(parameters, x.T), X, Y)\nplt.title(\"Decision Boundary for hidden layer size \" + str(4))", "(1, 400)\n(1, 400)\nCost after iteration 0: 0.693048\nCost after iteration 1000: 0.288083\nCost after iteration 2000: 0.254385\nCost after iteration 3000: 0.233864\nCost after iteration 4000: 0.226792\nCost after iteration 5000: 0.222644\nCost after iteration 6000: 0.219731\nCost after iteration 7000: 0.217504\nCost after iteration 8000: 0.219430\nCost after iteration 9000: 0.218551\n" ], [ "# Print accuracy\npredictions = predict(parameters, X)\nprint ('Accuracy: %d' % float((np.dot(Y, predictions.T) + np.dot(1 - Y, 1 - predictions.T)) / float(Y.size) * 100) + '%')", "Accuracy: 90%\n" ] ], [ [ "**Expected Output**: \n\n<table style=\"width:30%\">\n <tr>\n <td><b>Accuracy</b></td>\n <td> 90% </td> \n </tr>\n</table>", "_____no_output_____" ], [ "Accuracy is really high compared to Logistic Regression. The model has learned the patterns of the flower's petals! Unlike logistic regression, neural networks are able to learn even highly non-linear decision boundaries. ", "_____no_output_____" ], [ "### Congrats on finishing this Programming Assignment! \n\nHere's a quick recap of all you just accomplished: \n\n- Built a complete 2-class classification neural network with a hidden layer\n- Made good use of a non-linear unit\n- Computed the cross entropy loss\n- Implemented forward and backward propagation\n- Seen the impact of varying the hidden layer size, including overfitting.\n\nYou've created a neural network that can learn patterns! Excellent work. Below, there are some optional exercises to try out some other hidden layer sizes, and other datasets. ", "_____no_output_____" ], [ "<a name='6'></a>\n## 6 - Tuning hidden layer size (optional/ungraded exercise)\n\nRun the following code(it may take 1-2 minutes). Then, observe different behaviors of the model for various hidden layer sizes.", "_____no_output_____" ] ], [ [ "# This may take about 2 minutes to run\n\nplt.figure(figsize=(16, 32))\nhidden_layer_sizes = [1, 2, 3, 4, 5, 20, 50]\nfor i, n_h in enumerate(hidden_layer_sizes):\n plt.subplot(5, 2, i+1)\n plt.title('Hidden Layer of size %d' % n_h)\n parameters = nn_model(X, Y, n_h, num_iterations = 5000)\n plot_decision_boundary(lambda x: predict(parameters, x.T), X, Y)\n predictions = predict(parameters, X)\n accuracy = float((np.dot(Y,predictions.T) + np.dot(1 - Y, 1 - predictions.T)) / float(Y.size)*100)\n print (\"Accuracy for {} hidden units: {} %\".format(n_h, accuracy))", "(1, 400)\n(1, 400)\nAccuracy for 1 hidden units: 67.5 %\n(1, 400)\n(1, 400)\nAccuracy for 2 hidden units: 67.25 %\n(1, 400)\n(1, 400)\nAccuracy for 3 hidden units: 90.75 %\n(1, 400)\n(1, 400)\nAccuracy for 4 hidden units: 90.5 %\n(1, 400)\n(1, 400)\nAccuracy for 5 hidden units: 91.25 %\n(1, 400)\n(1, 400)\nAccuracy for 20 hidden units: 90.0 %\n(1, 400)\n(1, 400)\nAccuracy for 50 hidden units: 90.25 %\n" ] ], [ [ "**Interpretation**:\n- The larger models (with more hidden units) are able to fit the training set better, until eventually the largest models overfit the data. \n- The best hidden layer size seems to be around n_h = 5. Indeed, a value around here seems to fits the data well without also incurring noticeable overfitting.\n- Later, you'll become familiar with regularization, which lets you use very large models (such as n_h = 50) without much overfitting. ", "_____no_output_____" ], [ "**Note**: Remember to submit the assignment by clicking the blue \"Submit Assignment\" button at the upper-right. \n\n**Some optional/ungraded questions that you can explore if you wish**: \n- What happens when you change the tanh activation for a sigmoid activation or a ReLU activation?\n- Play with the learning_rate. What happens?\n- What if we change the dataset? (See part 5 below!)", "_____no_output_____" ], [ "<a name='7'></a>\n## 7- Performance on other datasets", "_____no_output_____" ], [ "If you want, you can rerun the whole notebook (minus the dataset part) for each of the following datasets.", "_____no_output_____" ] ], [ [ "# Datasets\nnoisy_circles, noisy_moons, blobs, gaussian_quantiles, no_structure = load_extra_datasets()\n\ndatasets = {\"noisy_circles\": noisy_circles,\n \"noisy_moons\": noisy_moons,\n \"blobs\": blobs,\n \"gaussian_quantiles\": gaussian_quantiles}\n\n### START CODE HERE ### (choose your dataset)\ndataset = \"noisy_moons\"\n### END CODE HERE ###\n\nX, Y = datasets[dataset]\nX, Y = X.T, Y.reshape(1, Y.shape[0])\n\n# make blobs binary\nif dataset == \"blobs\":\n Y = Y%2\n\n# Visualize the data\nplt.scatter(X[0, :], X[1, :], c=Y, s=40, cmap=plt.cm.Spectral);", "_____no_output_____" ] ], [ [ "**References**:\n\n- http://scs.ryerson.ca/~aharley/neural-networks/\n- http://cs231n.github.io/neural-networks-case-study/", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ] ]
d0279dda138d7ae3deec64a3a5334dcc2e319c3d
36,096
ipynb
Jupyter Notebook
loc_clust_stripe_segmentation.ipynb
YuTian8328/flow-based-clustering
da293edbfac058f5908fc0ab057d3097f0becc47
[ "MIT" ]
null
null
null
loc_clust_stripe_segmentation.ipynb
YuTian8328/flow-based-clustering
da293edbfac058f5908fc0ab057d3097f0becc47
[ "MIT" ]
null
null
null
loc_clust_stripe_segmentation.ipynb
YuTian8328/flow-based-clustering
da293edbfac058f5908fc0ab057d3097f0becc47
[ "MIT" ]
null
null
null
53.475556
7,084
0.752438
[ [ [ "import numpy as np\nimport datetime\n\n\nimport matplotlib.pyplot as plt\nfrom PIL import Image\n\nfrom scipy.sparse import csr_matrix\nimport matplotlib.pyplot as plt\n\nfrom sklearn.cluster import KMeans\n\nfrom numpy.linalg import norm\nfrom sklearn.feature_extraction import image\nimport warnings\nwarnings.filterwarnings(\"ignore\")\n", "_____no_output_____" ], [ "def get_B_and_weight_vec(n_nodes,threshold,sigma=1):\n '''\n Generate graph structure from the image to be segmented.\n Inputs:\n n_nodes: number of nodes, i.e. number of pixels\n threshold: threshold to drop edges with small weights (weak similarities)\n sigma: parameter to scale edge weights\n Outputs:\n B: Incidence matrix\n Weight_vec: edge_wise weights\n '''\n N = n_nodes\n row = []\n col = []\n data = []\n weight_vec = []\n cnt = 0\n# \n for i in range(N):\n for j in [i+1,i+100]:\n if j>=2900:\n continue\n if np.exp(-norm(img[i]-img[j])/(2*sigma**2)) > threshold:\n row.append(cnt)\n col.append(i)\n data.append(1)\n\n row.append(cnt)\n col.append(j)\n data.append(-1)\n cnt += 1\n weight_vec.append(np.exp(-norm(img[i]-img[j])/(2*sigma**2)))\n\n B = csr_matrix((data, (row, col)), shape=(cnt, N))\n weight_vec = np.array(weight_vec)\n return B, weight_vec", "_____no_output_____" ], [ "def algorithm(B, weight_vec, seeds,K=15000,alpha=0.02, lambda_nLasso=None, check_s=False):\n E, N = B.shape\n# weight_vec = np.ones(E)\n\n Gamma_vec = np.array(1./(np.sum(abs(B), 0)))[0] # \\in [0, 1]\n Gamma = np.diag(Gamma_vec)\n\n Sigma = 0.5\n\n seednodesindicator= np.zeros(N)\n seednodesindicator[seeds] = 1 \n\n\n \n noseednodeindicator = np.ones(N)\n noseednodeindicator[seeds] = 0\n \n if lambda_nLasso == None:\n lambda_nLasso = 2 / math.sqrt(np.sum(weight_vec))\n \n if check_s:\n s = 0.0\n for item in range(len(weight_vec)):\n x = B[item].toarray()[0]\n i = np.where(x == -1)[0][0]\n j = np.where(x == 1)[0][0]\n if i < N1 <= j:\n s += weight_vec[item]\n elif i >= N1 > j:\n s += weight_vec[item]\n\n if lambda_nLasso * s >= alpha * N2 / 2:\n print ('eq(24)', lambda_nLasso * s, alpha * N2 / 2)\n \n fac_alpha = 1./(Gamma_vec*alpha+1) # \\in [0, 1]\n\n hatx = np.zeros(N)\n newx = np.zeros(N)\n prevx = np.zeros(N)\n haty = np.array([x/(E-1) for x in range(0, E)])\n history = []\n for iterk in range(K):\n # if 0 < np.max(abs(newx - prevx)) < 1e-4:\n # print(iterk)\n # break\n tildex = 2 * hatx - prevx\n newy = haty + Sigma * B.dot(tildex) # chould be negative\n haty = newy / np.maximum(abs(newy) / (lambda_nLasso * weight_vec), np.ones(E)) # could be negative\n\n newx = hatx - Gamma_vec * B.T.dot(haty) # could be negative\n newx[seeds] = (newx[seeds] + Gamma_vec[seeds]) / (1 + Gamma_vec[seeds])\n\n newx = seednodesindicator * newx + noseednodeindicator * (newx * fac_alpha)\n prevx = np.copy(hatx)\n hatx = newx # could be negative\n history.append(newx)\n \n history = np.array(history)\n\n return history\n ", "_____no_output_____" ], [ "#load the image\nimg=Image.open(\"stripes.png\")", "_____no_output_____" ] ], [ [ "# Preprocess the image", "_____no_output_____" ] ], [ [ "# resize the image\nbasewidth = 100\nwpercent = (basewidth / float(img.size[0]))\nhsize = int((float(img.size[1]) * float(wpercent)))\nimg = img.resize((basewidth, hsize), Image.ANTIALIAS)\nimg = np.array(img)[:,:,:3]\nprint(img.shape)\nplt.imshow(img)", "(29, 100, 3)\n" ], [ "img = img.reshape(-1,3)\nimg.shape", "_____no_output_____" ] ], [ [ "# Perform the segmentation task via Kmeans", "_____no_output_____" ] ], [ [ "kmeans = KMeans(n_clusters=2).fit(img)\nplt.imshow(kmeans.labels_.reshape(29,100))", "_____no_output_____" ] ], [ [ "# Perform the task via our algorithm", "_____no_output_____" ] ], [ [ "# generate graph from image\nimg = img.reshape(-1,3)/255\nn_nodes=img.shape[0]\nprint(\"number of nodes:\",n_nodes )\n\nB,weight=get_B_and_weight_vec(n_nodes,0.2,1)\n# plt.hist(weight,bins=30) #distribution of similarity measure", "number of nodes: 2900\n" ], [ "def run_seg(n_nodes,seeds,threshold, K=30, alpha=0.1, lambda_nLasso=0.1):\n B, weight_vec = get_B_and_weight_vec(n_nodes,threshold)\n \n start = datetime.datetime.now()\n history = algorithm(B, weight_vec, seeds=seeds, K=K, alpha=alpha, lambda_nLasso=lambda_nLasso)\n print('our method time: ', datetime.datetime.now() - start) \n return history", "_____no_output_____" ], [ "# generate seeds according to the labels assigned by kmeans\nseeds = np.random.choice(np.where(kmeans.labels_==0)[0],20)\n\n# run our algorithm and visulize the result before feed it to kmeans\nhistory = run_seg(n_nodes=n_nodes,seeds=seeds,threshold = 0.95, K=1000,alpha=0.01, lambda_nLasso=1)\nplt.imshow(history[-1].reshape(29,100))", "our method time: 0:00:00.431303\n" ], [ "# Feed the node signal from our algorithm to kmeans to complete clustering (2 clusters)\nhistory=np.nan_to_num(history)\nkmeans = KMeans(n_clusters=2).fit(history[-1].reshape(len(history[-1]), 1))\n\n#visulize the segmentation result\nsegmented = kmeans.labels_\nplt.imshow(segmented.reshape((29,100)))", "_____no_output_____" ] ], [ [ "# Perform the segmentation task via spectral clustering", "_____no_output_____" ] ], [ [ "from sklearn.cluster import SpectralClustering\n\ns=SpectralClustering(2).fit(img)\n\nplt.imshow(s.labels_.reshape(29,100))", "_____no_output_____" ], [ "# Python3 Program to print BFS traversal\n# from a given source vertex. BFS(int s)\n# traverses vertices reachable from s.\nfrom collections import defaultdict\n\n# This class represents a directed graph\n# using adjacency list representation\nclass Graph:\n\n\t# Constructor\n\tdef __init__(self):\n\n\t\t# default dictionary to store graph\n\t\tself.graph = defaultdict(list)\n\n\t# function to add an edge to graph\n\tdef addEdge(self,u,v):\n\t\tself.graph[u].append(v)\n\n\t# Function to print a BFS of graph\n\tdef BFS(self, s):\n\n\t\t# Mark all the vertices as not visited\n\t\tvisited = [False] * (max(self.graph) + 1)\n\n\t\t# Create a queue for BFS\n\t\tqueue = []\n\n\t\t# Mark the source node as\n\t\t# visited and enqueue it\n\t\tqueue.append(s)\n\t\tvisited[s] = True\n\n\t\twhile queue:\n\n\t\t\t# Dequeue a vertex from\n\t\t\t# queue and print it\n\t\t\ts = queue.pop(0)\n\t\t\tprint (s, end = \" \")\n\n\t\t\t# Get all adjacent vertices of the\n\t\t\t# dequeued vertex s. If a adjacent\n\t\t\t# has not been visited, then mark it\n\t\t\t# visited and enqueue it\n\t\t\tfor i in self.graph[s]:\n\t\t\t\tif visited[i] == False:\n\t\t\t\t\tqueue.append(i)\n\t\t\t\t\tvisited[i] = True\n\n# Driver code\n\n# Create a graph given in\n# the above diagram\ng = Graph()\ng.addEdge(0, 1)\ng.addEdge(0, 2)\ng.addEdge(1, 2)\ng.addEdge(2, 0)\ng.addEdge(2, 3)\ng.addEdge(3, 3)\n\nprint (\"Following is Breadth First Traversal\"\n\t\t\t\t\" (starting from vertex 2)\")\ng.BFS(2)\n\n# This code is contributed by Neelam Yadav\n\n", "Following is Breadth First Traversal (starting from vertex 2)\n2 0 3 1 " ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ] ]
d027b1c892d498010316ee8523329537452ac06a
1,648
ipynb
Jupyter Notebook
Chapter01/.ipynb_checkpoints/13 Jumping between frames in video file-checkpoint.ipynb
PCJimmmy/OpenCV-3-Computer-Vision-with-Python-Cookbook
08be606384e3439183599c147291901d80fc8310
[ "MIT" ]
1
2019-08-18T03:53:01.000Z
2019-08-18T03:53:01.000Z
Chapter01/.ipynb_checkpoints/13 Jumping between frames in video file-checkpoint.ipynb
HardToMake/OpenCV-3-Computer-Vision-with-Python-Cookbook
325dc921cb89bcfa029241e8ca5644b343be53b0
[ "MIT" ]
1
2020-06-29T06:25:37.000Z
2020-06-29T06:25:37.000Z
Chapter01/.ipynb_checkpoints/13 Jumping between frames in video file-checkpoint.ipynb
HardToMake/OpenCV-3-Computer-Vision-with-Python-Cookbook
325dc921cb89bcfa029241e8ca5644b343be53b0
[ "MIT" ]
2
2019-08-12T01:02:07.000Z
2021-02-18T15:02:45.000Z
21.973333
65
0.534587
[ [ [ "import cv2\n\ncapture = cv2.VideoCapture('../data/drop.avi')\nframe_count = capture.get(cv2.CAP_PROP_FRAME_COUNT)\nprint('Frame count:', frame_count)\n\nprint('Position:', capture.get(cv2.CAP_PROP_POS_FRAMES))\n_, frame = capture.read()\ncv2.imshow('frame0', frame)\n\nprint('Position:', capture.get(cv2.CAP_PROP_POS_FRAMES))\n_, frame = capture.read()\ncv2.imshow('frame1', frame)\n\ncapture.set(cv2.CAP_PROP_POS_FRAMES, 100)\nprint('Position:', capture.get(cv2.CAP_PROP_POS_FRAMES))\n_, frame = capture.read()\ncv2.imshow('frame100', frame)\n \ncv2.waitKey()\ncv2.destroyAllWindows()", "Frame count: 182.0\nPosition: 0.0\nPosition: 1.0\nPosition: 100.0\n" ] ] ]
[ "code" ]
[ [ "code" ] ]
d027c6708e27ee5d934bde790c253e3e8b02fd52
490,784
ipynb
Jupyter Notebook
project1/.ipynb_checkpoints/Ex2_Eve_Rahbe_235549-checkpoint.ipynb
antoine-alleon/Biological_Modelling_Neural_Network_python_exercises
b266654975a37033ac22e29197e7930ccc3ccfc6
[ "MIT" ]
null
null
null
project1/.ipynb_checkpoints/Ex2_Eve_Rahbe_235549-checkpoint.ipynb
antoine-alleon/Biological_Modelling_Neural_Network_python_exercises
b266654975a37033ac22e29197e7930ccc3ccfc6
[ "MIT" ]
null
null
null
project1/.ipynb_checkpoints/Ex2_Eve_Rahbe_235549-checkpoint.ipynb
antoine-alleon/Biological_Modelling_Neural_Network_python_exercises
b266654975a37033ac22e29197e7930ccc3ccfc6
[ "MIT" ]
null
null
null
360.605437
42,872
0.924924
[ [ [ "# Solution Graded Exercise 1: Leaky-integrate-and-fire model", "_____no_output_____" ], [ "first name: Eve\n\nlast name: Rahbe\n\nsciper: 235549\n\ndate: 21.03.2018\n\n*Your teammate*\n\nfirst name of your teammate: Antoine\n\nlast name of your teammate: Alleon\n\nsciper of your teammate: 223333\n\n\nNote: You are allowed to discuss the concepts with your class mates. You are not allowed to share code. You have to understand every line of code you write in this notebook. We will ask you questions about your submission during a fraud detection session during the last week of the semester.\n\nIf you are asked for plots: The appearance of the plots (labelled axes, useful scaling etc.) is important!\n\nIf you are asked for discussions: Answer in a precise way and try to be concise. \n\n\n** Submission **\n\nRename this notebook to Ex2_FirstName_LastName_Sciper.ipynb and upload that single file on moodle before the deadline.\n\n** Link to the exercise **\n\nhttp://neuronaldynamics-exercises.readthedocs.io/en/stable/exercises/leaky-integrate-and-fire.html", "_____no_output_____" ], [ "# Exercise 2, getting started", "_____no_output_____" ] ], [ [ "%matplotlib inline\nimport brian2 as b2\nimport matplotlib.pyplot as plt\nimport numpy as np\nfrom neurodynex.leaky_integrate_and_fire import LIF\nfrom neurodynex.tools import input_factory, plot_tools\n\n\nLIF.getting_started()\nLIF.print_default_parameters()", "nr of spikes: 0\n" ] ], [ [ "# 2.1 Exercise: minimal current\n## 2.1.1. Question: minimal current (calculation) \n#### [2 points]", "_____no_output_____" ] ], [ [ "from neurodynex.leaky_integrate_and_fire import LIF\nprint(\"resting potential: {}\".format(LIF.V_REST))\ni_min = (LIF.FIRING_THRESHOLD-LIF.V_REST)/LIF.MEMBRANE_RESISTANCE\nprint(\"minimal current i_min: {}\".format(i_min))", "resting potential: -0.07\nminimal current i_min: 2e-09\n" ] ], [ [ "The minimal current is :\n\n$i_{min} = \\frac{\\theta-u_{rest}}{R} = \\frac{-50-(-70) [mV]}{10 [Mohm]} = 2 [nA]$\n\n$\\theta$ is the firing threshold\n\n$u_{rest}$ is the resting potential\n\n$R$ is the membrane resistance", "_____no_output_____" ], [ "## 2.1.2. Question: minimal current (simulation)\n#### [2 points]", "_____no_output_____" ] ], [ [ "# create a step current with amplitude= i_min\nstep_current = input_factory.get_step_current(\n t_start=5, t_end=100, unit_time=b2.ms,\n amplitude= i_min) # set i_min to your value\n\n# run the LIF model.\n# Note: As we do not specify any model parameters, the simulation runs with the default values\n(state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=step_current, simulation_time = 100 * b2.ms)\n\n# plot I and vm\nplot_tools.plot_voltage_and_current_traces(\nstate_monitor, step_current, title=\"min input\", firing_threshold=LIF.FIRING_THRESHOLD)\nprint(\"nr of spikes: {}\".format(spike_monitor.count[0])) # should be 0", "nr of spikes: 0\n" ] ], [ [ "# 2.2. Exercise: f-I Curve \n## 2.2.1. Question: f-I Curve and refractoryness\n", "_____no_output_____" ], [ "1 - Sketch or plot the curve with some program. You don't have to include it here, it is just for your understanding and will not be graded.\n\n2 - What is the maximum rate at which this neuron can fire?\n#### [3 points]", "_____no_output_____" ] ], [ [ "# create a step current with amplitude i_max\ni_max = 125 * b2.namp\nstep_current = input_factory.get_step_current(\n t_start=5, t_end=100, unit_time=b2.ms,\n amplitude=i_max)\n\n# run the LIF model and set the absolute refractory period to 3ms\n(state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=step_current, simulation_time = 500 * b2.ms, \n abs_refractory_period=3 * b2.ms)\n\n# number of spikes\nprint(\"nr of spikes: {}\".format(spike_monitor.count[0]))\n\n# firing frequency\nT = 95e-03/spike_monitor.count[0]\nprint(\"T : {}\".format(T))\nprint(\"firing frequency : {}\".format(1/T))", "nr of spikes: 32\nT : 0.00296875\nfiring frequency : 336.842105263\n" ] ], [ [ "The maximum rate at which this neuron can fire is $f = 336.84 [Hz]$. \n", "_____no_output_____" ], [ "3 - Inject currents of different amplitudes (from 0nA to 100nA) into a LIF neuron. \nFor each current, run the simulation for 500ms and determine the firing frequency in Hz. Then plot the f-I curve. \n#### [4 points]", "_____no_output_____" ] ], [ [ "import numpy as np\nimport matplotlib.pyplot as plt\n\nfiring_frequency = []\n\n# create a step current with amplitude i from 0 to 100 nA\nfor i in range(0,100,1) :\n step_current = input_factory.get_step_current(\n t_start=5, t_end=100, unit_time=b2.ms,\n amplitude=i * b2.namp) # stock amplitude i from 0 to 100 nA\n \n # run the LIF model\n (state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=step_current, simulation_time = 500 * b2.ms, \n abs_refractory_period= 3 * b2.ms)\n\n if (spike_monitor.count[0] == 0) :\n firing_frequency.append(0)\n else :\n # firing frequency\n T = 95e-03/spike_monitor.count[0]\n firing_frequency.append(1/T)", "_____no_output_____" ], [ "plt.xlabel(\"step current amplitude [nA]\")\nplt.ylabel(\"firing frequency [Hz]\")\nplt.title(\"f-I curve for step current\")\nplt.plot(range(0,100,1), firing_frequency)", "_____no_output_____" ] ], [ [ "# 2.3. Exercise: “Experimentally” estimate the parameters of a LIF neuron\n## 2.3.1. Question: “Read” the LIF parameters out of the vm plot\n#### [6 points]", "_____no_output_____" ], [ "My estimates for the parameters :\n\\begin{itemize}\n\\item Resting potential : $u_{rest} = -66 [mV]$.\n\\item Reset potential : $u_{reset} = -63 [mV]$ is the membrane potential between spikes.\n\\item Firing threshold : $\\theta = -38 [mV]$ using $\\theta = u(t_{inf})$ with step current amplitude $i_{min}$.\n\\item Membrane resitance : $R = \\frac{\\theta-u_{rest}}{i_{min}} = 12.7 [Mohm]$ with $i_{min}=2.2[nA]$ at the firing threshold.\n\\item Membrane time scale : $\\tau = 11.5 [ms]$ time to reach $63\\%$ of $\\theta$ with step curent amplitude $i_{min}$.\n\\item Absolute refractory period : $ t = 5 [ms]$ time before new spike after reset of the potential.\n\\end{itemize} \n", "_____no_output_____" ] ], [ [ "# get a random parameter. provide a random seed to have a reproducible experiment\nrandom_parameters = LIF.get_random_param_set(random_seed=432)\n\n# define your test current\ntest_current = input_factory.get_step_current(\n t_start=5, t_end=100, unit_time=b2.ms, amplitude= 10 * b2.namp)\n\n# probe the neuron. pass the test current AND the random params to the function\nstate_monitor, spike_monitor = LIF.simulate_random_neuron(test_current, random_parameters)\n\n# plot\nplot_tools.plot_voltage_and_current_traces(state_monitor, test_current, title=\"experiment\")\n\n# print the parameters to the console and compare with your estimates\nLIF.print_obfuscated_parameters(random_parameters)", "Resting potential: -0.066\nReset voltage: -0.063\nFiring threshold: -0.038\nMembrane resistance: 13000000.0\nMembrane time-scale: 0.013\nAbsolute refractory period: 0.005\n" ] ], [ [ "# 2.4. Exercise: Sinusoidal input current and subthreshold response\n## 2.4.1. Question\n#### [5 points]", "_____no_output_____" ] ], [ [ "# note the higher resolution when discretizing the sine wave: we specify unit_time=0.1 * b2.ms\nsinusoidal_current = input_factory.get_sinusoidal_current(200, 1000, unit_time=0.1 * b2.ms,\n amplitude= 2.5 * b2.namp, frequency=250*b2.Hz,\n direct_current=0. * b2.namp)\n\n# run the LIF model. By setting the firing threshold to to a high value, we make sure to stay in the linear (non spiking) regime.\n(state_monitor, spike_monitor) = LIF.simulate_LIF_neuron(input_current=sinusoidal_current, simulation_time = 120 * b2.ms, \n firing_threshold=0*b2.mV)\n\n# plot the membrane voltage\nplot_tools.plot_voltage_and_current_traces(state_monitor, sinusoidal_current, title = \"Sinusoidal input current\")\nprint(\"nr of spikes: {}\".format(spike_monitor.count[0]))\n\n# Calculate the amplitude of the membrane voltage\n# get the difference betwwen min value of the voltage and the resting potential\nprint(\"Amplitude of the membrane voltage : {} V\" .format(abs(np.min(np.asarray(state_monitor.v))-(-0.07))))\n\nimport scipy.signal\n\n# Calculate the phase of the membrane voltage\n# interpolation of the signals\nxx = np.interp(np.linspace(1,1002,1002),np.linspace(1,1200,1200),np.transpose(np.asarray(state_monitor.v))[:,0])\n# correlation\ncorr = scipy.signal.correlate(xx,sinusoidal_current.values[:,0])\n\ndt = np.arange(-1001,1002)\n# find the max correlation\nrecovered_time_shift = dt[corr.argmax()]\n# convert timeshift in phase between 0 and 2pi\nperiod = 1/0.250\nrecovered_phase_shift = np.pi*(((0.5 + recovered_time_shift/period) %1.0)-0.5)\nprint(\"Phase shift : {}\" .format(recovered_phase_shift))", "nr of spikes: 0\nAmplitude of the membrane voltage : 0.001985117111761442 V\nPhase shift : -1.5707963267948966\n" ] ], [ [ "The results are : \n\n$ A = 2 [mV]$ (computationnally and visually) and $phase = -\\pi/2$ (computationnally and visually).", "_____no_output_____" ], [ "## 2.4.2. Question\n#### [5 points]", "_____no_output_____" ] ], [ [ "# For input frequencies between 10Hz and 1kHz plot the resulting amplitude of subthreshold oscillations of the \n# membrane potential vs. input frequency.\n\namplitude = []\nfor i in range(15) :\n\n sinusoidal_current = input_factory.get_sinusoidal_current(200, 1000, unit_time=0.1 * b2.ms,\n amplitude= 2.5 * b2.namp, frequency=10.**(1.+i/7.)*b2.Hz,\n direct_current=0. * b2.namp)\n\n # run the LIF model.\n (state_monitor, spike_monitor) = LIF.simulate_LIF_neuron(input_current=sinusoidal_current, simulation_time = 120 * b2.ms, \n firing_threshold=0*b2.mV)\n\n amplitude.append(abs(np.min(np.asarray(state_monitor.v))-(-0.07)))", "_____no_output_____" ], [ "plt.xlabel(\"sinusoidal current frequency [Hz]\")\nplt.ylabel(\"amplitude of the membrane potential [mV]\")\nplt.title(\"Amplitude vs Input frequency\")\nplt.plot([10.**(1.+i/7.) for i in range(15)], amplitude)", "_____no_output_____" ] ], [ [ "## 2.4.3. Question\n#### [5 points]", "_____no_output_____" ] ], [ [ "# For input frequencies between 10Hz and 1kHz\n# plot the resulting phase shift of subthreshold oscillations of the membrane potential vs. input frequency.\nphase = []\nfor f in [10,50,100,250,500,750,1000] :\n \n sinusoidal_current = input_factory.get_sinusoidal_current(200, 1000, unit_time=0.1 * b2.ms,\n amplitude= 2.5 * b2.namp, frequency = f*b2.Hz,\n direct_current=0. * b2.namp)\n\n # run the LIF model.\n (state_monitor, spike_monitor) = LIF.simulate_LIF_neuron(input_current=sinusoidal_current, simulation_time = 120 * b2.ms, \n firing_threshold=0*b2.mV)\n\n xx = np.interp(np.linspace(1,1002,1002),np.linspace(1,1200,1200),np.transpose(np.asarray(state_monitor.v))[:,0])\n corr = scipy.signal.correlate(xx,sinusoidal_current.values[:,0])\n\n dt = np.arange(-1001,1002)\n recovered_time_shift = dt[corr.argmax()]\n period = 1000/f\n recovered_phase_shift = np.pi*(((0.5 + recovered_time_shift/period) %1.0)-0.5)\n \n phase.append(recovered_phase_shift)", "_____no_output_____" ], [ "plt.xlabel(\"sinusoidal current frequency [Hz]\")\nplt.ylabel(\"phase shift between membrane potential and input current\")\nplt.title(\"Phase shift of membrane potential vs Input frequency\")\nplt.plot([10,50,100,250,500,750,1000], phase)", "_____no_output_____" ] ], [ [ "## 2.4.4. Question\n#### [3 points]", "_____no_output_____" ], [ "It is a \\textbf{Low-Pass} filter because it amplifies low frequencies and attenuates high frequencies.", "_____no_output_____" ], [ "# 2.5 Leaky integrate-and-fire neuron with noisy input\nThis exercise is not available online. All information is given here.\nSo far you have explored the leaky integrate-and-fire model with step and sinusoidal input currents. We will now investigate the same neuron model with noisy input.\nThe voltage equation now is:\n\\begin{eqnarray}\n\\tau \\frac{du}{dt} = -u(t) + u_{rest} + RI(t) + RI_{noise}(t)\n\\end{eqnarray}\nwhere the noise is simply an additional term.\n\nTo implement the noise term in the above equation we will consider it as 'white noise', $I_{noise}(t) = \\sigma \\xi(t)$. White noise $\\xi$ is a stochastic process with expectation value $<\\xi(t)>=0$ and autocorrelation $<\\xi(t)\\xi(t+\\Delta)>=\\delta(\\Delta)$ Note that, as we saw in the Exercise set of Week 1, the $\\delta$-function has units of $1/time$, so $\\xi$ has units of $1/\\sqrt{time}$.\n\nIt can be shown that the discrete time implementation of a noisy voltage trajectory is:\n\\begin{eqnarray}\ndu = (-u(t) + u_{rest} + RI(t))\\frac{dt}{\\tau} + \\frac{R}{\\tau}\\sigma \\sqrt{dt}\\ y,\n\\end{eqnarray}\nwhere $y \\sim \\mathcal{N}(0, 1)$\n\nWe can then write, again for implementational purposes:\n\\begin{eqnarray}\ndu = \\big[-u(t) + u_{rest} + R(I(t) + \\sigma \\frac{1}{\\sqrt{dt}} y) \\big]\\frac{dt}{\\tau}\n\\end{eqnarray}\n\n\nNote that for the physical units to be consistent $\\sigma$ in our formulation has units of $current * \\sqrt{time}$. \n\nDetails of the above are beyond the scope of this exercise. If you would like to get more insights we refer to the paragraph 8.1 of the book (http://neuronaldynamics.epfl.ch/online/Ch8.S1.html), to http://www.scholarpedia.org/article/Stochastic_dynamical_systems#Ornstein-Uhlenbeck_process and regarding the implementational scaling of the noise to http://brian2.readthedocs.io/en/stable/user/models.html#time-scaling-of-noise.", "_____no_output_____" ], [ "### 2.5.1 Noisy step input current\n\n#### [7 points]\n1 - Implement the noisy current $I_0 + I_{noise}$ as described above. In order to do this edit the function get_noisy_step_current provided below. This is simply a copy of the code of the function get_step_current that you used earlier, and you just need to add the noisy part of the current at the indicated line (indicated by \"???\").\n\nThen create a noisy step current with amplitude $I_0 = 1.5nA$ and $\\sigma = 1 nA* \\sqrt{\\text{your time unit}}$ (e.g.: time_unit = 1 ms), run the LIF model and plot the input current and the membrane potential, as you did in the previous exercises.", "_____no_output_____" ] ], [ [ "def get_noisy_step_current(t_start, t_end, unit_time, amplitude, sigma, append_zero=True):\n\n \"\"\"Creates a step current with added noise. If t_start == t_end, then a single\n entry in the values array is set to amplitude.\n\n Args:\n t_start (int): start of the step\n t_end (int): end of the step\n unit_time (Quantity, Time): unit of t_start and t_end. e.g. 0.1*brian2.ms\n amplitude (Quantity): amplitude of the step. e.g. 3.5*brian2.uamp\n sigma (float): amplitude (std) of the noise. e.g. 0.1*b2.uamp\n append_zero (bool, optional): if true, 0Amp is appended at t_end+1.\n Without that trailing 0, Brian reads out the last value in the array (=amplitude) for all indices > t_end.\n\n Returns:\n TimedArray: Brian2.TimedArray\n \"\"\"\n\n assert isinstance(t_start, int), \"t_start_ms must be of type int\"\n assert isinstance(t_end, int), \"t_end must be of type int\"\n assert b2.units.fundamentalunits.have_same_dimensions(amplitude, b2.amp), \\\n \"amplitude must have the dimension of current e.g. brian2.uamp\"\n\n tmp_size = 1 + t_end # +1 for t=0\n if append_zero:\n tmp_size += 1\n tmp = np.zeros((tmp_size, 1)) * b2.amp\n \n tmp[t_start] = amplitude\n for i in range(t_start+1, t_end) :\n tmp[i] = amplitude + sigma*(time_step**(-0.5))*np.random.randn()\n \n curr = b2.TimedArray(tmp, dt= unit_time)\n return curr\n\n# -------------------\namplitude = 1.5*b2.nA\ntime_unit = 1.*b2.ms\ntime_step = 1.*b2.ms\nsigma = 1*b2.nA*time_unit**(0.5)\n\n# Create a noisy step current\nnoisy_step_current = get_noisy_step_current(t_start=50, t_end=500, unit_time = time_step,\n amplitude= amplitude, sigma = sigma)\n\n# Run the LIF model\n(state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=noisy_step_current, \\\n simulation_time = 500*b2.ms)\n\n# plot I and vm\nplot_tools.plot_voltage_and_current_traces(state_monitor, noisy_step_current, title=\"min input\", \\\n firing_threshold=LIF.FIRING_THRESHOLD)\nprint(\"nr of spikes: {}\".format(spike_monitor.count[0]))", "nr of spikes: 3\n" ] ], [ [ "2 - How does the neuron behave? Discuss your result. Your answer should be max 3 lines long.", "_____no_output_____" ], [ "The current behaves randomly increasing or decreasing at each time step (like in a Markov process). The membrane voltage reacts to this input current by following the increasing or decreasing pattern. When the current is large enough, the membrane potential reaches the firing threshold and spikes are generated. ", "_____no_output_____" ], [ "### 2.5.2 Subthreshold vs. superthreshold regime\n#### [7 + 5 = 12 points]\n1 - A time-dependent input current $I(t)$ is called subthreshold if it does not lead to spiking, i.e. if it leads to a membrane potential that stays - in the absence of noise - below the firing threshold. When noise is added, however, even subthreshold stimuli can induce spikes. Input stimuli that lead to spiking even in a noise-free neuron are called superthreshold. Sub- and superthreshold inputs, in the presence and absence of noise give rise to different spiking behaviour. These 4 different regimes (sub, super, noiseless, noisy) are what we will explore in this exercise.\n\nCreate a function that takes the amplitudes of a step current and the noise as arguments. It should simulate the LIF-model with this input, calculate the interspike intervals (ISI) and plot a histogram of the ISI (the interspike interval is the time interval between two consecutive spikes).\n\nIn order to do so edit the function test_effect_of_noise provided below. A few more details:\n* Use the function spike_tools.get_spike_train_stats (http://neuronaldynamics-exercises.readthedocs.io/en/latest/_modules/neurodynex/tools/spike_tools.html#get_spike_train_stats) to get the ISI. Have a look at its source code to understand how to use it and what it returns. You may need to use other parts of the documentation as well.\n* You will need to simulate the neuron model for long enough to get some statistics.\n* Optional and recommended: What would you expect the resulting histograms to look like?\n\n2 - Run your function and create the ISI histograms for the following four regimes:\n* No noise, subthreshold: $I_0 = 1.9nA$, $\\sigma = 0 nA* \\sqrt{\\text{your time unit}}$\n* Noise, subthreshold regime: $I_0 = 1.9nA$, $\\sigma = 1 nA* \\sqrt{\\text{your time unit}}$\n* No noise, superthreshold regime: $I_0 = 2.5nA$, $\\sigma = 0 nA* \\sqrt{\\text{your time unit}}$\n* Noise, superthreshold regime: $I_0 = 2.5nA$, $\\sigma = 1 nA* \\sqrt{\\text{your time unit}}$", "_____no_output_____" ] ], [ [ "from neurodynex.tools import spike_tools, plot_tools\n\n# time unit. e.g.\ntime_unit = 1.*b2.ms\ntime_step = time_unit\n\ndef test_effect_of_noise(amplitude, sigma, bins = np.linspace(0,1,50)):\n \n # Create a noisy step current \n noisy_step_current = get_noisy_step_current(t_start=50, t_end=5000, unit_time = time_step,\n amplitude= amplitude, sigma = sigma)\n \n # Run the LIF model\n (state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=noisy_step_current, \\\n simulation_time = 5000 * b2.ms)\n\n plt.figure()\n plot_tools.plot_voltage_and_current_traces(state_monitor, noisy_step_current, title=\"\", \\\n firing_threshold=LIF.FIRING_THRESHOLD)\n plt.show()\n \n print(\"nr of spikes: {}\".format(spike_monitor.count[0]))\n \n # Use the function spike_tools.get_spike_train_stats\n spike_stats = spike_tools.get_spike_train_stats(spike_monitor) \n\n # Make the ISI histogram\n if len(spike_stats._all_ISI) != 0:\n plt.hist(np.asarray(spike_stats._all_ISI), bins)\n # choose an appropriate window size for the x-axis (ISI-axis)!\n plt.xlabel(\"ISI [s]\")\n plt.ylabel(\"Number of spikes\")\n plt.show()\n \n return spike_stats\n\n\n# 1. No noise, subthreshold\nstats1 = test_effect_of_noise(amplitude = 1.9 *b2.nA, sigma = 0*b2.nA*time_unit**(0.5))\n\n# 2. Noise, subthreshold regime\nstats2 = test_effect_of_noise(amplitude = 1.9*b2.nA, sigma = 1*b2.nA*time_unit**(0.5), bins = np.linspace(0, 0.1, 100))\n\n# 3. No noise, superthreshold regime\nstats3 = test_effect_of_noise(amplitude = 2.5*b2.nA, sigma = 0*b2.nA*time_unit**(0.5),bins = np.linspace(0, 0.1, 100))\n\n# 4. Noise, superthreshold regime\nstats4 = test_effect_of_noise(amplitude = 2.5*b2.nA, sigma = 1*b2.nA*time_unit**(0.5), bins = np.linspace(0, 0.1, 100))", "_____no_output_____" ] ], [ [ "2 - Discuss yout results (ISI histograms) for the four regimes regimes. For help and inspiration, as well as for verification of your results, have a look at the book chapter 8.3 (http://neuronaldynamics.epfl.ch/online/Ch8.S3.html). Your answer should be max 5 lines long.\n#### [5 points]", "_____no_output_____" ], [ "\\begin{itemize}\n\\item The first regime (no noise, subthreshold) does not generate any spikes. The histogram does not exist.\n\\item The second regime (noise, subthreshold) generates spikes, but less than the superthreshold regimes, as the voltage is generally under the firing threshold. The interspike interval is concentrated around 0.02 [s].\n\\item The third regime (no noise, superthreshold) generates regularly separated spikes. The histogram only has one column as the time between each spike is always the same (0.012 [s]).\n\\item The fourth regime (noise, superthreshold) generates as many spikes as the noiseless superthreshold regime. The interspike interval is concentrated around 0.012 [s] which is faster than the noisy subthreshold regime as the current is generally higher.\n\\end{itemize}", "_____no_output_____" ], [ "3 - For the ISI histograms you needed to simulate the neuron for a long time to gather enought statistics for the ISI. If you wanted to parallelize this procedure in order to reduce the computation time (e.g. you have multiple CPU cores on your machine), what would be a simple method to do that? Your answer should be max 3 lines long.\n\nHint: Temporal vs. ensemble average...\n#### [2 points]", "_____no_output_____" ], [ "You can simulate the neuron separately on 4 cores to obtain 4 times more ISI than if running on only 1 core. You can then aggregate the data of the 4 cores to compute the ISI histograms.", "_____no_output_____" ], [ "### 2.5.3 Noisy sinusoidal input current\nImplement the noisy sinusoidal input current $I(t) + I_{noise}$. As before, edit the function provided below; you only have to add the noisy part ot the current.\n\nThen create a noisy sinusoidal current with amplitude = $2.5nA$, frequency = $100Hz$, $\\sigma = 1 nA* \\sqrt{\\text{your time unit}}$ and direct_current = $1.5nA$, run the LIF model and plot the input current and the membrane potential, as you did in the previous exercises. What do you observe when compared to the noiseless case ($\\sigma = 0 nA*\\sqrt{\\text{your time unit}}$)?\n#### [5 points]", "_____no_output_____" ] ], [ [ "import math\n\ndef get_noisy_sinusoidal_current(t_start, t_end, unit_time,\n amplitude, frequency, direct_current, sigma, phase_offset=0.,\n append_zero=True):\n \"\"\"Creates a noisy sinusoidal current. If t_start == t_end, then ALL entries are 0.\n\n Args:\n t_start (int): start of the sine wave\n t_end (int): end of the sine wave\n unit_time (Quantity, Time): unit of t_start and t_end. e.g. 0.1*brian2.ms\n amplitude (Quantity, Current): maximum amplitude of the sinus e.g. 3.5*brian2.uamp\n frequency (Quantity, Hz): Frequency of the sine. e.g. 0.5*brian2.kHz\n direct_current(Quantity, Current): DC-component (=offset) of the current\n sigma (float): amplitude (std) of the noise. e.g. 0.1*b2.uamp\n phase_offset (float, Optional): phase at t_start. Default = 0.\n append_zero (bool, optional): if true, 0Amp is appended at t_end+1. Without that\n trailing 0, Brian reads out the last value in the array for all indices > t_end.\n\n\n Returns:\n TimedArray: Brian2.TimedArray\n \"\"\"\n assert isinstance(t_start, int), \"t_start_ms must be of type int\"\n assert isinstance(t_end, int), \"t_end must be of type int\"\n assert b2.units.fundamentalunits.have_same_dimensions(amplitude, b2.amp), \\\n \"amplitude must have the dimension of current. e.g. brian2.uamp\"\n assert b2.units.fundamentalunits.have_same_dimensions(direct_current, b2.amp), \\\n \"direct_current must have the dimension of current. e.g. brian2.uamp\"\n assert b2.units.fundamentalunits.have_same_dimensions(frequency, b2.Hz), \\\n \"frequency must have the dimension of 1/Time. e.g. brian2.Hz\"\n \n tmp_size = 1 + t_end # +1 for t=0\n if append_zero:\n tmp_size += 1\n tmp = np.zeros((tmp_size, 1)) * b2.amp\n if t_end > t_start: # if deltaT is zero, we return a zero current\n phi = range(0, (t_end - t_start) + 1)\n phi = phi * unit_time * frequency\n phi = phi * 2. * math.pi + phase_offset\n c = np.sin(phi)\n c = (direct_current + c * amplitude) # add direct current and scale by amplitude\n tmp[t_start: t_end + 1, 0] = c # add sinusoidal part of current\n \n for i in range(t_start, t_end) :\n # Add noisy part of current here\n # Pay attention to correct scaling with respect to the unit_time (time_step)\n tmp[i] += sigma*(time_step**(-0.5))*np.random.randn()\n \n curr = b2.TimedArray(tmp, dt= unit_time)\n return curr\n \n# ------------------\namplitude = 2.5 *b2.nA\nfrequency = 100 *b2.Hz \ntime_unit = 1.*b2.ms\ntime_step = 0.1*b2.ms # This is needed for higher temporal resolution \nsigma = 1*b2.nA*time_unit**(0.5)\ndirect_current = 1.5 *b2.nA\n\n# Create a noiseless sinusoidal current\nnoisy_sinusoidal_current = get_noisy_sinusoidal_current(200, 800, unit_time = time_step,\n amplitude= amplitude, frequency=frequency,\n direct_current=direct_current, sigma = 0 *b2.nA*time_unit**(0.5))\n\n# Run the LIF model\n(state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=noisy_sinusoidal_current, \\\n simulation_time = 100 * b2.ms)\n\n# plot I and vm\nplot_tools.plot_voltage_and_current_traces(state_monitor, noisy_sinusoidal_current, title=\"\", \\\n firing_threshold=LIF.FIRING_THRESHOLD)\nprint(\"nr of spikes: {}\".format(spike_monitor.count[0]))", "nr of spikes: 0\n" ], [ "# Create a noisy sinusoidal current\nnoisy_sinusoidal_current = get_noisy_sinusoidal_current(200, 800, unit_time = time_step,\n amplitude= amplitude, frequency=frequency,\n direct_current=direct_current, sigma = sigma)\n\n# Run the LIF model\n(state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=noisy_sinusoidal_current, \\\n simulation_time = 100 * b2.ms)\n\n# plot I and vm\nplot_tools.plot_voltage_and_current_traces(state_monitor, noisy_sinusoidal_current, title=\"\", \\\n firing_threshold=LIF.FIRING_THRESHOLD)\nprint(\"nr of spikes: {}\".format(spike_monitor.count[0]))", "nr of spikes: 2\n" ] ], [ [ "In the noiseles case, the voltage reaches the firing threshold but does not generate any spike. In the noisy case we observe some spikes when the firing threshold is exceeded around the maxima of the sinusoid.", "_____no_output_____" ], [ "### 2.5.4 Stochastic resonance (Bonus, not graded)\nContrary to what one may expect, some amount of noise under certain circumstances can improve the signal transmission properties of neurons. In the subthreshold regime, a neuron cannot transmit any information about the temporal structure of its input since it does not spike. With some noise, there is some probability to spike, with the probability depending on the time dependent input (inhomogeneous Poisson process). However to much noise covers the signal completely and thus, there is usually an optimal value for the amplitude of the noise. This phenomenon is called \"stochastic resonance\" and we will briefly touch upon it in this exercise. To get an idea of the effect we suggest reading section 9.4.2 in the book: http://neuronaldynamics.epfl.ch/online/Ch9.S4.html.\n\n1 - Simulate several (e.g. n_inits = 5) trials of a LIF neuron with noisy sinusoidal current. For each trial calculate the power spectrum of the resulting spike train (using the function spike_tools.get_averaged_single_neuron_power_spectrum). Finally calculate the average power spectrum and plot it. With appropriate noise amplitudes, you should see a pronounced peak at the driving frequency, while without noise we don't see anything in the power spectrum since no spike was elicited in the subthreshold regime we are in. \n\nIn order to do that use the provided parameters and edit the code provided below. Complete the function _run_sim() which creates the input current, runs a simulation and computes the power spectrum. Call it in a loop to execute several trials. Then average over the spectra to obtain a smooth spectrum to plot.", "_____no_output_____" ] ], [ [ "amplitude = 1.*b2.nA\nfrequency = 20*b2.Hz\ntime_unit = 1.*b2.ms\ntime_step = .1*b2.ms\ndirect_current = 1. * b2.nA\nsampling_frequency = .01/time_step\nnoise_amplitude = 2.\nn_inits = 5\n\n# run simulation and calculate power spectrum\ndef _run_sim(amplitude, noise_amplitude):\n noisy_sinusoidal_current = get_noisy_sinusoidal_current(50, 100000, unit_time = time_step,\n amplitude= ???, frequency= ???,\n direct_current= ???,\n sigma = noise_amplitude*b2.nA*np.sqrt(time_unit))\n # run the LIF model\n (state_monitor,spike_monitor) = LIF.simulate_LIF_neuron(input_current=noisy_sinusoidal_current, \\\n simulation_time = 10000 * b2.ms)\n # get power spectrum\n freq, mean_ps, all_ps_dict, mean_firing_rate, mean_firing_freqs_per_neuron = \\\n spike_tools.get_averaged_single_neuron_power_spectrum(spike_monitor, sampling_frequency,\n window_t_min = 1000*b2.ms, window_t_max = 9000*b2.ms,\n nr_neurons_average=1, subtract_mean=True)\n \n return freq, all_ps_dict, mean_firing_rate\n\n\n# initialize array\nspectra = []\n# run a few simulations, calculate the power spectrum and append it to the spectra array\nfor i in ???:\n freq, spectrum, mfr = ???\n spectra.append(spectrum[0])\n\n# average spectra over trials\nspectrum = ??? # hint: us np.mean with axis=0\n\n# plotting, frequencies vs the obtained spectrum:\nplt.figure()\nplt.plot(???,???)\nplt.xlabel(\"???\")\nplt.ylabel(\"???\")", "_____no_output_____" ] ], [ [ "2 - We now apply different noise levels to investigate the optimal noise level of stochastic resonance. \n\nThe quantity to optimize is the signal-to-noise ratio (SNR). Here, the SNR is defined as the intensity of the power spectrum at the driving frequency (the peak from above), divided by the value of the background noise (power spectrum averaged around the peak).\n\nIn order to do that edit the code provided below. You can re-use the function _run_sim() to obtain the power spectrum of on trial. The calculation of the SNR is already implemented and doesn't need to be changed.\n\nWhen you are done with completing the code, run the simulation with the proposed parameters (This could take several minutes...). The result should be a plot showing an optimal noise amplitude, i.e. a $\\sigma$ where the SNR is maximal.", "_____no_output_____" ] ], [ [ "def get_snr(amplitude, noise_amplitude, n_inits):\n \n spectra = []\n snr = 0.\n for i in range(0,n_inits):\n # run model with noisy sinusoidal \n freq_signal, spectrum, mfr = ???\n spectra.append(spectrum[0])\n\n # Average over trials to get power spectrum\n spectrum = ???\n\n if mfr != 0.*b2.Hz:\n peak = np.amax(spectrum)\n index_of_peak = np.argmax(spectrum)\n # snr: divide peak value by average of surrounding values\n snr = peak/np.mean(np.concatenate((spectrum[index_of_peak-100:index_of_peak-1],\\\n spectrum[index_of_peak+1:index_of_peak+100])))\n else:\n snr = 0.\n \n return snr\n\nnoise_amplitudes = np.arange(0.,5.,.5)\nsnr = np.zeros(len(noise_amplitudes))\nfor j in np.arange(0,len(noise_amplitudes)):\n snr[j] = get_snr(amplitude, noise_amplitudes[j], n_inits = 8)\n\n\nplt.figure()\nplt.plot(noise_amplitudes,snr)\nplt.xlabel(\"???\")\nplt.ylabel(\"???\")\nplt.show()", "_____no_output_____" ] ], [ [ "3 - For further reading on this topic, consult the book chapter 9.4.2 (http://neuronaldynamics.epfl.ch/online/Ch9.S4.html#Ch9.F10).", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d027cb77007abd480fcf50ca872329564434b924
9,298
ipynb
Jupyter Notebook
azure/train_pipeline.ipynb
fashourr/covid-cxr
0cf126b047ec47e0e94151741f5a8267bded9f14
[ "MIT" ]
122
2020-03-27T14:56:07.000Z
2022-03-09T09:50:02.000Z
azure/train_pipeline.ipynb
fashourr/covid-cxr
0cf126b047ec47e0e94151741f5a8267bded9f14
[ "MIT" ]
9
2020-03-30T19:35:49.000Z
2021-08-07T15:15:37.000Z
azure/train_pipeline.ipynb
fashourr/covid-cxr
0cf126b047ec47e0e94151741f5a8267bded9f14
[ "MIT" ]
63
2020-04-01T17:00:17.000Z
2022-03-25T07:03:42.000Z
39.565957
169
0.595827
[ [ [ "# Azure ML Training Pipeline for COVID-CXR\nThis notebook defines an Azure machine learning pipeline for a single training run and submits the pipeline as an experiment to be run on an Azure virtual machine.", "_____no_output_____" ] ], [ [ "# Import statements\nimport azureml.core\nfrom azureml.core import Experiment\nfrom azureml.core import Workspace, Datastore\nfrom azureml.data.data_reference import DataReference\nfrom azureml.pipeline.core import PipelineData\nfrom azureml.pipeline.core import Pipeline\nfrom azureml.pipeline.steps import PythonScriptStep, EstimatorStep\nfrom azureml.train.dnn import TensorFlow\nfrom azureml.train.estimator import Estimator\nfrom azureml.core.compute import ComputeTarget, AmlCompute\nfrom azureml.core.compute_target import ComputeTargetException\nfrom azureml.core.environment import Environment\nfrom azureml.core.runconfig import RunConfiguration\nimport shutil", "_____no_output_____" ] ], [ [ "### Register the workspace and configure its Python environment.", "_____no_output_____" ] ], [ [ "# Get reference to the workspace\nws = Workspace.from_config(\"./ws_config.json\")\n\n# Set workspace's environment\nenv = Environment.from_pip_requirements(name = \"covid-cxr_env\", file_path = \"./../requirements.txt\")\nenv.register(workspace=ws)\nrunconfig = RunConfiguration(conda_dependencies=env.python.conda_dependencies)\nprint(env.python.conda_dependencies.serialize_to_string())\n\n# Move AML ignore file to root folder\naml_ignore_path = shutil.copy('./.amlignore', './../.amlignore') ", "_____no_output_____" ] ], [ [ "### Create references to persistent and intermediate data\nCreate DataReference objects that point to our raw data on the blob. Configure a PipelineData object to point to preprocessed images stored on the blob.", "_____no_output_____" ] ], [ [ "# Get the blob datastore associated with this workspace\nblob_store = Datastore(ws, name='covid_cxr_ds')\n\n# Create data references to folders on the blob\nraw_data_dr = DataReference(\n datastore=blob_store,\n data_reference_name=\"raw_data\",\n path_on_datastore=\"data/\")\nmila_data_dr = DataReference(\n datastore=blob_store,\n data_reference_name=\"mila_data\",\n path_on_datastore=\"data/covid-chestxray-dataset/\")\nfig1_data_dr = DataReference(\n datastore=blob_store,\n data_reference_name=\"fig1_data\",\n path_on_datastore=\"data/Figure1-COVID-chestxray-dataset/\")\nrsna_data_dr = DataReference(\n datastore=blob_store,\n data_reference_name=\"rsna_data\",\n path_on_datastore=\"data/rsna/\")\ntraining_logs_dr = DataReference(\n datastore=blob_store,\n data_reference_name=\"training_logs_data\",\n path_on_datastore=\"logs/training/\")\nmodels_dr = DataReference(\n datastore=blob_store,\n data_reference_name=\"models_data\",\n path_on_datastore=\"models/\")\n\n# Set up references to pipeline data (intermediate pipeline storage).\nprocessed_pd = PipelineData(\n \"processed_data\",\n datastore=blob_store,\n output_name=\"processed_data\",\n output_mode=\"mount\")", "_____no_output_____" ] ], [ [ "### Compute Target\nSpecify and configure the compute target for this workspace. If a compute cluster by the name we specified does not exist, create a new compute cluster.", "_____no_output_____" ] ], [ [ "CT_NAME = \"nd12s-clust-hp\" # Name of our compute cluster\nVM_SIZE = \"STANDARD_ND12S\" # Specify the Azure VM for execution of our pipeline\n#CT_NAME = \"d2-cluster\" # Name of our compute cluster\n#VM_SIZE = \"STANDARD_D2\" # Specify the Azure VM for execution of our pipeline\n\n# Set up the compute target for this experiment\ntry:\n compute_target = AmlCompute(ws, CT_NAME)\n print(\"Found existing compute target.\")\nexcept ComputeTargetException:\n print(\"Creating new compute target\")\n provisioning_config = AmlCompute.provisioning_configuration(vm_size=VM_SIZE, min_nodes=1, max_nodes=4) \n compute_target = ComputeTarget.create(ws, CT_NAME, provisioning_config) # Create the compute cluster\n \n # Wait for cluster to be provisioned\n compute_target.wait_for_completion(show_output=True, min_node_count=None, timeout_in_minutes=20) \n \nprint(\"Azure Machine Learning Compute attached\")\nprint(\"Compute targets: \", ws.compute_targets)\ncompute_target = ws.compute_targets[CT_NAME]", "_____no_output_____" ] ], [ [ "### Define pipeline and submit experiment.\nDefine the steps of an Azure machine learning pipeline. Create an Azure Experiment that will run our pipeline. Submit the experiment to the execution environment.", "_____no_output_____" ] ], [ [ "# Define preprocessing step the ML pipeline\nstep1 = PythonScriptStep(name=\"preprocess_step\",\n script_name=\"azure/preprocess_step/preprocess_step.py\",\n arguments=[\"--miladatadir\", mila_data_dr, \"--fig1datadir\", fig1_data_dr, \n \"--rsnadatadir\", rsna_data_dr, \"--preprocesseddir\", processed_pd],\n inputs=[mila_data_dr, fig1_data_dr, rsna_data_dr],\n outputs=[processed_pd],\n compute_target=compute_target, \n source_directory=\"./../\",\n runconfig=runconfig,\n allow_reuse=True)\n\n# Define training step in the ML pipeline\nest = TensorFlow(source_directory='./../',\n script_params=None,\n compute_target=compute_target,\n entry_script='azure/train_step/train_step.py',\n pip_packages=['tensorboard', 'pandas', 'dill', 'numpy', 'imblearn', 'matplotlib', 'scikit-image', 'matplotlib',\n 'pydicom', 'opencv-python', 'tqdm', 'scikit-learn'],\n use_gpu=True,\n framework_version='2.0')\nstep2 = EstimatorStep(name=\"estimator_train_step\", \n estimator=est, \n estimator_entry_script_arguments=[\"--rawdatadir\", raw_data_dr, \"--preprocesseddir\", processed_pd, \n \"--traininglogsdir\", training_logs_dr, \"--modelsdir\", models_dr],\n runconfig_pipeline_params=None, \n inputs=[raw_data_dr, processed_pd, training_logs_dr, models_dr], \n outputs=[], \n compute_target=compute_target)\n\n# Construct the ML pipeline from the steps\nsteps = [step1, step2]\nsingle_train_pipeline = Pipeline(workspace=ws, steps=steps)\nsingle_train_pipeline.validate()\n\n# Define a new experiment and submit a new pipeline run to the compute target.\nexperiment = Experiment(workspace=ws, name='SingleTrainExperiment_v3')\nexperiment.submit(single_train_pipeline, regenerate_outputs=False)\nprint(\"Pipeline is submitted for execution\")\n\n# Move AML ignore file back to original folder\naml_ignore_path = shutil.move(aml_ignore_path, './.amlignore') ", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d027f22288e1ecb488a2bf9021eb82a99b2ec57e
46,425
ipynb
Jupyter Notebook
Jupyter/SIT742P01A-Python.ipynb
jilliant/sit742
71d996c5735ebc4e084e55d21f775c3612eca9a5
[ "MIT" ]
1
2019-03-08T05:58:59.000Z
2019-03-08T05:58:59.000Z
Jupyter/SIT742P01A-Python.ipynb
jilliant/sit742
71d996c5735ebc4e084e55d21f775c3612eca9a5
[ "MIT" ]
null
null
null
Jupyter/SIT742P01A-Python.ipynb
jilliant/sit742
71d996c5735ebc4e084e55d21f775c3612eca9a5
[ "MIT" ]
1
2022-03-09T14:44:13.000Z
2022-03-09T14:44:13.000Z
57.670807
4,125
0.550415
[ [ [ "# SIT742: Modern Data Science \n**(Week 01: Programming Python)**\n\n---\n- Materials in this module include resources collected from various open-source online repositories.\n- You are free to use, change and distribute this package.\n- If you found any issue/bug for this document, please submit an issue at [tulip-lab/sit742](https://github.com/tulip-lab/sit742/issues)\n\n\n\nPrepared by **SIT742 Teaching Team**\n\n---\n\n\n# Session 1A - IPython notebook and basic data types\n\n\nIn this session, \nyou will learn how to run *Python* code under **IPython notebook**. You have two options for the environment:\n\n1. Install the [Anaconda](https://www.anaconda.com/distribution/), and run it locally; **OR**\n1. Use one cloud data science platform such as:\n - [Google Colab](https://colab.research.google.com): SIT742 lab session will use Google Colab. \n - [IBM Cloud](https://www.ibm.com/cloud)\n - [DataBricks](https://community.cloud.databricks.com)\n\n\n\nIn IPython notebook, you will be able to execute and modify your *Python* code more efficiently. \n\n- **If you are using Google Colab for SIT742 lab session practicals, you can ignore this Part 1 of this Session 1A, and start with Part 2.**\n\n\n\nIn addition, you will be given an introduction on *Python*'s basic data types, \ngetting familiar with **string**, **number**, data conversion, data comparison and \ndata input/output. \n\nHopefully, by using **Python** and the powerful **IPython Notebook** environment, \nyou will find writing programs both fun and easy. \n", "_____no_output_____" ], [ "## Content\n\n### Part 1 Create your own IPython notebook\n\n1.1 [Start a notebook server](#cell_start)\n\n1.2 [A tour of IPython notebook](#cell_tour)\n\n1.3 [IPython notebook infterface](#cell_interface)\n\n1.4 [Open and close notebooks](#cell_close)\n\n### Part 2 Basic data types\n\n2.1 [String](#cell_string)\n\n2.2 [Number](#cell_number)\n\n2.3 [Data conversion and comparison](#cell_conversion)\n\n2.4 [Input and output](#cell_input)\n", "_____no_output_____" ], [ "# Part 1. Create your own IPython notebook\n\n- **If you are using Google Colab for SIT742 lab session practicals, you can ignore this Part 1, and start with Part 2.**\n\n\nThis notebook will show you how to start an IPython notebook session. It guides you through the process of creating your own notebook. It provides you details on the notebook interface and show you how to nevigate with a notebook and manipulate its components. ", "_____no_output_____" ], [ "<a id = \"cell_start\"></a>", "_____no_output_____" ], [ " \n## 1. 1 Start a notebook server\n\nAs described in Part 1, you start the IPython notebnook server by keying in the command in a terminal window/command line window.\n\nHowever, before you do this, make sure you have created a folder **p01** under **H:/sit742**, download the file **SIT742P01A-Python.ipynb** notebook, and saved it under **H:/sit742/p01**.\n\nIf you are using [Google Colab](https://colab.research.google.com), you can upload this notebook to Google Colab and run it from there. If any difficulty, please ask your tutor, or check the CloudDeakin discussions.\n\n\nAfter you complete this, you can now switch working directory to **H:/sit742**, and start the IPython notebook server by the following commands:", "_____no_output_____" ] ], [ [ "ipython notebook %don't run this in notebook, run it on command line to activate the server", "_____no_output_____" ] ], [ [ "You can see the message in the terminal windows as follows:\n\n\n<img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/start-workspace.jpg\">\n\nThis will open a new browser window(or a new tab in your browser window). In the browser, there is an **dashboard** page which shows you all the folders and files under **sit742** folder\n\n\n<img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/start-index.jpg\">\n ", "_____no_output_____" ], [ "<a id = \"cell_tour\"></a>", "_____no_output_____" ], [ " \n ## 1.2 A tour of iPython notebook \n \n ### Create a new ipython notebook\n \n \n To create a new notebook, go to the menu bar and select **File -> New Notebook -> Python 3**\n\nBy default, the new notebook is named **Untitled**. To give your notebook a meaningful name, click on the notebook name and rename it. We would like to call our new notebook **hello.ipynb**. Therefore, key in the name **hello**. \n\n<img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/emptyNotebook.jpg\">\n\n\n\n### Run script in code cells\n\nAfter a new notebook is created, there is an empty box in the notebook, called a **cell**. If you double click on the cell, you enter the **edit** mode of the notebook. Now we can enter the following code in the cell", "_____no_output_____" ] ], [ [ "text = \"Hello World\"\nprint(text)", "_____no_output_____" ] ], [ [ "After this, press **CTRL + ENTER**, and execute the cell. The result will be shown after the cell. \n\n\n\n<img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/hello-world.jpg\">\n\n\n\n\nAfter a cell is executed , the notebook is switched to the **Commmand** mode. In this mode, you can manipulte the notebook and its commponent. Alternatively, you can use **ESC** key to switch from **Edit** mode to **Command** mode without executing code. \n\nTo modify the code you entered in the cell, **double click** the cell again and modify its content. For example, try to change the first line of previouse cell into the following code:", "_____no_output_____" ] ], [ [ "text = \"Good morning World!\"", "_____no_output_____" ] ], [ [ "Afterwards, press **CTRL + ENTER**, and the new output is displayed. \n\nAs you can see, you are switching bewteen two modes, **Command** and **Edit**, when editing a notebook. We will in later section look into these two operation modes of closely. Now practise switching between the two modes until you are comfortable with them. ", "_____no_output_____" ], [ "### Add new cells\n\nTo add a new cell to a notebook, you have to ensure the notebook is in **Command** mode. If not, refer to previous section to switch to **Command** mode. \n\n\nTo add cell below the currrent cell, go to menubar and click **Insert-> Insert Cell Below**. Alternatively, you can use shortcut i.e. pressing **b** (or **a** to create a cell above).\n\n\n<img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/new-cell.jpg\">\n\n### Add markdown cells\n\nBy default, a code cell is created when adding a new cell. However, IPython notebook also use a **Markdown** cell for enter normal text. We use markdown cell to display the text in specific format and to provide structure for a notebook. \n\nTry to copy the text in the cell below and paste it into your new notebook. Then from toolbar(**Cell->Cell Type**), change cell type from **Code** to **Markdown**. \n\nPlease note in the following cell, there is a space between the leading **-, #, 0** and the text that follows. ", "_____no_output_____" ] ], [ [ "## Heading 2\nNormal text here!\n\n### Heading 3\nordered list here\n\n0. Fruits\n 0. Banana\n 0. Grapes\n0. Veggies\n 0. Tomato\n 0. Broccoli\n \nUnordered list here\n- Fruits\n - Banana\n - Grapes\n- Veggies\n - Tomato\n - Broccoli", "_____no_output_____" ] ], [ [ "Now execute the cell by press **CTRL+ ENTER**. You notebook should look like this:\n \n<img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/new-markdown.jpg\">\n", "_____no_output_____" ], [ "Here is what the formated Markdown cell looks like:", "_____no_output_____" ], [ "### Exercise:\nClick this cell, and practise writing markdown language here....", "_____no_output_____" ], [ "<a id = \"cell_interface\"></a>", "_____no_output_____" ], [ " \n### 1.3 IPython notebook interface\n\nNow you have created your first notebook, let us have a close look at the user interface of IPython notebook.\n\n\n\n### Notebook component\nWhen you create a new notebook document, you will be presented with the notebook name, a menu bar, a toolbar and an empty code cell.\n\nWe can see the following components in a notebook:\n\n- **Title bar** is at the top of the page and contains the name of the notebook. Clicking on the notebook name brings up a dialog which allows you to rename it. Please renaming your notebook name from “Untitled0” to “hello”. This change the file name from **Untitled0.ipynb** to **hello.ipynb**.\n\n- **Menu bar** presents different options that can be used to manipulate the way the notebook functions.\n\n- **Toolbar** gives a quick way of performing the most-used operations within the notebook.\n\n- An empty computational cell is show in a new notebook where you can key in your code.\n\nThe notebook has two modes of operatiopn:\n\n- **Edit**: In this mode, a single cell comes into focus and you can enter text or execute code. You activate the **Edit mode** by **clicking on a cell** or **selecting a cell and then pressing Enter key**.\n\n- **Command**: In this mode, you can perform tasks that is related to the whole notebook structure. For example, you can move, copy, cut and paste cells. A series of keyboard shortcuts are also available to enable you to performa these tasks more effiencient. One easiest way of activating the command mode by pressing the **Esc** key to exit editing mode. \n\n\n\n\n\n### Get help and interrupting\n\n\nTo get help on the use of different cammands, shortcuts, you can go to the **Help** menu, which provides links to relevant documentation.\n\nIt is also easy to get help on any objects(including functions and methods). For example, to access help on the sum() function, enter the followsing line in a cell:", "_____no_output_____" ] ], [ [ "sum?", "_____no_output_____" ] ], [ [ "The other improtant thing to know is how to interrupt a compuation. This can be done through the menu **Kernel->Interrupt** or **Kernel->Restart**, depending on what works on the situation. We will have chance to try this in later session. ", "_____no_output_____" ], [ "\n### Notebook cell types\n\n\nThere are basically three types of cells in a IPython notebook: Code Cells, Markdown Cells, Raw Cells.\n\n\n**Code cells** : Code cell can be used to enter code and will be executed by Python interpreter. Although we will not use other language in this unit, it is good to know that Jupyter Notebooks also support JavaScript, HTML, and Bash commands.\n\n*** Markdown cells***: You have created markdown cell in the previouse section. Markdown Cells are the easiest way to write and format text. It is also give structure to the notebook. Markdown language is used in this type of cell. Follow this link https://daringfireball.net/projects/markdown/basics for the basics of the syntax. \n\nThis is a Markdown Cells example notebook sourced from : https://ipython.org/ipython-doc/3/notebook/notebook.html\nThis markdown cheat sheet can also be good reference to the main markdowns you might need to use in our pracs http://nestacms.com/docs/creating-content/markdown-cheat-sheet\n\n\n**Raw cells** : Raw cells, unlike all other Jupyter Notebook cells, have no input-output distinction. This means that raw Cells cannot be rendered into anything other than what they already are. They are mainly used to create examples.\n\n\nAs you have seen, you can use the toolbar to choose between different cell types. In addition, shortcut **M** and **Y** can be used to quickly change a cell to Code cell or Markdown cell under Command mode. \n\n\n### Operation modes of IPytho notebook\n\n**Edit mode**\n\n\n\nThe Edit mode is used to enter text in cells and to execute code. As you have seen, after typing some code in the notebook and pressing **CTRL+Enter**, the notebook executes the cell and diplays output. The other two shortcuts used to run code in a cell are **Shift +Enter** and **Alt + Enter**. \n\nThese three ways to run the the code in a cells are summarized as follows:\n\n\n- Pressing Shift + Enter: This runs the cell and select the next cell(A new cell is created if at the end of the notebook). This is the most usual way to execute a cell.\n\n- Pressing Ctrl + Enter: This runs the cell and keep the same cell selected. \n\n- Pressing Alt + Enter: This runs the cell and insert a new cell below it. \n\n\n**Command mode**\n\nIn Command mode, you can edit the notebook as a whole, but not type into individual cells.\n\nYou can use keyboard shortcut in this mode to perform the notebook and cell actions effeciently. For example, if you are in command mode and press **c**, you will copy the current cell. \n\n\n\n There are a large amount of shortcuts avaialbe in the command mode. However, you do not have to remember all of them, since most actions in the command mode are available in the menu. \n \n Here is a list of the most useful shortcuts. They are arrganged by the \n order we recommend you learn so that you can edit the cells effienctly.\n\n\n1. Basic navigation: \n\n - Enter: switch to Edit mode\n \n - Esc: switch to Command mode\n \n - Shift+enter: Eexecute a cell\n \n - Up, down: Move to the cell above or below\n\n2. Cell types: \n - y: switch to code cell)\n - m: switch to markdown cell) \n\n3. Cell creation: \n - a: insert new sell above\n - b: insert new cell below\n\n4. Cell deleting:\n - press D twice.\n\nNote that one of the most common (and frustrating) mistakes when using the\nnotebook is to type something in the wrong mode. Remember to use **Esc**\nto switch to the Command mode and **Enter** to switch to the Edit mode.\nAlso, remember that **clicking** on a cell automatically places it in the Edit\nmode, so it will be necessary to press **Esc** to go to the Command mode.\n\n### Exercise\nPlease go ahead and try these shortcuts. For example, try to insert new cell, modify and delete an existing cell. You can also switch cells between code type and markdown type, and practics different kinds of formatting in a markdown cell. \n \nFor a complete list of shortcut in **Command** mode, go to menu bar **Help->Keyboardshorcut**. Feel free to explore the other shortcuts. \n \n ", "_____no_output_____" ], [ "<a id = \"cell_close\"></a>", "_____no_output_____" ], [ " \n ## 1.4 open and close notebooks\n \n You can open multiple notebooks in a browser windows. Simply go to menubar and choose **File->open...**, and select one **.ipynb** file. The second notebook will be opened in a seperated tab. \n \nNow make sure you still have your **hello.ipynb** open. Also please download **ControlAdvData.ipynb** from cloudDeakin, and save under **H:/sit742/prac01**. Now go to the manu bar, click on **File->open ...**, locate the file **ControlAdvData.ipynb**, and open this file. \n \n When you finish your work, you will need to close your notebooks and shutdown the IPython notebook server. Instead of simply close all the tabs in the browser, you need to shutdown each notebook first. To do this, swich to the **Home** tab(**Dashboard page**) and **Running** section(see below). Click on **Shutdown** button to close each notebook. In case **Dashboard** page is not open, click on the **Jupyter** icon to reopen it. \n \n\n <img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/close-index.jpg\">\n \n After each notebook is shutdown, it is time to showdown the IPython notebook server. To do this, go to the terminal window and press **CTRL + C**, and then enter **Y**. After the notebook server is shut down, the terminal window is ready for you to enter any new command. \n \n\n <img src=\"https://raw.githubusercontent.com/tuliplab/mds/master/Jupyter/image/close-terminal.jpg\">\n \n ", "_____no_output_____" ], [ "\n# Part 2 Basic Data Types\n", "_____no_output_____" ], [ "\n\nIn this part, you will get better understanding with Python's basic data type. We will \nlook at **string** and **number** data type in this section. Also covered are:\n\n- Data conversion\n- Data comparison\n- Receive input from users and display results effectively \n\nYou will be guided through completing a simple program which receives input from a user,\n process the information, and display results with specific format. ", "_____no_output_____" ], [ "<a id = \"cell_string\"></a>", "_____no_output_____" ], [ "## 2.1 String\n\nA string is a *sequence of characters*. We are using strings in almost every Python\nprograms. As we can seen in the **”Hello, World!”** example, strings can be specified\nusing single quotes **'**. The **print()** function can be used to display a string.", "_____no_output_____" ] ], [ [ "print('Hello, World!')", "_____no_output_____" ] ], [ [ "We can also use a variable to store the string value, and use the variable in the\n**print()** function.", "_____no_output_____" ] ], [ [ "# Assign a string to a variable \ntext = 'Hello, World!'\nprint(text)", "_____no_output_____" ] ], [ [ "A *variable* is basically a name that represents (or refers to) some value. We use **=**\nto assign a value to a variable before we use it. Variable names are given by a programer\nin a way that the program is easy to understanding. Variable names are *case sensitive*.\nIt can consist of letters, digits and underscores. However, it can not begin with a digit.\nFor example, **plan9** and **plan_9** are valid names, where **9plan** is not.", "_____no_output_____" ] ], [ [ "text = 'Hello, World!'", "_____no_output_____" ], [ "# with print() function, content is displayed without quotation mark\nprint(text)", "_____no_output_____" ] ], [ [ "With variables, we can also display its value without **print()** function. Note that\nyou can not display a variable without **print()** function in Python script(i.e. in a **.py** file). This method only works under interactive mode (i.e. in the notebook). ", "_____no_output_____" ] ], [ [ "# without print() function, quotation mark is displayed together with content\ntext ", "_____no_output_____" ] ], [ [ "Back to representation of string, there will be issues if you need to include a quotation\nmark in the text.", "_____no_output_____" ] ], [ [ "text = ’What’ s your name ’", "_____no_output_____" ] ], [ [ "Since strings in double quotes **\"** work exactly the same way as string in single quotes.\nBy mixing the two types, it is easy to include quaotation mark itself in the text.", "_____no_output_____" ] ], [ [ "text = \"What' s your name?\"\nprint(text)", "_____no_output_____" ] ], [ [ "Alternertively, you can use:", "_____no_output_____" ] ], [ [ "text = '\"What is the problem?\", he asked.'\nprint(text)", "_____no_output_____" ] ], [ [ "You can specify multi-line strings using triple quotes (**\"\"\"** or **'''**). In this way, single\nquotes and double quotes can be used freely in the text.\nHere is one example:", "_____no_output_____" ] ], [ [ "multiline = '''This is a test for multiline. This is the first line. \nThis is the second line. \nI asked, \"What's your name?\"'''\nprint(multiline)", "_____no_output_____" ] ], [ [ "Notice the difference when the variable is displayed without **print()** function in this case.", "_____no_output_____" ] ], [ [ "multiline = '''This is a test for multiline. This is the first line. \nThis is the second line. \nI asked, \"What's your name?\"'''\nmultiline", "_____no_output_____" ] ], [ [ "Another way of include the special characters, such as single quotes is with help of\nescape sequences **\\\\**. For example, you can specify the single quote using **\\\\' ** as follows.", "_____no_output_____" ] ], [ [ "string = 'What\\'s your name?'\nprint(string)", "_____no_output_____" ] ], [ [ "There are many more other escape sequences (See Section 2.4.1 in [Python3.0 official document](https://docs.python.org/3.1/reference/lexical_analysis.html)). But I am going to mention the most useful two examples here. \n\nFirst, use escape sequences to indicate the backslash itself e.g. **\\\\\\\\**", "_____no_output_____" ] ], [ [ "path = 'c:\\\\windows\\\\temp'\nprint(path)", "_____no_output_____" ] ], [ [ "Second, used escape sequences to specify a two-line string. Apart from using a triple-quoted\nstring as shown previously, you can use **\\n** to indicate the start of a new line.", "_____no_output_____" ] ], [ [ "multiline = 'This is a test for multiline. This is the first line.\\nThis is the second line.'\nprint(multiline)", "_____no_output_____" ] ], [ [ "To manipulate strings, the following two operators are most useful: \n* **+** is use to concatenate\ntwo strings or string variables; \n* ***** is used for concatenating several copies of the same\nstring.", "_____no_output_____" ] ], [ [ "print('Hello, ' + 'World' * 3)", "_____no_output_____" ] ], [ [ "Below is another example of string concatenation based on variables that store strings.", "_____no_output_____" ] ], [ [ "name = 'World'\ngreeting = 'Hello'\nprint(greeting + ', ' + name + '!')", "_____no_output_____" ] ], [ [ "Using variables, change part of the string text is very easy. ", "_____no_output_____" ] ], [ [ "name", "_____no_output_____" ], [ "greeting", "_____no_output_____" ], [ "# Change part of the text is easy\ngreeting = 'Good morning' \nprint(greeting + ', ' + name + '!')", "_____no_output_____" ] ], [ [ "<a id = \"cell_number\"></a>", "_____no_output_____" ], [ " ## 2.2 Number", "_____no_output_____" ], [ "There are two types of numbers that are used most frequently: integers and floats. As we\nexpect, the standard mathematic operation can be applied to these two types. Please\ntry the following expressions. Note that **\\*\\*** is exponent operator, which indicates\nexponentation exponential(power) caluclation.", "_____no_output_____" ] ], [ [ "2 + 3", "_____no_output_____" ], [ "3 * 5", "_____no_output_____" ], [ "#3 to the power of 4\n3 ** 4 ", "_____no_output_____" ] ], [ [ "Among the number operations, we need to look at division closely. In Python 3.0, classic division is performed using **/**. ", "_____no_output_____" ] ], [ [ "15 / 5", "_____no_output_____" ], [ "14 / 5", "_____no_output_____" ] ], [ [ "*//* is used to perform floor division. It truncates the fraction and rounds it to the next smallest whole number toward the left on the number line.", "_____no_output_____" ] ], [ [ "14 // 5", "_____no_output_____" ], [ "# Negatives move left on number line. The result is -3 instead of -2\n-14 // 5 ", "_____no_output_____" ] ], [ [ "Modulus operator **%** can be used to obtain remaider. Pay attention when negative number is involved.", "_____no_output_____" ] ], [ [ "14 % 5", "_____no_output_____" ], [ "# Hint: −14 // 5 equal to −3\n# (-3) * 5 + ? = -14\n\n-14 % 5 ", "_____no_output_____" ] ], [ [ "*Operator precedence* is a rule that affects how an expression is evaluated. As we learned in high school, the multiplication is done first than the addition. e.g. **2 + 3 * 4**. This means multiplication operator has higher precedence than the addition operator.\n\nFor your reference, a precedence table from the python reference manual is used to indicate the evaluation order in Python. For a complete precedence table, check the heading \"Python Operators Precedence\" in this [Python tutorial](http://www.tutorialspoint.com/python/python_basic_operators.htm)\n\n\nHowever, When things get confused, it is far better to use parentheses **()** to explicitly\nspecify the precedence. This makes the program more readable.\n\nHere are some examples on operator precedence:", "_____no_output_____" ] ], [ [ "2 + 3 * 4", "_____no_output_____" ], [ "(2 + 3) * 4", "_____no_output_____" ], [ "2 + 3 ** 2", "_____no_output_____" ], [ "(2 + 3) ** 2", "_____no_output_____" ], [ "-(4+3)+2", "_____no_output_____" ] ], [ [ "Similary as string, variables can be used to store a number so that it is easy to manipulate them.", "_____no_output_____" ] ], [ [ "x = 3\ny = 2\nx + 2", "_____no_output_____" ], [ "sum = x + y\nsum", "_____no_output_____" ], [ "x * y", "_____no_output_____" ] ], [ [ "One common expression is to run a math operation on a variable and then assign the result of the operation back to the variable. Therefore, there is a shortcut for such a expression. ", "_____no_output_____" ] ], [ [ "x = 2\nx = x * 3\nx", "_____no_output_____" ] ], [ [ "This is equivalant to:", "_____no_output_____" ] ], [ [ "x = 2\n# Note there is no space between '*' and '+'\nx *= 3\nx", "_____no_output_____" ] ], [ [ "<a id = \"cell_conversion\"></a>", "_____no_output_____" ], [ "## 2.3 Data conversion and comparison", "_____no_output_____" ], [ "So far, we have seen three types of data: interger, float, and string. With various data type, Python can define the operations possible on them and the storage method for each of them. In the later pracs, we will further introduce more data types, such as tuple, list and dictionary. \n\nTo obtain the data type of a variable or a value, we can use built-in function **type()**;\nwhereas functions, such as **str()**, **int()**, **float()**, are used to convert data one type to another. Check the following examples on the usage of these functions:", "_____no_output_____" ] ], [ [ "type('Hello, world!)')", "_____no_output_____" ], [ "input_Value = '45.6'\ntype(input_Value)", "_____no_output_____" ], [ "weight = float(input_Value)\nweight\ntype(weight)", "_____no_output_____" ] ], [ [ "Note the system will report error message when the conversion function is not compatible with the data.", "_____no_output_____" ] ], [ [ "input_Value = 'David'\nweight = float(input_Value)", "_____no_output_____" ] ], [ [ "Comparison between two values can help make decision in a program. The result of the comparison is either **True** or **False**. They are the two values of *Boolean* type.", "_____no_output_____" ] ], [ [ "5 > 10", "_____no_output_____" ], [ "type(5 > 10)", "_____no_output_____" ], [ "# Double equal sign is also used for comparison\n10.0 == 10", "_____no_output_____" ] ], [ [ "Check the following examples on comparison of two strings.", "_____no_output_____" ] ], [ [ "'cat' < 'dog'", "_____no_output_____" ], [ "# All uppercases are before low cases. \n'cat' < 'Dog'", "_____no_output_____" ], [ "'apple' < 'apricot'", "_____no_output_____" ] ], [ [ "There are three logical operators, *not*, *and* and *or*, which can be applied to the boolean values. ", "_____no_output_____" ] ], [ [ "# Both condition #1 and condition #2 are True?\n3 < 4 and 7 < 8", "_____no_output_____" ], [ "# Either condition 1 or condition 2 are True?\n3 < 4 or 7 > 8", "_____no_output_____" ], [ "# Both conditional #1 and conditional #2 are False?\nnot ((3 > 4) or (7 > 8))", "_____no_output_____" ] ], [ [ "<a id = \"cell_input\"></a>", "_____no_output_____" ], [ "## 2. 4. Input and output", "_____no_output_____" ], [ "All programing languages provide features to interact with user. Python provide *input()* function to get input. It waits for the user to type some input and press return. We can add some information for the user by putting a message inside the function's brackets. It must be a string or a string variable. The text that was typed can be saved in a variable. Here is one example:", "_____no_output_____" ] ], [ [ "nInput = input('Enter you number here:\\n')", "_____no_output_____" ] ], [ [ "However, be aware that the input received from the user are treated as a string, even\nthough a user entered a number. The following **print()** function invokes an error message.", "_____no_output_____" ] ], [ [ "print(nInput + 3)", "_____no_output_____" ] ], [ [ "The input need to be converted to an integer before the match operation can be performed as follows:", "_____no_output_____" ] ], [ [ "print(int(nInput) + 3)", "_____no_output_____" ] ], [ [ "After user's input are accepted, the messages need to be displayed to the user accordingly. String concatenation is one way to display messages which incorporate variable values. ", "_____no_output_____" ] ], [ [ "name = 'David'\nprint('Hello, ' + name)", "_____no_output_____" ] ], [ [ "Another way of achieving this is using **print()** funciton with *string formatting*. We need to use the *string formatting operator*, the percent(**%**) sign. ", "_____no_output_____" ] ], [ [ "name = 'David'\nprint('Hello, %s' % name)", "_____no_output_____" ] ], [ [ "Here is another example with two variables:", "_____no_output_____" ] ], [ [ "name = 'David'\nage = 23\nprint('%s is %d years old.' % (name, age))", "_____no_output_____" ] ], [ [ "Notice that the two variables, **name**, **age**, that specify the values are included at the end of the statement, and enclosed with a bracket. \n\nWith the quotation mark, **%s** and **%d** are used to specify formating for string and integer respectively. \nThe following table shows a selected set of symbols which can be used along with %. ", "_____no_output_____" ], [ "<table width=\"304\" border=\"1\">\n <tr>\n <th width=\"112\" scope=\"col\">Format symbol</th>\n <th width=\"176\" scope=\"col\">Conversion</th>\n </tr>\n <tr>\n <td>%s</td>\n <td>String</td>\n </tr>\n <tr>\n <td>%d</td>\n <td>Signed decimal integer</td>\n </tr>\n <tr>\n <td>%f</td>\n <td>Floating point real number</td>\n </tr>\n</table>", "_____no_output_____" ], [ "There are extra charaters that are used together with above symbols:", "_____no_output_____" ], [ "<table width=\"400\" border=\"1\">\n <tr>\n <th width=\"100\" scope=\"col\">Symbol</th>\n <th width=\"3000\" scope=\"col\">Functionality</th>\n </tr>\n <tr>\n <td>-</td>\n <td>Left justification</td>\n </tr>\n <tr>\n <td>+</td>\n <td>Display the sign</td>\n </tr>\n <tr>\n <td>m.n</td>\n <td>m is the minimum total width; n is the number of digits to display after the decimal point</td>\n </tr>\n</table>", "_____no_output_____" ], [ "Here are more examples that use above specifiers:", "_____no_output_____" ] ], [ [ "# With %f, the format is right justification by default. \n# As a result, white spaces are added to the left of the number\n# 10.4 means minimal width 10 with 4 decinal points\nprint('Output a float number: %10.4f' % (3.5))", "_____no_output_____" ], [ "# plus sign after % means to show positive sign\n# Zero after plus sign means using leading zero to fill width of 5\nprint('Output an integer: %+05d' % (23))", "_____no_output_____" ] ], [ [ "### 2.5 Notes on *Python 2*\n\nYou need to pay attention if you test examples in this prac under *Python* 2. \n\n1. In *Python 3, * **/** is float division, and **//** is integer division; while in Python 2, \n\tboth **/** and **//**\n\tperform *integer division*. \n\tHowever, if you stick to using **float(3)/2** for *float division*, \n\tand **3/2** for *integer division*, \n\tyou will have no problem in both version. \n \n2. Instead using function **input()**, \n\t**raw_input()** is used in Python 2. \n\tBoth functions have the same functionality,\n\ti.e. take what the user typed and passes it back as a string.\n \n3. Although both versions support **print()** function with same format, \n\tPython 2 also allows the print statement (e.g. **print \"Hello, World!\"**), \n\twhich is not valid in Python 3. \n\tHowever, if you stick to our examples and using **print()** function with parantheses, \n\tyour programs should works fine in both versions. ", "_____no_output_____" ] ] ]
[ "markdown", "raw", "markdown", "raw", "markdown", "raw", "markdown", "raw", "markdown", "raw", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "raw" ], [ "markdown", "markdown", "markdown" ], [ "raw" ], [ "markdown" ], [ "raw" ], [ "markdown", "markdown" ], [ "raw" ], [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "raw" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d0280b6471f2285f6be59dcd2e8bc7a438cd2fe5
527,755
ipynb
Jupyter Notebook
00. DynamicProgramming/05. General Equilibrium.ipynb
JMSundram/ConsumptionSavingNotebooks
338a8cecbe0043ebb4983c2fe0164599cd2a4fc0
[ "MIT" ]
20
2019-03-09T02:08:49.000Z
2022-03-28T15:56:04.000Z
00. DynamicProgramming/05. General Equilibrium.ipynb
JMSundram/ConsumptionSavingNotebooks
338a8cecbe0043ebb4983c2fe0164599cd2a4fc0
[ "MIT" ]
1
2019-06-03T18:33:44.000Z
2019-07-02T13:51:21.000Z
00. DynamicProgramming/05. General Equilibrium.ipynb
JMSundram/ConsumptionSavingNotebooks
338a8cecbe0043ebb4983c2fe0164599cd2a4fc0
[ "MIT" ]
34
2019-02-26T19:27:37.000Z
2021-12-27T09:34:04.000Z
277.619674
131,424
0.917566
[ [ [ "# General Equilibrium", "_____no_output_____" ], [ "This notebook illustrates **how to solve GE equilibrium models**. The example is a simple one-asset model without nominal rigidities.\n\nThe notebook shows how to:\n\n1. Solve for the **stationary equilibrium**.\n2. Solve for (non-linear) **transition paths** using a relaxtion algorithm.\n3. Solve for **transition paths** (linear vs. non-linear) and **impulse-responses** using the **sequence-space method** of **Auclert et. al. (2020)**.", "_____no_output_____" ] ], [ [ "LOAD = False # load stationary equilibrium\nDO_VARY_SIGMA_E = True # effect of uncertainty on stationary equilibrium\nDO_TP_RELAX = True # do transition path with relaxtion", "_____no_output_____" ] ], [ [ "# Setup", "_____no_output_____" ] ], [ [ "%load_ext autoreload\n%autoreload 2\n\nimport time\nimport numpy as np\nimport numba as nb\nfrom scipy import optimize\n\nimport matplotlib.pyplot as plt\nplt.style.use('seaborn-whitegrid')\nprop_cycle = plt.rcParams['axes.prop_cycle']\ncolors = prop_cycle.by_key()['color']\n\nfrom consav.misc import elapsed\n\nfrom GEModel import GEModelClass\nfrom GEModel import solve_backwards, simulate_forwards, simulate_forwards_transpose", "_____no_output_____" ] ], [ [ "## Choose number of threads in numba", "_____no_output_____" ] ], [ [ "import numba as nb\nnb.set_num_threads(8)", "_____no_output_____" ] ], [ [ "# Model", "_____no_output_____" ] ], [ [ "model = GEModelClass('baseline',load=LOAD)\nprint(model)", "Modelclass: GEModelClass\nName: baseline\n\nnamespaces: ['sol', 'sim', 'par']\nother_attrs: []\nsavefolder: saved\nnot_floats: ['Ne', 'Na', 'max_iter_solve', 'max_iter_simulate', 'path_T']\n\nsol:\n a = ndarray with shape = (7, 500) [dtype: float64]\n m = ndarray with shape = (7, 500) [dtype: float64]\n c = ndarray with shape = (7, 500) [dtype: float64]\n Va = ndarray with shape = (7, 500) [dtype: float64]\n i = ndarray with shape = (7, 500) [dtype: int32]\n w = ndarray with shape = (7, 500) [dtype: float64]\n path_a = ndarray with shape = (500, 7, 500) [dtype: float64]\n path_m = ndarray with shape = (500, 7, 500) [dtype: float64]\n path_c = ndarray with shape = (500, 7, 500) [dtype: float64]\n path_Va = ndarray with shape = (500, 7, 500) [dtype: float64]\n path_i = ndarray with shape = (500, 7, 500) [dtype: int32]\n path_w = ndarray with shape = (500, 7, 500) [dtype: float64]\n jac_K = ndarray with shape = (500, 500) [dtype: float64]\n jac_C = ndarray with shape = (500, 500) [dtype: float64]\n jac_curlyK_r = ndarray with shape = (500, 500) [dtype: float64]\n jac_curlyK_w = ndarray with shape = (500, 500) [dtype: float64]\n jac_C_r = ndarray with shape = (500, 500) [dtype: float64]\n jac_C_w = ndarray with shape = (500, 500) [dtype: float64]\n jac_r_K = ndarray with shape = (500, 500) [dtype: float64]\n jac_w_K = ndarray with shape = (500, 500) [dtype: float64]\n jac_r_Z = ndarray with shape = (500, 500) [dtype: float64]\n jac_w_Z = ndarray with shape = (500, 500) [dtype: float64]\n H_K = ndarray with shape = (500, 500) [dtype: float64]\n H_Z = ndarray with shape = (500, 500) [dtype: float64]\n G = ndarray with shape = (500, 500) [dtype: float64]\n memory, gb: 0.1\n\nsim:\n D = ndarray with shape = (7, 500) [dtype: float64]\n path_D = ndarray with shape = (500, 7, 500) [dtype: float64]\n path_K = ndarray with shape = (500,) [dtype: float64]\n path_C = ndarray with shape = (500,) [dtype: float64]\n path_Klag = ndarray with shape = (500,) [dtype: float64]\n memory, gb: 0.0\n\npar:\n r_ss = nan [float]\n w_ss = nan [float]\n K_ss = nan [float]\n Y_ss = nan [float]\n C_ss = nan [float]\n kd_ss = nan [float]\n ks_ss = nan [float]\n sigma = 1.0 [float]\n beta = 0.982 [float]\n Z = 1.0 [float]\n Z_sigma = 0.01 [float]\n Z_rho = 0.9 [float]\n alpha = 0.11 [float]\n delta = 0.025 [float]\n rho = 0.966 [float]\n sigma_e = 0.1 [float]\n Ne = 7 [int]\n a_max = 200.0 [float]\n Na = 500 [int]\n path_T = 500 [int]\n max_iter_solve = 5000 [int]\n max_iter_simulate = 5000 [int]\n solve_tol = 1e-10 [float]\n simulate_tol = 1e-10 [float]\n a_grid = ndarray with shape = (500,) [dtype: float64]\n e_grid = ndarray with shape = (7,) [dtype: float64]\n e_trans = ndarray with shape = (7, 7) [dtype: float64]\n e_ergodic = ndarray with shape = (7,) [dtype: float64]\n e_trans_cumsum = ndarray with shape = (7, 7) [dtype: float64]\n e_ergodic_cumsum = ndarray with shape = (7,) [dtype: float64]\n memory, gb: 0.0\n\n" ] ], [ [ "For easy access", "_____no_output_____" ] ], [ [ "par = model.par\nsim = model.sim\nsol = model.sol", "_____no_output_____" ] ], [ [ "**Productivity states:**", "_____no_output_____" ] ], [ [ "for e,pr_e in zip(par.e_grid,par.e_ergodic):\n print(f'Pr[e = {e:7.4f}] = {pr_e:.4f}')\n \nassert np.isclose(np.sum(par.e_grid*par.e_ergodic),1.0)", "Pr[e = 0.3599] = 0.0156\nPr[e = 0.4936] = 0.0938\nPr[e = 0.6769] = 0.2344\nPr[e = 0.9282] = 0.3125\nPr[e = 1.2729] = 0.2344\nPr[e = 1.7456] = 0.0937\nPr[e = 2.3939] = 0.0156\n" ] ], [ [ "# Find Stationary Equilibrium", "_____no_output_____" ], [ "**Step 1:** Find demand and supply of capital for a grid of interest rates.", "_____no_output_____" ] ], [ [ "if not LOAD:\n \n t0 = time.time()\n \n par = model.par\n\n # a. interest rate trial values\n Nr = 20\n r_vec = np.linspace(0.005,1.0/par.beta-1-0.002,Nr) # 1+r > beta not possible\n\n # b. allocate\n Ks = np.zeros(Nr) \n Kd = np.zeros(Nr)\n\n # c. loop\n r_min = r_vec[0]\n r_max = r_vec[Nr-1]\n for i_r in range(Nr):\n\n # i. firm side\n k = model.firm_demand(r_vec[i_r],par.Z)\n Kd[i_r] = k*1 # aggregate labor = 1.0\n \n # ii. household side\n success = model.solve_household_ss(r=r_vec[i_r])\n if success: \n success = model.simulate_household_ss()\n\n if success:\n\n # total demand\n Ks[i_r] = np.sum(model.sim.D*model.sol.a)\n\n # bounds on r\n diff = Ks[i_r]-Kd[i_r]\n if diff < 0: r_min = np.fmax(r_min,r_vec[i_r])\n if diff > 0: r_max = np.fmin(r_max,r_vec[i_r])\n\n else:\n\n Ks[i_r] = np.nan \n \n # d. save\n model.save()\n \n print(f'grid search done in {elapsed(t0)}')", "grid search done in 10.8 secs\n" ] ], [ [ "**Step 2:** Plot supply and demand.", "_____no_output_____" ] ], [ [ "if not LOAD:\n \n par = model.par\n\n fig = plt.figure(figsize=(6,4))\n ax = fig.add_subplot(1,1,1)\n\n ax.plot(r_vec,Ks,label='supply of capital')\n ax.plot(r_vec,Kd,label='demand for capital')\n\n ax.axvline(r_min,lw=0.5,ls='--',color='black')\n ax.axvline(r_max,lw=0.5,ls='--',color='black')\n\n ax.legend(frameon=True)\n ax.set_xlabel('interest rate, $r$')\n ax.set_ylabel('capital, $K_t$')\n \n fig.tight_layout()\n fig.savefig('figs/stationary_equilibrium.pdf')\n ", "_____no_output_____" ] ], [ [ "**Step 3:** Solve root-finding problem.", "_____no_output_____" ] ], [ [ "def obj(r,model):\n \n model.solve_household_ss(r=r)\n model.simulate_household_ss()\n return np.sum(model.sim.D*model.sol.a)-model.firm_demand(r,model.par.Z)\n\nif not LOAD:\n \n t0 = time.time()\n \n opt = optimize.root_scalar(obj,bracket=[r_min,r_max],method='bisect',args=(model,))\n model.par.r_ss = opt.root\n assert opt.converged\n \n print(f'search done in {elapsed(t0)}')\n ", "search done in 7.1 secs\n" ] ], [ [ "**Step 4:** Check market clearing conditions.", "_____no_output_____" ] ], [ [ "model.steady_state()", "household problem solved in 0.1 secs [652 iterations]\nhousehold problem simulated in 0.1 secs [747 iterations]\n\nr: 0.0127\nw: 1.0160\nY: 1.1415\nK/Y: 2.9186\n\ncapital market clearing: -0.00000000\ngoods market clearing: -0.00000755\n" ] ], [ [ "## Timings", "_____no_output_____" ] ], [ [ "%timeit model.solve_household_ss(r=par.r_ss)", "119 ms ± 8.83 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" ], [ "%timeit model.simulate_household_ss()", "66.2 ms ± 1.42 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)\n" ] ], [ [ "## Income uncertainty and the equilibrium interest rate", "_____no_output_____" ], [ "The equlibrium interest rate decreases when income uncertainty is increased.", "_____no_output_____" ] ], [ [ "if DO_VARY_SIGMA_E:\n \n par = model.par\n\n # a. seetings\n sigma_e_vec = [0.20]\n\n # b. find equilibrium rates\n model_ = model.copy()\n for sigma_e in sigma_e_vec:\n\n # i. set new parameter\n model_.par.sigma_e = sigma_e\n model_.create_grids()\n \n # ii. solve\n print(f'sigma_e = {sigma_e:.4f}',end='')\n opt = optimize.root_scalar(\n obj,\n bracket=[0.00,model.par.r_ss],\n method='bisect',\n args=(model_,)\n )\n print(f' -> r_ss = {opt.root:.4f}')\n model_.par.r_ss = opt.root\n model_.steady_state() \n print('\\n')\n ", "sigma_e = 0.2000 -> r_ss = 0.0029\nhousehold problem solved in 0.1 secs [430 iterations]\nhousehold problem simulated in 0.0 secs [427 iterations]\n\nr: 0.0029\nw: 1.0546\nY: 1.1849\nK/Y: 3.9462\n\ncapital market clearing: -0.00000000\ngoods market clearing: -0.00000587\n\n\n" ] ], [ [ "## Test matrix formulation", "_____no_output_____" ], [ "**Step 1:** Construct $\\boldsymbol{Q}_{ss}$", "_____no_output_____" ] ], [ [ "# a. allocate Q\nQ = np.zeros((par.Ne*par.Na,par.Ne*par.Na))\n\n# b. fill\nfor i_e in range(par.Ne):\n \n # get view of current block\n q = Q[i_e*par.Na:(i_e+1)*par.Na,i_e*par.Na:(i_e+1)*par.Na]\n \n for i_a in range(par.Na):\n \n # i. optimal choice\n a_opt = sol.a[i_e,i_a]\n \n # ii. above -> all weight on last node\n if a_opt >= par.a_grid[-1]:\n \n q[i_a,-1] = 1.0 \n \n # iii. below -> all weight on first node\n elif a_opt <= par.a_grid[0]:\n \n q[i_a,0] = 1.0\n \n # iv. standard -> distribute weights on neighboring nodes\n else: \n \n i_a_low = np.searchsorted(par.a_grid,a_opt,side='right')-1 \n \n assert a_opt >= par.a_grid[i_a_low], f'{a_opt} < {par.a_grid[i_a_low]}'\n assert a_opt < par.a_grid[i_a_low+1], f'{a_opt} < {par.a_grid[i_a_low]}'\n \n q[i_a,i_a_low] = (par.a_grid[i_a_low+1]-a_opt)/(par.a_grid[i_a_low+1]-par.a_grid[i_a_low])\n q[i_a,i_a_low+1] = 1-q[i_a,i_a_low]\n ", "_____no_output_____" ] ], [ [ "**Step 2:** Construct $\\tilde{\\Pi}^e=\\Pi^e \\otimes \\boldsymbol{I}_{\\#_{a}\\times\\#_{a}}$", "_____no_output_____" ] ], [ [ "Pit = np.kron(par.e_trans,np.identity(par.Na))", "_____no_output_____" ] ], [ [ "**Step 3:** Test $\\overrightarrow{D}_{t+1}=\\tilde{\\Pi}^{e\\prime}\\boldsymbol{Q}_{ss}^{\\prime}\\overrightarrow{D}_{t}$", "_____no_output_____" ] ], [ [ "D = np.zeros(sim.D.shape)\nD[:,0] = par.e_ergodic \n\n# a. standard\nD_plus = np.zeros(D.shape)\nsimulate_forwards(D,sol.i,sol.w,par.e_trans.T.copy(),D_plus)\n\n# b. matrix product\nD_plus_alt = ((Pit.T@Q.T)@D.ravel()).reshape((par.Ne,par.Na))\n\n# c. test equality\nassert np.allclose(D_plus,D_plus_alt)", "_____no_output_____" ] ], [ [ "# Find transition path", "_____no_output_____" ], [ "**MIT-shock:** Transtion path for arbitrary exogenous path of $Z_t$ starting from the stationary equilibrium, i.e. $D_{-1} = D_{ss}$ and in particular $K_{-1} = K_{ss}$.", "_____no_output_____" ], [ "**Step 1:** Construct $\\{Z_t\\}_{t=0}^{T-1}$ where $Z_t = (1-\\rho_Z)Z_{ss} + \\rho_Z Z_t$ and $Z_0 = (1+\\sigma_Z) Z_{ss}$", "_____no_output_____" ] ], [ [ "path_Z = model.get_path_Z()", "_____no_output_____" ] ], [ [ "**Step 2:** Apply relaxation algorithm.", "_____no_output_____" ] ], [ [ "if DO_TP_RELAX:\n \n t0 = time.time()\n\n # a. allocate\n path_r = np.repeat(model.par.r_ss,par.path_T) # use steady state as initial guess\n path_r_ = np.zeros(par.path_T)\n path_w = np.zeros(par.path_T)\n\n # b. setting\n nu = 0.90 # relaxation parameter\n max_iter = 5000 # maximum number of iterations\n\n # c. iterate\n it = 0\n while True:\n\n # i. find wage \n for t in range(par.path_T):\n path_w[t] = model.implied_w(path_r[t],path_Z[t])\n\n # ii. solve and simulate\n model.solve_household_path(path_r,path_w)\n model.simulate_household_path(model.sim.D)\n\n # iii. implied prices\n for t in range(par.path_T):\n path_r_[t] = model.implied_r(sim.path_Klag[t],path_Z[t])\n\n # iv. difference\n max_abs_diff = np.max(np.abs(path_r-path_r_))\n if it%10 == 0: print(f'{it:4d}: {max_abs_diff:.8f}')\n if max_abs_diff < 1e-8: break\n\n # v. update\n path_r = nu*path_r + (1-nu)*path_r_\n\n # vi. increment\n it += 1\n if it > max_iter: raise Exception('too many iterations') \n\n print(f'\\n transtion path found in {elapsed(t0)}')", " 0: 0.00038764\n 10: 0.00013139\n 20: 0.00004581\n 30: 0.00001597\n 40: 0.00000557\n 50: 0.00000194\n 60: 0.00000068\n 70: 0.00000024\n 80: 0.00000008\n 90: 0.00000003\n 100: 0.00000001\n\n transtion path found in 23.7 secs\n" ] ], [ [ "**Plot transition-paths:**", "_____no_output_____" ] ], [ [ "if DO_TP_RELAX:\n \n fig = plt.figure(figsize=(10,6))\n\n ax = fig.add_subplot(2,2,1)\n ax.plot(np.arange(par.path_T),path_Z,'-o',ms=2)\n ax.set_title('technology, $Z_t$');\n\n ax = fig.add_subplot(2,2,2)\n ax.plot(np.arange(par.path_T),sim.path_K,'-o',ms=2)\n ax.set_title('capital, $k_t$');\n\n ax = fig.add_subplot(2,2,3)\n ax.plot(np.arange(par.path_T),path_r,'-o',ms=2)\n ax.set_title('interest rate, $r_t$');\n\n ax = fig.add_subplot(2,2,4)\n ax.plot(np.arange(par.path_T),path_w,'-o',ms=2)\n ax.set_title('wage, $w_t$')\n\n fig.tight_layout()\n fig.savefig('figs/transition_path.pdf')\n ", "_____no_output_____" ] ], [ [ "**Remember:**", "_____no_output_____" ] ], [ [ "if DO_TP_RELAX:\n \n path_Z_relax = path_Z\n path_K_relax = sim.path_K\n path_r_relax = path_r\n path_w_relax = path_w\n ", "_____no_output_____" ] ], [ [ "# Find impulse-responses using sequence-space method", "_____no_output_____" ], [ "**Paper:** Auclert, A., Bardóczy, B., Rognlie, M., and Straub, L. (2020). *Using the Sequence-Space Jacobian to Solve and Estimate Heterogeneous-Agent Models*. \n\n**Original code:** [shade-econ](https://github.com/shade-econ/sequence-jacobian/#sequence-space-jacobian)\n\n**This code:** Illustrates the sequence-space method. The original paper shows how to do it computationally efficient and for a general class of models.", "_____no_output_____" ], [ "**Step 1:** Compute the Jacobian for the household block around the stationary equilibrium", "_____no_output_____" ] ], [ [ "def jac(model,price,dprice=1e-4,do_print=True):\n \n t0_all = time.time()\n \n if do_print: print(f'price is {price}')\n \n par = model.par\n sol = model.sol\n sim = model.sim\n \n # a. step 1: solve backwards\n t0 = time.time()\n\n path_r = np.repeat(par.r_ss,par.path_T)\n path_w = np.repeat(par.w_ss,par.path_T)\n \n if price == 'r': path_r[-1] += dprice\n elif price == 'w': path_w[-1] += dprice\n \n model.solve_household_path(path_r,path_w,do_print=False)\n if do_print: print(f'solved backwards in {elapsed(t0)}')\n\n # b. step 2: derivatives\n t0 = time.time()\n \n diff_Ds = np.zeros((par.path_T,*sim.D.shape))\n diff_as = np.zeros(par.path_T)\n diff_cs = np.zeros(par.path_T)\n \n for s in range(par.path_T):\n \n t_ =(par.path_T-1)-s\n simulate_forwards(sim.D,sol.path_i[t_],sol.path_w[t_],par.e_trans.T,diff_Ds[s])\n \n diff_Ds[s] = (diff_Ds[s]-sim.D)/dprice\n diff_as[s] = (np.sum(sol.path_a[t_]*sim.D)-np.sum(sol.a*sim.D))/dprice \n diff_cs[s] = (np.sum(sol.path_c[t_]*sim.D)-np.sum(sol.c*sim.D))/dprice \n \n if do_print: print(f'derivatives calculated in {elapsed(t0)}')\n \n # c. step 3: expectation factors\n t0 = time.time()\n \n # demeaning improves numerical stability\n def demean(x):\n return x - x.sum()/x.size\n\n exp_as = np.zeros((par.path_T-1,*sol.a.shape))\n exp_as[0] = demean(sol.a)\n\n exp_cs = np.zeros((par.path_T-1,*sol.c.shape))\n exp_cs[0] = demean(sol.c)\n \n for t in range(1,par.path_T-1):\n \n simulate_forwards_transpose(exp_as[t-1],sol.i,sol.w,par.e_trans,exp_as[t])\n exp_as[t] = demean(exp_as[t])\n \n simulate_forwards_transpose(exp_cs[t-1],sol.i,sol.w,par.e_trans,exp_cs[t])\n exp_cs[t] = demean(exp_cs[t])\n \n if do_print: print(f'expecation factors calculated in {elapsed(t0)}')\n \n # d. step 4: F \n t0 = time.time()\n \n Fa = np.zeros((par.path_T,par.path_T))\n Fa[0,:] = diff_as\n \n Fc = np.zeros((par.path_T,par.path_T))\n Fc[0,:] = diff_cs \n \n Fa[1:, :] = exp_as.reshape((par.path_T-1, -1)) @ diff_Ds.reshape((par.path_T, -1)).T\n Fc[1:, :] = exp_cs.reshape((par.path_T-1, -1)) @ diff_Ds.reshape((par.path_T, -1)).T \n\n if do_print: print(f'f calculated in {elapsed(t0)}')\n \n t0 = time.time()\n \n # e. step 5: J\n Ja = Fa.copy()\n for t in range(1, Ja.shape[1]): Ja[1:, t] += Ja[:-1, t - 1]\n\n Jc = Fc.copy()\n for t in range(1, Jc.shape[1]): Jc[1:, t] += Jc[:-1, t - 1]\n\n if do_print: print(f'J calculated in {elapsed(t0)}')\n \n # f. save\n setattr(model.sol,f'jac_curlyK_{price}',Ja)\n setattr(model.sol,f'jac_C_{price}',Jc)\n \n if do_print: print(f'full Jacobian calculated in {elapsed(t0_all)}\\n')", "_____no_output_____" ], [ "jac(model,'r')\njac(model,'w')", "price is r\nsolved backwards in 0.2 secs\nderivatives calculated in 2.0 secs\nexpecation factors calculated in 0.8 secs\nf calculated in 0.1 secs\nJ calculated in 0.0 secs\nfull Jacobian calculated in 3.1 secs\n\nprice is w\nsolved backwards in 0.1 secs\nderivatives calculated in 0.2 secs\nexpecation factors calculated in 0.1 secs\nf calculated in 0.1 secs\nJ calculated in 0.0 secs\nfull Jacobian calculated in 0.5 secs\n\n" ] ], [ [ "**Inspect Jacobians:**", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(12,8))\nT_fig = 200\n\n# curlyK_r\nax = fig.add_subplot(2,2,1)\nfor s in [0,25,50,75,100]:\n ax.plot(np.arange(T_fig),sol.jac_curlyK_r[s,:T_fig],'-o',ms=2,label=f'$s={s}$')\n \nax.legend(frameon=True)\nax.set_title(r'$\\mathcal{J}^{\\mathcal{K},r}$')\nax.set_xlim([0,T_fig])\n\n# curlyK_w\nax = fig.add_subplot(2,2,2)\nfor s in [0,25,50,75,100]:\n ax.plot(np.arange(T_fig),sol.jac_curlyK_w[s,:T_fig],'-o',ms=2)\n \nax.set_title(r'$\\mathcal{J}^{\\mathcal{K},w}$')\nax.set_xlim([0,T_fig])\n\n# C_r\nax = fig.add_subplot(2,2,3)\nfor s in [0,25,50,75,100]:\n ax.plot(np.arange(T_fig),sol.jac_C_r[s,:T_fig],'-o',ms=2,label=f'$s={s}$')\n \nax.legend(frameon=True)\nax.set_title(r'$\\mathcal{J}^{C,r}$')\nax.set_xlim([0,T_fig])\n\n# curlyK_w\nax = fig.add_subplot(2,2,4)\nfor s in [0,25,50,75,100]:\n ax.plot(np.arange(T_fig),sol.jac_C_w[s,:T_fig],'-o',ms=2)\n \nax.set_title(r'$\\mathcal{J}^{C,w}$')\nax.set_xlim([0,T_fig])\n\nfig.tight_layout()\nfig.savefig('figs/jacobians.pdf')", "_____no_output_____" ] ], [ [ "**Step 2:** Compute the Jacobians for the firm block around the stationary equilibrium (analytical).", "_____no_output_____" ] ], [ [ "sol.jac_r_K[:] = 0\nsol.jac_w_K[:] = 0\nsol.jac_r_Z[:] = 0\nsol.jac_w_Z[:] = 0\n\nfor s in range(par.path_T):\n for t in range(par.path_T):\n\n if t == s+1:\n\n sol.jac_r_K[t,s] = par.alpha*(par.alpha-1)*par.Z*par.K_ss**(par.alpha-2)\n sol.jac_w_K[t,s] = (1-par.alpha)*par.alpha*par.Z*par.K_ss**(par.alpha-1)\n\n if t == s:\n sol.jac_r_Z[t,s] = par.alpha*par.Z*par.K_ss**(par.alpha-1)\n sol.jac_w_Z[t,s] = (1-par.alpha)*par.Z*par.K_ss**par.alpha\n", "_____no_output_____" ] ], [ [ "**Step 3:** Use the chain rule and solve for $G$.", "_____no_output_____" ] ], [ [ "H_K = sol.jac_curlyK_r @ sol.jac_r_K + sol.jac_curlyK_w @ sol.jac_w_K - np.eye(par.path_T)\nH_Z = sol.jac_curlyK_r @ sol.jac_r_Z + sol.jac_curlyK_w @ sol.jac_w_Z \nG_K_Z = -np.linalg.solve(H_K, H_Z) # H_K^(-1)H_Z", "_____no_output_____" ] ], [ [ "**Step 4:** Find effect on prices and other outcomes than $K$.", "_____no_output_____" ] ], [ [ "G_r_Z = sol.jac_r_Z + sol.jac_r_K@G_K_Z\nG_w_Z = sol.jac_w_Z + sol.jac_w_K@G_K_Z\nG_C_Z = sol.jac_C_r@G_r_Z + sol.jac_C_w@G_w_Z", "_____no_output_____" ] ], [ [ "**Step 5:** Plot impulse-responses.", "_____no_output_____" ], [ "**Example I:** News shock (i.e. in a single period) vs. persistent shock where $ dZ_t = \\rho dZ_{t-1} $ and $dZ_0$ is the initial shock.", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(12,4))\nT_fig = 50\n\n# left: news shock\nax = fig.add_subplot(1,2,1)\nfor s in [5,10,15,20,25]:\n dZ = (1+par.Z_sigma)*par.Z*(np.arange(par.path_T) == s)\n dK = G_K_Z@dZ \n ax.plot(np.arange(T_fig),dK[:T_fig],'-o',ms=2,label=f'$s={s}$')\n\nax.legend(frameon=True)\nax.set_title(r'1% TFP news shock in period $s$')\nax.set_ylabel('$K_t-K_{ss}$')\nax.set_xlim([0,T_fig])\n\n# right: persistent shock\nax = fig.add_subplot(1,2,2)\ndZ = model.get_path_Z()-par.Z\ndK = G_K_Z@dZ\nax.plot(np.arange(T_fig),dK[:T_fig],'-o',ms=2)\n \nax.set_title(r'1% TFP shock with persistence $\\rho=0.90$')\nax.set_ylabel('$K_t-K_{ss}$')\nax.set_xlim([0,T_fig])\n\nfig.tight_layout()\nfig.savefig('figs/news_vs_persistent_shock.pdf')", "_____no_output_____" ] ], [ [ "**Example II:** Further effects of persistent shock.", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(12,8))\nT_fig = 50\n\nax_K = fig.add_subplot(2,2,1)\nax_r = fig.add_subplot(2,2,2)\nax_w = fig.add_subplot(2,2,3)\nax_C = fig.add_subplot(2,2,4)\n\nax_K.set_title('$K_t-K_{ss}$ after 1% TFP shock')\nax_K.set_xlim([0,T_fig])\n \nax_r.set_title('$r_t-r_{ss}$ after 1% TFP shock')\nax_r.set_xlim([0,T_fig])\n\nax_w.set_title('$w_t-w_{ss}$ after 1% TFP shock')\nax_w.set_xlim([0,T_fig])\n\nax_C.set_title('$C_t-C_{ss}$ after 1% TFP shock')\nax_C.set_xlim([0,T_fig])\n\ndZ = model.get_path_Z()-par.Z\n \ndK = G_K_Z@dZ\nax_K.plot(np.arange(T_fig),dK[:T_fig],'-o',ms=2)\n\ndr = G_r_Z@dZ \nax_r.plot(np.arange(T_fig),dr[:T_fig],'-o',ms=2)\n\ndw = G_w_Z@dZ \nax_w.plot(np.arange(T_fig),dw[:T_fig],'-o',ms=2)\n\ndC = G_C_Z@dZ\nax_C.plot(np.arange(T_fig),dC[:T_fig],'-o',ms=2)\n \nfig.tight_layout()\nfig.savefig('figs/irfs.pdf')", "_____no_output_____" ] ], [ [ "## Non-linear transition path", "_____no_output_____" ], [ "Use the Jacobian to speed-up solving for the non-linear transition path using a quasi-Newton method.", "_____no_output_____" ], [ "**1. Solver**", "_____no_output_____" ] ], [ [ "def broyden_solver(f,x0,jac,tol=1e-8,max_iter=100,backtrack_fac=0.5,max_backtrack=30,do_print=False):\n \"\"\" numerical solver using the broyden method \"\"\"\n\n # a. initial\n x = x0.ravel()\n y = f(x)\n\n # b. iterate\n for it in range(max_iter):\n \n # i. current difference\n abs_diff = np.max(np.abs(y))\n if do_print: print(f' it = {it:3d} -> max. abs. error = {abs_diff:12.8f}')\n\n if abs_diff < tol: return x\n \n # ii. new x\n dx = np.linalg.solve(jac,-y)\n \n # iii. evalute with backtrack\n for _ in range(max_backtrack):\n\n try: # evaluate\n ynew = f(x+dx)\n except ValueError: # backtrack\n dx *= backtrack_fac\n else: # update jac and break from backtracking\n dy = ynew-y\n jac = jac + np.outer(((dy - jac @ dx) / np.linalg.norm(dx) ** 2), dx)\n y = ynew\n x += dx\n break\n\n else:\n\n raise ValueError('too many backtracks, maybe bad initial guess?')\n \n else:\n\n raise ValueError(f'no convergence after {max_iter} iterations') ", "_____no_output_____" ] ], [ [ "**2. Target function**", "_____no_output_____" ], [ "$$\\boldsymbol{H}(\\boldsymbol{K},\\boldsymbol{Z},D_{ss}) = \\mathcal{K}_{t}(\\{r(Z_{s},K_{s-1}),w(Z_{s},K_{s-1})\\}_{s\\geq0},D_{ss})-K_{t}=0$$", "_____no_output_____" ] ], [ [ "def target(path_K,path_Z,model,D0,full_output=False):\n\n par = model.par\n sim = model.sim\n \n path_r = np.zeros(path_K.size)\n path_w = np.zeros(path_K.size)\n\n # a. implied prices\n K0lag = np.sum(par.a_grid[np.newaxis,:]*D0)\n path_Klag = np.insert(path_K,0,K0lag)\n \n for t in range(par.path_T):\n path_r[t] = model.implied_r(path_Klag[t],path_Z[t])\n path_w[t] = model.implied_w(path_r[t],path_Z[t])\n \n # b. solve and simulate\n model.solve_household_path(path_r,path_w)\n model.simulate_household_path(D0)\n \n # c. market clearing\n if full_output:\n return path_r,path_w\n else:\n return sim.path_K-path_K\n", "_____no_output_____" ] ], [ [ "**3. Solve**", "_____no_output_____" ] ], [ [ "path_Z = model.get_path_Z()\nf = lambda x: target(x,path_Z,model,sim.D)", "_____no_output_____" ], [ "t0 = time.time()\npath_K = broyden_solver(f,x0=np.repeat(par.K_ss,par.path_T),jac=H_K,do_print=True)\npath_r,path_w = target(path_K,path_Z,model,sim.D,full_output=True)\nprint(f'\\nIRF found in {elapsed(t0)}')", " it = 0 -> max. abs. error = 0.05898269\n it = 1 -> max. abs. error = 0.00026075\n it = 2 -> max. abs. error = 0.00000163\n it = 3 -> max. abs. error = 0.00000000\n\nIRF found in 2.1 secs\n" ] ], [ [ "**4. Plot**", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(12,4))\n\nax = fig.add_subplot(1,2,1)\nax.set_title('capital, $K_t$')\ndK = G_K_Z@(path_Z-par.Z)\nax.plot(np.arange(T_fig),dK[:T_fig] + par.K_ss,'-o',ms=2,label=f'linear')\nax.plot(np.arange(T_fig),path_K[:T_fig],'-o',ms=2,label=f'non-linear')\nif DO_TP_RELAX:\n ax.plot(np.arange(T_fig),path_K_relax[:T_fig],'--o',ms=2,label=f'non-linear (relaxtion)')\n\nax.legend(frameon=True) \n \nax = fig.add_subplot(1,2,2)\nax.set_title('interest rate, $r_t$')\ndr = G_r_Z@(path_Z-par.Z)\nax.plot(np.arange(T_fig),dr[:T_fig] + par.r_ss,'-o',ms=2,label=f'linear')\nax.plot(np.arange(T_fig),path_r[:T_fig],'-o',ms=2,label=f'non-linear')\nif DO_TP_RELAX:\n ax.plot(np.arange(T_fig),path_r_relax[:T_fig],'--o',ms=2,label=f'non-linear (relaxtion)')\n \nfig.tight_layout()\nfig.savefig('figs/non_linear.pdf')", "_____no_output_____" ] ], [ [ "## Covariances", "_____no_output_____" ], [ "Assume that $Z_t$ is stochastic and follows\n\n$$ d\\tilde{Z}_t = \\rho d\\tilde{Z}_{t-1} + \\sigma\\epsilon_t,\\,\\,\\, \\epsilon_t \\sim \\mathcal{N}(0,1) $$\n\nThe covariances between all outcomes can be calculated as follows.", "_____no_output_____" ] ], [ [ "# a. choose parameter\nrho = 0.90\nsigma = 0.10\n\n# b. find change in outputs\ndZ = rho**(np.arange(par.path_T))\ndC = G_C_Z@dZ\ndK = G_K_Z@dZ\n\n# c. covariance of consumption\nprint('auto-covariance of consumption:\\n')\nfor k in range(5):\n if k == 0:\n autocov_C = sigma**2*np.sum(dC*dC)\n else:\n autocov_C = sigma**2*np.sum(dC[:-k]*dC[k:])\n print(f' k = {k}: {autocov_C:.4f}')\n \n# d. covariance of consumption and capital\ncov_C_K = sigma**2*np.sum(dC*dK)\nprint(f'\\ncovariance of consumption and capital: {cov_C_K:.4f}')", "auto-covariance of consumption:\n\n k = 0: 0.0445\n k = 1: 0.0431\n k = 2: 0.0415\n k = 3: 0.0399\n k = 4: 0.0382\n\ncovariance of consumption and capital: 0.2117\n" ] ], [ [ "# Extra: No idiosyncratic uncertainty", "_____no_output_____" ], [ "This section solve for the transition path in the case without idiosyncratic uncertainty.", "_____no_output_____" ], [ "**Analytical solution for steady state:**", "_____no_output_____" ] ], [ [ "r_ss_pf = (1/par.beta-1) # from euler-equation\nw_ss_pf = model.implied_w(r_ss_pf,par.Z)\nK_ss_pf = model.firm_demand(r_ss_pf,par.Z)\nY_ss_pf = model.firm_production(K_ss_pf,par.Z)\nC_ss_pf = Y_ss_pf-par.delta*K_ss_pf\n\nprint(f'r: {r_ss_pf:.6f}')\nprint(f'w: {w_ss_pf:.6f}')\nprint(f'Y: {Y_ss_pf:.6f}')\nprint(f'C: {C_ss_pf:.6f}')\nprint(f'K/Y: {K_ss_pf/Y_ss_pf:.6f}')", "r: 0.018330\nw: 0.998613\nY: 1.122037\nC: 1.050826\nK/Y: 2.538660\n" ] ], [ [ "**Function for finding consumption and capital paths given paths of interest rates and wages:**\n\nIt can be shown that\n\n$$ C_{0}=\\frac{(1+r_{0})a_{-1}+\\sum_{t=0}^{\\infty}\\frac{1}{\\mathcal{R}_{t}}w_{t}}{\\sum_{t=0}^{\\infty}\\beta^{t/\\rho}\\mathcal{R}_{t}^{\\frac{1-\\rho}{\\rho}}} $$\n\nwhere \n\n$$ \\mathcal{R}_{t} =\\begin{cases} 1 & \\text{if }t=0\\\\ (1+r_{t})\\mathcal{R}_{t-1} & \\text{else} \\end{cases} $$\n\nOtherwise the **Euler-equation** holds\n\n$$ C_t = (\\beta (1+r_{t}))^{\\frac{1}{\\sigma}}C_{t-1} $$", "_____no_output_____" ] ], [ [ "def path_CK_func(K0,path_r,path_w,r_ss,w_ss,model):\n \n par = model.par\n \n # a. initialize\n wealth = (1+path_r[0])*K0\n inv_MPC = 0\n \n # b. solve\n RT = 1\n max_iter = 5000\n t = 0\n while True and t < max_iter: \n \n # i. prices padded with steady state\n r = path_r[t] if t < par.path_T else r_ss\n w = path_w[t] if t < par.path_T else w_ss\n \n # ii. interest rate factor \n if t == 0:\n fac = 1\n else:\n fac *= (1+r)\n \n # iii. accumulate\n add_wealth = w/fac\n add_inv_MPC = par.beta**(t/par.sigma)*fac**((1-par.sigma)/par.sigma)\n if np.fmax(add_wealth,add_inv_MPC) < 1e-12:\n break\n else:\n wealth += add_wealth\n inv_MPC += add_inv_MPC\n \n # iv. increment\n t += 1\n \n # b. simulate\n path_C = np.empty(par.path_T)\n path_K = np.empty(par.path_T)\n \n for t in range(par.path_T):\n \n if t == 0:\n path_C[t] = wealth/inv_MPC\n K_lag = K0\n else:\n path_C[t] = (par.beta*(1+path_r[t]))**(1/par.sigma)*path_C[t-1]\n K_lag = path_K[t-1]\n \n path_K[t] = (1+path_r[t])*K_lag + path_w[t] - path_C[t]\n \n return path_K,path_C", "_____no_output_____" ] ], [ [ "**Test with steady state prices:**", "_____no_output_____" ] ], [ [ "path_r_pf = np.repeat(r_ss_pf,par.path_T)\npath_w_pf = np.repeat(w_ss_pf,par.path_T)\npath_K_pf,path_C_pf = path_CK_func(K_ss_pf,path_r_pf,path_w_pf,r_ss_pf,w_ss_pf,model)", "_____no_output_____" ], [ "print(f'C_ss: {C_ss_pf:.6f}')\nprint(f'C[0]: {path_C_pf[0]:.6f}')\nprint(f'C[-1]: {path_C_pf[-1]:.6f}')\nassert np.isclose(C_ss_pf,path_C_pf[0])", "C_ss: 1.050826\nC[0]: 1.050826\nC[-1]: 1.050826\n" ] ], [ [ "**Shock paths** where interest rate deviate in one period:", "_____no_output_____" ] ], [ [ "dr = 1e-4\nts = np.array([0,20,40])\npath_C_pf_shock = np.empty((ts.size,par.path_T))\npath_K_pf_shock = np.empty((ts.size,par.path_T))\n\nfor i,t in enumerate(ts):\n \n path_r_pf_shock = path_r_pf.copy()\n path_r_pf_shock[t] += dr\n \n K,C = path_CK_func(K_ss_pf,path_r_pf_shock,path_w_pf,r_ss_pf,w_ss_pf,model)\n \n path_K_pf_shock[i,:] = K\n path_C_pf_shock[i,:] = C", "_____no_output_____" ] ], [ [ "**Plot paths:**", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(12,4))\n\nax = fig.add_subplot(1,2,1)\nax.plot(np.arange(par.path_T),path_C_pf,'-o',ms=2,label=f'$r_t = r^{{\\\\ast}}$')\nfor i,t in enumerate(ts):\n ax.plot(np.arange(par.path_T),path_C_pf_shock[i],'-o',ms=2,label=f'shock to $r_{{{t}}}$')\n \nax.set_xlim([0,50])\nax.set_xlabel('periods')\nax.set_ylabel('consumtion, $C_t$');\n\nax = fig.add_subplot(1,2,2)\nax.plot(np.arange(par.path_T),path_K_pf,'-o',ms=2,label=f'$r_t = r^{{\\\\ast}}$')\nfor i,t in enumerate(ts):\n ax.plot(np.arange(par.path_T),path_K_pf_shock[i],'-o',ms=2,label=f'shock to $r_{{{t}}}$')\n \nax.legend(frameon=True)\nax.set_xlim([0,50])\nax.set_xlabel('$t$')\nax.set_ylabel('capital, $K_t$');\n\nfig.tight_layout()", "_____no_output_____" ] ], [ [ "**Find transition path with shooting algorithm:**", "_____no_output_____" ] ], [ [ "# a. allocate\ndT = 200\npath_C_pf = np.empty(par.path_T)\npath_K_pf = np.empty(par.path_T)\npath_r_pf = np.empty(par.path_T)\npath_w_pf = np.empty(par.path_T)\n\n# b. settings\nC_min = C_ss_pf\nC_max = C_ss_pf + K_ss_pf\n\nK_min = 1.5 # guess on lower consumption if below this\nK_max = 3 # guess on higher consumption if above this\n\ntol_pf = 1e-6\nmax_iter_pf = 5000\n\npath_K_pf[0] = K_ss_pf # capital is pre-determined\n\n# c. iterate\nt = 0\nit = 0\nwhile True:\n \n # i. update prices\n path_r_pf[t] = model.implied_r(path_K_pf[t],path_Z[t])\n path_w_pf[t] = model.implied_w(path_r_pf[t],path_Z[t])\n\n # ii. consumption \n if t == 0:\n C0 = (C_min+C_max)/2\n path_C_pf[t] = C0\n else:\n path_C_pf[t] = (1+path_r_pf[t])*par.beta*path_C_pf[t-1]\n \n # iii. check for steady state\n if path_K_pf[t] < K_min:\n t = 0\n C_max = C0\n continue\n elif path_K_pf[t] > K_max:\n t = 0\n C_min = C0\n continue\n elif t > 10 and np.sqrt((path_C_pf[t]-C_ss_pf)**2+(path_K_pf[t]-K_ss_pf)**2) < tol_pf:\n path_C_pf[t:] = path_C_pf[t]\n path_K_pf[t:] = path_K_pf[t]\n for k in range(par.path_T):\n path_r_pf[k] = model.implied_r(path_K_pf[k],path_Z[k])\n path_w_pf[k] = model.implied_w(path_r_pf[k],path_Z[k])\n break\n \n # iv. update capital\n path_K_pf[t+1] = (1+path_r_pf[t])*path_K_pf[t] + path_w_pf[t] - path_C_pf[t] \n \n # v. increment\n t += 1\n it += 1\n if it > max_iter_pf: break ", "_____no_output_____" ] ], [ [ "**Plot deviations from steady state:**", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(12,8))\n\nax = fig.add_subplot(2,2,1)\nax.plot(np.arange(par.path_T),path_Z,'-o',ms=2)\nax.set_xlim([0,200])\nax.set_title('technology, $Z_t$')\n\nax = fig.add_subplot(2,2,2)\nax.plot(np.arange(par.path_T),path_K-model.par.kd_ss,'-o',ms=2,label='$\\sigma_e = 0.5$')\nax.plot(np.arange(par.path_T),path_K_pf-K_ss_pf,'-o',ms=2,label='$\\sigma_e = 0$')\nax.legend(frameon=True)\nax.set_title('capital, $k_t$')\nax.set_xlim([0,200])\n\nax = fig.add_subplot(2,2,3)\nax.plot(np.arange(par.path_T),path_r-model.par.r_ss,'-o',ms=2,label='$\\sigma_e = 0.5$')\nax.plot(np.arange(par.path_T),path_r_pf-r_ss_pf,'-o',ms=2,label='$\\sigma_e = 0$')\nax.legend(frameon=True)\nax.set_title('interest rate, $r_t$')\nax.set_xlim([0,200])\n\nax = fig.add_subplot(2,2,4)\nax.plot(np.arange(par.path_T),path_w-model.par.w_ss,'-o',ms=2,label='$\\sigma_e = 0.5$')\nax.plot(np.arange(par.path_T),path_w_pf-w_ss_pf,'-o',ms=2,label='$\\sigma_e = 0$')\nax.legend(frameon=True)\nax.set_title('wage, $w_t$')\nax.set_xlim([0,200])\n\nfig.tight_layout()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0280bb7ebf666cc2c79aedcf989ddce331103b2
18,683
ipynb
Jupyter Notebook
LeNet-Lab.ipynb
dumebi/-Udasity-CarND-LeNet-Lab
a02a97373b10efe1ee18ae3115f5e4ed9934512f
[ "MIT" ]
null
null
null
LeNet-Lab.ipynb
dumebi/-Udasity-CarND-LeNet-Lab
a02a97373b10efe1ee18ae3115f5e4ed9934512f
[ "MIT" ]
null
null
null
LeNet-Lab.ipynb
dumebi/-Udasity-CarND-LeNet-Lab
a02a97373b10efe1ee18ae3115f5e4ed9934512f
[ "MIT" ]
null
null
null
34.406998
2,100
0.599101
[ [ [ "# LeNet Lab\n![LeNet Architecture](lenet.png)\nSource: Yan LeCun", "_____no_output_____" ], [ "## Load Data\n\nLoad the MNIST data, which comes pre-loaded with TensorFlow.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "from tensorflow.examples.tutorials.mnist import input_data\n\nmnist = input_data.read_data_sets(\"MNIST_data/\", reshape=False)\nX_train, y_train = mnist.train.images, mnist.train.labels\nX_validation, y_validation = mnist.validation.images, mnist.validation.labels\nX_test, y_test = mnist.test.images, mnist.test.labels\n\nassert(len(X_train) == len(y_train))\nassert(len(X_validation) == len(y_validation))\nassert(len(X_test) == len(y_test))\n\nprint()\nprint(\"Image Shape: {}\".format(X_train[0].shape))\nprint()\nprint(\"Training Set: {} samples\".format(len(X_train)))\nprint(\"Validation Set: {} samples\".format(len(X_validation)))\nprint(\"Test Set: {} samples\".format(len(X_test)))", "/Users/Kornet-Mac-1/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\n" ] ], [ [ "The MNIST data that TensorFlow pre-loads comes as 28x28x1 images.\n\nHowever, the LeNet architecture only accepts 32x32xC images, where C is the number of color channels.\n\nIn order to reformat the MNIST data into a shape that LeNet will accept, we pad the data with two rows of zeros on the top and bottom, and two columns of zeros on the left and right (28+2+2 = 32).\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "import numpy as np\n\n# Pad images with 0s\nX_train = np.pad(X_train, ((0,0),(2,2),(2,2),(0,0)), 'constant')\nX_validation = np.pad(X_validation, ((0,0),(2,2),(2,2),(0,0)), 'constant')\nX_test = np.pad(X_test, ((0,0),(2,2),(2,2),(0,0)), 'constant')\n \nprint(\"Updated Image Shape: {}\".format(X_train[0].shape))", "Updated Image Shape: (32, 32, 1)\n" ] ], [ [ "## Visualize Data\n\nView a sample from the dataset.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "import random\nimport numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\n\nindex = random.randint(0, len(X_train))\nimage = X_train[index].squeeze()\n\nplt.figure(figsize=(1,1))\nplt.imshow(image, cmap=\"gray\")\nprint(y_train[index])", "6\n" ] ], [ [ "## Preprocess Data\n\nShuffle the training data.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "from sklearn.utils import shuffle\n\nX_train, y_train = shuffle(X_train, y_train)", "_____no_output_____" ] ], [ [ "## Setup TensorFlow\nThe `EPOCH` and `BATCH_SIZE` values affect the training speed and model accuracy.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "import tensorflow as tf\n\nEPOCHS = 10\nBATCH_SIZE = 128", "_____no_output_____" ] ], [ [ "## TODO: Implement LeNet-5\nImplement the [LeNet-5](http://yann.lecun.com/exdb/lenet/) neural network architecture.\n\nThis is the only cell you need to edit.\n### Input\nThe LeNet architecture accepts a 32x32xC image as input, where C is the number of color channels. Since MNIST images are grayscale, C is 1 in this case.\n\n### Architecture\n**Layer 1: Convolutional.** The output shape should be 28x28x6.\n\n**Activation.** Your choice of activation function.\n\n**Pooling.** The output shape should be 14x14x6.\n\n**Layer 2: Convolutional.** The output shape should be 10x10x16.\n\n**Activation.** Your choice of activation function.\n\n**Pooling.** The output shape should be 5x5x16.\n\n**Flatten.** Flatten the output shape of the final pooling layer such that it's 1D instead of 3D. The easiest way to do is by using `tf.contrib.layers.flatten`, which is already imported for you.\n\n**Layer 3: Fully Connected.** This should have 120 outputs.\n\n**Activation.** Your choice of activation function.\n\n**Layer 4: Fully Connected.** This should have 84 outputs.\n\n**Activation.** Your choice of activation function.\n\n**Layer 5: Fully Connected (Logits).** This should have 10 outputs.\n\n### Output\nReturn the result of the 2nd fully connected layer.", "_____no_output_____" ] ], [ [ "from tensorflow.contrib.layers import flatten\n\ndef LeNet(x): \n # Arguments used for tf.truncated_normal, randomly defines variables for the weights and biases for each layer\n mu = 0\n sigma = 0.1\n \n # TODO: Layer 1: Convolutional. Input = 32x32x1. Output = 28x28x6.\n conv1_W = tf.Variable(tf.truncated_normal(shape=(5, 5, 1, 6), mean = mu, stddev = sigma))\n conv1_b = tf.Variable(tf.zeros(6))\n conv1 = tf.nn.conv2d(x, conv1_W, strides=[1, 1, 1, 1], padding='VALID') + conv1_b\n\n # TODO: Activation.\n conv1 = tf.nn.relu(conv1)\n\n # TODO: Pooling. Input = 28x28x6. Output = 14x14x6.\n conv1 = tf.nn.max_pool(conv1, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='VALID')\n\n # TODO: Layer 2: Convolutional. Output = 10x10x16.\n conv2_W = tf.Variable(tf.truncated_normal(shape=(5, 5, 6, 16), mean = mu, stddev = sigma))\n conv2_b = tf.Variable(tf.zeros(16))\n conv2 = tf.nn.conv2d(conv1, conv2_W, strides=[1, 1, 1, 1], padding='VALID') + conv2_b\n \n # TODO: Activation.\n conv2 = tf.nn.relu(conv2)\n\n # TODO: Pooling. Input = 10x10x16. Output = 5x5x16.\n conv2 = tf.nn.max_pool(conv2, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='VALID')\n\n # TODO: Flatten. Input = 5x5x16. Output = 400.\n fc0 = flatten(conv2)\n \n # TODO: Layer 3: Fully Connected. Input = 400. Output = 120.\n fc1_W = tf.Variable(tf.truncated_normal(shape=(400, 120), mean = mu, stddev = sigma))\n fc1_b = tf.Variable(tf.zeros(120))\n fc1 = tf.matmul(fc0, fc1_W) + fc1_b\n \n # TODO: Activation.\n fc1 = tf.nn.relu(fc1)\n\n # TODO: Layer 4: Fully Connected. Input = 120. Output = 84.\n fc2_W = tf.Variable(tf.truncated_normal(shape=(120, 84), mean = mu, stddev = sigma))\n fc2_b = tf.Variable(tf.zeros(84))\n fc2 = tf.matmul(fc1, fc2_W) + fc2_b\n \n # TODO: Activation.\n fc2 = tf.nn.relu(fc2)\n\n # TODO: Layer 5: Fully Connected. Input = 84. Output = 10.\n fc3_W = tf.Variable(tf.truncated_normal(shape=(84, 10), mean = mu, stddev = sigma))\n fc3_b = tf.Variable(tf.zeros(10))\n logits = tf.matmul(fc2, fc3_W) + fc3_b\n \n return logits", "_____no_output_____" ] ], [ [ "## Features and Labels\nTrain LeNet to classify [MNIST](http://yann.lecun.com/exdb/mnist/) data.\n\n`x` is a placeholder for a batch of input images.\n`y` is a placeholder for a batch of output labels.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "x = tf.placeholder(tf.float32, (None, 32, 32, 1))\ny = tf.placeholder(tf.int32, (None))\none_hot_y = tf.one_hot(y, 10)", "_____no_output_____" ] ], [ [ "## Training Pipeline\nCreate a training pipeline that uses the model to classify MNIST data.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "rate = 0.001\n\nlogits = LeNet(x)\ncross_entropy = tf.nn.softmax_cross_entropy_with_logits(labels=one_hot_y, logits=logits)\nloss_operation = tf.reduce_mean(cross_entropy)\noptimizer = tf.train.AdamOptimizer(learning_rate = rate)\ntraining_operation = optimizer.minimize(loss_operation)", "_____no_output_____" ] ], [ [ "## Model Evaluation\nEvaluate how well the loss and accuracy of the model for a given dataset.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "correct_prediction = tf.equal(tf.argmax(logits, 1), tf.argmax(one_hot_y, 1))\naccuracy_operation = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\nsaver = tf.train.Saver()\n\ndef evaluate(X_data, y_data):\n num_examples = len(X_data)\n total_accuracy = 0\n sess = tf.get_default_session()\n for offset in range(0, num_examples, BATCH_SIZE):\n batch_x, batch_y = X_data[offset:offset+BATCH_SIZE], y_data[offset:offset+BATCH_SIZE]\n accuracy = sess.run(accuracy_operation, feed_dict={x: batch_x, y: batch_y})\n total_accuracy += (accuracy * len(batch_x))\n return total_accuracy / num_examples", "_____no_output_____" ] ], [ [ "## Train the Model\nRun the training data through the training pipeline to train the model.\n\nBefore each epoch, shuffle the training set.\n\nAfter each epoch, measure the loss and accuracy of the validation set.\n\nSave the model after training.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "with tf.Session() as sess:\n sess.run(tf.global_variables_initializer())\n num_examples = len(X_train)\n \n print(\"Training...\")\n print()\n for i in range(EPOCHS):\n X_train, y_train = shuffle(X_train, y_train)\n for offset in range(0, num_examples, BATCH_SIZE):\n end = offset + BATCH_SIZE\n batch_x, batch_y = X_train[offset:end], y_train[offset:end]\n sess.run(training_operation, feed_dict={x: batch_x, y: batch_y})\n \n validation_accuracy = evaluate(X_validation, y_validation)\n print(\"EPOCH {} ...\".format(i+1))\n print(\"Validation Accuracy = {:.3f}\".format(validation_accuracy))\n print()\n \n saver.save(sess, './lenet')\n print(\"Model saved\")", "Training...\n\nEPOCH 1 ...\nValidation Accuracy = 0.972\n\nEPOCH 2 ...\nValidation Accuracy = 0.978\n\nEPOCH 3 ...\nValidation Accuracy = 0.984\n\nEPOCH 4 ...\nValidation Accuracy = 0.986\n\nEPOCH 5 ...\nValidation Accuracy = 0.989\n\nEPOCH 6 ...\nValidation Accuracy = 0.987\n\nEPOCH 7 ...\nValidation Accuracy = 0.990\n\nEPOCH 8 ...\nValidation Accuracy = 0.987\n\nEPOCH 9 ...\nValidation Accuracy = 0.989\n\nEPOCH 10 ...\nValidation Accuracy = 0.989\n\nModel saved\n" ] ], [ [ "## Evaluate the Model\nOnce you are completely satisfied with your model, evaluate the performance of the model on the test set.\n\nBe sure to only do this once!\n\nIf you were to measure the performance of your trained model on the test set, then improve your model, and then measure the performance of your model on the test set again, that would invalidate your test results. You wouldn't get a true measure of how well your model would perform against real data.\n\nYou do not need to modify this section.", "_____no_output_____" ] ], [ [ "with tf.Session() as sess:\n saver.restore(sess, tf.train.latest_checkpoint('.'))\n\n test_accuracy = evaluate(X_test, y_test)\n print(\"Test Accuracy = {:.3f}\".format(test_accuracy))", "INFO:tensorflow:Restoring parameters from ./lenet\nTest Accuracy = 0.990\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0280f2f545220cf9bd8028e67c6265c7ed56436
322,995
ipynb
Jupyter Notebook
Threshold Investigation.ipynb
mwcotton/DAGmetrics
974a1d6781e041e59edb081659184dc34153f3d8
[ "MIT" ]
1
2020-11-07T21:11:56.000Z
2020-11-07T21:11:56.000Z
Threshold Investigation.ipynb
mwcotton/DAGmetrics
974a1d6781e041e59edb081659184dc34153f3d8
[ "MIT" ]
null
null
null
Threshold Investigation.ipynb
mwcotton/DAGmetrics
974a1d6781e041e59edb081659184dc34153f3d8
[ "MIT" ]
null
null
null
146.019439
37,416
0.839954
[ [ [ "import numpy as np\nimport scipy as sp\nimport scipy.interpolate\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport scipy.stats\nimport scipy.optimize\n\nfrom scipy.optimize import curve_fit \n\nimport minkowskitools as mt", "_____no_output_____" ], [ "import importlib\nimportlib.reload(mt)", "_____no_output_____" ], [ "n=4000\nrand_points = np.random.uniform(size=(2, n-2))\nedge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\npoints = np.concatenate((rand_points, edge_points), axis=1)", "_____no_output_____" ], [ "connections = mt.get_connections(points, pval=2, radius=0.05)", "_____no_output_____" ], [ "quick_data = []\n\nfor i in range(1000):\n n=1000\n rand_points = np.random.uniform(size=(2, n-2))\n edge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\n points = np.concatenate((rand_points, edge_points), axis=1)\n connections = mt.get_connections(points, pval=2, radius=0.1)\n no_points = mt.perc_thresh_n(connections)\n quick_data.append(no_points)", "_____no_output_____" ], [ "plt.hist(quick_data, cumulative=True, bins=100)\nplt.gca().set(xlim=(0, 1000), xlabel='Number of Points', ylabel='Culmulative Density', title='Connectionn Threshold')\n# plt.savefig('img/pval2r05.pdf')\nplt.gca().set(xlim=(0, np.max(quick_data)))", "_____no_output_____" ], [ "plt.hist(quick_data, bins=100);", "_____no_output_____" ], [ "n=1000\nrand_points = np.random.uniform(size=(2, n-2))\nedge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\npoints = np.concatenate((rand_points, edge_points), axis=1)\n\nmt.smallest_r(points, pval=2)", "_____no_output_____" ], [ "n=1000\ntrials = 100\nall_results = {}\nresults = []\nfor i in range(trials):\n \n rand_points = np.random.uniform(size=(2, n-2))\n edge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\n points = np.concatenate((rand_points, edge_points), axis=1)\n\n results.append(mt.smallest_r(points, pval=2)[1])\n ", "_____no_output_____" ], [ "plt.hist(results, cumulative=True, bins=100);", "_____no_output_____" ], [ "mt.r1_area2D(2)*(.05**2)*n", "_____no_output_____" ], [ "ns = [1000]\nps = [2]\n\nmt.separate_perc_r(ns, ps, 'outputs/test_perc.txt', repeats=10)", "_____no_output_____" ], [ "import importlib\nimportlib.reload(mt)", "_____no_output_____" ], [ "data_dict = {}\nfor pval in [0.8, 1, 1.2]:\n data_dict[pval] = []\n n = 1000\n r = 0.1\n for i in range(1000):\n rand_points = np.random.uniform(size=(2, n-2))\n edge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\n points = np.concatenate((rand_points, edge_points), axis=1)\n connections = mt.get_connections(points, pval=pval, radius=r)\n no_points = mt.perc_thresh_n(connections)\n data_dict[pval].append(no_points)", "_____no_output_____" ], [ "for pval in [0.8, 1, 1.2]:\n plt.hist(data_dict[pval], cumulative=True, bins=100, label=pval, alpha=.3);\n \nplt.legend()\nplt.gca().set(title='Number of Points for Connectedness', xlabel='Points', ylabel='Cumulative Frequency');\n# plt.savefig('img/PointsCumul.pdf')", "_____no_output_____" ], [ "data_dict_r = {}\nfor pval in [0.8, 1, 1.2]:\n data_dict_r[pval] = []\n n = 1000\n r = 0.1\n for i in range(1000):\n print(i, end=',')\n rand_points = np.random.uniform(size=(2, n-2))\n edge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\n points = np.concatenate((rand_points, edge_points), axis=1)\n r_min = smallest_r(points, pval)\n data_dict_r[pval].append(r_min[1])", "0," ], [ "fig, [ax1, ax2] = plt.subplots(ncols=2, figsize=(14, 5))\n\nfor pval in [0.8, 1, 1.2]:\n ax1.hist(data_dict_r[pval], cumulative=True, bins=100, label=pval, alpha=.3);\n\nax1.legend()\nax1.set(xlabel='r', ylabel='Cumulative Frequency')\n# plt.savefig('img/RadCumul.pdf')\n# suptitle='Minimum r for Connectedness'", "_____no_output_____" ], [ "apprx_thresh = [0.065, 0.068, 0.08]\nps = [1.2, 1, 0.8]\n\nfor p, thresh, col in zip(ps, apprx_thresh, ['k', 'g', 'b']):\n rs = np.arange(0.05, 0.14, 0.01)\n ys = 1000*(mt.r1_area2D(p)*rs*rs)\n plt.scatter(thresh, 1000*(mt.r1_area2D(p)*thresh*thresh), c=col)\n plt.plot(rs, ys, c=col, alpha=0.6)\n plt.axvline(x=thresh, c=col, ls='--', label=p, alpha=0.6)", "_____no_output_____" ], [ "n=10\nrand_points = np.random.uniform(size=(2, n-2))\nedge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\npoints = np.concatenate((rand_points, edge_points), axis=1)", "_____no_output_____" ], [ "fig, (ax1, ax2) = plt.subplots(ncols=2, figsize=(10, 5))\nfor pval, col in zip([0.8, 1, 1.2], ['k', 'g', 'b']):\n ax1.hist(data_dict_r[pval], bins=np.arange(0.05, 0.14, 0.0005), label=pval, alpha=.3, color=col, cumulative=1, histtype='step', lw=5)\n hist_out = ax2.hist(data_dict_r[pval], bins=50, color=col, alpha=0.3, label=pval)\n ys = hist_out[0]\n xs = (hist_out[1][1:]+hist_out[1][:-1])/2\n pt = thresh_calc(xs, ys, sig_fract=.8, n_av=5)[0]\n\n ax1.axvline(x=pt, ls='--', alpha=0.6, c=col)\n ax2.axvline(x=pt, ls='--', alpha=0.6, c=col)\n \nax1.axhline(y=500, alpha=0.2, c='r')\n \n# popt, pcov = curve_fit(skewed, xs, ys)\n# plt.plot(xs, skewed(xs, *popt))\nax1.set(xlim=(0.05, 0.12), xlabel='r', ylabel='Cumulative Frequency')\nax2.set(xlim=(0.05, 0.12), xlabel='r', ylabel='Frequency')\nax1.legend(loc='lower right')\nax2.legend()\nplt.savefig('img/r_perc.pdf')\n# plt.gca().set(title='Minimum r for Connectedness', xlabel='r', ylabel='Cumulative Frequency', xlim=(0.05, .1))", "_____no_output_____" ], [ "for pval in [0.8, 1, 1.2]:\n hist_out = np.histogram(data_dict_r[pval], bins=50);\n ys = hist_out[0]\n xs = (hist_out[1][1:]+hist_out[1][:-1])/2\n# popt, pcov = curve_fit(skewed, xs, ys)\n plt.scatter(np.log(xs), np.log(ys))", "/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:6: RuntimeWarning: divide by zero encountered in log\n \n" ], [ "ys = hist_out[0]\nxs = (hist_out[1][1:]+hist_out[1][:-1])/2\npopt, pcov = curve_fit(skewed, xs, ys)\nplt.plot(xs, skewed(xs, *popt))", "_____no_output_____" ], [ "def skewed(x, a, b, c, d):\n# (100*(xs-.06), 4, 50)\n return d*sp.stats.skewnorm.pdf(a*x-b, c)\n \npopt, pcov = curve_fit(skewed, xs, ys)\n\nhist_out = plt.hist(data_dict_r[pval], bins=50, label=pval, alpha=.3)\nplt.plot(xs, skewed(xs, *popt))\n# plt.plot(xs, skewed(xs, 100, 6, 4, 50))\n# plt.plot(xs, ys, label='Fit')\nplt.legend()\npopt", "_____no_output_____" ], [ "def moving_average(a, n=3) :\n ret = np.cumsum(np.array(a))\n ret[n:] = ret[n:] - ret[:-n]\n return ret[n - 1:] / n\n\ndef thresh_calc(data, sig_fract=.8, n_av=5, bins=50):\n \n hist_data = np.histogram(data, bins=bins)\n \n xs, ys = (hist_data[1]+hist_data[1])/2, hist_data[0]\n smoothxs = (moving_average(xs, n=n_av))\n smoothys = (moving_average(ys, n=n_av))\n inds = np.where(smoothys > max(smoothys)*sig_fract)\n\n vals, err = np.polyfit(smoothxs[inds], smoothys[inds], 2, cov=True)\n \n stat_point = -.5*vals[1]/vals[0]\n fract_err = np.sqrt(err[0, 0]/(vals[0]**2) + err[1, 1]/(vals[1]**2))\n\n return stat_point, fract_err*stat_point", "_____no_output_____" ], [ "apprx_thresh = [450, 500, 600]\nps = [1.2, 1, 0.8]\n\nfor p, thresh, col in zip(ps, apprx_thresh, ['k', 'g', 'b']):\n xs = np.arange(1000)\n ys = xs*(mt.r1_area2D(p)*.1*.1)\n plt.scatter(thresh, thresh*(mt.r1_area2D(p)*.1*.1), c=col)\n plt.plot(xs, ys, c=col, alpha=0.6)\n plt.axvline(x=thresh, c=col, ls='--', label=p, alpha=0.6)", "_____no_output_____" ], [ "def separate_perc_n(p, r, n_max=None):\n\n if n_max==None:\n n_max=int(4/(mt.r1_area2D(p)*r*r))\n print(n_max)\n\n rand_points = np.random.uniform(size=(2, n_max-2))\n edge_points = np.array([[0.0, 1.0],[0.0, 1.0]])\n points = np.concatenate((rand_points, edge_points), axis=1)\n\n connections = mt.get_connections(points, radius=r, pval=p)\n\n return mt.perc_thresh_n(connections)\n\ndef ensemble_perc_n(fileName, ps, rs, repeats=1, verbose=True):\n\n for p, r in zip(ps, rs):\n \n if verbose:\n print(f'p:{p}, r:{r}')\n \n for i in range(repeats):\n if verbose:\n print(i, end=' ')\n \n thresh = separate_perc_n(p, r)\n file1 = open(\"{}\".format(fileName),\"a\") \n file1.writelines(f'{p} - {r} - {thresh}\\n')\n file1.close()\n\n if verbose:\n print()\n \n return fileName", "_____no_output_____" ], [ "ensemble_perc_n('new_test.txt', [.8, 1.2, 2], [0.2, 0.1, 0.05], repeats=10)", "p:0.8, r:0.2\n0 258\n1 258\n2 258\n3 258\n4 258\n5 258\n6 258\n7 258\n8 258\n9 258\n\np:1.2, r:0.1\n0 680\n1 680\n2 680\n3 680\n4 680\n5 680\n6 680\n7 680\n8 680\n9 680\n\np:2, r:0.05\n0 2037\n1 2037\n2 2037\n3 2037\n4 2037\n5 2037\n6 2037\n7 2037\n8 2037\n9 2037\n\n" ], [ "pd.read_csv('new_test.txt', header=None, delimiter=\" - \")", "/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:1: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support regex separators (separators > 1 char and different from '\\s+' are interpreted as regex); you can avoid this warning by specifying engine='python'.\n \"\"\"Entry point for launching an IPython kernel.\n" ], [ "p=.8\nr=0.05\n\n4/(mt.r1_area2D(p)*r*r)", "_____no_output_____" ], [ "pn1 = pd.read_csv('outputs/perc_n.txt', names=['p', 'r', 'n'], delimiter=\" - \")\npn1.tail()", "/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:1: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support regex separators (separators > 1 char and different from '\\s+' are interpreted as regex); you can avoid this warning by specifying engine='python'.\n \"\"\"Entry point for launching an IPython kernel.\n" ], [ "pn1['edges'] = pn1['n']*pn1['r']*pn1['r']*mt.kernel_area2D(pn1['p'])", "_____no_output_____" ], [ "plt.hist(pn1[pn1['edges'] < 2.95]['edges'], bins=50, cumulative=1);\n# plt.hist(pn1['edges'], bins=50, cumulative=1);\nplt.gca().set(xlabel='Average Number Edges from Node', ylabel='Cumulative Frequency', );", "_____no_output_____" ], [ "plt.hist(pn1[pn1['edges'] < 2.95]['edges'], bins=50, cumulative=0)\nplt.gca().set(xlabel='Average Number Edges from Node', ylabel='Frequency', )", "_____no_output_____" ], [ "for bins in [50, 75, 100]:\n plt.plot(np.arange(0.5, 0.95, 0.01), [thresh_calc(pn1[pn1['edges'] < 2.95]['edges'], sig_fract=elem, bins=bins)[0] for elem in np.arange(0.5, 0.95, 0.01)], label=f'{bins} bins')\nplt.legend()\nplt.gca().set(xlabel='Fraction for bars to be considered', ylabel='Percolation Threshold', );", "_____no_output_____" ], [ "# #input file\n# fin = open('outputs/perc_r5000clean.txt', \"rt\")\n# #output file to write the result to\n# fout = open(\"outputs/perc_r5000clean2.txt\", \"wt\")\n# #for each line in the input file\n# for line in fin:\n# \t#read replace the string and write to output file\n# \tfout.write(line.replace('-[[', '- [['))\n# #close input and output files\n# fin.close()\n# fout.close()", "_____no_output_____" ], [ "pr1 = pd.read_csv('outputs/perc_r5000clean2.txt', names=['p', 'n', 'r', 'path'], delimiter=\" - \")\npr1['edges'] = pr1['n']*pr1['r']*pr1['r']*mt.kernel_area2D(pr1['p'])", "/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:1: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support regex separators (separators > 1 char and different from '\\s+' are interpreted as regex); you can avoid this warning by specifying engine='python'.\n \"\"\"Entry point for launching an IPython kernel.\n" ], [ "fig, ax = plt.subplots(figsize=(7, 7))\n# axins = ax.inset_axes([5, 8, 150, 250])\naxins = ax.inset_axes([0.5, 0.57, 0.5, 0.43])\n\nhist_data = axins.hist(pr1['edges'], bins=100, label='Raw Data')\n\naxins.legend(loc='upper right')\n\n\nn_av = 5\nsig_fract = .7\nplot_fract = 0.1\n\nxs, ys = (hist_data[1]+hist_data[1])/2, hist_data[0]\nsmoothxs = (moving_average(xs, n=n_av))\nsmoothys = (moving_average(ys, n=n_av))\ninds = np.where(smoothys > max(smoothys)*sig_fract)\nnotinds = np.where(smoothys <= max(smoothys)*sig_fract)\n\n[a, b, c], err = np.polyfit(smoothxs[inds], smoothys[inds], 2, cov=True)\n# plt.plot(xs, vals[0]*xs*xs + vals[1]*xs + vals[2])\n# plotx = xs[inds]\n\nax.scatter(smoothxs[inds], smoothys[inds], c='b', alpha=0.5, label='Points in Fit')\nax.scatter(smoothxs[notinds], smoothys[notinds], c='k', alpha=0.2, label='Smoothed Points')\n\nplotx = smoothxs[inds]\nlowerlim = max(smoothys)*plot_fract\nquadx = np.arange((-b+np.sqrt(b*b - 4*a*(c-lowerlim)))/(2*a), (-b-np.sqrt(b*b - 4*a*(c-lowerlim)))/(2*a), 0.001)\nquady = a*quadx*quadx + b*quadx + c\nplotinds = np.where(quady > 0)\n\nax.axhline(max(smoothys)*sig_fract, color='r', alpha=0.5, ls='--', label=f'Fraction={sig_fract}')\nax.axvline(thresh_calc(pr1['edges'])[0], color='g', alpha=0.5, ls='--', label=f'Threshold')\nax.plot(quadx, quady, c='b', alpha=0.6, label='Quadratic Fit')\n\nax.legend(loc='best', bbox_to_anchor=(0.5, 0., 0.5, 0.45))\nax.set(xlabel='Average number of edges per node', ylabel='Frequency', title='Determining the Percolation Threshold');\nplt.savefig('img/percthreshn5000.pdf')", "_____no_output_____" ], [ "ss = np.arange(0.2, 0.85, 0.01)\nplt.plot(ss, [thresh_calc(pr1['edges'], sig_fract=s)[0] for s in ss])", "_____no_output_____" ], [ "r5000 = pd.read_csv('outputs/perc_r5000clean2.txt', names=['p', 'n', 'r', 'path'], delimiter=\" - \")\nr5000['e'] = mt.kernel_area2D(r5000['p'])*r5000['r']*r5000['r']*r5000['n']", "/anaconda3/lib/python3.7/site-packages/ipykernel_launcher.py:1: ParserWarning: Falling back to the 'python' engine because the 'c' engine does not support regex separators (separators > 1 char and different from '\\s+' are interpreted as regex); you can avoid this warning by specifying engine='python'.\n \"\"\"Entry point for launching an IPython kernel.\n" ], [ "ps = [0.4, 0.6, 0.8, 1.0]\nthreshs = [thresh_calc(r5000[np.abs(r5000['p']-p) < 0.01]['e'], sig_fract=0.6)[0] for p in ps]", "_____no_output_____" ], [ "plt.plot(ps, threshs)", "_____no_output_____" ], [ "r5000[np.abs(r5000['p']-1) < .01]", "_____no_output_____" ], [ "thresh_calc(r5000[np.abs(r5000['p']-p) < 0.01]['e'])[0]", "_____no_output_____" ], [ "thresh_calc(r5000[np.abs(r5000['p']-0.6) < .1]['e'], sig_fract=.6)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d02816856d1dda0c5693e22c70f12a2f228d42ca
174,448
ipynb
Jupyter Notebook
Prace_domowe/Praca_domowa3/Grupa1/EljasiakBartlomiej/pd3.ipynb
niladrem/2020L-WUM
ddccedd900e41de196612c517227e1348c7195df
[ "Apache-2.0" ]
null
null
null
Prace_domowe/Praca_domowa3/Grupa1/EljasiakBartlomiej/pd3.ipynb
niladrem/2020L-WUM
ddccedd900e41de196612c517227e1348c7195df
[ "Apache-2.0" ]
null
null
null
Prace_domowe/Praca_domowa3/Grupa1/EljasiakBartlomiej/pd3.ipynb
niladrem/2020L-WUM
ddccedd900e41de196612c517227e1348c7195df
[ "Apache-2.0" ]
1
2020-06-01T23:23:16.000Z
2020-06-01T23:23:16.000Z
57.36534
64,072
0.686187
[ [ [ "# Praca domowa 3", "_____no_output_____" ], [ "## Ładowanie podstawowych pakietów", "_____no_output_____" ] ], [ [ "import pandas as pd\nimport numpy as np\nimport sklearn\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import f1_score\nfrom sklearn.model_selection import StratifiedKFold # used in crossvalidation\nfrom sklearn.model_selection import KFold\n\nimport IPython\nfrom time import time", "_____no_output_____" ] ], [ [ "## Krótki wstęp", "_____no_output_____" ], [ "Celem zadania jest by bazujac na danych meteorologicznych z australi sprawdzić i wytrenowac 3 różne modele. Równie ważnym celem zadania jest przejrzenie oraz zmiana tzn. hiperparamterów z każdego nich. ", "_____no_output_____" ], [ "### Załadowanie danych", "_____no_output_____" ] ], [ [ "data = pd.read_csv(\"../../australia.csv\")", "_____no_output_____" ] ], [ [ "### Przyjrzenie się danym", "_____no_output_____" ] ], [ [ "data.info()", "<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 56420 entries, 0 to 56419\nData columns (total 18 columns):\nMinTemp 56420 non-null float64\nMaxTemp 56420 non-null float64\nRainfall 56420 non-null float64\nEvaporation 56420 non-null float64\nSunshine 56420 non-null float64\nWindGustSpeed 56420 non-null float64\nWindSpeed9am 56420 non-null float64\nWindSpeed3pm 56420 non-null float64\nHumidity9am 56420 non-null float64\nHumidity3pm 56420 non-null float64\nPressure9am 56420 non-null float64\nPressure3pm 56420 non-null float64\nCloud9am 56420 non-null float64\nCloud3pm 56420 non-null float64\nTemp9am 56420 non-null float64\nTemp3pm 56420 non-null float64\nRainToday 56420 non-null int64\nRainTomorrow 56420 non-null int64\ndtypes: float64(16), int64(2)\nmemory usage: 7.7 MB\n" ] ], [ [ "Nie ma w danych żadnych braków, oraz są one przygotowane idealnie do uczenia maszynowego. Przyjżyjmy się jednak jak wygląda ramka. ", "_____no_output_____" ] ], [ [ "data.head()", "_____no_output_____" ] ], [ [ "## Random Forest", "_____no_output_____" ], [ "**Załadowanie potrzebnych bibliotek** ", "_____no_output_____" ] ], [ [ "from sklearn.ensemble import RandomForestClassifier", "_____no_output_____" ] ], [ [ "**Inicjalizowanie modelu**", "_____no_output_____" ] ], [ [ "rf_default = RandomForestClassifier()", "_____no_output_____" ] ], [ [ "**Hiperparametry**", "_____no_output_____" ] ], [ [ "params = rf_default.get_params()\nparams", "_____no_output_____" ] ], [ [ "**Zmiana kilku hiperparametrów**", "_____no_output_____" ] ], [ [ "params['n_estimators']=150\nparams['max_depth']=6\nparams['min_samples_leaf']=4\nparams['n_jobs']=4\nparams['random_state']=0", "_____no_output_____" ], [ "rf_modified = RandomForestClassifier()\nrf_modified.set_params(**params)", "_____no_output_____" ] ], [ [ "## Extreme Gradient Boosting", "_____no_output_____" ], [ "**Załadowanie potrzebnych bibliotek** ", "_____no_output_____" ] ], [ [ "from xgboost import XGBClassifier", "_____no_output_____" ] ], [ [ "**Inicjalizowanie modelu**", "_____no_output_____" ] ], [ [ "xgb_default = XGBClassifier()", "_____no_output_____" ] ], [ [ "**Hiperparametry**", "_____no_output_____" ] ], [ [ "params = xgb_default.get_params()\nparams", "_____no_output_____" ] ], [ [ "**Zmiana kilku hiperparametrów**", "_____no_output_____" ] ], [ [ "params['n_estimators']=150\nparams['max_depth']=6\nparams['n_jobs']=4\nparams['random_state']=0", "_____no_output_____" ], [ "xgb_modified = XGBClassifier()\nxgb_modified.set_params(**params)", "_____no_output_____" ] ], [ [ "## Support Vector Machines", "_____no_output_____" ], [ "**Załadowanie potrzebnych bibliotek** ", "_____no_output_____" ] ], [ [ "from sklearn.svm import SVC", "_____no_output_____" ] ], [ [ "**Inicjalizowanie modelu**", "_____no_output_____" ] ], [ [ "svc_default = SVC()", "_____no_output_____" ] ], [ [ "**Hiperparametry**", "_____no_output_____" ] ], [ [ "params = svc_default.get_params()\nparams", "_____no_output_____" ] ], [ [ "**Zmiana kilku hiperparametrów**", "_____no_output_____" ] ], [ [ "params['degree']=3\nparams['tol']=0.001\nparams['random_state']=0", "_____no_output_____" ], [ "svc_modified = SVC()\nsvc_modified.set_params(**params)", "_____no_output_____" ] ], [ [ "## Komentarz\nW tym momencie otrzymaliśmy 3 modele z zmienionymi hiperparametrami, oraz ich domyślne odpowiedniki. Zobaczmy teraz jak zmieniły się rezultaty osiągane przez te modele i chociaż nie był to cel tego zadania, zobaczmy czy może udało nam się poprawić jakiś model.", "_____no_output_____" ], [ "## Porównanie", "_____no_output_____" ], [ "**Załadowanie potrzebnych bibliotek** ", "_____no_output_____" ] ], [ [ "from sklearn.metrics import f1_score\nfrom sklearn.metrics import accuracy_score\nfrom sklearn.metrics import balanced_accuracy_score\nfrom sklearn.metrics import precision_score\nfrom sklearn.metrics import average_precision_score\nfrom sklearn.metrics import roc_auc_score", "_____no_output_____" ] ], [ [ "### Funkcje pomocnicze", "_____no_output_____" ] ], [ [ "def cv_classifier(classifier,kfolds = 10, X = data.drop(\"RainTomorrow\", axis = 1), y = data.RainTomorrow):\n start_time = time()\n \n scores ={}\n scores[\"f1\"]=[]\n scores[\"accuracy\"]=[]\n scores[\"balanced_accuracy\"]=[]\n scores[\"precision\"]=[]\n scores[\"average_precision\"]=[]\n scores[\"roc_auc\"]=[]\n \n # Hardcoded crossvalidation metod, could be \n cv= StratifiedKFold(n_splits=kfolds,shuffle=True,random_state=0)\n \n for i, (train, test) in enumerate(cv.split(X, y)):\n \n IPython.display.clear_output()\n print(f\"Model {i+1}/{kfolds}\")\n \n # Training model\n classifier.fit(X.iloc[train, ], y.iloc[train], )\n \n # Testing model\n prediction = classifier.predict(X.iloc[test,])\n \n # calculating and savings scores\n scores[\"f1\"].append( f1_score(y.iloc[test],prediction))\n scores[\"accuracy\"].append( accuracy_score(y.iloc[test],prediction))\n scores[\"balanced_accuracy\"].append( balanced_accuracy_score(y.iloc[test],prediction))\n scores[\"precision\"].append( precision_score(y.iloc[test],prediction))\n scores[\"average_precision\"].append( average_precision_score(y.iloc[test],prediction))\n scores[\"roc_auc\"].append( roc_auc_score(y.iloc[test],prediction))\n \n IPython.display.clear_output()\n print(f\"Crossvalidation on {kfolds} folds done in {round((time()-start_time),2)}s\")\n \n return scores", "_____no_output_____" ], [ "def get_mean_scores(scores_dict):\n means={}\n for score_name in scores_dict:\n means[score_name] = np.mean(scores_dict[score_name])\n return means", "_____no_output_____" ], [ "def print_mean_scores(mean_scores_dict,precision=4):\n for score_name in mean_scores_dict:\n print(f\"Mean {score_name} score is {round(mean_scores_dict[score_name]*100,precision)}%\")", "_____no_output_____" ] ], [ [ "### Wyniki\n\nPoniżej zamieszczam wyniki predykcji pokazanych wcześniej modeli. Dla kontrastu nauczyłem zmodyfikowane wersję klasyfikatorów jak i również te domyślne. Ze smutkiem muszę stwierdzić, że nie jestem najlepszy w strzelaniu, ponieważ parametry, które dobrałem znacznie pogarszają skutecznść każdego z modeli. Niemniej jednak by to stwierdzić musiałem sie posłóżyć pewnymi miarami. Są to:\n* F1 \n* Accuracy\n* Balanced Accuracy\n* Precision\n* Average Precision\n* ROC AUC\n\nWszystkie modele zostały poddane 10 krotnej kroswalidacji, więc przedstawione wyniki są średnią. Kroswalidacja pozwala dokładniej ocenić skutecznosć modelu oraz wyciągajac z nich takie informacje jak odchylenie standardowe wyników, co daje nam możliowść dyskusji na temat działania modelu w skrajnych przypadkach. ", "_____no_output_____" ], [ "### Random Forest", "_____no_output_____" ], [ "### Kroswalidacja modeli", "_____no_output_____" ] ], [ [ "scores_rf_default = cv_classifier(rf_default)", "Crossvalidation on 10 folds done in 169.77s\n" ], [ "scores_rf_modified = cv_classifier(rf_modified)", "Crossvalidation on 10 folds done in 35.2s\n" ], [ "mean_scores_rf_default = get_mean_scores(scores_rf_default)\nmean_scores_rf_modified = get_mean_scores(scores_rf_modified)", "_____no_output_____" ] ], [ [ "**Random forest default**", "_____no_output_____" ] ], [ [ "print_mean_scores(mean_scores_rf_default,precision=2)", "Mean f1 score is 62.85%\nMean accuracy score is 86.09%\nMean balanced_accuracy score is 74.38%\nMean precision score is 76.32%\nMean average_precision score is 51.05%\nMean roc_auc score is 74.38%\n" ] ], [ [ "**Random forest modified**", "_____no_output_____" ] ], [ [ "print_mean_scores(mean_scores_rf_modified,precision=2)", "Mean f1 score is 56.74%\nMean accuracy score is 84.97%\nMean balanced_accuracy score is 70.54%\nMean precision score is 77.55%\nMean average_precision score is 46.88%\nMean roc_auc score is 70.54%\n" ] ], [ [ "## Extreme Gradient Boosting", "_____no_output_____" ], [ "### Kroswalidacja modeli", "_____no_output_____" ] ], [ [ "scores_xgb_default = cv_classifier(xgb_default)", "Crossvalidation on 10 folds done in 41.98s\n" ], [ "scores_xgb_modified = cv_classifier(xgb_modified)", "Crossvalidation on 10 folds done in 67.69s\n" ], [ "mean_scores_xgb_default = get_mean_scores(scores_xgb_default)\nmean_scores_xgb_modified = get_mean_scores(scores_xgb_modified)", "_____no_output_____" ] ], [ [ "**XGBoost default**", "_____no_output_____" ] ], [ [ "print_mean_scores(mean_scores_xgb_default,precision=2)", "Mean f1 score is 63.79%\nMean accuracy score is 85.92%\nMean balanced_accuracy score is 75.3%\nMean precision score is 73.56%\nMean average_precision score is 51.06%\nMean roc_auc score is 75.3%\n" ] ], [ [ "**XGBoost modified**", "_____no_output_____" ] ], [ [ "print_mean_scores(mean_scores_xgb_modified,precision=2)", "Mean f1 score is 63.93%\nMean accuracy score is 85.89%\nMean balanced_accuracy score is 75.44%\nMean precision score is 73.18%\nMean average_precision score is 51.07%\nMean roc_auc score is 75.44%\n" ] ], [ [ "## Support Vector Machines", "_____no_output_____" ], [ "### Kroswalidacja modeli", "_____no_output_____" ], [ "**warning this takes a while**", "_____no_output_____" ] ], [ [ "scores_svc_default = cv_classifier(svc_default)", "Crossvalidation on 10 folds done in 923.06s\n" ], [ "scores_svc_modified = cv_classifier(svc_modified)", "Crossvalidation on 10 folds done in 864.31s\n" ], [ "mean_scores_svc_default = get_mean_scores(scores_svc_default)\nmean_scores_svc_modified = get_mean_scores(scores_svc_modified)", "_____no_output_____" ] ], [ [ "**SVM default**", "_____no_output_____" ] ], [ [ "print_mean_scores(mean_scores_svc_default,precision=2)", "Mean f1 score is 51.52%\nMean accuracy score is 84.38%\nMean balanced_accuracy score is 67.63%\nMean precision score is 81.47%\nMean average_precision score is 44.44%\nMean roc_auc score is 67.63%\n" ] ], [ [ "**SVM modified**", "_____no_output_____" ] ], [ [ "print_mean_scores(mean_scores_svc_modified,precision=2)", "Mean f1 score is 51.52%\nMean accuracy score is 84.38%\nMean balanced_accuracy score is 67.63%\nMean precision score is 81.47%\nMean average_precision score is 44.44%\nMean roc_auc score is 67.63%\n" ] ], [ [ "## Podsumowanie", "_____no_output_____" ], [ "Wyniki random forest oraz xgboost były dośyć zbliżone i szczerze mówiąc dosyć słabe. Jeszcze gorzej wypadł SVM, co pewnie wielu nie zdziwi. Ma okropnie długi czas uczenia, ponad minuta na model. Wypada dużo gorzej niż pozostałe algorytmy, gdzie 10 modeli xgboost zostało wyszkolone w 41s. Natomiast wyniki random forest oraz xgboost są dosyć zbliżone. Gdybym jednak miał wybrać jeden z tych trzech modeli, by dalej go dostrajać na pewno zdecydowałbym się na xgboosta. Między innymi dlatego, że czas uczenia i testowania byłby dużo krótszy niż w przypadku random forest, oraz prawdopodobnie z odpowiednimi parametrami xgboost będzie sobie radził lepiej niż random forest. \n\nNatomiast wybór najlepszej miary nie jest już taki prosty, a nawet śmiem twierdzić, że nie znalazłem takiej która zasługiwałaby na takie miano. Większość miar w niebanalny sposób reprezentuje jakąś cechę modelu. Natomiast jeżeli musiałbym się ograniczyć do jednej prawdopodobnie wybrałbym ROC_AUC. Z tego powodu, że przez użycie True Positive Rate i False Positive Rate jest ona całkiem logiczna (w przeciwieństwie do wielu), a zarazem wiąć pozwala dobrze tłumaczyć skutecznośc modeli.", "_____no_output_____" ], [ "# Część bonusowa - Regresja", "_____no_output_____" ], [ "### Przygotowanie danych", "_____no_output_____" ] ], [ [ "data2 = pd.read_csv('allegro-api-transactions.csv')\ndata2 = data2.drop(['lp','date'], axis = 1)\ndata2.head()", "_____no_output_____" ] ], [ [ "Dane są prawie gotowe do procesu czenia, trzeba jedynie poprawić `it_location` w którym mogą pojawić się powtórki w stylu *Warszawa* i *warszawa*, a następnie zakodować zmienne kategoryczne", "_____no_output_____" ] ], [ [ "data2.it_location = data2.it_location.str.lower()\ndata2.head()", "_____no_output_____" ], [ "encoding_columns = ['categories','seller','it_location','main_category']", "_____no_output_____" ] ], [ [ "## Kodowanie zmiennych kategorycznych ", "_____no_output_____" ] ], [ [ "import category_encoders\nfrom sklearn.preprocessing import OneHotEncoder", "_____no_output_____" ] ], [ [ "### Podział danych \nNie wykonam standardowego podziału na dane test i train, ponieważ w dalszej części dokumentu do oceny skutecznosci użytych kodowań posłużę się kroswalidacją. Pragnę zaznaczyć, ze prawdopodbnie najlepszą metodą tutaj byłoby rozbicie kategorii `categories` na 26 kolumn, zero-jedynkowych, jednak znacznie by to powiększyło rozmiar danych. Z dokładnie tego samego powodu nie wykonam one hot encodingu, tylko posłużę się kodowaniami, które nie powiększą rozmiatu danych.", "_____no_output_____" ] ], [ [ "X = data2.drop('price', axis = 1)\ny = data2.price", "_____no_output_____" ] ], [ [ "## Target encoding", "_____no_output_____" ] ], [ [ "te = category_encoders.target_encoder.TargetEncoder(data2, cols = encoding_columns)\ntarget_encoded = te.fit_transform(X,y)\ntarget_encoded", "_____no_output_____" ] ], [ [ "## James-Stein Encoding", "_____no_output_____" ] ], [ [ "js = category_encoders.james_stein.JamesSteinEncoder(cols = encoding_columns)\nencoded_js = js.fit_transform(X,y)\nencoded_js", "_____no_output_____" ] ], [ [ "## Cat Boost Encoding", "_____no_output_____" ] ], [ [ "cb = category_encoders.cat_boost.CatBoostEncoder(cols = encoding_columns)\nencoded_cb = cb.fit_transform(X,y)\nencoded_cb", "_____no_output_____" ] ], [ [ "## Testowanie", "_____no_output_____" ] ], [ [ "from sklearn.metrics import r2_score, mean_squared_error\nfrom sklearn import linear_model", "_____no_output_____" ], [ "def cv_encoding(model,kfolds = 10, X = data.drop(\"RainTomorrow\", axis = 1), y = data.RainTomorrow):\n start_time = time()\n \n scores ={}\n scores[\"r2_score\"] = []\n scores['RMSE'] = []\n \n # Standard k-fold\n cv = KFold(n_splits=kfolds,shuffle=False,random_state=0)\n \n for i, (train, test) in enumerate(cv.split(X, y)):\n \n IPython.display.clear_output()\n print(f\"Model {i+1}/{kfolds}\")\n \n # Training model\n model.fit(X.iloc[train, ], y.iloc[train], )\n \n # Testing model\n prediction = model.predict(X.iloc[test,])\n \n # calculating and savings score\n scores['r2_score'].append( r2_score(y.iloc[test],prediction))\n scores['RMSE'].append( mean_squared_error(y.iloc[test],prediction))\n \n \n IPython.display.clear_output()\n print(f\"Crossvalidation on {kfolds} folds done in {round((time()-start_time),2)}s\")\n \n return scores", "_____no_output_____" ] ], [ [ "## Mierzenie skutecznosci kodowań\nZdecydowałem isę skorzystać z modelu regresji liniowej `Lasso` początkowo chciałem skorzystać z `Elastic Net`, ale jak się okazało zmienne nie sa ze sobą zbytnio powiązane, a to miał być główny powód do jego użycia.", "_____no_output_____" ] ], [ [ "corr=data2.corr()\nfig, ax=plt.subplots(figsize=(9,6)) \nax=sns.heatmap(corr, xticklabels=corr.columns, yticklabels=corr.columns, annot=True, cmap=\"PiYG\", center=0, vmin=-1, vmax=1)\nax.set_title('Korelacje zmiennych')\nplt.show();", "_____no_output_____" ] ], [ [ "### Wybór modelu liniowego\nOkreślam go w tym miejscu, ponieważ w dalszych częściach dokumentu będę z niego wielokrotnie korzystał przy kroswalidacji", "_____no_output_____" ] ], [ [ "lasso = linear_model.Lasso()", "_____no_output_____" ] ], [ [ "## Wyniki target encodingu", "_____no_output_____" ] ], [ [ "target_encoding_scores = cv_encoding(model = lasso,kfolds=20, X = target_encoded, y = y)", "Crossvalidation on 20 folds done in 8.57s\n" ], [ "target_encoding_scores_mean = get_mean_scores(target_encoding_scores)\ntarget_encoding_scores_mean", "_____no_output_____" ] ], [ [ "## Wyniki James-Stein Encodingu", "_____no_output_____" ] ], [ [ "js_encoding_scores = cv_encoding(lasso, 20, encoded_js, y)", "Crossvalidation on 20 folds done in 24.17s\n" ], [ "js_encoding_scores_mean = get_mean_scores(js_encoding_scores)\njs_encoding_scores_mean", "_____no_output_____" ] ], [ [ "## Wyniki Cat Boost Encodingu", "_____no_output_____" ] ], [ [ "cb_encoding_scores = cv_encoding(lasso, 20 ,encoded_cb, y)", "Crossvalidation on 20 folds done in 11.04s\n" ], [ "cb_encoding_scores_mean = get_mean_scores(cb_encoding_scores)\ncb_encoding_scores_mean", "_____no_output_____" ] ], [ [ "## Porównanie", "_____no_output_____" ], [ "## Wyniki metryki r2", "_____no_output_____" ] ], [ [ "r2_data = [target_encoding_scores[\"r2_score\"], js_encoding_scores[\"r2_score\"], cb_encoding_scores[\"r2_score\"]]\nlabels = [\"Target\", \" James-Stein\", \"Cat Boost\"]\nfig, ax = plt.subplots(figsize = (12,9))\nax.set_title('Wyniki r2')\nax.boxplot(r2_data, labels = labels)\nplt.show()", "_____no_output_____" ] ], [ [ "**Komentarz** \n\nWidać, że użycie kodowania Jamesa-Steina pozwoliło modelowi dużo lepiej się dopasować do danych, jednak stwarza to ewentaulny problem nadmiernego dopasowania się do danych. Warto by było sprawdzić czy przez ten sposób kodowania nie dochodzi do dużo silniejszego overfittingu. ", "_____no_output_____" ], [ "## Wynikii metryki RMSE", "_____no_output_____" ] ], [ [ "rmse_data = [target_encoding_scores[\"RMSE\"], js_encoding_scores[\"RMSE\"], cb_encoding_scores[\"RMSE\"]]\nlabels = [\"Target\", \" James-Stein\", \"Cat Boost\"]\nfig, ax = plt.subplots(figsize = (12,9))\nax.set_title('Wyniki RMSE w skali logarytmicznej')\nax.set_yscale('log')\nax.boxplot(rmse_data, labels = labels)\nplt.show()", "_____no_output_____" ] ], [ [ "**Komentarz**\n\nNajlepiej poradził sobie James-Stein Encoding, nic dziwnego ponieważ r2 wskazał nam już lepsze dopasowanie modelu. ", "_____no_output_____" ], [ "## Podsumowanie", "_____no_output_____" ], [ "Kodowanie Jamesa-Steina osiąga dużo lepsze wyniki niż pozostałe dwa przez mnie wybrane. O ile nie dochodzi w tym przypadku do overfittingu, to w tej grupie z pewnością wybrałbym właśnie to kodowanie. Warto się jednak zastanowić nad kodowaniem one-hot, które w tym przypadku wydaje się bardzo naturale, jednakże wiąże się z kilkukrotnym powiększeniem danych.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ] ]
d0282d05845ae8f98d6d4a69a5f04f48d963139f
8,309
ipynb
Jupyter Notebook
graphs_trees/tree_dfs/dfs_challenge.ipynb
hanbf/interactive-coding-challenges
1676ac16c987e35eeb4be6ab57a3c10ed9b71b8b
[ "Apache-2.0" ]
null
null
null
graphs_trees/tree_dfs/dfs_challenge.ipynb
hanbf/interactive-coding-challenges
1676ac16c987e35eeb4be6ab57a3c10ed9b71b8b
[ "Apache-2.0" ]
null
null
null
graphs_trees/tree_dfs/dfs_challenge.ipynb
hanbf/interactive-coding-challenges
1676ac16c987e35eeb4be6ab57a3c10ed9b71b8b
[ "Apache-2.0" ]
null
null
null
27.976431
282
0.48297
[ [ [ "This notebook was prepared by [Donne Martin](https://github.com/donnemartin). Source and license info is on [GitHub](https://github.com/donnemartin/interactive-coding-challenges).", "_____no_output_____" ], [ "# Challenge Notebook", "_____no_output_____" ], [ "## Problem: Implement depth-first traversals (in-order, pre-order, post-order) on a binary tree.\n\n* [Constraints](#Constraints)\n* [Test Cases](#Test-Cases)\n* [Algorithm](#Algorithm)\n* [Code](#Code)\n* [Unit Test](#Unit-Test)", "_____no_output_____" ], [ "## Constraints\n\n* Can we assume we already have a Node class with an insert method?\n * Yes\n* What should we do with each node when we process it?\n * Call an input method `visit_func` on the node\n* Can we assume this fits in memory?\n * Yes", "_____no_output_____" ], [ "## Test Cases\n\n### In-Order Traversal\n\n* 5, 2, 8, 1, 3 -> 1, 2, 3, 5, 8\n* 1, 2, 3, 4, 5 -> 1, 2, 3, 4, 5\n\n### Pre-Order Traversal\n\n* 5, 2, 8, 1, 3 -> 5, 2, 1, 3, 8\n* 1, 2, 3, 4, 5 -> 1, 2, 3, 4, 5\n\n### Post-Order Traversal\n\n* 5, 2, 8, 1, 3 -> 1, 3, 2, 8, 5\n* 1, 2, 3, 4, 5 -> 5, 4, 3, 2, 1", "_____no_output_____" ], [ "## Algorithm\n\nRefer to the [Solution Notebook](http://nbviewer.ipython.org/github/donnemartin/interactive-coding-challenges/blob/master/graphs_trees/tree_dfs/dfs_solution.ipynb). If you are stuck and need a hint, the solution notebook's algorithm discussion might be a good place to start.", "_____no_output_____" ], [ "## Code", "_____no_output_____" ] ], [ [ "# %load ../bst/bst.py\nclass Node(object):\n\n def __init__(self, data):\n self.data = data\n self.left = None\n self.right = None\n self.parent = None\n\n def __repr__(self):\n return str(self.data)\n\n\nclass Bst(object):\n\n def __init__(self, root=None):\n self.root = root\n\n def insert(self, data):\n if data is None:\n raise TypeError('data cannot be None')\n if self.root is None:\n self.root = Node(data)\n return self.root\n else:\n return self._insert(self.root, data)\n\n def _insert(self, node, data):\n if node is None:\n return Node(data)\n if data <= node.data:\n if node.left is None:\n node.left = self._insert(node.left, data)\n node.left.parent = node\n return node.left\n else:\n return self._insert(node.left, data)\n else:\n if node.right is None:\n node.right = self._insert(node.right, data)\n node.right.parent = node\n return node.right\n else:\n return self._insert(node.right, data)", "_____no_output_____" ], [ "class BstDfs(Bst):\n\n def in_order_traversal(self, node, visit_func):\n if node is None:\n return\n \n self.in_order_traversal(node.left, visit_func)\n \n visit_func(node)\n \n self.in_order_traversal(node.right, visit_func)\n\n def pre_order_traversal(self, node, visit_func):\n if node is None:\n return\n \n visit_func(node)\n\n self.pre_order_traversal(node.left, visit_func)\n \n self.pre_order_traversal(node.right, visit_func)\n\n def post_order_traversal(self,node, visit_func):\n if node is None:\n return\n \n self.post_order_traversal(node.left, visit_func)\n \n self.post_order_traversal(node.right, visit_func)\n\n visit_func(node)\n ", "_____no_output_____" ] ], [ [ "## Unit Test", "_____no_output_____" ] ], [ [ "%run ../utils/results.py", "_____no_output_____" ], [ "# %load test_dfs.py\nfrom nose.tools import assert_equal\n\n\nclass TestDfs(object):\n\n def __init__(self):\n self.results = Results()\n\n def test_dfs(self):\n bst = BstDfs(Node(5))\n bst.insert(2)\n bst.insert(8)\n bst.insert(1)\n bst.insert(3)\n\n bst.in_order_traversal(bst.root, self.results.add_result)\n assert_equal(str(self.results), \"[1, 2, 3, 5, 8]\")\n self.results.clear_results()\n\n bst.pre_order_traversal(bst.root, self.results.add_result)\n assert_equal(str(self.results), \"[5, 2, 1, 3, 8]\")\n self.results.clear_results()\n\n bst.post_order_traversal(bst.root, self.results.add_result)\n assert_equal(str(self.results), \"[1, 3, 2, 8, 5]\")\n self.results.clear_results()\n\n bst = BstDfs(Node(1))\n bst.insert(2)\n bst.insert(3)\n bst.insert(4)\n bst.insert(5)\n\n bst.in_order_traversal(bst.root, self.results.add_result)\n assert_equal(str(self.results), \"[1, 2, 3, 4, 5]\")\n self.results.clear_results()\n\n bst.pre_order_traversal(bst.root, self.results.add_result)\n assert_equal(str(self.results), \"[1, 2, 3, 4, 5]\")\n self.results.clear_results()\n\n bst.post_order_traversal(bst.root, self.results.add_result)\n assert_equal(str(self.results), \"[5, 4, 3, 2, 1]\")\n\n print('Success: test_dfs')\n\n\ndef main():\n test = TestDfs()\n test.test_dfs()\n\n\nif __name__ == '__main__':\n main()", "Success: test_dfs\n" ] ], [ [ "## Solution Notebook\n\nReview the [Solution Notebook](http://nbviewer.ipython.org/github/donnemartin/interactive-coding-challenges/blob/master/graphs_trees/tree_dfs/dfs_solution.ipynb) for a discussion on algorithms and code solutions.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d0282f93ebb8e50de46e6e988c967cc9bc3c6d35
220,983
ipynb
Jupyter Notebook
chapter14_generative-adversarial-networks/gan-intro.ipynb
vishaalkapoor/mxnet-the-straight-dope
8b69042bae8bb9ab2fa73a357ab4cc0111a9b92e
[ "Apache-2.0" ]
2,796
2017-07-12T06:23:19.000Z
2022-02-19T16:38:09.000Z
chapter14_generative-adversarial-networks/gan-intro.ipynb
m2rik/mxnet-the-straight-dope
b524c70401e9fb62cb2af411cee3abe2e344bace
[ "Apache-2.0" ]
337
2017-07-12T17:07:41.000Z
2020-10-15T20:19:17.000Z
chapter14_generative-adversarial-networks/gan-intro.ipynb
m2rik/mxnet-the-straight-dope
b524c70401e9fb62cb2af411cee3abe2e344bace
[ "Apache-2.0" ]
867
2017-07-13T03:59:31.000Z
2022-03-18T15:01:55.000Z
390.429329
18,106
0.922981
[ [ [ "# Generative Adversarial Networks\n\n\nThroughout most of this book, we've talked about how to make predictions.\nIn some form or another, we used deep neural networks learned mappings from data points to labels.\nThis kind of learning is called discriminative learning,\nas in, we'd like to be able to discriminate between photos cats and photos of dogs. \nClassifiers and regressors are both examples of discriminative learning. \nAnd neural networks trained by backpropagation \nhave upended everything we thought we knew about discriminative learning \non large complicated datasets. \nClassification accuracies on high-res images has gone from useless \nto human-level (with some caveats) in just 5-6 years. \nWe'll spare you another spiel about all the other discriminative tasks \nwhere deep neural networks do astoundingly well.\n\nBut there's more to machine learning than just solving discriminative tasks.\nFor example, given a large dataset, without any labels,\nwe might want to learn a model that concisely captures the characteristics of this data.\nGiven such a model, we could sample synthetic data points that resemble the distribution of the training data.\nFor example, given a large corpus of photographs of faces,\nwe might want to be able to generate a *new* photorealistic image \nthat looks like it might plausibly have come from the same dataset. \nThis kind of learning is called *generative modeling*. \n\nUntil recently, we had no method that could synthesize novel photorealistic images. \nBut the success of deep neural networks for discriminative learning opened up new possiblities.\nOne big trend over the last three years has been the application of discriminative deep nets\nto overcome challenges in problems that we don't generally think of as supervised learning problems.\nThe recurrent neural network language models are one example of using a discriminative network (trained to predict the next character)\nthat once trained can act as a generative model. \n\n\nIn 2014, a young researcher named Ian Goodfellow introduced [Generative Adversarial Networks (GANs)](https://arxiv.org/abs/1406.2661) a clever new way to leverage the power of discriminative models to get good generative models. \nGANs made quite a splash so it's quite likely you've seen the images before. \nFor instance, using a GAN you can create fake images of bedrooms, as done by [Radford et al. in 2015](https://arxiv.org/pdf/1511.06434.pdf) and depicted below. \n\n![](../img/fake_bedrooms.png)\n\nAt their heart, GANs rely on the idea that a data generator is good\nif we cannot tell fake data apart from real data. \nIn statistics, this is called a two-sample test - a test to answer the question whether datasets $X = \\{x_1, \\ldots x_n\\}$ and $X' = \\{x_1', \\ldots x_n'\\}$ were drawn from the same distribution. \nThe main difference between most statistics papers and GANs is that the latter use this idea in a constructive way.\nIn other words, rather than just training a model to say 'hey, these two datasets don't look like they came from the same distribution', they use the two-sample test to provide training signal to a generative model.\nThis allows us to improve the data generator until it generates something that resembles the real data. \nAt the very least, it needs to fool the classifier. And if our classifier is a state of the art deep neural network.\n\nAs you can see, there are two pieces to GANs - first off, we need a device (say, a deep network but it really could be anything, such as a game rendering engine) that might potentially be able to generate data that looks just like the real thing. \nIf we are dealing with images, this needs to generate images. \nIf we're dealing with speech, it needs to generate audio sequences, and so on. \nWe call this the *generator network*. The second component is the *discriminator network*. \nIt attempts to distinguish fake and real data from each other. \nBoth networks are in competition with each other. \nThe generator network attempts to fool the discriminator network. At that point, the discriminator network adapts to the new fake data. This information, in turn is used to improve the generator network, and so on. \n\n**Generator**\n* Draw some parameter $z$ from a source of randomness, e.g. a normal distribution $z \\sim \\mathcal{N}(0,1)$.\n* Apply a function $f$ such that we get $x' = G(u,w)$\n* Compute the gradient with respect to $w$ to minimize $\\log p(y = \\mathrm{fake}|x')$ \n\n**Discriminator**\n* Improve the accuracy of a binary classifier $f$, i.e. maximize $\\log p(y=\\mathrm{fake}|x')$ and $\\log p(y=\\mathrm{true}|x)$ for fake and real data respectively.\n\n\n![](../img/simple-gan.png)\n\nIn short, there are two optimization problems running simultaneously, and the optimization terminates if a stalemate has been reached. There are lots of further tricks and details on how to modify this basic setting. For instance, we could try solving this problem in the presence of side information. This leads to cGAN, i.e. conditional Generative Adversarial Networks. We can change the way how we detect whether real and fake data look the same. This leads to wGAN (Wasserstein GAN), kernel-inspired GANs and lots of other settings, or we could change how closely we look at the objects. E.g. fake images might look real at the texture level but not so at the larger level, or vice versa. \n\nMany of the applications are in the context of images. Since this takes too much time to solve in a Jupyter notebook on a laptop, we're going to content ourselves with fitting a much simpler distribution. We will illustrate what happens if we use GANs to build the world's most inefficient estimator of parameters for a Gaussian. Let's get started.\n", "_____no_output_____" ] ], [ [ "from __future__ import print_function\nimport matplotlib as mpl\nfrom matplotlib import pyplot as plt\nimport mxnet as mx\nfrom mxnet import gluon, autograd, nd\nfrom mxnet.gluon import nn\nimport numpy as np\n\nctx = mx.cpu()", "_____no_output_____" ] ], [ [ "## Generate some 'real' data\n\nSince this is going to be the world's lamest example, we simply generate data drawn from a Gaussian. And let's also set a context where we'll do most of the computation.", "_____no_output_____" ] ], [ [ "X = nd.random_normal(shape=(1000, 2))\nA = nd.array([[1, 2], [-0.1, 0.5]])\nb = nd.array([1, 2])\nX = nd.dot(X, A) + b\nY = nd.ones(shape=(1000, 1))\n\n# and stick them into an iterator\nbatch_size = 4\ntrain_data = mx.io.NDArrayIter(X, Y, batch_size, shuffle=True)", "_____no_output_____" ] ], [ [ "Let's see what we got. This should be a Gaussian shifted in some rather arbitrary way with mean $b$ and covariance matrix $A^\\top A$.", "_____no_output_____" ] ], [ [ "plt.scatter(X[:,0].asnumpy(), X[:,1].asnumpy())\nplt.show()\nprint(\"The covariance matrix is\")\nprint(nd.dot(A.T, A))", "_____no_output_____" ] ], [ [ "## Defining the networks\n\nNext we need to define how to fake data. Our generator network will be the simplest network possible - a single layer linear model. This is since we'll be driving that linear network with a Gaussian data generator. Hence, it literally only needs to learn the parameters to fake things perfectly. For the discriminator we will be a bit more discriminating: we will use an MLP with 3 layers to make things a bit more interesting. \n\nThe cool thing here is that we have *two* different networks, each of them with their own gradients, optimizers, losses, etc. that we can optimize as we please. ", "_____no_output_____" ] ], [ [ "# build the generator\nnetG = nn.Sequential()\nwith netG.name_scope():\n netG.add(nn.Dense(2))\n\n# build the discriminator (with 5 and 3 hidden units respectively)\nnetD = nn.Sequential()\nwith netD.name_scope():\n netD.add(nn.Dense(5, activation='tanh'))\n netD.add(nn.Dense(3, activation='tanh'))\n netD.add(nn.Dense(2))\n\n# loss\nloss = gluon.loss.SoftmaxCrossEntropyLoss()\n\n# initialize the generator and the discriminator\nnetG.initialize(mx.init.Normal(0.02), ctx=ctx)\nnetD.initialize(mx.init.Normal(0.02), ctx=ctx)\n\n# trainer for the generator and the discriminator\ntrainerG = gluon.Trainer(netG.collect_params(), 'adam', {'learning_rate': 0.01})\ntrainerD = gluon.Trainer(netD.collect_params(), 'adam', {'learning_rate': 0.05})", "_____no_output_____" ] ], [ [ "## Setting up the training loop\n\nWe are going to iterate over the data a few times. To make life simpler we need a few variables", "_____no_output_____" ] ], [ [ "real_label = mx.nd.ones((batch_size,), ctx=ctx)\nfake_label = mx.nd.zeros((batch_size,), ctx=ctx)\nmetric = mx.metric.Accuracy()\n\n# set up logging\nfrom datetime import datetime\nimport os\nimport time", "_____no_output_____" ] ], [ [ "## Training loop\n\n", "_____no_output_____" ] ], [ [ "stamp = datetime.now().strftime('%Y_%m_%d-%H_%M')\nfor epoch in range(10):\n tic = time.time()\n train_data.reset()\n for i, batch in enumerate(train_data):\n ############################\n # (1) Update D network: maximize log(D(x)) + log(1 - D(G(z)))\n ###########################\n # train with real_t\n data = batch.data[0].as_in_context(ctx)\n noise = nd.random_normal(shape=(batch_size, 2), ctx=ctx)\n\n with autograd.record():\n real_output = netD(data)\n errD_real = loss(real_output, real_label)\n \n fake = netG(noise)\n fake_output = netD(fake.detach())\n errD_fake = loss(fake_output, fake_label)\n errD = errD_real + errD_fake\n errD.backward()\n\n trainerD.step(batch_size)\n metric.update([real_label,], [real_output,])\n metric.update([fake_label,], [fake_output,])\n\n ############################\n # (2) Update G network: maximize log(D(G(z)))\n ###########################\n with autograd.record():\n output = netD(fake)\n errG = loss(output, real_label)\n errG.backward()\n\n trainerG.step(batch_size)\n\n name, acc = metric.get()\n metric.reset()\n print('\\nbinary training acc at epoch %d: %s=%f' % (epoch, name, acc))\n print('time: %f' % (time.time() - tic))\n noise = nd.random_normal(shape=(100, 2), ctx=ctx)\n fake = netG(noise)\n plt.scatter(X[:,0].asnumpy(), X[:,1].asnumpy())\n plt.scatter(fake[:,0].asnumpy(), fake[:,1].asnumpy())\n plt.show()", "\nbinary training acc at epoch 0: accuracy=0.764500\ntime: 5.838877\n" ] ], [ [ "## Checking the outcome\n\nLet's now generate some fake data and check whether it looks real.", "_____no_output_____" ] ], [ [ "noise = mx.nd.random_normal(shape=(100, 2), ctx=ctx)\nfake = netG(noise)\n\nplt.scatter(X[:,0].asnumpy(), X[:,1].asnumpy())\nplt.scatter(fake[:,0].asnumpy(), fake[:,1].asnumpy())\nplt.show()", "_____no_output_____" ] ], [ [ "## Conclusion \n\nA word of caution here - to get this to converge properly, we needed to adjust the learning rates *very carefully*. And for Gaussians, the result is rather mediocre - a simple mean and covariance estimator would have worked *much better*. However, whenever we don't have a really good idea of what the distribution should be, this is a very good way of faking it to the best of our abilities. Note that a lot depends on the power of the discriminating network. If it is weak, the fake can be very different from the truth. E.g. in our case it had trouble picking up anything along the axis of reduced variance. \nIn summary, this isn't exactly easy to set and forget. One nice resource for dirty practioner's knowledge is [Soumith Chintala's handy list of tricks](https://github.com/soumith/ganhacks) for how to babysit GANs. ", "_____no_output_____" ], [ "For whinges or inquiries, [open an issue on GitHub.](https://github.com/zackchase/mxnet-the-straight-dope)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ] ]
d0284e8bc5b94550d3a2f08bfd196c58361b0982
55,684
ipynb
Jupyter Notebook
notebooks/module05_01_cross_validation.ipynb
lottieandrews/CS345
0316f5a72c7b1c616f3ff692a38ad48044b50746
[ "MIT" ]
9
2020-08-26T20:24:25.000Z
2022-02-06T21:17:04.000Z
notebooks/module05_01_cross_validation.ipynb
lottieandrews/CS345
0316f5a72c7b1c616f3ff692a38ad48044b50746
[ "MIT" ]
null
null
null
notebooks/module05_01_cross_validation.ipynb
lottieandrews/CS345
0316f5a72c7b1c616f3ff692a38ad48044b50746
[ "MIT" ]
17
2020-08-25T19:13:26.000Z
2021-05-06T21:59:16.000Z
70.308081
8,248
0.794824
[ [ [ "*This notebook is part of course materials for CS 345: Machine Learning Foundations and Practice at Colorado State University.\nOriginal versions were created by Asa Ben-Hur.\nThe content is availabe [on GitHub](https://github.com/asabenhur/CS345).*\n\n*The text is released under the [CC BY-SA license](https://creativecommons.org/licenses/by-sa/4.0/), and code is released under the [MIT license](https://opensource.org/licenses/MIT).*\n\n<img style=\"padding: 10px; float:right;\" alt=\"CC-BY-SA icon.svg in public domain\" src=\"https://upload.wikimedia.org/wikipedia/commons/d/d0/CC-BY-SA_icon.svg\" width=\"125\">\n", "_____no_output_____" ], [ "<a href=\"https://colab.research.google.com/github//asabenhur/CS345/blob/master/notebooks/module05_01_cross_validation.ipynb\">\n <img align=\"left\" src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n</a>", "_____no_output_____" ] ], [ [ "import numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\n%autosave 0", "_____no_output_____" ] ], [ [ "# Evaluating classifiers: cross validation \n\n### Learning curves\n\nIntuitively, the more data we have available, the more accurate our classifiers become. To demonstrate this, let's read in some data and evaluate a k-nearest neighbor classifier on a fixed test set with increasing number of training examples. The resulting curve of accuracy as a function of number of examples is called a **learning curve**.\n\n", "_____no_output_____" ] ], [ [ "from sklearn.datasets import load_digits\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.neighbors import KNeighborsClassifier\n\nX, y = load_digits(return_X_y=True)\n\ntraining_sizes = [20, 40, 100, 200, 400, 600, 800, 1000, 1200]\n\n# note the use of the stratify keyword: it makes it so that each \n# class is equally represented in both train and test set\nX_full_train, X_test, y_full_train, y_test = train_test_split(\n X, y, test_size = len(y)-max(training_sizes), \n stratify=y, random_state=1)\n\naccuracy = []\nfor training_size in training_sizes :\n X_train,_ , y_train,_ = train_test_split(\n X_full_train, y_full_train, test_size = \n len(y_full_train)-training_size+10, stratify=y_full_train)\n knn = KNeighborsClassifier(n_neighbors=1)\n knn.fit(X_train, y_train)\n y_pred = knn.predict(X_test)\n accuracy.append(np.sum((y_pred==y_test))/len(y_test))\n", "_____no_output_____" ], [ "plt.figure(figsize=(6,4))\nplt.plot(training_sizes, accuracy, 'ob')\nplt.xlabel('training set size')\nplt.ylabel('accuracy')\nplt.ylim((0.5,1));", "_____no_output_____" ] ], [ [ "It's also instructive to look at the numbers themselves:", "_____no_output_____" ] ], [ [ "print (\"# training examples\\t accuracy\")\nfor i in range(len(accuracy)) :\n print (\"\\t{:d}\\t\\t {:f}\".format(training_sizes[i], accuracy[i]))\n", "# training examples\t accuracy\n\t20\t\t 0.636516\n\t40\t\t 0.889447\n\t100\t\t 0.914573\n\t200\t\t 0.953099\n\t400\t\t 0.966499\n\t600\t\t 0.979899\n\t800\t\t 0.983250\n\t1000\t\t 0.983250\n\t1200\t\t 0.983250\n" ] ], [ [ "### Exercise\n\n* What can you conclude from this plot? \n* Why would you want to compute a learning curve on your data?\n", "_____no_output_____" ], [ "### Making better use of our data with cross validation\n\nThe discussion above demonstrates that it is best to have as large of a training set as possible. We also need to have a large enough test set, so that the accuracy estimates are accurate. How do we balance these two contradictory requirements? Cross-validation provides us a more effective way to make use of our data. Here it is:\n\n**Cross validation**\n\n* Randomly partition the data into $k$ subsets (\"folds\").\n* Set one fold aside for evaluation and train a model on the remaining $k$ folds and evaluate it on the held-out fold.\n* Repeat until each fold has been used for evaluation\n* Compute accuracy by averaging over the accuracy estimates generated for each fold.\n\nHere is an illustration of 8-fold cross validation:\n\n<img style=\"padding: 10px; float:left;\" alt=\"cross-validation by MBanuelos22 CC BY-SA 4.0\" src=\"https://upload.wikimedia.org/wikipedia/commons/c/c7/LOOCV.gif\">\n\n width=\"600\">\n\nAs you can see, this procedure is more expensive than dividing your data into train and test set. When dealing with relatively small datasets, which is when you want to use this procedure, this won't be an issue.\n\nTypically cross-validation is used with the number of folds being in the range of 5-10. An extreme case is when the number of folds equals the number of training examples. This special case is called *leave-one-out cross-validation*.", "_____no_output_____" ] ], [ [ "from sklearn.linear_model import LogisticRegression\nfrom sklearn.model_selection import cross_validate\nfrom sklearn.model_selection import cross_val_score\n\nfrom sklearn import metrics", "_____no_output_____" ] ], [ [ "Let's use the scikit-learn breast cancer dataset to demonstrate the use of cross-validation.", "_____no_output_____" ] ], [ [ "from sklearn.datasets import load_breast_cancer\ndata = load_breast_cancer()", "_____no_output_____" ] ], [ [ "A scikit-learn data object is container object with whose interesting attributes are: \n * ‘data’, the data to learn, \n * ‘target’, the classification labels, \n * ‘target_names’, the meaning of the labels,\n * ‘feature_names’, the meaning of the features, and \n * ‘DESCR’, the full description of the dataset.\n\n", "_____no_output_____" ] ], [ [ "X = data.data\ny = data.target\nprint('number of examples ', len(y))\nprint('number of features ', len(X[0]))\n\nprint(data.target_names)\nprint(data.feature_names)", "number of examples 569\nnumber of features 30\n['malignant' 'benign']\n['mean radius' 'mean texture' 'mean perimeter' 'mean area'\n 'mean smoothness' 'mean compactness' 'mean concavity'\n 'mean concave points' 'mean symmetry' 'mean fractal dimension'\n 'radius error' 'texture error' 'perimeter error' 'area error'\n 'smoothness error' 'compactness error' 'concavity error'\n 'concave points error' 'symmetry error' 'fractal dimension error'\n 'worst radius' 'worst texture' 'worst perimeter' 'worst area'\n 'worst smoothness' 'worst compactness' 'worst concavity'\n 'worst concave points' 'worst symmetry' 'worst fractal dimension']\n" ], [ "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, \n random_state=0)\nclassifier = KNeighborsClassifier(n_neighbors=3)\n#classifier = LogisticRegression()\n\n_ = classifier.fit(X_train, y_train)\ny_pred = classifier.predict(X_test)\n", "_____no_output_____" ] ], [ [ "Let's compute the accuracy of our predictions:", "_____no_output_____" ] ], [ [ "np.mean(y_pred==y_test)", "_____no_output_____" ] ], [ [ "We can do the same using scikit-learn:", "_____no_output_____" ] ], [ [ "metrics.accuracy_score(y_test, y_pred)", "_____no_output_____" ] ], [ [ "Now let's compute accuracy using [cross_validate](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.cross_validate.html) instead:", "_____no_output_____" ] ], [ [ "accuracy = cross_val_score(classifier, X, y, cv=5, \n scoring='accuracy')\nprint(accuracy)", "[0.87719298 0.92105263 0.94736842 0.93859649 0.91150442]\n" ] ], [ [ "This yields an array containing the accuracy values for each fold.\nWhen reporting your results, you will typically show the mean:", "_____no_output_____" ] ], [ [ "np.mean(accuracy)", "_____no_output_____" ] ], [ [ "The arguments of `cross_val_score`:\n* A classifier (anything that satisfies the scikit-learn classifier API)\n* data (features/labels)\n* `cv` : an integer that specifies the number of folds (can be used in more sophisticated ways as we will see below).\n* `scoring`: this determines which accuracy measure is evaluated for each fold. Here's a link to the [list of available measures](https://scikit-learn.org/stable/modules/model_evaluation.html#scoring-parameter) in scikit-learn.\n\nYou can obtain accuracy for other metrics. *Balanced accuracy* for example, is appropriate when the data is unbalanced (e.g. when one class contains a much larger number of examples than other classes in the data).", "_____no_output_____" ] ], [ [ "accuracy = cross_val_score(classifier, X, y, cv=5, \n scoring='balanced_accuracy')\nnp.mean(accuracy)", "_____no_output_____" ] ], [ [ "`cross_val_score` is somewhat limited, in that it simply returns a list of accuracy scores. In practice, we often want to have more information about what happened during training, and also to compute multiple accuracy measures.\n`cross_validate` will provide you with that information:", "_____no_output_____" ] ], [ [ "results = cross_validate(classifier, X, y, cv=5, \n scoring='accuracy', return_estimator=True)\nprint(results)", "{'fit_time': array([0.0010581 , 0.00099397, 0.00085902, 0.00093985, 0.00105977]), 'score_time': array([0.00650811, 0.00734997, 0.01043916, 0.00643301, 0.00529218]), 'estimator': (KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski',\n metric_params=None, n_jobs=None, n_neighbors=3, p=2,\n weights='uniform'), KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski',\n metric_params=None, n_jobs=None, n_neighbors=3, p=2,\n weights='uniform'), KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski',\n metric_params=None, n_jobs=None, n_neighbors=3, p=2,\n weights='uniform'), KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski',\n metric_params=None, n_jobs=None, n_neighbors=3, p=2,\n weights='uniform'), KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski',\n metric_params=None, n_jobs=None, n_neighbors=3, p=2,\n weights='uniform')), 'test_score': array([0.87719298, 0.92105263, 0.94736842, 0.93859649, 0.91150442])}\n" ] ], [ [ "The object returned by `cross_validate` is a Python dictionary as the output suggests. To extract a specific piece of data from this object, simply access the dictionary with the appropriate key:", "_____no_output_____" ] ], [ [ "results['test_score']", "_____no_output_____" ] ], [ [ "If you would like to know the predictions made for each training example during cross-validation use [cross_val_predict](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.cross_val_predict.html) instead:", "_____no_output_____" ] ], [ [ "from sklearn.model_selection import cross_val_predict\n\ny_pred = cross_val_predict(classifier, X, y, cv=5)\nmetrics.accuracy_score(y, y_pred)", "_____no_output_____" ] ], [ [ "The above way of performing cross-validation doesn't always give us enough control on the process: we usually want our machine learning experiments be reproducible, and to be able to use the same cross-validation splits with multiple algorithms. The scikit-learn `KFold` and `StratifiedKFold` cross-validation generators are the way to achieve that. \n\n`KFold` simply chooses a random subset of examples for each fold. This strategy can lead to cross-validation folds in which the classes are not well-represented as the following toy example demonstrates:", "_____no_output_____" ] ], [ [ "from sklearn.model_selection import StratifiedKFold, KFold\n\nX_toy = np.array([[1, 2], [3, 4], [5, 6], [7, 8], [9,10], [11, 12]])\ny_toy = np.array([0, 0, 1, 1, 1, 1])\ncv = KFold(n_splits=2, random_state=3, shuffle=True)\nfor train_idx, test_idx in cv.split(X_toy, y_toy):\n print(\"train:\", train_idx, \"test:\", test_idx)\n X_train, X_test = X_toy[train_idx], X_toy[test_idx]\n y_train, y_test = y_toy[train_idx], y_toy[test_idx]\n print(y_train)", "train: [0 1 2] test: [3 4 5]\n[0 0 1]\ntrain: [3 4 5] test: [0 1 2]\n[1 1 1]\n" ] ], [ [ "`StratifiedKFold` addresses this issue by making sure that each class is represented in each fold in proportion to its overall fraction in the data. This is particularly important when one or more of the classes have few examples.\n\n`StratifiedKFold` and `KFold` generate folds that can be used in conjunction with the cross-validation methods we saw above.\nAs an example, we will demonstrate the use of `StratifiedKFold` with `cross_val_score` on the breast cancer datast:", "_____no_output_____" ] ], [ [ "cv = StratifiedKFold(n_splits=5, random_state=1, shuffle=True)\naccuracy = cross_val_score(classifier, X, y, cv=cv, \n scoring='accuracy')\nnp.mean(accuracy)", "_____no_output_____" ] ], [ [ "For classification problems, `StratifiedKFold` is the preferred strategy. However, for regression problems `KFold` is the way to go.\n\n#### Question\n\nWhy is `KFold` used in regression probelms rather than `StratifiedKFold`?", "_____no_output_____" ], [ "To clarify the distinction between the different methods of generating cross-validation folds and their different parameters let's look at the following figures:", "_____no_output_____" ] ], [ [ "# the code for the figure is adapted from\n# https://scikit-learn.org/stable/auto_examples/model_selection/plot_cv_indices.html\n\nnp.random.seed(42)\ncmap_data = plt.cm.Paired\ncmap_cv = plt.cm.coolwarm\nn_folds = 4\n\n# Generate the data\nX = np.random.randn(100, 10)\n\n# generate labels - classes 0,1,2 and 10,30,60 examples, respectively\ny = np.array([0] * 10 + [1] * 30 + [2] * 60)\n\ndef plot_cv_indices(cv, X, y, ax, n_folds):\n \"\"\"plot the indices of a cross-validation object.\"\"\"\n\n # Generate the training/testing visualizations for each CV split\n for ii, (tr, tt) in enumerate(cv.split(X=X, y=y)):\n # Fill in indices with the training/test groups\n indices = np.zeros(len(X))\n indices[tt] = 1\n # Visualize the results\n ax.scatter(range(len(indices)), [ii + .5] * len(indices),\n c=indices, marker='_', lw=15, cmap=cmap_cv,\n vmin=-.2, vmax=1.2)\n\n # Plot the data classes and groups at the end\n ax.scatter(range(len(X)), [ii + 1.5] * len(X), c=y, marker='_', lw=15, cmap=cmap_data)\n\n # Formatting\n yticklabels = list(range(n_folds)) + ['class']\n ax.set(yticks=np.arange(n_folds+2) + .5, yticklabels=yticklabels, \n xlabel='index', ylabel=\"CV fold\",\n ylim=[n_folds+1.2, -.2], xlim=[0, 100])\n ax.set_title('{}'.format(type(cv).__name__), fontsize=15)\n return ax\n", "_____no_output_____" ] ], [ [ "Let's visualize the results of using `KFold` for fold generation:", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\ncv = KFold(n_folds)\nplot_cv_indices(cv, X, y, ax, n_folds);", "_____no_output_____" ] ], [ [ "As you can see, this naive way of using `KFold` can lead to highly undesirable splits into cross-validation folds.\nUsing `StratifiedKFold` addresses this to some extent:", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\ncv = StratifiedKFold(n_folds)\nplot_cv_indices(cv, X, y, ax, n_folds);", "_____no_output_____" ] ], [ [ "Using `StratifiedKFold` with shuffling of the examples is the preferred way of splitting the data into folds:", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\ncv = StratifiedKFold(n_folds, shuffle=True)\nplot_cv_indices(cv, X, y, ax, n_folds);", "_____no_output_____" ] ], [ [ "### Question\n\nConsider the task of digitizing handwritten text (aka optical character recognition, or OCR). For each letter in the alphabet you have multiple labeled examples generated by the same writer. How would this setup affect the way you divide your examples into training and test sets, or when performing cross-validation?\n", "_____no_output_____" ], [ "### Summary and Discussion\n\nIn this notebook we discussed cross-validation as a more effective way to make use of limited amounts of data compared to the strategy of splitting data into train and test sets. For very large datasets where training is time consuming you might still opt for evaluation on a single test set.\n", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ] ]
d028530fe628db56c8f7d65ca7865b4389f247ce
18,255
ipynb
Jupyter Notebook
data_analysis/3.3_anomaly_detection/anomaly_detection.ipynb
camille-vanhoffelen/modern-ML-engineer
1ee62260beac1e0eeca1fd77a1eeadb48a0065b9
[ "CC-BY-4.0" ]
10
2020-07-24T17:33:09.000Z
2022-01-29T13:47:06.000Z
data_analysis/3.3_anomaly_detection/anomaly_detection.ipynb
pandeyankit83/modern-ML-engineer
1ee62260beac1e0eeca1fd77a1eeadb48a0065b9
[ "CC-BY-4.0" ]
null
null
null
data_analysis/3.3_anomaly_detection/anomaly_detection.ipynb
pandeyankit83/modern-ML-engineer
1ee62260beac1e0eeca1fd77a1eeadb48a0065b9
[ "CC-BY-4.0" ]
2
2020-08-07T13:15:36.000Z
2021-12-14T17:57:59.000Z
43.882212
821
0.646289
[ [ [ "# Lecture 3.3: Anomaly Detection\n\n[**Lecture Slides**](https://docs.google.com/presentation/d/1_0Z5Pc5yHA8MyEBE8Fedq44a-DcNPoQM1WhJN93p-TI/edit?usp=sharing)\n\nThis lecture, we are going to use gaussian distributions to detect anomalies in our emoji faces dataset\n\n**Learning goals:**\n\n- Introduce an anomaly detection problem\n- Implement Gaussian distribution anomaly detection for images\n- Debug the optimisation of a learning algorithm\n- Discuss the imperfection of learning algorithms\n- Acknowledge other outlier detection methods", "_____no_output_____" ], [ "\n## 1. Introduction\n\nWe have an `emoji_faces` dataset of all our favourite emojis. However, Skynet hates their friendly expressiveness, and wants to destroy emojis forever! 🙀 It sent _terminator robots_ from the future to invade our dataset. We must act fast, and detect them amongst the emojis to prevent the catastrophy. \n\nOur challenge here, is that we don't watch many movies, so we don't have a clear idea of what those _terminators_ look like. 🤖 All we know, is that they look very different compared to emojis, and that only a handful managed to infiltrate our dataset.\n\nThis is a typical scenario of _anomaly detection_. We would like to identify rare examples that differ from our \"normal\" data points. We choose to use a Gaussian Distribution to model this \"normality\" and detect the killer robots.", "_____no_output_____" ], [ "\n## 2. Data Munging\n\nFirst let's load the images using [pillow](https://pillow.readthedocs.io/en/stable/), like in lecture 2.5:", "_____no_output_____" ] ], [ [ "from PIL import Image\nimport glob\n\npaths = glob.glob('emoji_faces/*.png')\n\nimages = [Image.open(path) for path in paths]\nlen(images)", "_____no_output_____" ] ], [ [ "We have 134 emoji faces, including a few terminator robots. We'll again be using the [sklearn](https://scikit-learn.org/) library to create our model. The interface is usually the same, and for gaussian anomaly detection, sklearn again expect a NumPy matrix where the rows are our images and the columns are the pixels. So we can apply the same transformations as notebook 3.2:", "_____no_output_____" ] ], [ [ "import numpy as np\n\narrays = [np.asarray(im) for im in images]\n# 64 * 64 = 4096\nvectors = [arr.reshape((4096,)) for arr in arrays]\ndata = np.stack(vectors)", "_____no_output_____" ] ], [ [ "## 3. Training \n\nNext, we will create an [`EllipticEnvelope`](https://scikit-learn.org/stable/modules/generated/sklearn.covariance.EllipticEnvelope.html) object. This will fit a multi-variate gaussian distribution to our data. It then allows us to pick a threshold to define an _ellipsoid_ decision boundary , and detect outliers. \n\nRemember that we are using a _learning_ algorithm, which must therefore be _trained_ before it can be used. This is why we'll use the `.fit()` method first, before calling `.predict()`:", "_____no_output_____" ] ], [ [ "from sklearn.covariance import EllipticEnvelope\n\ncov = EllipticEnvelope(random_state=0).fit(data)", "_____no_output_____" ] ], [ [ "😰 What's happening? Why is it stuck? Have the killer robots already taken over? \n\nNo need to panic, this kind of hiccup is very common when dealing with machine learning algorithms. We can kill the process (before it fries our laptop fan) by clicking the `stop` button ⬛️ in the notebook toolbar.\n\nMost learning algorithms are based around an _optimisation_ procedure. This step is often iterative and stochastic, i.e it tries its statistical best to maximise the learning in incremental steps. \n\nThis process isn't fail proof:\n* it can dramatically stop because of out of memory errors, or overflow errors 💥\n* it can get stuck, e.g when the optimisation is too slow 🐌\n* it can fail silently, and return wrong results 💩\n\nℹ️ We will encounter many of these failures throughout our ML experiments, so knowing how to overcome them is a part of the data scientist skillset. \n\nLet's go back to our killer robot detection: the model fitting got _stuck_ , which suggests that something about our data was too much to handle. We find the following \"notes\" in the [official documentation](https://scikit-learn.org/stable/modules/generated/sklearn.covariance.EllipticEnvelope.html#sklearn.covariance.EllipticEnvelope):\n\n> Outlier detection from covariance estimation may break or not perform well in high-dimensional settings.\n\nWe recall that our images are $64 \\times 64$ pixels, so $4096$ dimensions.... that's a lot. It seems a good candidate to explain why our multivariate gaussian distribution failed to fit our dataset. If only there was a way to reduce the dimensions of our data... 😏\n\nLet's apply PCA to reduce the number of dimensions of our dataset. Our emoji faces dataset is smaller than the full emoji dataset, so 40 dimensions should suffice to explain its variance: \n", "_____no_output_____" ] ], [ [ "from sklearn.decomposition import PCA\n\npca = PCA(n_components=40)\npca.fit(data)\ncomponents = pca.transform(data)\ncomponents.shape", "_____no_output_____" ] ], [ [ "💪 Visualise the eigenvector images of our PCA model. You can use the code from lecture 3.2!\n\n🧠 Can you explain what those eigenvector images represent? Why are they different than from the full emoji dataset?\n\nFantastic, we've managed to reduce the number of dimensions by 99%! Hopefully that should be enough to make our gaussian distribution fitting happy. Let's try again with the _principal components_ instead of the original data:", "_____no_output_____" ] ], [ [ "cov = EllipticEnvelope(random_state=0).fit(components)", "_____no_output_____" ] ], [ [ "😅 that was fast!\n\n## 4. Prediction\n\nWe can now use our fitted gaussian distribution to detect the outliers in our `data`. For this, we use the `.predict()` method:", "_____no_output_____" ] ], [ [ "y = cov.predict(components)\ny", "_____no_output_____" ] ], [ [ "`y` is our vector of predictions, where $1$ is a normal data point, and $-1$ is an anomaly. We can therefore iterate through our original `arrays` to find outliers:", "_____no_output_____" ] ], [ [ "outliers = []\n\nfor i in range(0, len(arrays)):\n if y[i] == -1:\n outliers.append(arrays[i])\nlen(outliers)", "_____no_output_____" ], [ "import matplotlib.pyplot as plt\n\nfig, axs = plt.subplots(dpi=150, nrows=2, ncols=7)\nfor outlier, ax in zip(outliers, axs.flatten()):\n ax.imshow(outlier, cmap='gray', vmin=0, vmax=255)\n ax.get_xaxis().set_visible(False)\n ax.get_yaxis().set_visible(False)", "_____no_output_____" ] ], [ [ "THERE'S OUR TERMINATORS! 🤖 We can count 5 of them in total. Notice how some real emoji faces were also detected as outliers. This is perhaps a sign that we should change our _threshold_ , to make the ellipsoid decision boundary smaller. \n\nIn fact, we didn't even specify a threshold before, we just used the default value of `contamination=0.1` in the [`EllipticEnvelope`](https://scikit-learn.org/stable/modules/generated/sklearn.covariance.EllipticEnvelope.html) class. This represents our estimation of the proportion of data points which are outliers. Since it looks like we detected double the amount of actual anomalies, let's try again with `contamination=0.05`:", "_____no_output_____" ] ], [ [ "cov = EllipticEnvelope(random_state=0, contamination=0.05).fit(components)\ny = cov.predict(components)\n\noutliers = []\n\nfor i in range(0, len(arrays)):\n if y[i] == -1:\n outliers.append(arrays[i])\n \nfig, axs = plt.subplots(dpi=150, nrows=1, ncols=7)\nfor outlier, ax in zip(outliers, axs.flatten()):\n ax.imshow(outlier, cmap='gray', vmin=0, vmax=255)\n ax.get_xaxis().set_visible(False)\n ax.get_yaxis().set_visible(False)", "_____no_output_____" ] ], [ [ "Better! `contamination=0.05` was a better choice of threshold, and we assessed this through _manual inspection_. This means we went through the results and used our human jugement to change the value of this _hyperparameter_.\n\nℹ️ Notice how our outlier detection is not _perfect_. Some emojis were also erroneously detected as anomalous killer robots. This can seem like a problem, or a sign that our model was malfunctioning. But, quite the contrary, _imperfection_ is a core aspect of all _learning_ algorithms. Instead of seeing the glass half-empty and looking at the outlier detector's mistakes, we should reflect on the task itself. It would have been almost impossible to detect those killer robot images using rule-based algorithms, and our model _accuracy_ was good _enough_ to save the emojis from Skynet. As data scientists, our goal is to make models which are accurate _enough_ to be useful, not to aim for perfect scores. We will revisit these topics later in the course when discussing Machine Learning Engineering 🛠", "_____no_output_____" ], [ "## 5. Analysis", "_____no_output_____" ], [ "We have detected the robot intruders and saved the emojis from a jealous AI from the future, all is good! We still want to better understand how anomaly detection defeated Skynet. For this, we would like to leverage our shiny new data visualization skills. Representing our dataset in space would allow us to identify its structures and hopefully understand how our gaussian distribution model identified terminators as \"abnormal\".\n\nOur data is high dimensional, so we can use our trusted PCA once again to project it down to 2 dimensions. We understand that this will lose a lot of the variance of our data, but the results were still somewhat interpretable with the full emoji dataset, so let's go!", "_____no_output_____" ] ], [ [ "# Dimesionality reduction to 2\npca_model = PCA(n_components=2)\npca_model.fit(data) # fit the model\nT = pca_model.transform(data) # transform the 'normalized model'\n\nplt.scatter(T[:, 0], T[:, 1],\n # use the predictions as color\n c=y, \n marker='o',\n alpha=0.4\n )\nplt.title('Anomaly detection of the emoji faces dataset with PCA dimensionality reduction');", "_____no_output_____" ] ], [ [ "We can notice that most of the outliers are clearly _separable_ from the bulk of the dataset, even with only 2 principal components. One outlier is very much within the main cluster however. This could be explained by the dimensionality reduction, i.e that this point is separated from the cluster in other dimensions, or by the fact our threshold might be too permissive.\n\nWe can check this by displaying the images directly on the scatter plot:", "_____no_output_____" ] ], [ [ "from matplotlib import offsetbox\n\ndef plot_components(data, model, images=None, ax=None,\n thumb_frac=0.05, cmap='gray'):\n ax = ax or plt.gca()\n \n proj = model.fit_transform(data)\n ax.plot(proj[:, 0], proj[:, 1], '.k')\n \n if images is not None:\n min_dist_2 = (thumb_frac * max(proj.max(0) - proj.min(0))) ** 2\n shown_images = np.array([2 * proj.max(0)])\n for i in range(data.shape[0]):\n dist = np.sum((proj[i] - shown_images) ** 2, 1)\n if np.min(dist) < min_dist_2:\n # don't show points that are too close\n continue\n shown_images = np.vstack([shown_images, proj[i]])\n imagebox = offsetbox.AnnotationBbox(\n offsetbox.OffsetImage(images[i], cmap=cmap),\n proj[i])\n ax.add_artist(imagebox)", "_____no_output_____" ], [ "small_images = [im[::2, ::2] for im in arrays]\nfig, ax = plt.subplots(figsize=(10, 10))\nplot_components(data,\n model=PCA(n_components=2),\n images=small_images, thumb_frac=0.02)\nplt.title('Anomaly detection of the emoji faces dataset with PCA dimensionality reduction');", "_____no_output_____" ] ], [ [ "We could probably have reduced the value of `contamination` further, since we can see how the killer robots are clearly \"abnormal\" with this visualisation. We also have a \"feel\" of how our gaussian distribution model could successfully detect them as outliers. Although remember that all of modeling magic happens in 40 dimensional space!\n\n🧠🧠 Can you explain why it is not very useful to display the ellipsoid decision boundary of our anomaly detection model on this graph?", "_____no_output_____" ], [ "## 6. More Anomaly Detection\n\nAnomaly detection is an active field in ML research, which combines supervised, unsupervised, non-linear, Bayesian, ... a whole bunch of methods! Each solution will have its pros and cons, and developing a production level outlier detection system will require empirically evaluating and comparing them. For a breakdown of the methods available in sklearn, check out this excellent [blogpost](https://sdsawtelle.github.io/blog/output/week9-anomaly-andrew-ng-machine-learning-with-python.html), or the [official documentation](https://scikit-learn.org/stable/modules/outlier_detection.html). For an in-depth view of modern anomaly detection, watch this [video](https://youtu.be/LRqX5uO5StA). And for everything else, feel free to experiment with this dataset or any other. Good luck on finding all the killer robots!", "_____no_output_____" ], [ "## 7. Summary\n\nToday, we defined **anomaly detection**, and listed some of its common applications including fraud detection and data cleaning. We then described how to use **fitted Gaussian distributions** to identify outliers. This lead us to a discussion about the choice of **thresholds** and **hyperparameters**, where we went over a few different realistic scenarios. We then used a Gaussian distribution to remove terminator images from an emoji faces dataset. We learned how learning algorithms **fail** and that data scientists must know how to **debug** them. Finally, we used **PCA** to visualize our killer robot detection. \n\n\n# Resources\n\n## Core Resources\n\n- [Anomaly detection algorithm](https://www.coursera.org/lecture/machine-learning/algorithm-C8IJp) \nAndrew Ng's limpid breakdown of anomaly detection\n\n## Additional Resources\n\n- [A review of ML techniques for anomaly detection](https://youtu.be/LRqX5uO5StA) \nMore in depth review of modern techniques for anomaly detection\n- [Anomaly Detection in sklearn](https://sdsawtelle.github.io/blog/output/week9-anomaly-andrew-ng-machine-learning-with-python.html) \nVisual blogpost experimenting with the various outlier detection algorithms available in sklearn\n- [sklearn official documentation - outlier detection](https://scikit-learn.org/stable/modules/outlier_detection.html) ", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ] ]
d02864a9715a23a804ee382bba43e9187727f856
804,768
ipynb
Jupyter Notebook
models/densenet121_144_128_hlcp_no_img_aug.ipynb
pujandave25/chexpert
87bf49b7a2500c1edbaf27e224042f1c271f6941
[ "Apache-2.0" ]
1
2021-05-14T11:13:55.000Z
2021-05-14T11:13:55.000Z
models/densenet121_144_128_hlcp_no_img_aug.ipynb
pujandave25/chexpert
87bf49b7a2500c1edbaf27e224042f1c271f6941
[ "Apache-2.0" ]
null
null
null
models/densenet121_144_128_hlcp_no_img_aug.ipynb
pujandave25/chexpert
87bf49b7a2500c1edbaf27e224042f1c271f6941
[ "Apache-2.0" ]
2
2021-05-09T19:01:40.000Z
2021-05-12T09:37:17.000Z
1,188.726736
435,472
0.949855
[ [ [ "from fastai.vision.all import *\nimport pandas as pd\nimport cam\nimport util", "_____no_output_____" ], [ "dls, labels = util.chexpert_data_loader()", "_____no_output_____" ], [ "dls.show_batch(max_n=9, figsize=(20,9))", "_____no_output_____" ], [ "# First train on conditional probabilities\nchexpert_learner_conditional = util.ChexpertLearner(dls, densenet121, n_out=len(labels), y_range=(0, 1),\n loss_func=util.BCEFlatHLCP(hierarchy_map=util.hierarchy_map),\n metrics=[RocAucMulti(average=None),\n RocAucMulti(average='weighted')])\nchexpert_learner_conditional.learn_model(use_saved=False , epochs=10, freeze_epochs=1)", "_____no_output_____" ], [ "# Next train unconditionally for only transfer learning\nprint('-------Running unconditional-------')\nchexpert_learner_unconditional = util.ChexpertLearner(dls, densenet121, n_out=len(labels), y_range=(0, 1),\n loss_func=BCELossFlat(),\n metrics=[RocAucMulti(average=None),\n RocAucMulti(average='weighted')])\nchexpert_learner_unconditional.learn_model(use_saved=True, train_saved=True, epochs=10, freeze_epochs=10)", "-------Running unconditional-------\n" ], [ "chexpert_learner = chexpert_learner_unconditional\ncam.plot_cam(chexpert_learner.learn)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code" ] ]
d0286a664c8681937df2ace0b975d61bde05c4bf
226,774
ipynb
Jupyter Notebook
Titanic.ipynb
hashmat3525/Titanic
1a15bbf8afeff7aa522a916653984da4e92bfc8a
[ "MIT" ]
3
2019-10-30T08:56:35.000Z
2019-10-31T08:50:52.000Z
Titanic.ipynb
hashmat3525/Titanic
1a15bbf8afeff7aa522a916653984da4e92bfc8a
[ "MIT" ]
null
null
null
Titanic.ipynb
hashmat3525/Titanic
1a15bbf8afeff7aa522a916653984da4e92bfc8a
[ "MIT" ]
1
2019-10-31T08:51:10.000Z
2019-10-31T08:51:10.000Z
113.217174
60,121
0.739318
[ [ [ "# Import Necessary Libraries", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\nfrom sklearn import preprocessing\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn import svm\nfrom sklearn.metrics import precision_score, recall_score\n# display images \nfrom IPython.display import Image\n# linear algebra\nimport numpy as np \n# data processing\nimport pandas as pd \n# data visualization\nimport seaborn as sns \n%matplotlib inline\nfrom matplotlib import pyplot as plt\nfrom matplotlib import style\n# Algorithms\nfrom sklearn import linear_model\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.linear_model import Perceptron\nfrom sklearn.linear_model import SGDClassifier\nfrom sklearn.tree import DecisionTreeClassifier\nfrom sklearn.neighbors import KNeighborsClassifier\nfrom sklearn.svm import SVC, LinearSVC\nfrom sklearn.naive_bayes import GaussianNB", "_____no_output_____" ] ], [ [ "# Titanic \nTitanic was a British passenger liner that sank in the North Atlantic Ocean in the early morning hours of 15 April 1912, after it collided with an iceberg during its maiden voyage from Southampton to New York City. There were an estimated 2,224 passengers and crew aboard the ship, and more than 1,500 died, making it one of the deadliest commercial peacetime maritime disasters in modern history. The RMS Titanic was the largest ship afloat at the time it entered service and was the second of three Olympic-class ocean liners operated by the White Star Line. The Titanic was built by the Harland and Wolff shipyard in Belfast. Thomas Andrews, her architect, died in the disaster.", "_____no_output_____" ] ], [ [ "# Image of Titanic ship\nImage(filename='C:/Users/Nemgeree Armanonah/Documents/GitHub/Titanic/images/ship.jpeg')", "_____no_output_____" ] ], [ [ "# Getting the Data", "_____no_output_____" ] ], [ [ "#reading train.csv\ndata = pd.read_csv('./titanic datasets/train.csv')\ndata", "_____no_output_____" ] ], [ [ "## Exploring Data", "_____no_output_____" ] ], [ [ "data.info()", "<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 891 entries, 0 to 890\nData columns (total 12 columns):\nPassengerId 891 non-null int64\nSurvived 891 non-null int64\nPclass 891 non-null int64\nName 891 non-null object\nSex 891 non-null object\nAge 714 non-null float64\nSibSp 891 non-null int64\nParch 891 non-null int64\nTicket 891 non-null object\nFare 891 non-null float64\nCabin 204 non-null object\nEmbarked 889 non-null object\ndtypes: float64(2), int64(5), object(5)\nmemory usage: 83.7+ KB\n" ] ], [ [ "### Describe Statistics \nDescribe method is used to view some basic statistical details like PassengerId,Servived,Age etc.", "_____no_output_____" ] ], [ [ "data.describe()", "_____no_output_____" ] ], [ [ "### View All Features ", "_____no_output_____" ] ], [ [ "data.columns.values", "_____no_output_____" ] ], [ [ "### What features could contribute to a high survival rate ?", "_____no_output_____" ], [ "To Us it would make sense if everything except ‘PassengerId’, ‘Ticket’ and ‘Name’ would be correlated with a high survival rate.", "_____no_output_____" ] ], [ [ "# defining variables\nsurvived = 'survived'\nnot_survived = 'not survived'\n# data to be plotted \nfig, axes = plt.subplots(nrows=1, ncols=2,figsize=(10, 4))\nwomen = data[data['Sex']=='female']\nmen = data[data['Sex']=='male']\n# plot the data\nax = sns.distplot(women[women['Survived']==1].Age.dropna(), bins=18, label = survived, ax = axes[0], kde =False)\nax = sns.distplot(women[women['Survived']==0].Age.dropna(), bins=40, label = not_survived, ax = axes[0], kde =False)\nax.legend()\nax.set_title('Female')\nax = sns.distplot(men[men['Survived']==1].Age.dropna(), bins=18, label = survived, ax = axes[1], kde = False)\nax = sns.distplot(men[men['Survived']==0].Age.dropna(), bins=40, label = not_survived, ax = axes[1], kde = False)\nax.legend()\n_ = ax.set_title('Male')", "_____no_output_____" ], [ "# count the null values \nnull_values = data.isnull().sum()\nnull_values", "_____no_output_____" ], [ "plt.plot(null_values)\nplt.grid()\nplt.show()", "_____no_output_____" ] ], [ [ "## Data Processing", "_____no_output_____" ] ], [ [ "\ndef handle_non_numerical_data(df):\n \n columns = df.columns.values\n\n for column in columns:\n text_digit_vals = {}\n def convert_to_int(val):10\n return text_digit_vals[val]\n\n #print(column,df[column].dtype)\n if df[column].dtype != np.int64 and df[column].dtype != np.float64:\n \n column_contents = df[column].values.tolist()\n #finding just the uniques\n unique_elements = set(column_contents)\n # great, found them. \n x = 0\n for unique in unique_elements:\n if unique not in text_digit_vals:\n text_digit_vals[unique] = x\n x+=1\n df[column] = list(map(convert_to_int,df[column]))\n\n return df", "_____no_output_____" ], [ "y_target = data['Survived']\n# Y_target.reshape(len(Y_target),1)\nx_train = data[['Pclass', 'Age', 'Sex', 'SibSp', 'Parch', 'Fare','Embarked', 'Ticket']]\n\nx_train = handle_non_numerical_data(x_train)\nx_train.head()", "_____no_output_____" ], [ "fare = pd.DataFrame(x_train['Fare'])\n# Normalizing\nmin_max_scaler = preprocessing.MinMaxScaler()\nnewfare = min_max_scaler.fit_transform(fare)\nx_train['Fare'] = newfare\nx_train", "c:\\users\\nemgeree armanonah\\appdata\\local\\programs\\python\\python36\\lib\\site-packages\\ipykernel_launcher.py:5: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n \"\"\"\n" ], [ "null_values = x_train.isnull().sum()\nnull_values", "_____no_output_____" ], [ "plt.plot(null_values)\nplt.show()", "_____no_output_____" ], [ "# Fill the NAN values with the median values in the datasets\nx_train['Age'] = x_train['Age'].fillna(x_train['Age'].median())\nprint(\"Number of NULL values\" , x_train['Age'].isnull().sum())\nx_train.head()", "Number of NULL values 0\n" ], [ "x_train['Sex'] = x_train['Sex'].replace('male', 0)\nx_train['Sex'] = x_train['Sex'].replace('female', 1)\n# print(type(x_train))\ncorr = x_train.corr()\ncorr.style.background_gradient()", "c:\\users\\nemgeree armanonah\\appdata\\local\\programs\\python\\python36\\lib\\site-packages\\ipykernel_launcher.py:1: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n \"\"\"Entry point for launching an IPython kernel.\nc:\\users\\nemgeree armanonah\\appdata\\local\\programs\\python\\python36\\lib\\site-packages\\ipykernel_launcher.py:2: SettingWithCopyWarning: \nA value is trying to be set on a copy of a slice from a DataFrame.\nTry using .loc[row_indexer,col_indexer] = value instead\n\nSee the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n \n" ], [ "def plot_corr(df,size=10):\n corr = df.corr()\n fig, ax = plt.subplots(figsize=(size, size))\n ax.matshow(corr)\n plt.xticks(range(len(corr.columns)), corr.columns);\n plt.yticks(range(len(corr.columns)), corr.columns);\n# plot_corr(x_train)\nx_train.corr()\ncorr.style.background_gradient()", "_____no_output_____" ], [ "# Dividing the data into train and test data set\nX_train, X_test, Y_train, Y_test = train_test_split(x_train, y_target, test_size = 0.4, random_state = 40)", "_____no_output_____" ], [ "clf = RandomForestClassifier()\nclf.fit(X_train, Y_train)", "c:\\users\\nemgeree armanonah\\appdata\\local\\programs\\python\\python36\\lib\\site-packages\\sklearn\\ensemble\\forest.py:245: FutureWarning: The default value of n_estimators will change from 10 in version 0.20 to 100 in 0.22.\n \"10 in version 0.20 to 100 in 0.22.\", FutureWarning)\n" ], [ "print(clf.predict(X_test))\nprint(\"Accuracy: \",clf.score(X_test, Y_test))", "_____no_output_____" ], [ "## Testing the model.\ntest_data = pd.read_csv('./titanic datasets/test.csv')\ntest_data.head(3)\n# test_data.isnull().sum()", "_____no_output_____" ], [ "### Preprocessing on the test data\ntest_data = test_data[['Pclass', 'Age', 'Sex', 'SibSp', 'Parch', 'Fare', 'Ticket', 'Embarked']]\ntest_data = handle_non_numerical_data(test_data)\n\nfare = pd.DataFrame(test_data['Fare'])\nmin_max_scaler = preprocessing.MinMaxScaler()\nnewfare = min_max_scaler.fit_transform(fare)\ntest_data['Fare'] = newfare\ntest_data['Fare'] = test_data['Fare'].fillna(test_data['Fare'].median())\ntest_data['Age'] = test_data['Age'].fillna(test_data['Age'].median())\ntest_data['Sex'] = test_data['Sex'].replace('male', 0)\ntest_data['Sex'] = test_data['Sex'].replace('female', 1)\nprint(test_data.head())\n", "_____no_output_____" ], [ "print(clf.predict(test_data))", "_____no_output_____" ], [ "from sklearn.model_selection import cross_val_predict\npredictions = cross_val_predict(clf, X_train, Y_train, cv=3)\nprint(\"Precision:\", precision_score(Y_train, predictions))\nprint(\"Recall:\",recall_score(Y_train, predictions))", "_____no_output_____" ], [ "from sklearn.metrics import precision_recall_curve\n\n# getting the probabilities of our predictions\ny_scores = clf.predict_proba(X_train)\ny_scores = y_scores[:,1]\n\nprecision, recall, threshold = precision_recall_curve(Y_train, y_scores)\ndef plot_precision_and_recall(precision, recall, threshold):\n plt.plot(threshold, precision[:-1], \"r-\", label=\"precision\", linewidth=5)\n plt.plot(threshold, recall[:-1], \"b\", label=\"recall\", linewidth=5)\n plt.xlabel(\"threshold\", fontsize=19)\n plt.legend(loc=\"upper right\", fontsize=19)\n plt.ylim([0, 1])\n\nplt.figure(figsize=(14, 7))\nplot_precision_and_recall(precision, recall, threshold)\nplt.axis([0.3,0.8,0.8,1])\nplt.show()", "_____no_output_____" ], [ "def plot_precision_vs_recall(precision, recall):\n plt.plot(recall, precision, \"g--\", linewidth=2.5)\n plt.ylabel(\"recall\", fontsize=19)\n plt.xlabel(\"precision\", fontsize=19)\n plt.axis([0, 1.5, 0, 1.5])\n\nplt.figure(figsize=(14, 7))\nplot_precision_vs_recall(precision, recall)\nplt.show()", "_____no_output_____" ], [ "from sklearn.model_selection import cross_val_predict\nfrom sklearn.metrics import confusion_matrix\npredictions = cross_val_predict(clf, X_train, Y_train, cv=3)\nconfusion_matrix(Y_train, predictions)", "_____no_output_____" ] ], [ [ "True positive: 293 (We predicted a positive result and it was positive)\nTrue negative: 143 (We predicted a negative result and it was negative)\nFalse positive: 34 (We predicted a positive result and it was negative)\nFalse negative: 64 (We predicted a negative result and it was positive)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ] ]
d0287996d84ee72a3e442149d0f9b2efaea53c0c
449,053
ipynb
Jupyter Notebook
07.02-Gaussian-transformation-sklearn.ipynb
sri-spirited/feature-engineering-for-ml
607c376cf92efd0ca9cc0f4f4959f639f793dedc
[ "BSD-3-Clause" ]
null
null
null
07.02-Gaussian-transformation-sklearn.ipynb
sri-spirited/feature-engineering-for-ml
607c376cf92efd0ca9cc0f4f4959f639f793dedc
[ "BSD-3-Clause" ]
null
null
null
07.02-Gaussian-transformation-sklearn.ipynb
sri-spirited/feature-engineering-for-ml
607c376cf92efd0ca9cc0f4f4959f639f793dedc
[ "BSD-3-Clause" ]
null
null
null
487.042299
77,357
0.941145
[ [ [ "## Gaussian Transformation with Scikit-learn\n\nScikit-learn has recently released transformers to do Gaussian mappings as they call the variable transformations. The PowerTransformer allows to do Box-Cox and Yeo-Johnson transformation. With the FunctionTransformer, we can specify any function we want.\n\nThe transformers per se, do not allow to select columns, but we can do so using a third transformer, the ColumnTransformer\n\nAnother thing to keep in mind is that Scikit-learn transformers return NumPy arrays, and not dataframes, so we need to be mindful of the order of the columns not to mess up with our features.\n\n## Important\n\nBox-Cox and Yeo-Johnson transformations need to learn their parameters from the data. Therefore, as always, before attempting any transformation it is important to divide the dataset into train and test set.\n\nIn this demo, I will not do so for simplicity, but when using this transformation in your pipelines, please make sure you do so.\n\n\n## In this demo\n\nWe will see how to implement variable transformations using Scikit-learn and the House Prices dataset.", "_____no_output_____" ] ], [ [ "import pandas as pd\nimport numpy as np\n\nimport matplotlib.pyplot as plt\n\nimport scipy.stats as stats\n\nfrom sklearn.preprocessing import FunctionTransformer, PowerTransformer", "_____no_output_____" ], [ "# load the data\n\ndata = pd.read_csv('../houseprice.csv')\n\ndata.head()", "_____no_output_____" ] ], [ [ "Let's select the numerical and positive variables in the dataset for this demonstration. As most of the transformations require the variables to be positive.", "_____no_output_____" ] ], [ [ "cols = []\n\nfor col in data.columns:\n\n if data[col].dtypes != 'O' and col != 'Id': # if the variable is numerical\n \n if np.sum(np.where(data[col] <= 0, 1, 0)) == 0: # if the variable is positive\n \n cols.append(col) # append variable to the list\n\ncols", "_____no_output_____" ], [ "# let's explore the distribution of the numerical variables\n\ndata[cols].hist(figsize=(20,20))\nplt.show()", "_____no_output_____" ] ], [ [ "## Plots to assess normality\n\nTo visualise the distribution of the variables, we plot a histogram and a Q-Q plot. In the Q-Q pLots, if the variable is normally distributed, the values of the variable should fall in a 45 degree line when plotted against the theoretical quantiles. We discussed this extensively in Section 3 of this course.", "_____no_output_____" ] ], [ [ "# plot the histograms to have a quick look at the variable distribution\n# histogram and Q-Q plots\n\ndef diagnostic_plots(df, variable):\n \n # function to plot a histogram and a Q-Q plot\n # side by side, for a certain variable\n \n plt.figure(figsize=(15,6))\n plt.subplot(1, 2, 1)\n df[variable].hist(bins=30)\n\n plt.subplot(1, 2, 2)\n stats.probplot(df[variable], dist=\"norm\", plot=plt)\n\n plt.show()", "_____no_output_____" ] ], [ [ "### Logarithmic transformation", "_____no_output_____" ] ], [ [ "# create a log transformer\n\ntransformer = FunctionTransformer(np.log, validate=True)", "_____no_output_____" ], [ "# transform all the numerical and positive variables\n\ndata_t = transformer.transform(data[cols].fillna(1))", "_____no_output_____" ], [ "# Scikit-learn returns NumPy arrays, so capture in dataframe\n# note that Scikit-learn will return an array with\n# only the columns indicated in cols\n\ndata_t = pd.DataFrame(data_t, columns = cols)", "_____no_output_____" ], [ "# original distribution\n\ndiagnostic_plots(data, 'GrLivArea')", "_____no_output_____" ], [ "# transformed distribution\n\ndiagnostic_plots(data_t, 'GrLivArea')", "_____no_output_____" ], [ "# original distribution\n\ndiagnostic_plots(data, 'MSSubClass')", "_____no_output_____" ], [ "# transformed distribution\n\ndiagnostic_plots(data_t, 'MSSubClass')", "_____no_output_____" ] ], [ [ "### Reciprocal transformation", "_____no_output_____" ] ], [ [ "# create the transformer\ntransformer = FunctionTransformer(lambda x: 1/x, validate=True)\n\n# also\n# transformer = FunctionTransformer(np.reciprocal, validate=True)\n\n# transform the positive variables\ndata_t = transformer.transform(data[cols].fillna(1))\n\n# re-capture in a dataframe\ndata_t = pd.DataFrame(data_t, columns = cols)", "_____no_output_____" ], [ "# transformed variable\n\ndiagnostic_plots(data_t, 'GrLivArea')", "_____no_output_____" ], [ "# transformed variable\n\ndiagnostic_plots(data_t, 'MSSubClass')", "_____no_output_____" ] ], [ [ "### Square root transformation", "_____no_output_____" ] ], [ [ "transformer = FunctionTransformer(lambda x: x**(1/2), validate=True)\n\n# also\n# transformer = FunctionTransformer(np.sqrt, validate=True)\n\ndata_t = transformer.transform(data[cols].fillna(1))\n\ndata_t = pd.DataFrame(data_t, columns = cols)", "_____no_output_____" ], [ "diagnostic_plots(data_t, 'GrLivArea')", "_____no_output_____" ], [ "diagnostic_plots(data_t, 'MSSubClass')", "_____no_output_____" ] ], [ [ "### Exponential", "_____no_output_____" ] ], [ [ "transformer = FunctionTransformer(lambda x: x**(1/1.2), validate=True)\n\ndata_t = transformer.transform(data[cols].fillna(1))\n\ndata_t = pd.DataFrame(data_t, columns = cols)", "_____no_output_____" ], [ "diagnostic_plots(data_t, 'GrLivArea')", "_____no_output_____" ], [ "diagnostic_plots(data_t, 'MSSubClass')", "_____no_output_____" ] ], [ [ "### Box-Cox transformation", "_____no_output_____" ] ], [ [ "# create the transformer\ntransformer = PowerTransformer(method='box-cox', standardize=False)\n\n# find the optimal lambda using the train set\ntransformer.fit(data[cols].fillna(1))\n\n# transform the data\ndata_t = transformer.transform(data[cols].fillna(1))\n\n# capture data in a dataframe\ndata_t = pd.DataFrame(data_t, columns = cols)", "C:\\Users\\Sole\\Documents\\Repositories\\envs\\fe_test\\lib\\site-packages\\numpy\\core\\_methods.py:195: RuntimeWarning: overflow encountered in multiply\n x = um.multiply(x, x, out=x)\nC:\\Users\\Sole\\Documents\\Repositories\\envs\\fe_test\\lib\\site-packages\\scipy\\stats\\morestats.py:910: RuntimeWarning: divide by zero encountered in log\n return (lmb - 1) * np.sum(logdata, axis=0) - N/2 * np.log(variance)\n" ], [ "diagnostic_plots(data_t, 'GrLivArea')", "_____no_output_____" ], [ "diagnostic_plots(data_t, 'MSSubClass')", "_____no_output_____" ] ], [ [ "### Yeo-Johnson\n\nYeo-Johnson is an adaptation of Box-Cox that can also be used in negative value variables. So let's expand the list of variables for the demo, to include those that contain zero and negative values as well. ", "_____no_output_____" ] ], [ [ "cols = [\n 'MSSubClass', 'LotFrontage', 'LotArea', 'OverallQual',\n 'OverallCond', 'MasVnrArea', 'BsmtFinSF1',\n 'BsmtFinSF2', 'BsmtUnfSF', 'TotalBsmtSF', '1stFlrSF', '2ndFlrSF',\n 'LowQualFinSF', 'GrLivArea', 'BsmtFullBath', 'BsmtHalfBath', 'FullBath',\n 'HalfBath', 'BedroomAbvGr', 'KitchenAbvGr', 'TotRmsAbvGrd',\n 'Fireplaces', 'GarageYrBlt', 'GarageCars', 'GarageArea', 'WoodDeckSF',\n 'OpenPorchSF', 'EnclosedPorch', '3SsnPorch', 'ScreenPorch', 'PoolArea',\n 'MiscVal', 'SalePrice'\n]", "_____no_output_____" ], [ "# call the transformer\ntransformer = PowerTransformer(method='yeo-johnson', standardize=False)\n\n# learn the lambda from the train set\ntransformer.fit(data[cols].fillna(1))\n\n# transform the data\ndata_t = transformer.transform(data[cols].fillna(1))\n\n# capture data in a dataframe\ndata_t = pd.DataFrame(data_t, columns = cols)", "_____no_output_____" ], [ "diagnostic_plots(data_t, 'GrLivArea')", "_____no_output_____" ], [ "diagnostic_plots(data_t, 'MSSubClass')", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ] ]
d0288eacfb4079d91fa8e3187893fe6983ec98a0
33,476
ipynb
Jupyter Notebook
EDA/news_stat.ipynb
pcrete/stock_prediction_using_contextual_information
de77fa09bf25b184821093539debb5aa8056a862
[ "Apache-2.0" ]
null
null
null
EDA/news_stat.ipynb
pcrete/stock_prediction_using_contextual_information
de77fa09bf25b184821093539debb5aa8056a862
[ "Apache-2.0" ]
null
null
null
EDA/news_stat.ipynb
pcrete/stock_prediction_using_contextual_information
de77fa09bf25b184821093539debb5aa8056a862
[ "Apache-2.0" ]
null
null
null
34.36961
2,398
0.453638
[ [ [ "from IPython.core.display import display, HTML\ndisplay(HTML(\"<style>.container { width:95% !important; }</style>\"))\n\nfrom jupyterthemes import jtplot\njtplot.style()\n\nfrom plotly.offline import download_plotlyjs, init_notebook_mode, plot, iplot\ninit_notebook_mode(connected=True)\n\nimport os\nimport json\nimport numpy as np\nimport pandas as pd\nfrom tqdm import tqdm_notebook\nfrom datetime import datetime\nfrom matplotlib import pyplot as plt\n%matplotlib inline\n\ntarget_stocks = ['BANPU','IRPC','PTT','BBL','KBANK','SCB','AOT','THAI','CPF','MINT',\n 'TU','SCC','CPN','CK','CPALL','HMPRO','BDMS','BH','ADVANC','JAS','TRUE']", "_____no_output_____" ], [ "with open('../data/kaohoon.json') as json_data:\n data = json.load(json_data)\nlen(data)", "_____no_output_____" ], [ "data[20]", "_____no_output_____" ], [ "entry = []\nfor i, row in tqdm_notebook(enumerate(data)):\n if len(row['Stock Include']) > 1: continue\n for stock in row['Stock Include']:\n if stock in target_stocks:\n entry.append([\n row['Date'],\n stock,\n# row['Stock Include'],\n row['Content']\n ])", "_____no_output_____" ], [ "df = pd.DataFrame.from_records(entry)\ndf[0] = pd.to_datetime(df[0], format='%Y-%m-%d')\ndf.columns = ['Date','Ticker','Text']", "_____no_output_____" ], [ "df.to_csv('../data/kaohoon.csv', index=False)", "_____no_output_____" ], [ "df.head()", "_____no_output_____" ] ], [ [ "## Money Channel", "_____no_output_____" ] ], [ [ "with open('../data/moneychanel.json') as json_data:\n data = json.load(json_data)\nlen(data)", "_____no_output_____" ], [ "data[13]", "_____no_output_____" ], [ "entry = []\nfor i, row in tqdm_notebook(enumerate(data)):\n if len(row['Stock Include']) > 1: continue\n for stock in row['Stock Include']:\n if stock in target_stocks:\n entry.append([\n row['Date'],\n stock,\n row['Content']\n ])", "_____no_output_____" ], [ "df = pd.DataFrame.from_records(entry)\ndf[0] = pd.to_datetime(df[0], format='%Y-%m-%d')\ndf.columns = ['Date','Ticker','Text']\ndf.head()", "_____no_output_____" ], [ "df.to_csv('../data/moneychanel.csv', index=False)", "_____no_output_____" ] ], [ [ "## Pantip", "_____no_output_____" ] ], [ [ "with open('../data/pantip.json') as json_data:\n data = json.load(json_data)\nlen(data)", "_____no_output_____" ], [ "data[3]", "_____no_output_____" ], [ "data[3]['date']\ndata[3]['stock']\ntext = data[3]['head']+' '+data[3]['content']\ntext", "_____no_output_____" ], [ "for x in data[3]['comments']:\n text += x['message']\ntext", "_____no_output_____" ], [ "entry = []\nfor i, row in tqdm_notebook(enumerate(data)):\n if len(row['stock']) > 1: continue\n for stock in row['stock']:\n if stock in target_stocks:\n \n text = row['head']+' '+row['content']\n for comment in row['comments']:\n text += comment['message']\n \n entry.append([\n row['date'],\n stock,\n text\n ])", "_____no_output_____" ], [ "df = pd.DataFrame.from_records(entry)\ndf[0] = pd.to_datetime(df[0], format='%Y-%m-%d')\ndf.columns = ['Date','Ticker','Text']\ndf.head()", "_____no_output_____" ], [ "df.to_csv('../data/pantip.csv', index=False)", "_____no_output_____" ] ], [ [ "## Twitter", "_____no_output_____" ] ], [ [ "with open('../data/twitter.json') as json_data:\n data = json.load(json_data)\nlen(data)", "_____no_output_____" ], [ "data[0]", "_____no_output_____" ], [ "entry = []\nfor i, row in tqdm_notebook(enumerate(data)):\n if len(row['Stock Include']) > 1: continue\n for stock in row['Stock Include']:\n if stock in target_stocks:\n entry.append([\n row['date'],\n stock,\n row['text']\n ])", "_____no_output_____" ], [ "df = pd.DataFrame.from_records(entry)\ndf[0] = pd.to_datetime(df[0], format='%Y-%m-%d')\ndf.columns = ['Date','Ticker','Text']\ndf.head()", "_____no_output_____" ], [ "df.to_csv('../data/twitter.csv', index=False)", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d028905b3c3af922b2af68abdecdef151ddd49a5
10,880
ipynb
Jupyter Notebook
content/lessons/05/Class-Coding-Lab/CCL-Iterations.ipynb
MahopacHS/spring2019-rizzenM
2860b3338c8f452e45aac04f04388a417b2cf506
[ "MIT" ]
null
null
null
content/lessons/05/Class-Coding-Lab/CCL-Iterations.ipynb
MahopacHS/spring2019-rizzenM
2860b3338c8f452e45aac04f04388a417b2cf506
[ "MIT" ]
null
null
null
content/lessons/05/Class-Coding-Lab/CCL-Iterations.ipynb
MahopacHS/spring2019-rizzenM
2860b3338c8f452e45aac04f04388a417b2cf506
[ "MIT" ]
null
null
null
52.307692
1,199
0.583732
[ [ [ "# In-Class Coding Lab: Iterations\n\nThe goals of this lab are to help you to understand:\n\n- How loops work.\n- The difference between definite and indefinite loops, and when to use each.\n- How to build an indefinite loop with complex exit conditions.\n- How to create a program from a complex idea.\n", "_____no_output_____" ], [ "# Understanding Iterations\n\nIterations permit us to repeat code until a Boolean expression is `False`. Iterations or **loops** allow us to write succint, compact code. Here's an example, which counts to 3 before [Blitzing the Quarterback in backyard American Football](https://www.quora.com/What-is-the-significance-of-counting-one-Mississippi-two-Mississippi-and-so-on):", "_____no_output_____" ] ], [ [ "i = 1\nwhile i <= 3:\n print(i,\"Mississippi...\")\n i=i+1\nprint(\"Blitz!\")", "1 Mississippi...\n2 Mississippi...\n3 Mississippi...\nBlitz!\n" ] ], [ [ "## Breaking it down...\n\nThe `while` statement on line 2 starts the loop. The code indented beneath it (lines 3-4) will repeat, in a linear fashion until the Boolean expression on line 2 `i <= 3` is `False`, at which time the program continues with line 5.\n\n### Some Terminology\n\nWe call `i <=3` the loop's **exit condition**. The variable `i` inside the exit condition is the only thing that we can change to make the exit condition `False`, therefore it is the **loop control variable**. On line 4 we change the loop control variable by adding one to it, this is called an **increment**.\n\nFurthermore, we know how many times this loop will execute before it actually runs: 3. Even if we allowed the user to enter a number, and looped that many times, we would still know. We call this a **definite loop**. Whenever we iterate over a fixed number of values, regardless of whether those values are determined at run-time or not, we're using a definite loop.\n\nIf the loop control variable never forces the exit condition to be `False`, we have an **infinite loop**. As the name implies, an Infinite loop never ends and typically causes our computer to crash or lock up. ", "_____no_output_____" ] ], [ [ "## WARNING!!! INFINITE LOOP AHEAD\n## IF YOU RUN THIS CODE YOU WILL NEED TO KILL YOUR BROWSER AND SHUT DOWN JUPYTER NOTEBOOK\n\ni = 1\nwhile i <= 3:\n print(i,\"Mississippi...\")\n# i=i+1\nprint(\"Blitz!\")", "_____no_output_____" ] ], [ [ "### For loops\n\nTo prevent an infinite loop when the loop is definite, we use the `for` statement. Here's the same program using `for`:", "_____no_output_____" ] ], [ [ "for i in range(1,4):\n print(i,\"Mississippi...\")\nprint(\"Blitz!\")", "1 Mississippi...\n2 Mississippi...\n3 Mississippi...\nBlitz!\n" ] ], [ [ "One confusing aspect of this loop is `range(1,4)` why does this loop from 1 to 3? Why not 1 to 4? Well it has to do with the fact that computers start counting at zero. The easier way to understand it is if you subtract the two numbers you get the number of times it will loop. So for example, 4-1 == 3.\n\n### Now Try It\n\nIn the space below, Re-Write the above program to count from 10 to 15. Note: How many times will that loop?", "_____no_output_____" ] ], [ [ "# TODO Write code here\n", "10 Mississippi...\n11 Mississippi...\n12 Mississippi...\n13 Mississippi...\n14 Mississippi...\n15 Mississippi...\nBlitz!\n" ] ], [ [ "## Indefinite loops\n\nWith **indefinite loops** we do not know how many times the program will execute. This is typically based on user action, and therefore our loop is subject to the whims of whoever interacts with it. Most applications like spreadsheets, photo editors, and games use indefinite loops. They'll run on your computer, seemingly forever, until you choose to quit the application. \n\nThe classic indefinite loop pattern involves getting input from the user inside the loop. We then inspect the input and based on that input we might exit the loop. Here's an example:", "_____no_output_____" ] ], [ [ "name = \"\"\nwhile name != 'mike':\n name = input(\"Say my name! : \")\n print(\"Nope, my name is not %s! \" %(name))", "Say my name! : rizzen\nNope, my name is not rizzen! \nSay my name! : mike\nNope, my name is not mike! \n" ] ], [ [ "The classic problem with indefinite loops is that its really difficult to get the application's logic to line up with the exit condition. For example we need to set `name = \"\"` in line 1 so that line 2 start out as `True`. Also we have this wonky logic where when we say `'mike'` it still prints `Nope, my name is not mike!` before exiting.\n\n### Break statement\n\nThe solution to this problem is to use the break statement. **break** tells Python to exit the loop immediately. We then re-structure all of our indefinite loops to look like this:\n\n```\nwhile True:\n if exit-condition:\n break\n```\n\nHere's our program we-written with the break statement. This is the recommended way to write indefinite loops in this course.", "_____no_output_____" ] ], [ [ "while True:\n name = input(\"Say my name!: \")\n if name == 'mike':\n break\n print(\"Nope, my name is not %s!\" %(name))", "Say my name!: bill\nNope, my name is not bill!\nSay my name!: dave\nNope, my name is not dave!\nSay my name!: mike\n" ] ], [ [ "### Multiple exit conditions\n\nThis indefinite loop pattern makes it easy to add additional exit conditions. For example, here's the program again, but it now stops when you say my name or type in 3 wrong names. Make sure to run this program a couple of times. First enter mike to exit the program, next enter the wrong name 3 times.", "_____no_output_____" ] ], [ [ "times = 0\nwhile True:\n name = input(\"Say my name!: \")\n times = times + 1\n if name == 'mike':\n print(\"You got it!\")\n break\n if times == 3:\n print(\"Game over. Too many tries!\")\n break\n print(\"Nope, my name is not %s!\" %(name))", "Say my name!: mike\nYou got it!\n" ] ], [ [ "# Number sums\n\nLet's conclude the lab with you writing your own program which\n\nuses an indefinite loop. We'll provide the to-do list, you write the code. This program should ask for floating point numbers as input and stops looping when **the total of the numbers entered is over 100**, or **more than 5 numbers have been entered**. Those are your two exit conditions. After the loop stops print out the total of the numbers entered and the count of numbers entered. ", "_____no_output_____" ] ], [ [ "## TO-DO List\n\n#1 count = 0\n#2 total = 0\n#3 loop Indefinitely\n#4. input a number\n#5 increment count\n#6 add number to total\n#7 if count equals 5 stop looping\n#8 if total greater than 100 stop looping\n#9 print total and count", "_____no_output_____" ], [ "# Write Code here:\ncount = 0 \ntotal = 0 \nwhile true:\n number = int(input(\"enter a number:\"))\n count = count + 1\n total = total + number \n if count == 5:\n break\n if total >100 \n break\n print(\"total:\", total,\"count:\", count)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d028a007b4fd192febb7bfbba4e944a70b272b83
1,919
ipynb
Jupyter Notebook
notebooks/data_processing_numeric.ipynb
hirogen317/chamomile
6a185cc74ae2c54832f9bdc04f77b22d70bc9dc0
[ "Apache-2.0" ]
null
null
null
notebooks/data_processing_numeric.ipynb
hirogen317/chamomile
6a185cc74ae2c54832f9bdc04f77b22d70bc9dc0
[ "Apache-2.0" ]
8
2020-03-31T11:24:43.000Z
2022-03-12T00:24:08.000Z
notebooks/data_processing_numeric.ipynb
hirogen317/chamomile
6a185cc74ae2c54832f9bdc04f77b22d70bc9dc0
[ "Apache-2.0" ]
null
null
null
22.845238
132
0.503908
[ [ [ "# default_exp data_processing.numeric", "_____no_output_____" ] ], [ [ "# data_processing.numeric\n\n> Numeric related data processing\n\n- toc: True", "_____no_output_____" ] ], [ [ "#export\nimport pandas as pd", "_____no_output_____" ], [ "# export\ndef moving_average(data_frame: pd.DataFrame = None,\n window: int=7,\n group_col: str = None,\n value_col: str = None,\n shift=0)->pd.DataFrame:\n df = data_frame.copy()\n ma_col = '{value_col}_ma{window}'.format(value_col=value_col, window=window)\n if group_col is None:\n df[ma_col] = df[value_col].rolling(window=window).mean().shift(periods=shift)\n else:\n df[ma_col] = df.groupby(group_col)[value_col].apply(lambda x: x.rolling(window=window).mean().shift(periods=shift))\n return df[ma_col]", "_____no_output_____" ] ] ]
[ "code", "markdown", "code" ]
[ [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d028a6e9608ebdfd0601eba4f484b95f82d659e6
3,509
ipynb
Jupyter Notebook
debug/pytorch/max_mem_allocated.ipynb
stas00/fastai-misc
e7e8c18ed798f91b2e026c667f795f45992608b8
[ "Apache-2.0" ]
1
2018-06-01T17:39:59.000Z
2018-06-01T17:39:59.000Z
debug/pytorch/max_mem_allocated.ipynb
stas00/fastai-misc
e7e8c18ed798f91b2e026c667f795f45992608b8
[ "Apache-2.0" ]
null
null
null
debug/pytorch/max_mem_allocated.ipynb
stas00/fastai-misc
e7e8c18ed798f91b2e026c667f795f45992608b8
[ "Apache-2.0" ]
null
null
null
27.414063
118
0.54403
[ [ [ "import torch \nimport pynvml\npynvml = pynvml\npynvml.nvmlInit()\nnvml_preload = 0\nnvml_prev = 0\npytorch_prev = 0\ndef nvml_used():\n handle = pynvml.nvmlDeviceGetHandleByIndex(torch.cuda.current_device())\n info = pynvml.nvmlDeviceGetMemoryInfo(handle)\n return b2mb(info.used)\ndef b2mb(x): return int(x/2**20)\ndef consume_gpu_ram(n): return torch.ones((n, n)).cuda()\ndef consume_gpu_ram_64mb(): return consume_gpu_ram(2**12)\ndef consume_gpu_ram_256mb(): return consume_gpu_ram(2**13)\ndef mem(): \n global nvml_preload, nvml_prev, pytorch_preload, pytorch_prev\n nvml_this = nvml_used()\n nvml_delta_cached = nvml_this - nvml_preload\n nvml_delta_used = nvml_this - nvml_prev\n nvml_prev = nvml_this\n \n pytorch_this = torch.cuda.memory_allocated()\n pytorch_delta_used = pytorch_this - pytorch_prev\n pytorch_prev = pytorch_this\n \n print(f\" nvml used: {nvml_delta_used:4d}, allocated: {nvml_delta_cached:4d}\")\n print(f\"pytorch used: {b2mb(pytorch_delta_used):4d}, allocated: {b2mb(torch.cuda.memory_cached()):4d}\\n\")", "_____no_output_____" ], [ "print(\"preloading:\")\nmem()\n_ = torch.ones((1, 1)).cuda()\nmem()\npreload = nvml_used()\npytorch_preload = torch.cuda.memory_allocated()\n\nprint(\"\\nrunning:\")\nx1 = consume_gpu_ram_64mb()\nmem()\nx2 = consume_gpu_ram_256mb()\nmem()\ndel x2\nmem()\nx3 = consume_gpu_ram_64mb()\nmem()", "preloading:\n nvml used: 4779, allocated: 4779\npytorch used: 0, allocated: 0\n\n nvml used: 495, allocated: 5274\npytorch used: 0, allocated: 1\n\n\nrunning:\n nvml used: 64, allocated: 5338\npytorch used: 64, allocated: 65\n\n nvml used: 256, allocated: 5594\npytorch used: 256, allocated: 321\n\n nvml used: 0, allocated: 5594\npytorch used: -256, allocated: 321\n\n nvml used: 0, allocated: 5594\npytorch used: 64, allocated: 321\n\n" ] ] ]
[ "code" ]
[ [ "code", "code" ] ]
d028af63c6518bd29539551388ee5ca3c12722c1
160,721
ipynb
Jupyter Notebook
chapter08.ipynb
fKVzGecnXYhM/nlp100
8ea3b8e0e2904ce4ccdfdbecb11ea1b28596a791
[ "MIT" ]
26
2020-06-08T02:12:42.000Z
2022-02-21T01:41:01.000Z
chapter08.ipynb
fKVzGecnXYhM/nlp100
8ea3b8e0e2904ce4ccdfdbecb11ea1b28596a791
[ "MIT" ]
null
null
null
chapter08.ipynb
fKVzGecnXYhM/nlp100
8ea3b8e0e2904ce4ccdfdbecb11ea1b28596a791
[ "MIT" ]
8
2020-05-23T05:49:50.000Z
2021-11-17T08:43:50.000Z
90.905543
42,722
0.750941
[ [ [ "# 第8章: ニューラルネット\n第6章で取り組んだニュース記事のカテゴリ分類を題材として,ニューラルネットワークでカテゴリ分類モデルを実装する.なお,この章ではPyTorch, TensorFlow, Chainerなどの機械学習プラットフォームを活用せよ.", "_____no_output_____" ], [ "## 70. 単語ベクトルの和による特徴量\n***\n問題50で構築した学習データ,検証データ,評価データを行列・ベクトルに変換したい.例えば,学習データについて,すべての事例$x_i$の特徴ベクトル$\\boldsymbol{x}_i$を並べた行列$X$と正解ラベルを並べた行列(ベクトル)$Y$を作成したい.\n\n$$\nX = \\begin{pmatrix} \n \\boldsymbol{x}_1 \\\\ \n \\boldsymbol{x}_2 \\\\ \n \\dots \\\\ \n \\boldsymbol{x}_n \\\\ \n\\end{pmatrix} \\in \\mathbb{R}^{n \\times d},\nY = \\begin{pmatrix} \n y_1 \\\\ \n y_2 \\\\ \n \\dots \\\\ \n y_n \\\\ \n\\end{pmatrix} \\in \\mathbb{N}^{n}\n$$\n\nここで,$n$は学習データの事例数であり,$\\boldsymbol x_i \\in \\mathbb{R}^d$と$y_i \\in \\mathbb N$はそれぞれ,$i \\in \\{1, \\dots, n\\}$番目の事例の特徴量ベクトルと正解ラベルを表す.\nなお,今回は「ビジネス」「科学技術」「エンターテイメント」「健康」の4カテゴリ分類である.$\\mathbb N_{<4}$で$4$未満の自然数($0$を含む)を表すことにすれば,任意の事例の正解ラベル$y_i$は$y_i \\in \\mathbb N_{<4}$で表現できる.\n以降では,ラベルの種類数を$L$で表す(今回の分類タスクでは$L=4$である).\n\n$i$番目の事例の特徴ベクトル$\\boldsymbol x_i$は,次式で求める.\n\n$$\\boldsymbol x_i = \\frac{1}{T_i} \\sum_{t=1}^{T_i} \\mathrm{emb}(w_{i,t})$$\n\nここで,$i$番目の事例は$T_i$個の(記事見出しの)単語列$(w_{i,1}, w_{i,2}, \\dots, w_{i,T_i})$から構成され,$\\mathrm{emb}(w) \\in \\mathbb{R}^d$は単語$w$に対応する単語ベクトル(次元数は$d$)である.すなわち,$i$番目の事例の記事見出しを,その見出しに含まれる単語のベクトルの平均で表現したものが$\\boldsymbol x_i$である.今回は単語ベクトルとして,問題60でダウンロードしたものを用いればよい.$300$次元の単語ベクトルを用いたので,$d=300$である.\n$i$番目の事例のラベル$y_i$は,次のように定義する.\n\n$$\ny_i = \\begin{cases}\n0 & (\\mbox{記事}\\boldsymbol x_i\\mbox{が「ビジネス」カテゴリの場合}) \\\\\n1 & (\\mbox{記事}\\boldsymbol x_i\\mbox{が「科学技術」カテゴリの場合}) \\\\\n2 & (\\mbox{記事}\\boldsymbol x_i\\mbox{が「エンターテイメント」カテゴリの場合}) \\\\\n3 & (\\mbox{記事}\\boldsymbol x_i\\mbox{が「健康」カテゴリの場合}) \\\\\n\\end{cases}\n$$\n\nなお,カテゴリ名とラベルの番号が一対一で対応付いていれば,上式の通りの対応付けでなくてもよい.\n\n以上の仕様に基づき,以下の行列・ベクトルを作成し,ファイルに保存せよ.\n\n+ 学習データの特徴量行列: $X_{\\rm train} \\in \\mathbb{R}^{N_t \\times d}$\n+ 学習データのラベルベクトル: $Y_{\\rm train} \\in \\mathbb{N}^{N_t}$\n+ 検証データの特徴量行列: $X_{\\rm valid} \\in \\mathbb{R}^{N_v \\times d}$\n+ 検証データのラベルベクトル: $Y_{\\rm valid} \\in \\mathbb{N}^{N_v}$\n+ 評価データの特徴量行列: $X_{\\rm test} \\in \\mathbb{R}^{N_e \\times d}$\n+ 評価データのラベルベクトル: $Y_{\\rm test} \\in \\mathbb{N}^{N_e}$\n\nなお,$N_t, N_v, N_e$はそれぞれ,学習データの事例数,検証データの事例数,評価データの事例数である.\n", "_____no_output_____" ] ], [ [ "!wget https://archive.ics.uci.edu/ml/machine-learning-databases/00359/NewsAggregatorDataset.zip\n!unzip NewsAggregatorDataset.zip", "_____no_output_____" ], [ "!wc -l ./newsCorpora.csv", "422937 ./newsCorpora.csv\n" ], [ "!head -10 ./newsCorpora.csv", "1\tFed official says weak data caused by weather, should not slow taper\thttp://www.latimes.com/business/money/la-fi-mo-federal-reserve-plosser-stimulus-economy-20140310,0,1312750.story\\?track=rss\tLos Angeles Times\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.latimes.com\t1394470370698\n2\tFed's Charles Plosser sees high bar for change in pace of tapering\thttp://www.livemint.com/Politics/H2EvwJSK2VE6OF7iK1g3PP/Feds-Charles-Plosser-sees-high-bar-for-change-in-pace-of-ta.html\tLivemint\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.livemint.com\t1394470371207\n3\tUS open: Stocks fall after Fed official hints at accelerated tapering\thttp://www.ifamagazine.com/news/us-open-stocks-fall-after-fed-official-hints-at-accelerated-tapering-294436\tIFA Magazine\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.ifamagazine.com\t1394470371550\n4\tFed risks falling 'behind the curve', Charles Plosser says\thttp://www.ifamagazine.com/news/fed-risks-falling-behind-the-curve-charles-plosser-says-294430\tIFA Magazine\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.ifamagazine.com\t1394470371793\n5\tFed's Plosser: Nasty Weather Has Curbed Job Growth\thttp://www.moneynews.com/Economy/federal-reserve-charles-plosser-weather-job-growth/2014/03/10/id/557011\tMoneynews\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.moneynews.com\t1394470372027\n6\tPlosser: Fed May Have to Accelerate Tapering Pace\thttp://www.nasdaq.com/article/plosser-fed-may-have-to-accelerate-tapering-pace-20140310-00371\tNASDAQ\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.nasdaq.com\t1394470372212\n7\tFed's Plosser: Taper pace may be too slow\thttp://www.marketwatch.com/story/feds-plosser-taper-pace-may-be-too-slow-2014-03-10\\?reflink=MW_news_stmp\tMarketWatch\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.marketwatch.com\t1394470372405\n8\tFed's Plosser expects US unemployment to fall to 6.2% by the end of 2014\thttp://www.fxstreet.com/news/forex-news/article.aspx\\?storyid=23285020-b1b5-47ed-a8c4-96124bb91a39\tFXstreet.com\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\twww.fxstreet.com\t1394470372615\n9\tUS jobs growth last month hit by weather:Fed President Charles Plosser\thttp://economictimes.indiatimes.com/news/international/business/us-jobs-growth-last-month-hit-by-weatherfed-president-charles-plosser/articleshow/31788000.cms\tEconomic Times\tb\tddUyU0VZz0BRneMioxUPQVP6sIxvM\teconomictimes.indiatimes.com\t1394470372792\n10\tECB unlikely to end sterilisation of SMP purchases - traders\thttp://www.iii.co.uk/news-opinion/reuters/news/152615\tInteractive Investor\tb\tdPhGU51DcrolUIMxbRm0InaHGA2XM\twww.iii.co.uk\t1394470501265\n" ], [ "# 読込時のエラー回避のためダブルクォーテーションをシングルクォーテーションに置換\n!sed -e 's/\"/'\\''/g' ./newsCorpora.csv > ./newsCorpora_re.csv", "_____no_output_____" ], [ "import pandas as pd\nfrom sklearn.model_selection import train_test_split\n\n# データの読込\ndf = pd.read_csv('./newsCorpora_re.csv', header=None, sep='\\t', names=['ID', 'TITLE', 'URL', 'PUBLISHER', 'CATEGORY', 'STORY', 'HOSTNAME', 'TIMESTAMP'])\n\n# データの抽出\ndf = df.loc[df['PUBLISHER'].isin(['Reuters', 'Huffington Post', 'Businessweek', 'Contactmusic.com', 'Daily Mail']), ['TITLE', 'CATEGORY']]\n\n# データの分割\ntrain, valid_test = train_test_split(df, test_size=0.2, shuffle=True, random_state=123, stratify=df['CATEGORY'])\nvalid, test = train_test_split(valid_test, test_size=0.5, shuffle=True, random_state=123, stratify=valid_test['CATEGORY'])\n\n# 事例数の確認\nprint('【学習データ】')\nprint(train['CATEGORY'].value_counts())\nprint('【検証データ】')\nprint(valid['CATEGORY'].value_counts())\nprint('【評価データ】')\nprint(test['CATEGORY'].value_counts())", "【学習データ】\nb 4501\ne 4235\nt 1220\nm 728\nName: CATEGORY, dtype: int64\n【検証データ】\nb 563\ne 529\nt 153\nm 91\nName: CATEGORY, dtype: int64\n【評価データ】\nb 563\ne 530\nt 152\nm 91\nName: CATEGORY, dtype: int64\n" ], [ "train.to_csv('drive/My Drive/nlp100/data/train.tsv', index=False, sep='\\t', header=False)\nvalid.to_csv('drive/My Drive/nlp100/data/valid.tsv', index=False, sep='\\t', header=False)\ntest.to_csv('drive/My Drive/nlp100/data/test.tsv', index=False, sep='\\t', header=False)", "_____no_output_____" ], [ "import gdown\nfrom gensim.models import KeyedVectors\n\n# 学習済み単語ベクトルのダウンロード\nurl = \"https://drive.google.com/uc?id=0B7XkCwpI5KDYNlNUTTlSS21pQmM\"\noutput = 'GoogleNews-vectors-negative300.bin.gz'\ngdown.download(url, output, quiet=True)\n \n# ダウンロードファイルのロード\nmodel = KeyedVectors.load_word2vec_format('GoogleNews-vectors-negative300.bin.gz', binary=True)", "_____no_output_____" ], [ "import string\nimport torch\n\ndef transform_w2v(text):\n table = str.maketrans(string.punctuation, ' '*len(string.punctuation))\n words = text.translate(table).split() # 記号をスペースに置換後、スペースで分割してリスト化\n vec = [model[word] for word in words if word in model] # 1語ずつベクトル化\n\n return torch.tensor(sum(vec) / len(vec)) # 平均ベクトルをTensor型に変換して出力", "_____no_output_____" ], [ "# 特徴ベクトルの作成\nX_train = torch.stack([transform_w2v(text) for text in train['TITLE']])\nX_valid = torch.stack([transform_w2v(text) for text in valid['TITLE']])\nX_test = torch.stack([transform_w2v(text) for text in test['TITLE']])\n\nprint(X_train.size())\nprint(X_train)", "torch.Size([10684, 300])\ntensor([[ 0.0837, 0.0056, 0.0068, ..., 0.0751, 0.0433, -0.0868],\n [ 0.0272, 0.0266, -0.0947, ..., -0.1046, -0.0489, -0.0092],\n [ 0.0577, -0.0159, -0.0780, ..., -0.0421, 0.1229, 0.0876],\n ...,\n [ 0.0392, -0.0052, 0.0686, ..., -0.0175, 0.0061, -0.0224],\n [ 0.0798, 0.1017, 0.1066, ..., -0.0752, 0.0623, 0.1138],\n [ 0.1664, 0.0451, 0.0508, ..., -0.0531, -0.0183, -0.0039]])\n" ], [ "# ラベルベクトルの作成\ncategory_dict = {'b': 0, 't': 1, 'e':2, 'm':3}\ny_train = torch.LongTensor(train['CATEGORY'].map(lambda x: category_dict[x]).values)\ny_valid = torch.LongTensor(valid['CATEGORY'].map(lambda x: category_dict[x]).values)\ny_test = torch.LongTensor(test['CATEGORY'].map(lambda x: category_dict[x]).values)\n\nprint(y_train.size())\nprint(y_train)", "torch.Size([10684])\ntensor([0, 1, 3, ..., 0, 3, 2])\n" ], [ "# 保存\ntorch.save(X_train, 'X_train.pt')\ntorch.save(X_valid, 'X_valid.pt')\ntorch.save(X_test, 'X_test.pt')\ntorch.save(y_train, 'y_train.pt')\ntorch.save(y_valid, 'y_valid.pt')\ntorch.save(y_test, 'y_test.pt')", "_____no_output_____" ] ], [ [ "## 71. 単層ニューラルネットワークによる予測\n***\n問題70で保存した行列を読み込み,学習データについて以下の計算を実行せよ.\n\n$$\n\\hat{y}_1=softmax(x_1W),\\\\\\hat{Y}=softmax(X_{[1:4]}W)\n$$\n\nただし,$softmax$はソフトマックス関数,$X_{[1:4]}∈\\mathbb{R}^{4×d}$は特徴ベクトル$x_1$,$x_2$,$x_3$,$x_4$を縦に並べた行列である.\n\n$$\nX_{[1:4]}=\\begin{pmatrix}x_1\\\\x_2\\\\x_3\\\\x_4\\end{pmatrix}\n$$\n\n行列$W \\in \\mathbb{R}^{d \\times L}$は単層ニューラルネットワークの重み行列で,ここではランダムな値で初期化すればよい(問題73以降で学習して求める).なお,$\\hat{\\boldsymbol y_1} \\in \\mathbb{R}^L$は未学習の行列$W$で事例$x_1$を分類したときに,各カテゴリに属する確率を表すベクトルである.\n同様に,$\\hat{Y} \\in \\mathbb{R}^{n \\times L}$は,学習データの事例$x_1, x_2, x_3, x_4$について,各カテゴリに属する確率を行列として表現している.\n\n", "_____no_output_____" ] ], [ [ "from torch import nn\ntorch.manual_seed(0)\n\nclass SLPNet(nn.Module):\n def __init__(self, input_size, output_size):\n super().__init__()\n self.fc = nn.Linear(input_size, output_size, bias=False) # Linear(入力次元数, 出力次元数)\n nn.init.normal_(self.fc.weight, 0.0, 1.0) # 正規乱数で重みを初期化\n\n def forward(self, x):\n x = self.fc(x)\n return x", "_____no_output_____" ], [ "model = SLPNet(300, 4)\ny_hat_1 = torch.softmax(model.forward(X_train[:1]), dim=-1)\nprint(y_hat_1)", "tensor([[0.4273, 0.0958, 0.2492, 0.2277]], grad_fn=<SoftmaxBackward>)\n" ], [ "Y_hat = torch.softmax(model.forward(X_train[:4]), dim=-1)\nprint(Y_hat)", "tensor([[0.4273, 0.0958, 0.2492, 0.2277],\n [0.2445, 0.2431, 0.0197, 0.4927],\n [0.7853, 0.1132, 0.0291, 0.0724],\n [0.5279, 0.2319, 0.0873, 0.1529]], grad_fn=<SoftmaxBackward>)\n" ] ], [ [ "## 72. 損失と勾配の計算\n***\n学習データの事例$x_1$と事例集合$x_1$,$x_2$,$x_3$,$x_4$に対して,クロスエントロピー損失と,行列$W$に対する勾配を計算せよ.なお,ある事例$x_i$に対して損失は次式で計算される.\n\n$$l_i=−log[事例x_iがy_iに分類される確率]$$\n\nただし,事例集合に対するクロスエントロピー損失は,その集合に含まれる各事例の損失の平均とする.", "_____no_output_____" ] ], [ [ "criterion = nn.CrossEntropyLoss()", "_____no_output_____" ], [ "l_1 = criterion(model.forward(X_train[:1]), y_train[:1]) # 入力ベクトルはsoftmax前の値\nmodel.zero_grad() # 勾配をゼロで初期化\nl_1.backward() # 勾配を計算\nprint(f'損失: {l_1:.4f}')\nprint(f'勾配:\\n{model.fc.weight.grad}')", "損失: 0.8503\n勾配:\ntensor([[-0.0479, -0.0032, -0.0039, ..., -0.0430, -0.0248, 0.0497],\n [ 0.0080, 0.0005, 0.0007, ..., 0.0072, 0.0041, -0.0083],\n [ 0.0208, 0.0014, 0.0017, ..., 0.0187, 0.0108, -0.0216],\n [ 0.0190, 0.0013, 0.0016, ..., 0.0171, 0.0099, -0.0198]])\n" ], [ "l = criterion(model.forward(X_train[:4]), y_train[:4])\nmodel.zero_grad()\nl.backward()\nprint(f'損失: {l:.4f}')\nprint(f'勾配:\\n{model.fc.weight.grad}')", "損失: 1.8321\n勾配:\ntensor([[-0.0063, 0.0042, -0.0139, ..., -0.0272, 0.0201, 0.0263],\n [-0.0047, -0.0025, 0.0195, ..., 0.0196, 0.0160, 0.0009],\n [ 0.0184, -0.0110, -0.0148, ..., 0.0070, -0.0055, -0.0001],\n [-0.0074, 0.0092, 0.0092, ..., 0.0006, -0.0306, -0.0272]])\n" ] ], [ [ "## 73. 確率的勾配降下法による学習\n***\n確率的勾配降下法(SGD: Stochastic Gradient Descent)を用いて,行列$W$を学習せよ.なお,学習は適当な基準で終了させればよい(例えば「100エポックで終了」など).", "_____no_output_____" ] ], [ [ "from torch.utils.data import Dataset\n\nclass CreateDataset(Dataset):\n def __init__(self, X, y): # datasetの構成要素を指定\n self.X = X\n self.y = y\n\n def __len__(self): # len(dataset)で返す値を指定\n return len(self.y)\n\n def __getitem__(self, idx): # dataset[idx]で返す値を指定\n if isinstance(idx, torch.Tensor):\n idx = idx.tolist()\n return [self.X[idx], self.y[idx]]", "_____no_output_____" ], [ "from torch.utils.data import DataLoader\n\ndataset_train = CreateDataset(X_train, y_train)\ndataset_valid = CreateDataset(X_valid, y_valid)\ndataset_test = CreateDataset(X_test, y_test)\ndataloader_train = DataLoader(dataset_train, batch_size=1, shuffle=True)\ndataloader_valid = DataLoader(dataset_valid, batch_size=len(dataset_valid), shuffle=False)\ndataloader_test = DataLoader(dataset_test, batch_size=len(dataset_test), shuffle=False)\n\nprint(len(dataset_train))\nprint(next(iter(dataloader_train)))", "10684\n[tensor([[-2.2791e-02, -1.6650e-02, 1.2573e-02, 1.1694e-02, -2.0509e-02,\n -4.1607e-02, -1.1132e-01, -4.7964e-02, 7.6147e-02, 9.4415e-02,\n -3.7549e-02, 7.2437e-02, -3.8168e-02, 7.9443e-02, -6.0207e-02,\n -5.1074e-02, -1.8954e-02, 7.6978e-02, 8.1055e-02, -8.3789e-02,\n -1.3208e-02, 2.0891e-01, 9.6887e-02, -2.3356e-02, -7.3456e-02,\n 5.9668e-02, -4.8009e-02, 8.0090e-02, 2.8123e-02, -1.6791e-02,\n 2.0227e-02, -9.6387e-02, 1.6510e-02, -1.6281e-02, -4.0601e-02,\n -8.2489e-02, 9.8975e-02, 1.4099e-03, 1.4362e-02, 3.9368e-02,\n 7.6392e-02, -1.3135e-01, 1.3572e-01, -1.4496e-03, -8.1097e-02,\n -6.5753e-02, -9.6622e-02, 2.0679e-02, 4.8145e-02, 5.0012e-02,\n 7.2842e-02, 4.8761e-02, 4.9164e-02, 1.1853e-01, 2.7307e-02,\n -6.8723e-02, 4.0675e-02, -2.6984e-02, -1.6510e-02, -1.6882e-01,\n 5.8417e-02, -2.1912e-02, -4.8096e-02, -9.4360e-02, -6.9186e-02,\n -1.2361e-02, -7.6489e-02, 5.1843e-02, 1.5080e-01, 5.7861e-03,\n -6.3660e-02, -9.0894e-02, 1.1075e-01, 3.5229e-02, -1.0220e-01,\n -2.4133e-02, 1.8951e-02, 1.0651e-01, 1.5167e-02, 8.7891e-03,\n -5.8649e-02, -4.8902e-02, -1.7447e-02, -5.0873e-03, 2.0083e-02,\n -3.3643e-02, -1.1077e-01, 9.6948e-02, -4.5068e-02, -4.8102e-02,\n 1.9116e-02, -1.7224e-02, -1.0402e-01, -4.5465e-02, -3.8379e-02,\n -1.8384e-02, 6.0464e-02, -1.7932e-02, 8.9215e-02, -6.3168e-02,\n 1.6414e-02, -4.4244e-02, 6.6852e-02, 8.0658e-03, -7.7148e-02,\n -1.0146e-01, -1.1623e-01, 1.1017e-02, -2.3859e-02, -5.6921e-02,\n -8.0200e-03, 3.2812e-02, -2.6733e-02, -6.9550e-03, 8.5193e-02,\n -1.1182e-02, -1.3623e-02, -4.4067e-02, 1.0166e-01, 9.5972e-02,\n 1.8344e-02, 5.8070e-02, 4.4479e-04, 5.7736e-02, 8.2104e-02,\n -4.7461e-02, 1.7114e-02, -2.7600e-02, -8.2092e-03, 6.6895e-02,\n 3.8300e-02, -1.7280e-01, 1.5320e-02, -9.0527e-02, -5.0513e-02,\n -9.0625e-02, -3.4372e-02, 4.9023e-02, -1.0402e-01, 5.3085e-02,\n 1.6299e-01, -2.0200e-01, 6.9128e-02, -3.4766e-02, -1.2520e-01,\n 4.7406e-02, 3.7939e-02, -2.7258e-02, 4.7699e-03, -4.1489e-02,\n 1.5836e-01, -3.4470e-02, 7.9187e-02, 7.0186e-02, -9.9365e-03,\n 5.1636e-03, -7.1176e-02, -6.1713e-02, -1.4331e-02, -7.9578e-02,\n 2.8979e-02, 4.8320e-02, 6.1710e-02, 6.6895e-03, 3.5571e-02,\n -2.9993e-02, 1.0642e-01, 7.7972e-03, 3.1024e-02, -3.0566e-02,\n -1.3335e-01, 4.5648e-02, 2.7258e-02, -1.3228e-01, -3.9993e-03,\n -4.0796e-02, -1.0620e-02, 2.3071e-02, 1.1028e-01, -7.1956e-02,\n -1.2183e-01, -1.0478e-01, -5.9741e-02, -1.5576e-02, -5.0122e-02,\n -1.0491e-01, 6.6578e-02, 8.7109e-02, 7.7832e-02, 7.1664e-02,\n -1.4240e-02, 1.6156e-02, 9.9219e-02, 3.7555e-02, -1.4541e-01,\n 7.7380e-02, -7.4424e-02, -1.5622e-02, -7.6538e-02, -1.1675e-01,\n -7.0117e-02, -2.4982e-02, -2.1033e-02, -1.0903e-01, -7.1951e-02,\n -7.8564e-02, -3.1067e-02, -1.0606e-02, 9.1476e-02, -1.1772e-01,\n -7.9239e-02, 1.0401e-01, -5.2237e-02, 5.0278e-02, -9.8535e-02,\n 1.4659e-02, 5.7626e-02, -3.9384e-02, -2.1252e-01, 4.3774e-02,\n 3.9404e-02, 9.2145e-02, -9.3225e-02, -4.1455e-02, -1.4404e-02,\n 5.6091e-03, 8.7646e-03, -3.4631e-02, -1.5869e-03, 3.8266e-02,\n -7.5806e-03, 2.2644e-02, 9.0625e-02, -8.6884e-03, -2.5506e-02,\n 1.8097e-03, 1.1675e-01, -5.1315e-02, 4.1077e-02, -5.5695e-03,\n 2.9001e-02, 3.6060e-02, -8.0099e-02, -8.0518e-02, -9.3018e-03,\n -3.3798e-02, 1.8066e-02, 9.7656e-03, 5.8420e-02, -1.0171e-01,\n -5.9680e-02, 7.2296e-02, -4.7632e-02, 4.5984e-02, -6.7035e-02,\n -5.9448e-02, 6.5326e-02, 9.1699e-02, 3.5828e-02, -3.7921e-02,\n -1.2726e-03, 4.9103e-02, 2.4626e-02, 8.0011e-02, -1.2207e-05,\n 1.6797e-01, -6.0141e-02, -1.0533e-01, 1.0718e-02, 4.2593e-02,\n -3.4094e-02, 8.3630e-02, 3.6023e-02, -4.1527e-02, 4.7495e-02,\n 2.9892e-03, -9.2068e-03, 2.6147e-02, -3.7276e-02, 8.0615e-02,\n -8.7317e-02, -1.7491e-02, 2.1057e-02, 1.0767e-01, 4.3916e-02,\n -3.1616e-03, -1.0800e-01, 2.4817e-02, 1.6101e-02, 5.0292e-02,\n -8.6255e-02, -5.5252e-03, -4.6820e-02, 5.6238e-02, 8.2507e-02,\n 5.3406e-03, 8.2825e-03, 2.1946e-02, 2.3987e-03, -4.1766e-02]]), tensor([0])]\n" ], [ "# モデルの定義\nmodel = SLPNet(300, 4)\n\n# 損失関数の定義\ncriterion = nn.CrossEntropyLoss()\n\n# オプティマイザの定義\noptimizer = torch.optim.SGD(model.parameters(), lr=1e-1)\n\n# 学習\nnum_epochs = 10\nfor epoch in range(num_epochs):\n # 訓練モードに設定\n model.train()\n loss_train = 0.0\n for i, (inputs, labels) in enumerate(dataloader_train):\n # 勾配をゼロで初期化\n optimizer.zero_grad()\n\n # 順伝播 + 誤差逆伝播 + 重み更新\n outputs = model.forward(inputs)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n\n # 損失を記録\n loss_train += loss.item()\n \n # バッチ単位の平均損失計算\n loss_train = loss_train / i\n\n # 検証データの損失計算\n model.eval() \n with torch.no_grad():\n inputs, labels = next(iter(dataloader_valid))\n outputs = model.forward(inputs)\n loss_valid = criterion(outputs, labels)\n\n # ログを出力\n print(f'epoch: {epoch + 1}, loss_train: {loss_train:.4f}, loss_valid: {loss_valid:.4f}') ", "epoch: 1, loss_train: 0.4686, loss_valid: 0.3738\nepoch: 2, loss_train: 0.3159, loss_valid: 0.3349\nepoch: 3, loss_train: 0.2846, loss_valid: 0.3248\nepoch: 4, loss_train: 0.2689, loss_valid: 0.3194\nepoch: 5, loss_train: 0.2580, loss_valid: 0.3094\nepoch: 6, loss_train: 0.2503, loss_valid: 0.3089\nepoch: 7, loss_train: 0.2437, loss_valid: 0.3068\nepoch: 8, loss_train: 0.2401, loss_valid: 0.3083\nepoch: 9, loss_train: 0.2358, loss_valid: 0.3077\nepoch: 10, loss_train: 0.2338, loss_valid: 0.3052\n" ] ], [ [ "## 74. 正解率の計測\n***\n問題73で求めた行列を用いて学習データおよび評価データの事例を分類したとき,その正解率をそれぞれ求めよ.", "_____no_output_____" ] ], [ [ "def calculate_accuracy(model, X, y):\n model.eval()\n with torch.no_grad():\n outputs = model(X)\n pred = torch.argmax(outputs, dim=-1)\n\n return (pred == y).sum().item() / len(y)", "_____no_output_____" ], [ "# 正解率の確認\nacc_train = calculate_accuracy(model, X_train, y_train)\nacc_test = calculate_accuracy(model, X_test, y_test)\nprint(f'正解率(学習データ):{acc_train:.3f}')\nprint(f'正解率(評価データ):{acc_test:.3f}')", "正解率(学習データ):0.925\n正解率(評価データ):0.902\n" ] ], [ [ "## 75. 損失と正解率のプロット\n***\n問題73のコードを改変し,各エポックのパラメータ更新が完了するたびに,訓練データでの損失,正解率,検証データでの損失,正解率をグラフにプロットし,学習の進捗状況を確認できるようにせよ.", "_____no_output_____" ] ], [ [ "def calculate_loss_and_accuracy(model, criterion, loader):\n model.eval()\n loss = 0.0\n total = 0\n correct = 0\n with torch.no_grad():\n for inputs, labels in loader:\n outputs = model(inputs)\n loss += criterion(outputs, labels).item()\n pred = torch.argmax(outputs, dim=-1)\n total += len(inputs)\n correct += (pred == labels).sum().item()\n \n return loss / len(loader), correct / total", "_____no_output_____" ], [ "# モデルの定義\nmodel = SLPNet(300, 4)\n\n# 損失関数の定義\ncriterion = nn.CrossEntropyLoss()\n\n# オプティマイザの定義\noptimizer = torch.optim.SGD(model.parameters(), lr=1e-1)\n\n# 学習\nnum_epochs = 30\nlog_train = []\nlog_valid = []\nfor epoch in range(num_epochs):\n # 訓練モードに設定\n model.train()\n for i, (inputs, labels) in enumerate(dataloader_train):\n # 勾配をゼロで初期化\n optimizer.zero_grad()\n\n # 順伝播 + 誤差逆伝播 + 重み更新\n outputs = model.forward(inputs)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n \n # 損失と正解率の算出\n loss_train, acc_train = calculate_loss_and_accuracy(model, criterion, dataloader_train)\n loss_valid, acc_valid = calculate_loss_and_accuracy(model, criterion, dataloader_valid)\n log_train.append([loss_train, acc_train])\n log_valid.append([loss_valid, acc_valid])\n\n # ログを出力\n print(f'epoch: {epoch + 1}, loss_train: {loss_train:.4f}, accuracy_train: {acc_train:.4f}, loss_valid: {loss_valid:.4f}, accuracy_valid: {acc_valid:.4f}') ", "epoch: 1, loss_train: 0.3308, accuracy_train: 0.8853, loss_valid: 0.3681, accuracy_valid: 0.8705\nepoch: 2, loss_train: 0.2887, accuracy_train: 0.8999, loss_valid: 0.3364, accuracy_valid: 0.8802\nepoch: 3, loss_train: 0.2660, accuracy_train: 0.9109, loss_valid: 0.3182, accuracy_valid: 0.8907\nepoch: 4, loss_train: 0.2582, accuracy_train: 0.9138, loss_valid: 0.3143, accuracy_valid: 0.8855\nepoch: 5, loss_train: 0.2447, accuracy_train: 0.9181, loss_valid: 0.3056, accuracy_valid: 0.8952\nepoch: 6, loss_train: 0.2390, accuracy_train: 0.9190, loss_valid: 0.3047, accuracy_valid: 0.8967\nepoch: 7, loss_train: 0.2362, accuracy_train: 0.9205, loss_valid: 0.3079, accuracy_valid: 0.8937\nepoch: 8, loss_train: 0.2350, accuracy_train: 0.9186, loss_valid: 0.3126, accuracy_valid: 0.8922\nepoch: 9, loss_train: 0.2276, accuracy_train: 0.9234, loss_valid: 0.3049, accuracy_valid: 0.8952\nepoch: 10, loss_train: 0.2272, accuracy_train: 0.9218, loss_valid: 0.3094, accuracy_valid: 0.8967\nepoch: 11, loss_train: 0.2246, accuracy_train: 0.9240, loss_valid: 0.3080, accuracy_valid: 0.8922\nepoch: 12, loss_train: 0.2225, accuracy_train: 0.9238, loss_valid: 0.3104, accuracy_valid: 0.8967\nepoch: 13, loss_train: 0.2194, accuracy_train: 0.9248, loss_valid: 0.3090, accuracy_valid: 0.8945\nepoch: 14, loss_train: 0.2194, accuracy_train: 0.9249, loss_valid: 0.3099, accuracy_valid: 0.8930\nepoch: 15, loss_train: 0.2165, accuracy_train: 0.9275, loss_valid: 0.3092, accuracy_valid: 0.8930\nepoch: 16, loss_train: 0.2162, accuracy_train: 0.9286, loss_valid: 0.3100, accuracy_valid: 0.8922\nepoch: 17, loss_train: 0.2140, accuracy_train: 0.9255, loss_valid: 0.3124, accuracy_valid: 0.8937\nepoch: 18, loss_train: 0.2179, accuracy_train: 0.9257, loss_valid: 0.3151, accuracy_valid: 0.8907\nepoch: 19, loss_train: 0.2146, accuracy_train: 0.9258, loss_valid: 0.3161, accuracy_valid: 0.8967\nepoch: 20, loss_train: 0.2124, accuracy_train: 0.9281, loss_valid: 0.3151, accuracy_valid: 0.8960\nepoch: 21, loss_train: 0.2125, accuracy_train: 0.9277, loss_valid: 0.3180, accuracy_valid: 0.8945\nepoch: 22, loss_train: 0.2136, accuracy_train: 0.9276, loss_valid: 0.3205, accuracy_valid: 0.8892\nepoch: 23, loss_train: 0.2171, accuracy_train: 0.9262, loss_valid: 0.3259, accuracy_valid: 0.8900\nepoch: 24, loss_train: 0.2095, accuracy_train: 0.9296, loss_valid: 0.3198, accuracy_valid: 0.8915\nepoch: 25, loss_train: 0.2092, accuracy_train: 0.9296, loss_valid: 0.3195, accuracy_valid: 0.8915\nepoch: 26, loss_train: 0.2093, accuracy_train: 0.9291, loss_valid: 0.3204, accuracy_valid: 0.8892\nepoch: 27, loss_train: 0.2086, accuracy_train: 0.9297, loss_valid: 0.3226, accuracy_valid: 0.8915\nepoch: 28, loss_train: 0.2090, accuracy_train: 0.9297, loss_valid: 0.3242, accuracy_valid: 0.8922\nepoch: 29, loss_train: 0.2076, accuracy_train: 0.9300, loss_valid: 0.3228, accuracy_valid: 0.8915\nepoch: 30, loss_train: 0.2066, accuracy_train: 0.9303, loss_valid: 0.3248, accuracy_valid: 0.8915\n" ], [ "import numpy as np\nfrom matplotlib import pyplot as plt\n\n# 可視化\nfig, ax = plt.subplots(1, 2, figsize=(15, 5))\nax[0].plot(np.array(log_train).T[0], label='train')\nax[0].plot(np.array(log_valid).T[0], label='valid')\nax[0].set_xlabel('epoch')\nax[0].set_ylabel('loss')\nax[0].legend()\nax[1].plot(np.array(log_train).T[1], label='train')\nax[1].plot(np.array(log_valid).T[1], label='valid')\nax[1].set_xlabel('epoch')\nax[1].set_ylabel('accuracy')\nax[1].legend()\nplt.show()", "_____no_output_____" ] ], [ [ "## 76. チェックポイント\n***\n問題75のコードを改変し,各エポックのパラメータ更新が完了するたびに,チェックポイント(学習途中のパラメータ(重み行列など)の値や最適化アルゴリズムの内部状態)をファイルに書き出せ.", "_____no_output_____" ] ], [ [ "# モデルの定義\nmodel = SLPNet(300, 4)\n\n# 損失関数の定義\ncriterion = nn.CrossEntropyLoss()\n\n# オプティマイザの定義\noptimizer = torch.optim.SGD(model.parameters(), lr=1e-1)\n\n# 学習\nnum_epochs = 10\nlog_train = []\nlog_valid = []\nfor epoch in range(num_epochs):\n # 訓練モードに設定\n model.train()\n for inputs, labels in dataloader_train:\n # 勾配をゼロで初期化\n optimizer.zero_grad()\n\n # 順伝播 + 誤差逆伝播 + 重み更新\n outputs = model.forward(inputs)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n \n # 損失と正解率の算出\n loss_train, acc_train = calculate_loss_and_accuracy(model, criterion, dataloader_train)\n loss_valid, acc_valid = calculate_loss_and_accuracy(model, criterion, dataloader_valid)\n log_train.append([loss_train, acc_train])\n log_valid.append([loss_valid, acc_valid])\n\n # チェックポイントの保存\n torch.save({'epoch': epoch, 'model_state_dict': model.state_dict(), 'optimizer_state_dict': optimizer.state_dict()}, f'checkpoint{epoch + 1}.pt')\n\n # ログを出力\n print(f'epoch: {epoch + 1}, loss_train: {loss_train:.4f}, accuracy_train: {acc_train:.4f}, loss_valid: {loss_valid:.4f}, accuracy_valid: {acc_valid:.4f}') ", "epoch: 1, loss_train: 0.3281, accuracy_train: 0.8886, loss_valid: 0.3622, accuracy_valid: 0.8698\nepoch: 2, loss_train: 0.2928, accuracy_train: 0.9040, loss_valid: 0.3351, accuracy_valid: 0.8832\nepoch: 3, loss_train: 0.2638, accuracy_train: 0.9125, loss_valid: 0.3138, accuracy_valid: 0.8870\nepoch: 4, loss_train: 0.2571, accuracy_train: 0.9131, loss_valid: 0.3097, accuracy_valid: 0.8892\nepoch: 5, loss_train: 0.2450, accuracy_train: 0.9185, loss_valid: 0.3049, accuracy_valid: 0.8915\nepoch: 6, loss_train: 0.2428, accuracy_train: 0.9194, loss_valid: 0.3054, accuracy_valid: 0.8952\nepoch: 7, loss_train: 0.2400, accuracy_train: 0.9220, loss_valid: 0.3083, accuracy_valid: 0.8960\nepoch: 8, loss_train: 0.2306, accuracy_train: 0.9232, loss_valid: 0.3035, accuracy_valid: 0.8967\nepoch: 9, loss_train: 0.2293, accuracy_train: 0.9243, loss_valid: 0.3058, accuracy_valid: 0.8930\nepoch: 10, loss_train: 0.2270, accuracy_train: 0.9254, loss_valid: 0.3054, accuracy_valid: 0.8952\n" ] ], [ [ "## 77. ミニバッチ化\n***\n問題76のコードを改変し,$B$事例ごとに損失・勾配を計算し,行列$W$の値を更新せよ(ミニバッチ化).$B$の値を$1,2,4,8,…$と変化させながら,1エポックの学習に要する時間を比較せよ.", "_____no_output_____" ] ], [ [ "import time\n\ndef train_model(dataset_train, dataset_valid, batch_size, model, criterion, optimizer, num_epochs):\n # dataloaderの作成\n dataloader_train = DataLoader(dataset_train, batch_size=batch_size, shuffle=True)\n dataloader_valid = DataLoader(dataset_valid, batch_size=len(dataset_valid), shuffle=False)\n\n # 学習\n log_train = []\n log_valid = []\n for epoch in range(num_epochs):\n # 開始時刻の記録\n s_time = time.time()\n\n # 訓練モードに設定\n model.train()\n for inputs, labels in dataloader_train:\n # 勾配をゼロで初期化\n optimizer.zero_grad()\n\n # 順伝播 + 誤差逆伝播 + 重み更新\n outputs = model.forward(inputs)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n \n # 損失と正解率の算出\n loss_train, acc_train = calculate_loss_and_accuracy(model, criterion, dataloader_train)\n loss_valid, acc_valid = calculate_loss_and_accuracy(model, criterion, dataloader_valid)\n log_train.append([loss_train, acc_train])\n log_valid.append([loss_valid, acc_valid])\n\n # チェックポイントの保存\n torch.save({'epoch': epoch, 'model_state_dict': model.state_dict(), 'optimizer_state_dict': optimizer.state_dict()}, f'checkpoint{epoch + 1}.pt')\n\n # 終了時刻の記録\n e_time = time.time()\n\n # ログを出力\n print(f'epoch: {epoch + 1}, loss_train: {loss_train:.4f}, accuracy_train: {acc_train:.4f}, loss_valid: {loss_valid:.4f}, accuracy_valid: {acc_valid:.4f}, {(e_time - s_time):.4f}sec') \n\n return {'train': log_train, 'valid': log_valid}", "_____no_output_____" ], [ "# datasetの作成\ndataset_train = CreateDataset(X_train, y_train)\ndataset_valid = CreateDataset(X_valid, y_valid)\n\n# モデルの定義\nmodel = SLPNet(300, 4)\n\n# 損失関数の定義\ncriterion = nn.CrossEntropyLoss()\n\n# オプティマイザの定義\noptimizer = torch.optim.SGD(model.parameters(), lr=1e-1)\n\n# モデルの学習\nfor batch_size in [2 ** i for i in range(11)]:\n print(f'バッチサイズ: {batch_size}')\n log = train_model(dataset_train, dataset_valid, batch_size, model, criterion, optimizer, 1)", "バッチサイズ: 1\nepoch: 1, loss_train: 0.3310, accuracy_train: 0.8858, loss_valid: 0.3579, accuracy_valid: 0.8795, 4.1217sec\nバッチサイズ: 2\nepoch: 1, loss_train: 0.2985, accuracy_train: 0.8967, loss_valid: 0.3289, accuracy_valid: 0.8907, 2.3251sec\nバッチサイズ: 4\nepoch: 1, loss_train: 0.2895, accuracy_train: 0.9000, loss_valid: 0.3226, accuracy_valid: 0.8900, 1.2911sec\nバッチサイズ: 8\nepoch: 1, loss_train: 0.2870, accuracy_train: 0.9003, loss_valid: 0.3213, accuracy_valid: 0.8870, 0.7291sec\nバッチサイズ: 16\nepoch: 1, loss_train: 0.2843, accuracy_train: 0.9027, loss_valid: 0.3189, accuracy_valid: 0.8915, 0.4637sec\nバッチサイズ: 32\nepoch: 1, loss_train: 0.2833, accuracy_train: 0.9029, loss_valid: 0.3182, accuracy_valid: 0.8937, 0.3330sec\nバッチサイズ: 64\nepoch: 1, loss_train: 0.2829, accuracy_train: 0.9028, loss_valid: 0.3180, accuracy_valid: 0.8930, 0.2453sec\nバッチサイズ: 128\nepoch: 1, loss_train: 0.2822, accuracy_train: 0.9029, loss_valid: 0.3179, accuracy_valid: 0.8930, 0.2005sec\nバッチサイズ: 256\nepoch: 1, loss_train: 0.2837, accuracy_train: 0.9028, loss_valid: 0.3178, accuracy_valid: 0.8930, 0.1747sec\nバッチサイズ: 512\nepoch: 1, loss_train: 0.2823, accuracy_train: 0.9028, loss_valid: 0.3178, accuracy_valid: 0.8930, 0.1724sec\nバッチサイズ: 1024\nepoch: 1, loss_train: 0.2869, accuracy_train: 0.9028, loss_valid: 0.3178, accuracy_valid: 0.8930, 0.1432sec\n" ] ], [ [ "## 78. GPU上での学習\n***\n問題77のコードを改変し,GPU上で学習を実行せよ.", "_____no_output_____" ] ], [ [ "def calculate_loss_and_accuracy(model, criterion, loader, device):\n model.eval()\n loss = 0.0\n total = 0\n correct = 0\n with torch.no_grad():\n for inputs, labels in loader:\n inputs = inputs.to(device)\n labels = labels.to(device)\n outputs = model(inputs)\n loss += criterion(outputs, labels).item()\n pred = torch.argmax(outputs, dim=-1)\n total += len(inputs)\n correct += (pred == labels).sum().item()\n \n return loss / len(loader), correct / total\n \n\ndef train_model(dataset_train, dataset_valid, batch_size, model, criterion, optimizer, num_epochs, device=None):\n # GPUに送る\n model.to(device)\n\n # dataloaderの作成\n dataloader_train = DataLoader(dataset_train, batch_size=batch_size, shuffle=True)\n dataloader_valid = DataLoader(dataset_valid, batch_size=len(dataset_valid), shuffle=False)\n\n # 学習\n log_train = []\n log_valid = []\n for epoch in range(num_epochs):\n # 開始時刻の記録\n s_time = time.time()\n\n # 訓練モードに設定\n model.train()\n for inputs, labels in dataloader_train:\n # 勾配をゼロで初期化\n optimizer.zero_grad()\n\n # 順伝播 + 誤差逆伝播 + 重み更新\n inputs = inputs.to(device)\n labels = labels.to(device)\n outputs = model.forward(inputs)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n \n # 損失と正解率の算出\n loss_train, acc_train = calculate_loss_and_accuracy(model, criterion, dataloader_train, device)\n loss_valid, acc_valid = calculate_loss_and_accuracy(model, criterion, dataloader_valid, device)\n log_train.append([loss_train, acc_train])\n log_valid.append([loss_valid, acc_valid])\n\n # チェックポイントの保存\n torch.save({'epoch': epoch, 'model_state_dict': model.state_dict(), 'optimizer_state_dict': optimizer.state_dict()}, f'checkpoint{epoch + 1}.pt')\n\n # 終了時刻の記録\n e_time = time.time()\n\n # ログを出力\n print(f'epoch: {epoch + 1}, loss_train: {loss_train:.4f}, accuracy_train: {acc_train:.4f}, loss_valid: {loss_valid:.4f}, accuracy_valid: {acc_valid:.4f}, {(e_time - s_time):.4f}sec') \n\n return {'train': log_train, 'valid': log_valid}", "_____no_output_____" ], [ "# datasetの作成\ndataset_train = CreateDataset(X_train, y_train)\ndataset_valid = CreateDataset(X_valid, y_valid)\n\n# モデルの定義\nmodel = SLPNet(300, 4)\n\n# 損失関数の定義\ncriterion = nn.CrossEntropyLoss()\n\n# オプティマイザの定義\noptimizer = torch.optim.SGD(model.parameters(), lr=1e-1)\n\n# デバイスの指定\ndevice = torch.device('cuda')\nfor batch_size in [2 ** i for i in range(11)]:\n print(f'バッチサイズ: {batch_size}')\n log = train_model(dataset_train, dataset_valid, batch_size, model, criterion, optimizer, 1, device=device)", "バッチサイズ: 1\nepoch: 1, loss_train: 0.3322, accuracy_train: 0.8842, loss_valid: 0.3676, accuracy_valid: 0.8780, 10.2910sec\nバッチサイズ: 2\nepoch: 1, loss_train: 0.3038, accuracy_train: 0.8983, loss_valid: 0.3469, accuracy_valid: 0.8840, 5.0635sec\nバッチサイズ: 4\nepoch: 1, loss_train: 0.2929, accuracy_train: 0.9013, loss_valid: 0.3390, accuracy_valid: 0.8832, 2.5709sec\nバッチサイズ: 8\nepoch: 1, loss_train: 0.2885, accuracy_train: 0.9024, loss_valid: 0.3352, accuracy_valid: 0.8877, 1.3670sec\nバッチサイズ: 16\nepoch: 1, loss_train: 0.2865, accuracy_train: 0.9038, loss_valid: 0.3334, accuracy_valid: 0.8855, 0.7702sec\nバッチサイズ: 32\nepoch: 1, loss_train: 0.2857, accuracy_train: 0.9039, loss_valid: 0.3329, accuracy_valid: 0.8855, 0.4686sec\nバッチサイズ: 64\nepoch: 1, loss_train: 0.2851, accuracy_train: 0.9041, loss_valid: 0.3327, accuracy_valid: 0.8855, 0.3011sec\nバッチサイズ: 128\nepoch: 1, loss_train: 0.2845, accuracy_train: 0.9041, loss_valid: 0.3325, accuracy_valid: 0.8855, 0.2226sec\nバッチサイズ: 256\nepoch: 1, loss_train: 0.2850, accuracy_train: 0.9041, loss_valid: 0.3325, accuracy_valid: 0.8855, 0.1862sec\nバッチサイズ: 512\nepoch: 1, loss_train: 0.2849, accuracy_train: 0.9041, loss_valid: 0.3324, accuracy_valid: 0.8855, 0.1551sec\nバッチサイズ: 1024\nepoch: 1, loss_train: 0.2847, accuracy_train: 0.9041, loss_valid: 0.3324, accuracy_valid: 0.8855, 0.1477sec\n" ] ], [ [ "## 79. 多層ニューラルネットワーク\n***\n問題78のコードを改変し,バイアス項の導入や多層化など,ニューラルネットワークの形状を変更しながら,高性能なカテゴリ分類器を構築せよ.", "_____no_output_____" ] ], [ [ "from torch.nn import functional as F\n\nclass MLPNet(nn.Module):\n def __init__(self, input_size, mid_size, output_size, mid_layers):\n super().__init__()\n self.mid_layers = mid_layers\n self.fc = nn.Linear(input_size, mid_size)\n self.fc_mid = nn.Linear(mid_size, mid_size)\n self.fc_out = nn.Linear(mid_size, output_size) \n self.bn = nn.BatchNorm1d(mid_size)\n\n def forward(self, x):\n x = F.relu(self.fc(x))\n for _ in range(self.mid_layers):\n x = F.relu(self.bn(self.fc_mid(x)))\n x = F.relu(self.fc_out(x))\n \n return x", "_____no_output_____" ], [ "from torch import optim\n\ndef calculate_loss_and_accuracy(model, criterion, loader, device):\n model.eval()\n loss = 0.0\n total = 0\n correct = 0\n with torch.no_grad():\n for inputs, labels in loader:\n inputs = inputs.to(device)\n labels = labels.to(device)\n outputs = model(inputs)\n loss += criterion(outputs, labels).item()\n pred = torch.argmax(outputs, dim=-1)\n total += len(inputs)\n correct += (pred == labels).sum().item()\n \n return loss / len(loader), correct / total\n \n\ndef train_model(dataset_train, dataset_valid, batch_size, model, criterion, optimizer, num_epochs, device=None):\n # GPUに送る\n model.to(device)\n\n # dataloaderの作成\n dataloader_train = DataLoader(dataset_train, batch_size=batch_size, shuffle=True)\n dataloader_valid = DataLoader(dataset_valid, batch_size=len(dataset_valid), shuffle=False)\n\n # スケジューラの設定\n scheduler = optim.lr_scheduler.CosineAnnealingLR(optimizer, num_epochs, eta_min=1e-5, last_epoch=-1)\n\n # 学習\n log_train = []\n log_valid = []\n for epoch in range(num_epochs):\n # 開始時刻の記録\n s_time = time.time()\n\n # 訓練モードに設定\n model.train()\n for inputs, labels in dataloader_train:\n # 勾配をゼロで初期化\n optimizer.zero_grad()\n\n # 順伝播 + 誤差逆伝播 + 重み更新\n inputs = inputs.to(device)\n labels = labels.to(device)\n outputs = model.forward(inputs)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n \n # 損失と正解率の算出\n loss_train, acc_train = calculate_loss_and_accuracy(model, criterion, dataloader_train, device)\n loss_valid, acc_valid = calculate_loss_and_accuracy(model, criterion, dataloader_valid, device)\n log_train.append([loss_train, acc_train])\n log_valid.append([loss_valid, acc_valid])\n\n # チェックポイントの保存\n torch.save({'epoch': epoch, 'model_state_dict': model.state_dict(), 'optimizer_state_dict': optimizer.state_dict()}, f'checkpoint{epoch + 1}.pt')\n\n # 終了時刻の記録\n e_time = time.time()\n\n # ログを出力\n print(f'epoch: {epoch + 1}, loss_train: {loss_train:.4f}, accuracy_train: {acc_train:.4f}, loss_valid: {loss_valid:.4f}, accuracy_valid: {acc_valid:.4f}, {(e_time - s_time):.4f}sec') \n\n # 検証データの損失が3エポック連続で低下しなかった場合は学習終了\n if epoch > 2 and log_valid[epoch - 3][0] <= log_valid[epoch - 2][0] <= log_valid[epoch - 1][0] <= log_valid[epoch][0]:\n break\n\n # スケジューラを1ステップ進める\n scheduler.step()\n\n return {'train': log_train, 'valid': log_valid}", "_____no_output_____" ], [ "# datasetの作成\ndataset_train = CreateDataset(X_train, y_train)\ndataset_valid = CreateDataset(X_valid, y_valid)\n\n# モデルの定義\nmodel = MLPNet(300, 200, 4, 1)\n\n# 損失関数の定義\ncriterion = nn.CrossEntropyLoss()\n\n# オプティマイザの定義\noptimizer = torch.optim.SGD(model.parameters(), lr=1e-3)\n\n# デバイスの指定\ndevice = torch.device('cuda')\nlog = train_model(dataset_train, dataset_valid, 64, model, criterion, optimizer, 1000, device)", "epoch: 1, loss_train: 0.9282, accuracy_train: 0.7431, loss_valid: 0.9309, accuracy_valid: 0.7485, 0.4775sec\nepoch: 2, loss_train: 0.7333, accuracy_train: 0.7686, loss_valid: 0.7376, accuracy_valid: 0.7650, 0.4934sec\nepoch: 3, loss_train: 0.6493, accuracy_train: 0.7748, loss_valid: 0.6565, accuracy_valid: 0.7747, 0.4844sec\nepoch: 4, loss_train: 0.6009, accuracy_train: 0.7845, loss_valid: 0.6096, accuracy_valid: 0.7844, 0.4872sec\nepoch: 5, loss_train: 0.5634, accuracy_train: 0.7930, loss_valid: 0.5751, accuracy_valid: 0.7934, 0.4708sec\nepoch: 6, loss_train: 0.5342, accuracy_train: 0.8095, loss_valid: 0.5474, accuracy_valid: 0.8106, 0.5088sec\nepoch: 7, loss_train: 0.5039, accuracy_train: 0.8236, loss_valid: 0.5186, accuracy_valid: 0.8286, 0.4774sec\nepoch: 8, loss_train: 0.4821, accuracy_train: 0.8366, loss_valid: 0.4980, accuracy_valid: 0.8323, 0.4826sec\nepoch: 9, loss_train: 0.4572, accuracy_train: 0.8505, loss_valid: 0.4740, accuracy_valid: 0.8398, 0.4772sec\nepoch: 10, loss_train: 0.4339, accuracy_train: 0.8586, loss_valid: 0.4523, accuracy_valid: 0.8458, 0.4982sec\nepoch: 11, loss_train: 0.4146, accuracy_train: 0.8689, loss_valid: 0.4339, accuracy_valid: 0.8525, 0.4773sec\nepoch: 12, loss_train: 0.4014, accuracy_train: 0.8747, loss_valid: 0.4217, accuracy_valid: 0.8593, 0.4854sec\nepoch: 13, loss_train: 0.3852, accuracy_train: 0.8808, loss_valid: 0.4067, accuracy_valid: 0.8683, 0.4861sec\nepoch: 14, loss_train: 0.3711, accuracy_train: 0.8853, loss_valid: 0.3938, accuracy_valid: 0.8690, 0.4838sec\nepoch: 15, loss_train: 0.3600, accuracy_train: 0.8868, loss_valid: 0.3840, accuracy_valid: 0.8720, 0.4851sec\nepoch: 16, loss_train: 0.3495, accuracy_train: 0.8892, loss_valid: 0.3745, accuracy_valid: 0.8750, 0.4820sec\nepoch: 17, loss_train: 0.3404, accuracy_train: 0.8944, loss_valid: 0.3664, accuracy_valid: 0.8750, 0.4785sec\nepoch: 18, loss_train: 0.3315, accuracy_train: 0.8971, loss_valid: 0.3582, accuracy_valid: 0.8817, 0.4858sec\nepoch: 19, loss_train: 0.3264, accuracy_train: 0.8965, loss_valid: 0.3536, accuracy_valid: 0.8847, 0.4797sec\nepoch: 20, loss_train: 0.3179, accuracy_train: 0.8992, loss_valid: 0.3467, accuracy_valid: 0.8825, 0.4690sec\nepoch: 21, loss_train: 0.3110, accuracy_train: 0.9008, loss_valid: 0.3404, accuracy_valid: 0.8870, 0.5023sec\nepoch: 22, loss_train: 0.3065, accuracy_train: 0.9037, loss_valid: 0.3368, accuracy_valid: 0.8945, 0.4727sec\nepoch: 23, loss_train: 0.3004, accuracy_train: 0.9036, loss_valid: 0.3326, accuracy_valid: 0.8900, 0.4938sec\nepoch: 24, loss_train: 0.2952, accuracy_train: 0.9049, loss_valid: 0.3282, accuracy_valid: 0.8937, 0.4866sec\nepoch: 25, loss_train: 0.2891, accuracy_train: 0.9057, loss_valid: 0.3238, accuracy_valid: 0.8915, 0.4758sec\nepoch: 26, loss_train: 0.2905, accuracy_train: 0.9065, loss_valid: 0.3258, accuracy_valid: 0.8885, 0.4759sec\nepoch: 27, loss_train: 0.2833, accuracy_train: 0.9073, loss_valid: 0.3196, accuracy_valid: 0.8960, 0.4889sec\nepoch: 28, loss_train: 0.2789, accuracy_train: 0.9086, loss_valid: 0.3165, accuracy_valid: 0.8952, 0.4782sec\nepoch: 29, loss_train: 0.2755, accuracy_train: 0.9113, loss_valid: 0.3139, accuracy_valid: 0.8937, 0.4820sec\nepoch: 30, loss_train: 0.2713, accuracy_train: 0.9117, loss_valid: 0.3117, accuracy_valid: 0.8937, 0.4950sec\nepoch: 31, loss_train: 0.2682, accuracy_train: 0.9135, loss_valid: 0.3093, accuracy_valid: 0.8945, 0.4862sec\nepoch: 32, loss_train: 0.2643, accuracy_train: 0.9137, loss_valid: 0.3073, accuracy_valid: 0.8967, 0.4850sec\nepoch: 33, loss_train: 0.2618, accuracy_train: 0.9150, loss_valid: 0.3057, accuracy_valid: 0.8967, 0.4811sec\nepoch: 34, loss_train: 0.2597, accuracy_train: 0.9160, loss_valid: 0.3045, accuracy_valid: 0.8967, 0.4927sec\nepoch: 35, loss_train: 0.2581, accuracy_train: 0.9163, loss_valid: 0.3042, accuracy_valid: 0.8990, 0.4921sec\nepoch: 36, loss_train: 0.2539, accuracy_train: 0.9188, loss_valid: 0.3013, accuracy_valid: 0.8997, 0.4833sec\nepoch: 37, loss_train: 0.2511, accuracy_train: 0.9181, loss_valid: 0.2999, accuracy_valid: 0.8990, 0.4838sec\nepoch: 38, loss_train: 0.2482, accuracy_train: 0.9191, loss_valid: 0.2986, accuracy_valid: 0.9012, 0.4892sec\nepoch: 39, loss_train: 0.2457, accuracy_train: 0.9218, loss_valid: 0.2970, accuracy_valid: 0.8982, 0.4948sec\nepoch: 40, loss_train: 0.2440, accuracy_train: 0.9219, loss_valid: 0.2970, accuracy_valid: 0.8997, 0.4939sec\nepoch: 41, loss_train: 0.2413, accuracy_train: 0.9217, loss_valid: 0.2954, accuracy_valid: 0.9019, 0.4853sec\nepoch: 42, loss_train: 0.2401, accuracy_train: 0.9219, loss_valid: 0.2950, accuracy_valid: 0.8997, 0.4851sec\nepoch: 43, loss_train: 0.2378, accuracy_train: 0.9251, loss_valid: 0.2940, accuracy_valid: 0.8997, 0.4834sec\nepoch: 44, loss_train: 0.2345, accuracy_train: 0.9242, loss_valid: 0.2928, accuracy_valid: 0.9004, 0.5006sec\nepoch: 45, loss_train: 0.2334, accuracy_train: 0.9269, loss_valid: 0.2919, accuracy_valid: 0.8990, 0.4869sec\nepoch: 46, loss_train: 0.2309, accuracy_train: 0.9261, loss_valid: 0.2912, accuracy_valid: 0.8997, 0.4804sec\nepoch: 47, loss_train: 0.2295, accuracy_train: 0.9282, loss_valid: 0.2907, accuracy_valid: 0.9027, 0.4832sec\nepoch: 48, loss_train: 0.2270, accuracy_train: 0.9280, loss_valid: 0.2899, accuracy_valid: 0.8997, 0.4867sec\nepoch: 49, loss_train: 0.2250, accuracy_train: 0.9298, loss_valid: 0.2892, accuracy_valid: 0.9034, 0.5102sec\nepoch: 50, loss_train: 0.2238, accuracy_train: 0.9292, loss_valid: 0.2893, accuracy_valid: 0.9034, 0.4798sec\nepoch: 51, loss_train: 0.2220, accuracy_train: 0.9300, loss_valid: 0.2885, accuracy_valid: 0.9034, 0.4805sec\nepoch: 52, loss_train: 0.2197, accuracy_train: 0.9313, loss_valid: 0.2875, accuracy_valid: 0.9042, 0.4750sec\nepoch: 53, loss_train: 0.2186, accuracy_train: 0.9318, loss_valid: 0.2871, accuracy_valid: 0.9034, 0.4846sec\nepoch: 54, loss_train: 0.2163, accuracy_train: 0.9313, loss_valid: 0.2871, accuracy_valid: 0.9012, 0.4905sec\nepoch: 55, loss_train: 0.2152, accuracy_train: 0.9318, loss_valid: 0.2870, accuracy_valid: 0.9042, 0.4972sec\nepoch: 56, loss_train: 0.2125, accuracy_train: 0.9330, loss_valid: 0.2857, accuracy_valid: 0.9019, 0.4787sec\nepoch: 57, loss_train: 0.2116, accuracy_train: 0.9319, loss_valid: 0.2858, accuracy_valid: 0.9042, 0.4794sec\nepoch: 58, loss_train: 0.2104, accuracy_train: 0.9334, loss_valid: 0.2857, accuracy_valid: 0.9034, 0.4769sec\nepoch: 59, loss_train: 0.2094, accuracy_train: 0.9347, loss_valid: 0.2854, accuracy_valid: 0.9004, 0.4834sec\nepoch: 60, loss_train: 0.2063, accuracy_train: 0.9351, loss_valid: 0.2843, accuracy_valid: 0.9012, 0.4869sec\nepoch: 61, loss_train: 0.2060, accuracy_train: 0.9346, loss_valid: 0.2851, accuracy_valid: 0.9027, 0.4856sec\nepoch: 62, loss_train: 0.2033, accuracy_train: 0.9360, loss_valid: 0.2841, accuracy_valid: 0.9027, 0.4784sec\nepoch: 63, loss_train: 0.2032, accuracy_train: 0.9351, loss_valid: 0.2847, accuracy_valid: 0.9042, 0.4896sec\nepoch: 64, loss_train: 0.2018, accuracy_train: 0.9374, loss_valid: 0.2842, accuracy_valid: 0.9049, 0.4809sec\nepoch: 65, loss_train: 0.1999, accuracy_train: 0.9373, loss_valid: 0.2839, accuracy_valid: 0.9027, 0.4869sec\nepoch: 66, loss_train: 0.1980, accuracy_train: 0.9374, loss_valid: 0.2839, accuracy_valid: 0.9042, 0.4919sec\nepoch: 67, loss_train: 0.1980, accuracy_train: 0.9396, loss_valid: 0.2837, accuracy_valid: 0.9034, 0.4725sec\nepoch: 68, loss_train: 0.1953, accuracy_train: 0.9395, loss_valid: 0.2833, accuracy_valid: 0.9034, 0.4791sec\nepoch: 69, loss_train: 0.1941, accuracy_train: 0.9392, loss_valid: 0.2833, accuracy_valid: 0.9034, 0.4753sec\nepoch: 70, loss_train: 0.1922, accuracy_train: 0.9396, loss_valid: 0.2826, accuracy_valid: 0.9049, 0.4933sec\nepoch: 71, loss_train: 0.1920, accuracy_train: 0.9398, loss_valid: 0.2824, accuracy_valid: 0.9049, 0.4952sec\nepoch: 72, loss_train: 0.1908, accuracy_train: 0.9400, loss_valid: 0.2836, accuracy_valid: 0.9042, 0.4832sec\nepoch: 73, loss_train: 0.1887, accuracy_train: 0.9410, loss_valid: 0.2825, accuracy_valid: 0.9027, 0.4889sec\nepoch: 74, loss_train: 0.1870, accuracy_train: 0.9418, loss_valid: 0.2821, accuracy_valid: 0.9027, 0.4898sec\nepoch: 75, loss_train: 0.1864, accuracy_train: 0.9423, loss_valid: 0.2820, accuracy_valid: 0.9042, 0.4891sec\nepoch: 76, loss_train: 0.1843, accuracy_train: 0.9428, loss_valid: 0.2823, accuracy_valid: 0.9042, 0.4904sec\nepoch: 77, loss_train: 0.1829, accuracy_train: 0.9431, loss_valid: 0.2825, accuracy_valid: 0.9042, 0.4853sec\nepoch: 78, loss_train: 0.1810, accuracy_train: 0.9432, loss_valid: 0.2821, accuracy_valid: 0.9034, 0.4963sec\nepoch: 79, loss_train: 0.1804, accuracy_train: 0.9436, loss_valid: 0.2820, accuracy_valid: 0.9042, 0.4861sec\nepoch: 80, loss_train: 0.1800, accuracy_train: 0.9444, loss_valid: 0.2826, accuracy_valid: 0.9012, 0.4809sec\nepoch: 81, loss_train: 0.1784, accuracy_train: 0.9441, loss_valid: 0.2823, accuracy_valid: 0.9064, 0.4777sec\nepoch: 82, loss_train: 0.1782, accuracy_train: 0.9447, loss_valid: 0.2836, accuracy_valid: 0.9027, 0.4794sec\nepoch: 83, loss_train: 0.1762, accuracy_train: 0.9449, loss_valid: 0.2823, accuracy_valid: 0.9034, 0.4850sec\nepoch: 84, loss_train: 0.1750, accuracy_train: 0.9462, loss_valid: 0.2833, accuracy_valid: 0.9057, 0.5037sec\nepoch: 85, loss_train: 0.1753, accuracy_train: 0.9461, loss_valid: 0.2840, accuracy_valid: 0.9064, 0.4919sec\nepoch: 86, loss_train: 0.1723, accuracy_train: 0.9469, loss_valid: 0.2823, accuracy_valid: 0.9042, 0.4919sec\nepoch: 87, loss_train: 0.1709, accuracy_train: 0.9476, loss_valid: 0.2823, accuracy_valid: 0.9072, 0.4874sec\nepoch: 88, loss_train: 0.1702, accuracy_train: 0.9473, loss_valid: 0.2824, accuracy_valid: 0.9049, 0.4909sec\nepoch: 89, loss_train: 0.1683, accuracy_train: 0.9480, loss_valid: 0.2823, accuracy_valid: 0.9027, 0.4870sec\nepoch: 90, loss_train: 0.1676, accuracy_train: 0.9484, loss_valid: 0.2826, accuracy_valid: 0.9042, 0.4772sec\nepoch: 91, loss_train: 0.1657, accuracy_train: 0.9485, loss_valid: 0.2825, accuracy_valid: 0.9049, 0.4935sec\nepoch: 92, loss_train: 0.1652, accuracy_train: 0.9491, loss_valid: 0.2834, accuracy_valid: 0.9049, 0.4998sec\nepoch: 93, loss_train: 0.1638, accuracy_train: 0.9496, loss_valid: 0.2833, accuracy_valid: 0.9064, 0.4804sec\nepoch: 94, loss_train: 0.1640, accuracy_train: 0.9499, loss_valid: 0.2838, accuracy_valid: 0.9064, 0.4907sec\nepoch: 95, loss_train: 0.1619, accuracy_train: 0.9501, loss_valid: 0.2829, accuracy_valid: 0.9034, 0.4808sec\nepoch: 96, loss_train: 0.1604, accuracy_train: 0.9506, loss_valid: 0.2832, accuracy_valid: 0.9057, 0.4832sec\nepoch: 97, loss_train: 0.1600, accuracy_train: 0.9514, loss_valid: 0.2830, accuracy_valid: 0.9012, 0.4855sec\nepoch: 98, loss_train: 0.1591, accuracy_train: 0.9516, loss_valid: 0.2831, accuracy_valid: 0.8997, 0.4824sec\nepoch: 99, loss_train: 0.1581, accuracy_train: 0.9518, loss_valid: 0.2840, accuracy_valid: 0.9019, 0.4816sec\nepoch: 100, loss_train: 0.1559, accuracy_train: 0.9520, loss_valid: 0.2837, accuracy_valid: 0.9057, 0.4907sec\nepoch: 101, loss_train: 0.1561, accuracy_train: 0.9521, loss_valid: 0.2847, accuracy_valid: 0.9057, 0.4827sec\nepoch: 102, loss_train: 0.1554, accuracy_train: 0.9527, loss_valid: 0.2845, accuracy_valid: 0.9004, 0.4884sec\nepoch: 103, loss_train: 0.1527, accuracy_train: 0.9528, loss_valid: 0.2845, accuracy_valid: 0.9049, 0.4875sec\nepoch: 104, loss_train: 0.1519, accuracy_train: 0.9542, loss_valid: 0.2833, accuracy_valid: 0.9012, 0.4773sec\nepoch: 105, loss_train: 0.1521, accuracy_train: 0.9542, loss_valid: 0.2846, accuracy_valid: 0.9042, 0.4895sec\nepoch: 106, loss_train: 0.1491, accuracy_train: 0.9542, loss_valid: 0.2838, accuracy_valid: 0.9027, 0.4813sec\nepoch: 107, loss_train: 0.1484, accuracy_train: 0.9551, loss_valid: 0.2834, accuracy_valid: 0.9049, 0.4903sec\nepoch: 108, loss_train: 0.1472, accuracy_train: 0.9546, loss_valid: 0.2836, accuracy_valid: 0.9057, 0.4866sec\nepoch: 109, loss_train: 0.1466, accuracy_train: 0.9554, loss_valid: 0.2849, accuracy_valid: 0.9042, 0.4862sec\nepoch: 110, loss_train: 0.1446, accuracy_train: 0.9563, loss_valid: 0.2841, accuracy_valid: 0.9064, 0.4882sec\nepoch: 111, loss_train: 0.1440, accuracy_train: 0.9561, loss_valid: 0.2841, accuracy_valid: 0.9057, 0.4883sec\nepoch: 112, loss_train: 0.1419, accuracy_train: 0.9572, loss_valid: 0.2846, accuracy_valid: 0.9064, 0.4904sec\nepoch: 113, loss_train: 0.1409, accuracy_train: 0.9585, loss_valid: 0.2838, accuracy_valid: 0.9057, 0.4929sec\nepoch: 114, loss_train: 0.1410, accuracy_train: 0.9571, loss_valid: 0.2851, accuracy_valid: 0.9027, 0.4888sec\nepoch: 115, loss_train: 0.1399, accuracy_train: 0.9584, loss_valid: 0.2841, accuracy_valid: 0.9034, 0.4802sec\nepoch: 116, loss_train: 0.1383, accuracy_train: 0.9584, loss_valid: 0.2841, accuracy_valid: 0.9072, 0.4962sec\nepoch: 117, loss_train: 0.1385, accuracy_train: 0.9587, loss_valid: 0.2853, accuracy_valid: 0.9049, 0.5010sec\nepoch: 118, loss_train: 0.1364, accuracy_train: 0.9597, loss_valid: 0.2854, accuracy_valid: 0.9049, 0.4854sec\nepoch: 119, loss_train: 0.1353, accuracy_train: 0.9598, loss_valid: 0.2856, accuracy_valid: 0.9064, 0.4734sec\n" ], [ "# 可視化\nfig, ax = plt.subplots(1, 2, figsize=(15, 5))\nax[0].plot(np.array(log['train']).T[0], label='train')\nax[0].plot(np.array(log['valid']).T[0], label='valid')\nax[0].set_xlabel('epoch')\nax[0].set_ylabel('loss')\nax[0].legend()\nax[1].plot(np.array(log['train']).T[1], label='train')\nax[1].plot(np.array(log['valid']).T[1], label='valid')\nax[1].set_xlabel('epoch')\nax[1].set_ylabel('accuracy')\nax[1].legend()\nplt.show()", "_____no_output_____" ], [ "def calculate_accuracy(model, X, y, device):\n model.eval()\n with torch.no_grad():\n inputs = X.to(device)\n outputs = model(inputs)\n pred = torch.argmax(outputs, dim=-1).cpu()\n\n return (pred == y).sum().item() / len(y)", "_____no_output_____" ], [ "# 正解率の確認\nacc_train = calculate_accuracy(model, X_train, y_train, device)\nacc_test = calculate_accuracy(model, X_test, y_test, device)\nprint(f'正解率(学習データ):{acc_train:.3f}')\nprint(f'正解率(評価データ):{acc_test:.3f}')", "正解率(学習データ):0.960\n正解率(評価データ):0.913\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ] ]
d028b24ee4e8fea5c9702ca2aee3a468bcda3385
1,267
ipynb
Jupyter Notebook
.ipynb_checkpoints/images-checkpoint.ipynb
Ke-Chi-Chang/prerequisite_python
de64c6f31c3bcbf33cb814c7a69755c8112ede64
[ "MIT" ]
null
null
null
.ipynb_checkpoints/images-checkpoint.ipynb
Ke-Chi-Chang/prerequisite_python
de64c6f31c3bcbf33cb814c7a69755c8112ede64
[ "MIT" ]
null
null
null
.ipynb_checkpoints/images-checkpoint.ipynb
Ke-Chi-Chang/prerequisite_python
de64c6f31c3bcbf33cb814c7a69755c8112ede64
[ "MIT" ]
null
null
null
17.121622
54
0.494081
[ [ [ "import os\nimport cv2\nimport sys\nimport pathlib\nimport numpy as np", "_____no_output_____" ], [ "img_path = pathlib.Path('./archi_1920x1080.jpg')", "_____no_output_____" ], [ "img = cv2.imread(str(img_path))\nprint(img.shape)\ncv2.imshow()", "(1080, 1920, 3)\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code" ] ]
d028b549147fa4e141085ab0f1632b4bd59d9570
14,726
ipynb
Jupyter Notebook
recordsearch/2-Analyse-a-series.ipynb
GLAM-Workbench/glam-workbench-presentations
036c7380a41a7dc34b5b523af8e83514a2247e6d
[ "MIT" ]
8
2018-04-16T06:48:24.000Z
2018-07-04T23:45:44.000Z
RecordSearch/2-Analyse-a-series.ipynb
GLAM-Workbench/ozglam-workbench
3406d098f74e941a0533d860a98492ffe9bc5476
[ "MIT" ]
7
2020-11-18T21:24:35.000Z
2022-03-11T23:27:57.000Z
recordsearch/2-Analyse-a-series.ipynb
GLAM-Workbench/glam-workbench-presentations
036c7380a41a7dc34b5b523af8e83514a2247e6d
[ "MIT" ]
3
2018-10-18T09:35:14.000Z
2019-11-20T01:50:34.000Z
28.931238
387
0.577142
[ [ [ "# Analyse a series", "_____no_output_____" ], [ "<div class=\"alert alert-block alert-warning\">\n <b>Under construction</b>\n</div>", "_____no_output_____" ] ], [ [ "import os\nimport pandas as pd\nfrom IPython.display import Image as DImage\nfrom IPython.core.display import display, HTML\nimport series_details\n\n# Plotly helps us make pretty charts\nimport plotly.offline as py\nimport plotly.graph_objs as go\n\n# Make sure data directory exists\nos.makedirs('../../data/RecordSearch/images', exist_ok=True)\n\n# This lets Plotly draw charts in cells\npy.init_notebook_mode()", "_____no_output_____" ] ], [ [ "This notebook is for analysing a series that you've already harvested. If you haven't harvested any data yet, then you need to go back to the ['Harvesting a series' notebook](Harvesting series.ipynb).", "_____no_output_____" ] ], [ [ "# What series do you want to analyse?\n# Insert the series id between the quotes.\nseries = 'J2483'", "_____no_output_____" ], [ "# Load the CSV data for the specified series into a dataframe. Parse the dates as dates!\ndf = pd.read_csv('../data/RecordSearch/{}.csv'.format(series.replace('/', '-')), parse_dates=['start_date', 'end_date'])", "_____no_output_____" ] ], [ [ "Remember that you can download harvested data from the workbench [data directory](../data/RecordSearch).", "_____no_output_____" ], [ "## Get some summary data\n\nWe're going to create a simple summary of some of the main characteristics of the series, as reflected in the harvested files.", "_____no_output_____" ] ], [ [ "# We're going to assemble some summary data about the series in a 'summary' dictionary\n# Let's create the dictionary and add the series identifier\nsummary = {'series': series}", "_____no_output_____" ], [ "# The 'shape' property returns the number of rows and columns. So 'shape[0]' gives us the number of items harvested.\nsummary['total_items'] = df.shape[0]\nprint(summary['total_items'])", "_____no_output_____" ], [ "# Get the frequency of the different access status categories\nsummary['access_counts'] = df['access_status'].value_counts().to_dict()\nprint(summary['access_counts'])", "_____no_output_____" ], [ "# Get the number of files that have been digitised\nsummary['digitised_files'] = len(df.loc[df['digitised_status'] == True])\nprint(summary['digitised_files'])", "_____no_output_____" ], [ "# Get the number of individual pages that have been digitised\nsummary['digitised_pages'] = df['digitised_pages'].sum()\nprint(summary['digitised_pages'])", "_____no_output_____" ], [ "# Get the earliest start date\nstart = df['start_date'].min()\ntry:\n summary['date_from'] = start.year\nexcept AttributeError:\n summary['date_from'] = None\nprint(summary['date_from'])", "_____no_output_____" ], [ "# Get the latest end date\nend = df['end_date'].max()\ntry:\n summary['date_to'] = end.year\nexcept AttributeError:\n summary['date_to'] = None\nprint(summary['date_to'])", "_____no_output_____" ], [ "# Let's display all the summary data\nprint('SERIES: {}'.format(summary['series']))\nprint('Number of items: {:,}'.format(summary['total_items']))\nprint('Access status:')\nfor status, total in summary['access_counts'].items():\n print(' {}: {:,}'.format(status, total))\nprint('Contents dates: {} to {}'.format(summary['date_from'], summary['date_to']))\nprint('Digitised files: {:,}'.format(summary['digitised_files']))\nprint('Digitised pages: {:,}'.format(summary['digitised_pages']))", "_____no_output_____" ] ], [ [ "Note that a slightly enhanced version of the code above is available in the `series_details` module that you can import into any notebook. So to create a summary of a series you can just:", "_____no_output_____" ] ], [ [ "# Import the module\nimport series_details\n\n# Call display_series() providing the series name and the dataframe\nseries_details.display_summary(series, df)", "_____no_output_____" ] ], [ [ "## Plot the contents dates\n\nPlotting the dates is a bit tricky. Each file can have both a start date and an end date. So if we want to plot the years covered by a file, we need to include all the years between the start and end dates. Also dates can be recorded at different levels of granularity, for specific days to just years. And sometimes there are no end dates recorded at all – what does this mean?\n\nThe code in the cell below does a few things:\n\n* It fills any empty end dates with the start date from the same item. This probably means some content years will be missed, but it's the only date we can be certain of.\n* It loops through all the rows in the dataframe, then for each row it extracts the years between the start and end date. Currently this looks to see if the 1 January is covered by the date range, so if there's an exact start date after 1 January I don't think it will be captured. I need to investigate this further.\n* It combines all of the years into one big series and then totals up the frquency of each year.\n\nI'm sure this is not perfect, but it seems to produce useful results.\n", "_____no_output_____" ] ], [ [ "# Fill any blank end dates with start dates\ndf['end_date'] = df[['end_date']].apply(lambda x: x.fillna(value=df['start_date']))\n\n# This is a bit tricky.\n# For each item we want to find the years that it has content from -- ie start_year <= year <= end_year.\n# Then we want to put all the years from all the items together and look at their frequency\nyears = []\nfor row in df.itertuples(index=False):\n try:\n years_in_range = pd.date_range(start=row.start_date, end=row.end_date, freq='AS').year.to_series()\n except ValueError:\n # No start date\n pass\n else:\n years.append(years_in_range)\nyear_counts = pd.concat(years).value_counts()", "_____no_output_____" ], [ "# Put the resulting series in a dataframe so it looks pretty.\nyear_totals = pd.DataFrame(year_counts)\n\n# Sort results by year\nyear_totals.sort_index(inplace=True)", "_____no_output_____" ], [ "# Display the results\nyear_totals.style.format({0: '{:,}'})", "_____no_output_____" ], [ "# Let's graph the frequency of content years\nplotly_data = [go.Bar(\n x=year_totals.index.values, # The years are the index\n y=year_totals[0]\n )]\n\n# Add some labels\nlayout = go.Layout(\n title='Content dates',\n xaxis=dict(\n title='Year'\n ),\n yaxis=dict(\n title='Number of items'\n )\n)\n\n# Create a chart \nfig = go.Figure(data=plotly_data, layout=layout)\npy.iplot(fig, filename='series-dates-bar')", "_____no_output_____" ] ], [ [ "Note that a slightly enhanced version of the code above is available in the series_details module that you can import into any notebook. So to create a summary of a series you can just:", "_____no_output_____" ] ], [ [ "# Import the module\nimport series_details\n\n# Call plot_series() providing the series name and the dataframe\nfig = series_details.plot_dates(df)\npy.iplot(fig)", "_____no_output_____" ] ], [ [ "## Filter by words in file titles", "_____no_output_____" ] ], [ [ "# Find titles containing a particular phrase -- in this case 'wife'\n# This creates a new dataframe\n# Try changing this to filter for other words\n\nsearch_term = 'wife'\ndf_filtered = df.loc[df['title'].str.contains(search_term, case=False)].copy()\ndf_filtered", "_____no_output_____" ], [ "# We can plot this filtered dataframe just like the series\nfig = series_details.plot_dates(df)\npy.iplot(fig)", "_____no_output_____" ], [ "# Save the new dataframe as a csv\ndf_filtered.to_csv('../data/RecordSearch/{}-{}.csv'.format(series.replace('/', '-'), search_term))", "_____no_output_____" ], [ "# Find titles containing one of two words -- ie an OR statement\n# Try changing this to filter for other words\n\ndf_filtered = df.loc[df['title'].str.contains('chinese', case=False) | df['title'].str.contains(r'\\bah\\b', case=False)].copy()\ndf_filtered", "_____no_output_____" ] ], [ [ "## Filter by date range", "_____no_output_____" ] ], [ [ "start_year = '1920'\nend_year = '1930'\ndf_filtered = df[(df['start_date'] >= start_year) & (df['end_date'] <= end_year)]\ndf_filtered", "_____no_output_____" ] ], [ [ "## N-gram frequencies in file titles", "_____no_output_____" ] ], [ [ "# Import TextBlob for text analysis\nfrom textblob import TextBlob\nimport nltk\nstopwords = nltk.corpus.stopwords.words('english')", "_____no_output_____" ], [ "# Combine all of the file titles into a single string\ntitle_text = a = df['title'].str.lower().str.cat(sep=' ')", "_____no_output_____" ], [ "blob = TextBlob(title_text)\nwords = [[word, count] for word, count in blob.lower().word_counts.items() if word not in stopwords]\nword_counts = pd.DataFrame(words).rename({0: 'word', 1: 'count'}, axis=1).sort_values(by='count', ascending=False)\nword_counts[:25].style.format({'count': '{:,}'}).bar(subset=['count'], color='#d65f5f').set_properties(subset=['count'], **{'width': '300px'})", "_____no_output_____" ], [ "def get_ngram_counts(text, size):\n blob = TextBlob(text)\n # Extract n-grams as WordLists, then convert to a list of strings\n ngrams = [' '.join(ngram).lower() for ngram in blob.lower().ngrams(size)]\n # Convert to dataframe then count values and rename columns\n ngram_counts = pd.DataFrame(ngrams)[0].value_counts().rename_axis('ngram').reset_index(name='count')\n return ngram_counts\n \ndef display_top_ngrams(text, size):\n ngram_counts = get_ngram_counts(text, size)\n # Display top 25 results as a bar chart\n display(ngram_counts[:25].style.format({'count': '{:,}'}).bar(subset=['count'], color='#d65f5f').set_properties(subset=['count'], **{'width': '300px'}))", "_____no_output_____" ], [ "display_top_ngrams(title_text, 2)", "_____no_output_____" ], [ "display_top_ngrams(title_text, 4)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ] ]
d028c00837702eb9ac32592e9df14867d99ade9a
5,671
ipynb
Jupyter Notebook
python/modules/jupyter/Pyopenssl.ipynb
HHW-zhou/snippets
f1e5c1fd361e1b86e072a1c01ca8707ac0b9d68b
[ "Apache-2.0" ]
null
null
null
python/modules/jupyter/Pyopenssl.ipynb
HHW-zhou/snippets
f1e5c1fd361e1b86e072a1c01ca8707ac0b9d68b
[ "Apache-2.0" ]
null
null
null
python/modules/jupyter/Pyopenssl.ipynb
HHW-zhou/snippets
f1e5c1fd361e1b86e072a1c01ca8707ac0b9d68b
[ "Apache-2.0" ]
null
null
null
31.159341
946
0.54858
[ [ [ "## Pyopenssl\n[官方文档](https://www.pyopenssl.org/)", "_____no_output_____" ], [ "### 使用openssl生成私钥和公钥\n[参考资料](https://blog.csdn.net/huanhuanq1209/article/details/80899017)\n> openssl \n> genrsa -out private.pem 1024 \n> rsa -in public.pem -pubout -out rsa_public_key.pem", "_____no_output_____" ], [ "### 签名实例", "_____no_output_____" ] ], [ [ "import OpenSSL\nfrom OpenSSL._util import lib as _lib\nFILETYPE_PEM = _lib.SSL_FILETYPE_PEM\nimport base64\n\ndef makeSign(message):\n order = sorted(message)\n \n sign_str = \"\"\n \n for key in order:\n sign_str = sign_str + \"&{0}={1}\".format(key,message[key])\n \n sign_str = sign_str[1:]\n \n print(\"待签名的串 == 》 %s\"%sign_str)\n \n with open('private.pem','rb') as f:\n pkey = OpenSSL.crypto.load_privatekey(FILETYPE_PEM, buffer=f.read())\n \n sign = OpenSSL.crypto.sign(pkey, bytes(sign_str,encoding=\"utf-8\"), \"sha256\")\n \n print(\"签名结果 ==》 %s\"%sign)\n \n sign = base64.b64encode(sign)\n \n print(\"签名结果(base64编码) ==》 %s\"%sign)\n \n return sign,sign_str\n\nsign,sign_str = makeSign({\n \"method\":\"any\",\n \"name\":\"Baird\",\n \"sex\":\"male\",\n \"mobile\":\"18300010001\"\n})", "待签名的串 == 》 method=any&mobile=18300010001&name=Baird&sex=male\n签名结果 ==》 b'\\x84\\xb9\\xae\\xc3{\\xfb\"\\xb5\\x9fA\\x02\\x9bZ\\x16g\\xd5\\x90`\\x1e\\xc6\\x87\\xef\\xb1\\xef\\xb3\\x8a\\xb7\\xbc\\xc3\\x0e\\xab45T\\xfaK\\x02\\xc25\\x82\\xbag\\xb9\\x94\\t\\x8c\\xc8\\x0f\\xe9\\x81\\xd7U\\x80\\xd6\\xf9\\x871q>V\\xdfn\\x0b\\x8e\\xac\\x8a\\xab#B\\xab\\xf3\\xc6\\xfaM\\xc4\\x95$\\xa7\\xef*J\\xd1~\\x803\\x14G\\x80\\x8d\\x16\\xbd4\\xa5w\\xf5\\x03E\\xb1\\xffb\\x99\\x97#U3\\x17\\xd0\\x98n\\x89\\xe9\\xe5\\x7f\\x9f\\x97\\xde\\x04\\xc6\\xa3p\\xc3\\x0f{\\x01XZ\\xc6\\xcd\\x84\\x8be\\xb3\\xdd\\x1cI\\x87$\\r\\xfb\\xe4\\x85\\x18\\xc6\\xbc\\xfb\\xed\\xc3tl\\xfe\\xab{\\x87\\xd4|p0\\x95\\xd2!\\x94\\x80\\x00\\x8e0\\xfdy\\xb3\\x1e({+\\xb6\\xd9D3\\xd3W\\xc0\\xbe3\\x05\\xc6Y\\x13\\x84\",\\xef0\\xdf\\xdb\\x15\\x8b\\xb1g\\xe8\\xc9\\xa1\\xbfQ\\xd9\\x12#\\x92 S\\xbe\\xcbK\\xc2\\x17\\xc5\\xb9\\x08\\xbbp\\xec\\xedk\\xf6\\x82\\xd4 \\xa8\\x91\\ry\\xc6A\\xedK\\\\\\x03\\xafx\\xaf7\\xf8z%\\xbcV1\\xedu\\xea$\\xfeq\\x84fDtUz'\n签名结果(base64编码) ==》 b'hLmuw3v7IrWfQQKbWhZn1ZBgHsaH77Hvs4q3vMMOqzQ1VPpLAsI1grpnuZQJjMgP6YHXVYDW+YcxcT5W324LjqyKqyNCq/PG+k3ElSSn7ypK0X6AMxRHgI0WvTSld/UDRbH/YpmXI1UzF9CYbonp5X+fl94ExqNwww97AVhaxs2Ei2Wz3RxJhyQN++SFGMa8++3DdGz+q3uH1HxwMJXSIZSAAI4w/XmzHih7K7bZRDPTV8C+MwXGWROEIizvMN/bFYuxZ+jJob9R2RIjkiBTvstLwhfFuQi7cOzta/aC1CCokQ15xkHtS1wDr3ivN/h6JbxWMe116iT+cYRmRHRVeg=='\n" ] ], [ [ "### 验签实例", "_____no_output_____" ] ], [ [ "def makeVerify(sign, sign_str):\n sign = base64.b64decode(sign)\n with open(\"public.pem\",\"rb\") as f:\n pubkey = OpenSSL.crypto.load_publickey(FILETYPE_PEM, buffer=f.read())\n \n x509 = OpenSSL.crypto.X509()\n x509.set_pubkey(pubkey)\n \n #验证通过返回None,否则抛出错误\n try:\n OpenSSL.crypto.verify(x509, sign, bytes(sign_str, encoding=\"utf-8\"), 'sha256')\n except Exception as e:\n return e\n return True\n\nresult = makeVerify(sign,sign_str) \nresult2 = makeVerify(sign,\"hello world\")\nresult,result2 ", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d028cfa6b15cf11867cfd85841fa6a640b545b34
18,185
ipynb
Jupyter Notebook
starter_code/.ipynb_checkpoints/VacationPy-checkpoint.ipynb
jackaloppy/python-api-challenge
541e25e01dbf76ee42f14f0f4a529d9d51be30bc
[ "ADSL" ]
null
null
null
starter_code/.ipynb_checkpoints/VacationPy-checkpoint.ipynb
jackaloppy/python-api-challenge
541e25e01dbf76ee42f14f0f4a529d9d51be30bc
[ "ADSL" ]
null
null
null
starter_code/.ipynb_checkpoints/VacationPy-checkpoint.ipynb
jackaloppy/python-api-challenge
541e25e01dbf76ee42f14f0f4a529d9d51be30bc
[ "ADSL" ]
null
null
null
30.460637
160
0.396426
[ [ [ "# VacationPy\n----\n\n#### Note\n* Keep an eye on your API usage. Use https://developers.google.com/maps/reporting/gmp-reporting as reference for how to monitor your usage and billing.\n\n* Instructions have been included for each segment. You do not have to follow them exactly, but they are included to help you think through the steps.", "_____no_output_____" ] ], [ [ "# Dependencies and Setup\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport numpy as np\nimport requests\nimport gmaps\nimport os\n\n# Import API key\nfrom api_keys import g_key", "_____no_output_____" ] ], [ [ "### Store Part I results into DataFrame\n* Load the csv exported in Part I to a DataFrame", "_____no_output_____" ] ], [ [ "cities_df = pd.read_csv('../output_data/cities.csv')\ncities_df.dropna(inplace = True) \ncities_df.head()", "_____no_output_____" ] ], [ [ "### Humidity Heatmap\n* Configure gmaps.\n* Use the Lat and Lng as locations and Humidity as the weight.\n* Add Heatmap layer to map.", "_____no_output_____" ] ], [ [ "gmaps.configure(api_key=g_key)", "_____no_output_____" ], [ "locations = cities_df[[\"Lat\", \"Lng\"]]\nhumidity = cities_df[\"Humidity\"]\nfig = gmaps.figure()\nheat_layer = gmaps.heatmap_layer(locations, weights=humidity, \n dissipating=False, max_intensity=150,\n point_radius=3)\nfig.add_layer(heat_layer)\nfig", "_____no_output_____" ] ], [ [ "### Create new DataFrame fitting weather criteria\n* Narrow down the cities to fit weather conditions.\n* Drop any rows will null values.", "_____no_output_____" ] ], [ [ "ideal_df = cities_df[cities_df[\"Max Temp\"].lt(80) &\n cities_df[\"Max Temp\"].gt(70) &\n cities_df[\"Wind Speed\"].lt(10) & \n cities_df[\"Cloudiness\"].eq(0) &\n cities_df[\"Humidity\"].lt(80) &\n cities_df[\"Humidity\"].gt(30)]\nideal_df", "_____no_output_____" ] ], [ [ "### Hotel Map\n* Store into variable named `hotel_df`.\n* Add a \"Hotel Name\" column to the DataFrame.\n* Set parameters to search for hotels with 5000 meters.\n* Hit the Google Places API for each city's coordinates.\n* Store the first Hotel result into the DataFrame.\n* Plot markers on top of the heatmap.", "_____no_output_____" ] ], [ [ "hotel_df = ideal_df[[\"City\", \"Lat\", \"Lng\", \"Country\"]].reset_index(drop=True)\nhotel_df[\"Hotel Name\"] = \"\"", "_____no_output_____" ], [ "params = {\n \"radius\": 5000,\n \"types\": \"lodging\",\n \"keyword\": \"hotel\",\n \"key\": g_key\n}\nfor index, row in hotel_df.iterrows():\n lat = row[\"Lat\"]\n lng = row[\"Lng\"]\n params[\"location\"] = f\"{lat},{lng}\"\n base_url = \"https://maps.googleapis.com/maps/api/place/nearbysearch/json\"\n name_address = requests.get(base_url, params=params).json()\n try:\n hotel_df.loc[index, \"Hotel Name\"] = name_address[\"results\"][0][\"name\"]\n except (KeyError, IndexError):\n hotel_df.loc[index, \"Hotel Name\"] = \"NA\"\n print(\"Couldn't find a hotel here at \" + row[\"City\"] + \", \" + row[\"Country\"])", "Couldn't find a hotel here at Nioro, GM\nCouldn't find a hotel here at Barentu, ER\nCouldn't find a hotel here at Beloha, MG\n" ], [ "# NOTE: Do not change any of the code in this cell\n\n# Using the template add the hotel marks to the heatmap\ninfo_box_template = \"\"\"\n<dl>\n<dt>Name</dt><dd>{Hotel Name}</dd>\n<dt>City</dt><dd>{City}</dd>\n<dt>Country</dt><dd>{Country}</dd>\n</dl>\n\"\"\"\n# Store the DataFrame Row\n# NOTE: be sure to update with your DataFrame name\nhotel_info = [info_box_template.format(**row) for index, row in hotel_df.iterrows()]\nhover_info = [f\"{row['City']}, {row['Country']}\" for index,row in hotel_df.iterrows()]\nlocations = hotel_df[[\"Lat\", \"Lng\"]]\n\nmarkers = gmaps.marker_layer(\n locations, \n hover_text=hover_info,\n info_box_content=hotel_info)", "_____no_output_____" ], [ "# Add marker layer ontop of heat map\nfig.add_layer(markers)\nfig.add_layer(heat_layer)\n# Display figure\nfig", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ] ]
d028d4782d1449d8e0563ac0576089c298ca3281
73,242
ipynb
Jupyter Notebook
S01 - Bootcamp and Binary Classification/SLU07 - Regression with Linear Regression/Example notebook.ipynb
claury/sidecar-academy-batch2
874e5a31f739be9e9f328eb2a7b043976453a0f9
[ "MIT" ]
2
2022-02-04T17:40:04.000Z
2022-03-26T18:03:12.000Z
S01 - Bootcamp and Binary Classification/SLU07 - Regression with Linear Regression/Example notebook.ipynb
claury/sidecar-academy-batch2
874e5a31f739be9e9f328eb2a7b043976453a0f9
[ "MIT" ]
null
null
null
S01 - Bootcamp and Binary Classification/SLU07 - Regression with Linear Regression/Example notebook.ipynb
claury/sidecar-academy-batch2
874e5a31f739be9e9f328eb2a7b043976453a0f9
[ "MIT" ]
2
2021-10-30T16:20:13.000Z
2021-11-25T12:09:31.000Z
30.240297
9,968
0.530188
[ [ [ "# SLU07 - Regression with Linear Regression: Example notebook", "_____no_output_____" ], [ "# 1 - Writing linear models\n\nIn this section you have a few examples on how to implement simple and multiple linear models.\n\nLet's start by implementing the following:\n\n$$y = 1.25 + 5x$$", "_____no_output_____" ] ], [ [ "def first_linear_model(x):\n \"\"\"\n Implements y = 1.25 + 5*x\n \n Args: \n x : float - input of model\n\n Returns:\n y : float - output of linear model\n \"\"\"\n\n y = 1.25 + 5 * x\n \n return y\n\nfirst_linear_model(1)", "_____no_output_____" ] ], [ [ "You should be thinking that this is too easy. So let's generalize it a bit. We'll write the code for the next equation:\n\n$$ y = a + bx $$", "_____no_output_____" ] ], [ [ "def second_linear_model(x, a, b):\n \"\"\"\n Implements y = a + b * x\n\n Args: \n x : float - input of model\n a : float - intercept of model\n b : float - coefficient of model\n\n Returns:\n y : float - output of linear model\n \"\"\"\n\n y = a + b * x\n return y\n\nsecond_linear_model(1, 1.25, 5)", "_____no_output_____" ] ], [ [ "Still very simple, right? Now what if we want to have a linear model with multiple variables, such as this one:\n\n$$ y = a + bx_1 + cx_2 + dx_3 $$\n\nYou can follow the same logic and just write the following:", "_____no_output_____" ] ], [ [ "def first_multiple_linear_model(x_1, x_2, x_3, a, b, c, d):\n \"\"\"\n Implements y = a + b * x_1 + c * x_2 + d * x_3 \n \n Args: \n x_1 : float - first input of model\n x_2 : float - second input of model\n x_3 : float - third input of model\n a : float - intercept of model\n b : float - first coefficient of model\n c : float - second coefficient of model\n d : float - third coefficient of model\n\n Returns:\n y : float - output of linear model\n \"\"\"\n\n y = a + b * x_1 + c * x_2 + d * x_3\n return y\n\nfirst_multiple_linear_model(1.0, 1.0, 1.0, .5, .2, .1, .4)", "_____no_output_____" ] ], [ [ "However, you should already be seeing the problem. The bigger our model gets, the more variables we need to consider, so this is clearly not efficient. Now let's write the generic form for a linear model:\n\n$$ y = w_0 + \\sum_{i=1}^{N} w_i x_i$$\n\nAnd we will implement the inputs and outputs of the model as vectors:", "_____no_output_____" ] ], [ [ "def second_multiple_linear_model(x, w):\n \"\"\"\n Implements y = w_0 + sum(x_i*w_i) (where i=1...N)\n\n Args: \n x : vector of input features with size N-1\n w : vector of model weights with size N\n \n Returns:\n y : float - output of linear model\n \"\"\"\n \n w_0 = w[0]\n y = w_0\n \n for i in range(1, len(x)+1):\n y += x[i-1]*w[i]\n \n return y\n\nsecond_multiple_linear_model([1.0, 1.0, 1.0], [.5, .2, .1, .4])", "_____no_output_____" ] ], [ [ "You could go even one step further and use numpy to vectorize these computations. You can represent both vectors as numpy arrays and just do the same calculation:", "_____no_output_____" ] ], [ [ "import numpy as np\n\ndef vectorized_multiple_linear_model(x, w):\n \"\"\"\n Implements y = w_0 + sum(x_i*w_i) (where i=1...N)\n\n Args: \n x : numpy array with shape (N-1, ) of inputs\n w : numpy array with shape (N, ) of model weights \n \n Returns:\n y : float - output of linear model\n \"\"\"\n\n y = w[0] + x*w[1:]\n \nvectorized_multiple_linear_model(np.array([1.0, 1.0, 1.0]), np.array([.5, .2, .1, .4]))", "_____no_output_____" ] ], [ [ "Read more about numpy array and its manipulation at the end of this example notebook. This will be necessary as you will be requested to implement these types of models in a way that they can compute several samples with many features at once.", "_____no_output_____" ], [ "<br>\n<br>", "_____no_output_____" ], [ "# 2 - Using sklearn's LinearRegression\n\nThe following cells show you how to use the LinearRegression solver of the scikitlearn library. We'll start by creating some fake data to use in these examples: ", "_____no_output_____" ] ], [ [ "import numpy as np\nimport matplotlib.pyplot as plt\n\nnp.random.seed(42)\n\nX = np.arange(-10, 10) + np.random.rand(20)\ny = 1.12 + .75 * X + 2. * np.random.rand(20)\n\nplt.xlim((-10, 10))\nplt.ylim((-20, 20))\nplt.plot(X, y, 'b.')", "_____no_output_____" ] ], [ [ "## 2.1 Training the model\n\nWe will now use the base data created and show you how to fit the scikitlearn LinearRegression model with the data:", "_____no_output_____" ] ], [ [ "from sklearn.linear_model import LinearRegression\n\n# Since our numpy array has only 1 dimension, we need reshape \n# it to become a column vector - which corresponds to 1 feature\n# and N samples\nX = X.reshape(-1, 1)\n\nlr = LinearRegression()\nlr.fit(X, y)", "_____no_output_____" ] ], [ [ "## 2.2 Coefficients and Intercept\n\nYou can get both the coefficients and the intercept from this model:", "_____no_output_____" ] ], [ [ "print('Coefficients: {}'.format(lr.coef_))\nprint('Intercept: {}'.format(lr.intercept_))", "Coefficients: [0.76238153]\nIntercept: 2.030181639054948\n" ] ], [ [ "## 2.3 Making predictions\n\nWe can then make prediction with our model and see how they compare with the actual samples:", "_____no_output_____" ] ], [ [ "y_pred = lr.predict(X)\n\nplt.xlim((-10, 10))\nplt.ylim((-20, 20))\nplt.plot(X, y, 'b.')\nplt.plot(X, y_pred, 'r-')", "_____no_output_____" ] ], [ [ "## 2.4 Evaluating the model\n\nWe can also extract the $R^2$ score of this model:", "_____no_output_____" ] ], [ [ "print('R² score: %f' % lr.score(X, y))", "R² score: 0.983519\n" ] ], [ [ "<br>\n<br>\n", "_____no_output_____" ], [ "# Bonus examples: Numpy utilities", "_____no_output_____" ], [ "With linear models, we normally have data that can be represented by either vectors or matrices. Even though you don't need advanced algebra knowledge to implement and understand the models presented, it is useful to understand its basics, since most of the computational part is typically implemented from these concepts.\n\nNumpy is a powerful library that allows us to represent our data easily in this format, and already implements a lot of functions to then manipulate or do calculations over our data. In this section we present the basic functions that you should know and will use the most to implement the basic models:", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd", "_____no_output_____" ] ], [ [ "## a) Pandas to numpy and back\n\nPandas stores our data in dataframes and series, which are very useful for visualization and even for some specific data operations we want to perform. However, for many algorithms that involve combination of numeric data, the standard form of implementing is by using numpy. Start by seeing how to convert from pandas to numpy and back:", "_____no_output_____" ] ], [ [ "df = pd.read_csv('data/polynomial.csv')\ndf.head()", "_____no_output_____" ] ], [ [ "### a.1) Pandas to numpy\n\nLet's transform our first column into a numpy vector. There are two ways of doing this, either by using the `.values` attribute:", "_____no_output_____" ] ], [ [ "np_array = df['x'].values\nprint(np_array[:10])", "[-0.97468167 1.04349486 1.67141609 -0.05145155 1.98901715 1.69483221\n 2.3605217 3.69166478 1.80589394 1.55395614]\n" ] ], [ [ "Or by calling the method `.to_numpy()` :", "_____no_output_____" ] ], [ [ "np_array = df['x'].to_numpy()\nprint(np_array[:10])", "[-0.97468167 1.04349486 1.67141609 -0.05145155 1.98901715 1.69483221\n 2.3605217 3.69166478 1.80589394 1.55395614]\n" ] ], [ [ "You can also apply this to the full table:", "_____no_output_____" ] ], [ [ "np_array = df.values\nprint(np_array[:5, :])", "[[-9.74681670e-01 9.50004358e-01 -9.25951835e-01 -1.13819408e+00]\n [ 1.04349486e+00 1.08888152e+00 1.13624227e+00 1.11665074e+00]\n [ 1.67141609e+00 2.79363175e+00 4.66932106e+00 1.59111751e+00]\n [-5.14515491e-02 2.64726191e-03 -1.36205726e-04 1.00102006e+00]\n [ 1.98901715e+00 3.95618924e+00 7.86892827e+00 -9.73300421e-01]]\n" ], [ "np_array = df.to_numpy()\nprint(np_array[:5, :])", "[[-9.74681670e-01 9.50004358e-01 -9.25951835e-01 -1.13819408e+00]\n [ 1.04349486e+00 1.08888152e+00 1.13624227e+00 1.11665074e+00]\n [ 1.67141609e+00 2.79363175e+00 4.66932106e+00 1.59111751e+00]\n [-5.14515491e-02 2.64726191e-03 -1.36205726e-04 1.00102006e+00]\n [ 1.98901715e+00 3.95618924e+00 7.86892827e+00 -9.73300421e-01]]\n" ] ], [ [ "### a.2) Numpy to pandas\n\nLet's start by defining an array and converting it to a pandas series:", "_____no_output_____" ] ], [ [ "np_array = np.array([4., .1, 1., .23, 3.])\npd_series = pd.Series(np_array)\nprint(pd_series)", "0 4.00\n1 0.10\n2 1.00\n3 0.23\n4 3.00\ndtype: float64\n" ] ], [ [ "We can also create several series and concatenate them to create a dataframe:", "_____no_output_____" ] ], [ [ "np_array = np.array([4., .1, 1., .23, 3.])\npd_series_1 = pd.Series(np_array, name='A')\npd_series_2 = pd.Series(2 * np_array, name='B')\npd_dataframe = pd.concat((pd_series_1, pd_series_2), axis=1)\npd_dataframe.head()", "_____no_output_____" ] ], [ [ "We can also directly convert to a dataframe:", "_____no_output_____" ] ], [ [ "np_array = np.array([[1, 2, 3], [4, 5, 6]])\npd_dataframe = pd.DataFrame(np_array)\npd_dataframe.head()", "_____no_output_____" ] ], [ [ "However, we might want more detailed names and specific indices. Some ways of achieving this follows:", "_____no_output_____" ] ], [ [ "data = np.array([['','Col1','Col2'],\n ['Row1',1,2],\n ['Row2',3,4]])\n \npd_dataframe = pd.DataFrame(data=data[1:,1:], index=data[1:,0], columns=data[0,1:])\npd_dataframe.head()", "_____no_output_____" ], [ "pd_dataframe = pd.DataFrame(np.array([[4,5,6,7], [1,2,3,4]]), index=range(0, 2), columns=['A', 'B', 'C', 'D'])\npd_dataframe.head()", "_____no_output_____" ], [ "my_dict = {'A': np.array(['1', '3']), 'B': np.array(['1', '2']), 'C': np.array(['2', '4'])}\npd_dataframe = pd.DataFrame(my_dict)\npd_dataframe.head()", "_____no_output_____" ] ], [ [ "## b) Vector and Matrix initialization and shaping\n\nWhen working with vectors and matrices, we need to be aware of the dimensions of these objects, and how they affect the possible operations perform over them. Numpy allows you to access these dimensions through the shape of the object:", "_____no_output_____" ] ], [ [ "v1 = np.array([ .1, 1., 2.])\n\nprint('1-d Array: {}'.format(v1))\nprint('Shape: {}'.format(v1.shape))\n\nv2 = np.array([[ .1, 1., 2.]])\n\nprint('\\n')\nprint('2-d Row Array: {}'.format(v2))\nprint('Shape: {}'.format(v2.shape))\n\nv3 = np.array([[ .1], [1.], [2.]])\n\nprint('\\n')\nprint('2-d Column Array:\\n {}'.format(v3))\nprint('Shape: {}'.format(v3.shape))\n\nm1 = np.array([[ .1, 3., 4., 1.], [1., .3, .1, .5], [2.,.7, 3.8, .1]])\n\nprint('\\n')\nprint('2-d matrix:\\n {}'.format(m1))\nprint('Shape: {}'.format(m1.shape))\n", "1-d Array: [0.1 1. 2. ]\nShape: (3,)\n\n\n2-d Row Array: [[0.1 1. 2. ]]\nShape: (1, 3)\n\n\n2-d Column Array:\n [[0.1]\n [1. ]\n [2. ]]\nShape: (3, 1)\n\n\n2-d matrix:\n [[0.1 3. 4. 1. ]\n [1. 0.3 0.1 0.5]\n [2. 0.7 3.8 0.1]]\nShape: (3, 4)\n" ] ], [ [ "Another important functionality provided is the possibility of reshaping these objects. For example, we can turn a 1-d array into a row vector:", "_____no_output_____" ] ], [ [ "v1 = np.array([ .1, 1., 2.])\nv1_reshaped = v1.reshape((1, -1))\n \nprint('Old 1-d Array reshaped to row: {}'.format(v1_reshaped))\nprint('Shape: {}'.format(v1_reshaped.shape))\n", "Old 1-d Array reshaped to row: [[0.1 1. 2. ]]\nShape: (1, 3)\n" ] ], [ [ "Or we can reshape it into a column vector:", "_____no_output_____" ] ], [ [ "v1 = np.array([ .1, 1., 2.])\nv1_reshaped = v2.reshape((-1, 1))\n \nprint('Old 1-d Array reshaped to column: \\n{}'.format(v1_reshaped))\nprint('Shape: {}'.format(v1_reshaped.shape))\n", "Old 1-d Array reshaped to column: \n[[0.1]\n [1. ]\n [2. ]]\nShape: (3, 1)\n" ] ], [ [ "We can also create specific vectors of 1s, 0s or random numbers with specific shapes from the start. See how to use each in the cells that follow:", "_____no_output_____" ] ], [ [ "custom_shape = (3, )\nv1_ones = np.ones(custom_shape)\n \nprint('1-D Vector of ones: \\n{}'.format(v1_ones))\nprint('Shape: {}'.format(v1_ones.shape))", "1-D Vector of ones: \n[1. 1. 1.]\nShape: (3,)\n" ], [ "custom_shape = (5, 1)\nv1_zeros = np.zeros(custom_shape)\n \nprint('2-D vector of zeros: \\n{}'.format(v1_zeros))\nprint('Shape: {}'.format(v1_zeros.shape))", "2-D vector of zeros: \n[[0.]\n [0.]\n [0.]\n [0.]\n [0.]]\nShape: (5, 1)\n" ], [ "custom_shape = (5, 3)\nv1_rand = np.random.rand(custom_shape[0], custom_shape[1])\n \nprint('2-D Matrix of random numbers: \\n{}'.format(v1_rand))\nprint('Shape: {}'.format(v1_rand.shape))", "2-D Matrix of random numbers: \n[[0.12203823 0.49517691 0.03438852]\n [0.9093204 0.25877998 0.66252228]\n [0.31171108 0.52006802 0.54671028]\n [0.18485446 0.96958463 0.77513282]\n [0.93949894 0.89482735 0.59789998]]\nShape: (5, 3)\n" ] ], [ [ "## c) Vector and Matrix Concatenation \n\nIn this section, you will learn how to concatenate 2 vectors, a matrix and a vector, or 2 matrices. \n\n### c.1) Vector - Vector\n\nLet's start by defining 2 vectors:", "_____no_output_____" ] ], [ [ "v1 = np.array([ .1, 1., 2.])\nv2 = np.array([5.1, .3, .41, 3. ])\n\nprint('1st array: {}'.format(v1))\nprint('Shape: {}'.format(v1.shape))\n\nprint('2nd array: {}'.format(v2))\nprint('Shape: {}'.format(v2.shape))", "1st array: [0.1 1. 2. ]\nShape: (3,)\n2nd array: [5.1 0.3 0.41 3. ]\nShape: (4,)\n" ] ], [ [ "Since vectors only have one dimension with a given size (notice the shape with only one element) we can only concatenate in this dimension, leading to a longer vector:", "_____no_output_____" ] ], [ [ "vconcat = np.concatenate((v1, v2))\nprint('Concatenated vector: {}'.format(vconcat))\nprint('Shape: {}'.format(vconcat.shape))", "Concatenated vector: [0.1 1. 2. 5.1 0.3 0.41 3. ]\nShape: (7,)\n" ] ], [ [ "Concatenating vectors is very easy, and since we can only concatenate them in their one dimension, the sizes do not have to match. Now let's move on to a more complex case.\n\n### c.2) Matrix - row vector\n\nWhen concatenating matrices and vectors we have to take into account their dimensions.", "_____no_output_____" ] ], [ [ "v1 = np.array([ .1, 1., 2., 3.])\nm1 = np.array([[5.1, .3, .41, 3. ], [5.1, .3, .41, 3. ]])\n\nprint('Array: {}'.format(v1))\nprint('Shape: {}'.format(v1.shape))\n\nprint('Matrix: \\n{}'.format(m1))\nprint('Shape: {}'.format(m1.shape))", "Array: [0.1 1. 2. 3. ]\nShape: (4,)\nMatrix: \n[[5.1 0.3 0.41 3. ]\n [5.1 0.3 0.41 3. ]]\nShape: (2, 4)\n" ] ], [ [ "The first thing you need to know is that whatever numpy objects you are trying to concatenate need to have the same dimensions. Run the code below to verify that you can not concatenate directly the vector and matrix:", "_____no_output_____" ] ], [ [ "try:\n vconcat = np.concatenate((v1, m1))\nexcept Exception as e:\n print('Concatenation raised the following error: {}'.format(e))\n ", "Concatenation raised the following error: all the input arrays must have same number of dimensions, but the array at index 0 has 1 dimension(s) and the array at index 1 has 2 dimension(s)\n" ] ], [ [ "So how can we do matrix-vector concatenation? \n\nIt is actually quite simple. We'll use the reshape functionality you seen before to add a dimension to the vector.", "_____no_output_____" ] ], [ [ "v1_reshaped = v1.reshape((1, v1.shape[0]))\nm1 = np.array([[5.1, .3, .41, 3. ], [5.1, .3, .41, 3. ]])\n\nprint('Array: {}'.format(v1_reshaped))\nprint('Shape: {}'.format(v1_reshaped.shape))\n\nprint('Matrix: \\n{}'.format(m1))\nprint('Shape: {}'.format(m1.shape))", "Array: [[0.1 1. 2. 3. ]]\nShape: (1, 4)\nMatrix: \n[[5.1 0.3 0.41 3. ]\n [5.1 0.3 0.41 3. ]]\nShape: (2, 4)\n" ] ], [ [ "We've reshaped our vector into a 1-row matrix. Now we can try to perform the same concatenation:", "_____no_output_____" ] ], [ [ "vconcat = np.concatenate((v1_reshaped, m1))\nprint('Concatenated vector: {}'.format(vconcat))\nprint('Shape: {}'.format(vconcat.shape))", "Concatenated vector: [[0.1 1. 2. 3. ]\n [5.1 0.3 0.41 3. ]\n [5.1 0.3 0.41 3. ]]\nShape: (3, 4)\n" ] ], [ [ "### c.3) Matrix - column vector", "_____no_output_____" ], [ "We can also do this procedure with a column vector:", "_____no_output_____" ] ], [ [ "v1 = np.array([ .1, 1.])\nv1_reshaped = v1.reshape((v1.shape[0], 1))\nm1 = np.array([[5.1, .3, .41, 3. ], [5.1, .3, .41, 3. ]])\n\nprint('Array: \\n{}'.format(v1_reshaped))\nprint('Shape: {}'.format(v1_reshaped.shape))\n\nprint('Matrix: \\n{}'.format(m1))\nprint('Shape: {}'.format(m1.shape))\n\nvconcat = np.concatenate((v1_reshaped, m1), axis=1)\nprint('Concatenated vector: {}'.format(vconcat))\nprint('Shape: {}'.format(vconcat.shape))", "Array: \n[[0.1]\n [1. ]]\nShape: (2, 1)\nMatrix: \n[[5.1 0.3 0.41 3. ]\n [5.1 0.3 0.41 3. ]]\nShape: (2, 4)\nConcatenated vector: [[0.1 5.1 0.3 0.41 3. ]\n [1. 5.1 0.3 0.41 3. ]]\nShape: (2, 5)\n" ] ], [ [ "There's yet another restriction when concatenating vectors and matrices. The dimension where we want to concatenate has to share the same size.\n\nSee what would happen if we tried to concatenate a smaller vector with the same matrix:", "_____no_output_____" ] ], [ [ "v2 = np.array([ .1, 1.])\nv2_reshaped = v2.reshape((1, v2.shape[0])) # Row vector as matrix\n\ntry:\n vconcat = np.concatenate((v2, m1))\nexcept Exception as e:\n print('Concatenation raised the following error: {}'.format(e))", "Concatenation raised the following error: all the input arrays must have same number of dimensions, but the array at index 0 has 1 dimension(s) and the array at index 1 has 2 dimension(s)\n" ] ], [ [ "### c.4) Matrix - Matrix\n\nThis is just an extension of the previous case, since what we did before was transforming the vector into a matrix where the size of one of the dimensions is 1. So all the same restrictions apply, the arrays must have compatible dimensions. Run the following examples to see this:\n", "_____no_output_____" ] ], [ [ "m1 = np.array([[5.1, .3, .41, 3. ], [5.1, .3, .41, 3. ]])\nm2 = np.array([[1., 2., 0., 3. ], [.1, .13, 1., 3. ], [.1, 2., .5, .3 ]])\nm3 = np.array([[1., 0. ], [0., 1. ]])\n\nprint('Matrix 1: \\n{}'.format(m1))\nprint('Shape: {}'.format(m1.shape))\n\nprint('Matrix 2: \\n{}'.format(m2))\nprint('Shape: {}'.format(m2.shape))\n\nprint('Matrix 3: \\n{}'.format(m3))\nprint('Shape: {}'.format(m3.shape))", "Matrix 1: \n[[5.1 0.3 0.41 3. ]\n [5.1 0.3 0.41 3. ]]\nShape: (2, 4)\nMatrix 2: \n[[1. 2. 0. 3. ]\n [0.1 0.13 1. 3. ]\n [0.1 2. 0.5 0.3 ]]\nShape: (3, 4)\nMatrix 3: \n[[1. 0.]\n [0. 1.]]\nShape: (2, 2)\n" ] ], [ [ "Concatenate m1 and m2 at row level (stack the two matrices):", "_____no_output_____" ] ], [ [ "mconcat = np.concatenate((m1, m2))\nprint('Concatenated matrix:\\n {}'.format(mconcat))\nprint('Shape: {}'.format(mconcat.shape))", "Concatenated matrix:\n [[5.1 0.3 0.41 3. ]\n [5.1 0.3 0.41 3. ]\n [1. 2. 0. 3. ]\n [0.1 0.13 1. 3. ]\n [0.1 2. 0.5 0.3 ]]\nShape: (5, 4)\n" ] ], [ [ "Concatenate m1 and m2 at column level (joining the two matrices side by side) should produce an error:", "_____no_output_____" ] ], [ [ "try:\n vconcat = np.concatenate((m1, m2), axis=1)\nexcept Exception as e:\n print('Concatenation raised the following error: {}'.format(e))\n ", "Concatenation raised the following error: all the input array dimensions for the concatenation axis must match exactly, but along dimension 0, the array at index 0 has size 2 and the array at index 1 has size 3\n" ] ], [ [ "Concatenate m1 and m3 at column level (joining the two matrices side by side):", "_____no_output_____" ] ], [ [ "mconcat = np.concatenate((m1, m3), axis=1)\nprint('Concatenated matrix:\\n {}'.format(mconcat))\nprint('Shape: {}'.format(mconcat.shape))", "Concatenated matrix:\n [[5.1 0.3 0.41 3. 1. 0. ]\n [5.1 0.3 0.41 3. 0. 1. ]]\nShape: (2, 6)\n" ] ], [ [ "Concatenate m1 and m3 at row level (stack the two matrices) should produce an error:", "_____no_output_____" ] ], [ [ "try:\n vconcat = np.concatenate((m1, m3))\nexcept Exception as e:\n print('Concatenation raised the following error: {}'.format(e))\n ", "Concatenation raised the following error: all the input array dimensions for the concatenation axis must match exactly, but along dimension 1, the array at index 0 has size 4 and the array at index 1 has size 2\n" ] ], [ [ "## d) Single matrix operations \n\nIn this section we describe a few operations that can be done over matrices:", "_____no_output_____" ], [ "### d.1) Transpose\n\nA very common operation is the transpose. If you are used to see matrix notation, you should know what this operation is. Take a matrix with 2 dimensions:\n\n$$ X = \\begin{bmatrix} a & b \\\\ c & d \\\\ \\end{bmatrix} $$\n\nTransposing the matrix is inverting its data with respect to its diagonal:\n\n$$ X^T = \\begin{bmatrix} a & c \\\\ b & d \\\\ \\end{bmatrix} $$\n\nThis means that the rows of X will become its columns and vice-versa. You can attain the transpose of a matrix by using either `.T` on a matrix or calling `numpy.transpose`:", "_____no_output_____" ] ], [ [ "m1 = np.array([[ .1, 1., 2.], [ 3., .24, 4.], [ 6., 2., 5.]]) \nprint('Initial matrix: \\n{}'.format(m1))", "Initial matrix: \n[[0.1 1. 2. ]\n [3. 0.24 4. ]\n [6. 2. 5. ]]\n" ], [ "m1_transposed = m1.transpose()\nprint('Transposed matrix with `transpose` \\n{}'.format(m1_transposed))", "Transposed matrix with `transpose` \n[[0.1 3. 6. ]\n [1. 0.24 2. ]\n [2. 4. 5. ]]\n" ], [ "m1_transposed = m1.T\nprint('Transposed matrix with `T` \\n{}'.format(m1_transposed))", "Transposed matrix with `T` \n[[0.1 3. 6. ]\n [1. 0.24 2. ]\n [2. 4. 5. ]]\n" ] ], [ [ "A few examples of non-squared matrices. In these, you'll see that the shape (a, b) gets inverted to (b, a):", "_____no_output_____" ] ], [ [ "m1 = np.array([[ .1, 1., 2., 5.], [ 3., .24, 4., .6]]) \nprint('Initial matrix: \\n{}'.format(m1))\n\nm1_transposed = m1.T\nprint('Transposed matrix: \\n{}'.format(m1_transposed))\n", "Initial matrix: \n[[0.1 1. 2. 5. ]\n [3. 0.24 4. 0.6 ]]\nTransposed matrix: \n[[0.1 3. ]\n [1. 0.24]\n [2. 4. ]\n [5. 0.6 ]]\n" ], [ "m1 = np.array([[ .1, 1.], [2., 5.], [ 3., .24], [4., .6]]) \nprint('Initial matrix: \\n{}'.format(m1))\n\nm1_transposed = m1.T\nprint('Transposed matrix: \\n{}'.format(m1_transposed))\n", "Initial matrix: \n[[0.1 1. ]\n [2. 5. ]\n [3. 0.24]\n [4. 0.6 ]]\nTransposed matrix: \n[[0.1 2. 3. 4. ]\n [1. 5. 0.24 0.6 ]]\n" ] ], [ [ "For vectors represented as matrices, this means transforming from a row vector (1, N) to a column vector (N, 1) or vice-versa:", "_____no_output_____" ] ], [ [ "v1 = np.array([ .1, 1., 2.])\nv1_reshaped = v1.reshape((1, -1))\n \nprint('Row vector as 2-d array: {}'.format(v1_reshaped))\nprint('Shape: {}'.format(v1_reshaped.shape))\n\nv1_transposed = v1_reshaped.T\n\nprint('Transposed (column vector as 2-d array): \\n{}'.format(v1_transposed))\nprint('Shape: {}'.format(v1_transposed.shape))\n", "Row vector as 2-d array: [[0.1 1. 2. ]]\nShape: (1, 3)\nTransposed (column vector as 2-d array): \n[[0.1]\n [1. ]\n [2. ]]\nShape: (3, 1)\n" ], [ "v1 = np.array([ 3., .23, 2., .6])\nv1_reshaped = v1.reshape((-1, 1))\n \nprint('Column vector as 2-d array: \\n{}'.format(v1_reshaped))\nprint('Shape: {}'.format(v1_reshaped.shape))\n\nv1_transposed = v1_reshaped.T\n\nprint('Transposed (row vector as 2-d array): {}'.format(v1_transposed))\nprint('Shape: {}'.format(v1_transposed.shape))\n", "Column vector as 2-d array: \n[[3. ]\n [0.23]\n [2. ]\n [0.6 ]]\nShape: (4, 1)\nTransposed (row vector as 2-d array): [[3. 0.23 2. 0.6 ]]\nShape: (1, 4)\n" ] ], [ [ "### d.2) Statistics operators\n\nNumpy also allows us to perform several operations over the rows and columns of a matrix, such as: \n \n* Sum\n* Mean\n* Max\n* Min\n* ...\n\nThe most important thing to take into account when using these is to know exactly in which direction we are performing the operations. We can perform, for example, a `max` operation over the whole matrix, obtaining the max value in all of the matrix values. Or we might want this value for each row, or for each column. Check the following examples:", "_____no_output_____" ] ], [ [ "m1 = np.array([[ .1, 1.], [2., 5.], [ 3., .24], [4., .6]]) \nprint('Initial matrix: \\n{}'.format(m1))", "Initial matrix: \n[[0.1 1. ]\n [2. 5. ]\n [3. 0.24]\n [4. 0.6 ]]\n" ] ], [ [ "Operating over all matrix' values:", "_____no_output_____" ] ], [ [ "print('Total sum of matrix elements: {}'.format(m1.sum()))\nprint('Minimum of all matrix elements: {}'.format(m1.max()))\nprint('Maximum of all matrix elements: {}'.format(m1.min()))\nprint('Mean of all matrix elements: {}'.format(m1.mean()))", "Total sum of matrix elements: 15.94\nMinimum of all matrix elements: 5.0\nMaximum of all matrix elements: 0.1\nMean of all matrix elements: 1.9925\n" ] ], [ [ "Operating across rows - produces a row with the sum/max/min/mean for each column:", "_____no_output_____" ] ], [ [ "print('Total sum of matrix elements: {}'.format(m1.sum(axis=0)))\nprint('Minimum of all matrix elements: {}'.format(m1.max(axis=0)))\nprint('Maximum of all matrix elements: {}'.format(m1.min(axis=0)))\nprint('Mean of all matrix elements: {}'.format(m1.mean(axis=0)))", "Total sum of matrix elements: [9.1 6.84]\nMinimum of all matrix elements: [4. 5.]\nMaximum of all matrix elements: [0.1 0.24]\nMean of all matrix elements: [2.275 1.71 ]\n" ] ], [ [ "Operating across columns - produces a column with the sum/max/min/mean for each row:", "_____no_output_____" ] ], [ [ "print('Total sum of matrix elements: {}'.format(m1.sum(axis=1)))\nprint('Minimum of all matrix elements: {}'.format(m1.max(axis=1)))\nprint('Maximum of all matrix elements: {}'.format(m1.min(axis=1)))\nprint('Mean of all matrix elements: {}'.format(m1.mean(axis=1)))", "Total sum of matrix elements: [1.1 7. 3.24 4.6 ]\nMinimum of all matrix elements: [1. 5. 3. 4.]\nMaximum of all matrix elements: [0.1 2. 0.24 0.6 ]\nMean of all matrix elements: [0.55 3.5 1.62 2.3 ]\n" ] ], [ [ "As an example, imagine that you have a matrix of shape (n_samples, n_features), where each row represents all the features for one sample. Then , to average over the samples, we do:", "_____no_output_____" ] ], [ [ "m1 = np.array([[ .1, 1.], [2., 5.], [ 3., .24], [4., .6]]) \nprint('Initial matrix: \\n{}'.format(m1))\n\nprint('\\n')\nprint('Sample 1: {}'.format(m1[0, :]))\nprint('Sample 2: {}'.format(m1[1, :]))\nprint('Sample 3: {}'.format(m1[2, :]))\nprint('Sample 4: {}'.format(m1[3, :]))\nprint('\\n')\n\nprint('Average over samples: \\n{}'.format(m1.mean(axis=0)))", "Initial matrix: \n[[0.1 1. ]\n [2. 5. ]\n [3. 0.24]\n [4. 0.6 ]]\n\n\nSample 1: [0.1 1. ]\nSample 2: [2. 5.]\nSample 3: [3. 0.24]\nSample 4: [4. 0.6]\n\n\nAverage over samples: \n[2.275 1.71 ]\n" ] ], [ [ "Other statistical functions behave in a similar manner, so it is important to know how to work the axis of these objects.", "_____no_output_____" ], [ "## e) Multiple matrix operations ", "_____no_output_____" ], [ "### e.1) Element wise operations\n\nSeveral operations available work at the element level, this is, if we have two matrices A and B:\n\n$$ A = \\begin{bmatrix} a & b \\\\ c & d \\\\ \\end{bmatrix} $$\n\nand \n\n$$ B = \\begin{bmatrix} e & f \\\\ g & h \\\\ \\end{bmatrix} $$\n\nan element-wise operation produces a matrix:\n\n$$ Op(A, B) = \\begin{bmatrix} Op(a,e) & Op(b,f) \\\\ Op(c,g) & Op(d,h) \\\\ \\end{bmatrix} $$\n\nYou can perform sum and difference, but also element-wise multiplication and division. These are implemented with the regular operators `+`, `-`, `*`, `/`. Check out the examples below:", "_____no_output_____" ] ], [ [ "m1 = np.array([[ .1, 1., 2., 5.], [ 3., .24, 4., .6]]) \nm2 = np.array([[ .1, 4., .25, .1], [ 2., 1.5, .42, -1.]]) \n\nprint('Matrix 1: \\n{}'.format(m1))\nprint('Matrix 2: \\n{}'.format(m1))\n\nprint('\\n')\nprint('Sum: \\n{}'.format(m1 + m2))\n\nprint('\\n')\nprint('Difference: \\n{}'.format(m1 - m2))\n\nprint('\\n')\nprint('Multiplication: \\n{}'.format(m1*m2))\n\nprint('\\n')\nprint('Division: \\n{}'.format(m1/m2))", "Matrix 1: \n[[0.1 1. 2. 5. ]\n [3. 0.24 4. 0.6 ]]\nMatrix 2: \n[[0.1 1. 2. 5. ]\n [3. 0.24 4. 0.6 ]]\n\n\nSum: \n[[ 0.2 5. 2.25 5.1 ]\n [ 5. 1.74 4.42 -0.4 ]]\n\n\nDifference: \n[[ 0. -3. 1.75 4.9 ]\n [ 1. -1.26 3.58 1.6 ]]\n\n\nMultiplication: \n[[ 0.01 4. 0.5 0.5 ]\n [ 6. 0.36 1.68 -0.6 ]]\n\n\nDivision: \n[[ 1. 0.25 8. 50. ]\n [ 1.5 0.16 9.52380952 -0.6 ]]\n" ] ], [ [ "For these operations, ideally your matrices should have the same dimensions. An exception to this is when you have one of the elements that can be [broadcasted](https://numpy.org/doc/stable/user/basics.broadcasting.html) over the other. However we won't cover that in these examples.", "_____no_output_____" ], [ "### e.2) Matrix multiplication\n\nAlthough you've seen how to perform element wise multiplication with the basic operation, one of the most common matrix operations is matrix multiplication, where the output is not the result of an element wise combination of its elements, but actually a linear combination between rows of the first matrix nd columns of the second.\n\nIn other words, element (i, j) of the resulting matrix is the dot product between row i of the first matrix and column j of the second:\n\n![matrix-multiply](assets/matrix-multiply-a.svg)\n\nWhere the dot product represented breaks down to:\n\n$$ 58 = 1 \\times 7 + 2 \\times 9 + 3 \\times 11 $$\n\nNumpy already provides this function, so check out the following examples:", "_____no_output_____" ] ], [ [ "m1 = np.array([[ .1, 1., 2., 5.], [ 3., .24, 4., .6]]) \nm2 = np.array([[ .1, 4.], [.25, .1], [ 2., 1.5], [.42, -1.]]) \n\nprint('Matrix 1: \\n{}'.format(m1))\nprint('Matrix 2: \\n{}'.format(m1))\n\nprint('\\n')\nprint('Matrix multiplication: \\n{}'.format(np.matmul(m1, m2)))\n", "Matrix 1: \n[[0.1 1. 2. 5. ]\n [3. 0.24 4. 0.6 ]]\nMatrix 2: \n[[0.1 1. 2. 5. ]\n [3. 0.24 4. 0.6 ]]\n\n\nMatrix multiplication: \n[[ 6.36 -1.5 ]\n [ 8.612 17.424]]\n" ], [ "m1 = np.array([[ .1, 4.], [.25, .1], [ 2., 1.5], [.42, -1.]]) \nm2 = np.array([[ .1, 1., 2.], [ 3., .24, 4.]]) \n\nprint('Matrix 1: \\n{}'.format(m1))\nprint('Matrix 2: \\n{}'.format(m1))\n\nprint('\\n')\nprint('Matrix multiplication: \\n{}'.format(np.matmul(m1, m2)))\n", "Matrix 1: \n[[ 0.1 4. ]\n [ 0.25 0.1 ]\n [ 2. 1.5 ]\n [ 0.42 -1. ]]\nMatrix 2: \n[[ 0.1 4. ]\n [ 0.25 0.1 ]\n [ 2. 1.5 ]\n [ 0.42 -1. ]]\n\n\nMatrix multiplication: \n[[12.01 1.06 16.2 ]\n [ 0.325 0.274 0.9 ]\n [ 4.7 2.36 10. ]\n [-2.958 0.18 -3.16 ]]\n" ] ], [ [ "Notice that in both operations the matrix multiplication of shapes `(k, l)` and `(m, n)` yields a matrix of dimensions `(k, n)`. Additionally, for this operation to be possible, the inner dimensions need to match, this is `l == m` . See what happens if we try to multiply matrices with incompatible dimensions:", "_____no_output_____" ] ], [ [ "m1 = np.array([[ .1, 4., 3.], [.25, .1, 1.], [ 2., 1.5, .5], [.42, -1., 4.3]]) \nm2 = np.array([[ .1, 1., 2.], [ 3., .24, 4.]]) \n\nprint('Matrix 1: \\n{}'.format(m1))\nprint('Shape: {}'.format(m1.shape))\n\nprint('Matrix 2: \\n{}'.format(m1))\nprint('Shape: {}'.format(m2.shape))\nprint('\\n')\n\ntry:\n m3 = np.matmul(m1, m2)\nexcept Exception as e:\n print('Matrix multiplication raised the following error: {}'.format(e))\n", "Matrix 1: \n[[ 0.1 4. 3. ]\n [ 0.25 0.1 1. ]\n [ 2. 1.5 0.5 ]\n [ 0.42 -1. 4.3 ]]\nShape: (4, 3)\nMatrix 2: \n[[ 0.1 4. 3. ]\n [ 0.25 0.1 1. ]\n [ 2. 1.5 0.5 ]\n [ 0.42 -1. 4.3 ]]\nShape: (2, 3)\n\n\nMatrix multiplication raised the following error: matmul: Input operand 1 has a mismatch in its core dimension 0, with gufunc signature (n?,k),(k,m?)->(n?,m?) (size 2 is different from 3)\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ] ]
d028d5160fb78d32a85cbadcbceccf3cbb5b3810
59,645
ipynb
Jupyter Notebook
tensorflow/practice/recurrentnets/SimpleRNNECG5000.ipynb
lisuizhe/ml-algorithm
af0755869657b4085de44d4ec95b8c5269d9ac1a
[ "Apache-2.0" ]
3
2019-04-21T06:04:20.000Z
2019-04-26T00:03:14.000Z
tensorflow/practice/recurrentnets/SimpleRNNECG5000.ipynb
lisuizhe/ml-algorithm
af0755869657b4085de44d4ec95b8c5269d9ac1a
[ "Apache-2.0" ]
null
null
null
tensorflow/practice/recurrentnets/SimpleRNNECG5000.ipynb
lisuizhe/ml-algorithm
af0755869657b4085de44d4ec95b8c5269d9ac1a
[ "Apache-2.0" ]
null
null
null
139.357477
44,804
0.857557
[ [ [ "%matplotlib inline\n\nfrom scipy.io import arff\nimport numpy as np\n\n# download from http://timeseriesclassification.com/description.php?Dataset=ECG5000\ndataset_train, meta = arff.loadarff('./data/ECG5000/ECG5000_TRAIN.arff')\n\nds_train = np.asarray(dataset_train.tolist(), dtype=np.float32)\nx_dataset = ds_train[:, :140]\ny_dataset = np.asarray(ds_train[:,-1].tolist(), dtype=np.int8)-1", "_____no_output_____" ], [ "import matplotlib.pyplot as plt\n\nN = 10\nprint(y_dataset[:N])\nobj = plt.plot(x_dataset[:N].T)\nplt.legend(obj, [str(n) for n in range(N)])", "[0 0 0 0 0 0 0 0 0 0]\n" ], [ "from tensorflow.keras.utils import to_categorical\n\nx_train = x_dataset[:,:,np.newaxis]\ny_train = to_categorical(y_dataset)\n\nprint(x_train.shape)\nprint(y_train.shape)", "(500, 140, 1)\n(500, 5)\n" ], [ "dataset_test, meta = arff.loadarff('./data/ECG5000/ECG5000_TEST.arff')\nds_test = np.asarray(dataset_test.tolist(), dtype=np.float32)\n\nprint(ds_test.shape)", "(4500, 141)\n" ], [ "x_test = ds_test[:, :140][:,:,np.newaxis]\ny_test = to_categorical(np.asarray(ds_test[:,-1].tolist(), dtype=np.int8)-1)\n\nprint(x_test.shape)\nprint(y_test.shape)", "(4500, 140, 1)\n(4500, 5)\n" ], [ "from tensorflow.keras.models import Sequential\nfrom tensorflow.keras.layers import Dense, Activation, SimpleRNN\n\nhid_dim = 10\n\nmodel = Sequential()\nmodel.add(SimpleRNN(hid_dim, input_shape=x_train.shape[1:]))\nmodel.add(Dense(y_train.shape[1], activation='softmax'))\nmodel.compile(loss='categorical_crossentropy',\n optimizer='adam',\n metrics=['accuracy'])", "WARNING:tensorflow:From C:\\Users\\suizh\\Anaconda3\\lib\\site-packages\\tensorflow\\python\\ops\\resource_variable_ops.py:435: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.\nInstructions for updating:\nColocations handled automatically by placer.\n" ], [ "model.fit(x_train, y_train,\n epochs=50, batch_size=100, verbose=2,\n validation_split=0.2)", "Train on 400 samples, validate on 100 samples\nWARNING:tensorflow:From C:\\Users\\suizh\\Anaconda3\\lib\\site-packages\\tensorflow\\python\\ops\\math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\nInstructions for updating:\nUse tf.cast instead.\nEpoch 1/50\n - 1s - loss: 1.9762 - acc: 0.2000 - val_loss: 1.2430 - val_acc: 0.4800\nEpoch 2/50\n - 0s - loss: 1.9121 - acc: 0.2325 - val_loss: 1.2100 - val_acc: 0.5000\nEpoch 3/50\n - 1s - loss: 1.8505 - acc: 0.2500 - val_loss: 1.1801 - val_acc: 0.6000\nEpoch 4/50\n - 0s - loss: 1.7908 - acc: 0.2675 - val_loss: 1.1534 - val_acc: 0.6400\nEpoch 5/50\n - 1s - loss: 1.7336 - acc: 0.2825 - val_loss: 1.1295 - val_acc: 0.6600\nEpoch 6/50\n - 1s - loss: 1.6769 - acc: 0.2900 - val_loss: 1.1086 - val_acc: 0.6700\nEpoch 7/50\n - 0s - loss: 1.6237 - acc: 0.3050 - val_loss: 1.0901 - val_acc: 0.6700\nEpoch 8/50\n - 0s - loss: 1.5708 - acc: 0.3150 - val_loss: 1.0742 - val_acc: 0.6700\nEpoch 9/50\n - 0s - loss: 1.5209 - acc: 0.3250 - val_loss: 1.0604 - val_acc: 0.6700\nEpoch 10/50\n - 0s - loss: 1.4704 - acc: 0.3300 - val_loss: 1.0486 - val_acc: 0.6800\nEpoch 11/50\n - 0s - loss: 1.4223 - acc: 0.3400 - val_loss: 1.0387 - val_acc: 0.6800\nEpoch 12/50\n - 0s - loss: 1.3749 - acc: 0.3475 - val_loss: 1.0304 - val_acc: 0.6800\nEpoch 13/50\n - 0s - loss: 1.3281 - acc: 0.3600 - val_loss: 1.0232 - val_acc: 0.6800\nEpoch 14/50\n - 0s - loss: 1.2829 - acc: 0.3725 - val_loss: 1.0171 - val_acc: 0.6900\nEpoch 15/50\n - 0s - loss: 1.2363 - acc: 0.3875 - val_loss: 1.0121 - val_acc: 0.6800\nEpoch 16/50\n - 0s - loss: 1.1914 - acc: 0.4175 - val_loss: 1.0081 - val_acc: 0.6800\nEpoch 17/50\n - 0s - loss: 1.1448 - acc: 0.4525 - val_loss: 1.0049 - val_acc: 0.6800\nEpoch 18/50\n - 0s - loss: 1.0989 - acc: 0.5000 - val_loss: 1.0027 - val_acc: 0.6800\nEpoch 19/50\n - 0s - loss: 1.0511 - acc: 0.5575 - val_loss: 1.0014 - val_acc: 0.6700\nEpoch 20/50\n - 0s - loss: 1.0049 - acc: 0.6425 - val_loss: 1.0011 - val_acc: 0.6700\nEpoch 21/50\n - 0s - loss: 0.9575 - acc: 0.7475 - val_loss: 1.0018 - val_acc: 0.6700\nEpoch 22/50\n - 0s - loss: 0.9103 - acc: 0.8050 - val_loss: 1.0035 - val_acc: 0.6700\nEpoch 23/50\n - 0s - loss: 0.8649 - acc: 0.8700 - val_loss: 1.0064 - val_acc: 0.6700\nEpoch 24/50\n - 0s - loss: 0.8196 - acc: 0.8900 - val_loss: 1.0101 - val_acc: 0.6700\nEpoch 25/50\n - 0s - loss: 0.7753 - acc: 0.9275 - val_loss: 1.0149 - val_acc: 0.6700\nEpoch 26/50\n - 0s - loss: 0.7349 - acc: 0.9300 - val_loss: 1.0204 - val_acc: 0.6700\nEpoch 27/50\n - 0s - loss: 0.6947 - acc: 0.9300 - val_loss: 1.0263 - val_acc: 0.6700\nEpoch 28/50\n - 0s - loss: 0.6569 - acc: 0.9400 - val_loss: 1.0336 - val_acc: 0.6700\nEpoch 29/50\n - 0s - loss: 0.6203 - acc: 0.9475 - val_loss: 1.0414 - val_acc: 0.6700\nEpoch 30/50\n - 0s - loss: 0.5863 - acc: 0.9525 - val_loss: 1.0529 - val_acc: 0.6700\nEpoch 31/50\n - 1s - loss: 0.5511 - acc: 0.9625 - val_loss: 1.0690 - val_acc: 0.6600\nEpoch 32/50\n - 1s - loss: 0.5238 - acc: 0.9600 - val_loss: 1.0860 - val_acc: 0.6500\nEpoch 33/50\n - 1s - loss: 0.5011 - acc: 0.9600 - val_loss: 1.0926 - val_acc: 0.6500\nEpoch 34/50\n - 1s - loss: 0.4789 - acc: 0.9675 - val_loss: 1.0943 - val_acc: 0.6500\nEpoch 35/50\n - 1s - loss: 0.4577 - acc: 0.9675 - val_loss: 1.0919 - val_acc: 0.6600\nEpoch 36/50\n - 0s - loss: 0.4367 - acc: 0.9675 - val_loss: 1.0895 - val_acc: 0.6600\nEpoch 37/50\n - 1s - loss: 0.4192 - acc: 0.9675 - val_loss: 1.0900 - val_acc: 0.6600\nEpoch 38/50\n - 1s - loss: 0.4033 - acc: 0.9675 - val_loss: 1.0949 - val_acc: 0.6600\nEpoch 39/50\n - 0s - loss: 0.3884 - acc: 0.9675 - val_loss: 1.1006 - val_acc: 0.6600\nEpoch 40/50\n - 0s - loss: 0.3739 - acc: 0.9675 - val_loss: 1.1071 - val_acc: 0.6600\nEpoch 41/50\n - 0s - loss: 0.3604 - acc: 0.9675 - val_loss: 1.1145 - val_acc: 0.6600\nEpoch 42/50\n - 0s - loss: 0.3476 - acc: 0.9675 - val_loss: 1.1208 - val_acc: 0.6600\nEpoch 43/50\n - 0s - loss: 0.3354 - acc: 0.9675 - val_loss: 1.1254 - val_acc: 0.6600\nEpoch 44/50\n - 0s - loss: 0.3234 - acc: 0.9675 - val_loss: 1.1303 - val_acc: 0.6600\nEpoch 45/50\n - 0s - loss: 0.3123 - acc: 0.9700 - val_loss: 1.1339 - val_acc: 0.6600\nEpoch 46/50\n - 1s - loss: 0.3019 - acc: 0.9725 - val_loss: 1.1399 - val_acc: 0.6600\nEpoch 47/50\n - 0s - loss: 0.2930 - acc: 0.9725 - val_loss: 1.1456 - val_acc: 0.6600\nEpoch 48/50\n - 0s - loss: 0.2835 - acc: 0.9725 - val_loss: 1.1515 - val_acc: 0.6600\nEpoch 49/50\n - 1s - loss: 0.2759 - acc: 0.9775 - val_loss: 1.1578 - val_acc: 0.6600\nEpoch 50/50\n - 0s - loss: 0.2682 - acc: 0.9800 - val_loss: 1.1622 - val_acc: 0.6600\n" ], [ "score = model.evaluate(x_test, y_test, verbose=0)\nprint('Test loss:', score[0])\nprint('Test accuracy,', score[1])", "Test loss: 0.5085278123219807\nTest accuracy, 0.896\n" ], [ "from IPython.display import SVG\nfrom tensorflow.python.keras.utils.vis_utils import model_to_dot\n\nSVG(model_to_dot(model).create(prog='dot', format='svg'))", "_____no_output_____" ], [ "model.summary()", "_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nsimple_rnn (SimpleRNN) (None, 10) 120 \n_________________________________________________________________\ndense (Dense) (None, 5) 55 \n=================================================================\nTotal params: 175\nTrainable params: 175\nNon-trainable params: 0\n_________________________________________________________________\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d028dc3ead25183e65430330b4ba6d81548920f7
12,218
ipynb
Jupyter Notebook
Cloud_Data_warehouse/.ipynb_checkpoints/create_infrastructure-checkpoint.ipynb
ManaliSharma/Data_Engineering_Projects
4a6eb1be9bc3de36ca44bfd7b635b92a2adb76ae
[ "MIT" ]
1
2021-07-28T02:33:13.000Z
2021-07-28T02:33:13.000Z
Cloud_Data_warehouse/.ipynb_checkpoints/create_infrastructure-checkpoint.ipynb
ManaliSharma/Data_Engineering_Projects
4a6eb1be9bc3de36ca44bfd7b635b92a2adb76ae
[ "MIT" ]
null
null
null
Cloud_Data_warehouse/.ipynb_checkpoints/create_infrastructure-checkpoint.ipynb
ManaliSharma/Data_Engineering_Projects
4a6eb1be9bc3de36ca44bfd7b635b92a2adb76ae
[ "MIT" ]
null
null
null
37.136778
177
0.534048
[ [ [ "#importing libraries\nimport pandas as pd\nimport boto3\nimport json\nimport configparser\nfrom botocore.exceptions import ClientError\nimport psycopg2", "_____no_output_____" ], [ "\ndef config_parse_file():\n \"\"\"\n Parse the dwh.cfg configuration file\n :return:\n \"\"\"\n global KEY, SECRET, DWH_CLUSTER_TYPE, DWH_NUM_NODES, \\\n DWH_NODE_TYPE, DWH_CLUSTER_IDENTIFIER, DWH_DB, \\\n DWH_DB_USER, DWH_DB_PASSWORD, DWH_PORT, DWH_IAM_ROLE_NAME\n\n print(\"Parsing the config file...\")\n config = configparser.ConfigParser()\n with open('dwh.cfg') as configfile:\n config = configparser.ConfigParser()\n config.read_file(open('dwh.cfg'))\n\n KEY = config.get('AWS','KEY')\n SECRET = config.get('AWS','SECRET')\n\n DWH_CLUSTER_TYPE = config.get(\"DWH\",\"DWH_CLUSTER_TYPE\")\n DWH_NUM_NODES = config.get(\"DWH\",\"DWH_NUM_NODES\")\n DWH_NODE_TYPE = config.get(\"DWH\",\"DWH_NODE_TYPE\")\n\n DWH_CLUSTER_IDENTIFIER = config.get(\"DWH\",\"DWH_CLUSTER_IDENTIFIER\")\n DWH_DB = config.get(\"CLUSTER\",\"DWH_DB\")\n DWH_DB_USER = config.get(\"CLUSTER\",\"DWH_DB_USER\")\n DWH_DB_PASSWORD = config.get(\"CLUSTER\",\"DWH_DB_PASSWORD\")\n DWH_PORT = config.get(\"CLUSTER\",\"DWH_PORT\")\n\n DWH_IAM_ROLE_NAME = config.get(\"DWH\", \"DWH_IAM_ROLE_NAME\")\n\n#Function for creating iam_role\ndef create_iam_role(iam):\n \"\"\"\n Create the AWS IAM role\n :param iam:\n :return:\n \"\"\"\n global DWH_IAM_ROLE_NAME\n dwhRole = None\n try:\n print('1.1 Creating a new IAM Role')\n dwhRole = iam.create_role(\n Path='/',\n RoleName=DWH_IAM_ROLE_NAME,\n Description=\"Allows Redshift clusters to call AWS services on your behalf.\",\n AssumeRolePolicyDocument=json.dumps(\n {'Statement': [{'Action': 'sts:AssumeRole',\n 'Effect': 'Allow',\n 'Principal': {'Service': 'redshift.amazonaws.com'}}],\n 'Version': '2012-10-17'})\n )\n except Exception as e:\n print(e)\n dwhRole = iam.get_role(RoleName=DWH_IAM_ROLE_NAME)\n return dwhRole\n\n \ndef attach_iam_role_policy(iam):\n \"\"\"\n Attach the AmazonS3ReadOnlyAccess role policy to the created IAM\n :param iam:\n :return:\n \"\"\"\n global DWH_IAM_ROLE_NAME\n print('1.2 Attaching Policy')\n return iam.attach_role_policy(RoleName=DWH_IAM_ROLE_NAME, PolicyArn=\"arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess\")['ResponseMetadata']['HTTPStatusCode'] == 200\n\n\ndef get_iam_role_arn(iam):\n \"\"\"\n Get the IAM role ARN string\n :param iam: The IAM resource client\n :return:string\n \"\"\"\n global DWH_IAM_ROLE_NAME\n return iam.get_role(RoleName=DWH_IAM_ROLE_NAME)['Role']['Arn']\n \n \n#Function to create cluster\n\ndef create_cluster(redshift, roleArn):\n \"\"\"\n Start the Redshift cluster creation\n :param redshift: The redshift resource client\n :param roleArn: The created role ARN\n :return:\n \"\"\"\n global DWH_CLUSTER_TYPE, DWH_NODE_TYPE, DWH_NUM_NODES, DWH_DB, DWH_CLUSTER_IDENTIFIER, DWH_DB_USER, DWH_DB_PASSWORD\n try:\n response = redshift.create_cluster( \n #HW\n ClusterType=DWH_CLUSTER_TYPE,\n NodeType=DWH_NODE_TYPE,\n NumberOfNodes=int(DWH_NUM_NODES),\n\n #Identifiers & Credentials\n DBName=DWH_DB,\n ClusterIdentifier=DWH_CLUSTER_IDENTIFIER,\n MasterUsername=DWH_DB_USER,\n MasterUserPassword=DWH_DB_PASSWORD,\n\n #Roles (for s3 access)\n IamRoles=[roleArn] \n )\n print(\"Redshift cluster creation http response status code: \")\n print(response['ResponseMetadata']['HTTPStatusCode'])\n return response['ResponseMetadata']['HTTPStatusCode'] == 200\n except Exception as e:\n print(e)\n return False\n\n\n#Adding details to config file\ndef config_persist_cluster_infos(redshift):\n \"\"\"\n Write back to the dwh.cfg configuration file the cluster endpoint and IAM ARN\n :param redshift: The redshift resource client\n :return:\n \"\"\"\n global DWH_CLUSTER_IDENTIFIER\n print(\"Writing the cluster address and IamRoleArn to the config file...\")\n\n cluster_props = redshift.describe_clusters(ClusterIdentifier=DWH_CLUSTER_IDENTIFIER)['Clusters'][0]\n\n config = configparser.ConfigParser()\n\n with open('dwh.cfg') as configfile:\n config.read_file(configfile)\n\n config.set(\"CLUSTER\", \"HOST\", cluster_props['Endpoint']['Address'])\n config.set(\"IAM_ROLE\", \"ARN\", cluster_props['IamRoles'][0]['IamRoleArn'])\n\n with open('dwh.cfg', 'w+') as configfile:\n config.write(configfile)\n\n config_parse_file()\n \n\n \n#Function to retrive redshift cluster properties\ndef prettyRedshiftProps(props):\n \n '''\n Retrieve Redshift clusters properties\n '''\n \n pd.set_option('display.max_colwidth', -1)\n keysToShow = [\"ClusterIdentifier\", \"NodeType\", \"ClusterStatus\", \"MasterUsername\", \"DBName\", \"Endpoint\", \"NumberOfNodes\", 'VpcId']\n x = [(k, v) for k,v in props.items() if k in keysToShow]\n return pd.DataFrame(data=x, columns=[\"Key\", \"Value\"])\n\n#Function to get cluster properties\ndef get_cluster_props(redshift):\n \"\"\"\n Retrieves the Redshift cluster status\n :param redshift: The Redshift resource client\n :return: The cluster status\n \"\"\"\n global DWH_CLUSTER_IDENTIFIER\n myClusterProps = redshift.describe_clusters(ClusterIdentifier=DWH_CLUSTER_IDENTIFIER)['Clusters'][0]\n cluster_status = myClusterProps['ClusterStatus']\n return cluster_status.lower()\n\n#to check if cluster became available or not\ndef check_cluster_creation(redshift):\n \"\"\"\n Check if the cluster status is available, if it is returns True. Otherwise, false.\n :param redshift: The Redshift client resource\n :return:bool\n \"\"\"\n if get_cluster_props(redshift) == 'available':\n return True\n return False\n\n#Function to Open an incoming TCP port to access the cluster ednpoint\ndef aws_open_redshift_port(ec2, redshift):\n \"\"\"\n Opens the Redshift port on the VPC security group.\n :param ec2: The EC2 client resource\n :param redshift: The Redshift client resource\n :return:None\n \"\"\"\n global DWH_CLUSTER_IDENTIFIER, DWH_PORT\n cluster_props = redshift.describe_clusters(ClusterIdentifier=DWH_CLUSTER_IDENTIFIER)['Clusters'][0]\n try:\n vpc = ec2.Vpc(id=cluster_props['VpcId'])\n all_security_groups = list(vpc.security_groups.all())\n print(all_security_groups)\n defaultSg = all_security_groups[1]\n print(defaultSg)\n\n defaultSg.authorize_ingress(\n GroupName=defaultSg.group_name,\n CidrIp='0.0.0.0/0',\n IpProtocol='TCP',\n FromPort=int(DWH_PORT),\n ToPort=int(DWH_PORT)\n )\n except Exception as e:\n print(e) \n\n##Create clients for IAM, EC2, S3 and Redshift¶\ndef aws_resource(name, region):\n \"\"\"\n Creates an AWS client resource\n :param name: The name of the resource\n :param region: The region of the resource\n :return:\n \"\"\"\n global KEY, SECRET\n return boto3.resource(name, region_name=region, aws_access_key_id=KEY, aws_secret_access_key=SECRET)\n\n\ndef aws_client(service, region):\n \"\"\"\n Creates an AWS client\n :param service: The service\n :param region: The region of the service\n :return:\n \"\"\"\n global KEY, SECRET\n return boto3.client(service, aws_access_key_id=KEY, aws_secret_access_key=SECRET, region_name=region)\n\n#delete resources\ndef delete_cluster_resources(redshift):\n \"\"\"\n Destroy the Redshift cluster (request deletion)\n :param redshift: The Redshift client resource\n :return:None\n \"\"\"\n global DWH_CLUSTER_IDENTIFIER\n redshift.delete_cluster( ClusterIdentifier=DWH_CLUSTER_IDENTIFIER, SkipFinalClusterSnapshot=True)\n\ndef delete_iam_resource(iam):\n iam.detach_role_policy(RoleName=DWH_IAM_ROLE_NAME, PolicyArn=\"arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess\")\n iam.delete_role(RoleName=DWH_IAM_ROLE_NAME)\n \n", "_____no_output_____" ], [ "#Main Function to start the process\ndef main():\n \n config_parse_file()\n # ec2 = aws_resource('ec2', 'us-east-2')\n # s3 = aws_resource('s3', 'us-west-2')\n iam = aws_client('iam', \"us-east-1\")\n redshift = aws_client('redshift', \"us-east-1\")\n \n create_iam_role(iam)\n attach_iam_role_policy(iam)\n roleArn = get_iam_role_arn(iam)\n\n clusterCreationStarted = create_cluster(redshift, roleArn)\n \n if clusterCreationStarted:\n print(\"The cluster is being created.\")", "_____no_output_____" ], [ "# if __name__ == '__main__':\n# main()", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code" ] ]
d028ddb9794ee5de75a48950f470954e5dd010bf
25,286
ipynb
Jupyter Notebook
notebooks/test_raw_gan_check.ipynb
Steve-Tod/3D_generation
75817f91f4a69da06375b4401d4182e932102cd1
[ "Apache-2.0" ]
null
null
null
notebooks/test_raw_gan_check.ipynb
Steve-Tod/3D_generation
75817f91f4a69da06375b4401d4182e932102cd1
[ "Apache-2.0" ]
null
null
null
notebooks/test_raw_gan_check.ipynb
Steve-Tod/3D_generation
75817f91f4a69da06375b4401d4182e932102cd1
[ "Apache-2.0" ]
1
2019-07-24T03:35:27.000Z
2019-07-24T03:35:27.000Z
35.464236
158
0.484853
[ [ [ "import os, sys\nos.environ['CUDA_VISIBLE_DEVICES'] = '2'\nsys.path.append('../')", "_____no_output_____" ], [ "import argparse, json\nfrom tqdm import tqdm_notebook as tqdm", "_____no_output_____" ], [ "import os.path as osp\nfrom data.pointcloud_dataset import load_one_class_under_folder\nfrom utils.dirs import mkdir_and_rename\nfrom utils.tf import reset_tf_graph", "_____no_output_____" ], [ "opt = {\n 'data': {\n 'data_root':\n '/orion/u/jiangthu/projects/latent_3d_points/data/shape_net_core_uniform_samples_2048',\n 'class_name': 'airplane',\n 'n_thread': 20\n },\n 'model': {\n 'type': 'wgan',\n 'num_points': 2048,\n 'noise_dim': 128,\n 'noise_params': {\n 'mu': 0,\n 'sigma': 0.2\n }\n },\n 'train': {\n 'batch_size': 50,\n 'learning_rate': 0.0001,\n 'beta': 0.5,\n 'z_rotate': False,\n 'saver_step': 100\n },\n 'path': {\n 'train_root': './experiments',\n 'experiment_name': 'single_class_gan_chair_noise128'\n }\n}\ntrain_dir = osp.join(opt['path']['train_root'], opt['path']['experiment_name'])\ntrain_opt = opt['train']", "_____no_output_____" ], [ "import numpy as np\nimport tensorflow as tf\nfrom utils.tf import leaky_relu\nfrom utils.tf import expand_scope_by_name\nfrom tflearn.layers.normalization import batch_normalization\nfrom tflearn.layers.core import fully_connected, dropout\nfrom tflearn.layers.conv import conv_1d\nfrom utils.tf import expand_scope_by_name, replicate_parameter_for_all_layers\nimport tflearn", "_____no_output_____" ], [ "def encoder_with_convs_and_symmetry(in_signal,\n init_list,\n n_filters=[64, 128, 256, 1024],\n filter_sizes=[1],\n strides=[1],\n non_linearity=tf.nn.relu,\n weight_decay=0.001,\n symmetry=tf.reduce_max,\n regularizer=None,\n scope=None,\n reuse=False,\n padding='same',\n verbose=False,\n conv_op=conv_1d):\n '''An Encoder (recognition network), which maps inputs onto a latent space.\n '''\n\n if verbose:\n print('Building Encoder')\n\n n_layers = len(n_filters)\n filter_sizes = replicate_parameter_for_all_layers(filter_sizes, n_layers)\n strides = replicate_parameter_for_all_layers(strides, n_layers)\n\n if n_layers < 2:\n raise ValueError('More than 1 layers are expected.')\n\n for i in range(n_layers):\n if i == 0:\n layer = in_signal\n\n name = 'encoder_conv_layer_' + str(i)\n scope_i = expand_scope_by_name(scope, name)\n layer = conv_op(layer,\n nb_filter=n_filters[i],\n filter_size=filter_sizes[i],\n strides=strides[i],\n regularizer=regularizer,\n weight_decay=weight_decay,\n name=name,\n reuse=reuse,\n scope=scope_i,\n padding=padding,\n weights_init=tf.constant_initializer(init_list[i][0]),\n bias_init=tf.constant_initializer(init_list[i][1]))\n\n if non_linearity is not None:\n layer = non_linearity(layer)\n\n if verbose:\n print(layer)\n print('output size:', np.prod(layer.get_shape().as_list()[1:]),\n '\\n')\n\n if symmetry is not None:\n layer = symmetry(layer, axis=1)\n if verbose:\n print(layer)\n\n return layer\n\n\ndef decoder_with_fc_only(latent_signal,\n init_list,\n layer_sizes=[],\n non_linearity=tf.nn.relu,\n regularizer=None,\n weight_decay=0.001,\n reuse=False,\n scope=None,\n verbose=False):\n '''A decoding network which maps points from the latent space back onto the data space.\n '''\n if verbose:\n print('Building Decoder')\n\n n_layers = len(layer_sizes)\n\n if n_layers < 2:\n raise ValueError(\n 'For an FC decoder with single a layer use simpler code.')\n\n for i in range(0, n_layers - 1):\n name = 'decoder_fc_' + str(i)\n scope_i = expand_scope_by_name(scope, name)\n\n if i == 0:\n layer = latent_signal\n\n layer = fully_connected(\n layer,\n layer_sizes[i],\n activation='linear',\n weights_init=tf.constant_initializer(init_list[i][0]),\n bias_init=tf.constant_initializer(init_list[i][1]),\n name=name,\n regularizer=regularizer,\n weight_decay=weight_decay,\n reuse=reuse,\n scope=scope_i)\n\n if verbose:\n print(name,\n 'FC params = ',\n np.prod(layer.W.get_shape().as_list()) +\n np.prod(layer.b.get_shape().as_list()),\n end=' ')\n\n if non_linearity is not None:\n layer = non_linearity(layer)\n\n if verbose:\n print(layer)\n print('output size:', np.prod(layer.get_shape().as_list()[1:]),\n '\\n')\n\n # Last decoding layer never has a non-linearity.\n name = 'decoder_fc_' + str(n_layers - 1)\n scope_i = expand_scope_by_name(scope, name)\n layer = fully_connected(layer,\n layer_sizes[n_layers - 1],\n activation='linear',\n weights_init=tf.constant_initializer(init_list[-1][0]),\n bias_init=tf.constant_initializer(init_list[-1][1]),\n name=name,\n regularizer=regularizer,\n weight_decay=weight_decay,\n reuse=reuse,\n scope=scope_i)\n if verbose:\n print(name,\n 'FC params = ',\n np.prod(layer.W.get_shape().as_list()) +\n np.prod(layer.b.get_shape().as_list()),\n end=' ')\n\n if verbose:\n print(layer)\n print('output size:', np.prod(layer.get_shape().as_list()[1:]), '\\n')\n\n return layer\n\n\ndef mlp_discriminator(in_signal,\n cov_init_list,\n fc_init_list,\n non_linearity=tf.nn.relu,\n reuse=False,\n scope=None):\n ''' used in nips submission.\n '''\n encoder_args = {\n 'n_filters': [64, 128, 256, 256, 512],\n 'filter_sizes': [1, 1, 1, 1, 1],\n 'strides': [1, 1, 1, 1, 1]\n }\n encoder_args['reuse'] = reuse\n encoder_args['scope'] = scope\n encoder_args['non_linearity'] = non_linearity\n layer = encoder_with_convs_and_symmetry(in_signal, cov_init_list, weight_decay=0.0,\n **encoder_args)\n\n name = 'decoding_logits'\n scope_e = expand_scope_by_name(scope, name)\n d_logit = decoder_with_fc_only(layer,\n fc_init_list,\n layer_sizes=[128, 64, 1],\n reuse=reuse,\n scope=scope_e,\n weight_decay=0.0)\n d_prob = tf.nn.sigmoid(d_logit)\n return d_prob, d_logit\n\n\ndef point_cloud_generator(z,\n pc_dims,\n init_list,\n layer_sizes=[64, 128, 512, 1024],\n non_linearity=tf.nn.relu):\n ''' used in nips submission.\n '''\n\n n_points, dummy = pc_dims\n if (dummy != 3):\n raise ValueError()\n\n out_signal = decoder_with_fc_only(z,\n init_list[:-1],\n layer_sizes=layer_sizes,\n non_linearity=non_linearity, weight_decay=0.0)\n out_signal = non_linearity(out_signal)\n\n\n out_signal = fully_connected(out_signal,\n np.prod([n_points, 3]),\n activation='linear',\n weights_init=tf.constant_initializer(init_list[-1][0]),\n bias_init=tf.constant_initializer(init_list[-1][1]),\n weight_decay=0.0)\n out_signal = tf.reshape(out_signal, [-1, n_points, 3])\n return out_signal", "_____no_output_____" ], [ "from trainers.gan import GAN\nfrom tflearn import is_training\nclass PGAN(GAN):\n '''Gradient Penalty.\n https://arxiv.org/abs/1704.00028\n '''\n\n def __init__(self, name, learning_rate, lam, n_output, noise_dim, discriminator, generator, beta=0.5, gen_kwargs={}, disc_kwargs={}, graph=None):\n\n GAN.__init__(self, name, graph)\n \n self.noise_dim = noise_dim\n self.n_output = n_output\n self.discriminator = discriminator\n self.generator = generator\n \n with tf.variable_scope(name):\n self.noise = tf.placeholder(tf.float32, shape=[None, noise_dim]) # Noise vector.\n self.real_pc = tf.placeholder(tf.float32, shape=[None] + self.n_output) # Ground-truth.\n\n with tf.variable_scope('generator'):\n self.generator_out = self.generator(self.noise, self.n_output, **gen_kwargs)\n \n with tf.variable_scope('discriminator') as scope:\n self.real_prob, self.real_logit = self.discriminator(self.real_pc, scope=scope, **disc_kwargs)\n self.synthetic_prob, self.synthetic_logit = self.discriminator(self.generator_out, reuse=True, scope=scope, **disc_kwargs)\n \n \n # Compute WGAN losses\n self.loss_d_logit = tf.reduce_mean(self.synthetic_logit) - tf.reduce_mean(self.real_logit)\n self.loss_g = -tf.reduce_mean(self.synthetic_logit)\n\n# # Compute gradient penalty at interpolated points\n# ndims = self.real_pc.get_shape().ndims\n# batch_size = tf.shape(self.real_pc)[0]\n# alpha = 0.5\n# differences = self.generator_out - self.real_pc\n# interpolates = self.real_pc + (alpha * differences)\n\n# with tf.variable_scope('discriminator') as scope:\n# gradients = tf.gradients(self.discriminator(interpolates, reuse=True, scope=scope, **disc_kwargs)[1], [interpolates])[0]\n\n# # Reduce over all but the first dimension\n# slopes = tf.sqrt(tf.reduce_sum(tf.square(gradients), reduction_indices=list(range(1, ndims))))\n# self.gradient_penalty = tf.reduce_mean((slopes - 1.) ** 2)\n# self.loss_d = self.loss_d_logit + lam * self.gradient_penalty\n self.loss_d = self.loss_d_logit\n\n train_vars = tf.trainable_variables()\n d_params = [v for v in train_vars if v.name.startswith(name + '/discriminator/')]\n g_params = [v for v in train_vars if v.name.startswith(name + '/generator/')]\n \n self.opt_d = self.optimizer(learning_rate, beta, self.loss_d, d_params)\n self.opt_g = self.optimizer(learning_rate, beta, self.loss_g, g_params)\n# self.optimizer_d = tf.train.AdamOptimizer(learning_rate, beta1=beta)\n# self.opt_d = self.optimizer_d.minimize(self.loss_d, var_list=d_params)\n# self.optimizer_g = tf.train.AdamOptimizer(learning_rate, beta1=beta)\n# self.opt_g = self.optimizer_g.minimize(self.loss_g, var_list=g_params)\n\n self.saver = tf.train.Saver(tf.global_variables(), max_to_keep=None)\n self.init = tf.global_variables_initializer()\n\n # Launch the session\n config = tf.ConfigProto()\n config.gpu_options.allow_growth = True\n self.sess = tf.Session(config=config)\n self.sess.run(self.init)", "_____no_output_____" ], [ "# model\ndiscriminator = mlp_discriminator\ngenerator = point_cloud_generator", "_____no_output_____" ], [ "np.random.seed(0)\ng_fc_channel = [128, 64, 128, 512, 1024, 6144]\nd_cov_channel = [3, 64, 128, 256, 256, 512]\nd_fc_channel = [512, 128, 64, 1]\ng_fc_weight = []\nfor i in range(len(g_fc_channel) - 1):\n in_c = g_fc_channel[i]\n out_c = g_fc_channel[i + 1]\n g_fc_weight.append(\n (np.random.rand(in_c, out_c).astype(np.float32) * 0.1 - 0.05,\n np.random.rand(out_c).astype(np.float32) * 0.1 - 0.05))\n\nd_cov_weight = []\nfor i in range(len(d_cov_channel) - 1):\n in_c = d_cov_channel[i]\n out_c = d_cov_channel[i + 1]\n d_cov_weight.append((np.random.rand(in_c, out_c).astype(np.float32) * 0.1 - 0.05,\n np.random.rand(out_c).astype(np.float32) * 0.1 - 0.05))\n\nd_fc_weight = []\nfor i in range(len(d_fc_channel) - 1):\n in_c = d_fc_channel[i]\n out_c = d_fc_channel[i + 1]\n d_fc_weight.append((np.random.rand(in_c, out_c).astype(np.float32) * 0.1 - 0.05,\n np.random.rand(out_c).astype(np.float32) * 0.1 - 0.05))\n\ninput_noise = [np.random.rand(4, 128).astype(np.float32) * 0.1 - 0.05 for _ in range(10)]\ntarget_points = [\n np.random.rand(4, 2048, 3).astype(np.float32) * 0.1 - 0.05 for _ in range(10)\n]", "_____no_output_____" ], [ "reset_tf_graph()\ntf.random.set_random_seed(0)\nmodel_opt = opt['model']\nif model_opt['type'] == 'wgan':\n lam = 10\n disc_kwargs = {'cov_init_list': d_cov_weight, 'fc_init_list': d_fc_weight}\n gen_kwargs = {'init_list': g_fc_weight}\n gan = PGAN(model_opt['type'],\n train_opt['learning_rate'],\n lam, [model_opt['num_points'], 3],\n model_opt['noise_dim'],\n discriminator,\n generator,\n disc_kwargs=disc_kwargs,\n gen_kwargs=gen_kwargs,\n beta=train_opt['beta'])", "_____no_output_____" ], [ "for i in range(10):\n feed_dict = {gan.real_pc: target_points[i], gan.noise: input_noise[i]}\n _, loss_d = gan.sess.run([gan.opt_d, gan.loss_d], feed_dict=feed_dict)\n feed_dict = {gan.noise: input_noise[i]}\n _, loss_g = gan.sess.run([gan.opt_g, gan.loss_g], feed_dict=feed_dict)\n print(loss_d, loss_g)", "-2.1327287e-07 -0.00828593\n-2.4586916e-07 -0.008225871\n-2.933666e-07 -0.008159315\n-3.6507845e-07 -0.008088458\n-4.3399632e-07 -0.008014243\n-4.833564e-07 -0.007936596\n-5.3923577e-07 -0.007855793\n-5.9977174e-07 -0.00777228\n-6.495975e-07 -0.007685996\n-7.21775e-07 -0.007597178\n" ], [ "for i in range(10):\n feed_dict = {gan.real_pc: target_points[i], gan.noise: input_noise[i]}\n _, loss_d = gan.sess.run([gan.opt_d, gan.loss_d], feed_dict=feed_dict)\n feed_dict = {gan.noise: input_noise[i]}\n _, loss_g = gan.sess.run([gan.opt_g, gan.loss_g], feed_dict=feed_dict)\n print(loss_d, loss_g)", "-2.1327287e-07 -0.00828593\n-2.4586916e-07 -0.008225888\n-2.924353e-07 -0.008159323\n-3.6600977e-07 -0.008088484\n-4.3120235e-07 -0.00801428\n-4.833564e-07 -0.007936636\n-5.3830445e-07 -0.0078557655\n-5.9977174e-07 -0.0077722715\n-6.421469e-07 -0.007686084\n-7.2224066e-07 -0.007597371\n" ], [ "i = 0\nfeed_dict = {gan.real_pc: target_points[i], gan.noise: input_noise[i]}\n_, loss_d, loss_d_logit, gradient_penalty = gan.sess.run(\n [gan.opt_d, gan.loss_d, gan.loss_d_logit, gan.gradient_penalty],\n feed_dict=feed_dict)", "_____no_output_____" ], [ "float(loss_d), float(gradient_penalty), float(loss_d_logit)", "_____no_output_____" ], [ "gen_var = gan.sess.run(tf.trainable_variables('wgan/dis'))\nfor v in gen_var:\n print(v.reshape(-1)[0])", "0.041308507\n0.013890814\n0.0013969338\n0.041102726\n-0.040633775\n-0.045540597\n0.023586925\n0.03344834\n-0.009558369\n0.0005664937\n0.038841397\n0.020789754\n0.008475412\n0.032563414\n0.037454013\n0.0066774487\n" ], [ "# reset_tf_graph()\n# np.random.seed(0)\n# w = np.random.rand(3, 4).astype(np.float32)\n# b = np.random.rand(4).astype(np.float32)\n# in_f = np.random.rand(2, 3).astype(np.float32)\n\n# in_feat = tf.placeholder(tf.float32, [None, 3])\n# out = fully_connected(in_feat,\n# 4,\n# weights_init=tf.constant_initializer(w),\n# bias_init=tf.constant_initializer(b))\n\n# with tf.Session() as sess:\n# sess.run(tf.global_variables_initializer())\n# res = sess.run([out], feed_dict = {in_feat: in_f})\n# print(res[0])", "[[1.6817647 1.7762184 1.063653 1.252217 ]\n [2.2302346 2.4863346 1.6563923 1.8565607]]\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d028fe22fa804a1bdcacb353d3af87090abf9241
17,457
ipynb
Jupyter Notebook
util/imutil.ipynb
shoulderhu/azure-image-ipy
ed9bf5c5e4293b2516518e2f0434d35e97a27f11
[ "MIT" ]
null
null
null
util/imutil.ipynb
shoulderhu/azure-image-ipy
ed9bf5c5e4293b2516518e2f0434d35e97a27f11
[ "MIT" ]
null
null
null
util/imutil.ipynb
shoulderhu/azure-image-ipy
ed9bf5c5e4293b2516518e2f0434d35e97a27f11
[ "MIT" ]
null
null
null
41.075294
1,149
0.46583
[ [ [ "import numpy as np\nimport matplotlib.pyplot as plt\nimport os\nimport scipy.ndimage as ndi\nimport skimage.filters as fl\nimport warnings", "_____no_output_____" ], [ "from numpy import uint8, int64, float64, array, arange, zeros, zeros_like, ones, mean\nfrom numpy.fft import fft, fft2, ifft, ifft2, fftshift\nfrom math import log2\nfrom scipy.ndimage import convolve, correlate, uniform_filter, gaussian_laplace, gaussian_filter, generic_filter, minimum_filter, maximum_filter, median_filter, rank_filter, \\\nbinary_fill_holes, binary_dilation, binary_erosion, binary_opening, binary_closing\nfrom scipy.signal import wiener\nfrom skimage import io, data\nfrom skimage.color import rgb2gray\nfrom skimage.draw import polygon\nfrom skimage.exposure import adjust_gamma, equalize_hist, rescale_intensity\nfrom skimage.feature import canny\nfrom skimage.filters import threshold_otsu, threshold_isodata, prewitt_h, prewitt_v, prewitt, roberts, sobel_h, sobel_v, sobel, laplace\nfrom skimage.io import imshow\nfrom skimage.measure import label\nfrom skimage.morphology import dilation, erosion, opening, closing, square\nfrom skimage.transform import rescale\nfrom skimage.util import img_as_ubyte, img_as_float, img_as_bool, random_noise\nfrom IPython.core.interactiveshell import InteractiveShell", "_____no_output_____" ], [ "warnings.filterwarnings('ignore')\nInteractiveShell.ast_node_interactivity = \"all\"", "_____no_output_____" ] ], [ [ "## numpy", "_____no_output_____" ] ], [ [ "def add(image, c):\n return uint8(np.clip(float64(image) + c, 0, 255))", "_____no_output_____" ] ], [ [ "## matplotlib", "_____no_output_____" ] ], [ [ "def matplot(img, title=None, cmap=None, figsize=None):\n col = len(img)\n \n if figsize is None:\n plt.figure(figsize=(col * 4, col * 4))\n else:\n plt.figure(figsize=figsize)\n \n for i, j in enumerate(img):\n plt.subplot(1, col, i + 1)\n plt.axis(\"off\")\n \n if title != None:\n plt.title(title[i])\n if cmap != None and cmap[i] != \"\":\n plt.imshow(j, cmap=cmap[i])\n else:\n imshow(j)", "_____no_output_____" ] ], [ [ "## Chapter 2", "_____no_output_____" ] ], [ [ "def imread(fname):\n return io.imread(os.path.join(\"/home/nbuser/library/\", \"Image\", \"read\", fname))", "_____no_output_____" ], [ "def imsave(fname, image):\n io.imsave(os.path.join(\"/home/nbuser/library/\", \"Image\", \"save\", fname), image)", "_____no_output_____" ] ], [ [ "## Chapter 3", "_____no_output_____" ] ], [ [ "def spatial_resolution(image, scale):\n return rescale(rescale(image, 1 / scale), scale, order=0)", "_____no_output_____" ], [ "def grayslice(image, n):\n image = img_as_ubyte(image)\n v = 256 // n\n return image // v * v ", "_____no_output_____" ] ], [ [ "## Chapter 4", "_____no_output_____" ] ], [ [ "def imhist(image, equal=False):\n if equal:\n image = img_as_ubyte(equalize_hist(image))\n f = plt.figure()\n f.show(plt.hist(image.flatten(), bins=256))", "_____no_output_____" ] ], [ [ "## Chapter 5", "_____no_output_____" ] ], [ [ "def unsharp(alpha=0.2):\n A1 = array([[-1, 1, -1], \n [1, 1, 1], \n [-1, 1, -1]], dtype=float64)\n A2 = array([[0, -1, 0], \n [-1, 5, -1], \n [0, -1, 0]], dtype=float64)\n return (alpha * A1 + A2) / (alpha + 1)", "_____no_output_____" ] ], [ [ "## Chapter 6", "_____no_output_____" ] ], [ [ "ne = array([[1, 1, 0], [1, 1, 0], [0, 0, 0]])\nbi = array([[1, 2, 1], [2, 4, 2], [1, 2, 1]]) / 4\nbc = array([[1, 4, 6, 4, 1], \n [4, 16, 24, 16, 4], \n [6, 24, 35, 24, 6], \n [4, 16, 24, 16, 4], \n [1, 4, 6, 4, 1]]) / 64", "_____no_output_____" ], [ "def zeroint(img):\n r, c = img.shape\n res = zeros((r*2, c*2))\n res[::2, ::2] = img\n return res", "_____no_output_____" ], [ "def spatial_filtering(img, p, filt):\n for i in range(int(log2(p))):\n img_zi = zeroint(img)\n img_sf = correlate(img_zi, filt, mode=\"reflect\")\n return img_sf", "_____no_output_____" ] ], [ [ "## Chapter 7", "_____no_output_____" ] ], [ [ "def fftformat(F):\n for f in F:\n print(\"%8.4f %+.4fi\" % (f.real, f.imag))", "_____no_output_____" ], [ "def fftshow(f, type=\"log\"):\n if type == \"log\":\n return rescale_intensity(np.log(1 + abs(f)), out_range=(0, 1))\n elif type == \"abs\":\n return rescale_intensity(abs(f), out_range=(0, 1))", "_____no_output_____" ], [ "def circle_mask(img, type, lh, D=15, n=2, sigma=10):\n r, c = img.shape\n arr = arange(-r / 2, r / 2)\n arc = arange(-c / 2, c / 2)\n x, y = np.meshgrid(arr, arc)\n \n if type == \"ideal\": \n if lh == \"low\":\n return x**2 + y**2 < D**2\n elif lh == \"high\":\n return x**2 + y**2 > D**2\n elif type == \"butterworth\":\n if lh == \"low\":\n return 1 / (1 + (np.sqrt(2) - 1) * ((x**2 + y**2) / D**2)**n)\n elif lh == \"high\":\n return 1 / (1 + (D**2 / (x**2 + y**2))**n)\n elif type == \"gaussian\":\n g = np.exp(-(x**2 + y**2) / sigma**2)\n if lh == \"low\":\n return g / g.max()\n elif lh == \"high\":\n return 1 - g / g.max()", "_____no_output_____" ], [ "def fft_filter(img, type, lh, D=15, n=2, sigma=10):\n f = fftshift(fft2(img))\n c = circle_mask(img, type, lh, D, n, sigma)\n fc = f * c\n return fftshow(f), c, fftshow(fc), fftshow(ifft2(fc), \"abs\")", "_____no_output_____" ] ], [ [ "## Chapter 8", "_____no_output_____" ] ], [ [ "def periodic_noise(img, s=None): \n if \"numpy\" not in str(type(s)):\n r, c = img.shape\n x, y = np.mgrid[0:r, 0:c].astype(float64)\n s = np.sin(x / 3 + y / 3) + 1\n return (2 * img_as_float(img) + s / 2) / 3", "_____no_output_____" ], [ "def outlier_filter(img, D=0.5):\n av = array([[1, 1, 1], \n [1, 0, 1], \n [1, 1, 1]]) / 8\n img_av = convolve(img, av)\n r = abs(img - img_av) > D\n return r * img_av + (1 - r) * img", "_____no_output_____" ], [ "def image_average(img, n):\n x, y = img.shape\n t = zeros((x, y, n))\n for i in range(n):\n t[:, :, i] = random_noise(img, \"gaussian\")\n return np.mean(t, 2)", "_____no_output_____" ], [ "def pseudo_median(x):\n MAXMIN = 0\n MINMAX = 255\n for i in range(len(x) - 2):\n MAXMIN = max(MAXMIN, min(x[i:i+3]))\n MINMAX = min(MINMAX, max(x[i:i+3]))\n return 0.5 * (MAXMIN + MINMAX)", "_____no_output_____" ], [ "def periodic_filter(img, type=\"band\", k=1):\n r, c = img.shape\n x_mid, y_mid = r // 2, c // 2\n \n f = fftshift(fft2(img))\n f2 = img_as_ubyte(fftshow(f, \"abs\"))\n f2[x_mid, y_mid] = 0\n x, y = np.where(f2 == f2.max())\n d = np.sqrt((x[0] - x_mid)**2 + (y[0] - y_mid)**2)\n \n if type == \"band\":\n x, y = np.meshgrid(arange(0, r), arange(0, c))\n z = np.sqrt((x - x_mid)**2 + (y - y_mid)**2)\n br = (z < np.floor(d - k)) | (z > np.ceil(d + k))\n fc = f * br\n elif type == \"criss\":\n fc = np.copy(f)\n fc[x, :] = 0\n fc[:, y] = 0 \n \n fci = ifft2(fc)\n return fftshow(f), fftshow(fc), fftshow(fci, \"abs\") ", "_____no_output_____" ], [ "def fft_inverse(img, c, type=\"low\", D2=15, n2=2, d=0.01):\n f = fftshift(fft2(img_as_ubyte(img)))\n if type == \"low\":\n c2 = circle_mask(img, \"butterworth\", \"low\", D2, n2, 10)\n fb = f / c * c2\n elif type == \"con\":\n c2 = np.copy(c)\n c2[np.where(c2 < d)] = 1\n fb = f / c2\n return c2, fftshow(ifft2(fb), \"abs\")", "_____no_output_____" ], [ "def deblur(img, m, type=\"con\",d=0.02):\n m2 = zeros_like(img, dtype=float64)\n r, c = m.shape\n m2[0:r, 0:c] = m\n mf = fft2(m2)\n \n if type == \"div\":\n bmi = ifft2(fft2(img) / mf)\n bmu = fftshow(bmi, \"abs\")\n elif type == \"con\": \n mf[np.where(abs(mf) < d)] = 1\n bmi = abs(ifft2(fft2(img) / mf))\n bmu = img_as_ubyte(bmi / bmi.max())\n bmu = rescale_intensity(bmu, in_range=(0, 128))\n return bmu", "_____no_output_____" ] ], [ [ "## Chapter 9", "_____no_output_____" ] ], [ [ "def threshold_adaptive(img, cut):\n r, c = img.shape\n w = c // cut\n starts = range(0, c - 1, w) \n ends = range(w, c + 1, w)\n z = zeros((r, c))\n for i in range(cut):\n tmp = img[:, starts[i]:ends[i]]\n z[:, starts[i]:ends[i]] = tmp > threshold_otsu(tmp)\n return z", "_____no_output_____" ], [ "def zerocross(img):\n r, c = img.shape\n z = np.zeros_like(img)\n for i in range(1, r - 1):\n for j in range(1, c - 1):\n if (img[i][j] < 0 and (img[i - 1][j] > 0 or img[i + 1][j] > 0 or img[i][j - 1] > 0 or img[i][j + 1] > 0)) or \\\n (img[i][j] == 0 and (img[i - 1][j] * img[i + 1][j] < 0 or img[i][j - 1] * img[i][j + 1] < 0)):\n z[i][j] = 1\n return z", "_____no_output_____" ], [ "def laplace_zerocross(img):\n return zerocross(ndi.laplace(float64(img), mode=\"constant\"))", "_____no_output_____" ], [ "def marr_hildreth(img, sigma=0.5):\n return zerocross(ndi.gaussian_laplace(float64(img), sigma=sigma))", "_____no_output_____" ] ], [ [ "## Chapter 10", "_____no_output_____" ] ], [ [ "sq = square(3)\ncr = array([[0, 1, 0], \n [1, 1, 1], \n [0, 1, 0]])\nsq\ncr", "_____no_output_____" ], [ "def internal_boundary(a, b):\n '''\n A - (A erosion B)\n '''\n return a - binary_erosion(a, b)", "_____no_output_____" ], [ "def external_boundary(a, b):\n '''\n (A dilation B) - A\n '''\n return binary_dilation(a, b) - a", "_____no_output_____" ], [ "def morphological_gradient(a, b):\n '''\n (A dilation B) - (A erosion B)\n '''\n return binary_dilation(a, b) * 1 - binary_erosion(a, b)", "_____no_output_____" ], [ "def hit_or_miss(t, b1):\n '''\n (A erosion B1) and (not A erosion B2)\n '''\n r, c = b1.shape\n b2 = ones((r + 2, c + 2))\n b2[1:r+1, 1:c+1] = 1 - b1\n t = img_as_bool(t)\n tb1 = binary_erosion(t, b1)\n tb2 = binary_erosion(1 - t, b2)\n x, y = np.where((tb1 & tb2) == 1)\n tb3 = np.zeros_like(tb1)\n tb3[x, y] = 1\n return x, y, tb1, tb2, tb3", "_____no_output_____" ], [ "def bwskel(img, kernel=sq):\n skel = zeros_like(img, dtype=bool)\n e = (np.copy(img) > 0) * 1\n while e.max() > 0:\n o = binary_opening(e, kernel) * 1\n skel = skel | (e & (1 - o))\n e = binary_erosion(e, kernel) * 1\n return skel", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ] ]
d02914a2b75b59fde6aac11d2913d767cc6f597f
28,705
ipynb
Jupyter Notebook
Dataset Analysis.ipynb
nicolascarva/Nico-Carvajal-Winter-2022-Data-Science-Intern-Challenge
58f3738a9821b51ef5a1a595633b5e46b0475ba4
[ "MIT" ]
null
null
null
Dataset Analysis.ipynb
nicolascarva/Nico-Carvajal-Winter-2022-Data-Science-Intern-Challenge
58f3738a9821b51ef5a1a595633b5e46b0475ba4
[ "MIT" ]
null
null
null
Dataset Analysis.ipynb
nicolascarva/Nico-Carvajal-Winter-2022-Data-Science-Intern-Challenge
58f3738a9821b51ef5a1a595633b5e46b0475ba4
[ "MIT" ]
null
null
null
30.96548
129
0.363595
[ [ [ "import pandas as pd", "_____no_output_____" ], [ "df = pd.read_csv('dataset.csv', index_col='order_id', parse_dates=True)", "_____no_output_____" ], [ "df.head(10)", "_____no_output_____" ], [ "df.describe(include='all')\n# Here we can see that the $3145.13 is the mean of all order amounts. However the data is heavily skewed by a few orders\n# of over 100 sneakers (probably by distributors). There are a few options that can be taken depending on the situation:\n# one of them is using the median, another is filtering out the outliers (ie. orders with over 100 sneakers) and \n# recalculating the mean", "_____no_output_____" ], [ "# Median AOV is $284 dollars, this is a much more reasonable amount:\n\ndf['order_amount'].median()", "_____no_output_____" ], [ "# Filtering out suspected distributor orders we can find the new average is $754. This is a much more reasonable number\n# but still seems a little high. (We'll call the dataframe with the distributors filtered out df_no_dist)\ndf_no_dist = df[df['total_items']<=100]\ndf_no_dist.describe()", "_____no_output_____" ], [ "# Looking at the maximum, we find there's an order for only 8 sneakers valued at $154,350. This is either a novelty\n# item or a mistake. One option would be to filter out this order and recalculate the new mean.\n\ndf_no_dist.max()", "_____no_output_____" ], [ "# But first we'll find the information for that transaction\n\ndf_no_dist.loc[df['order_amount'] == 154350]", "_____no_output_____" ], [ "# Then we can see the statistics for that specific store to see if it their item is priced this high in fact or if it\n# was a mistake. We find that it is not a mistake, the store must be selling a novelty sneakers.\n# Their sneaker cost is $25,725.\n\ndf_no_dist.loc[df['shop_id'] == 78].describe()", "_____no_output_____" ], [ "# Depending on the purpose of our analysis, we may filter out this store to get a more realistic AOV\n# across our stores. We will filter this out and call the new dataframe df_no_dist_or_nov\n\ndf_no_dist_or_nov = df_no_dist.loc[df['shop_id'] != 78]\ndf_no_dist_or_nov.describe()", "_____no_output_____" ], [ "# The AOV with these transactions filtered out is now $302, which, unsurprisingly is close to the median calculated\n# above ($284)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0291d1f2e0abff28b8ebe2da656f3ec7b60d570
420,384
ipynb
Jupyter Notebook
style_transfer.ipynb
sbrml/style-transfer
b6fe8864f85d8d96a3f02283599a39ecd8dc1a66
[ "MIT" ]
null
null
null
style_transfer.ipynb
sbrml/style-transfer
b6fe8864f85d8d96a3f02283599a39ecd8dc1a66
[ "MIT" ]
7
2021-06-08T22:41:57.000Z
2022-03-12T00:51:20.000Z
style_transfer.ipynb
sbrml/style-transfer
b6fe8864f85d8d96a3f02283599a39ecd8dc1a66
[ "MIT" ]
null
null
null
1,387.405941
410,944
0.958367
[ [ [ "import numpy as np\nimport matplotlib.pyplot as plt\nimport matplotlib.image as mpimg\n\nimport tensorflow as tf\n\nimport os\nfrom imageio import imwrite\n\nfrom tqdm import tqdm", "_____no_output_____" ], [ "!ls imgs", "Odo_bayeux_tapestry.png josh_is_bae.jpg\r\ngirl_over_soul.png josh_is_molly.png\r\ngirl_with_pearl_earring.jpg kandinsky.jpg\r\ngmunden_middleevil_nuns.png kandinsky_johns.png\r\ngmunden_nuns.jpeg middleevil.jpg\r\ngmunden_pieta.jpeg over_soul.jpg\r\ngmunden_pieta2.jpeg starry_johns.png\r\ngmunden_starry_nuns.png starry_night.png\r\ngrundlsee_jesus.jpeg the_scream.jpg\r\njohns.jpg\r\n" ], [ "# no need to resize yet\nraw_img = tf.io.read_file('imgs/grundlsee_jesus.jpeg')\ncontent_img = tf.image.decode_image(raw_img)[None, ...]\ncontent_img = tf.cast(content_img, tf.float32) / 255.\ngen_img = content_img[:]\ncontent_img = tf.image.resize(content_img, size=tuple([v // 2 for v in content_img.shape[1:3]]))\ncontent_resized_shape = content_img.shape[1:3]\n\nprint(content_resized_shape)\n\nraw_img = tf.io.read_file('imgs/middleevil.jpg')\nstyle_img = tf.image.decode_image(raw_img)[None, :, :, :]\nstyle_img = tf.image.resize(style_img, size=content_img.shape[1:3])#[..., :-1]\nstyle_img = tf.cast(style_img, tf.float32) / 255.\n\nplt.figure(figsize=(17, 6))\nplt.subplot(121)\nplt.imshow(content_img[0])\nplt.axis('off')\nplt.subplot(122)\nplt.imshow(style_img[0])\nplt.axis('off')\nplt.show()", "(266, 200)\n" ], [ "def content_loss(g_acts, c_acts, weights):\n \n return 0.5 * sum([w * tf.reduce_sum(tf.math.squared_difference(g_act, c_act)) \n for w, g_act, c_act in zip(weights, g_acts, c_acts)])\n\ndef style_loss(g_acts, s_acts, weights):\n \n loss = 0\n \n for w, g_act, s_act in zip(weights, g_acts, s_acts):\n \n g_gram = tf.einsum('bijk, bijl -> kl', g_act, g_act)\n s_gram = tf.einsum('bijk, bijl -> kl', s_act, s_act)\n \n NM = tf.cast(tf.reduce_prod(g_act.shape), tf.float32)\n \n loss += w * 0.25 / NM**2 * tf.reduce_sum(tf.math.squared_difference(g_gram, s_gram))\n \n return loss\n\ndef total_variation_loss(gen_img):\n \n x_deltas = gen_img[:, 1:, :, :] - gen_img[:, :-1, :, :]\n y_deltas = gen_img[:, :, 1:, :] - gen_img[:, :, :-1, :]\n \n return tf.reduce_mean(x_deltas**2) + tf.reduce_mean(y_deltas**2)", "_____no_output_____" ], [ "# Names of layers and weight values for content and style losses\ncontent_layers = ['block4_conv2']\ncontent_weights = [1] * len(content_layers)\n\nstyle_layers = ['block1_conv1', 'block2_conv1', 'block3_conv1', 'block4_conv1', 'block5_conv1']\nstyle_weights = [1] * len(style_layers)\n\n# Loading in VGG-19\nvgg19 = tf.keras.applications.vgg19.VGG19(include_top=False,\n weights='imagenet',\n input_tensor=None,\n input_shape=content_img.shape[1:], \n pooling='avg', \n classes=1000)\nvgg19.trainable = False\n\n# Define new model to get activations\noutputs = [vgg19.get_layer(layer_name).output for layer_name in (content_layers + style_layers)]\nvgg19_activations = tf.keras.Model([vgg19.input], outputs)", "_____no_output_____" ], [ "gen_image_name = \"grundlsee_middleevil_jesus.png\"\n\nnum_epochs = 11\nlog_freq = 10\nlearn_rate = 2e-2\ncontent_weight = 1\nstyle_weight = 1e3\nreg_weight = 1e8\noptimizer = tf.optimizers.Adam(learning_rate=learn_rate, beta_1=0.99, epsilon=1e-1)\n\n# gen_img = tf.random.uniform(minval=0., maxval=1., shape=content_img.shape)\ngen_img = tf.Variable(gen_img)\n\n# Precompute content and style activatiosn\nc_input = tf.keras.applications.vgg19.preprocess_input(content_img * 255.)\ns_input = tf.keras.applications.vgg19.preprocess_input(style_img * 255.)\n\nc_acts = vgg19_activations(c_input)[:len(content_layers)]\ns_acts = vgg19_activations(s_input)[len(content_layers):]\n\nfor epoch in tqdm(range(num_epochs)):\n with tf.GradientTape() as tape:\n \n g_input = tf.image.resize(gen_img, size=content_resized_shape)\n g_input = tf.keras.applications.vgg19.preprocess_input(g_input * 255.)\n g_acts = vgg19_activations(g_input)\n\n c_loss = content_loss(g_acts[:len(content_layers)], c_acts, content_weights)\n\n s_loss = style_loss(g_acts[len(content_layers):], s_acts, style_weights)\n\n loss = content_weight * c_loss + style_weight * s_loss \n loss = loss + reg_weight * total_variation_loss(gen_img)\n\n grads = tape.gradient(loss, gen_img)\n optimizer.apply_gradients([(grads, gen_img)])\n \n # Bring image back to a valid range\n gen_img.assign(tf.clip_by_value(gen_img, clip_value_min=0., clip_value_max=1.))\n\n if epoch % log_freq == 0:\n plt.figure(figsize=(17, 6))\n plt.subplot(131)\n plt.imshow(tf.squeeze(content_img).numpy())\n plt.axis('off')\n plt.subplot(132)\n plt.imshow(tf.squeeze(style_img).numpy())\n plt.axis('off')\n plt.subplot(133)\n plt.imshow(tf.squeeze(gen_img).numpy())\n plt.axis('off')\n plt.show()\n\nresult = tf.squeeze(gen_img).numpy()\n\n# Save resulting image\n# fig = plt.figure()\n# ax = plt.Axes(fig, [0., 0., 1., 1.])\n# ax.set_axis_off()\n# fig.add_axes(ax)\n\n# ax.imshow(result)\n# plt.savefig('girl_over_soul.png', dpi=800) \n# plt.close()\nif not os.path.exists(\"imgs/\" + gen_image_name):\n result = (result * 255).astype(np.uint8)\n imwrite(\"imgs/\" + gen_image_name, result)\nelse:\n print(gen_image_name + \" already exists!\")", "\r 0%| | 0/11 [00:00<?, ?it/s]" ], [ "result = tf.squeeze(gen_img).numpy()\n\n# Save resulting image\n# fig = plt.figure()\n# ax = plt.Axes(fig, [0., 0., 1., 1.])\n# ax.set_axis_off()\n# fig.add_axes(ax)\n\n# ax.imshow(result)\n# plt.savefig('girl_over_soul.png', dpi=800) \n# plt.close()\nif not os.path.exists(\"imgs/\" + gen_image_name):\n result = (result * 255).astype(np.uint8)\n imwrite(\"imgs/\" + gen_image_name, result)\nelse:\n print(gen_image_name + \" already exists!\")", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code" ] ]
d0292ee47523490972937f1dde9999a6e6b09378
4,480
ipynb
Jupyter Notebook
notebooks/Complete-Python-Bootcamp-master/Methods.ipynb
sheldon-cheah/cppkernel
212c81f34c2f144d605fc0be4a90327989ab7625
[ "BSD-3-Clause" ]
6
2017-09-28T12:38:00.000Z
2020-07-15T04:41:07.000Z
notebooks/Complete-Python-Bootcamp-master/Methods.ipynb
sheldon-cheah/cppkernel
212c81f34c2f144d605fc0be4a90327989ab7625
[ "BSD-3-Clause" ]
5
2016-08-25T06:06:12.000Z
2016-11-26T18:57:20.000Z
notebooks/Complete-Python-Bootcamp-master/Methods.ipynb
sheldon-cheah/cppkernel
212c81f34c2f144d605fc0be4a90327989ab7625
[ "BSD-3-Clause" ]
1
2019-11-05T05:29:25.000Z
2019-11-05T05:29:25.000Z
23.703704
307
0.560268
[ [ [ "#Methods\n\nWe've already seen a few example of methods when learning about Object and Data Structure Types in Python. Methods are essentially functions built into objects. Later on in the course we will learn about how to create our own objects and methods using Object Oriented Programming (OOP) and classes.\n\nMethods will perform specific actions on the object and can also take arguments, just like a function. This lecture will serve as just a bried introduction to methods and get you thinking about overall design methods that we will touch back upon when we reach OOP in the course.\n\nMethods are in the form:\n\n object.method(arg1,arg2,etc...)\n \nYou'll later see that we can think of methods as having an argument 'self' referring to the object itself. You can't see this argument but we will be using it later on in the course during the OOP lectures.\n\nLets take a quick look at what an example of the various methods a list has:", "_____no_output_____" ] ], [ [ "# Create a simple list\nl = [1,2,3,4,5]", "_____no_output_____" ] ], [ [ "Fortunately, with iPython and the Jupyter Notebook we can quickly see all the possible methods using the tab key. The methods for a list are:\n\n* append\n* count\n* extend\n* insert\n* pop\n* remove\n* reverse\n* sort\n\nLet's try out a few of them:", "_____no_output_____" ], [ "append() allows us to add elements to the end of a list:", "_____no_output_____" ] ], [ [ "l.append(6)", "_____no_output_____" ], [ "l", "_____no_output_____" ] ], [ [ "Great! Now how about count()? The count() method will count the number of occurences of an element in a list.", "_____no_output_____" ] ], [ [ "# Check how many times 2 shows up in the list\nl.count(2)", "_____no_output_____" ] ], [ [ "You can always use Shift+Tab in the Jupyter Notebook to get more help about the method. In general Python you can use the help() function: ", "_____no_output_____" ] ], [ [ "help(l.count)", "Help on built-in function count:\n\ncount(...)\n L.count(value) -> integer -- return number of occurrences of value\n\n" ] ], [ [ "Feel free to play around with the rest of the methods for a list. Later on in this section your quiz will involve using help and google searching for methods of different types of objects!", "_____no_output_____" ], [ "Great! By this lecture you should feel comfortable calling methods of objects in Python!", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ] ]
d0295719c535d4e3d2a750dc7c0f9f93255adcfb
4,502
ipynb
Jupyter Notebook
tadsilweny_airplane_finder.ipynb
CHesseling/tadsilweny
d49acddc8358ac21171336c91e4ae9770b73cc19
[ "MIT" ]
null
null
null
tadsilweny_airplane_finder.ipynb
CHesseling/tadsilweny
d49acddc8358ac21171336c91e4ae9770b73cc19
[ "MIT" ]
null
null
null
tadsilweny_airplane_finder.ipynb
CHesseling/tadsilweny
d49acddc8358ac21171336c91e4ae9770b73cc19
[ "MIT" ]
1
2019-03-12T00:48:14.000Z
2019-03-12T00:48:14.000Z
22.17734
150
0.528654
[ [ [ "# You need to install the OpenSkyAPI library\n# Get it here: https://github.com/openskynetwork/opensky-api", "_____no_output_____" ], [ "# Install it\n#!pip install -e \\opensky-api-master\\python", "_____no_output_____" ], [ "from opensky_api import OpenSkyApi\nimport geocoder\nimport pandas as pd\nimport json\nfrom pandas.io.json import json_normalize\nimport requests", "_____no_output_____" ], [ "# Get your own Lat/Long position\ng = geocoder.ip('me')\n#print(g.latlng)", "_____no_output_____" ], [ "# Add a bounding box of 1 degree (I know that this is the lazy solution)\n\nlat1 = g.latlng[0]-0.5\nlat2 = g.latlng[0]+0.5\n# print (lat1, lat2)", "_____no_output_____" ], [ "lng1 = g.latlng[1]-0.5\nlng2 = g.latlng[1]+0.5\n# print (lng1, lng2)", "_____no_output_____" ], [ "# API request\napi = OpenSkyApi()\nstates = api.get_states(bbox=(lat1,lat2,lng1,lng2))\n", "_____no_output_____" ], [ "# Delete your pre-existing dataframe\ntry:\n del df\nexcept: Exception\n \n \n\n# Create your new Dataframe\ndf = pd.DataFrame(columns=['longitude', 'latitude', 'velocity', 'callsign', 'origin_country', 'on_ground', 'squawk', 'vertical_rate', 'icao24'])", "_____no_output_____" ], [ "# Fill your dataframe - again, lazy solution, 'cause the read_json didn't work\n\nfor s in states.states:\n df.loc[s] = [ s.longitude, s.latitude, s.velocity, s.callsign, s.origin_country, s.on_ground, s.squawk, s.vertical_rate, s.icao24 ]\ndf.reset_index(inplace=True)", "_____no_output_____" ], [ "df", "_____no_output_____" ], [ "url = \"https://ae.roplan.es/api/hex-type.php?hex=\"\nheaders = {\n 'User-Agent': 'Mozilla/5.0 (iPad; U; CPU OS 3_2_1 like Mac OS X; en-us'\n}", "_____no_output_____" ], [ "# Function to get the aircraft type from ae.roplan.es\n# I know, you wouldn't need it here but I used a different website with BS4 etc\ndef get_type(callsign):\n r = requests.get(url+callsign, headers=headers)\n return (r.text)", "_____no_output_____" ], [ "# Test it\n# get_type(\"49d027\")", "_____no_output_____" ], [ "# Add aircraft type to dataframe\nfor index,row in df.iterrows():\n df.loc[index,'type'] = get_type(row['icao24'])", "_____no_output_____" ], [ "# Filter dataframe by aircraft type\ndf[df['type'].astype(str).str.contains('Boeing 737')]", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0295acaf88102e4362944791723fd452b4bcb0e
1,964
ipynb
Jupyter Notebook
Chapter 1 - Machine Learning Toolkit/Exercise 3 - Order of Execution.ipynb
doc-E-brown/Applied-Supervised-Learning-with-Python
f125cecde1af4f77017302c3393acf9c2415ce9a
[ "MIT" ]
2
2021-06-08T18:00:07.000Z
2021-10-08T06:31:38.000Z
Chapter 1 - Machine Learning Toolkit/Exercise 3 - Order of Execution.ipynb
TrainingByPackt/Applied-Supervised-Learning-with-Python
f125cecde1af4f77017302c3393acf9c2415ce9a
[ "MIT" ]
null
null
null
Chapter 1 - Machine Learning Toolkit/Exercise 3 - Order of Execution.ipynb
TrainingByPackt/Applied-Supervised-Learning-with-Python
f125cecde1af4f77017302c3393acf9c2415ce9a
[ "MIT" ]
16
2019-06-04T22:22:17.000Z
2022-01-02T06:43:44.000Z
23.95122
297
0.523422
[ [ [ "# Exercise 3: Order of Execution\n\nThis Jupyter notebook has been written to partner with Lesson 1 - Machine Learning Toolkit", "_____no_output_____" ] ], [ [ "print('Hello World!!')", "Hello World!!\n" ], [ "print(hello_world)", "_____no_output_____" ], [ "hello_world = 'Hello World!!!!!!!!!!!!'", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code" ] ]
d029788ebf15bd92da1e891d543437553ce22c78
17,357
ipynb
Jupyter Notebook
task_7_SVMs.ipynb
knutzk/handson-ml
3b80038e85e6ea0ac1dc1c4e068f563adca1d760
[ "Apache-2.0" ]
1
2020-05-01T11:20:53.000Z
2020-05-01T11:20:53.000Z
task_7_SVMs.ipynb
knutzk/handson-ml
3b80038e85e6ea0ac1dc1c4e068f563adca1d760
[ "Apache-2.0" ]
null
null
null
task_7_SVMs.ipynb
knutzk/handson-ml
3b80038e85e6ea0ac1dc1c4e068f563adca1d760
[ "Apache-2.0" ]
null
null
null
39.447727
523
0.586046
[ [ [ "# Task 4: Support Vector Machines\n\n_All credit for the code examples of this notebook goes to the book \"Hands-On Machine Learning with Scikit-Learn & TensorFlow\" by A. Geron. Modifications were made and text was added by K. Zoch in preparation for the hands-on sessions._", "_____no_output_____" ], [ "# Setup", "_____no_output_____" ], [ "First, import a few common modules, ensure MatplotLib plots figures inline and prepare a function to save the figures:", "_____no_output_____" ] ], [ [ "# Common imports\nimport numpy as np\nimport os\n\n# to make this notebook's output stable across runs\nnp.random.seed(42)\n\n# To plot pretty figures\n%matplotlib inline\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\nmpl.rc('axes', labelsize=14)\nmpl.rc('xtick', labelsize=12)\nmpl.rc('ytick', labelsize=12)\n\n# Function to save a figure. This also decides that all output files \n# should stored in the subdirectorz 'classification'.\nPROJECT_ROOT_DIR = \".\"\nEXERCISE = \"SVMs\"\n\ndef save_fig(fig_id, tight_layout=True):\n path = os.path.join(PROJECT_ROOT_DIR, \"output\", EXERCISE, fig_id + \".png\")\n print(\"Saving figure\", fig_id)\n if tight_layout:\n plt.tight_layout()\n plt.savefig(path, format='png', dpi=300)", "_____no_output_____" ] ], [ [ "# Large margin *vs* margin violations", "_____no_output_____" ], [ "This code example contains two linear support vector machine classifiers ([LinearSVC](https://scikit-learn.org/stable/modules/generated/sklearn.svm.LinearSVC.html), which are initialised with different hyperparameter C. The used dataset is the iris dataset also shown in the lecture (iris verginica vcs. iris versicolor). Try a few different values for C and compare the results! What effect do different values of C have on: (1) the width of the street, (2) the number of outliers, (3) the number of support vectors?", "_____no_output_____" ] ], [ [ "import numpy as np\nfrom sklearn import datasets\nfrom sklearn.pipeline import Pipeline\nfrom sklearn.preprocessing import StandardScaler\nfrom sklearn.svm import LinearSVC\n\n# Load the dataset and store the necessary features/labels in X/y.\niris = datasets.load_iris()\nX = iris[\"data\"][:, (2, 3)] # petal length, petal width\ny = (iris[\"target\"] == 2).astype(np.float64) # Iris-Virginica\n\n# Initialise a scaler and the two SVC instances.\nscaler = StandardScaler()\nsvm_clf1 = LinearSVC(C=1, loss=\"hinge\", max_iter=10000, random_state=42)\nsvm_clf2 = LinearSVC(C=100, loss=\"hinge\", max_iter=10000, random_state=42)\n\n# Create pipelines to automatically scale the input.\nscaled_svm_clf1 = Pipeline([\n (\"scaler\", scaler),\n (\"linear_svc\", svm_clf1),\n ])\nscaled_svm_clf2 = Pipeline([\n (\"scaler\", scaler),\n (\"linear_svc\", svm_clf2),\n ])\n\n# Perform the actual fit of the two models.\nscaled_svm_clf1.fit(X, y)\nscaled_svm_clf2.fit(X, y)\n\n# Convert to unscaled parameters\nb1 = svm_clf1.decision_function([-scaler.mean_ / scaler.scale_])\nb2 = svm_clf2.decision_function([-scaler.mean_ / scaler.scale_])\nw1 = svm_clf1.coef_[0] / scaler.scale_\nw2 = svm_clf2.coef_[0] / scaler.scale_\nsvm_clf1.intercept_ = np.array([b1])\nsvm_clf2.intercept_ = np.array([b2])\nsvm_clf1.coef_ = np.array([w1])\nsvm_clf2.coef_ = np.array([w2])\n\n# Find support vectors (LinearSVC does not do this automatically)\nt = y * 2 - 1\nsupport_vectors_idx1 = (t * (X.dot(w1) + b1) < 1).ravel()\nsupport_vectors_idx2 = (t * (X.dot(w2) + b2) < 1).ravel()\nsvm_clf1.support_vectors_ = X[support_vectors_idx1]\nsvm_clf2.support_vectors_ = X[support_vectors_idx2]\n\n# Now do the plotting.\ndef plot_svc_decision_boundary(svm_clf, xmin, xmax):\n w = svm_clf.coef_[0]\n b = svm_clf.intercept_[0]\n\n # At the decision boundary, w0*x0 + w1*x1 + b = 0\n # => x1 = -w0/w1 * x0 - b/w1\n x0 = np.linspace(xmin, xmax, 200)\n decision_boundary = -w[0]/w[1] * x0 - b/w[1]\n\n margin = 1/w[1]\n gutter_up = decision_boundary + margin\n gutter_down = decision_boundary - margin\n\n svs = svm_clf.support_vectors_\n plt.scatter(svs[:, 0], svs[:, 1], s=180, facecolors='#FFAAAA')\n plt.plot(x0, decision_boundary, \"k-\", linewidth=2)\n plt.plot(x0, gutter_up, \"k--\", linewidth=2)\n plt.plot(x0, gutter_down, \"k--\", linewidth=2)\n \nplt.figure(figsize=(12,3.2))\nplt.subplot(121)\nplt.plot(X[:, 0][y==1], X[:, 1][y==1], \"g^\", label=\"Iris-Virginica\")\nplt.plot(X[:, 0][y==0], X[:, 1][y==0], \"bs\", label=\"Iris-Versicolor\")\nplot_svc_decision_boundary(svm_clf1, 4, 6)\nplt.xlabel(\"Petal length\", fontsize=14)\nplt.ylabel(\"Petal width\", fontsize=14)\nplt.legend(loc=\"upper left\", fontsize=14)\nplt.title(\"$C = {}$\".format(svm_clf1.C), fontsize=16)\nplt.axis([4, 6, 0.8, 2.8])\nplt.subplot(122)\nplt.plot(X[:, 0][y==1], X[:, 1][y==1], \"g^\")\nplt.plot(X[:, 0][y==0], X[:, 1][y==0], \"bs\")\nplot_svc_decision_boundary(svm_clf2, 4, 6)\nplt.xlabel(\"Petal length\", fontsize=14)\nplt.title(\"$C = {}$\".format(svm_clf2.C), fontsize=16)\nplt.axis([4, 6, 0.8, 2.8])\n\nsave_fig(\"regularization_plot\")", "_____no_output_____" ] ], [ [ "# Polynomial features vs. polynomial kernels\n\nLet's create a non-linear dataset, for which we can compare two approaches: (1) adding polynomial features to the model, (2) using a polynomial kernel (see exercise sheet). First, create some random data.", "_____no_output_____" ] ], [ [ "from sklearn.datasets import make_moons\nX, y = make_moons(n_samples=100, noise=0.15, random_state=42)\n\ndef plot_dataset(X, y, axes):\n plt.plot(X[:, 0][y==0], X[:, 1][y==0], \"bs\")\n plt.plot(X[:, 0][y==1], X[:, 1][y==1], \"g^\")\n plt.axis(axes)\n plt.grid(True, which='both')\n plt.xlabel(r\"$x_1$\", fontsize=20)\n plt.ylabel(r\"$x_2$\", fontsize=20, rotation=0)\n\nplot_dataset(X, y, [-1.5, 2.5, -1, 1.5])\nplt.show()", "_____no_output_____" ] ], [ [ "Now let's first look at a linear SVM classifier that uses polynomial features. We will implement them through a pipeline including scaling of the inputs. What happens if you increase the degrees of polynomial features? Does the model get better? How is the computing time affected? Hint: you might have to increase the `max_iter` parameter for higher degrees.", "_____no_output_____" ] ], [ [ "from sklearn.pipeline import Pipeline\nfrom sklearn.preprocessing import PolynomialFeatures\n\npolynomial_svm_clf = Pipeline([\n (\"poly_features\", PolynomialFeatures(degree=3)),\n (\"scaler\", StandardScaler()),\n (\"svm_clf\", LinearSVC(C=10, loss=\"hinge\", max_iter=1000, random_state=42))\n ])\n\npolynomial_svm_clf.fit(X, y)\n\ndef plot_predictions(clf, axes):\n x0s = np.linspace(axes[0], axes[1], 100)\n x1s = np.linspace(axes[2], axes[3], 100)\n x0, x1 = np.meshgrid(x0s, x1s)\n X = np.c_[x0.ravel(), x1.ravel()]\n y_pred = clf.predict(X).reshape(x0.shape)\n y_decision = clf.decision_function(X).reshape(x0.shape)\n plt.contourf(x0, x1, y_pred, cmap=plt.cm.brg, alpha=0.2)\n plt.contourf(x0, x1, y_decision, cmap=plt.cm.brg, alpha=0.1)\n\nplot_predictions(polynomial_svm_clf, [-1.5, 2.5, -1, 1.5])\nplot_dataset(X, y, [-1.5, 2.5, -1, 1.5])\n\nsave_fig(\"moons_polynomial_svc_plot\")\nplt.show()", "_____no_output_____" ] ], [ [ "Now let's try the same without polynomial features, but a polynomial kernel instead. What is the fundamental difference between these two approaches? How do they scale in terms of computing time: (1) as a function of the number of features, (2) as a function of the number of instances?\n\n1. Try out different degrees for the polynomial kernel. Do you expect any changes in the computing time? How does the model itself change in the plot?\n2. Try different values for the `coef0` parameter. Can you guess what it controls? You should be able to see different behaviour for different degrees in the kernel.\n3. Try different values for the hyperparameter C, which controls margin violations.", "_____no_output_____" ] ], [ [ "from sklearn.svm import SVC\n\n# Let's make one pipeline with polynomial kernel degree 3.\npoly_kernel_svm_clf = Pipeline([\n (\"scaler\", StandardScaler()),\n (\"svm_clf\", SVC(kernel=\"poly\", degree=3, coef0=1, C=5))\n ])\npoly_kernel_svm_clf.fit(X, y)\n\n# And another pipeline with polynomial kernel degree 10.\npoly100_kernel_svm_clf = Pipeline([\n (\"scaler\", StandardScaler()),\n (\"svm_clf\", SVC(kernel=\"poly\", degree=10, coef0=100, C=5))\n ])\npoly100_kernel_svm_clf.fit(X, y)\n\n# Now start the plotting.\nplt.figure(figsize=(11, 4))\nplt.subplot(121)\nplot_predictions(poly_kernel_svm_clf, [-1.5, 2.5, -1, 1.5])\nplot_dataset(X, y, [-1.5, 2.5, -1, 1.5])\nplt.title(r\"$d=3, r=1, C=5$\", fontsize=18)\nplt.subplot(122)\nplot_predictions(poly100_kernel_svm_clf, [-1.5, 2.5, -1, 1.5])\nplot_dataset(X, y, [-1.5, 2.5, -1, 1.5])\nplt.title(r\"$d=10, r=100, C=5$\", fontsize=18)\n\nsave_fig(\"moons_kernelized_polynomial_svc_plot\")\nplt.show()", "_____no_output_____" ] ], [ [ "# Gaussian kernels", "_____no_output_____" ], [ "Before trying the following piece of code which implements Gaussian RBF (Radial Basis Function) kernels, remember _similarity features_ that were discussed in the lecture:\n1. What are similarity features? What is the idea of adding a \"landmark\"?\n2. If similarity features help to increase the power of the model, why should we be careful to just add a similarity feature for _each_ instance of the dataset?\n3. How does the kernel trick (once again) save the day in this case?\n4. What does the `gamma` parameter control?\n\nBelow you find a code implementation which creates a set of four plots with different values for gamma and hyperparameter C. Try different values for both. Which direction _increases_ regularisation of the model? In which direction would you go to avoid underfitting? In which to avoid overfitting?", "_____no_output_____" ] ], [ [ "from sklearn.svm import SVC\n\n# Set up multiple values for gamma and hyperparameter C\n# and create a list of value pairs.\ngamma1, gamma2 = 0.1, 5\nC1, C2 = 0.001, 1000\nhyperparams = (gamma1, C1), (gamma1, C2), (gamma2, C1), (gamma2, C2)\n\n# Store multiple SVM classifiers in a list with these sets of \n# hyperparameters. For all of them, use a pipeline to allow\n# scaling of the inputs.\nsvm_clfs = []\nfor gamma, C in hyperparams:\n rbf_kernel_svm_clf = Pipeline([\n (\"scaler\", StandardScaler()),\n (\"svm_clf\", SVC(kernel=\"rbf\", gamma=gamma, C=C))\n ])\n rbf_kernel_svm_clf.fit(X, y)\n svm_clfs.append(rbf_kernel_svm_clf)\n\n# Now do the plotting.\nplt.figure(figsize=(11, 7))\nfor i, svm_clf in enumerate(svm_clfs):\n plt.subplot(221 + i)\n plot_predictions(svm_clf, [-1.5, 2.5, -1, 1.5])\n plot_dataset(X, y, [-1.5, 2.5, -1, 1.5])\n gamma, C = hyperparams[i]\n plt.title(r\"$\\gamma = {}, C = {}$\".format(gamma, C), fontsize=16)\n\nsave_fig(\"moons_rbf_svc_plot\")\nplt.show()", "_____no_output_____" ] ], [ [ "# Regression\n\nThe following code implements the support vector regression class from Scikit-Learn ([SVR](https://scikit-learn.org/stable/modules/generated/sklearn.svm.SVR.html)). Here are a couple of questions (some of which require changes to the code, others are just conceptual:\n1. Quick recap: whereas the SVC class tries to make a classification decision, what is the job of this regression class? How is the output different?\n2. Try different values for the hyperparameter C. What does it control?\n3. How should the margin of a 'good' SVR model look like? Should it be broad? Should it be narrow? How does the parameter epsilon affect this?", "_____no_output_____" ] ], [ [ "# Generate some random data (degree = 2).\nnp.random.seed(42)\nm = 100\nX = 2 * np.random.rand(m, 1) - 1\ny = (0.2 + 0.1 * X + 0.5 * X**2 + np.random.randn(m, 1)/10).ravel()\n\n# Import the support vector regression class and create two \n# instances with different hyperparameters.\nfrom sklearn.svm import SVR\nsvm_poly_reg1 = SVR(kernel=\"poly\", degree=2, C=100, epsilon=0.1, gamma=\"auto\")\nsvm_poly_reg2 = SVR(kernel=\"poly\", degree=2, C=0.01, epsilon=0.1, gamma=\"auto\")\nsvm_poly_reg1.fit(X, y)\nsvm_poly_reg2.fit(X, y)\n\n# Now do the plotting.\ndef plot_svm_regression(svm_reg, X, y, axes):\n x1s = np.linspace(axes[0], axes[1], 100).reshape(100, 1)\n y_pred = svm_reg.predict(x1s)\n plt.plot(x1s, y_pred, \"k-\", linewidth=2, label=r\"$\\hat{y}$\")\n plt.plot(x1s, y_pred + svm_reg.epsilon, \"k--\")\n plt.plot(x1s, y_pred - svm_reg.epsilon, \"k--\")\n plt.scatter(X[svm_reg.support_], y[svm_reg.support_], s=180, facecolors='#FFAAAA')\n plt.plot(X, y, \"bo\")\n plt.xlabel(r\"$x_1$\", fontsize=18)\n plt.legend(loc=\"upper left\", fontsize=18)\n plt.axis(axes)\n \nplt.figure(figsize=(9, 4))\nplt.subplot(121)\nplot_svm_regression(svm_poly_reg1, X, y, [-1, 1, 0, 1])\nplt.title(r\"$degree={}, C={}, \\epsilon = {}$\".format(svm_poly_reg1.degree, svm_poly_reg1.C, svm_poly_reg1.epsilon), fontsize=18)\nplt.ylabel(r\"$y$\", fontsize=18, rotation=0)\nplt.subplot(122)\nplot_svm_regression(svm_poly_reg2, X, y, [-1, 1, 0, 1])\nplt.title(r\"$degree={}, C={}, \\epsilon = {}$\".format(svm_poly_reg2.degree, svm_poly_reg2.C, svm_poly_reg2.epsilon), fontsize=18)\nsave_fig(\"svm_with_polynomial_kernel_plot\")\nplt.show()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0297a8e9e5e6b2bb07a7d662643db6cd1351a8b
10,412
ipynb
Jupyter Notebook
courses/machine_learning/deepdive/05_review/3_tensorflow_dnn.ipynb
Glairly/introduction_to_tensorflow
aa0a44d9c428a6eb86d1f79d73f54c0861b6358d
[ "Apache-2.0" ]
2
2022-01-06T11:52:57.000Z
2022-01-09T01:53:56.000Z
courses/machine_learning/deepdive/05_review/3_tensorflow_dnn.ipynb
Glairly/introduction_to_tensorflow
aa0a44d9c428a6eb86d1f79d73f54c0861b6358d
[ "Apache-2.0" ]
null
null
null
courses/machine_learning/deepdive/05_review/3_tensorflow_dnn.ipynb
Glairly/introduction_to_tensorflow
aa0a44d9c428a6eb86d1f79d73f54c0861b6358d
[ "Apache-2.0" ]
null
null
null
34.363036
559
0.553688
[ [ [ "# Create TensorFlow Deep Neural Network Model\n\n**Learning Objective**\n- Create a DNN model using the high-level Estimator API \n\n## Introduction\n\nWe'll begin by modeling our data using a Deep Neural Network. To achieve this we will use the high-level Estimator API in Tensorflow. Have a look at the various models available through the Estimator API in [the documentation here](https://www.tensorflow.org/api_docs/python/tf/estimator). \n\nStart by setting the environment variables related to your project.", "_____no_output_____" ] ], [ [ "PROJECT = \"cloud-training-demos\" # Replace with your PROJECT\nBUCKET = \"cloud-training-bucket\" # Replace with your BUCKET\nREGION = \"us-central1\" # Choose an available region for Cloud MLE\nTFVERSION = \"1.14\" # TF version for CMLE to use", "_____no_output_____" ], [ "import os\nos.environ[\"BUCKET\"] = BUCKET\nos.environ[\"PROJECT\"] = PROJECT\nos.environ[\"REGION\"] = REGION\nos.environ[\"TFVERSION\"] = TFVERSION", "_____no_output_____" ], [ "%%bash\nif ! gsutil ls | grep -q gs://${BUCKET}/; then\n gsutil mb -l ${REGION} gs://${BUCKET}\nfi", "_____no_output_____" ], [ "%%bash\nls *.csv", "_____no_output_____" ] ], [ [ "## Create TensorFlow model using TensorFlow's Estimator API ##\n\nWe'll begin by writing an input function to read the data and define the csv column names and label column. We'll also set the default csv column values and set the number of training steps.", "_____no_output_____" ] ], [ [ "import shutil\nimport numpy as np\nimport tensorflow as tf\nprint(tf.__version__)", "_____no_output_____" ], [ "CSV_COLUMNS = \"weight_pounds,is_male,mother_age,plurality,gestation_weeks\".split(',')\nLABEL_COLUMN = \"weight_pounds\"\n\n# Set default values for each CSV column\nDEFAULTS = [[0.0], [\"null\"], [0.0], [\"null\"], [0.0]]\nTRAIN_STEPS = 1000", "_____no_output_____" ] ], [ [ "### Create the input function\n\nNow we are ready to create an input function using the Dataset API.", "_____no_output_____" ] ], [ [ "def read_dataset(filename_pattern, mode, batch_size = 512):\n def _input_fn():\n def decode_csv(value_column):\n columns = tf.decode_csv(records = value_column, record_defaults = DEFAULTS)\n features = dict(zip(CSV_COLUMNS, columns))\n label = features.pop(LABEL_COLUMN)\n return features, label\n \n # Create list of files that match pattern\n file_list = tf.gfile.Glob(filename = filename_pattern)\n\n # Create dataset from file list\n dataset = (tf.data.TextLineDataset(filenames = file_list) # Read text file\n .map(map_func = decode_csv)) # Transform each elem by applying decode_csv fn\n\n if mode == tf.estimator.ModeKeys.TRAIN:\n num_epochs = None # indefinitely\n dataset = dataset.shuffle(buffer_size = 10 * batch_size)\n else:\n num_epochs = 1 # end-of-input after this\n\n dataset = dataset.repeat(count = num_epochs).batch(batch_size = batch_size)\n return dataset\n return _input_fn", "_____no_output_____" ] ], [ [ "### Create the feature columns\n\nNext, we define the feature columns", "_____no_output_____" ] ], [ [ "def get_categorical(name, values):\n return tf.feature_column.indicator_column(\n categorical_column = tf.feature_column.categorical_column_with_vocabulary_list(key = name, vocabulary_list = values))\n\ndef get_cols():\n # Define column types\n return [\\\n get_categorical(\"is_male\", [\"True\", \"False\", \"Unknown\"]),\n tf.feature_column.numeric_column(key = \"mother_age\"),\n get_categorical(\"plurality\",\n [\"Single(1)\", \"Twins(2)\", \"Triplets(3)\",\n \"Quadruplets(4)\", \"Quintuplets(5)\",\"Multiple(2+)\"]),\n tf.feature_column.numeric_column(key = \"gestation_weeks\")\n ]", "_____no_output_____" ] ], [ [ "### Create the Serving Input function \n\nTo predict with the TensorFlow model, we also need a serving input function. This will allow us to serve prediction later using the predetermined inputs. We will want all the inputs from our user.", "_____no_output_____" ] ], [ [ "def serving_input_fn():\n feature_placeholders = {\n \"is_male\": tf.placeholder(dtype = tf.string, shape = [None]),\n \"mother_age\": tf.placeholder(dtype = tf.float32, shape = [None]),\n \"plurality\": tf.placeholder(dtype = tf.string, shape = [None]),\n \"gestation_weeks\": tf.placeholder(dtype = tf.float32, shape = [None])\n }\n \n features = {\n key: tf.expand_dims(input = tensor, axis = -1)\n for key, tensor in feature_placeholders.items()\n }\n \n return tf.estimator.export.ServingInputReceiver(features = features, receiver_tensors = feature_placeholders)", "_____no_output_____" ] ], [ [ "### Create the model and run training and evaluation\n\nLastly, we'll create the estimator to train and evaluate. In the cell below, we'll set up a `DNNRegressor` estimator and the train and evaluation operations. ", "_____no_output_____" ] ], [ [ "def train_and_evaluate(output_dir):\n EVAL_INTERVAL = 300\n \n run_config = tf.estimator.RunConfig(\n save_checkpoints_secs = EVAL_INTERVAL,\n keep_checkpoint_max = 3)\n \n estimator = tf.estimator.DNNRegressor(\n model_dir = output_dir,\n feature_columns = get_cols(),\n hidden_units = [64, 32],\n config = run_config)\n \n train_spec = tf.estimator.TrainSpec(\n input_fn = read_dataset(\"train.csv\", mode = tf.estimator.ModeKeys.TRAIN),\n max_steps = TRAIN_STEPS)\n \n exporter = tf.estimator.LatestExporter(name = \"exporter\", serving_input_receiver_fn = serving_input_fn)\n \n eval_spec = tf.estimator.EvalSpec(\n input_fn = read_dataset(\"eval.csv\", mode = tf.estimator.ModeKeys.EVAL),\n steps = None,\n start_delay_secs = 60, # start evaluating after N seconds\n throttle_secs = EVAL_INTERVAL, # evaluate every N seconds\n exporters = exporter)\n \n tf.estimator.train_and_evaluate(estimator = estimator, train_spec = train_spec, eval_spec = eval_spec)", "_____no_output_____" ] ], [ [ "Finally, we train the model!", "_____no_output_____" ] ], [ [ "# Run the model\nshutil.rmtree(path = \"babyweight_trained_dnn\", ignore_errors = True) # start fresh each time\ntrain_and_evaluate(\"babyweight_trained_dnn\")", "_____no_output_____" ] ], [ [ "When I ran it, the final RMSE (the average_loss) is about **1.16**. You can explore the contents of the `exporter` directory to see the contains final model.", "_____no_output_____" ], [ "Copyright 2017-2018 Google Inc. Licensed under the Apache License, Version 2.0 (the \"License\"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ] ]
d029873db88b1ea4359504c4a5033f53b11ce879
20,682
ipynb
Jupyter Notebook
notebooks/dem_comparison.ipynb
pat-schmitt/tutorials
060d1cf83da31ae197f43f75be0cc111ed97186e
[ "BSD-3-Clause" ]
1
2021-02-23T11:56:06.000Z
2021-02-23T11:56:06.000Z
notebooks/dem_comparison.ipynb
nchampollion/tutorials
afdef4743e537249ef6ec273a53dfbacfab6db11
[ "BSD-3-Clause" ]
null
null
null
notebooks/dem_comparison.ipynb
nchampollion/tutorials
afdef4743e537249ef6ec273a53dfbacfab6db11
[ "BSD-3-Clause" ]
null
null
null
27.213158
455
0.533652
[ [ [ "# Compare different DEMs for individual glaciers", "_____no_output_____" ], [ "For most glaciers in the world there are several digital elevation models (DEM) which cover the respective glacier. In OGGM we have currently implemented 10 different open access DEMs to choose from. Some are regional and only available in certain areas (e.g. Greenland or Antarctica) and some cover almost the entire globe. For more information, visit the [rgitools documentation about DEMs](https://rgitools.readthedocs.io/en/latest/dems.html).\n\nThis notebook allows to see which of the DEMs are available for a selected glacier and how they compare to each other. That way it is easy to spot systematic differences and also invalid points in the DEMs.", "_____no_output_____" ], [ "## Input parameters ", "_____no_output_____" ], [ "This notebook can be run as a script with parameters using [papermill](https://github.com/nteract/papermill), but it is not necessary. The following cell contains the parameters you can choose from:", "_____no_output_____" ] ], [ [ "# The RGI Id of the glaciers you want to look for\n# Use the original shapefiles or the GLIMS viewer to check for the ID: https://www.glims.org/maps/glims\nrgi_id = 'RGI60-11.00897'\n\n# The default is to test for all sources available for this glacier\n# Set to a list of source names to override this\nsources = None\n# Where to write the plots. Default is in the current working directory\nplot_dir = ''\n# The RGI version to use\n# V62 is an unofficial modification of V6 with only minor, backwards compatible modifications\nprepro_rgi_version = 62\n# Size of the map around the glacier. Currently only 10 and 40 are available\nprepro_border = 10\n# Degree of processing level. Currently only 1 is available.\nfrom_prepro_level = 1", "_____no_output_____" ] ], [ [ "## Check input and set up", "_____no_output_____" ] ], [ [ "# The sources can be given as parameters\nif sources is not None and isinstance(sources, str):\n sources = sources.split(',')", "_____no_output_____" ], [ "# Plotting directory as well\nif not plot_dir:\n plot_dir = './' + rgi_id\nimport os\nplot_dir = os.path.abspath(plot_dir)", "_____no_output_____" ], [ "import pandas as pd\nimport numpy as np\nfrom oggm import cfg, utils, workflow, tasks, graphics, GlacierDirectory\nimport xarray as xr\nimport geopandas as gpd\nimport salem\nimport matplotlib.pyplot as plt\nfrom mpl_toolkits.axes_grid1 import AxesGrid\nimport itertools\n\nfrom oggm.utils import DEM_SOURCES\nfrom oggm.workflow import init_glacier_directories", "_____no_output_____" ], [ "# Make sure the plot directory exists\nutils.mkdir(plot_dir);\n# Use OGGM to download the data\ncfg.initialize()\ncfg.PATHS['working_dir'] = utils.gettempdir(dirname='OGGM-DEMS', reset=True)\ncfg.PARAMS['use_intersects'] = False", "_____no_output_____" ] ], [ [ "## Download the data using OGGM utility functions ", "_____no_output_____" ], [ "Note that you could reach the same goal by downloading the data manually from https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.4/rgitopo/ ", "_____no_output_____" ] ], [ [ "# URL of the preprocessed GDirs\ngdir_url = 'https://cluster.klima.uni-bremen.de/~oggm/gdirs/oggm_v1.4/rgitopo/'\n# We use OGGM to download the data\ngdir = init_glacier_directories([rgi_id], from_prepro_level=1, prepro_border=10, \n prepro_rgi_version='62', prepro_base_url=gdir_url)[0]", "_____no_output_____" ] ], [ [ "## Read the DEMs and store them all in a dataset ", "_____no_output_____" ] ], [ [ "if sources is None:\n sources = [src for src in os.listdir(gdir.dir) if src in utils.DEM_SOURCES]", "_____no_output_____" ], [ "print('RGI ID:', rgi_id)\nprint('Available DEM sources:', sources)\nprint('Plotting directory:', plot_dir)", "_____no_output_____" ], [ "# We use xarray to store the data\nods = xr.Dataset()\nfor src in sources:\n demfile = os.path.join(gdir.dir, src) + '/dem.tif'\n with xr.open_rasterio(demfile) as ds:\n data = ds.sel(band=1).load() * 1.\n ods[src] = data.where(data > -100, np.NaN)\n \n sy, sx = np.gradient(ods[src], gdir.grid.dx, gdir.grid.dx)\n ods[src + '_slope'] = ('y', 'x'), np.arctan(np.sqrt(sy**2 + sx**2))\n\nwith xr.open_rasterio(gdir.get_filepath('glacier_mask')) as ds:\n ods['mask'] = ds.sel(band=1).load()", "_____no_output_____" ], [ "# Decide on the number of plots and figure size\nns = len(sources)\nx_size = 12\nn_cols = 3\nn_rows = -(-ns // n_cols)\ny_size = x_size / n_cols * n_rows", "_____no_output_____" ] ], [ [ "## Raw topography data ", "_____no_output_____" ] ], [ [ "smap = salem.graphics.Map(gdir.grid, countries=False)\nsmap.set_shapefile(gdir.read_shapefile('outlines'))\nsmap.set_plot_params(cmap='topo')\nsmap.set_lonlat_contours(add_tick_labels=False)\nsmap.set_plot_params(vmin=np.nanquantile([ods[s].min() for s in sources], 0.25),\n vmax=np.nanquantile([ods[s].max() for s in sources], 0.75))\n\nfig = plt.figure(figsize=(x_size, y_size))\ngrid = AxesGrid(fig, 111,\n nrows_ncols=(n_rows, n_cols),\n axes_pad=0.7,\n cbar_mode='each',\n cbar_location='right',\n cbar_pad=0.1\n )\n\nfor i, s in enumerate(sources):\n data = ods[s]\n smap.set_data(data)\n ax = grid[i]\n smap.visualize(ax=ax, addcbar=False, title=s)\n if np.isnan(data).all():\n grid[i].cax.remove()\n continue\n cax = grid.cbar_axes[i]\n smap.colorbarbase(cax)\n \n# take care of uneven grids\nif ax != grid[-1]:\n grid[-1].remove()\n grid[-1].cax.remove()\n\nplt.savefig(os.path.join(plot_dir, 'dem_topo_color.png'), dpi=150, bbox_inches='tight')", "_____no_output_____" ] ], [ [ "## Shaded relief ", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(x_size, y_size))\ngrid = AxesGrid(fig, 111,\n nrows_ncols=(n_rows, n_cols),\n axes_pad=0.7,\n cbar_mode='none',\n cbar_location='right',\n cbar_pad=0.1\n )\nsmap.set_plot_params(cmap='Blues')\nsmap.set_shapefile()\nfor i, s in enumerate(sources):\n data = ods[s].copy().where(np.isfinite(ods[s]), 0)\n smap.set_data(data * 0)\n ax = grid[i]\n smap.set_topography(data)\n smap.visualize(ax=ax, addcbar=False, title=s)\n \n# take care of uneven grids\nif ax != grid[-1]:\n grid[-1].remove()\n grid[-1].cax.remove()\n\nplt.savefig(os.path.join(plot_dir, 'dem_topo_shade.png'), dpi=150, bbox_inches='tight')", "_____no_output_____" ] ], [ [ "## Slope ", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(x_size, y_size))\ngrid = AxesGrid(fig, 111,\n nrows_ncols=(n_rows, n_cols),\n axes_pad=0.7,\n cbar_mode='each',\n cbar_location='right',\n cbar_pad=0.1\n )\n\nsmap.set_topography();\nsmap.set_plot_params(vmin=0, vmax=0.7, cmap='Blues')\n\nfor i, s in enumerate(sources):\n data = ods[s + '_slope']\n smap.set_data(data)\n ax = grid[i]\n smap.visualize(ax=ax, addcbar=False, title=s + ' (slope)')\n cax = grid.cbar_axes[i]\n smap.colorbarbase(cax)\n \n# take care of uneven grids\nif ax != grid[-1]:\n grid[-1].remove()\n grid[-1].cax.remove()\n\nplt.savefig(os.path.join(plot_dir, 'dem_slope.png'), dpi=150, bbox_inches='tight')", "_____no_output_____" ] ], [ [ "## Some simple statistics about the DEMs ", "_____no_output_____" ] ], [ [ "df = pd.DataFrame()\nfor s in sources:\n df[s] = ods[s].data.flatten()[ods.mask.data.flatten() == 1]\n\ndfs = pd.DataFrame()\nfor s in sources:\n dfs[s] = ods[s + '_slope'].data.flatten()[ods.mask.data.flatten() == 1]", "_____no_output_____" ], [ "df.describe()", "_____no_output_____" ] ], [ [ "## Comparison matrix plot ", "_____no_output_____" ] ], [ [ "# Table of differences between DEMS\ndf_diff = pd.DataFrame()\ndone = []\nfor s1, s2 in itertools.product(sources, sources):\n if s1 == s2:\n continue\n if (s2, s1) in done:\n continue\n df_diff[s1 + '-' + s2] = df[s1] - df[s2]\n done.append((s1, s2))", "_____no_output_____" ], [ "# Decide on plot levels\nmax_diff = df_diff.quantile(0.99).max()\nbase_levels = np.array([-8, -5, -3, -1.5, -1, -0.5, -0.2, -0.1, 0, 0.1, 0.2, 0.5, 1, 1.5, 3, 5, 8])\nif max_diff < 10:\n levels = base_levels\nelif max_diff < 100:\n levels = base_levels * 10\nelif max_diff < 1000:\n levels = base_levels * 100\nelse:\n levels = base_levels * 1000\nlevels = [l for l in levels if abs(l) < max_diff]\nif max_diff > 10:\n levels = [int(l) for l in levels]\nlevels", "_____no_output_____" ], [ "smap.set_plot_params(levels=levels, cmap='PuOr', extend='both')\nsmap.set_shapefile(gdir.read_shapefile('outlines'))\n\nfig = plt.figure(figsize=(14, 14))\ngrid = AxesGrid(fig, 111,\n nrows_ncols=(ns - 1, ns - 1),\n axes_pad=0.3,\n cbar_mode='single',\n cbar_location='right',\n cbar_pad=0.1\n )\ndone = []\nfor ax in grid:\n ax.set_axis_off()\nfor s1, s2 in itertools.product(sources, sources):\n if s1 == s2:\n continue\n if (s2, s1) in done:\n continue\n data = ods[s1] - ods[s2]\n ax = grid[sources.index(s1) * (ns - 1) + sources[1:].index(s2)]\n ax.set_axis_on()\n smap.set_data(data)\n smap.visualize(ax=ax, addcbar=False)\n done.append((s1, s2))\n ax.set_title(s1 + '-' + s2, fontsize=8)\n \ncax = grid.cbar_axes[0]\nsmap.colorbarbase(cax);\n\nplt.savefig(os.path.join(plot_dir, 'dem_diffs.png'), dpi=150, bbox_inches='tight')", "_____no_output_____" ] ], [ [ "## Comparison scatter plot ", "_____no_output_____" ] ], [ [ "import seaborn as sns\nsns.set(style=\"ticks\")\n\nl1, l2 = (utils.nicenumber(df.min().min(), binsize=50, lower=True), \n utils.nicenumber(df.max().max(), binsize=50, lower=False))\n\ndef plot_unity(xdata, ydata, **kwargs):\n points = np.linspace(l1, l2, 100)\n plt.gca().plot(points, points, color='k', marker=None,\n linestyle=':', linewidth=3.0)\n\ng = sns.pairplot(df.dropna(how='all', axis=1).dropna(), plot_kws=dict(s=50, edgecolor=\"C0\", linewidth=1));\ng.map_offdiag(plot_unity)\nfor asx in g.axes:\n for ax in asx:\n ax.set_xlim((l1, l2))\n ax.set_ylim((l1, l2))\n\nplt.savefig(os.path.join(plot_dir, 'dem_scatter.png'), dpi=150, bbox_inches='tight')", "_____no_output_____" ] ], [ [ "## Table statistics ", "_____no_output_____" ] ], [ [ "df.describe()", "_____no_output_____" ], [ "df.corr()", "_____no_output_____" ], [ "df_diff.describe()", "_____no_output_____" ], [ "df_diff.abs().describe()", "_____no_output_____" ] ], [ [ "## What's next?\n\n- return to the [OGGM documentation](https://docs.oggm.org)\n- back to the [table of contents](welcome.ipynb)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ] ]
d02996c71b61bb386bfe90f27b22b75e9bf1fa52
4,654
ipynb
Jupyter Notebook
code/sagemaker_rcf.ipynb
tkeech1/aws_ml
512fd8c8770cbf5128ce9e0c03655c3d3021776b
[ "MIT" ]
1
2020-02-03T00:13:26.000Z
2020-02-03T00:13:26.000Z
code/sagemaker_rcf.ipynb
tkeech1/aws_ml
512fd8c8770cbf5128ce9e0c03655c3d3021776b
[ "MIT" ]
null
null
null
code/sagemaker_rcf.ipynb
tkeech1/aws_ml
512fd8c8770cbf5128ce9e0c03655c3d3021776b
[ "MIT" ]
null
null
null
29.455696
159
0.569618
[ [ [ "Created from https://github.com/awslabs/amazon-sagemaker-examples/blob/master/introduction_to_amazon_algorithms/random_cut_forest/random_cut_forest.ipynb", "_____no_output_____" ] ], [ [ "import boto3\nimport botocore\nimport sagemaker\nimport sys\n\n\nbucket = 'tdk-awsml-sagemaker-data.io-dev' # <--- specify a bucket you have access to\nprefix = ''\nexecution_role = sagemaker.get_execution_role()\n\n\n# check if the bucket exists\ntry:\n boto3.Session().client('s3').head_bucket(Bucket=bucket)\nexcept botocore.exceptions.ParamValidationError as e:\n print('Hey! You either forgot to specify your S3 bucket'\n ' or you gave your bucket an invalid name!')\nexcept botocore.exceptions.ClientError as e:\n if e.response['Error']['Code'] == '403':\n print(\"Hey! You don't have permission to access the bucket, {}.\".format(bucket))\n elif e.response['Error']['Code'] == '404':\n print(\"Hey! Your bucket, {}, doesn't exist!\".format(bucket))\n else:\n raise\nelse:\n print('Training input/output will be stored in: s3://{}/{}'.format(bucket, prefix))", "_____no_output_____" ], [ "%%time\n\nimport pandas as pd\nimport urllib.request\n\ndata_filename = 'nyc_taxi.csv'\ndata_source = 'https://raw.githubusercontent.com/numenta/NAB/master/data/realKnownCause/nyc_taxi.csv'\n\nurllib.request.urlretrieve(data_source, data_filename)\ntaxi_data = pd.read_csv(data_filename, delimiter=',')", "_____no_output_____" ], [ "from sagemaker import RandomCutForest\n\nsession = sagemaker.Session()\n\n# specify general training job information\nrcf = RandomCutForest(role=execution_role,\n train_instance_count=1,\n train_instance_type='ml.m5.large',\n data_location='s3://{}/{}/'.format(bucket, prefix),\n output_path='s3://{}/{}/output'.format(bucket, prefix),\n num_samples_per_tree=512,\n num_trees=50)\n\n# automatically upload the training data to S3 and run the training job\n# TK - had to modify this line to use to_numpy() instead of as_matrix()\nrcf.fit(rcf.record_set(taxi_data.value.to_numpy().reshape(-1,1)))", "_____no_output_____" ], [ "rcf_inference = rcf.deploy(\n initial_instance_count=1,\n instance_type='ml.m5.large',\n)\n\nprint('Endpoint name: {}'.format(rcf_inference.endpoint))", "_____no_output_____" ], [ "from sagemaker.predictor import csv_serializer, json_deserializer\n\nrcf_inference.content_type = 'text/csv'\nrcf_inference.serializer = csv_serializer\nrcf_inference.accept = 'application/json'\nrcf_inference.deserializer = json_deserializer", "_____no_output_____" ], [ "# TK - had to modify this line to use to_numpy() instead of as_matrix()\ntaxi_data_numpy = taxi_data.value.to_numpy().reshape(-1,1)\nprint(taxi_data_numpy[:6])\nresults = rcf_inference.predict(taxi_data_numpy[:6])", "_____no_output_____" ], [ "sagemaker.Session().delete_endpoint(rcf_inference.endpoint)", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ] ]
d0299c6345b10b08764b6c77e22afb1ec6d3278c
5,390
ipynb
Jupyter Notebook
00_index.ipynb
vtquang194/Big-Earth-data
d57191f1bff9d5c2fee2e69a97a7bdb8583ac665
[ "CC-BY-4.0" ]
10
2020-08-18T08:46:30.000Z
2022-01-26T13:59:53.000Z
00_index.ipynb
vtquang194/Big-Earth-data
d57191f1bff9d5c2fee2e69a97a7bdb8583ac665
[ "CC-BY-4.0" ]
null
null
null
00_index.ipynb
vtquang194/Big-Earth-data
d57191f1bff9d5c2fee2e69a97a7bdb8583ac665
[ "CC-BY-4.0" ]
10
2020-08-17T13:17:00.000Z
2022-03-02T09:00:26.000Z
32.275449
241
0.620408
[ [ [ "<br>", "_____no_output_____" ], [ "# Analysis of Big Earth Data with Jupyter Notebooks", "_____no_output_____" ], [ "<img src='./img/opengeohub_logo.png' alt='OpenGeoHub Logo' align='right' width='25%'></img>\nLecture given for OpenGeoHub summer school 2020<br>\nTuesday, 18. August 2020 | 11:00-13:00 CEST \n\n#### Lecturer\n* [Julia Wagemann](https://jwagemann.com) | Independent consultant and Phd student at University of Marburg\n\n#### Access to tutorial material\nNotebooks are available on [GitHub](https://github.com/jwagemann/2020_analysis_of_big_earth_data_with_jupyter).\n", "_____no_output_____" ], [ "<hr>", "_____no_output_____" ], [ "### Access to the JupyterHub", "_____no_output_____" ], [ "You can access the lecture material on a JupyterHub instance, a pre-defined environment that gives you direct access to the data and Python packages required for following the lecture.", "_____no_output_____" ], [ "<div class=\"alert alert-block alert-success\" align=\"left\">\n1. Web address: <a href='https://opengeohub.adamplatform.eu'>https://opengeohub.adamplatform.eu</a><br>\n2. Create an account: <a href='https://meeoauth.adamplatform.eu'>https://meeoauth.adamplatform.eu</a><br>\n3. Log into the <b>JupyterHub</b> with your account created. \n</div>", "_____no_output_____" ], [ "<hr>", "_____no_output_____" ], [ "## What is this lecture about?", "_____no_output_____" ], [ "Growing volumes of `Big Earth Data` force us to change the way how we access and process large volumes of geospatial data. New (cloud-based) data systems are being developed, each offering different functionalities for users.\n\nThis lecture is split in two parts: \n* **(Cloud-based) data access systems**<br>\nThis part will highlight five data access systems that allow you to access, download or process large volumes of Copernicus data related to climate and atmosphere. For each data system, an example is given how data can be retrieved.\nData access systems that will be covered:\n * [Copernicus Climate Data Store (CDS)](https://cds.climate.copernicus.eu/) / [Copernicus Atmosphere Data Store (ADS)](https://ads.atmosphere.copernicus.eu/)\n * [WEkEO - Copernicus Data and Information Access System](http://wekeo.eu/)\n * [Open Data Registry on Amazon Web Services](http://registry.opendata.aws)\n * [Google Earth Engine](https://code.earthengine.google.com/)\n\n\n* **Case study: Analysis of Covid-19 with Sentinel-5P data**<br>\nThis example showcases a case study analysing daily Sentinel-5P data from 2019 and 2020 with Jupyter notebooks and the Python library [xarray](http://xarray.pydata.org/en/stable/) in order to analyse possible Covid-19 impacts in 2020.", "_____no_output_____" ], [ "## Lecture outline", "_____no_output_____" ], [ "This lecture has the following outline:\n\n* [01 - Introduction to Project Jupyter (optional)](01_Intro_to_Python_and_Jupyter.ipynb)\n* [02 - Copernicus Climate Data Store / Copernicus Atmosphere Data Store](02_copernicus_climate_atmosphere_data_store.ipynb)\n* [03 - WEkEO - Copernicus Data and Information Access Service (DIAS)](03_WEkEO_dias_service.ipynb)\n* [04 - Amazon Web Services Open Data Registry](04_aws_open_data_registry.ipynb)\n* [05 - Google Earth Engine](05_google_earth_engine.ipynb)\n\n\n* [11 - Covid-19 case study - Sentinel-5P anomaly map](11_covid19_case_study_s5p_anomaly_map.ipynb)\n* [12 - Covid-19 case study - Sentinel-5P time-series analysis](12_covid19_case_study_s5p_time_series_analysis.ipynb)", "_____no_output_____" ], [ "<br>", "_____no_output_____" ], [ "<hr>\n&copy; 2020 | Julia Wagemann\n<a rel=\"license\" href=\"http://creativecommons.org/licenses/by/4.0/\"><img style=\"float: right\" alt=\"Creative Commons Lizenzvertrag\" style=\"border-width:0\" src=\"https://i.creativecommons.org/l/by/4.0/88x31.png\" /></a>", "_____no_output_____" ] ] ]
[ "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ] ]
d029a95c1231b8124a4ac31c833022163d3a1c74
92,057
ipynb
Jupyter Notebook
notebooks/.ipynb_checkpoints/pycaret-final-checkpoint.ipynb
ChandrakanthNethi/predict-the-employee-attrition-rate-in-organizations
3ddae6d20e1ae35b4efb4f32206f75b43a42407e
[ "MIT" ]
null
null
null
notebooks/.ipynb_checkpoints/pycaret-final-checkpoint.ipynb
ChandrakanthNethi/predict-the-employee-attrition-rate-in-organizations
3ddae6d20e1ae35b4efb4f32206f75b43a42407e
[ "MIT" ]
null
null
null
notebooks/.ipynb_checkpoints/pycaret-final-checkpoint.ipynb
ChandrakanthNethi/predict-the-employee-attrition-rate-in-organizations
3ddae6d20e1ae35b4efb4f32206f75b43a42407e
[ "MIT" ]
null
null
null
57.213797
553
0.471099
[ [ [ "# Predicting employee attrition rate in organizations", "_____no_output_____" ], [ "## Using PyCaret", "_____no_output_____" ], [ "### Step 1: Importing the data ", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nfrom pycaret.regression import *", "_____no_output_____" ], [ "train_csv = '../dataset/Train.csv'\ntest_csv = '../dataset/Test.csv'\ntrain_data = pd.read_csv(train_csv)\ntest_data = pd.read_csv(test_csv)", "_____no_output_____" ] ], [ [ "### Step 2: Setup", "_____no_output_____" ] ], [ [ "reg = setup(train_data, target='Attrition_rate', ignore_features=['Employee_ID'])", " \nSetup Succesfully Completed!\n" ] ], [ [ "### Step 3: Tuning the models", "_____no_output_____" ] ], [ [ "compare_models()", "_____no_output_____" ] ], [ [ "### Step 4: Selecting a model", "_____no_output_____" ] ], [ [ "model = create_model('br')", "_____no_output_____" ], [ "print(model)", "BayesianRidge(alpha_1=1e-06, alpha_2=1e-06, alpha_init=None,\n compute_score=False, copy_X=True, fit_intercept=True,\n lambda_1=1e-06, lambda_2=1e-06, lambda_init=None, n_iter=300,\n normalize=False, tol=0.001, verbose=False)\n" ] ], [ [ "### Step 5: Predicting on test data", "_____no_output_____" ] ], [ [ "predictions = predict_model(model, data = test_data)", "_____no_output_____" ], [ "predictions", "_____no_output_____" ], [ "predictions.rename(columns={\"Label\": \"Attrition_rate\"}, inplace=True)", "_____no_output_____" ], [ "predictions[['Employee_ID', 'Attrition_rate']].to_csv('../predictions.csv', index=False)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ] ]
d029ac20974d9cb7a89daa2bb85d96c4928708ff
34,906
ipynb
Jupyter Notebook
test.ipynb
sima97/unihobby
eb70be2d1e0f85ecd67b97f07ba49071c8a3aa1d
[ "MIT" ]
null
null
null
test.ipynb
sima97/unihobby
eb70be2d1e0f85ecd67b97f07ba49071c8a3aa1d
[ "MIT" ]
null
null
null
test.ipynb
sima97/unihobby
eb70be2d1e0f85ecd67b97f07ba49071c8a3aa1d
[ "MIT" ]
null
null
null
59.668376
2,024
0.582164
[ [ [ "<a href=\"https://colab.research.google.com/github/sima97/unihobby/blob/master/test.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ] ], [ [ "from google.colab import drive\ndrive.mount('/content/drive')", "Go to this URL in a browser: https://accounts.google.com/o/oauth2/auth?client_id=947318989803-6bn6qk8qdgf4n4g3pfee6491hc0brc4i.apps.googleusercontent.com&redirect_uri=urn%3aietf%3awg%3aoauth%3a2.0%3aoob&scope=email%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdocs.test%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive.photos.readonly%20https%3a%2f%2fwww.googleapis.com%2fauth%2fpeopleapi.readonly&response_type=code\n\nEnter your authorization code:\n··········\nMounted at /content/drive\n" ], [ "pip install nilearn", "Collecting nilearn\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/b9/c2/f5f1bdd37a3da28b3b34305e4ba27cce468db6073998d62a38abd0e281da/nilearn-0.6.2-py3-none-any.whl (2.5MB)\n\u001b[K |████████████████████████████████| 2.5MB 2.8MB/s \n\u001b[?25hRequirement already satisfied: nibabel>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from nilearn) (3.0.2)\nRequirement already satisfied: scipy>=0.19 in /usr/local/lib/python3.6/dist-packages (from nilearn) (1.4.1)\nRequirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.6/dist-packages (from nilearn) (0.16.0)\nRequirement already satisfied: scikit-learn>=0.19 in /usr/local/lib/python3.6/dist-packages (from nilearn) (0.22.2.post1)\nRequirement already satisfied: sklearn in /usr/local/lib/python3.6/dist-packages (from nilearn) (0.0)\nRequirement already satisfied: numpy>=1.11 in /usr/local/lib/python3.6/dist-packages (from nilearn) (1.18.5)\nInstalling collected packages: nilearn\nSuccessfully installed nilearn-0.6.2\n" ], [ "pip install tables", "Requirement already satisfied: tables in /usr/local/lib/python3.6/dist-packages (3.4.4)\nRequirement already satisfied: numpy>=1.8.0 in /usr/local/lib/python3.6/dist-packages (from tables) (1.18.5)\nRequirement already satisfied: numexpr>=2.5.2 in /usr/local/lib/python3.6/dist-packages (from tables) (2.7.1)\nRequirement already satisfied: six>=1.9.0 in /usr/local/lib/python3.6/dist-packages (from tables) (1.15.0)\n" ], [ "pip install git+https://www.github.com/farizrahman4u/keras-contrib.git", "Collecting git+https://www.github.com/farizrahman4u/keras-contrib.git\n Cloning https://www.github.com/farizrahman4u/keras-contrib.git to /tmp/pip-req-build-yzohelfu\n Running command git clone -q https://www.github.com/farizrahman4u/keras-contrib.git /tmp/pip-req-build-yzohelfu\nRequirement already satisfied: keras in /usr/local/lib/python3.6/dist-packages (from keras-contrib==2.0.8) (2.4.3)\nRequirement already satisfied: numpy>=1.9.1 in /usr/local/lib/python3.6/dist-packages (from keras->keras-contrib==2.0.8) (1.18.5)\nRequirement already satisfied: scipy>=0.14 in /usr/local/lib/python3.6/dist-packages (from keras->keras-contrib==2.0.8) (1.4.1)\nRequirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from keras->keras-contrib==2.0.8) (2.10.0)\nRequirement already satisfied: pyyaml in /usr/local/lib/python3.6/dist-packages (from keras->keras-contrib==2.0.8) (3.13)\nRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from h5py->keras->keras-contrib==2.0.8) (1.15.0)\nBuilding wheels for collected packages: keras-contrib\n Building wheel for keras-contrib (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-contrib: filename=keras_contrib-2.0.8-cp36-none-any.whl size=101064 sha256=50102c03bf897ccffa1d047fe537ea0faf3e58faaaebb788a0a926f266d294f4\n Stored in directory: /tmp/pip-ephem-wheel-cache-ti7rwgth/wheels/1f/8e/ac/fb1cca9d92276d64365b204e82a9a5bec1f24a20aca28fdbec\nSuccessfully built keras-contrib\nInstalling collected packages: keras-contrib\nSuccessfully installed keras-contrib-2.0.8\n" ], [ "pip install SimpleITK ", "Collecting SimpleITK\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f8/d8/53338c34f71020725ffb3557846c80af96c29c03bc883551a2565aa68a7c/SimpleITK-1.2.4-cp36-cp36m-manylinux1_x86_64.whl (42.5MB)\n\u001b[K |████████████████████████████████| 42.5MB 98kB/s \n\u001b[?25hInstalling collected packages: SimpleITK\nSuccessfully installed SimpleITK-1.2.4\n" ], [ "#pip install tensorflow==1.4", "_____no_output_____" ], [ "import tensorflow as tf\nfrom tensorflow.python.framework import ops\nimport tensorflow.compat.v1 as tf\ntf.disable_v2_behavior() \n\ndef cross_entropy_loss_v1(y_true, y_pred, sample_weight=None, eps=1e-6):\n \"\"\"\n :param y_pred: output 5D tensor, [batch size, dim0, dim1, dim2, class]\n :param y_true: 4D GT tensor, [batch size, dim0, dim1, dim2]\n :param eps: avoid log0\n :return: cross entropy loss\n \"\"\"\n log_y = tf.log(y_pred + eps)\n num_samples = tf.cast(tf.reduce_prod(tf.shape(y_true)), \"float32\")\n label_one_hot = tf.one_hot(indices=y_true, depth=y_pred.shape[-1], axis=-1, dtype=tf.float32)\n if sample_weight is not None:\n # ce = mean(- weight * y_true * log(y_pred)).\n label_one_hot = label_one_hot * sample_weight\n cross_entropy = - tf.reduce_sum(label_one_hot * log_y) / num_samples\n return cross_entropy\n\n\ndef cross_entropy_loss(y_true, y_pred, sample_weight=None):\n # may not use one_hot when use tf.keras.losses.CategoricalCrossentropy\n y_true = tf.one_hot(indices=y_true, depth=y_pred.shape[-1], axis=-1, dtype=tf.float32)\n if sample_weight is not None:\n # ce = mean(weight * y_true * log(y_pred)).\n y_true = y_true * sample_weight\n return tf.keras.losses.BinaryCrossentropy()(y_true, y_pred)\n\n\ndef cross_entropy_loss_with_weight(y_true, y_pred, sample_weight_per_c=None, eps=1e-6):\n # for simple calculate this batch.\n # if possible, get weight per epoch before training.\n num_dims, num_classes = [len(y_true.shape), y_pred.shape.as_list()[-1]]\n if sample_weight_per_c is None:\n print('use batch to calculate weight')\n num_lbls_in_ygt = tf.cast(tf.reduce_prod(tf.shape(y_true)), dtype=\"float32\")\n num_lbls_in_ygt_per_c = tf.bincount(arr=tf.cast(y_true, tf.int32), minlength=num_classes, maxlength=num_classes,\n dtype=\"float32\") # without the min/max, length of vector can change.\n sample_weight_per_c = (1. / (num_lbls_in_ygt_per_c + eps)) * (num_lbls_in_ygt / num_classes)\n sample_weight_per_c = tf.reshape(sample_weight_per_c, [1] * num_dims + [num_classes])\n # use cross_entropy_loss get negative value, while cross_entropy_loss and cross_entropy_loss_v1 get the same\n # when no weight. I guess may some error when batch distribution is huge different from epoch distribution.\n return cross_entropy_loss_v1(y_true, y_pred, sample_weight=sample_weight_per_c)\n\n\ndef dice_coef(y_true, y_pred, eps=1e-6):\n # problem: when gt class-0 >> class-1, the pred p(class-0) >> p(class-1)\n # eg. gt = [0, 0, 0, 0, 1] pred = [[1, 0], [1, 0], [1, 0], [1, 0], [1, 0]]. 2 * 4 / (5 + 5) = 0.8\n # in fact, change every pred, 4/5 -> 0.6, 1/5 ->1, so the model just pred all 0. imbalance class problem.\n # only calculate gt == 1 can fix my problem, but for multi-class task, weight needed like ce loss above.\n y_true = tf.one_hot(indices=y_true, depth=y_pred.shape[-1], axis=-1, dtype=tf.float32)\n abs_x_and_y = 2 * tf.reduce_sum(y_true * y_pred)\n abs_x_plus_abs_y = tf.reduce_sum(y_true) + tf.reduce_sum(y_pred)\n return (abs_x_and_y + eps) / (abs_x_plus_abs_y + eps)\n\n\ndef dice_coef_loss(y_true, y_pred):\n return 1. - dice_coef(y_true, y_pred)\n\n\n", "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/compat/v2_compat.py:96: disable_resource_variables (from tensorflow.python.ops.variable_scope) is deprecated and will be removed in a future version.\nInstructions for updating:\nnon-resource variables are not supported in the long term\n0.109087184\n0.10908617\nuse batch to calculate weight\n0.107429765\n0.10195482\n" ], [ "import numpy as np\nfrom keras import backend as K\nfrom keras.engine import Input, Model\nfrom keras.layers import Conv3D, MaxPooling3D, UpSampling3D, Activation, BatchNormalization, PReLU#, Deconvolution3D\nfrom keras.optimizers import Adam\n\n#from unet3d.metrics import dice_coefficient_loss, get_label_dice_coefficient_function, dice_coefficient\n\nK.set_image_data_format(\"channels_first\")\n\ntry:\n from keras.engine import merge\nexcept ImportError:\n from keras.layers.merge import concatenate\n\n\ndef unet_model_3d(input_shape, pool_size=(2, 2, 2), n_labels=1, initial_learning_rate=0.00001, deconvolution=False,\n depth=4, n_base_filters=32, include_label_wise_dice_coefficients=False, metrics=dice_coef,\n batch_normalization=False, activation_name=\"sigmoid\"):\n \"\"\"\n Builds the 3D UNet Keras model.f\n :param metrics: List metrics to be calculated during model training (default is dice coefficient).\n :param include_label_wise_dice_coefficients: If True and n_labels is greater than 1, model will report the dice\n coefficient for each label as metric.\n :param n_base_filters: The number of filters that the first layer in the convolution network will have. Following\n layers will contain a multiple of this number. Lowering this number will likely reduce the amount of memory required\n to train the model.\n :param depth: indicates the depth of the U-shape for the model. The greater the depth, the more max pooling\n layers will be added to the model. Lowering the depth may reduce the amount of memory required for training.\n :param input_shape: Shape of the input data (n_chanels, x_size, y_size, z_size). The x, y, and z sizes must be\n divisible by the pool size to the power of the depth of the UNet, that is pool_size^depth.\n :param pool_size: Pool size for the max pooling operations.\n :param n_labels: Number of binary labels that the model is learning.\n :param initial_learning_rate: Initial learning rate for the model. This will be decayed during training.\n :param deconvolution: If set to True, will use transpose convolution(deconvolution) instead of up-sampling. This\n increases the amount memory required during training.\n :return: Untrained 3D UNet Model\n \"\"\"\n inputs = Input(input_shape)\n current_layer = inputs\n levels = list()\n\n # add levels with max pooling\n for layer_depth in range(depth):\n layer1 = create_convolution_block(input_layer=current_layer, n_filters=n_base_filters*(2**layer_depth),\n batch_normalization=batch_normalization)\n layer2 = create_convolution_block(input_layer=layer1, n_filters=n_base_filters*(2**layer_depth)*2,\n batch_normalization=batch_normalization)\n if layer_depth < depth - 1:\n current_layer = MaxPooling3D(pool_size=pool_size)(layer2)\n levels.append([layer1, layer2, current_layer])\n else:\n current_layer = layer2\n levels.append([layer1, layer2])\n\n # add levels with up-convolution or up-sampling\n for layer_depth in range(depth-2, -1, -1):\n up_convolution = get_up_convolution(pool_size=pool_size, deconvolution=deconvolution,\n n_filters=current_layer._keras_shape[1])(current_layer)\n concat = concatenate([up_convolution, levels[layer_depth][1]], axis=1)\n current_layer = create_convolution_block(n_filters=levels[layer_depth][1]._keras_shape[1],\n input_layer=concat, batch_normalization=batch_normalization)\n current_layer = create_convolution_block(n_filters=levels[layer_depth][1]._keras_shape[1],\n input_layer=current_layer,\n batch_normalization=batch_normalization)\n\n final_convolution = Conv3D(n_labels, (1, 1, 1))(current_layer)\n act = Activation(activation_name)(final_convolution)\n model = Model(inputs=inputs, outputs=act)\n\n if not isinstance(metrics, list):\n metrics = [metrics]\n\n if include_label_wise_dice_coefficients and n_labels > 1:\n label_wise_dice_metrics = [get_label_dice_coefficient_function(index) for index in range(n_labels)]\n if metrics:\n metrics = metrics + label_wise_dice_metrics\n else:\n metrics = label_wise_dice_metrics\n\n model.compile(optimizer=Adam(lr=initial_learning_rate), loss=dice_coefficient_loss, metrics=metrics)\n return model\n\n\ndef create_convolution_block(input_layer, n_filters, batch_normalization=False, kernel=(3, 3, 3), activation=None,\n padding='same', strides=(1, 1, 1), instance_normalization=False):\n \"\"\"\n :param strides:\n :param input_layer:\n :param n_filters:\n :param batch_normalization:\n :param kernel:\n :param activation: Keras activation layer to use. (default is 'relu')\n :param padding:\n :return:\n \"\"\"\n layer = Conv3D(n_filters, kernel, padding=padding, strides=strides)(input_layer)\n if batch_normalization:\n layer = BatchNormalization(axis=1)(layer)\n elif instance_normalization:\n try:\n from keras_contrib.layers.normalization.instancenormalization import InstanceNormalization\n except ImportError:\n raise ImportError(\"Install keras_contrib in order to use instance normalization.\"\n \"\\nTry: pip install git+https://www.github.com/farizrahman4u/keras-contrib.git\")\n layer = InstanceNormalization(axis=1)(layer)\n if activation is None:\n return Activation('relu')(layer)\n else:\n return activation()(layer)\n\n\ndef compute_level_output_shape(n_filters, depth, pool_size, image_shape):\n \"\"\"\n Each level has a particular output shape based on the number of filters used in that level and the depth or number \n of max pooling operations that have been done on the data at that point.\n :param image_shape: shape of the 3d image.\n :param pool_size: the pool_size parameter used in the max pooling operation.\n :param n_filters: Number of filters used by the last node in a given level.\n :param depth: The number of levels down in the U-shaped model a given node is.\n :return: 5D vector of the shape of the output node \n \"\"\"\n output_image_shape = np.asarray(np.divide(image_shape, np.power(pool_size, depth)), dtype=np.int32).tolist()\n return tuple([None, n_filters] + output_image_shape)\n\n\ndef get_up_convolution(n_filters, pool_size, kernel_size=(2, 2, 2), strides=(2, 2, 2),\n deconvolution=False):\n if deconvolution:\n return Deconvolution3D(filters=n_filters, kernel_size=kernel_size,\n strides=strides)\n else:\n return UpSampling3D(size=pool_size)", "_____no_output_____" ], [ "import os\nimport glob\n\n#from unet3d.data import write_data_to_file, open_data_file\n#from unet3d.generator import get_training_and_validation_generators\n#from unet3d.model import unet_model_3d\n#from unet3d.training import load_old_model, train_model\n\n\nconfig = dict()\nconfig[\"pool_size\"] = (2, 2, 2) # pool size for the max pooling operations\nconfig[\"image_shape\"] = (144, 144, 144) # This determines what shape the images will be cropped/resampled to.\nconfig[\"patch_shape\"] = (64, 64, 64) # switch to None to train on the whole image\nconfig[\"labels\"] = (1, 2, 4) # the label numbers on the input image\nconfig[\"n_labels\"] = len(config[\"labels\"])\nconfig[\"all_modalities\"] = [\"t1\", \"t1ce\", \"flair\", \"t2\"]\nconfig[\"training_modalities\"] = config[\"all_modalities\"] # change this if you want to only use some of the modalities\nconfig[\"nb_channels\"] = len(config[\"training_modalities\"])\nif \"patch_shape\" in config and config[\"patch_shape\"] is not None:\n config[\"input_shape\"] = tuple([config[\"nb_channels\"]] + list(config[\"patch_shape\"]))\nelse:\n config[\"input_shape\"] = tuple([config[\"nb_channels\"]] + list(config[\"image_shape\"]))\nconfig[\"truth_channel\"] = config[\"nb_channels\"]\nconfig[\"deconvolution\"] = True # if False, will use upsampling instead of deconvolution\n\nconfig[\"batch_size\"] = 6\nconfig[\"validation_batch_size\"] = 12\nconfig[\"n_epochs\"] = 500 # cutoff the training after this many epochs\nconfig[\"patience\"] = 10 # learning rate will be reduced after this many epochs if the validation loss is not improving\nconfig[\"early_stop\"] = 50 # training will be stopped after this many epochs without the validation loss improving\nconfig[\"initial_learning_rate\"] = 0.00001\nconfig[\"learning_rate_drop\"] = 0.5 # factor by which the learning rate will be reduced\nconfig[\"validation_split\"] = 0.8 # portion of the data that will be used for training\nconfig[\"flip\"] = False # augments the data by randomly flipping an axis during\nconfig[\"permute\"] = True # data shape must be a cube. Augments the data by permuting in various directions\nconfig[\"distort\"] = None # switch to None if you want no distortion\nconfig[\"augment\"] = config[\"flip\"] or config[\"distort\"]\nconfig[\"validation_patch_overlap\"] = 0 # if > 0, during training, validation patches will be overlapping\nconfig[\"training_patch_start_offset\"] = (16, 16, 16) # randomly offset the first patch index by up to this offset\nconfig[\"skip_blank\"] = True # if True, then patches without any target will be skipped\n\nconfig[\"data_file\"] = os.path.abspath(\"/content/drive/My Drive/Brats2019/data.h5\")\nconfig[\"model_file\"] = os.path.abspath(\"/content/drive/My Drive/Brats2019/tumor_segmentation_model.h5\")\nconfig[\"training_file\"] = os.path.abspath(\"/content/drive/My Drive/Brats2019/pkl/training_ids.pkl\")\nconfig[\"validation_file\"] = os.path.abspath(\"/content/drive/My Drive/Brats2019/pkl/validation_ids.pkl\")\nconfig[\"overwrite\"] = False # If True, will previous files. If False, will use previously written files.\n\n\ndef fetch_training_data_files():\n training_data_files = list()\n for subject_dir in glob.glob(os.path.join(os.path.dirname(__file__), \"data\", \"preprocessed\", \"*\", \"*\")):\n subject_files = list()\n for modality in config[\"training_modalities\"] + [\"truth\"]:\n subject_files.append(os.path.join(subject_dir, modality + \".nii.gz\"))\n training_data_files.append(tuple(subject_files))\n return training_data_files\n\n\ndef main(overwrite=False):\n # convert input images into an hdf5 file\n if overwrite or not os.path.exists(config[\"data_file\"]):\n training_files = fetch_training_data_files()\n\n write_data_to_file(training_files, config[\"data_file\"], image_shape=config[\"image_shape\"])\n data_file_opened = open_data_file(config[\"data_file\"])\n\n if not overwrite and os.path.exists(config[\"model_file\"]):\n model = load_old_model(config[\"model_file\"])\n else:\n # instantiate new model\n model = unet_model_3d(input_shape=config[\"input_shape\"],\n pool_size=config[\"pool_size\"],\n n_labels=config[\"n_labels\"],\n initial_learning_rate=config[\"initial_learning_rate\"],\n deconvolution=config[\"deconvolution\"])\n\n # get training and testing generators\n train_generator, validation_generator, n_train_steps, n_validation_steps = get_training_and_validation_generators(\n data_file_opened,\n batch_size=config[\"batch_size\"],\n data_split=config[\"validation_split\"],\n overwrite=overwrite,\n validation_keys_file=config[\"validation_file\"],\n training_keys_file=config[\"training_file\"],\n n_labels=config[\"n_labels\"],\n labels=config[\"labels\"],\n patch_shape=config[\"patch_shape\"],\n validation_batch_size=config[\"validation_batch_size\"],\n validation_patch_overlap=config[\"validation_patch_overlap\"],\n training_patch_start_offset=config[\"training_patch_start_offset\"],\n permute=config[\"permute\"],\n augment=config[\"augment\"],\n skip_blank=config[\"skip_blank\"],\n augment_flip=config[\"flip\"],\n augment_distortion_factor=config[\"distort\"])\n\n # run training\n train_model(model=model,\n model_file=config[\"model_file\"],\n training_generator=train_generator,\n validation_generator=validation_generator,\n steps_per_epoch=n_train_steps,\n validation_steps=n_validation_steps,\n initial_learning_rate=config[\"initial_learning_rate\"],\n learning_rate_drop=config[\"learning_rate_drop\"],\n learning_rate_patience=config[\"patience\"],\n early_stopping_patience=config[\"early_stop\"],\n n_epochs=config[\"n_epochs\"])\n data_file_opened.close()\n\n\nif __name__ == \"__main__\":\n main(overwrite=config[\"overwrite\"])", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d029b32958aa6b1475db0c3e8e847d794f6d4ece
9,894
ipynb
Jupyter Notebook
tests/Test_binning.ipynb
ptyshevs/int_seq
645e469ee86d0c807ae57f5bbedbab46c6da3675
[ "MIT" ]
8
2018-11-18T20:08:38.000Z
2020-09-12T08:28:35.000Z
tests/Test_binning.ipynb
ptyshevs/int_seq
645e469ee86d0c807ae57f5bbedbab46c6da3675
[ "MIT" ]
3
2020-01-28T22:32:24.000Z
2020-03-31T00:39:08.000Z
tests/Test_binning.ipynb
ptyshevs/int_seq
645e469ee86d0c807ae57f5bbedbab46c6da3675
[ "MIT" ]
1
2018-11-21T22:55:49.000Z
2018-11-21T22:55:49.000Z
24.25
238
0.537396
[ [ [ "import pandas as pd\nimport numpy as np\nfrom tools import acc_score", "_____no_output_____" ], [ "df_train = pd.read_csv(\"../data/train.csv\", index_col=0)\ndf_test = pd.read_csv(\"../data/test.csv\", index_col=0)", "_____no_output_____" ], [ "train_bins = seq_to_num(df_train.Sequence, target_split=True, pad=True, pad_adaptive=True,\n pad_maxlen=100, dtype=np.float32, drop_na_inf=True,\n nbins=5, bins_by='terms')\ntest_bins = seq_to_num(df_test.Sequence, target_split=True, pad_adaptive=True,\n dtype=np.float32, drop_na_inf=True, nbins=5, bins_by='terms')", "_____no_output_____" ], [ "train_X, train_y, _ = train_bins[4]", "_____no_output_____" ], [ "test_X, test_y, test_idx = test_bins[4]", "_____no_output_____" ], [ "from sklearn.tree import DecisionTreeRegressor, ExtraTreeRegressor", "_____no_output_____" ], [ "dt = DecisionTreeRegressor(random_state=42)\ndt.fit(train_X, train_y)", "_____no_output_____" ], [ "acc_score(dt.predict(test_X), test_y)", "_____no_output_____" ], [ "train_X2, train_y2, _ = train_bins[1]\ntest_X2, test_y2, _ = test_bins[1]", "_____no_output_____" ], [ "etr = ExtraTreeRegressor(max_depth=100, random_state=42)\netr.fit(train_X2, train_y2)", "_____no_output_____" ], [ "acc_score(etr.predict(test_X2), test_y2)", "_____no_output_____" ], [ "# too long sequence?\ntrain_X2, train_y2 = train_bins[2]\ntest_X2, test_y2 = test_bins[2]\netr = DecisionTreeRegressor(max_depth=5, random_state=42)\netr.fit(train_X2, train_y2)\nacc_score(etr.predict(test_X2), test_y2)", "_____no_output_____" ], [ "from sklearn.neural_network import MLPRegressor", "_____no_output_____" ], [ "# NNet still doesn't work\nmlp = MLPRegressor(hidden_layer_sizes=(10, 1))\nmlp.fit(train_X, train_y)\nacc_score(mlp.predict(test_X), test_y)", "/Users/ptyshevs/envs/loc_env/lib/python3.6/site-packages/sklearn/neural_network/multilayer_perceptron.py:562: ConvergenceWarning: Stochastic Optimizer: Maximum iterations (200) reached and the optimization hasn't converged yet.\n % self.max_iter, ConvergenceWarning)\n" ], [ "# Try to combine predictions for bin 3 and 4 (by terms), while\n# fallback to mode on bin 0, 1, 2\ndef mmode(arr):\n modes = []\n for row in arr:\n counts = {i: row.tolist().count(i) for i in row}\n if len(counts) > 0:\n modes.append(max(counts.items(), key=lambda x:x[1])[0])\n else:\n modes.append(0)\n return modes", "_____no_output_____" ], [ "kg_train = pd.read_csv('../data/kaggle_train.csv', index_col=0)\nkg_test = pd.read_csv('../data/kaggle_test.csv', index_col=0)", "_____no_output_____" ], [ "train_bins = seq_to_num(kg_train.Sequence, target_split=True,\n pad_adaptive=True, dtype=np.float32, drop_na_inf=True,\n nbins=5, bins_by='terms')\ntest_bins = seq_to_num(kg_test.Sequence, target_split=False, pad_adaptive=True,\n dtype=np.float32, drop_na_inf=True, nbins=5, bins_by='terms')", "_____no_output_____" ], [ "bin3_X, bin3_y, _ = train_bins[3]\nbin4_X, bin4_y, _ = train_bins[4]\ndt_bin3 = DecisionTreeRegressor(random_state=42)\ndt_bin4 = DecisionTreeRegressor(random_state=42)\ndt_bin3.fit(bin3_X, bin3_y)\ndt_bin4.fit(bin4_X, bin4_y)", "_____no_output_____" ], [ "pred_bin3 = dt_bin3.predict(test_bins[3][0])\npred_bin4 = dt_bin4.predict(test_bins[4][0])", "_____no_output_____" ], [ "test_bins[3][1].shape, pred_bin3.shape", "_____no_output_____" ], [ "pred_bin0 = mmode(test_bins[0])", "_____no_output_____" ], [ "pred_bin1 = mmode(test_bins[1])\npred_bin2 = mmode(test_bins[2])", "_____no_output_____" ], [ "pred3 = pd.Series(pred_bin3, index=test_bins[3][1], dtype=object).map(lambda x: int(x))\npred4 = pd.Series(pred_bin4, index=test_bins[4][1], dtype=object).map(lambda x: int(x))", "_____no_output_____" ], [ "pred_total = pd.Series(np.zeros(kg_test.shape[0]), index=kg_test.index, dtype=np.int64)\npred_total[test_bins[3][1]] = pred_bin3\npred_total[test_bins[4][1]] = pred_bin4", "_____no_output_____" ], [ "prep_submit(pred_total)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d029ba35490782c025f348d562f60290eb42b0c6
809,309
ipynb
Jupyter Notebook
Repaso_algebra_LinealHeidy.ipynb
1966hs/MujeresDigitales
1360f017f0e18d27612ffc6c42cd2a84a2e7b925
[ "MIT" ]
null
null
null
Repaso_algebra_LinealHeidy.ipynb
1966hs/MujeresDigitales
1360f017f0e18d27612ffc6c42cd2a84a2e7b925
[ "MIT" ]
null
null
null
Repaso_algebra_LinealHeidy.ipynb
1966hs/MujeresDigitales
1360f017f0e18d27612ffc6c42cd2a84a2e7b925
[ "MIT" ]
null
null
null
620.160153
128,101
0.945292
[ [ [ "<a href=\"https://colab.research.google.com/github/1966hs/MujeresDigitales/blob/main/Repaso_algebra_LinealHeidy.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAugAAAGtCAIAAADLRRaFAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAP+lSURBVHhe7L0HwFxV1f1tEkroVfAFpRdfFBQ/G0VABARR9EVFpasUARVBkCJNkaYiVUEsEJqoCCoKgkgTJJAQAgGSEBLSe4P0J8VvnbvPc7JnnXYzJ/MXzP0Z51nnrHX2vTN35t498xTe8u+GhoaGhoaGhjcJTePS0NDQ0NDQ8KahaVwaGhoaGhoa3jQ0jUtDQ0NDQ0PDm4amcWloaGhoaGh409A0Lg0NDQ0NDQ1vGprGpaGhoaGhoeFNQ9O4NDQ0NDQ0NLxpaBqXhoaGhoaGhjcNTePS0NDQ0NDQ8KahaVwaGhoaGhoa3jQ0jUtDQ0NDQ0PDm4amcSlliYeepEzMIi0ii04mVpnSuQ1pLaRdTTBZuDxIMAkR1IIekpXAJU25SHHSMasmuoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+hcn2lvvEige1j7YSMRBMQgQ1IK2HNUlUiFmkRWTRSVpFlj8soWlcSlmssFMNDQ0NBdgTigdO+lZVmoZWtVqkRWRJJKt6meJa+xQm21vuEyse1D7aSsRAMAkR1IC0HtYkUSFmkRaRRSdpFVl6aJ/lBTSNS0NDQ8MbCznR+2iLkjGLtIgsiWRVL1Nca5/CZHvLfWLFg9pHW4kYCCYhghqQ1sOaJCrELNIisugkrSLLH5bQNC4NDQ0NbyzsCd5DW5SMWaRFZEkkq3qZ4lr7FCbbW+4TKx7UPtpKxEAwCRHUgLQe1iRRIWaRFpFFJ2kVWf6whKZxaWhoaHhjYU/wHtqiZMwiLSJLIlnVyxTX2qcw2d5yn1jxoPbRViIGgkmIoAak9bAmiQoxi7SILDpJq8jyhyU0jUtDQ0PDGwt7gvfQFiVjFmkRWRLJql6muNY+hcl/L/n3osWLFy7pWrJwySLz40AL8X+xiHaKV/jaR1uJGAgmIYIakNbDmiQqxCzSIrLoJK0iyx+W0DQuDQ0NDW8sqh9hDICTvlXeDzzGLNIisiSSVb1Mca19CpNLFi9ZuHDRrMWvL56H/82fs3jOooVdYhHtFK/wtY+2EjEQTEIENSCthzVJVIhZpEVk0UlaRZYe2md5AU3j0tDQ0PDGQk70PtqiZMwiLSJLIlnVyxTX2qcwab50LZy3ZL7pVubNfX3CvAWL5uOyKK6mneIVvvbRViIGgkmIoAak9bAmiQoxi7SILDpJq8jyhyU0jUtDQ0PDGwt7gvfQFiVjFmkRWRLJql6muNY+hcl/m7fuC/A2fhH+P3ZR3/3/Me/lBaG+pa3iFb720VYiBoJJiKAGpPWwJokKMYu0iCw6SavI8oclNI1LQ0NDwxuL6gP1ADjpW+V9/B6zSIvIkkhW9TLFtfYpTEIsXFx9c2jh4ld/OvThNR566dzB1Q+6MO0VD2ofbSViIJiECGpAWg9rkqgQs0iLyKKTtIosPbTP8gKaxqWhoaHhjYWc6H20RcmYRVpElkSyqpcprrVPYfLfS9DYLVloupfFQ/q88MDu/3rpihcw9mmneIWvfbSViIFgEiKoAWk9rEmiQswiLSKLTtIqsvxhCU3j0tDQ0PDGwp7gPbRFyZhFWkSWRLKqlymutU9h0nxZhPftC5YsXrBo1pJHvvbPxZPMT+z6tFO8wtc+2krEQDAJEdSAtB7WJFEhZpEWkUUnaRVZ/rCEpnFpaGhoeGNhT/Ae2qJkzCItIksiWdXLFNfapzBZfe1aZOSiJfOWPHLSI4unhdqWNosbfO2jrUQMBJMQQQ1I62FNEhViFmkRWXSSVpHlD0toGpeGhoaGNxb2BO+hLUrGLNIisiSSVb1Mca19CpPV16ZxSZGoELNIi8iik7SKLH9YQtO4NDQ0NLyxsCd4D21RMmaRFpElkazqZYpr7VOYrL42jUuKRIWYRVpEFp2kVWT5wxKaxqWhoaHhjYU9wXtoi5Ixi7SILIlkVS9TXGufwmT1tWlcUiQqxCzSIrLoJK0iyx+W0DQuDQ0NDW8s7AneQ1uUjFmkRWRJJKt6meJa+xQmq69N45IiUSFmkRaRRSdpFVn+sISmcWloaGh4YyF/7sIHJ32rvL+NEbNIi8iSSFb1MsW19ilMVl8XVH+3ZeHiuYsfPvHhRVPNbxn5tFXc4GsfbSViIJiECGpAWg9rkqgQs0iLyKKTtIosPbTP8gKaxqWhoaHhjYWc6H20RcmYRVpElkSyqpcprrVPYbL62nzikiJRIWaRFpFFJ2kVWf6whKZxaWhoaHhjYU/wHtqiZMwiLSJLIlnVyxTX2qcwWX1tGpcUiQoxi7SILDpJq8jyhyU0jct/IcPuPe4tFbteOcxONTQ0vHmwH6l74KRvlffxe8wiLSJLIlnVyxTX2qcwWX1tvlWUIlEhZpEWkUUnaRVZemif5QX8NzYu7rrdyq67HnflvSvEhXzYlbvau9w0Lg0Nb0LkRO+jLUrGLNIisiSSVb1Mca19CpPV1+YTlxSJCjGLtIgsOkmryPKHJfxXNi7d1+0Qux53r4399/KmaVzcjh63YnSUDQ31sCd4D21RMmaRFpElkazqZYpr7VOYrL42jUuKRIWYRVpEFp2kVWT5wxL+qxsXdd1Wn8L8938M8eZrXJqPhhoaFPYE76EtSsYs0iKyJJJVvUxxrX0Kk9XXpnFJkagQs0iLyKKTtIosf1jCitK4ANe6/NdfJZvGpaHhTY09wXtoi5Ixi7SILIlkVS9TXGufwmT1tWlcUiQqxCzSIrLoJK0iyx+WsAI1LrGr5LB7j9vVOmDXXY+7d5jyu1eZ7zDpZJWzGWFYZS+tFIigwpVqY9lAFcn/YE7Lfh135bD27qliaYGWb6yF6tap2ZIx98gkgj+IpHc4/1DYElhktmC12uHsvmUPR0PDfwR7gvfQFiVjFmkRWRLJql6muNY+hcnqa9O4pEhUiFmkRWTRSVpFlj8sYUX8xEVdjHFps5OtLI24xuU4e2nULN2A26aHX6oVtTfdV1+P1jvSSnzLLevy91SztKj2u0t0l61TM3qng2vdDtd6KGyJlm6xe9v1j2wret8bGv5D2BO8h7YoGbNIi8iSSFb1MsW19ilMVl+bxiVFokLMIi0ii07SKrL8YQkrSOOCt9buKhW4dOF9Ol+K3QVQX97sO3I1p2PVJwJuiy7THeku7fZKPqJxu+M27d74q73W12vN0s3YhKlpZ/SmumOJe9qCv/tqB2WqTk29e0tDu3b7qoTdUaHmQ+Fiitr71j1e+hiZPVt6OBoa/nPYE7yHtigZs0iLyJJIVvUyxbX2KUxWX5vGJUWiQswiLSKLTtIqsvxhCf/VjYtPyzcDXE5frHhyaTF9cV06G73QuUj3uqVXSrUPDi8uZDYUXrX0um9nXSx1T1vxN0xX/jo13bD1TjnCfmSVm/b3SN1TuW3dDYEm04ejoeE/if17Fx446Vvl/W2MmEVaRJZEsqqXKa61T2Gy+tr8HZcUiQoxi7SILDpJq8jSQ/ssL2BFalxauhagrnsB7DUvchX1egMk5ScmvG27hN6e9wMbztTXWhDYkCK8yttnvWUf2qJl6aMoPl33a9WM3aluwg9ubJWb9+4Wl6+zby0h73A0NPwnkRO9j7YoGbNIi8iSSFb1MsW19ilMVl+bT1xSJCrELNIisugkrSLLH5awInyrKPx9InVlDmKT4Wurv414MbVwGDobOyvYH1SNbiZhGMp3zsCXfcvSVSbgRvy4hKlSvB8ewUB0lWfEGpc6+ya52OFoaPiPYk/wHtqiZMwiLSJLIlnVyxTX2qcwWX1tGpcUiQoxi7SILDpJq8jyhyWsKD+cu/TttbpsRS6QLURSrp5Mq2H3G/doefvJjJhAdih2EY7NC2GXN13vnnroZU53b6hOTZcJ73ysSO2HIhass2+O8OFoaPhPYj9S98BJ3yrv4/eYRVpElkSyqpcprrVPYbL62nyrKEWiQswiLSKLTtIqsvTQPssLWFEal9BF1F33Ute38MWXZt1QZYJ7ocDbfe1H4pkqwbuwdOe6Z2vdU5+lGz/uOFth6R2sUzOXCd+72g+FK68PjaGN+0uHo6HhP4mc6H20RcmYRVpElkSyqpcprrVPYbL62nzikiJRIWaRFpFFJ2kVWf6whBWmcVk6G7jwYcr9hIP8conLLF0G5IdkAr/hoi6h1YT+7pTNmELmp2y667rf/lF+94TdmcCGGH0XpMzSGbWqzj0N0HLnDbpDqFPT273qTh3nfnfH+d2PWzVb96FQq+1MN/l9yx2Ohob/IPYE76EtSsYs0iKyJJJVvUxxrX0Kk9XXpnFJkagQs0iLyKKTtIosf1jCitO46KuZu9C5yxWz9DrL127N0gvm0toe6koZQO1jdFPuOh8itOXuHxFWxfP3NEhrddqPOjXDGXUA7IzQvbDeQ+EWBx6f3L7lD0dDw38Me4L30BYlYxZpEVkSyapeprjWPoXJ6mvTuKRIVIhZpEVk0UlaRZY/LGFFalzUxUpd6apPANRFjH7BRBdb+rYcQ/sZgaOqY01T5F6kRXfvBW0p9Jss8qGADYBav+3SuuVqv3jTFZl7GkZ1FqH+qU7N1vvECf0TsnoLNR6K7n3bVf+53KVk9q3G4Who+M9gT/Ae2qJkzCItIksiWdXLFNfapzBZfW0alxSJCjGLtIgsOkmryPKHJfw3Ni7LlWgX1NDQ0NAZ7AneQ1uUjFmkRWRJJKt6meJa+xQmq69N45IiUSFmkRaRRSdpFVn+sISmccnQNC4NDQ3/j7EneA9tUTJmkRaRJZGs6mWKa+1TmKy+No1LikSFmEVaRBadpFVk+cMSmsYlQ9O4NDQ0/D/GnuA9tEXJmEVaRJZEsqqXKa61T2Gy+to0LikSFWIWaRFZdJJWkeUPS2galwzdP6nyFv6ZloaGhobOYP/ehQdO+lZ5fxsjZpEWkSWRrOplimvtU5isvjZ/xyVFokLMIi0ii07SKrL00D7LC2gal4aGhoY3FnKi99EWJWMWaRFZEsmqXqa41j6Fyepr84lLikSFmEVaRBadpFVk+cMSmsaloaGh4Y2FPcF7aIuSMYu0iCyJZFUvU1xrn8Jk9bVpXFIkKsQs0iKy6CStIssfltA0Lg0NDQ1vLOxH6h446Vvlffwes0iLyJJIVvUyxbX2KUxWX5tvFaVIVIhZpEVk0UlaRZYe2md5AU3j0tDQ0PDGQk70PtqiZMwiLSJLIlnVyxTX2qcwWX1tPnFJkagQs0iLyKKTtIosf1hC07g0NDQ0vLGwJ3gPbVEyZpEWkSWRrOplimvtU5isvjaNS4pEhZhFWkQWnaRVZPnDEprGpaGhoeGNhT3Be2iLkjGLtIgsiWRVL1Nca5/CZPW1aVxSJCrELNIisugkrSLLH5bQNC4NDQ0NbyzsCd5DW5SMWaRFZEkkq3qZ4lr7FCarr03jkiJRIWaRFpFFJ2kVWf6whKZxaWhoaHhjYX+I0QMnfau8H3iMWaRFZEkkq3qZ4lr7FCarr80P56ZIVIhZpEVk0UlaRZYe2md5AU3j0rCiIy8qjZvUAa0B6ZhVE11Ba0DaDWneqhA1k7CcS1oE0NqnjWRHiwOnIYLaR1uJGHCuKRcpTjpm1YSW09CqVou0iCyJZFUvU1xrn8Jk9bX5xCVFokLMIi0ii07SKrL8YQlN49KwomNfTAo3qQNaA9Ixqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqa41j6Fyepr07ikSFSIWaRFZNFJWkWWPyyhaVwaVnTsx5cKvLScCGpAOmbVRFfQGpB2Q5q3KkTNJCznkhYBtPZpI9nR4sBpiKD20VYiBpxrykWKk45ZNaHlNLSq1SItIksiWdXLFNfapzBZfW2+VZQiUSFmkRaRRSdpFVl6aM+8BTSNS8OKjryoNG5SB7QGpGNWTXQFrQFpN6R5q0LUTMJyLmkRQGufNpIdLQ6chghqH20lYsC5plykOOmYVRNaTkOrWi3SIrIkklW9THGtfQqT1dfmE5cUiQoxi7SILDpJq8jyhyU0jUvDio59MSncpA5oDUjHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvsUJquvTeOSIlEhZpEWkUUnaRVZ/rCEpnFpWNGxH18q8NJyIqgB6ZhVE11Ba0DaDWneqhA1k7CcS1oE0NqnjWRHiwOnIYLaR1uJGHCuKRcpTjpm1YSW09CqVou0iCyJZFUvU1xrn8Jk9bX5VlGKRIWYRVpEFp2kVWTpoT3zFtA0Lg0rOvKi0rhJHdAakI5ZNdEVtAak3ZDmrQpRMwnLuaRFAK192kh2tDhwGiKofbSViAHnmnKR4qRjVk1oOQ2tarVIi8iSSFb1MsW19ilMVl+bT1xSJCrELNIisugkrSLLH5bQNC4NKzr2XYACLy0nghqQjlk10RW0BqTdkOatClEzCcu5pEUArX3aSHa0OHAaIqh9tJWIAeeacpHipGNWTWg5Da1qtUiLyJJIVvUyxbX2KUxWX5tPXFIkKsQs0iKy6CStIksP7Zm3gKZxKUUOiUZPUiZmkRaRRScTq0zp3Ia0FtKuJpgsXB4kmIQIakEPyUrgkqZcpDjpmFUTXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXGufwmT1tfnEJUWiQswiLSKLTtIqsvxhCU3jUoo9FAo9SZmYRVpEFp1MrDKlcxvSWki7mmCycHmQYBIiqAU9JCuBS5pykeKkY1ZNdAWtAWk3pHmrQtRMwnIuaRFAa582kh0tDpyGCGofbSViwLmmXKQ46ZhVE1pOQ6taLdIisiSSVb1Mca19CpPV16ZxSZGoELNIi8iik7SKLH9YQtO4lGI//FLgwFjlfUQWs0iLyKKTiVWmdG5DWgtpVxNMFi4PEkxCBLWgh2QlcElTLlKcdMyqia6gNSDthjRvVYiaSVjOJS0CaO3TRrKjxYHTEEHto61EDDjXlIsUJx2zakLLaWhVq0VaRJZEsqqXKa61T2Gy+tp8qyhFokLMIi0ii07SKrL00F47C2gal1LkkGj0JGViFmkRWXQyscqUzm1IayHtaoLJwuVBgkmIoBb0kKwELmnKRYqTjlk10RW0BqTdkOatClEzCcu5pEUArX3aSHa0OHAaIqh9tJWIAeeacpHipGNWTWg5Da1qtUiLyJJIVvUyxbX2KUxWX5tPXFIkKsQs0iKy6CStIssfltA0LqXYQ6HQk5SJWaRFZNHJxCpTOrchrYW0qwkmC5cHCSYhglrQQ7ISuKQpFylOOmbVRFfQGpB2Q5q3KkTNJCznkhYBtPZpI9nR4sBpiKD20VYiBpxrykWKk45ZNaHlNLSq1SItIksiWdXLFNfapzBZfW0alxSJCjGLtIgsOkmryPKHJTSNSyn2UCj0JGViFmkRWXQyscqUzm1IayHtaoLJwuVBgkmIoBb0kKwELmnKRYqTjlk10RW0BqTdkOatClEzCcu5pEUArX3aSHa0OHAaIqh9tJWIAeeacpHipGNWTWg5Da1qtUiLyJJIVvUyxbX2KUxWX5vGJUWiQswiLSKLTtIqsvxhCU3jUoo9FAo9SZmYRVpEFp1MrDKlcxvSWki7mmCycHmQYBIiqAU9JCuBS5pykeKkY1ZNdAWtAWk3pHmrQtRMwnIuaRFAa582kh0tDpyGCGofbSViwLmmXKQ46ZhVE1pOQ6taLdIisiSSVb1Mca19CpPV16ZxSZGoELNIi8iik7SKLH9YQtO4lGIPhUJPUiZmkRaRRScTq0zp3Ia0FtKuJpgsXB4kmIQIakEPyUrgkqZcpDjpmFUTXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXGufwmT1dcVtXOwPuFbIUM+LTlSIWaRFZNFJWkWWPyyhaVxKsYdCoScpE7NIi8iik4lVpnRuQ1oLaVcTTBYuDxJMQgS1oIdkJXBJUy5SnHTMqomuoDUg7YY0b1WImklYziUtAmjt00ayo8WB0xBB7aOtRAw415SLFCcds2pCy2loVatFWkSWRLKqlymutU9hsvq6IjYu6EsWLbK/PwXhNCynAYaxCiBmkRaRRSdpFVn+sISmcSnFHgqFnqRMzCItIotOJlaZ0rkNaS2kXU0wWbg8SDAJEdSCHpKVwCVNuUhx0jGrJrqC1oC0G9K8VSFqJmE5l7QIoLVPG8mOFgdOQwS1j7YSMeBcUy5SnHTMqgktp6FVrRZpEVkSyapeprjWPoXJ6usK2rhMmDDhxRdffOmll3ALBg8ePG7cuAULFkgfs3DhQknGKoCYRVpEFp2kVWT5wxKaxqUUaXI1ODBWtX6IB2IWaRFZdDKxypTObUhrIe1qgsnC5UGCSYigFvSQrAQuacpFipOOWTXRFbQGpN2Q5q0KUTMJy7mkRQCtfdpIdrQ4cBoiqH20lYgB55pykeKkY1ZNaDkNrWq1SIvIkkhW9TLFtfYpTFZfV9C/43LppZeu1k3v3r1XX331ddddd7vttsP89OnT0bhILFEhZpEWkUUnaRVZemivnQU0jUspckg0epIyMYu0iCw6mVhlSuc2pLWQdjXBZOHyIMEkRFALekhWApc05SLFScesmugKWgPSbkjzVoWomYTlXNIigNY+bSQ7Whw4DRHUPtpKxIBzTblIcdIxqya0nIZWtVqkRWRJJKt6meJa+xQmq68r6M+4XHzxxW95y1s23XTT448//thjjz3wwAPXWWedXhWnn376ggULdO8iUAXSbkhaRBadpFVk+cMSmsalFHsoFHqSMjGLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvsUJquvK+i3ii666CI0Lrvvvvu4cePGjh07evToZ5555r3vfW+PHj022mijCRMmzJkzx33DSNAVAGk3JC0ii07SKrL8YQlN41KKtLcaHBirvI/IYhZpEVl0MrHKlM5tSGsh7WqCycLlQYJJiKAW9JCsBC5pykWKk45ZNdEVtAak3ZDmrQpRMwnLuaRFAK192kh2tDhwGiKofbSViAHnmnKR4qRjVk1oOQ2tarVIi8iSSFb1MsW19ilMVl9X0G8V/eAHP0Djsttuuw0bNuzll18ePnz4qFGjrrzySjQuq622Wr9+/aZPnz5v3jz6WV1dgbQbkhaRRSdpFVl6aK+dBTSNSylySDR6kjIxi7SILDqZWGVK5zaktZB2NcFk4fIgwSREUAt6SFYClzTlIsVJx6ya6ApaA9JuSPNWhaiZhOVc0iKA1j5tJDtaHDgNEdQ+2krEgHNNuUhx0jGrJrSchla1WqRFZEkkq3qZ4lr7FCarryvot4qkcdl1112HDh2K3mX06NETJ0684YYb0Lj07t27b9++48ePnzVrlv7QhSqQdkPSIrLoJK0iyx+W0DQupdhDodCTlIlZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMFl9XaEbl1122eXll19+9dVXJ02aNH369COOOAKNyyabbPLCCy+glcHMggUL7ILaxUmLyKKTtIosf1hC07iUYg+FQk9SJmaRFpFFJxOrTOnchrQW0q4mmCxcHiSYhAhqQQ/JSuCSplykOOmYVRNdQWtA2g1p3qoQNZOwnEtaBNDap41kR4sDpyGC2kdbiRhwrikXKU46ZtWEltPQqlaLtIgsiWRVL1Nca5/CZPW1aVxeHjVq1MSJE6+44opVVlkFk1/72teGDBnSNC4NtbCHQqEnKROzSIvIopOJVaZ0bkNaC2lXE0wWLg8STEIEtaCHZCVwSVMuUpx0zKqJrqA1IO2GNG9ViJpJWM4lLQJo7dNGsqPFgdMQQe2jrUQMONeUixQnHbNqQstpaFWrRVpElkSyqpcprrVPYbL6ukI3LhtuuOGxxx57yCGHbLXVVj169OjZs+d+++333HPPDRs2bMyYMTNnzuzq6rILahcnLSKLTtIqsvxhCU3jUoo9FAo9SZmYRVpEFp1MrDKlcxvSWki7mmCycHmQYBIiqAU9JCuBS5pykeKkY1ZNdAWtAWk3pHmrQtRMwnIuaRFAa582kh0tDpyGCGofbSViwLmmXKQ46ZhVE1pOQ6taLdIisiSSVb1Mca19CpPV1xW6cQG9evVCv7LuuuvuuOOOl156KVqWV155Zfjw4ZMmTZo9e3bzMy4NGeyhUOhJysQs0iKy6GRilSmd25DWQtrVBJOFy4MEkxBBLeghWQlc0pSLFCcds2qiK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DREUPtoKxEDzjXlIsVJx6ya0HIaWtVqkRaRJZGs6mWKa+2jXff7L7jWzpo1a8qUKZMnT547d271l2BNcyIBCLUh3KzQjctOO+1011133X333U888cSIESNGjx796quvQowbN2769Onz58+Xh06gCqTdkLSILDpJq8jyhyU0jUsp9lAo9CRlYhZpEVl0MrHKlM5tSGsh7WqCycLlQYJJiKAW9JCsBC5pykWKk45ZNdEVtAak3ZDmrQpRMwnLuaRFAK192kh2tDhwGiKofbSViAHnmnKR4qRjVk1oOQ2tarVIi8iSSFb1MsW19tEuLrGzZ8/u06fPpz71qXe+853/8z//s/HGG++8886XXnrpjBkzurq65C+qIak2hJsV/WdcXqgYMmQI+pWRI0eidxk/fjy6FvR8zd9xachTvVtoAQfGKu/312MWaRFZdDKxypTObUhrIe1qgsnC5UGCSYigFvSQrAQuacpFipOOWTXRFbQGpN2Q5q0KUTMJy7mkRQCtfdpIdrQ4cBoiqH20lYgB55pykeKkY1ZNaDkNrWq1SIvIkkhW9TLFtfYh99FHH+3Vq9dqq62G6/FBBx205557rrrqqpjZbbfdJk+ejN5FYmpDuFlB/47LhRdeKI3Liy++KF3L2LFjJ06cOHXq1Ndee026FhvthiqQdkPSIrLoJK0iSw/ttbOApnEpRQ6JRk9SJmaRFpFFJxOrTOnchrQW0q4mmCxcHiSYhAhqQQ/JSuCSplykOOmYVRNdQWtA2g1p3qoQNZOwnEtaBNDap41kR4sDpyGC2kdbiRhwrikXKU46ZtWEltPQqlaLtIgsiWRVL1XcXp2q7/6Y7/eob1tgUoZiyW3//v2//e1vDx48eErF+PHjf/vb36699to9e/a86KKL6Cc2QLWdFfETFzx67hMXdC3Dhw/HYzVjxgw8RPPmzZP/1CIyEnboCoC0G5IWkUUnaRVZ/rCEpnEpxR4KhZ6kTMwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqq4dCegq0K0WGhBBMyLgIUr7ty5c1977bVJkyaNHTt2TMXnP//5Hj16fOxjH5s6dSouzIi5S3K1nRW0cZE/+b/rrru+/PLLo0ePnjZtGloWeRjhqocoXAHELNIisugkrSLLH5bQNC6lyNNFgwNjlfcRWcwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqY4+pJ//etf55xzztlnn41LrJ2tQINyySWXnHvuuUOHDsUQV1yEcfXF/JQpU9C7gIkTJ37729/GFXq33XZDK+N+4FQqVNtZEb9VhEdAPnFB4yK/+Txjxgy0feJqYhVAzCItIotO0iqy9NBeOwtoGpdS5JBo9CRlYhZpEVl0MrHKlM5tSGsh7WqCycLlQYJJiKAW9JCsBC5pykWKk45ZNdEVtAak3ZDmrQpRMwnLuaRFAK192kh2tDhwGiKofbSViAHnmnKR4qRjVk1oOQ2tarVIi8iSSFb1MsVxcZo8efL73//+Xr167b777lOnTpVPBeTS27Nnz49//OOvv/66++4G3Llz586qmD17NqyPfexjuEIfeOCBI0eOREMjv2ckxavtrKCfuAwaNOjGG2+888475XeI0O2h7cO8uCJArAKIWaRFZNFJWkWWPyyhaVxKsYdCoScpE7NIi8iik4lVpnRuQ1oLaVcTTBYuDxJMQgS1oIdkJXBJUy5SnHTMqomuoDUg7YY0b1WImklYziUtAmjt00ayo8WB0xBB7aOtRAw415SLFCcds2pCy2loVatFWkSWRLKqlyqOy6e0KbjKvvWtb0Wbcuyxx86cOXPevHkPP/zwOuuss9FGG7300kvoTubMmSMfpQAswTVY+Otf/7raaqutvPLKv/zlL4cPHz5hwgT5bpHUr7azgv5WEVo9PJLjx48fM2bMpEmT0Ofh4XL9iiNRIWaRFpFFJ2kVWf6whKZxKcUeCoWepEzMIi0ii04mVpnSuQ1pLaRdTTBZuDxIMAkR1IIekpXAJU25SHHSMasmuoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6qOK6j0oigKbnjjjtWXXVVtCBXXXUVrrU77bQT9PXXXz+5Ytq0afp7QMKoUaO23nrrXr16HXPMMcOGDRsxYgSu0E3jIshHU9OnT586dSo6GP1BlCZRIWaRFpFFJ2kVWf6whKZxKcUeCoWepEzMIi0ii04mVpnSuQ1pLaRdTTCZXo6XH85QcsJKJzXBJERQC3pIVgKXNOUixUmD6qxr0FZNpIKvAWk3pHmrQtRMwnIuaRFAa582kh0tDpyGCGofbSViwLmmXKQ46ZhVE1pOQ6taLdIisiSSVb1UcbwE5Larq2vGjBnf/OY3e/bsuc4663zqU5/q0aPH5z73uQkTJowbN8793RG59CIPMWjQoB133BF5xNC1DB8+HH0MLtLz5s2TsqDajm1cFs9d/MiJD6NxEVBBPuwBMhTk4xw948KSd0MI0SIANIFJWSWaKlcll57fBLGE2Iy5QxXukQTVA7x0iBg2hwdtzpw58mO5eqEjUSFmkRaRRSdpFVn+sISmcSnFHgqFnqRMzCItIotOJlaZ0rkNaS2kXU0wmV6OF5u8vKHTSU0wCRHUgh6SlcAlTblIcdIAd0dOQ9qqiVTwNSDthjRvVYiaSVjOJS0CaO3TRrKjxYHTEEHto61EDDjXlIsUJx2zakLLaWhVq0VaRJZEsqqXKS4al1hcX9GjfOxjH5M/Ub/11ls/88wzY8aMGT169MSJE1977TX52VKEcfvQQw9tuummiJ199tljx44dOXLkq6++iv5m5syZiFWFDVLbNi5zFj369UcXTTEvPTmlAOkkcAstL0kIOefgFriwTCIpM26eBG4lowWQUgCZqrA9rWFGxwTMY9JlRIgFjVtzhyrokdRDs40KWWtnPRIVYhZpEVl0klaR5Q9LaBqXUuyhUOhJysQs0iKy6GRilSmd25DWQtrVBJPp5fPnz8d7KXkZp5OaYBIiqAU9JCuBS5pykeKkwdChQ+UcpK2aSAVfA9JuSPNWhaiZhOVc0iKA1j5tJDtaHDgNEdQ+2krEgHNNuUhx0jGrJrSchla1WqRFZEkkq3q1iuPiitc4GpcPfOADPSr22GOPF198Ee0I+pJp06bp73Q8+OCDG2ywwZprrvnTn/508uTJCIwaNUr+Tgl9Q6TajmtcFj90wkMvPfpS36eeGjBgAE4saHGerMDrETuAUn379sUQAsOXX34Zw379+s2u/jwMuigMX3jhBezn9OnTn3rqKSTRV8FC2wQLM2ibMHz++edhYRPYk1mzZqECrFdeeQUL0YHBwlA2MWTIEAyxFhb2B0swfOmll7AQdxkxWNgEhngcMAQoiPOGuUMV+pGsHmB+YIFpXkKftQiJCjGLtIgsOkmryPKHJTSNSyn2UCj0JGViFmkRWXQyscqUzm1IayHtaoJJWiIvM7yqwdNPP33EEUf88Y9//C9rXE488cQLLrgApz/cR9wvfarNIhV8DUi7Ic1bFaJmEpZzSYsAWvu0kexoceA0RFD7aCsRA8415SLFScesmtByGlrVapEWkSWRrOrVKo5XAdqIk08+uWfPnu95z3vWWGONXr16ffOb3xw2bBgu9vKDpXiZgGeffXbTTTddZZVVrrrqqkmTJqHXka5Fmht5QdmiZkO46W5cZptvFd125W2XXnYZOh60I+CSCpxhUBl9EvRFF10EgZ3B5KWXXnrFFVegMspeffXVcG+++WZodCqwwBNPPIHhQw89VJW5BDuDrf/617+Gxiawz9jDn/zkJxj+5S9/Qc2BAwdefPHFWIiWBVv8wx/+AOuyyy5D14L7eO2112J4++23w0KjA40kWhkM77jjjl122WWttda6/PLLsUVzhyr0I1k9wPzAZklUiFmkRWTRSVpFlj8soWlcSsFzmsCBsar7xWYHcYu0iCw6mVhlSuc2pLWQdjXBpL8EL1S85n/1q1995CMfQe+CFyrAfPmGglrQQ7ISuKQpFylOGuCMee655+6+++54V4fTmfXqIRV8DUi7Ic1bFaJmEpZzSYsAWvu0kexoceA0RFD7aCsRA8415SLFScesmtByGlrVapEWkSWRrOplijt999139+7de4sttnjqqafOO++8lVZaabXVVuvTp89k9V9SRA/xwQ9+sEePHl/5ylfQE8jfoJswYYL8BIy8vQFqQ7jp/jsucxY9cuIjiybb77kIOLHgFguBnGeAWFgjQiZNiepjIUnKvASAaLFcwAkg1TCUGdzihS9DUEUMonErpwVos72uru9+97t4QHbYYYfvfe97mIQl0COphzVJVIhZpEVk0UlaRZYe2mtnAU3jUoocEo2epEzMIi0ii04mVpnSuQ1pLaRdTTCpJ+XJihcq3qZsu+22eOsjr1u8kuGWbyioBT0kK4FLmnKR4qQB7s68efNwMtp6660HDRqEO2jtGkgFXwPSbkjzVoWomYTlXNIigNY+bSQ7Whw4DRHUPtpKxIBzTblIcdIxqya0nIZWtVqkRWRJJKt6meKi0cG/7W1vW3311e+66y70IkOHDv3MZz7Ts2fPDTfcsH///vKJC141V1xxBbqWXr16/eUvf3n22WcHDhz43HPPIfzqq6+iv8GpQ9esBG6WfqvokZPMD+fisi/NgcN1DCKwLZRyQyfMmUj96ImbFwFEa0vCMoRGOyJagKVdQYZ6fs6cOd/4xjcOOeSQCy64AG/nYJk7VEGPpB7WJFEhZpEWkUUnaRVZ/rCEpnEpxR4KhZ6kTMwiLSKLTiZWmdK5DWktpF1NMKknqxfywltvvRVvtv70pz/hNIFJeT1DlG8oqAU9JCuBS5pykeKkgdwjvEfcd999d9xxxxEjRsgZCrc2F0cq+BqQdkOatypEzSQs55IWAbT2aSPZ0eLAaYig9tFWIgaca8pFipOOWTWh5TS0qtUiLSJLIlnVyxTHa3z69Ol77rkn2pSzzjprwoQJo0ePfuWVV/r167fzzjujTdlll11Gjhwp3wY65phjzM+/9OixyiqrrNpN7969cbrYf//95felUVNtCDfqt4pOemTx1MDbfcGOW9/0i9BDX4sAbt5N6hkRggw1Mil7LjNyMnz99ddPPvnkf/7zn3irAy2TVQ2DfiSh9bAmiQoxi7SILDpJq8jyhyU0jUsp8nTU4MBY1fpEBzGLtIgsOplYZUrnNqS1kHY1wSQtwRupjTfe+BOf+ISchuxsRfmGglrQQ7ISuKQpFylOWg8HDBiw7rrr4s0l3lPifET3N4iuoDUg7YY0b1WImklYziUtAmjt00ayo8WB0xBB7aOtRAw415SLFCcds2pCy2loVatFWkSWRLKqlyqOJzwalxtuuGH77bfHq2D8+PFjx45F44Jb6H/84x/vete7tt5664svvli+GXTmmWdiuMUWW2zWDTTYcsstDzvsMFzXXe8i9auvb9Y/+Y8TAu716aef/tBDD1XtykKcJUaNGmXtCqqshzVJVIhZpEVk0UlaRZYe2mtnAU3jUoocEo2epEzMIi0ii0viqeBuHf7QzYjWG/I3mnY1waQT2BDOOwcccADeTt1zzz14oWJGLIGKV7vWEtA4S28oqAU9FC31BRm6SRPqRg/FrWpzNYEsnGQPPfTQlVde+aabbsIZXM65aXQFrQFpN6R5q0LUTMJyLmkRQGufNpIdLQ6chghqH20lYsC5plykOOmYVRNaTkOrWi3SIrIkklW9VHG8WPC0F9x/OlF+0hZD3EKPHDkSrQwsXLbnzJmDScwMHjz4xRdffPnll0eMGCGNzoQJE6ZMmYKM/HSI1K+2Yz9xebP8AbrqFGJAH3bKKac88sgj0rUAvM+54IIL9FmCKuthTRIVYhZpEVl0klaR5Q9LaBqXUuyhUOhJysQs0iKyuCSe/Xje49a+ONS3eyFk6HDzekP+RtOuJpiEkM3h5PXHP/4RXQveP82ePVtcjSxxu408brGH4gLZW3FFY1JvKKgFPRTtNgRES03cQksSQgKCWFVtriaQhTzuco8ePfDeUX5D0hpxdAWtAWk3pHmrQtRMwnIuaRFAa582kh0tDpyGCGofbSViwLmmXKQ46ZhVE1pOQ6taLdIisiSSVb18cbxw0G2g55D/dCJaEzQomJlb/eHXiRMnSlMif/h/xowZ48aNe+WVV9C14Hb48OHoYwB6F7yC0O7gzY97eVbbeZM1LnKSwaNx3HHHya8UmdLVKmlc3L0DMi+42DKRqBCzSIvIopO0iix/WELTuJRiD4VCT1ImZpEWkcUlcbG//vrrr624ppurr776uuuuQ6+A1wzODtDWuOaaO+64Q145shz4G027mmASQi7/8+bN+8hHPtKzZ88jjjhCvzgdsgRJAXuLE5m+2GMV7iDugvshOEzqDQW1oIdOy44BCIAzqfsGlgSgsS156ARYVe1ANUAWluOE+7a3ve0tb3nLr371KwytEUdX0BqQdkOatypEzSQs55IWAbT2aSPZ0eLAaYig9tFWIgaca8pFipOOWTWh5TS0qtUiLSJLIlnVyxSHxksGLxy8uPDKBRB4FcurCRq9yMyZM2Uerz685NHNoL9BNzN+/HjcorMB6Fowj1ci1trSpjhu3nyfuOC+HH/88c8++yzuCzClq1W413379nUnHyDzgostE4kKMYu0iCw6SavI8oclNI1LKfZQKPQkZWIWaRFZXBLnghtvvBFXSrzLx8VS2HzzzR944AFcgBFA9/DXv/71Qx/6EALvfe9777vvPlxN9YZEJ148VkVwC13SFcT7iZVWWgmNCzonXd8hScR+/etfn3jiiXvvvfdnP/tZnNcQlvyoUaOOPfbYbbbZBre4I9IH6A0FtaCHoqUsToUPPvjg5ZdfjnZql112+eEPf4hTiVjgscce22effd75zndil7A5PLyYrGpzNYEshPGwf/jDH8ZR2H333XFStkYcXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXDReCHhZ4bWAW3kRCXhNYRL9ChALt9BoYvDCnFUBAfDawUteXptSGVS13xyNi9xfiClTppx00kn9+/fHfcfdkdMvgAstyBKgq0nMDmqTqBCzSIvIopO0iix/WELTuJQiT00NDoxV3S82O4hbpEVk0Uk873/605+iP3C9y8EHH4xJWFVt8zHAueeeix7i4YcfdvPVUoPWQtolUJBqQsjMeeedh53p1asXugGxCFlyyy23oHv4+Mc/jiT40Y9+hNMZzmUjRox43/vehxnctbXWWmvGjBlyBtQbCmpBD53GjuHN3DXXXIMHZPvtt8cjtvbaa+OcgsrgL3/5y/rrry+P5P777y9nGVkerAbIkvxXvvIV3PE11ljjhRdekPkEuoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+wSReNfIC1MikBNSGcPPm+OFc7DxOIDhrHX/88dK1uKQpXemRI0finaRMCi4DXGyZSFSIWaRFZNFJWkWWHtprZwFN41KKHBKNnqRMzCItIotLypUSL4/3vOc90rjgdosttsCbFYkBZI488siDDjpIwjJfrTZoDSRgnmIVWvugIKjWLa0jAp3Hbrvthv1BzzFs2DCxCElKHezw//3f/2Hn0U9A423KXnvt9YMf/OD3v//9N77xDbQamMS5wK0SEdSCHoqWDeFW3sYNHz5cPqn6+te/jsrort797nffcccdv/jFL4477rhHHnnEvduranM1gSzZxCWXXIKy4Gc/+5k14ugKWgPSbkjzVoWomYTlXNIigNY+bSQ7Whw4DRHUPtpKxIBzTblIcdIxqya0nIZWtVqkRWRJJKt6meJa+xQmq69vjk9ccEqZNm3ascce+9xzz+GcgJOJS5rSlW5+OLcNmsalFHsoFHqSMjGLtIgsLilPetxed9116BKkcQHSyEtm6tSp22yzzV//+ldJ4lZvSLSER40ade211/70pz+VH5oBWhN9+vRBd4KC7upuylUCM5MmTdpkk02wP5tvvjl2QCxClkgF3P79739feeWVsfOPP/74KaecctVVV6E+Xv8uANwqEUEt6KFoVwSgLDjttNOwh1tuueW4ceP2339/eWMkMbiyBFS1uZpAluRvu+02OQpf/vKX3aQEfHQFrQFpN6R5q0LUTMJyLmkRQGufNpIdLQ6chghqH20lYsC5plykOOmYVRNaTkOrWi3SIrIkklW9THGtfQqT1dc3dOMiL3YwefLkE0444fnnn5dTTbBxkf9iAMIyD1wGuNgykagQs0iLyKKTtIosf1hC07iUIk9QDQ6MVd3PRTuIW6RFZNFJ0RMmTJCfCQW4ZH7+8593XcUdd9yx8847v/7668ENaX3//fdLhTpsuummc+fOtStVHRGDBg1aZZVVEENbMGPGDLEI2o2JEye+/e1vx5LPfvazRxxxxJw5c6znJZ0IakEPyXKgmevVqxceLmzx+uuvx8lFHjFQ1QsXJ+1bDz74IGrijuy+++4yn0BX0BqQdkOatypEzSQs55IWAbT2aSPZ0eLAaYig9tFWIgaca8pFipOOWTWh5TS0qtUiLSJLIlnVyxTX2qcwWX19Q3+rCCcQgFMZupYB1X+X0RqhDeH8OX369GAGuNgykagQs0iLyKKTtIosPbTXzgKaxqUUOSQaPUmZmEVaRBadFI2L7te//nW5XoLVV1/92WeflWb/4x//+E9+8hOI4IacxrOqb9++O3Xznhz77rsvXnjyqtN1IDDz8MMPy55sv/32r732mlgE7Qb2cP/998dd2GCDDV588UUMreclnQhqQQ/JcowZM2ajjTbCTu61116zqj8ZZ41kcdJ6KDz55JNyIN797nfPmzdPHp8YuoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+hcnq6xv9E5fx48cfc8wxL7zwAs4nOC1bw9sQkguqXxHHKVTmgcsAidlBbRIVYhZpEVl0klaR5Q9LaBqXUuyhUOhJysQs0iKy6KRoPPvdhxwAF0750Y3Bgwdvvvnmo0aNQiC4IdF4FTlkXtBJAsVRE8hQF0cR+dQB7LrrrvrVq6HdQKmLL74YO7/ZZpvhLUunGxfsJHZs9913l1/YlrtjvWRx0npYPX6Ln3rqKdTEHdl0003xjgqVMWkTHrqC1oC0G9K8VSFqJmE5l7QIoLVPG8mOFgdOQwS1j7YSMeBcUy5SnHTMqgktp6FVrRZpEVkSyapeprjWPoXJ6mtp4yIvUiCvd0paL/dmw6pujbw5FS5aNG7cuKOPPnr48OHyqtd11L0wQDz33HM//vGPgxngYstEokLMIi0ii07SKrL8YQlN41KKPC81ODBWdT8X7SBukRaRRSdFy8vmM5/5TNW3GDbZZJPRo0efffbZxx13nHwTJLghrefNm4clY8aMwa2gNYGXKGpildxS8Yceegj7gMZl++23f/3118UiaDdQ584778SSlVdeGdd+a1RQ0omgFvSQLAdOLkcddRSajHe9612vvfaatEpimXKR4qT1UMDOyyFA4xL7NplDV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMFl9XQ7fKsKpRoB2SRnKbRp/33DqAGPHjj3mmGOGDh0qFvCTIkQPGDDgwgsv1Ft0GeBiy0SiQswiLSKLTtIqsvTQXjsLaBqXUuSQaPQkZWIWaRFZdFK0PC3uueceXPXlqonr8fnnn7/tttvK3zjCyyO4IaeRQbexbm3e/e53x75VhFtsVD512GqrraZPny4WQbsxZ86cgw8+eNVVV0XvcvXVV6Oy9bykE0Et6CFZAnZ72LBh73znO3v16tW7d++XX34Z7Z1LmnKR4qT1UB4K3Hc5BDvttBNqyoMvAR9dQWtA2g1p3qoQNZOwnEtaBNDap41kR4sDpyGC2kdbiRhwrikXKU46ZtWEltPQqlaLtIgsiWRVL1Nca5/CZPW1/U9c8EoUhg8fft999/3rX/9Ct4GhuHIq69ev3/333z9y5MgqaBBX4++bdC3HHnvs4MGDqx7GflocuRcGCJx/br31Vr0JlwEutkwkKsQs0iKy6CStIssfltA0LqXI01qDA2NV93PRDuIWaRFZdFLr119//T3dvxcN1lhjjb333nu++pvZEtNLnMbr9oEHHlhppZXQ+uBW0JrYfPPNYz+ci1KjRo1aa621sA9bbLHFtGnTxCL0pnF7xx13nHjiibvuuiv2/4tf/KJMyq1Oor9xk25eayABh7YEBNBSfO1rX7vqqqvWXHNN7CfOILJKr4UGuJszZsyQ7QJdrdosF3/kkUfkEOy77752Ko6uoDUg7YY0b1WImklYziUtAmjt00ayo8WB0xBB7aOtRAw415SLFCcds2pCy2loVatFWkSWRLKqlymutU9hsvq6HD5xefTRR3EG+5//+Z8xY8agyZBJvLRnzpy53Xbb4W3MU089taD7r3WLq/GLS9fyyiuvUD5yLwwQ8kG4XuIywMWWiUSFmEVaRBadpFVk6aG9dhbQNC6lyCHR6EnKxCzSIrLoJK362c9+1rP7j9H16tWrT58+Mm82E9qQ03hW4fKMdyFPPvkkbgWtCfm7bfKq03UgMIMW6l3vehf2YcMNNxw9erRYhCyRl+6kSZP2339/vNH59re/jZ3fdtttZ1d/dx+vbRFg+vTpV155JVoxdEKyXCoI2KjE5O91oqY11L45EMP7qiOOOGLq1KnYT2zxhBNOkD1Bnyc/UQsNcdNNN+24447rrbceYj/60Y/QxNgSFf4+gNtvv10e/5NPPtkacXQFrQFpN6R5q0LUTMJyLmkRQGufNpIdLQ6chghqH20lYsC5plykOOmYVRNaTkOrWi3SIrIkklW9THGtfQqT1dfl8MO5OEXssMMO6F2uv/56nKkwxAkBr/e77roLJ89ddtlF/oCvfsunccWxBAtxfjvmmGOka5F5R+ReGFB5ypQpgwcP1qtcBkjMDmqTqBCzSIvIopO0iix/WELTuJRiD4VCT1ImZpEWkUUnadXEiRPf8Y534KqJa+cWW2zhfqPHbCa0IdHy+jRX3W6BlxNwwgevWOkPZIkuLqsOO+ww7MYqq6wyYMAAsQhZgiROHF/72tduu+026N///vc4d6yxxhrPPfecnBTOPvtszA8ZMuTiiy/edNNN5ferZblUANgimowHHngAvQU6KuyY7JXgYg48Snvuuaf8DN1Xv/pVPFbve9/75D3WPffcc+edd0KAW265ZaeddjrvvPPOOussvBXDme7cc8+lyro4LHDZZZfJ44/7Yo04uoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+hcnq63JoXPDSvuSSS/Ai3XXXXSdPnozeRU5lBxxwAN7s4b0f3kfhbdKc1v90msMVx5Jhw4Z95StfGTFiBJIYyrwjci8MEPIH6Oh8YpWKLROJCjGLtIgsOkmryPKHJTSNSylyldLgwFjV/Vy0g7hFWkQWnaRVePF861vfwlUTl3+55Mu82UxoQ1oLaVcTTELIRn/zm99gN+T6LZYGb2iuu+66e++9Fw3HjTfeeOyxx8r7m1GjRq2++upYddVVV+H0gfl+/fpJTXDooYeiG5s5cyaSmARVMaNRZNVVV8V5Z8MNN7z77rtlXoNTzI9//OOXXnoJ56mjjz4aTYmca379619jc2uvvfbAgQOxLVjyxxWwP2inxowZIxvCO6rNNttsk002mTBhgq3Yug8CFqICCm6wwQavvvqqnY2jK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DREUPtoKxEDzjXlIsVJx6ya0HIaWtVqkRaRJZGs6mWKa+1TmKy+Lp8fzn3hhRfwkscZ49FHH502bdrcuXNxfsAQb41efPHF8ePHT5kyRRoXu6YVVAA4J+BtD95c2VmPyL0wQDzzzDPf//739SZcBrjYMpGoELNIi8iik7SKLD20184CmsalFDkkGj1JmZhFWkQWnaRVeBkMGjQIr8D1119/yJAheHXJvNlMaENaC2lXE0xCyHMUb2U22mgj9E9nnXUWhuI6fve736200kprrLHGxz/+8T322AM9CjJyOvjQhz4E661vfevOO+9800034R654kcdddTmm28uHyNh0s1j7eGHH45tAfQu733ve9EPiQVQE4EjjjgC7tvf/va99977lFNOQV8ijcvLL7+MhwvWDjvsgHdgOKNhEu/AcOZ66qmnsFw2hMkTTzxxtdVWe/7551ENOKvaiAGT2O773/9+NC6f+9znFlT/ncs0uoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+hcnq63L4xAWvUzQl++23H16nJ598Mt6K4JRyzjnn4CRw3HHHoREB0s3gVW/XKHDeAPJZC5LBjBC5FwYIbEL+gwAyD1wGuNgykagQs0iLyKKTtIosf1hC07iUYg+FQk9SJmaRFpFFJ2kVXgbgwAMP/MIXviAvMJk3mwltSGsh7WqCSRE4NWDTp5566luqP+8ml3lN//790dasssoq2FX5oBWTiIHbbrttzTXXhAuBaz9mXPGvfe1r8okLNCbdPLjhhhtWX311FNxqq63WXXdd1LRGBer/+Mc/XnnllfE2C2eoWbNmybeT8FjBwtsmLETDhKYE85iReWislQ1h5txzz91ss82mTp1a7WagcQHY7gYbbIDm6R//+AeW2Nk4uoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+hcnq63JoXADet9x6661oXHC6QAsyatSo7bbbDieBv/3tb+hFxo8fP2PGjNjPuODMgCVHH330mDFj5HQRjIHIvTBA4PwgyDxwGeBiy0SiQswiLSKLTtIqsvxhCU3jUkp18WoBB8aq7uexHcQt0iKy6GRw1ciRI8eOHWtK5zaktZB2NcGknhw+fDj6D/QKOC/YKcWkSZNeffVVeeXThrDzU6ZMgcCrGrfOPf7444PfKgJIDhkyBFu855571l9/fZxTrNENAtjcxIkT3VCENChYOK/6mVzgLCewIbRQn/nMZ0466SRM6nnahz59+uBsiDdzrloaXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXGufwmT1dfn8yX+86tGg4H1Iz549b7755ptuuglijz32wCkCJ0+cJeR3AuyCVuSzFvmWcVv3wgCBNzm/+93v3MkEuAxwsWUiUSFmkRaRRSdpFVl6aK+dBTSNSylySDR6kjIxi7SILDqZWGVK5zaktZB2NcGknsQz9eKLL+7Vq9cll1yC16c0KI42NiSNi/+tIuBe/9ddd90HP/hB/a0iHUvjklVto2W3cYuuaJtttkEHJgHBxQRs9GMf+9i666779NNPY0+wyhpxdAWtAWk3pHmrQtRMwnIuaRFAa582kh0tDpyGCGofbSViwLmmXKQ46ZhVE1pOQ6taLdIisiSSVb1Mca19CpPV1+XziQte1HPnzsX5BP3KgQceuM8+++D9xlVXXYWuBe955PtEeAnbdIWcCl588UWscn9+s617YYAYMGDA9773PX2WcBngYstEokLMIi0ii07SKrL8YQlN41KKPRQKPUmZmEVaRBadTKwypXMb0lpIu5pgUk/ihY0m41Of+tTGG288fvx4OhG0saFY44LKAKeA6dOn/3//3/+H9zF6Wy6WxSWr2kajDs5Wc+bMOfbYY2+77Ta/95KY8Pe//3311Ve/7LLLEJP9sUYcXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXGufwmT1dbk1LgsWLPjnP/+58sor9+7de4011thwww0HDRr06quv4kwlv2dko91gyfPPP49TEALQ5Y3LwIEDf/zjHzeNyzLRNC6l2EOh0JOUiVmkRWTRycQqUzq3Ia2FtKsJJvUkXpN4hY8aNerd7373V77yFfkbbtIK0Gu+5obkZ1yCn7ig5uTJkz/3uc+dc8458m0aa+SKa1yyqm00dhgnuJ///OcXXHABzmXSjuBWzmsSw6YBTnbomb785S/LezWhKpZCKvgakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMFl9XQ6Ni7h4CeNMstNOO/WoOPLII4cMGTJy5EicRuRMJWEIgBc4uha8e4ELjbW4xTxe4xCxV3rkXhggUARnKiyXeeAywMWWiUSFmEVaRBadpFVk+cMSmsalFHsoFHqSMjGLtIgsOplYZUrnNqS1kHY1waSelFc1Xp+vvPLKhz/84YsuughazgKw2thQ4hOX22+//fOf//w999wjJxQgFnCxLC5Z1TYa1R544IHzzz9//vz5sufu1sWgsUuHHXbYUUcdNXPmTLhVjVpIBV8D0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvsUJquvy61xwcsWL+rLLrsMXctqq6125513Dh06FO+ypk+frn8sFzGctfr37/+xj33skEMO+cIXvnDEEUd88YtffPzxx+VUkHi9R+6FAQJdi/y3zGQeuAxwsWUiUSFmkRaRRSdpFVn+sISmcSkFTzgCB8aq7ueiHcQt0iKy6GRilSmd25DWQtrVBJN6Ei9seW3jdsKECV/72te++93v4tKOIdw2NnTccccFfzgXYvjw4fK7QgAzsgnBxbLogqL/9a9/nXXWWahs7kl1XwYMGLBgwQJ9asO7tGOOOebiiy+W909i1UQq+BqQdkOatypEzSQs55IWAbT2aSPZ0eLAaYig9tFWIgaca8pFipOOWTWh5TS0qtUiLSJLIlnVyxTX2qcwWX1dPj+cKwIv5GnTpg0cOBAv8xdffBHvr/wfy8Wr/umnn95vv/3WX3/90047Da0G3qX88Ic/3HDDDe+//3682OWEEyRyLwwQgwYNQtskk4LLABdbJhIVYhZpEVl0klaRpYf22llA07iUIodEoycpE7NIi8iik4lVpnRuQ1oLaVcTTNIS+4StnsE4Tfzxj3+8/fbb5bxQf0PSNIAjjzxyk002mTp1KipIEQnI8mo7FpkX0sU1LokKqP/YY4995CMfueGGG26puPnmm9F4oY/BnsgmkAfnnntu3759cf6ShWLVRCr4GpB2Q5q3KkTNJCznkhYBtPZpI9nR4sBpiKD20VYiBpxrykWKk45ZNaHlNLSq1SItIksiWdXLFNfapzBZfV1un7gAvFTxZmP69Oljx47F249x48ahL8GMvMBxBoDo378/3nHhPLDRRhuNGjXq9Yrx48dvtdVW73nPeySPpBQkIvfCgPrND+e2QdO4lGIPhUJPUiZmkRaRRScTq0zp3Ia0FtKuJphMLzftRnfDkU5qsARnCnQ88rcW8Dbl2WefxfnCtQhYThX0kKwELomzCbqWDTbYoGf137gWoLH1Rx55hBoXDOVdl1teH6nga0DaDWneqhA1k7CcS1oE0NqnjWRHiwOnIYLaR1uJGHCuKRcpTjpm1YSW09CqVou0iCyJZFUvU1xrn8Jk9XV5Ni4AL945c+bMnDlz2rRpuJ07d677yBZi4MCB6Fr+9a9/rb322vJfScO7Jtyid/nCF76w0kor/f3vf3/ttddif6ouci8M2ARaJbyRcycx4DJAYnZQm0SFmEVaRBadpFVk+cMSmsalFDzhCBwYq7qfi3YQt0iLyKKTiVWmdG5DWgtpVxNMppfjXIBXOIBOJ4n58+dPmjQJb4xwK3/pEt2M1AFYThX0kKwELon9xCYmTpyIhgm3ABsVjT1xmzZbrZZU98mcuarVy4CrQBqQdkOatypEzSQs55IWAbT2aSPZ0eLAaYig9tFWIgaca8pFipOOWTWh5TS0qtUiLSJLIlnVyxTX2qcwWX1dnt8qAtB45S7oxr2twu1TTz113HHHzZo16xe/+EWvXr2OOeaYKVOm4DyAxmX27Nlf//rX8Wbme9/7nvuvHVX1WqANOSEa23W3gssAF1smEhViFmkRWXSSVpGlh/baWUDTuJQih0SjJykTs0iLyKKTiVWmdG5DWgtpVxNMppfjuWtVLhlEnvrVS2BpHYBJqqCHZCXIJmW7butmq21tyKEraA1IuyHNWxWiZhKWc0mLAFr7tJHsaHHgNERQ+2grEQPONeUixUnHrJrQchpa1WqRFpElkazqZYpr7VOYrL4u509coKvTia1j3n9UPProoyeffPL06dNhnXLKKehRzjjjDLxxkm8nzZ07F0NMyt/PRSsj32CSIo7IvTBATJ06tX///m7TwGWAiy0TiQoxi7SILDpJq8jyhyU0jUsp9lAo9CRlYhZpEVl0MrHKlM5tSGsh7WqCycLlQYJJiKAW9JCsBC5pykWKk45ZNdEVtAak3ZDmrQpRMwnLuaRFAK192kh2tDhwGiKofbSViAHnmnKR4qRjVk1oOQ2tarVIi8iSSFb1MsW19ilMVl+Xf+PiBEAb0dXV9fDDD6NZkf/YKmaOPPLIHj167LLLLqdVnH322eeee+6uu+6KxuUTn/iE/D1u/RvUDlccOF1tx+hnn332+9//ftO4LBNN41KKPRQKPUmZmEVaRBadTKwypXMb0lpIu5pgsnB5kGASIqgFPSQrgUuacpHipGNWTXQFrQFpN6R5q0LUTMJyLmkRQGufNpIdLQ6chghqH20lYsC5plykOOmYVRNaTkOrWi3SIrIkklW9THGtfQqT1deONC6ue4B47LHHvvGNb8yaNUu+c4R25KijjkKPcuCBB15zzTXXXXfdr371q5tuumm//fbD5Ec/+tFhw4aNHz8eeYSliCNyLwwQ8sO5epXLABdbJhIVYhZpEVl0klaR5Q9LaBqXUuyhUOhJysQs0iKy6GRilSmd25DWQtrVBJOFy4MEkxBBLeghWQlc0pSLFCcds2qiK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DREUPtoKxEDzjXlIsVJx6ya0HIaWtVqkRaRJZGs6mWKa+1TmKy+dqpx6ar4xz/+8c1vflP/vX+Io48+Gj3KKaec8vLLL8uf1p0+ffqJJ56IyUMOOQSNy9ixY+XHXGSJI3IvDBDTpk3DWv05jcsAF1smEhViFmkRWXSSVpHlD0toGpdS7KFQ6EnKxCzSIrLoZGKVKZ3bkNZC2tUEk4XLgwSTEEEt6CFZCVzSlIsUJx2zaqIraA1IuyHNWxWiZhKWc0mLAFr7tJHsaHHgNERQ+2grEQPONeUixUnHrJrQchpa1WqRFpElkazqZYpr7VOYrL52sHF54IEHvvOd78gP/uvG5bTTTtONy8SJE2fMmHHCCSdgEreYHDdu3LI2Lii7YMECdC1N47JMNI1LKfZQKPQkZWIWaRFZdDKxypTObUhrIe1qgsnC5UGCSYigFvSQrAQuacpFipOOWTXRFbQGpN2Q5q0KUTMJy7mkRQCtfdpIdrQ4cBoiqH20lYgB55pykeKkY1ZNaDkNrWq1SIvIkkhW9TLFtfYpTFZfl2fjIt0JbsFDDz10+umno/9A10KNy4033tijR4/DDjts+PDhY8aMmTp1Kpqbww8/HJOXX375K6+80sYnLig7dOjQm266yW0IuAyQmB3UJlEhZpEWkUUnaRVZ/rCEpnEpxR4KhZ6kTMwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqa41j6Fyerr8mxc5AMPcP/995966qnuO0QIuAxmXnjhhdVWW2233XZD4zJu3LiZM2fOmTPnIx/5yKqrrtq3b9+RI0dOmDBB/t6uLHG4IsDpqrbRzzzzzPe//31sXeaBywAXWyYSFWIWaRFZdJJWkeUPS2gal1LwhCZwYKxS3boQs0iLyKKTiVWmdG5DWgtpVxNMFi4PEkxCBLWgh2QlcElTLlKcdMyqia6gNSDthjRvVYiaSVjOJS0CaO3TRrKjxYHTEEHto61EDDjXlIsUJx2zakLLaWhVq0VaRJZEsqqXKa61T2Gy+rqc/44L+oa77777zDPPlD/ejyEmEdCZefPmffrTn15vvfX69esnP+AyatSoDTfc8MADDxwzZgz0lClT5G/Q2QXd6CJOV7WNRj90zTXXyKTgMsDFlolEhZhFWkQWnaRVZOmhvXYW0DQupcgh0ehJysQs0iKy6GRilSmd25DWQtrVBJOFy4MEkxBBLeghWQlc0pSLFCcds2qiK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DREUPtoKxEDzjXlIsVJx6ya0HIaWtVqkRaRJZGs6mWKa+1TmKy+LodPXIBcR9Fq3HXXXeeff760HQsWLMAkXL0cM2hoBg8evPXWW3/+858fNGjQsGHDjj766G222eaZZ54ZO3YsWpkZM2ZgrV2giNwLAwTKynZlHrgMcLFlIlEhZpEWkUUnaRVZ/rCEpnEpxR4KhZ6kTMwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqa41j6FyerrcmhcpGVB3/DHP/7x3HPPnTNnjvsuj2lnKmQoYDh//vwhQ4aceeaZX/3qV7/yla9897vffemll+QP6eJ29uzZ/g+4gMi9MKAmtit/KkbmgcsAidlBbRIVYhZpEVl0klaR5Q9LaBqXUqqndws4MFZ5H5HFLNIisuhkYpUpnduQ1kLa1QSThcuDBJMQQS3oIVkJXNKUixQnHbNqoitoDUi7Ic1bFaJmEpZzSYsAWvu0kexoceA0RFD7aCsRA8415SLFScesmtByGlrVapEWkSWRrOplimvtU5isvi6fbxWhU0HXsuOOO37nO9+5+uqroQcMGDBx4kTMo5OQH87VYGbu3LkzZ85EmzKt+q8azZgxw/13i9DW2FwrkXthgHjmmWd+8IMfyKTgMsDFlolEhZhFWkQWnaRVZOmhvXYW0DQupcgh0ehJysQs0iKy6GRilSmd25DWQtrVBJOFy4MEkxBBLeghWQlc0pSLFCcds2qiK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DREUPtoKxEDzjXlIsVJx6ya0HIaWtVqkRaRJZGs6mWKa+1TmKy+tv+Ji1w7Rfz+97+/4IILZs2aNX78+H79+t11111XXHHFySeffMQRRxx77LGw7rvvvgkTJsjHIbIQvcucCnQqs2fPfu2113AL0LXoT000kXthgGj+69Bt0DQupdhDodCTlIlZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMFl9LWpc0HyA3/72txdeeKH8DpFLVs3JYkzOmDHj2WefvfHGG0855ZRvfvObffr0mT59OlbJJzELFixApzKvAkL+FotU8IncCwPE2LFjH3jggaZxWSaaxqUUeygUepIyMYu0iCw6mVhlSuc2pLWQdjXBZOHyIMEkRFALekhWApc05SLFScesmugKWgPSbkjzVoWomYTlXNIigNY+bSQ7Whw4DRHUPtpKxIBzTblIcdIxqya0nIZWtVqkRWRJJKt6meJa+xQmq69FjUtXV9fNN9+MrgVth/QiOim9CyZFIDBx4sRbbrnly1/+8h133IElsAjE7OIQkXthgJAdADIPXAa42DKRqBCzSIvIopO0iix/WELTuJRiD4VCT1ImZpEWkUUnE6tM6dyGtBbSriaYLFweJJiECGpBD8lK4JKmXKQ46ZhVE11Ba0DaDWneqhA1k7CcS1oE0NqnjWRHiwOnIYLaR1uJGHCuKRcpTjpm1YSW09CqVou0iCyJZFUvU1xrn8Jk9bWdxkX6A3DjjTdedtllaF+gpTtxSWlB5FYmK98wa9asq6666owzzpDexc5WuCVBIvfCgFUTJkx45JFHUFDmgcsAidlBbRIVYhZpEVl0klaR5Q9LaBqXUuyhUOhJysQs0iKy6GRilSmd25DWQtrVBJOFy4MEkxBBLeghWQlc0pSLFCcds2qiK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DREUPtoKxEDzjXlIsVJx6ya0HIaWtVqkRaRJZGs6mWKa+1TmKy+LnPjgs5gYcUvf/nLH/3oR/KDtNZrKW7wNcJS4Sc/+UmfPn0wDC4Pol2nTemqcXnmmWcuvPDCWDWJ2UFtEhViFmkRWXSSVpHlD0toGpdS7KFQ6EnKxCzSIrLoZGKVKZ3bkNZC2tUEk4XLgwSTEEEt6CFZCVzSlIsUJx2zaqIraA1IuyHNWxWiZhKWc0mLAFr7tJHsaHHgNERQ+2grEQPONeUixUnHrJrQchpa1WqRFpElkazqZYpr7VOYrL4uc+OCzmDBggW/+MUv0Hmga5GPW6zXUtzgayyXxuX1118//vjj6e+1uFgQ7TptSleNS//+/Zsfzl1WmsalFHsoFHqSMjGLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvsUJquvy9C4oCdAf4Ce41e/+hW6FggMgWQEVdzga1kC0LKceOKJs2bNknnBxYJo12lTutIo5X5xSXAZ4GLLRKJCzCItIotO0iqy/GEJTeNSin06K3BgrOp+bdhB3CItIotOJlaZ0rkNaS2kXU0wWbg8SDAJEdSCHpKVwCVNuUhx0jGrJrqC1oC0G9K8VSFqJmE5l7QIoLVPG8mOFgdOQwS1j7YSMeBcUy5SnHTMqgktp6FVrRZpEVkSyapeprjWPoXJ6usy/B0XdCpdXV033HDDVVddJb+0rF1BFTf42jF37txDDz103rx52vJjmmASQjT2R/5sncwD0npYk0SFmEVaRBadpFVk6aG9dhbQNC6lyCHR6EnKxCzSIrLoZGKVKZ3bkNZC2tUEk4XLgwSTEEEt6CFZCVzSlIsUJx2zaqIraA1IuyHNWxWiZhKWc0mLAFr7tJHsaHHgNERQ+2grEQPONeUixUnHrJrQchpa1WqRFpElkazqZYpr7VOYrL7W/cQFl0m0BVdfffW1114rXQtwrgigiht87Rg+fPg555yDIqhsp1pL+WjXaVO60kOGDLnuuuvcXgGXAS62TCQqxCzSIrLoJK0iyx+W0DQupdhDodCTlIlZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMFl9zTcu0luga7niiiuuv/569x0iIIFIcYOvARai5o033njXXXdBaEtrn2ASQvSAAQPOP//8pnFZJprGpRR5JWhwYKzyPiKLWaRFZNHJxCpTOrchrYW0qwkmC5cHCSYhglrQQ7ISuKQpFylOOmbVRFfQGpB2Q5q3KkTNJCznkhYBtPZpI9nR4sBpiKD20VYiBpxrykWKk45ZNaHlNLSq1SItIksiWdXLFNfapzBZfc1/qwidyvz583/yk5/84he/kD8QZ41uIsUNvhZQ57DDDps8eTK0tihGBJMQoocMGfLTn/5U757LABdbJhIVYhZpEVl0klaRpYf22llA07iUIodEoycpE7NIi8iik4lVpnRuQ1oLaVcTTBYuDxJMQgS1oIdkJXBJUy5SnHTMqomuoDUg7YY0b1WImklYziUtAmjt00ayo8WB0xBB7aOtRAw415SLFCcds2pCy2loVatFWkSWRLKqlymutU9hsvqa/8QFTcYPf/jDPn36dHV1oYkB1ugmUtzga7niDh48+Bvf+AZKQcu84GJBtOu0KV1ptCzYw+YTl2WiaVxKsYdCoScpE7NIi8iik4lVpnRuQ1oLaVcTTBYuDxJMQgS1oIdkJXBJUy5SnHTMqomuoDUg7YY0b1WImklYziUtAmjt00ayo8WB0xBB7aOtRAw415SLFCcds2pCy2loVatFWkSWRLKqlymutU9hsvoabVzQVUgr8OMf//jXv/619CvUZwiR4gZfowLKXnDBBc8884x8OuIsoLVPMAkh+vXXXx87dmzTuCwTTeNSij0UCj1JmZhFWkQWnUysMqVzG9JaSLuaYLJweZBgEiKoBT0kK4FLmnKR4qRjVk10Ba0BaTekeatC1EzCci5pEUBrnzaSHS0OnIYIah9tJWLAuaZcpDjpmFUTWk5Dq1ot0iKyJJJVvUxxrX0Kk9XXcOOCfmJB9V8RQtdy0003SRMjlk+kuMHXqDxmzJijjz4aLZHMOAto7RNMQgCUHTBgQPMH6JaVpnEpxR4KhZ6kTMwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqa41j71kriQL648I/DF5f6NLqK7cVmMxuXEh7sbF5NE43LRRRfdeuutaF/QuMgnLlLBlRLRXdygiou2gX+brZkRGotLLrnkL/f8xXUY3Ut0KbG0MHQnwdJkVdzoZwcMuOCCC7rLmtvujLl1MTdT3QISOuC2YuiuAMyMslqKq9hS4ZIhYdBJV0oNjTD/MFBRHPRCmsalFDzhCBwYq7qf4nYQt0iLyKKTiVWmdG5DWgtpVxNMFi4PEkxCBLWgh2QlcElTLlKcdMyqia6gNSDthjRvVYiaSVjOJS0CaO3TRrKjxYHTEEHto61EDDjXlIsUJx2zakLLaWhVq0VaRJZEsqqXKa61Ty65yPzUbfXPtCZL9VKxAJlFXRjMnb/k8RP+vngKFnUtXrRg3rzZP/j+BbffdvMiREy4S5YE/8WKu3nziQ4KL1yEfxPGjf/qkUfNnTUbY5cUIUmn/X/BpClezU8cP/aJxx8z++9lKt0yrPkvUaF1Z5ZaEM7SmfS/1mot+1lZdmbhkoWLzNBir50FNI1LKXjhEXqSMjGLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvukk11LFqMvkX+iFy0x/xb/e7EV6CYWLl4yf9HChYvnz1vy+In3LpqGC+PC2XPmnHf+BXf87rcLutDT4Eq5dEnwn3aXFsdbUaWxA/MXL8K/H/zwsj/d+9d5ixaIJW5Q+/+CSQijcTfNj+bK97Q4I1oPa/5LVIhZpEVk/+kkrTIF0ZiaBmbJvxfhLX712VUFDnohTeNSij0UCj1JmZhFWkQWnUysMqVzG9JaSLuaYLJweZBgEiKoBT0kK4FLmnKR4qRjVk10Ba0BaTekeatC1EzCci5pEUBrnzaSHS0OnIYIah9tJWLAuaZcpDjpmFUTWk5Dq1ot0iKyJJJVvUxxrX0yyepNuWAHixZWPYy5NWIRupRFi83vCi1eOPff//jmvUumLZk/d+55Z511zx//tGjB/EVdXYu7pO2xS4L/tLu0+OKFrBfOnzBh7Je+9MU5c2YtwvxiXkLa/xdMQhi9sGv0q6/e/5e/QPgZ0XpY81+iQswiLSL7TydplRkuRgfZtQjHcRGOZPUtPqPs06aEpnEppXphtaBfetULsWVoVatFWkQWnUysMqVzG9JaSLuaYLJweZBgEiKoBT0kK4FLmnKR4qRjVk10Ba0BaTekeatC1EzCci5pEUBrnzaSHS0OnIYIah9tJWLAuaZcpDjpmFUTWk5Dq1ot0iKyJJJVvUxxrX0yyapN6f52g/leQ9WGdC1YUt0u7pq/pGv24rldi+fMR3bmkr+d8uD8MfPPPO+sO+/5w9yF80xg8QJJLl1SzeC2RSwJJI2o5iWA/mhR1/xLL7rowb//ffGiRYsXYbcW6iW6lBGtW9FJO6OKi+47oO85F54rYbmVeZdfukTVJKEDRnRXMP+qCi7pLInJ0OmlSyrhkr6QQEuS9hPWonnzu2bNXTR7/pJ5XUsWVN+5wwFtvlX0BgAvPEJPUiZmkRaRRScTq0zp3Ia0FtKuJpgsXB4kmIQIakEPyUrgkqZcpDjpmFUTXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXGufdNK0B2hd7O0S3C5Eu2B6hiUizI9LVM0M3tX/e9qSu771lx+dfPEf7ry7a6H5sZfujPmnlgT+addpCK0XLFw0YdKUo48+Flsz3zdaENyfFu3/CyYhRA949rmLLrpU7znl9bDmv0SFmEVaRPafTtIqDMf3Gz/t3umLJptfT1+wZD6OmxxiHPRCmsalFDkSGj1JmZhFWkQWnUysMqVzG9JaSLuaYLJweZBgEiKoBT0kK4FLmnKR4qRjVk10Ba0BaTekeatC1EzCci5pEUBrnzaSHS0OnIYIah9tJWLAuaZcpDjpmFUTWk5Dq1ot0iKyJJJVvUxxrX1SycVLRt88pv+3+j9zyoBnTnlmwLefHXDqgIGnDHzuW889d8rz5lbEyc8P/NazfU9/5vmTn71y859fvduPB5z8DKyBJ1dJ988tCf7Tri6u9DPf7n/rZ275+Z7XDzit/8BvYXMDnnebCC4P/gsmISo94MRnn/xy3+f0nlNeD2v+S1SIWaRFZP/pJK065fn++w54cI1HHjrkoZmDZ5rP0uwxtk+bEprGpRRzPFrBgbGq+0jZQdwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqa41j7p5MKpCxeMWSD/usZ2GTF6QdfoBQvHdOFWxIKxZnLB+AVdr3Q9/OXHFgye39Ud0//ckuA/7eriWtsdcPvTnbduSPv/gkmISneZ3VYb1RnReljzX6JCzCItIvtPJ2kVhkOvH9r30/+a9NeJi2YvnLdk3oLF5vtEwF47C2gal1LwwiP0JGViFmkRWXQyscqUzm1IayHtaoLJwuVBgkmIoBb0kKwELmnKRYqTjlk10RW0BqTdkOatClEzCcu5pEUArX3aSHa0OHAaIqh9tJWIAeeacpHipGNWTWg5Da1qtUiLyJJIVvUyxbX2SSfN1a36OVH8W9ItzI+7GG3+GbFoofnGDS6Bs5Y8dNIj5tehbcwku2+XLlHzS0V3cRPQxbvnTUBlbNJbYpMi3EwoaWZcEqJbG9GdsUMRrTE7U936QgfcVkygu4KdUVZLcRVbKlwyJExAJ10pN5w/bQEOzcK5i7oWLVi8xPyUC24ADnohTeNSirSQGvPa68YcpcjbC22RFpFFJxOrTOnchrQW0q4mmCxcHiSYhAhqQQ/JSuCSplykOOmYVRNdQWtA2g1p3qoQNZOwnEtaBNDap41kR4sDpyGC2kdbiRhwrikXKU46ZtWEltPQqlaLtIgsiWRVL1Nca596SfQvi6o//FYJsMS+X4dYZH7SBP9btGT24r9//eHF0yTTnbS3asnSeSVs8QpVfOlGIcy8WN0BXlIF9H62iAq9G8HidqNAht3CxPziAgkdcEsquLhK6uIQxqpwwiUDokInaT9h4TAtnt+1aH7XYvMHAzEQz147C2gal1LwwiP0JGViFmkRWXQyscqUzm1IayHtaoLJwuVBgkmIoBb0kKwELmnKRYqTjlk10RW0BqTdkOatClEzCcu5pEUArX3aSHa0OHAaIqh9tJWIAeeacpHipGNWTWg5Da1qtUiLyJJIVvUyxbX2ySXlWmv+LcEbQ1//G7ddaFq6liz895zFD37z4SXTl8b0P73c/xcsDhHU/j9tJWL4F0xCBLUMtdbDmv8SFWIWaRHZfzpJq+zQ/G2XxThQaFcWLFnYhfnup00JTeNSirzUNHqSMjGLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvvkk4vtP/cX91s0bswfU8E1sMs0Lt8wjYuL6X96uf8vWBwiqP1/2krE8C+YhAhqGWqthzX/JSrELNIisv90klZhiK/mW0PoWdDGmGZm0RIctu6nTQlN41JK9dFXC9WRspgXWevQqlaLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvsUJquv8oOeXUvmLH7o6w9V3yoK0FZxg699tJWIgWASIqgBaT2sSaJCzCItIotO0iozxGFZCNllfq98yUL7V1yabxW9EcDhIfQkZWIWaRFZdDKxypTObUhrIe1qgsnC5UGCSYigFvSQrAQuacpFipOOWTXRFbQGpN2Q5q0KUTMJy7mkRQCtfdpIdrQ4cBoiqH20lYgB55pykeKkY1ZNaDkNrWq1SIvIkkhW9TLFtfYpTFZfu/AG3vwZ1rlLHjnJ/UcWmbaKG3zto61EDASTEEENSOthTRIVYhZpEVl0klbZYdXALIGsPo4RYBXSNC6l2EOh0JOUiVmkRWTRycQqUzq3Ia2FtKsJJguXBwkmIYJa0EOyErikKRcpTjpm1URX0BqQdkOatypEzSQs55IWAbT2aSPZ0eLAaYig9tFWIgaca8pFipOOWTWh5TS0qtUiLSJLIlnVyxTX2qcwWX21/3XoJfPQuDzSNC5EokLMIi0ii07SKrL8YQlN41JK9dFXCzgwVlWahla1WqRFZNHJxCpTOrchrYW0qwkmC5cHCSYhglrQQ7ISuKQpFylOOmbVRFfQGpB2Q5q3KkTNJCznkhYBtPZpI9nR4sBpiKD20VYiBpxrykWKk45ZNaHlNLSq1SItIksiWdXLFNfapzBZfV1gvgWxeOHiuYsfPvHhRVObbxW1kKgQs0iLyKKTtIosPbTXzgKaxqUUOSQaPUmZmEVaRBadTKwypXMb0lpIu5pgsnB5kGASIqgFPSQrgUuacpHipGNWTXQFrQFpN6R5q0LUTMJyLmkRQGufNpIdLQ6chghqH20lYsC5plykOOmYVRNaTkOrWi3SIrIkklW9THGtfQqT1dcV6BMX3C6qkKv+ggULXn311b/+9a9z5szBUGIEVaChVa0WaRFZdJJWkeUPS2gal1LsoVDoScrELNIisuhkYpUpnduQ1kLa1QSThcuDBJMQQS3oIVkJXNKUixQnHbNqoitoDUi7Ic1bFaJmEpZzSYsAWvu0kexoceA0RFD7aCsRA8415SLFScesmtByGlrVapEWkSWRrOplimvtU5isvq5YjQsalNdee+0Pf/jDqaee+oEPfGC99dZbZZVVJk6cuHCh+eBJkhqqQEOrWi3SIrLoJK0iyx+W0DQupdhDodCTlIlZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMIkvixcvWojbJfMXzV306AmPLZpetTEe7RSv8LWPthIxkEguWmT23HyQ0tp/uBkRaFAee+yxnhXrr7/+W97yllVXXXXUqFHz588P9i60RRpa1WqRFpFFJ2kVWf6whKZxKcUeCoWepEzMIi0ii04mVpnSuQ1pLaRdTTBZuDxIMAkR1IIekpXAJU25SHHSMasmuoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+hUnzB0Kqv1C/eMmcRbMXPfa1xxfOqEYe7RSv8LWPthIxEEyiZZGeo/oukPk2kMy7GbgyLzMjRoy49tprn3/++X79+q2++uqrrLLKkCFDZs6cOW/ePLiy1kFbpKFVrRZpEVl0klaR5Q9LaBqXUuSJpcGBsUo9F4WYRVpEFp1MrDKlcxvSWki7mmCycHmQYBIiqAU9JCuBS5pykeKkY1ZNdAWtAWk3pHmrQtRMwnIuaRFAa582kh0tDpyGCGofbSViwLmmXKQ46ZhVE1pOQ6taLdIisiSSVb1Mca19CpNoUXA97zJywZJZix858dHFUyvDo53iFb720VYiBmLJWbNmofMArkER0LJgctq0afKBClzQ1dW1YMGC11577emnn15ttdVWWmmlAQMGTJw4EUVg2ZXd0BZpaFWrRVpEFp2kVWTpob12FtA0LqXIIdHoScrELNIisuhkYpUpnduQ1kLa1QSThcuDBJMQQS3oIVkJXNKUixQnHbNqoitoDUi7Ic1bFaJmEpZzSYsAWvu0kexoceA0hGg5dcpkELcEaO3jXFM6pAHpmFUTWk5Dq1ot0riVu1//QSCqetHigtY+hcnqE5fFi5YsRP/y71lL7j3l3iWTxWHaKV7hax9tJWIgmEQvctttt2233XbbbLPN/fff7z59we1jjz2Gyd133338+PH6cxf0MWhonnrqqd69e6NxQQczZswYzKChkZoO2iINrWq1SIvIopO0iix/WELTuJRiD4VCT1ImZpEWkUUnE6tM6dyGtBbSriaYLFweJJiECGpBD8lK4JKmXKQ46ZhVE11Ba0DaDWneqhA1k7CcS1oE0NqnjWRHiwOnIUTjMoCLgbtO+Oj5WEZwLkRQA9Ixqya0nIZWtVqk3QyQRyaIe9x83CPpaxFAa5/CZPWn5bH3C5YsWrJ41sK+t/ZdPCN8X9opXuFrH20lYiCYxDNw7ty5hx12WM+ePTfffPOhQ4fOmzcPLciUKVPe8573rLzyyrfccgtmEMMhQx73uKura/bs2QMGDEDjggA6mNGjRzeNS0M7VGeAFnBgrOo+NdhB3CItIotOJlaZ0rkNaS2kXU0wWbg8SDAJEdSCHpKVwCVNuUhx0jGrJrqC1oC0G9K8VSFqJmE5l7QIoLVPG8mOFgdOQ4jGZeD111//7ne/e9ZZZ50ZQs/HMoJzIYIakI5ZNaHlNLSq1SItAnd/3Lhx8sgEcY+bj3skfS0CaO1TmFyC/5n2a75pwRbOX9S1ePGCpd2Ypp3iFb720VYiBoJJPA/RlEyYMOF973sfepc999xz4sSJaGWOO+64Xr16HXrooXiWvvbaa+hU0Je4dhO9y6BBg1ZbbTU0Lk8//TQalxkzZsyfP19cB22Rhla1WqRFZNFJWkWWHtprZwFN41KKHBKNnqRMzCItIotOJlaZ0rkNaS2kXU0wWbg8SDAJEdSCHpKVwCVNuUhx0jGrJrqC1oC0G9K8VSFqJmE5l7QIoLVPG8mOFgdOQ4jGebNfv36bbLLJaaeddkYIXNetatU+zjW9QEgD0jGrJrSchla1WqTlFh0MLpnyyARxj5uPeyR9LQJo7VOY7P7ExfxG9KIlXV3VBVEsop3iFb720VYiBoJJ7DMaF/QcTz311DrrrINm5cQTT7zttttWWWWVrbbaaujQoVOmTJk8eTL6EvncRVahg3nppZdWX311NC54GjffKmpoE3soFHqSMjGLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA3h9N13373rrrviYmCueB4IWNWqfZxrioY0IB2zakLLaWhVq0Uat7j4yTt4sYK4x8pHP5KkRQCtfQqT7S33iRUPah9tJWIgmITAIejq6po1a9Z1110nH6Ksv/76vXv3vuOOOyZOnDhu3Dg0l+hLdOOCJUOGDJHGpX///k3j0tAm9lAo9CRlYhZpEVl0MrHKlM5tSGsh7WqCycLlQYJJiKAW9JCsBC5pykWKk45ZNdEVtAak3ZDmrQpRMwnLuaRFAK192kh2tDhwGsLpiy+++JhjjsE1IAgCVrVqH+eaoiENSMesmtByGlrVapGW26ZxAbHiQe2jrUQMJJI4EHPnzp06deq+++7bo0ePnj17HnXUUSNHjhxdMXnyZPdLQ5KHaBoX0DQupdhDodCTlIlZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGkIpw877LDLL78c14AgCFjVqoPoDkBmtAakY1ZNaDkNrVJads8h82aZEkHcY+WjH0nSIoDWPoXJ9pb7xIoHtY+2EjEQTMrjjwOEtuOuu+5addVVpXHZYYcd+vbtO3z48AkTJsjPryAjSwBWNY0LaBqXUuyhUOhJysQs0iKy6GRilSmd25DWQtrVBJOFy4MEkxBBLeghWQlc0pSLFCcds2qiK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DSE6IULF77vfe974IEHYldutwRoTaCO7loc1XbCFRJWTWg5Da2qNHZM9lC+IyYXP51Jk0jCci5pEUBrn8Jke8t9YsWD2kdbiRgIJnFQ5LiMHj16880379279wUXXLDZZpv16tXrk5/8JLqTSZMmzZ49232TSMCSpnEBTeNSij0UCj1JmZhFWkQWnUysMqVzG9JaSLuaYLJweZBgEiKoBT0kK4FLmnKR4qRjVk10Ba0BaTekeatC1EzCci5pEUBrnzaSHS0OnIYQPW/evLXXXnvUqFG4BohFuCVAawKXnNcq6NJSbSdcIWHVhJbT0KpK495hx9wemotk9dscNpEjkYTlXNIigNY+hcn2lvvEige1j7YSMRBM4nDgKYTW5KCDDurZs+dJJ500cuTIm266Sf5Gy9lnnz1lypS5c+dKx+nAqqZxAU3jUoo9FAo9SZmYRVpEFp1MrDKlcxvSWki7mmASojpVGmQG4EVop7on7cD7TBtIQBPbUFALekiWj94HnOvllOFWmdKRagkrCOq7bclQhJCopi2atypEzSQs55IWAbT2aSPZ0eLAaQh5zHHexxtcnO5l3ie9ITlYuKL84Ac/2HbbbbfccsvTTjtt1qxZmJQAlsQqJKya0HIaWlWBi9lFF1203XbbbbHFFqeccgoukNhDyiRIJM1Wu13SIoDWPoXJ9pb7xIoHtY+2EjEQTOJwdHV1XXLJJehaPvzhD7/66qvoQgYPHoynU69evVZbbbU///nPeF7NmTMHz1swY8aMxx9//Iknnrj99ttXWWUVZG688UZkHnjggWnTprmnn0BbpKFVrRZpEVl0klaR5Q9LaBqXUvCMIXBgrOp+MtlB3CItIotOJlaZ0rkNaS2kXU0wSUvktSfdALCz3fMQsASZDxLbUFALekhWEOwA9kd2A6KqFy5OOmbFwNlnt912O/XUU/27nKimLZq3KkTNJCznkhYBtPZpI9nR4sBpCNF33nnnRz/60cQzzS0BWgvVE3YRipx77rlPPfXUtddeu9Zaa33/+9+3ttqQQDpm1YSW09Cqirvvvvv888/v27fv9ddfv8Yaa2BvMUmZBImk2Wq3S1oE0NqnMNnecp9Y8aD20VYiBoJJdC333nvvmmuuueGGGw4YMGD8+PEjR44cMWLEyy+/LJ/BbLLJJgMHDkTvgucqQNfSo0cP9Cu4fctb3oJbgBhAjJ7PtEUaWtVqkRaRRSdpFVl6aK+dBTSNSylySDR6kjIxi7SILDqZWGVK5zaktZB2NcEkhDxH8Yr6zne+g0vFnnvueemll1avwaUfreMagMzYsWM/8YlP7LXXXkceeaR7h2ETitiGglrQQ7J8sFGcSrCfJ510Et6wYscw6VaZ0pFqCSsIKl955ZU4++y333547y7/RRLr1d4QzVsVomYSlnNJiwBa+7SR7GhxoDUedhzi733veyeccIIc3CDpDaEC1j722GN4hkDjwB133HG77747rkMSwJJYhYRVE1pOQ6sqHnzwQfm5Ttwef/zxeHFhhymTIJE0W+12SYsAWvsUJttb7hMrHtQ+2krEQDCJZ9Hvf/97nBLvu+++yZMnjxkzZtSoURMrXnzxxfPOO++73/3un//855nVr0Pj2TV69GiEL7roIjyBzznnnDPOOEP+EBE0Oh6cRugcYlWlaWhVq0VaRBadpFVk+cMSmsalFHsoFHqSMjGLtIgsOplYZUrnNqS1kHY1wSQEXpkAp84DDjhA3h9sueWW9HPyEOBXv/qVvJP43//93+nTp8uFHGslIwK3sQ2JNhvzPgzXQx0DoivHgmGfPn3wDma33XZD/4QTPXYDYH/wvgf7qfNUmYZWKcwmK0RfccUVaFz23Xdf3N/XX3/d9UlAlruhptqOLe4E0NqnflLuY3VMjNBaAstrQ1JckE1gcnkVjyXljnzhC1+49tprZdPWaCW9IVmFW1TDcwNXlLPOOuuII46AlgCWxCokrJrQchpaVVHdPwP285JLLvnSl74ETZkEiaTZardLWgTQ2qcw2d5yn1jxoPbRViIGYkk5t7z22mvyV1smTZokZwPcTpgwAX0M3tFNnToV5yIkcRwljBiamJdffhn9zdChQxHDDFahv5HXEaAt0tCqVou0iCw6SavI8oclNI1LKfZQKPQkZWIWaRFZdDKxypTObUhrIe1qgkkRct78xCc+gW5g5ZVXRndy5513yt+ClNcYbnHq33PPPVdaaSVktt9+e7wm8SZD/1QaishtYkNAYjoDgkuQ1MgkwBaHDBly8803440OdlJ6lxEjRuyzzz4777zztGnT3C4BqkxDqxSyIbc5+cRl7733Hj9+PM5N9IYJ6B1z6A3prWjtUz+JjeI+CnKuxNGRoQSW44ZQX5CN4nZ5FY8lsRU88T7wgQ88+OCDwYdXqLkhVMCDg2fIoYce+ve//10/RLEKCasmtJyGVrVaeIQ/+9nP3nvvvdlHWJNIVrWtS1oE0NqnMNnecp9Y8aD20VYiBoJJCBwRPH/Qc0ypcB+u4BYa7QjaF5wcZs+ejUkcRzQuMo+T5KuvvvrKK6/g7ASNvofOmbRFGlrVapEWkUUnaRVZ/rCEpnEpxR4KhZ6kTMwiLSKLTiZWmdK5DWktpF1NMOkEXkvSuKy33nq4VOMcircR+jcd0Cuga1lrrbV69Oix7bbbDh48GC9XvJjxWoUrFfBKxos2sSEk/Qzwl1TbNFdNeZ0DcQE0Nop+Be94wKxZs1AQ+7P22mv37t0bZwf5DAbLEabKNLRKgVVYDiCwIWlc9tprL5x6UBnbwqYlCRfbBekN0bxVIeonsUXsxv3334/Tonloqkf+N7/5DXZbAstrQ4899tgLL7yABxObwO0f/vAHnLiXV/FYEtvCQ/0///M/aBYxlIfXp+aG5Dg++uij3/nOd+S6IvNYEquQsGpCy2loVav11FNPfetb38LTSeZlMksiWdW2LmkRQGufwmR7y31ixYPaR1uJGAgmIfAUkhOR/BAujpF7ReB1h0mcJ+UshEmE8TJEE4NJdDN4vUyePBm3eEOFrkXeZSHmiosA0DS0qtUiLSKLTtIqsvxhCU3jUoo9FAo9SZmYRVpEFp1MrDKlcxvSWki7mmBShJzc999/fzQlRx55JC7Vb33rW/v27YvXG16Z4p5//vlw8bYV7tZbbz1gwICRI0fOmDEDr09cMrFq++2333jjjTfaaKM99tgDbxzlIir1sRwv+1/84hcf+tCHEHjb2952xBFHPPfccyeccMIhhxyCSyMCAEvOO+88zGD5mDFjTj/99C222AL597///X/+85/lZCE7g7fOBx98MC5F2EOcMq6//vr99ttPPis64IAD/u///g/7g33DkoEDB6LgySefLHdEdgbtznHHHYf5UaNGSVkA96WXXjrqqKPe/va3Yw8/+MEP/vznP7/44ovRzO25557Dhg3DdRSNGk5PKHXzzTfvtttuiOH6euCBBz7yyCNSQUrhzp544omo//zzz6O+bDdL8AD5wEJB8PGPf/zuu+8WjXMidga75zIigtTcEPjyl798ww03oD407sh73vMeHPflVTyWxObw3MDdkYdUtu6T3hBWybHAbqMNOvPMM3HQ9bHAkliFhFUTWk5Dqyotu4Sn1jnnnIODaO5tBfbchpLoaoTZardLWgTQ2qcw2d5yn1jxoPbRViIGgkkROCLS9crxEjCPw4R5saBlEreYwYkC5wF0KgI0ZjDvYoC2SEOrWi3SIrLoJK0iyx+W0DQupdhDodCTlIlZpEVk0cnEKlM6tyGthbSrCSZF4IWElxMaFzQlP/3pT9G1oEf50Y9+hPOp/G0lXBH/93//F30JruXSuPTr12/EiBF4GzF48OB1K3baaadPf/rT73jHO7B2zTXXfPzxx92LHFf6k046aaWVVurdu/cHPvCBT3ziEyi1wQYbYIjwQw89JC9pNC7oP9Ao4K3njjvuiDy6lnXWWUcK4i2pOW1U3HjjjZhE64D3NNi3M844Y8MNN0TXgrUoC73NNtuMGzcOZR944AFMvutd78JVAfXlnuI90CabbIL5Z599FpMoiEnco8022wwbxV341Kc+teWWW66yyip4KFB2r732evnll1EQXRpOQKeddhqstdde+6CDDsLdgV5//fUffPBB1JFTGB407IC+a9iue7RjBA+Qj7PQuPzxj38UjR1DC7XcG5evfOUraDchEMND9N73vheP2PIqHkxWz5fFd9xxBx5zCGxU5n3SG8JCgGOBpyi6TzxP5OgACWBJrELCqgktp6FVlcZOovm+5JJL3CeFAJO47zaURFcjzFa7XdIigNY+hcn2lvvEige1j7YSMRBMQoj2j4ib9y2ZrJ6DS5FJYEPeFmloVatFWkQWnaRVZPnDEprGpRR7KBR6kjIxi7SILDqZWGVK5zaktZB2NcGkCLyW8LqST1z69Onzuc99DuIjH/nIq6++Kj+li6syLvOHHnro7373O1hbbbVV//79pXFB5pprrhk6dKh8NDpkyBB0MMiccsop8vkEKuM6hJnVVlvtl7/8JTLoG1588cWjjz4akyh73333YS2aG5y45T8FAtAxoEkaO3Ys3uJvt9126B7QzeCNi1x7br75ZvRPH/7wh1EKWwFoO9Zaay30EP/85z8HDRo0fPhw+TAGxZFE14ULA/oJbALLJ02ahMs8tvL000/LpmGhDUISvdeYMWOwFm3K9ddfj93AHu6xxx64g/K9M/QK6LewHFvEm3iU/fa3v41SuNDKx8W48KDgDTfccPnll6Orc58hu0c7RvAA+TirZuNC50pBZtytUDkt/L9vXAA2dN5556HTtbsV2jGQ3hCK4EDg+Ymj8/DDD6PvxDPz1ltvfe655ySAJekKjoQVgyrTELfufuGQnXnmmY888siwittvv/2FF15wbhZdmag2a13SIoDWPoXJ9pb7xIoHtY+2EjEQTEIENRGz5FA67Gw3eglViFmkRWTRSVpFlj8soWlcSrGHQqEnKROzSIvIopOJVaZ0bkNaC2lXE0yKwCsKJ3r5xOWWW2658cYb0SigD3jooYdwjcdF/YgjjsC1+Z577kHjIp+4yLeKcOXGRXru3Lk4/6KHmDBhAq7uJ5xwAjJYgrW4bMNFT4DL/+GHH45uQH6fEDH0FpttthmSd911Fy7/SKIUGhf5oOXee+8d3c2xxx6L2FFHHYVNSO+C7gozH/rQh1AHFyesxf5I4/L444+jK0Lx8ePHz5w5E3WQ3H777bG3aEewHNczaVww/+ijj2ISO/+3v/1t5ZVXRkciP7uDnQQofv755yOGHg7zGKLgRz/6UdyXCy+8EDuDu4Pdw+bWXXfdtddeG/sgvREeTNyiLPZNfnwYG3WPdozgAfJxVp3GRY6sexMvQLtJEQCPKixZ5fiPfOKCDR188MHoGjGpM4S2/BiK4GE/6KCD5OPAddZZZ7311kP/irspgap2tII8LKLJqgNVpiFucTfxDMHz9lOf+tT6FdhJ3H7wgx/EodT5NImk2Wq3S1oE0NqnMNnecp9Y8aD20VYiBoJJiKAGpPWwJokKMYu0iCw6SavI8oclNI1LKfZQKPQkZWIWaRFZdDKxypTObUhrIe1qgkkRcrWQxgXvSnEl3nzzzdGpnH322bg2463qxhtvvN122+Ei7RqXgQMHohXAhRl9CRoOnIJxvcfCvn37fulLX8Ja3EoAPQSW42J/2223YUb+9CS6ihEjRkjjcscdd6CyXOD32Wcf9Eynnnrq0KFDX3nlFbQ4sM444wwsP/TQQ7EQrcP8+fNvuukmLPzABz4gPzOLCzb2RxqXJ554Ak2GtCmYl09csPMoKD+kgguSa1wefPBBtCPYNN7iY5933XVXbAJrsV3ccVT+/ve/j3nXuABsBTtzzjnnYLfx/vg3v/kN7hcuiui38OBM7f7NI2lc0Nhhh13jkqbmoXRWzcYFm8bBBdByoKVHwTxuocUFGMoqx3/kExfs2Lvf/e6HH34Yk3qe0JYfw97ieYLDgWcggBDt7mNVO1oBy3EEATRZdaDKNMQt6uNuYg/xLEUHjFsgu4o91Pk0iaTZardLWgTQ2qcw2d5yn1jxoPbRViIGgkmIoAak9bAmiQoxi7SILDpJq8jyhyU0jUsp9lAo9CRlYhZpEVl0MrHKlM5tSGsh7WqCSRE4V+JMKo0LLsPDhg076aSTcHnG9QOX8KuuugoX79NPPx3Xclytkdlmm22ee+45WLgY4Krcr18/XOHQzWy00UZ444hLOJoPNC7oPHDtHzBgwOqrr45qf/vb39BnoDPAORqXczQ6aI9QDZd/lMI1HqU+9rGPYebcc8996aWX5FMTnNbRJWDysMMOQ0uBLaIzuPHGGzGDxgWbkF9uev7559FSrLzyyu4/2YpJdFTYKJJoXNBUoe3ABR4XDDQub3vb2zD/wAMPoDHCplEcwwMOOAB7gtYK28UkwpdeeinmXePywgsvyM/lAMwLMgS4xktvhAueXJZmz56NO4UhHl55wBPUPJTOqt+4oHtDqyfgoevTpw+aUbQsONDoU39dgc6vjcbFX0J7joCfEXRS7zCeZniC4amCST0v4JjiyQMBS2bwIMuMIHnpDHArM2Lpo6CLA60BngPoYq+++mposnykOI4ydgMbhZZJ7CoaETr0Ug2u20PBlOjuJrNbdCSSsJxLWgTQ2qcw2d5yn1jxoPbRViIGgkmIoAak9bAmiQoxi7SILDpJq8jyhyU0jUspcl7Q4MBY1X2ys4O4RVpEFp1MrDKlcxvSWki7mmBSBM6bOGPiQojLMBoXXPj/8pe/oNtA/3HPPffstttuq6222j//+U9c46VxQY8yaNAgXONxXsYMOgYkP//5z99yyy3333//Zz7zGVzFv/jFL7788su4kPfv3x/LMfPnP/8Z1wO0FLiWo/mYMWOGNC6oIJ/N4Aq09957Y+a8887DWmkpZs2a9b3vfQ+Thx56KHYMfQxiuNBi5v3vfz8uvSiIyzb2Z80110TPhM3JjqFvwJ1yjQt6DvnABhcYLNl0000xj72V9kt+rOeTn/wkNoHtSkOGCj/60Y/QcsnPuGC5NC5oj9DGXXHFFddee+111113/fXX/+xnP7vmmmsef/xxrMUm5CMNPKoQcikS2jhAPs7ab7/96LeK8DjI0GXkyKLbkB88Er71rW/h8cE8btHE4Ogcc8wxaMv0rgrB3ypK7BvQbhtJiKeffhqNCx58aJnHpvFIoss8/vjjP/jBD26//fbom2X+oYce2nnnneV7l/7+C1JERFAD0nj64elxyimnyFDmY6AnRnKvvfbC8xlduDzmeGaiD8ZBwXPV5ipoQ25IWkSWRLKqlymutU9hsr3lPrHiQe2jrUQMBJMQQQ1I62FNEhViFmkRWXSSVpGlh/baWUDTuJQih0SjJykTs0iLyKKTiVWmdG5DWgtpVxNMisBzFGdb+cQFZ175hg7aAlzk9tlnH3Qw7nsov//975HBdQWXcFyk0QFsueWW6Fpwgce1E13FxIkTTzzxRGS+8IUvyHdncA3YZJNNUOonP/kJyqInwDtRdA+wNtpoIyR/+9vf0icu559/vnxqghlcsX7wgx9g8rDDDsO7cGlc9Ccu8q0i7I984vLMM89IxyPdA6522PS2227rPnHBFRG7tP7666OCfOKC6/dRRx2FGFo0FERM+hs8JpdffjnunfxWEeaxSxtuuCFaGdwXbBEXJzwmqID7hQDW4lHCJrBpeXixAyKENg6Qj7PqfOKCuyCgRcMdwV3Gzst/0gGTuB04cOAOO+yAYyFXfVnlSHzignAQ3zKFQuj7qPWtt96Kngybw6TMowiGOCI4+hdeeOEqq6xyyCGHYPK+++7DocHjIB0VMsHNueKuIGlAGkcWz4dTTz1VhjIfBFvEjuHxR1++9tprv/Od78TTCc/efffd96CDDkK/iOcqdsymvQ25IWkRWRLJql6muNY+hcn2lvvEige1j7YSMRBMQgQ1IK2HNUlUiFmkRWTRSVpFlj8soWlcSrGHQqEnKROzSIvIopOJVaZ0bkNaC2lXE0yKkAuDblxwkb7oooswxOkboC+R7/L84Q9/wCQaF/QB6DZwu+qqq66zzjr9+/fHRR3Xv3Hjxh177LFY8sUvfhGNC4boVHASx1UTndCgQYOmVP8VeJzZ8X4UpYB84qIblwsuuAAzSKLFwQXVNS7oD+SHhV3j4j5xcY1Lv379sJ+oJp+4PProo0ii28AVVzoS7M8nP/lJTAL3icvVV1+NfV5jjTWeeOIJ6W/QfGBbe+yxB+ZxdZSGZsaMGZ/73OdwX/AuH5d83EEsx+ZgoX3B3ceVdVb1n1uTKyhu0wdFEzxAPs6q/60igIvrhz70IdwX3Gv0H3gMMYmDfsUVV5x77rmigaxyJBqXJ598El0s+F0rbubOO+/EAcUOgKpYC27SD5x11lnf/OY3MYkNybagsXXZSez59ttvjwP6/PPPH3nkkXiC4UhhEpbrFwkpIiKoaQcw7zcuZi8jdwSbxi12D/uz0kor3XvvvccccwxeEbJjtFduo8DsQWTfRGRJJKt6meJa+xQm21vuEyse1D7aSsRAMAkR1IC0HtYkUSFmkRaRRSdpFVn+sISmcSnFHgqFnqRMzCItIotOJlaZ0rkNaS2kXU0w6QROvtJJSOMyceJEnHnXXXddnL432mgjXP7l4wRcKZFB4/LSSy/hmo3+Q35+5aSTTkITg2sJOgD5FeLdd999yJAhuK7gYn/33XfjXTImceXDxeBrX/vaO97xjk033XTjjTdG/d/85jfoSHTj8r3vfQ/F8c5VNy6HH344YvSJizQuuGBjT+RvrmCtfDCDSzXWYpfQWmHTF198Mfbw8ccf33vvvVdbbbXevXujAt6yoyY2hJj8JRi4uLOo9qc//emDH/wg1mLP99xzT/nEBTsJd7311kPB//u//8PFG3cQ/Q329u9//zuKoGvBdnFHrrrqKrRf2BPsQ/CC51PzUDqrTuOi6dOnDxo7PODgwQcfxBFHb4fODHfNJjwSjQv6SNTBYxgDjxsOjVzUq2JLkYu9xhoVn/70p2+44QZMYkOyLQGrBOwV+gN0k3iC2ankI+yK6IJao7eQ3bC1Fi8ePHgwngx4rkIjgFscR0GW+CCDpyUeEzS1aNqkGub1hgBpNyQtIksiWdXLFNfapzDZ3nKfWPGg9tFWIgaCSYigBqT1sCaJCjGLtIgsOkmryPKHJTSNSyn2UCj0JGViFmkRWXQyscqUzm1IayHtaoJJJ3CedY0LLsNoXHA5P/PMMw8++ODTTz990KBB8n2fe+65Bxn5xAV9DPoDXMBwFcf5esstt8S1E1eU/fffHxctNAf77bffM888M3XqVFzvr7zyyjXXXFOSuEUY3cxWW22FauWNCy7bWPvZz34WlbFdXDkOPPBATKKHwLUcF3hsFNfsTTbZBI0IApdddpn8cO5f//pXaVxmzpyJjcqPFWPn0ZpgCVorvHXGJK7uuEbi/qIJQ0FsfZ111sG21l9//V0qNt98c9T/y1/+gh3GVRDty9Zbb40KDz30kFwU3TVMHu0gwQPk46xlbVyw89ttt508/ui68OA88cQTX/jCF+gjAU2icfnb3/720yTXX389HgfccbnvGsxgo3hwfvvb30JgiOLOwrMLOwaBDcm2BMwAJG+55RbchWOPPVYeWMGGQrgiuqDWeI6hK8Vj8qmKgw46CM9DbAIHUYYy/8tf/tLtpw/2AS8K9PHvete78GyUfcO83hAg7YakRWRJJKt6meJa+xQm21vuEyse1D7aSsRAMAkR1IC0HtYkUSFmkRaRRSdpFVn+sISmcSnFHgqFnqRMzCItIotOJlaZ0rkNaS2kXU0w6QTOs9/97nf32WcfXJBGjx49adIkXMsBTuh4R47LtjQNTz/99L777nvEEUdgBjE0Jehm8Mb03e9+9zvf+U4sv+OOOzBz0kkn4fKz00474eKEtgCXVawdOHDgVVddhaYBF/6RI0e+8sor6B7QFqAZct8qQpOEawauEPKjJ9K43Hrrraj8/e9/3zUu2EnsxvHHHy+Ni3zOgV39xje+scMOO2y77bZowrBv8qva6IGOPPJIXLOxS2iq7rvvPqxCl4N2BLuHTSOJTeO6jh1DF4LL1f/+7/+iT+rfv//NN9+899574+5I44KdxNZx+9hjjx111FG4kCO84447fvSjHz3jjDPwQMHFnmC7X/3qV7EtVMC+zQ/994x8ggfIx1nL1LiYa/vixe7bf2ussUbfvn2PPvpo+Y8V2JBHonHBtqQlxWHSyAws3MrFG1TFloIj8s1vfhOtXu/evbEbLgbwlNtoo41w9KGxIdmWA0k0Os899xy6TzzmeKhlFbCJ6p5a1Y0rogtqjecYjh16DvSsQD5mw6MkXawMcXvaaacFHyvZAYBnEZ4PG2ywAV4+dN9F03K9D6RFZEkkq3qZ4lr7FCbbW+4TKx7UPtpKxEAwCRHUgLQe1iRRIWaRFpFFJ2kVWf6whKZxKcUeCoWepEzMIi0ii04mVpnSuQ1pLaRdTTDpBM6quL7iqoO2Ax0ALjy4BgNcfjDEhUR+BhaXZFyJoTGDjgF5XGmmTJmCVmDQoEG4xTxAh/HCCy/gDaj8qAouq2gL5FsqIyog/vSnP+HCsPHGGz/55JNYglJSXGLjxo3DDK5MuKhgHkXQNwBpMjCDHcMQSWllkMQ8YkOHDpVfpUYd2WHcQqPzwC5hHruEVWibkAQogn1Df4PlKI59w26jAmLYK9lh3C8sxL3G7iGJzSGJhVIEdxZ7i0cDe4J7ii3i4or9QRh3AcVRGfdCP9pBggfIx1nL2rjgqom93XDDDXFJRvty8MEH77ffftjP4MVYSDQusNZee+111lkHtxo389a3vhWPWHW9XtpJ4HG4/fbbcXVH7yufapxyyinuGg+eeuopNMF46BDGhmRbDgRQ4cILL8S9WG+99fAcwBDLgQjcIiMzdo16NHRBrbEKT34cPhxWgLJop7BvX//612UI5GnsfzpV7bUBW7z77rvRH6PLkf8MtYDigmhUcEX0PpAWkSWRrOplimvtU5hsb7lPrHhQ+2grEQPBJERQA9J6WJNEhZhFWkQWnaRVZPnDEprGpRR7KBR6kjIxi7SILDqZWGVK5zaktZB2NcGkEzjz4nyKKx9O0HKOxqkcVzVcpNGa4OqLawlmkMFlGNdjnOURQx5DhHHZxjVeLvbS+pguo/pDc3Lq/93vfocLPBoIzGD+n//8584774zL56c+9Sn3R2lRX4ojhuLYopzusRvYBCYBYmgdkMSmsQ+YkbYDSYlJU4VNSBhtBIDAvOwhbrE5t4eI4T5WVxPzo5TYLizsKpJwXRKtCe6FbkpwvzApLm5xv9wWkcEe6sYFM/rRDhI8QD7OaqNxwW4cddRReNjBSiut9NOf/hSTsGzII9G4XHrppZ/85Cc/8YlPHHjggbh1uOGnP/1pHCCsAlUx+xx76KGHHnnkETyGv/3tb9G5brXVVnjQkJE9vOmmmw444ADZJWxItuVAoF+/fuedd96hhx6KxgLPIlO9Gzzyjz/++Msvv4yt6DvliuiCWruuAqtkITo87Nu3v/1tVweuxGToQEDW4vlwzDHHXH755dixH/zgB2aHqgdcbvG8+vWvf33ZZZehTcd9l7J6H0iLyJJIVvUyxbX2KUy2t9wnVjyofbSViIFgEiKoAWk9rEmiQswiLSKLTtIqsvxhCU3jUoo5IbWCA2NV97nJDuIWaRFZdDKxypTObUhrIe1qgkk9iTOstA64fqMVkHMuzsg4zwIZAmhcIZBBUublIo3rOi5UAAJXa1yN5LINC00ALpNve9vbdtlll0MOOWSPPfZYf/31cX5/xzve8eCDDw6v/tAcLroohYIoC436KCtbxD5gfzAJpEcBslGAncFQ9l9iuIQA2TRqwsU8Lu2YxO7h1u0hcBtCBYSxdVi4zFQ1zLdCMARYLv1H9VCZ9/0YYhKlEEMeAtXkMYGLJDaKHdD12zhAPs6q83dcCOzY008/veqqq6Jxefvb3+4+EYmR+DsuGJpjoz4sERJ7DmSJPIw4UhtttBGeGH/9618xREHcfqdCyqIUVcPDi11Cg3jNNdf06NFD+oOBFThSP/vZz3DX7r//fpvuxhXRBbWudmrpHcE8Ghc8P13jIvM645DHAQf9/PPP/9e//iUPL3pKzOPJJv9Vczx/PvOZzxx88MG77bbb2muvjV5Q1la7EN43EVkSyapeprjWPoXJ9pb7xIoHtY+2EjEQTEIENSCthzVJVIhZpEVk0UlaRZYe2mtnAU3jUoocEo2epEzMIi0ii04mVpnSuQ1pLaRdTTCpJ/FMxSkYFxUg52iZh6YhzsXIyClbhjhx4/KMHgLggo2hzAAIXCCPPfbYD3zgA7hY4sSNy9X73ve+b3zjG88888yIESNGjx6NC7/0H6643gfcuk1AyH66mMxIDGAGO4ArnOyG2b/uPcSM7J7sFZDlCLgKGMJFzFUQZNNIyiOGJDTmZVsAS5BBQakmpRDQe9jGAfJx1rL+cK6AXULHg6u+XJVlx2IkPnGJod1YEhtFNXDMMcegPzj88MOhMYkH6pOf/ORNN90ke4XlANaTTz6JtmDYsGGnnnrqP/7xD8w8//zzvaq/r/Pcc8/hjuAoYBL9AZ5gDzzwgGzF4XZDCvoakEbjgo4KlWUo8wS2OGbMmDPOOKN///433njjD3/4Q+w/+tett94affmgQYPOPvvsV155BZNoX3AXILCfRx999BZbbIGjgApmDyL7JiJLIlnVyxTX2qcw2d5yn1jxoPbRViIGgkmIoAak9bAmiQoxi7SILDpJq8jyhyU0jUsp9lAo9CRlYhZpEVl0MrHKlM5tSGsh7WqCSX/SXMpCFzMqrmOigVx+9K0IaQVwTkePglP58Oqv044dO1a+dzO++q8h6ku+CCkuyLzgZnDrknr3MCnoMLCz1bzcAgkIMgPg4jLjMg5k/McBSQo7yyEztJbQbiLprPYaF3DHHXesu+66gwcPxj7LblvDo6ONC24ffvjhVVZZBb0snhuYmT179g477PDUU0+Zh6z7DR8E+hX0N5tsssnPf/5zaQRnzZqF3nfllVfed9998YySangQttlmm/LGBeDBQfHTTz8dmiwBm8NG//SnP6G/WWeddU466STskkyiR8fevvOd70SPhb3FJJ7h7umNJgZ3BPcURapdCO+biCyJZFUvU1xrn8Jke8t9YsWD2kdbiRgIJiGCGpDWw5okKsQs0iKy6CStIssfltA0LqXglEHgwFjlXWliFmkRWXQyscqUzm1IayHtaoLJwuVB/CRO2ehd0J1Mqn7GxfzYSPcPjkyYMAFvlOXjFr2kDi4JEdSAdMyqia6gNSDthjRvVYiaSWe18a0iAY92nz595JqapqN/8h/MmTPnXe96Fy7z8nfx8fRYa6218HwQF8ulAp4kP/vZz/r27Sv7jKcTeO6557BqcvVjNBJD37b11lu38a0iQHrevHkvvvgititDmffB/t9yyy333HMPhMxgZ/AMv/7660eMGFHtpmlWcCuNC8Aj/+lPf1rCZg8i+yYiSyJZ1csU19qnMNnecp9Y8aD20VYiBoJJiKAGpPWwJokKMYu0iCw6SavI0kN77SygaVxKkUOi0ZOUiVmkRWTRycQqUzq3Ia2FtKsJJguXB/GTeA0sqH7OF9ekKdV/DheXKPlxE1xrcZGQ07peUgeXhAhqQDpm1URX0BqQdkOatypEzaSz2v7EpeaGQIc+cRHg4olx7rnn9ujR44ADDsDT4Mknn9x2223R42JeAokK2hI9a9as5fKJS8KqCS3H3amaFtO+H3744f369fPvIGkRWRLJql6muNY+hcn2lvvEige1j7YSMRBMQgQ1IK2HNUlUiFmkRWTRSVpFlj8soWlcSrGHQqEnKROzSIvIopOJVaZ0bkNaC2lXE0wWLg8STOKsrX8iRMAQk/JZi8RAurjGJSGCGpCOWTXRFbQGpN2Q5q0KUTPprP+CxgW3zz333Kqrrtq7d+/hw4ffcMMNBx10UHWJN10sAokK2hL9Bm9c8Dz/zW9+06dPH/eEN5vpTpIWkSWRrOplimvtU5hsb7lPrHhQ+2grEQPBJERQA9J6WJNEhZhFWkQWnaRVZPnDEprGpRR7KBR6kjIxi7SILDqZWGVK5zaktZB2NcFk4fIgiaR76ykCt4K1K9LFNS4JEdSAdMyqia6gNSDthjRvVYj6SQGNi3yrCI/kjBkz0LigfRFreW0IjYv89X3Zys4777zcGxc0r7vvvnuPHj0uvfTSb3/722eeeWb1jDAgkKigLcmjb9t2223vv/9+7Ko1KlyyqhfQgHTMqokswS65hw48/fTT119//fzun/KudrmlX3cbciJLIlnVW1pQaxFAa5/CZHvLfWLFg9pHW4kYCCYhghqQ1sOaJCrELNIisugkrSLLH5bQNC6l2EOh0JOUiVmkRWTRycQqUzq3Ia2FtKsJJguXBwkmIYJa0EOyErikKRcpTjpm1URX0BqQdkOatypE/SSufGCfffb54x//KHr69Olve9vbOvGJC661somurq73vve9AwcOXF7FxcXF+5prrunZs+f73//+j370o7fccotbBeG0j7ZQBK0A7v522233t7/9TdoC66lkVS9cnHTMqoksMd1KBR66F1544eqrr55d/cWgOXPmyE/AYCddcbNJpUVkSSSrepniWvsUJttb7hMrHtQ+2krEQDAJEdSAtB7WJFEhZpEWkUUnaRVZ/rCEpnEpBecIAgfGqu5znB3ELdIisuhkYpUpnduQ1kLa1QSThcuDBJMQQS3oIVkJXNKUixQnHbNqoitoDUi7Ic1bFaJmEpZcDi+//HK8iZfr36xZs0455RRcF11GRJCaGwK/+MUv/vGPf0jjsmDBgnPOOefVV19dXsWdO378+N69e6+66qprrbXWk08+6eYhnPYhCw3BpEmT3vGOd/zmN7/BrtrZCpes6oWLk45ZNXFLcHTw0D3++OM777zzl770pSOPPPLwww9Hf/bAAw/IgXNJs0mlRWRJJKt6meJa+xQm21vuEyse1D7aSsRAMAkR1IC0HtYkUSFmkRaRRSdpFVl6aK+dBTSNSylySDR6kjIxi7SILDqZWGVK5zaktZB2NcFk4fIgwSREUAt6SFYClzTlIsVJx6ya6ApaA9JuSPNWhaiZFEvOLDIDgaugaGG5bEiQ4tXWDJhZXsXhSk1c2g8++OAePXqss846kydPdqsgnPbRForMnTv3zjvvvOSSS6655pphw4ZZo8Ilq3rh4qRjVk1kibt3t9xyC3bs4osvxu2ll156xRVXzJkzR5KOapt2Q05kSSSreksLai0CaO1TmGxvuU+seFD7aCsRA8EkRFAD0npYk0SFmEVaRBadpFVk+cMSmsalFHsoFHqSMjGLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1dcuiL0HCuvvPJmm20G7VZBOO2jLaxCHQ1mrKeSVb1wcdIxqyZuCXYjCPZQYi5JWkSWRLKqlymutU9hsr3lPrHiQe2jrUQMBJMQQQ1I62FNEhViFmkRWXSSVpHlD0toGpdS7AlDgQNjlfcRWcwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHc+fOvfLKKx9++OGu6j/qJJMQOkOQJc2KvnW4ZFUvXJx0zKpJYonet2o7NklaRJZEsqqXKa61T2GyveU+seJB7aOtRAwEkxBBDUjrYU0SFWIWaRFZdJJWkaWH9tpZQNO4lCKHRKMnKROzSIvIopOJVaZ0bkNaC2lXE0wWLg8STEIEtaCHZCVwSVMuUpx0zKqJrqA1IO2GNG9ViJpJWM4lLQJo7dNGsnPFcX6UIRDtVkE47eNbupTGJat64eKkY1ZNaIkbyh66/TSb6bZIi8iSSFb1MsW19ilMtrfcJ1Y8qH20lYiBYBIiqAFpPaxJokLMIi0ii07SKrL8YQlN41KKPRQKPUmZmEVaRBadTKwypXMb0lpIu5pgsnB5kGASIqgFPSQrgUuacpHipGNWTXQFrQFpN6R5q0LUTMJyLmkRQGufNpIdLQ6chghqH20lYsC5plykOOmYVRNaTkOrWi3SIrIkklW9THGtfQqT7S33iRUPah9tJWIgmIQIakBaD2uSqBCzSIvIopO0iix/WELTuJSCdzkEDoxV3W+A7CBukRaRRScTq0zp3Ia0FtKuJpgsXB4kmIQIakEPyUrgkqZcpDjpmFUTXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXGufwmR7y31ixYPaR1uJGAgmIYIakNbDmiQqxCzSIrLoJK0iSw/ttbOApnEpRQ6JRk9SJmaRFpFFJxOrTOnchrQW0q4mmCxcHiSYhAhqQQ/JSuCSplykOOmYVRNdQWtA2g1p3qoQNZOwnEtaBNDap41kR4sDpyGC2kdbiRhwrikXKU46ZtWEltPQqlaLtIgsiWRVL1Nca5/CZHvLfWLFg9pHW4kYCCYhghqQ1sOaJCrELNIisugkrSLLH5bQNC6l2B5SgQNjlddpxizSIrLoZGKVKZ3bkNZC2tUEk4XLgwSTEEEt6CFZCVzSlIsUJx2zaqIraA1IuyHNWxWiZhKWc0mLAFr7tJHsaHHgNERQ+2grEQPONeUixUnHrJrQchpa1WqRFpElkazqZYpr7VOYbG+5T6x4UPtoKxEDwSREUAPSeliTRIWYRVpEFp2kVWTpob12FtA0LqXIIdHoScrELNIisuhkYpUpnduQ1kLa1QSThcuDBJMQQS3oIVkJXNKUixQnHbNqoitoDUi7Ic1bFaJmEpZzSYsAWvu0kexoceA0RFD7aCsRA8415SLFScesmtByGlrVapEWkSWRrOplimvtU5hsb7lPrHhQ+2grEQPBJERQA9J6WJNEhZhFWkQWnaRVZPnDEprGpRR7KBR6kjIxi7SILDqZWGVK5zaktZB2NcFk4fIgwSREUAt6SFYClzTlIsVJx6ya6ApaA9JuSPNWhaiZhOVc0iKA1j5tJDtaHDgNEdQ+2krEgHNNuUhx0jGrJrSchla1WqRFZEkkq3qZ4lr7FCbbW+4TKx7UPtpKxEAwCRHUgLQe1iRRIWaRFpFFJ2kVWf6whKZxKcV++KXAgbHK+4gsZpEWkUUnE6tM6dyGtBbSriaYLFweJJiECGpBD8lK4JKmXKQ46ZhVE11Ba0DaDWneqhA1k7CcS1oE0NqnjWRHiwOnIYLaR1uJGHCuKRcpTjpm1YSW09CqVou0iCyJZFUvU1xrn8Jke8t9YsWD2kdbiRgIJiGCGpDWw5okKsQs0iKy6CStIksP7bWzgKZxKUUOiUZPUiZmkRaRRScTq0zp3Ia0FtKuJpgsXB4kmIQIakEPyUrgkqZcpDjpmFUTXUFrQNoNad6qEDWTsJxLWgTQ2qeNZEeLA6chgtpHW4kYcK4pFylOOmbVhJbT0KpWi7SILIlkVS9TXGufwmR7y31ixYPaR1uJGAgmIYIakNbDmiQqxCzSIrLoJK0iyx+W0DQupdhDodCTlIlZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMNnecp9Y8aD20VYiBoJJiKAGpPWwJokKMYu0iCw6SavI8oclNI1LKfbDLwUOjFXeR2Qxi7SILDqZWGVK5zaktZB2NcFk4fIgwSREUAt6SFYClzTlIsVJx6ya6ApaA9JuSPNWhaiZhOVc0iKA1j5tJDtaHDgNEdQ+2krEgHNNuUhx0jGrJrSchla1WqRFZEkkq3qZ4lr7FCbbW+4TKx7UPtpKxEAwCRHUgLQe1iRRIWaRFpFFJ2kVWXpor50FNI1LKXJINHqSMjGLtIgsOplYZUrnNqS1kHY1wWTh8iDBJERQC3pIVgKXNOUixUnHrJroCloD0m5I81aFqJmE5VzSIoDWPm0kO1ocOA0R1D7aSsSAc025SHHSMasmtJyGVrVapEVkSSSrepniWvsUJttb7hMrHtQ+2krEQDAJEdSAtB7WJFEhZpEWkUUnaRVZ/rCEpnEpxR4KhZ6kTMwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqa41j6FyfaW+8SKB7WPthIxEExCBDUgrYc1SVSIWaRFZNFJWkWWPyyhaVxKsYdCoScpE7NIi8iik4lVpnRuQ1oLaVcTTBYuDxJMQgS1oIdkJXBJUy5SnHTMqomuoDUg7YY0b1WImklYziUtAmjt00ayo8WB0xBB7aOtRAw415SLFCcds2pCy2loVatFWkSWRLKqlymutU9hsr3lPrHiQe2jrUQMBJMQQQ1I62FNEhViFmkRWXSSVpHlD0toGpdS7KFQ6EnKxCzSIrLoZGKVKZ3bkNZC2tUEk4XLgwSTEEEt6CFZCVzSlIsUJx2zaqIraA1IuyHNWxWiZhKWc0mLAFr7tJHsaHHgNERQ+2grEQPONeUixUnHrJrQchpa1WqRFpElkazqZYpr7VOYbG+5T6x4UPtoKxEDwSREUAPSeliTRIWYRVpEFp2kVWT5wxKaxqUU++NGChwYq7wfSopZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMNnecp9Y8aD20VYiBoJJiKAGpPWwJokKMYu0iCw6SavI0kN77SygaVxKkUOi0ZOUiVmkRWTRycQqUzq3Ia2FtKsJJguXBwkmIYJa0EOyErikKRcpTjpm1URX0BqQdkOatypEzSQs55IWAbT2aSPZ0eLAaYig9tFWIgaca8pFipOOWTWh5TS0qtUiLSJLIlnVyxTX2qcw2d5yn1jxoPbRViIGgkmIoAak9bAmiQoxi7SILDpJq8jyhyU0jUsp9lAo9CRlYhZpEVl0MrHKlM5tSGsh7WqCycLlQYJJiKAW9JCsBC5pykWKk45ZNdEVtAak3ZDmrQpRMwnLuaRFAK192kh2tDhwGiKofbSViAHnmnKR4qRjVk1oOQ2tarVIi8iSSFb1MsW19ilMtrfcJ1Y8qH20lYiBYBIiqAFpPaxJokLMIi0ii07SKrL8YQlN41KK/fBLgQNjlfcRWcwiLSKLTiZWmdK5DWktpF1NMFm4PEgwCRHUgh6SlcAlTblIcdIxqya6gtaAtBvSvFUhaiZhOZe0CKC1TxvJjhYHTkMEtY+2EjHgXFMuUpx0zKoJLaehVa0WaRFZEsmqXqa41j6FyfaW+8SKB7WPthIxEExCBDUgrYc1SVSIWaRFZNFJWkWWHtprZwFN41KKHBKNnqRMzCItIotOJlaZ0rkNaS2kXU0wWbg8SDAJEdSCHpKVwCVNuUhx0jGrJrqC1oC0G9K8VSFqJmE5l7QIoLVPG8mOFgdOQwS1j7YSMeBcUy5SnHTMqgktp6FVrRZpEVkSyapeprjWPoXJ9pb7xIoHtY+2EjEQTEIENSCthzVJVIhZpEVk0UlaRZY/LKFpXEqxh0KhJykTs0iLyKKTiVWmdG5DWgtpVxNMFi4PEkxCBLWgh2QlcElTLlKcdMyqia6gNSDthjRvVYiaSVjOJS0CaO3TRrKjxYHTEEHto61EDDjXlIsUJx2zakLLaWhVq0VaRJZEsqqXKa61T2GyveU+seJB7aOtRAwEkxBBDUjrYU0SFWIWaRFZdJJWkeUPS2gal1LsoVDoScrELNIisuhkYpUpnduQ1kLa1QSThcuDBJMQQS3oIVkJXNKUixQnHbNqoitoDUi7Ic1bFaJmEpZzSYsAWvu0kexoceA0RFD7aCsRA8415SLFScesmtByGlrVapEWkSWRrOplimvtU5hsb7lPrHhQ+2grEQPBJERQA9J6WJNEhZhFWkQWnaRVZPnDEprGpRR7KBR6kjIxi7SILDqZWGVK5zaktZB2NcFk4fIgwSREUAt6SFYClzTlIsVJx6ya6ApaA9JuSPNWhaiZhOVc0iKA1j5tJDtaHDgNEdQ+2krEgHNNuUhx0jGrJrSchla1WqRFZEkkq3qZ4lr7FCbbW+4TKx7UPtpKxEAwCRHUgLQe1iRRIWaRFpFFJ2kVWf6whKZxKcUeCoWepEzMIi0ii04mVpnSuQ1pLaRdTTBZuDxIMAkR1IIekpXAJU25SHHSMasmuoLWgLQb0rxVIWomYTmXtAigtU8byY4WB05DBLWPthIx4FxTLlKcdMyqCS2noVWtFmkRWRLJql6muNY+hcn2lvvEige1j7YSMRBMQgQ1IK2HNUlUiFmkRWTRSVpFlj8soWlcSrGHQqEnKROzSIvIopOJVaZ0bkNaC2lXE0wWLg8STEIEtaCHZCVwSVMuUpx0zKqJrqA1IO2GNG9ViJpJWM4lLQJo7dNGsqPFgdMQQe2jrUQMONeUixQnHbNqQstpaFWrRVpElkSyqpcprrVPYbK95T6x4kHto61EDASTEEENSOthTRIVYhZpEVl0klaR5Q9LaBqXUuyhUOhJysQs0iKy6GRilSmd25DWQtrVBJOFy4MEkxBBLeghWQlc0pSLFCcds2qiK2gNSLshzVsVomYSlnNJiwBa+7SR7Ghx4DREUPtoKxEDzjXlIsVJx6ya0HIaWtVqkRaRJZGs6mWKa+1TmGxvuU+seFD7aCsRA8EkRFAD0npYk0SFmEVaRBadpFVk+cMSmsalFPub6QocGKu831+PWaRFZNHJxCpTOrchrYW0qwkmC5cHCSYhglrQQ7ISuKQpFylOOmbVRFfQGpB2Q5q3KkTNJCznkhYBtPZpI9nR4sBpiKD20VYiBpxrykWKk45ZNaHlNLSq1SItIksiWdXLFNfapzDZ3nKfWPGg9tFWIgaCSYigBqT1sCaJCjGLtIgsOkmryNJDe+0soGlcSpFDotGTlIlZpEVk0cnEKlM6tyGthbSrCSYLlwcJJiGCWtBDshK4pCkXKU46ZtVEV9AakHZDmrcqRM0kLOeSFgG09mkj2dHiwGmIoPbRViIGnGvKRYqTjlk1oeU0tKrVIi0iSyJZ1csU19qnMNnecp9Y8aD20VYiBoJJiKAGpPWwJokKMYu0iCw6SavI8oclNI1LKfZQKPQkZWIWaRFZdDKxypTObUhrIe1qgsnC5UGCSYigFvSQrAQuacpFipOOWTXRFbQGpN2Q5q0KUTMJy7mkRQCtfdpIdrQ4cBoiqH20lYgB55pykeKkY1ZNaDkNrWq1SIvIkkhW9TLFtfYpTLa33CdWPKh9tJWIgWASIqgBaT2sSaJCzCItIotO0iqy/GEJTeNSiv3wS4EDY5X3EVnMIi0ii04mVpnSuQ1pLaRdTTCZWN7V1bVw4cJFixbJUFwM3bzcApmvUobYhoJacENXZ/jw4bNnzxY9b948DPUmBLeqqhcuThqYPV60aMSIESiLSWiI0aNHQ0gsgVTwteAeLl1KZyhP1EzCci5pEUBrnzaSHS0OnIYIah9taY0HXz/+gIroZ7UIfyZWvCa0nIZWtVqkRWRJJKt6meJa+xQm21vuEyse1D7aSsRAMAkR1IC0HtYkUSFmkRaRRSdpFVl6aK+dBTSNSylySDR6kjIxi7SILDqZWGVK5zaktZB2NcFkcBJPWZy+AU7oonErFmYmTZp0ySWXHHLIIZ///OePO+64J554Aq2MC4DYhoJacEO36T333PPxxx/HEFscMGDA+9//frQXknG4VVW9cHHSADUXLFiA+s8//zw0wIY+/OEPz5kzx+biSAXSUgT7LA/Oj3/84759+2KISYlVcYPWPjWTsJxLWgTQ2qd+0t0vYI7KokWYXF7Fg0kI0rIPcivzgliyY/JQA7OLVW+NW5kBSOIWM+6OuJoyAzB0SDXcCloDCUOY0hFki4K5D6F7CrRFWkSWRLKqlymutU9hsr3lPrHiQe2jrUQMBJMQQQ1I62FNEhViFmkRWXSSVpHlD0toGpdS7KFQ6EnKxCzSIrLoZGKVKZ3bkNZC2tUEk8FJOX0PHjz4l7/85W9/+1t0DO5MPWXKFFzjV1pppXXXXXezzTbr3bv3+eefjwDO5rIWxDYU1IIbYiuyrb322gv9hFwknn322Z133nn+/PmScbhVVb1wcdIABXFh22OPPQYOHIgNAWzogx/84OzZs20ujlTQWiqg5oQJE9Cy4DHp2bPnnXfe6R4xlwda+9RMwnIuaRFAa5/6SdwF9HN9+vT5+c9//uqrr8pRXl7Fg0kI0tgutn7bbbfhAOFRFQvAmj59+o033njrrbe6R/v/b++/w+Uojvd92AQRJHKyMRlMEDkZDAZMMmAbY2OSyNEYEUyUCTYGDCbnDCJnEDnnaDIiCwkJkZFAQgGJjPH3vWdr1K5TXd0z7HD8+b3Xtfcfez3d9VR1z8xqu84eIV577bXzzz+f9y2hYMbJK8Nhw4b179///vvvD80HzSv+AQMG8O6aMGGC5OKRV0FrwPPee+/pncTIikJxDd6Vgg4ZLaKSjLNVr6K41jENne2lx6SKuzpGhzI2cJ0IV4PReliTTIVUyGgRlWinyTKheNiETuPSlPJRKPSk8aRCRouoRDszWUXpqoW0FvJRjet0J/lAB46KH/zgB7179+YM4GOdz3om+dSeYoopFllkkSeeeGLo0KGPPPLI008//cknn0hzI+mphVwt6KGsHr5xYeb7bVyoSePy85//XBoX5qVxib/RiZEKRnPhnO50eFdfffVss83GTeMgFA8EP2gdU9NJSEd5LvJoBDmPRQQwiJDnqNO1dqFVnWuuubioO++8k1tkvl2LqV/cdSK05vWOO+5g9QUWWOCjjz766quvuIqwgTfeeGPGGWecZ555uC75loUmA/Nkk012yCGHhDctzuKOfPstDSXRrbfemjpyIWeeeSYzyy+/PO9hOiS0QAXe5wjaULRMCszT+kizTgXZiSFcArQuyLlS0CGjRaRoXU3xiMuxhymotQjQOqahs730mFRxV8foUMYGrhPhajBaD2uSqZAKGS2iEu00WSYUD5vQaVyaUj4KhZ40nlTIaBGVaGcmqyhdtZDWQj6qcZ2Z9KuuuoqP6UUXXXTEiBHSmvBJvfvuu/M53rdvX37o5IP+3XffHTly5JgxYz7//PPwMZpayNWCGQKNy+OPPy76+21cEFwIjctLL70k880bFzlFeN144425PzfccIN4IPhB65iaTkIhyoXQM/3tb3874IAD9t9///0U8j0EcPBLFPr168fT1MW1dgmNy80338zbgOXyvUv94q4ToTULSeMy33zzvfPOOxMnTpTeRQw0LjPNNNPcc889fvx4NkZIGheYdtppH3300U8//VS3FzwXQn369JkwYQLvWC7kjDPOYIZ3F10RV0rrCZe3uPjii3/5y1/yNBdeeGHKXtSCyUsvvZSWfdy4cfo9bwiXAK0Lcq4UdMhoESkGDRp04oknsis0VxcuUGMKai0CtI5p6GwvPSZV3NUxOpSxgetEuBqM1sOaZCqkQkaLqEQ7TZYJxcMmdBqXppSPQqEnjScVMlpEJdqZySpKVy2ktZCPalxnJl0al0UWWWT48OF8rHNocTDsuOOOfJTvs88+THKQcArStXAMcOqHD/HUQq4WzBD+/65xAcTOO+/8v2xcWJTb8thjjy244IKsGzjkkEM4y+VLCA77m266aZpppvnxj38svyXRxbV24TgnkZrXXHMNbwPKsiLrluGI+sVdJ0JrFrr99tt5H84777yDBw8eNWqUbhdC40L3PHbsWHqX/v37Y5absOKKK9JYMxl6F2lcNttsM+pwITzx008/nZllllmGLpyrowgdCVEKvv/++/Jup86rr746bNiwt99+m36ded72slzYiSFcArQuyLlS0CGjRaQ4//zz2djaa6/Ns0j1kaag1iJA65iGzvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpSvkoFHrSeFIho0VUop2ZrKJ01UJaC/moxnVm0uVXRfy4yYHB5zWv119//VprrTX55JP/+te/5qOTnz45+zkXzadnaiFXC2YIdRoXjg2OGdoOgfMbXnvtNTZDmyUzFNF/OUMW+n4bF5Br53WnnXbipv3PGheBda+++mpOMpYGHhDnN9fI/SHEK49vxhlnPOWUU+Sg1emmVAynOI0LZS+//HL6APpUedzc1dNOO22PPfb485//fMQRR9ASSZ9UprUIxZlnPxz23OoTTzyRlD333PPII498/fXXaURkq+KEkIVg/7feeiurzzPPPC+//DL9BF2yPE2gmaBxmWuuueihaTsInXfeeVw+rUzPnj0R+++/PxvmybI3/AMGDKDUJptswg1hns6DS2Bm6aWXZid0JEzyzmGftGsffvjhLrvsQnT55Zdn6aFDh0qnzkIff/yxfAcp22aTbAnPzTff/OSTTzKUSYGluTlcxQcffIAN+AHg6aefPuyww3bfffe//OUvvEUpxXyZ0KpGFld30UUX7b333tj222+/yy67jB3ivOeee5jh6vizee2113JRjz76qNx8gcSzzz6bRG7ySSed9Pzzz/PImC+rf/stV8d+6AjZySuvvPKPf/yD58jDlT0Ud7ZF/feJ62wvPSZV3NUxOpSxgetEuBqM1sOaZCqkQkaLqEQ7TZYJxcMmdBqXppSPQqEnjScVMlpEJdqZySpKVy2ktZCPalxnJl0al5/85CeDBg3iZ02GU7TggORDU+DzUb6i5yOvTEsv5GrBDKFO48KiHCHbbLONbIaNLb744k899RTz/DzN3qaeeurVV1/9xRdflO2FdRl+v41L4H/cuATNmb3iiiuyNHAfOJt5KHJW8XrnnXdy8NPkceHMhHTQ2oXGRX5VdOmll9KscMO5UXfddReTU0455QwteFcstdRS1GdR9+RjkqU32GADeQvRRU0zzTSI2Wef/d5775UvUSSxuB51dbyGxoW3AWc/jTJmMYTGhY3R0/DQzz33XN4JXP4f/vAH7gNRtsrNkXP9uuuuoxRRDmkuJDQubJ5+l0lalk8//ZR5XsnabbfdiK6wwgqvvvrq8OHDZQk2gEE361SGa665hhUpRRFZjiiwt5lnnnnWWWelAveHygcddFCvXr14czLJ67TTTnvooYdyV0PzgaaZIItbhJM73KNHD4qzBMWXXHJJrpEQM2yP14022kh+80XiGWecERKnn356nNNNNx1NDNuWpw90V6w7yyyzcEPYAx7qPPvss1yUeOT2hgcBWse4zvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpivyJ1fBgSjXpD2o5SIeMFlGJdmayitJVC2kt5KMa15lJv+KKK/g4W2ihhfiZjMaFz24+N+kD+Jj71a9+ddZZZ/Fj/cCBA/l8Nz/PpRZytWCGsMYaa9BPiOZHxmWXXZYPZRkG+JAFfgKed9552Sqst956fHwzySsf5Xwo8xnN3pjBH9Yluuqqq4b/quixxx776U9/il+GGUIFowM77rgj27j++uvLcddLi/2amk5CIRo010ijJicQJxlnFSexPBSuffvttz/kkEOKmzXpu41WdoHWLtxe+cZFGhd+WOfoXX755ela/vSnP3HaAT/3c17S4hCS3kUIxVmUed42bIMu6uWXX+aVZ0qRDTfcUL60k90W16OujldpXOaee27eBu+//z7vN65CDEOHDqUHYntvvvmm/E0s+cZlyy23fPTRR3/4wx+iaTvIknOdfVKKnuadd96RCzn11FOZodugO3/vvffoS9in3Ci2tPvuuxOlgnzpOKb1d7nkDSYbEGTn9G0/+tGP6DBuueUWrog6zOM88cQTZUtsj47npJNOYjj//PPfdNNNlD3nnHPoS2gj7rjjDilO1vHHHz/VVFNhW3PNNblRzz33HDfhhBNO4KLYA84//vGPRHv37k36BRdcwIz85uvCCy9kA1TjPtPB88fzuOOO4xZhPuCAA1hddv7EE0/gYQlCiy22GDvkvnEzubcUkcuB8CBA6xjX2V56TKq4q2N0KGMD14lwNRithzXJVEiFjBZRiXaaLBPSw/LsbECncWmKPBKNnjSeVMhoEZVoZyarKF21kNZCPqpxnZn08I0LLQuNCwcYH5pbb701k5xYdDN80vFpzo+A8mFXpqUXcrVghvCd/o4Ln858NPNTJgc2pxR/5PiAfvDBBzfeeGNOAvFAa9kihWj8jYs0Lhw/nBN84mfgwCMrVNP8n3zj0vqQ+Zaja84556Rr4T6wh3333VcOTg71hRdemD5GUiCkg9Yu5hsXhhz5HHj8QM994w0gf/Pjgw8+IMTJJ42L5OribIb3CYaPP/6YJoMUel+eGm8wHhm3lK1iK65HZVGKM5uLonGRb1ykcZFo+MaFaxw5ciSdwbnnnstW+/Tpw/WefvrpHM8sceCBB0rPQedNdJNNNqFxkW9cQuOCn21QIRRn6T333JPoiiuuKL9I0tEYQvy5YDnqS3Heeyy6xBJLSDfD/Rk+fPg888yDh+GHH37ITXj33Xf79evHDHvmvYefeWm51l57bfaJgTuMwEyICwHZ9s9+9jP+GLI3Qvwx5A5wM7lXBx98MDeKDTMP8l8C0h7RL0rvwoNjSyzBjwc0TyyBmXSKsIHwZ1k/CK1jXGd76TGp4q6O0aGMDVwnwtVgtB7WJFMhFTJaRCXaabJMKB42odO4NKV8FAo9aTypkNEiKtHOTFZRumohrYV8VOM6M+mmceFjlw9WaVz69u3LBz2nBd0Mn+by43KZll7I1YIZwndqXPjY5ShlY/wEz0+rbAbY5O233y4GobVskcJnt9u4kMWP9dNPPz0/jKbgwOa0wBmqaf6vGheuiI7hb3/7G6cRG+D0oonhqbFPurqtttoqNBMQ0kFrF9O4cCTzxBdccEEW4uC85557OIy5/7wTOPbkmwa3cWGSkDQu1KR3ueSSSyhCKd5I5PJ88RTX0zWrjcZlyy235DgHegiW6Nmz5/33309jJ3/fvPsal4ceeoh3yCyzzMJbS67ogQceYAO8e+UuXX311bxFMVBwyJAhvL7xxhsXXnghF9i7d2/50uj6668nhcYCQRaXRmNBiK6FPfNnEOTvFHP/Bw0ahIcQV0enTqM27bTTPvXUUzwprpHbxW1hFfm721wsu6J9fOyxx+SLmUcffZQoS7A3HgqrE3Ufn9YxrrO99JhUcVfH6FDGBq4T4WowWg9rkqmQChktohLtNFkmFA+b0GlcmsKfRgMPplST/qCWg3TIaBGVaGcmqyhdtZDWQj6qcZ2ZdPOrIn7O5szYZpttmNxjjz04M/iwky+oOR3LnBaphVwtmCHU+VVRyGIDf/zjH/loBn7AZVd80PMDKyeTrlys2hryo7D7qyLOHg4JfmA9IAE/HB900EEUJytU0/wf/qqIzQ8dOnTWWWdlA8CtoIfj7Fx66aU50sQvhHTQ2iX8qog+gxOOtwE36qKLLqKB43ylyVt33XWvvfZa5ulazJvBFGczPMrDDz98u+22W3/99RdffHE2ucACC7BtVtE/64tfhP5VUWhcxCC/KqJx4fDmofDcpXGhUWOrvGmffvrp+eabj1VWWWUVhvKWpnFB0wR8+umnp5xyCjNLLrlk+FUR7w0pzmZ4nxNdYYUVaDJMNAY/Bt6o3Jajjz6aPoAbQqPP8KijjqKNYIfMTzHFFPQu888/PxsDLv+HP/whq/Aqv4k78sgjSZljjjmeffZZ/pSRKD8eUI33M3eYGyX//AwXxR3gHUs7QoiHQqJ0RVw+y9HikEhN/jQR+vOf/0yLww2kX6Fx4dnJXzqmgvzpprK+wPAgQOsY19leekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwT0sPy7GxAp3FpijwSjZ40nlTIaBGVaGcmqyhdtZDWQj6qcZ2ZdPONi3zQc+RwDOy11158MvJ5ykw4bAKphVwtmCF8p29cOMloPvhZkw2zPfoGPsQ5JMzeWssWKfhT37iwivzthBR8ssvBGapp/q/+cq5odrXDDjtwRMl9WHjhhdkJhxanGiGxQUgBrV3MNy6ccDx0Ct54443cNM5gTmJOwZ133nlM67/f4RaVmao4k5yLffv2pd3Bz5a23357Oh42ueCCCw4ePJgDlZrYSAlZIm677TZWN42LGDihaVzmmWce3o3yjct5552HmcaFrcr3DRdeeCGb5J787W9/u/jii1n9D3/4g7yfWfF7/MaFzVOQdx0p3BkaHdqCmVsMHDiQ/oPNEOWSp5tuOrpq4A6st956v/zlLxG//vWv33jjDTyHHnooHq6XLGboP7hkeU+yOvAOlP6M5lu+buGJMHn++eczOeecc9KLhK+USCTKKtT805/+xFXQzTz88MPSuIQvbMJfyoHyer7L+8R1tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4TiYRM6jUtTykeh0JPGkwoZLaIS7cxkFaWrFtJayEc1rjOTrhsXPgc5lvgcDI2LfDKag0pILeRqwQzhOzUuwMcuJwEb5ohac801f/Ob3/Chz6e89hSrtobMpxoXLnaJJZbgql1oBRZbbDGOMbJCNc3/bePC61NPPTXNNNPwjIBbwQ/ul1xyCdf1PTYuvA147pzinKm33norJy7dAGsdf/zxnH/yI7vkhuJMHnnkkdhoU3is1KTrvfzyy9mkNC4c2JRln6SELBF33nknNhoUEulFdPfwzDPP0K3OO++8ceNCa8J5zCHNWpttthnbm2GGGfbdd19KSeMiF/I9Ni74uUzeVGyVK73//vv/8Y9/sG6fPn1Yjj6GFWk42AAbpmMYMmQITcO7777LHyUEcBXsmS3J9T7yyCNMkqW/iwIWksaF9zBPhAuUb0quuuoqErlMroUbNb71F40xE+XtTWifffaRP7YPPfQQjQtO7rz80eYPV3hqgfAgQOsY19leekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwTiodN6DQuTSkfhUJPGk8qZLSISrQzk1WUrlpIayEf1bjOTLp8r85pLY2LfB2tGxc+Uv+/07iED26Oiqmmmoof6M1pDa1li5RU44KfD330MgmWXXZZjjF+osUZqgWYDL8qkiGvsUdTzrbQTpOlKVadFDWagjwR+fGabXArFlhgAY6l771x4W5zuzgaaReGDRv2+9//HsNGG21ElJ/vw+keissNZ1dnnXUWZ6oQ/o4Lpzg9UPj2zlwRhys2WgH8dA/69zW8RQmtscYabMw0LnKiT5gwgV6KN/Dcc8/N6lNPPTV12C0tAtHuaFxYjvvAWrSwvXv3nnLKKeliaVy4XrqoJ554Qr5zuvjii7kutkFDQ2VA0Mzxp4zWkywqHHbYYfLNEO98uTOtiy4IjYv0OlwIGxs4cCDFyb3xxhtZjt3KjcJD/4qfmyNP8MEHH5RvXNiD/hWYXEig/vvEdbaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE4mETOo1LU8pHodCTxpMKGS2iEu3MZBWlqxbSWshHNa4zkx4aF84VaVw4q+LGpebnHcLVghnCd2pcis/yb7/lKFp44YXZM+eT/K0O01S1li1S5Bw1jQvXgiaLKJ/jLoQAD85QTSh28O238i/nDhgwAHOwiQE4NnbZZReOtCuvvFKqiUfQTq0NxaqTokZTjUumbWIPwKH+t7/9TbqWNhYSdOMiX1TQCb3yyivcLuDM4+DkXcF5ufnmm3MAczzLD/rkSnE0z+6nP/0pW6IVkNOadodbwcw888xDY5FpXGg+aB047HlkOHkfUo0lWHq11VaTa+TGsg2coXGR9yc12SRH+Pnnn8+7onVXJvvd737H6t3RuAAb4w8Od2PaaafldfHFF3/jjTfYDE0JPQ073HDDDbkW5h9++GHm2QZ3jIvinhBlS7zSeuKZffbZeSNhkC9UuBaKFO+/b7654IILuJD555+fP5tSGQOvv/zlL7khv/71r4cPH05Zrp3cAw44gEkaxOeff54VwzcuNC7husLbVVP/feI620uPSRV3dYwOZWzgOhGuBqP1sCaZCqmQ0SIq0U6TZULxsAmdxqUp8rGi4cGUatKf2HKQDhktohLtzGQVpasW0lrIRzWuM5N++eWX86m90EILcVpwYsmvirbddlsm+UBnRj5Sw7kYSC3kasEM4Tv95VyBnRx++OF8oHNGslWZ1J5i1daQj+nv/d9xQbOBLbfckvsj/3BtmBcBrMjxiWH//feXM1XfPVOtVBGEQtRoXilIZX7WZ5UZZpiBU1OimpACWrtwLoa/nEt/wLHHqT/rrLOutdZa//jHP8444wy6Fi5qqqmmuuiii6SzCdcVinOcyz/2ynm86aab7rHHHksuuSRHO8NpppmGu/H0009zYHPTSAlZCOqQS5/HQYt5iSWWOOKII84+++wTTzzxZz/7Gc967rnnfvLJJ+Wvl/LQ5asInoLshGdKOvvhvSr/JB1stNFGHNhcFzfqe/zLuQKX8MEHH9DqcbGsdeihh7IWudJG0HINHDhwvvnmIzrHHHPw5+jkk08++uijN9lkE96QYuD12WefpQLp9BbbbLPNUUcdxU6WXnrp008/nQ3gkW9uMNDfn3DCCbfccos0czRDs802G/NUO/bYY2nLaGLon+Css86im6FN4Srk77hMN9104brI1W9FITwI0DrGdbaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE9LA8OxvQaVyaIo9EoyeNJxUyWkQl2pnJKkpXLaS1kI9qXKc7Ke9a+TsuiyyyiP7GZYcdduBj8c9//rP8RMuHLE7JCqQWcrVghvBdf1XExy5wXM0444ycpuxKZspwi9ayRQqnS+obF2hdvY+OSrVy8O23L7/88mWXXcZPydwfztQ77riDW4RfVhS4EM54Dq299tqLRpCjmgODXIlqp9aGYtVJUaMpJVfNCcc2+vTpE/onEBuEFNDa5cMPP5RvXGhkpRvg8Ft99dU5+ZgE+Wn+zDPPHDp0KJ0N18WdlDsfinPcckD+9re/JYuNcY5y/zk+6YMZUoEDnqOdR8w+zRVxCRMnTqQjkX8BRdIRvNJu3nvvvexKfhHD+1P+gurWW28tXy1QkKW5yVTgAdHlkBgal8w3LuyfpeUbF1ap842LQCKXTyILzTLLLPfffz993siRI9kAuWyGbomfBDbccEM6NjzU54p69uxJh8H7QTotNvbUU0/RGtIOEhUP7eM111xDOh5e6Ut4q8vdo3ckhXlWue+++3g63ByygChNP3+W6Vq4J9KuyX9VRDqNS+cbl4DrRLgajNbDmmQqpEJGi6hEO02WCcXDJnQal6aUj0KhJ40nFTJaRCXamckqSlctpLWQj2pcpztZHH2tH9w5EvjI5jTi001+ZOQnVz6IgXNIjqian3cIVwtmCN+pcWnVKzRnA0cRZ23YVfCAtmUalwyhQtAsJLeLc/Ghhx56oMWDDz74yCOPcIvEJn7gEu655x5OrMMPP/zd1r86z2GT2mqpIopVu+4haBHAHVh88cU5n9hYOaXQTq1j2Bt75unL3ySVh84ZyRHLkM7j5ptv5op4kzB88803aSDkawlJD8W54WQR5SDnECURM2VJvOmmm2688UaaYyrL3dBXBFwC8/TNvOtuu+22Y445hrt31llncYffeOMN3qLcSfl1CUc+ZzCTbIa1KCidEKUQvIFZbuDAgc8//zyr04HxDmcSPbj1j7CNGDFC2gv8LMpVUIFSRPkjEJqP1taSsBzbYD9c0XPPPcd9Y4fh0oCdsEm2R/SWW2654YYbuBvy12l5P3BjMciFcLeffPLJAQMGcJP5s0BfyGNlw9K7cEO4lmuvvZYoazHkDgMGruWJJ55gHqggv6tikkvg6VCfC2FGrpqaqesKDwK0jnGd7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMqF42IRO49IU+dTQ8GBKNenkKAfpkNEiKtHOTFZRumohrYV8VOM6U+l8kPHBzecgn+98tNGv8OnG0c4nJp+MfC4zz2crn7B8xJc5k0gt5GrBDOE7/aqoVa/U7IfNi4YwD8HGpX2PvypixcxNCALY2MUXX8yP0ZdffjnnJXeVOxxytVNrA6EQNVoEoDmGeTryk7TBOEuVgCJyTMrbgIfOgwCOWN4DrEL/wfnHKYuWtibcf11czkuKYJZzFD9HKYkM0byvKMvdCFkI0RRkDxTHRqJAFrmc9+xKzntssgobkyaAzUsdylKcJfDLVuUIB2xUYCfSXsjj4BUkSjUTzYNH/uDInZGmSj8IyrI6i7ITASdLhD1jZrfcanKJyvUidOPCK/5wOXLnqcyrPCxClJXiQClySaQ4S7AlUrBlrktuvqB1jOtsLz0mVdzVMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCelheXY2oNO4NEUeiUZPGk8qZLSISrQzk1WUrlpIayEf1bhOd7J827YOLT4H+TDlU5LPRA4GPvX4mJMZPlvliJKsQGohVwt6KEu3940LicWBo75mCB4INrZtvnGRxkWGGUKFoFkRHV6FVrwk+IGzdv7559944435OZ6jiLONRYNfO7U2FKt23UPQIoR4JwHtNFkxFAkPnTcDD11OPnljcEZyfArS0+i3RCjOjHQVeDgpMZPIOcr5La8MeUdJn6GvSDQPlHRWJJ2bxhkMHPxoqnEGkyjryiqUYpLNyDshFCHEnllOroULAWz45S3NUDZfvIdaEKVU8IdLy4AHp+xW1mJ7lCrDk7ooynLt3AquQi4EJ/OkAx6y2JLcYQxyh+UucZm8soRUCLnMszSXSXH9aNDMMF9cUgspzqQULHfWFblvgtYxrrO99JhUcVfH6FDGBq4T4WowWg9rkqmQChktohLtNFkmFA+b0GlcmlI+CoWeNJ5UyGgRlWhnJqsoXbWQ1kI+qnGd+XQ+QOUjkk9DPuzMDIKhODWphVwt6KHUpHF54oknZFi/cTEajJYhO48bF44BGWYIFYwGo8NQz3OE3H///Rwz/KQLnGpcVLiH2qm1oSitimstArSO+a5OeejyxHkbMGydfcUvU9g/PQ0geFeYt4QuHorg5IgFSQGEaAw4QxYiaFmUdBJ5Uhy3vMq6sihgCxsDhOSGIszIikQlixkEyIyk6HWlmuxNlqiDZIWL0olSmRnmiXIJXAhwXdose5MihLCFO8a8ECpIiGFIRIdEkCjzxQ6UJ9wHmTeEmwBax7jO9tJjUsVdHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmVA8bEKncWkKfzgNPJhSTfpzWw7SIaNFVKKdmayidNVCWgv5qMZ11knnM47XOk7BdSJcLZghrLHGGo8//rjo7/SrIq3BaBnyea1/VRQaFxlmCBWMBqPDUM9zSHDSyE/D/KzMiuykjEUVShVRlFbFtRYBWse04UTI20C0iEAIQWVxzEI57gopIUvrAIncRpAiwGRs04Roq95/nVprjK0NdLqpZjT7l8uRYZgXEZDLdGndg+popngK7cxnuc720mNSxV0do0MZG7hOhKvBaD2sSaZCKmS0iEq002SZkB6WZ2cDOo1LU+SRaPSk8aRCRouoRDszWUXpqoW0FvJRjetsO13e2eW4K25NhKuFMJSyfNQefvjhQ4YMkeHbb7998MEHc8aLJxCyWvX84kYDBSn197///d1335X5N954g+X4qVSGGaRCrMHoMNTzLM0RxU/AtCy8StdSxqIKpYooSqviWosArWPacMYpeueayuIh0Y0yGea1FsgN6KFENXoyU9DFtbmrpNDpplqsW1dQFA+hIEJIE898J/QG8sRbTeE620uPSRV3dYwOZWzgOhGuBqP1sCaZCqmQ0SIq0U6TZULxsAmdxqUp8hGg4cGUatIHQTlIh4wWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3oofyMCPJjqGj9zUQgZLXq+cWNDkMpLsgSUI7T6Apag9FhaOZ5ZWkuR29AiJ0uhELUaBGgdUwbzm4tDkEjXB2jQxkbhGhRLlHc6FSoJibdDEvVNWS0iEoyzla9iuJaxzR0tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4T0sDw7G9BpXJoij0SjJ40nFTJaRCXamckqSlctpLWQj2pcZ8N0F9eJcLWgh/JnhjZCNy4yWTomEbJa9fziRoNU41UmRcukzGSQCrEGo8PQzItw13KdMYRC1GgRoHVMG85uLQ5BI1wdo0MZG4RoUS5R3OhUqCYm3QxL1TVktIhKMs5WvYriWsc0dLaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE4mETOo1LU8pHodCTxpMKGS2iEu3MZBWlqxbSWshHNa6zYbqL60S4WghDOdF5FSH9RKBl+S8hq1XPL240lLUmVRPNQjLMIxViDUaHoZkvlUdNJ6EQNVoEaB3ThrNbi0PQCFfH6FDGBiFalEsUNzoVqolJN8NSdQ0ZLaKSjLNVr6K41jENne2lx6SKuzpGhzI2cJ0IV4PReliTTIVUyGgRlWinyTKheNiETuPSFDmfNDyYUnU9wyAVMlpEJdqZySpKVy2ktZCPalxnw3QX14lwtaCHJpQhOItyieJGp0I10RW0BqPD0MyXyqOmk1CIGi0CtI5pw9mtxSFohKtjdChjgxAtyiWKG50K1cSkm2GpuoaMFlFJxtmqV1Fc65iGzvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJmQHpZnZwM6jUtT5JFo9KTxpEJGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ51HQSClGjRYDWMW04u7U4BI1wdYwOZWwQokW5RHGjU6GamHQzLFXXkNEiKsk4W/Uqimsd09DZXnpMqrirY3QoYwPXiXA1GK2HNclUSIWMFlGJdposE4qHTeg0Lk0pH4VCTxpPKmS0iEq0M5NVlK5aSGshH9W4zobpLq4T4WpBD00oQ3AW5RLFjU6FaqIraA1Gh6GZL5VHTSehEDVaBGgd04azW4tD0AhXx+hQxgYhWpRLFDc6FaqJSTfDUnUNGS2ikoyzVa+iuNYxDZ3tpcekirs6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyoXjYhE7j0pTyyy8FD6ZU0VdkqZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUdzoVKgmJt0MS9U1ZLSISjLOVr2K4lrHNHS2lx6TKu7qGB3K2MB1IlwNRuthTTIVUiGjRVSinSbLhPSwPDsb0GlcmiKPRKMnjScVMlpEJdqZySpKVy2ktZCPalxnw3QX14lwtaCHJpQhOItyieJGp0I10RW0BqPD0MyXyqOmk1CIGi0CtI5pw9mtxSFohKtjdChjgxAtyiWKG50K1cSkm2GpuoaMFlFJxtmqV1Fc65iGzvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpSvkoFHrSeFIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0SJA65g2nN1aHIJGuDpGhzI2CNGiXKK40alQTUy6GZaqa8hoEZVknK16FcW1jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgaLQK0jmnD2a3FIWiEq2N0KGODEC3KJYobnQrVxKSbYam6howWUUnG2apXUVzrmIbO9tJjUsVdHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmVA8bEKncWlK+SgUetJ4UiGjRVSinZmsonTVQloL+ajGdTZMd3GdCFcLemhCGYKzKJcobnQqVBNdQWswOgzNfKk8ajoJhajRIkDrmDac3Vocgka4OkaHMjYI0aJcorjRqVBNTLoZlqpryGgRlWScrXoVxbWOaehsLz0mVdzVMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCcXDJnQal6aUf91IwYMpVfSXklIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0SJA65g2nN1aHIJGuDpGhzI2CNGiXKK40alQTUy6GZaqa8hoEZVknK16FcW1jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnpYXl2NqDTuDRFHolGTxpPKmS0iEq0M5NVlK5aSGshH9W4zobpLq4T4WpBD00oQ3AW5RLFjU6FaqIraA1Gh6GZL5VHTSehEDVaBGgd04azW4tD0AhXx+hQxgYhWpRLFDc6FaqJSTfDUnUNGS2ikoyzVa+iuNYxDZ3tpcekirs6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyoXjYhE7j0pTyUSj0pPGkQkaLqEQ7M1lF6aqFtBbyUY3rbJju4joRrhb00IQyBGdRLlHc6FSoJrqC1mB0GJr5UnnUdBIKUaNFgNYxbTi7tTgEjXB1jA5lbBCiRblEcaNToZqYdDMsVdeQ0SIqyThb9SqKax3T0NleekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwTiodN6DQuTSm//FLwYEoVfUWWChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+VR00koRI0WAVrHtOHs1uIQNMLVMTqUsUGIFuUSxY1OhWpi0s2wVF1DRouoJONs1asornVMQ2d76TGp4q6O0aGMDVwnwtVgtB7WJFMhFTJaRCXaabJMSA/Ls7MBncalKfJINHrSeFIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0SJA65g2nN1aHIJGuDpGhzI2CNGiXKK40alQTUy6GZaqa8hoEZVknK16FcW1jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgaLQK0jmnD2a3FIWiEq2N0KGODEC3KJYobnQrVxKSbYam6howWUUnG2apXUVzrmIbO9tJjUsVdHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmVA8bEKncWlK+SgUetJ4UiGjRVSinZmsonTVQloL+ajGdTZMd3GdCFcLemhCGYKzKJcobnQqVBNdQWswOgzNfKk8ajoJhajRIkDrmDac3Vocgka4OkaHMjYI0aJcorjRqVBNTLoZlqpryGgRlWScrXoVxbWOaehsLz0mVdzVMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCcXDJnQal6aUj0KhJ40nFTJaRCXamckqSlctpLWQj2pcZ8N0F9eJcLWghyaUITiLconiRqdCNdEVtAajw9DMl8qjppNQiBotArSOacPZrcUhaISrY3QoY4MQLcolihudCtXEpJthqbqGjBZRScbZqldRXOuYhs720mNSxV0do0MZG7hOhKvBaD2sSaZCKmS0iEq002SZUDxsQqdxaUr5KBR60nhSIaNFVKKdmayidNVCWgv5qMZ1Nkx3cZ0IVwt6aEIZgrMolyhudCpUE11BazA6DM18qTxqOgmFqNEiQOuYNpwNi5d/2W/S3/4TXCfC1TE6lLFBiBblEsWNToVqYtLNsFSTtNyZlqsMBVFJxtmq99+CWosArWMaOttLj0kVd3WMDmVs4DoRrgaj9bAmmQqpkNEiKtFOk2VC8bAJncalKeWjUOhJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiRosArWPacDYs/u9///vrr7/mVYaC60S4OkaHMjYI0aJcorjRqVBNTLoZlqqlaVm4LSC9S5gXUUnGSShEjRYBWsc0dLaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE4mETOo1LU8pHodCTxpMKGS2iEu3MZBWlqxbSWshHNa6zYbqL60S4WtBDE8oQnEW5RHGjU6Ga6Apag9FhaOZL5VHTSShEjRYBWse04WxYnIP5pZde+vzzz2UouE6Eq2N0KGODEC3KJYobnQrVxKSbYalaety4ccOGDaNrkWGYF1FJxkkoRI0WAVrHNHS2lx6TKu7qGB3K2MB1IlwNRuthTTIVUiGjRVSinSbLhOJhEzqNS1OKb2O7woMpVfTfr6dCRouoRDszWUXpqoW0FvJRjetsmO7iOhGuFvTQhDIEZ1EuUdzoVKgmuoLWYHQYmvlSedR0EgpRo0WA1jFtOBsWf+211379619PnDhRhoLrRLg6RocyNgjRolyiuNGpUE1MuhmWqqVHjBjBnRk7dqwM5asX7cmTcRIKUaNFgNYxDZ3tpcekirs6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyIT0sz84GdBqXpsgj0ehJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiRosArWPacLZXnGP466+//uKLLzbeeOPbbrvtm2++KQMt3JoIV8foUMYGIVqUSxQ3OhWqiUk3w1JN0kcfffQRRxzBzeF28SoHhhgqyTgJhajRIkDrmIbO9tJjUsVdHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmVA8bEKncWlK+SgUetJ4UiGjRVSinZmsonTVQloL+ajGdTZMd3GdCFcLemhCGYKzKJcobnQqVBNdQWswOgzNfKk8ajoJhajRIkDrmDac7RXnDOYwvuWWW37/+99/9dVX6DLQwq2JcHWMDmVsEKJFuURxo1Ohmph0MyxVS3OLRo0atdJKKw0fPly6FpkXQyUZJ6EQNVoEaB3T0NleekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwTiodN6DQuTeETwcCDKVX0FVkqZLSISrQzk1WUrlpIayEf1bjOhukurhPhakEPTShDcBblEsWNToVqoitoDUaHoZkvlUdNJ6EQNVoEaB3ThrO94nQq48ePX2211e677z50OTsJtyZCaxECFXQRHTVOgy7oajA6FTLEFyWYdDMs1SRNkVNOOWWPPfZoXV/nV0WWVHFXx+hQxgauE+FqMFoPa5KpkAoZLaIS7TRZJqSH5dnZgE7j0hR5JBo9aTypkNEiKtHOTFZRumohrYV8VOM6G6a7uE6EqwU9NKEMwVmUSxQ3OhWqia6gNRgdhma+VB41nYRCNOjyk0YhR6AI0UGEdNA6JkTbSAGWu/baa3/xi1988cUXsnQZaOHWRIgutt6itWt7LeJsZRRoHROiRWlPg9FuKOxBXkVoxAYm3QxL1dIU+eabbz744INFFlnkzTfflJrakyfjJBSiRosArWMaOttLj0kVd3WMDmVs4DoRrgaj9bAmmQqpkNEiKtFOk2VC8bAJncalKeWjUOhJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiQXP4ff7552PHjh03CfmbsF9//TWT8Mknn4SDNqSD1jEh2kYKfPXVV+utt94FF1xQHPItykALtyZCtGwVJLEcTOobxNnKKNA6JkSL0p4Go+OQLMrq8pdRWnspCLrlLTHpZliqlpaCPKY//elPhx56qFTTnjwZJ6EQNVoEaB3T0NleekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwTiodN6DQuTSkfhUJPGk8qZLSISrQzk1WUrlpIayEf1bjOhukurhPhakEPTShDcBblEsWNToVqoitoDUaHoZkvlUdNJ6EQDZrT7plnnvnZz362YIvll1/+tttu4xQcMWLE7rvvvuiiix5//PFffvll64Tt9sZFVkE/++yzCyywwAcffCAz7Ec8glsTIZqz/MEHHzzttNMOO+yw8ePHy+l+ww037L333m+99ZY4WxkFWseEaFHa02B0HKIRvOeee7iNJ5544meffca10Cmef/75/fr1GzNmTP7SzLBUk7TcHB7fIosswpWitSdPxkkoRI0WAVrHNHS2lx6TKu7qGB3K2MB1IlwNRuthTTIVUiGjRVSinSbLhOJhEzqNS1PKR6HQk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+VR00koRIOWc/3MM8+cYoopJptsssUWW4wjkNMU7rrrrrXXXvuLL76QFAjpoHVMiH6nFI5eBK8HHXTQpptuyt4kZHBrIkSzYRqXAw44oEePHscddxwXQtOw4oor9uzZkwZCnK2MAq1jQrQo7WkwOg5xP++9997tt99+mmmmGTBgwFdffUULtfTSS88000yvvfYa25OrFky6GZaqa4iC880339133y3zMllJxtmqXUaNFgFaxzR0tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4TiYRM6jUtTykeh0JPGkwoZLaIS7cxkFaWrFtJayEc1rrNhuovrRLha0EMTyhCcRblEcaNToZroClqD0WFo5kvlUdNJKESD5uCkcfnkk08WXXRRGhd4+OGHOUqZ79ev35133pk5VkvlEaLfKYW1gM5jySWXvOCCC/TSGrcmQjRZXNHEiRNXWmkl+pWbb765f//+nO7yvZE4WxkFWseEaFHa02B0HOJOwocffjjPPPP06dPnnHPOue2229jM559/TmcGRMUPJt0MS9U1RPqee+5JY8TVaU+ejLNVu4waLQK0jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgGzbHHgcfrKaecMvnkk//gBz/YYostOObHjx//+9//noZGdw8hHbSOCdHvlCKbef3116eddtpBgwbpE13j1kSIlsuhITjooIOmmmqqvffemxaBmVAtpIDWMSFalPY0GB2H2I9sadNNN51hhhlOOukk9iYzIFr8YNLNsFRdQ6RffPHF8lWZDGU+j65m0MWNFgFaxzR0tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4TiYRM6jUtTykeh0JPGkwoZLaIS7cxkFaWrFtJayEc1rrNhuovrRLha0EMTyhCcRblEcaNToZroClqD0WFo5kvlUdNJKESN5rR7991355hjjskmm2ymmWZ65ZVXBgwYcOGFF3KyZo7VUnno4iKgTgor9u/ff/755zc9k8atiRBNlnD33XfTil166aVo8QghBbSOCdGitKfB6FSIPZx55pk9evT417/+VU55mHQzLFXXEJVfeOGF2Wab7fnnnzfPK4OuZtDFjRYBWsc0dLaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE4mETOo1LU/ggMPBgSjXpM6IcpENGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ51HQSClGjeeVH/z322OMHP/gBvQtiu+22GzVqVPGdQOssFEIKaB1jigs1U3bdddcNNtgAoZfWuDURQcu277nnHq6lb9++MhkINtA6JkSL0p4Go1MhOPvss9nPaaedVo49TLoZlioKffrpp3POOedll10mF17OZtHpBl3caBGgdUxDZ3vpManiro7RoYwNXCfC1WC0HtYkUyEVMlpEJdppskxID8uzswGdxqUp8kg0etJ4UiGjRVSinZmsonTVQloL+ajGdTZMd3GdCFcLemhCGYKzKJcobnQqVBNdQWswOgzNfKk8ajoJhajRfNZw4L366qtTTTXV5JNP3qNHj0MOOSQcgWKDkAJax+jiIqAyhRXffffdJZdccq+99pLVy1hXUjXDhhHjx48/9NBDV1xxxZVXXplznRmJ0p998MEHaEmp3FIQrgajw5C1eP3666/ll0EjR47861//Ou+88/bp04dJ2aTw0UcfPf3002Mn/V+HQq6uBkaHoZTiMrfYYosJEyaY3z0JGJgUpwhdzaCLGy0CtI5p6KyZLteiX8vAJFLFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpSvkoFHrSeFIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0bzyuc+Buv7660822WS9evV67rnnJKoJKaB1jCkuVKZw6F5xxRVTTDHFaaed5h5FQqom6UAiXHjhhc8+++x+++0300wzvfnmm/QuQ4YMGTdu3DnnnLP44ouH/9105ZaCcDUYHYZymn41iVNPPXXYsGEbb7zxwgsvTFM1evTod955hxt+1113rbfeeiuttBK7on2RzcuF62pgtB7CNttsw1MbPHgwNcspBQVBbg6CV5Ou0cWNFgFaxzR01kwPz1quCMrAJFLFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpSuuPfBd4MKWa9MFaDtIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0SL46L/pppt69OixwQYbcCrIpCY4QeuYuDhUpnDoXnbZZZNPPrn8J0Up3JoI9n/99dcPGDDg9ttvpwKXcN1119EGnX322eeffz6H+hdffPHyyy/PPffcNC4hS4SLLu5qMDoM2QwboH+65557uKg777yTmeOOO47be+utt55++ukjRoz47LPPLr744jFjxnzyySc777zzpptuigekgq4GRush7LHHHtNMM81rr73mPjig7CuvvHLbbbf961//wmPSNbq40SJA65iGzprpPE0eNDeTRrCc6kqquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwT0sPy7GxAp3FpijwSjZ40nlTIaBGVaGcmqyhdtZDWQj6qcZ0N011cJ8LVgh6aUIbgLMolihudCtVEV9AajA5DM18qj5pOQiFqtAg+cTjRl1hiCY5/tExqghO0jomLQ2UKh2v//v1pXG688UZ3A0KqJufx+uuvP+200+6///4TJ06k2ujRo+eZZ54FFliA7oEoDBkyZN555w3/OE3lloJwNRgtQzbPWvRhCy+88EwzzXTCCSegmaGxYLjssss+//zzDNmh/JN0cO211/7qV7+SSbn2UE0wWg/x9+vXj8aF1oT0clYhBQ877LAf/OAHa6+99ldffSULSdSgixstArSOaeisk84VjR07dsYZZ6QRfOGFF7i9xU2chL6Ngi7u6hgdytjAdSJEy3dgshleIXgg2L4TmQqpkNEiKtFOk2VC8bAJncalKeWjUOhJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiRosA9LBhw+S/Hy6nFMZZKo9U8VJ5EOUz/ayzzqJxeeihh9BlIMKtiWDPH3/88dChQ+UYa50R344YMWLUqFGcZMDw9ddf/980LnJ8fvjhh++++24YwnvvvSf/wRRIFoL5k08++eKLL8YGEgrVBKPDsFXp2+OPP54j/JlnnpFcl8MPP5zGZa211ho/fvynn35K+8JaZUyhixstArSOaeisk85lchvXXHPNVVZZ5eWXX6ZP5U3LbXzqqacuvfTSf/3rX3Ifwt3QxV0do0MZG7hOhOjnnnuOJ/vII4+wGdlP8ECwfScyFVIho0VUop0my4TiYRM6jUtTykeh0JPGkwoZLaIS7cxkFaWrFtJayEc1rrNhuovrRLha0EMTyhCcRblEcaNToZroClqD0WFo5kvlUdNJKESNFiFw6svBX44V2mmyDG7xyhQ+1k877bQpppgiHDwuqZqtQ7+g1SGUfwEiIAX/N40LsAFuY3gVAXozgY8++qhfv36cvmhCEtXVwOgwFPOpp5465ZRTPv744zLpIo3LGmusMXLkyDFjxtC7sKsyptDFjRYBWsc0dNZMZ/MTJkygVeVaxo0bJ43LtttuO9lkkx166KHmWyVd3NUxOpSxgetEAE+nb9++bOmAAw6QdwIzwQNiKwe1yVRIhYwWUYl2miwTiodN6DQuTSkfhUJPGk8qZLSISrQzk1WUrlpIayEf1bjOhukurhPhakEPTShDcBblEsWNToVqoitoDUaHoZkvlUdNJ6EQNVqEUBybLcqxQjtNlsEtXpnCoieffHJ7jYsIkyXDMIkYMmTIPPPM8z9oXPRO0JowI60Mm6Fde/fddznV5KwVj64GRoehlDrllFNoXB577DHJjWFeGpfVVlvtrbfeGjFiBKc+p/uXX345cODAY4899ogjjjj++ONvv/12+T8opeoA0dGjRz/yyCMnnXTSkUceSe4TTzzBVYSvcMxWRRB67rnnnnrqKZqMYsctmBw2bBiT8j+4BibZ29NPP/3yyy8zHD9+/LXXXssqrMW2xSCwc673gQceeO+997Cx7SeffHKDDTagS9huu+0efvhh3kVvv/22pMg2Qvqrr76K+f3330fL9mSeS+Bannnmmc8++0wmWeWVV17hzhx11FHnn38+a0kRogiQp8ZCV199NR7gabIxGilaQy5ko402Ykt9+vRhS8yHq+CV5d55551LLrmErKOPPvrmm2/mxkrBYjct5G4MGjSIt8obb7xxwgkncCuoTKi18QL3hgM6DI0WUYl2miwTiodN6DQuTZF3j4YHU6qub3pIhYwWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfKo6SQUokaLAK1j2nB+1xQ+lCeffHJpXFK4NRGu1nBUDB48eL755uNQkRnXFgjRVj2/uNGpkIv0DRdccAGHN2cVZz+Heji0MtVMCNyJYMIAAHcKSURBVKRxefTRR8txBNcuf8fl5z//OecfZ/bYsWMnTpy41157kcg953DlFcNVV10lLQhZ8UJwzz33yL9VSIspifpfKMagU4KmJlmk3HfffVyjOBFbbbUVix500EFo4aKLLsK28sor09AsuOCC1Je1fvjDH8rfVZJc9j/DDDP06NGDVuOTTz6h0enVqxf7CbuCY445JlyLgKbC7rvvTpT+hm5ARylF+tJLL01BQvQHu+2229RTT41Z7tJss83GHkIWghbn4IMPZieyriwt//Qzb7aZZ56Zq5MQAv7+97+TxR7YGM0iiZIir3TV1113nd7z2WefTWittdY655xzxMwePv7447CH1iNybjjokNEiKtFOk2VCelienQ3oNC5NkUei0ZPGkwoZLaIS7cxkFaWrFtJayEc1rrNhuovrRLha0EMTyhCcRblEcaNToZroClqD0WFo5kvlUdNJKESNFgFax7Th/K4p/GjLR/NDDz2kP5QNbk2EqzUUlH9k9sMPP5QZ1xYI0VY9v7jRqVAMm6Fv2G+//dZZZ51tWmy44YbvvvsuFy6GTDUdKg6Kb7/lB3FOR/n/TMm8gXn5xoXGhT7pvffe4+C/5pprOCznnHNOfugfOHDggAEDDj30UIpwGNNRSSm9rnDllVeuscYaPKkHHnjgmWee2WWXXege4LnnnqP34kAtfS1COofxXHPNxSbvvPNOGgL5zQ5svfXW7OHAAw+cMGECk9j69+/PDG3K7LPPPvfcc//lL3858cQTF198cXJXXHFF+aKIuzR69OiZZppJGhcmP/jgA472lVZaidw111zzqKOOOvbYY7kWaUHkrnJFCIaPP/44zRbvhDfffJM9sw3mYfvtt+ftRyKPhpuw6667suhyyy3Hnp9//nmaGIbcruHDh3N/pBQeehq28dvf/vb2229/5JFHaDV4rEOHDmVLaG44W1pttdXYEo3Ugw8+KJdwyCGHsNaMM85IK0OLRj+34447MtOzZ88bbrhB7j+ceeaZTHIfpp9++hVWWOHoo4/mzn/00UfcK7nVrfdC9fvEaBGVaKfJMqF42IRO49KU8lEo9KTxpEJGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ5GCcfdvJBzFA++wQxuFoEaB2TcrY+84sfZJkUZJIZMQgyrwn7BLmKk08+mY9pjiI0hlakAKeYZbKVXRB0y+VoDeclR8LFF1986623ciyVs2mKq5r0uw/ZntFiKN3RulrHkDhixIhLL72UpuGyFhx7csSKwa3Ga2vN/3Z1spnTTjuN4zPzKzbmQ+PCmUrjMm7cOHoC7vZOO+3ETpgBzloOxfHjx3OcS6lwsQFaAcAzatQoEmmDFlhgAc7mCy+8kJoksr3SqtK5NHoRDv6bbrrp448/Dv3ElltuSS4nPdVk3fPOOw8brcCSSy5JM8QS7IqnxgXOMccctBp0FRz8FJlhhhmYpDsZM2YMkzQEffr04Rr79u376quvcpnyxZI0SbIN4Low0+Jw7dw32TMGzOyQmi+99BLFWXqaaaaZeeaZac64J2zj7bffpiUi6/TTT5fmg4ajV69e7J++85133qECrWe4jewKdthhB7ZEUzJo0KA33niDOqyIoBHhGulsRo4ciZ8scumcqLbMMsuwbXbFLWIt0qeeeurNNtuMW40TP8W5ge4zMjoMjRZRiXaaLBOKh03oNC5N4Z1h4MGUatJnRDlIh4wWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfJwnXwQiwgQClGjRYDWMSln6yS1bQoznDFrrbXW2pNYZ511StWCE4tPf2zil4KhcZHJgHQtYna3gXC1QYoEylmP0qFaBJnXmqhsTIY6BFrHSK5OF5gR4VbDLMgkyHz4S806pGFeflW06qqrcqJzRtIlHHPMMWTNPvvs55577pAhQzg7OValq5BvFEgMewiC05RWgPTRo0dziJLVu3dvKp911lkffvghPUG4BAhZHPO0BTzc6667jtOXE50lcEqrsffee7MlehfSzznnHGyzzTbbfffdxwFPu0DooYce4vBmkuOfXDbA6pz97B8b22A/vJ14U1FNGhfpEqQDCLdF9sNmjj32WFqEVVZZhW6DRbko7iG5m2yyCTMk/uMf/8CwxhprcIFvvfUWb2Z2Iq3eVlttxVbpff/+978z7NmzJ10OHpD2hf2wPZYGGhfq8CqNi2yVe8XkQgst9Prrr5MiWdyWe++9d9ppp6Vheuyxx9gVt0h29ZOf/GTgwIFyN3CGJk+uKNxkMDoMjRZRiXaaLBPSw/LsbECncWmKPBKNnjSeVMhoEZVoZyarKF21kNZCPqpxnQ3TXVwnwtWCHppQhuAsyiWKG50K1URX0BqMDkMzXyoP4+SDg8/lJ5988q6u3N3C1SJA65iM88477+RTlaVlG5wTnATDhw+nWQmsu+66pWrBD6mcNBxdkiJXccopp3AUsXmZDFCZc+Kee+5hLXcbCFdr2OQdd9whQl5dm4ATAwdP+CCWnWjNlu6//36pCWbdTHGBRFmlHE9aVLRbjSiHlnQGsgfZDD+XT9n6r4rCvIF5/Y0LrQDHP4fxoosuygnKPV9++eVPOukkzk66FvmKwpRiIWaAhyvPgqN9t91246SfccYZKcIRS9lx48bx9itzJm0PeD/86Ec/4pi/6qqrOH051+V7Dmk19tprLzbDoc5bgi6KmeWWW+61116Tc5024vnnn6dxmWWWWV5++WVpj7gP0rhwZ+gteFJMyt+Y2X333enD2AxN2KeffsrS4VpkP8wMGzaMFoGeg14BG+uuuOKKVLv66quljdh88825qBlmmGHZZZddZpll2A+3iEtgUn6pR+uwxRZbcEUkUo33P4nsjTvAPWQzLA0777wzW9ppp53ktrNP7h7XS50111xT2h1uO3vgiVCEPpKaF154IbcI56mnnko6DZbcDSpI10KIq5ArCjcZjA5Do0VUop0my4TiYRM6jUtTykeh0JPGkwoZLaIS7cxkFaWrFtJayEc1rrNhuovrRLha0EMTyhCcRblEcaNToZroClqD0WFo5kvlYZx8RvODGqfUdl3ZvoWrRYDWMXnnTTfdxJEm20BwJvHKTjjGBD5nS9VCfqYHSZGryDQuH3300R//+EcWcreBcHWMDmVs22677Y477sj5x+rsTbZnNEeL/A8phdaytYqnMOlmyCs/uL/44ovhpoFs5js1LuHvuHC4Dh48+Pe//z0/4hPiKF1qqaWeeeYZ+SUFWeFKgQo8U+affvrpJZZYAjNHLEfvRhttNN1005HOEcsZzAHMky1zVAWeuDQuV155JY0LT5P2COfWW29N7p577kmbK43LOeecwwztAnujm8HJOc2xTeMy66yzvvTSS2Jjnq5CGhf6DGkIpBptQajGuuw8bAMB3EBaCvoP9tO3b19aAZpCbsIiiyzCJbA9WqWNN96Ya5xtttmkZVlhhRVoUADNe0Mu4Q9/+APLrbbaarxP6DnoWliRi+IuCehddtkFD69siRTWZXLXXXel+K9//WueBe0Om+ducNsxyF3iDwJ74MIRpK+88srcDakg38RQXJ61XJFcHRgdhkaLqEQ7TZYJxcMmdBqXpvDmMPBgSjXpM6IcpENGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ5GCcfyhwwUE5NglBwGi0CtI5JOYvuo0U5bkXZAD9i/uQnP1lQsYBi7bXX5nM5ZEnB1K+KsHECidndBsLVMTqUsZnLCU6t5SbrSwgh0LomJt0MeWUtFjV747WNXxWNGzeOQ5RXDshHHnmEDkz+oxXOaRoyudutLZR7QLD0Bx98wDPFtvvuu1OHNoX0xRdfnMocsfQZ8lWKpEBI56CVv+Ny1VVX0Rxgk8ZFflXE6hz8lPrkk0/OPvtsZmhcpMHiTUJBojQW0rjI10J0CdNPPz07oeeQPoyCoXFhCdoRWhm5IfoqgElqXnHFFXR7c801Fxey1VZb0UkceuihdCRUY3s777wzMzQ3r7zyyqBBg9gMNblAzPI9EB5Zbskll6SvIiq//zIL0W3j4ZXK3C75NmufffZhkiaS6+JyaHe4P5i5A3PMMQcXJf/pNU+HPxE4f/azn7GuzLCEXJQgC5UDdaWgQ0aLqEQ7TZYJ6WF5djag07g0RR6JRk8aTypktIhKtDOTVZSuWkhrIR/VuM6G6S6uE+FqQQ9NKENwFuUSxY1OhWqiK2gNRoehmS+Vh3HyoSaUnyKTEIOrRYDWMRlnWJF5CTHk52A+/TlFOJYETrsAhwQHJzZJkavIfOMi9UGGQtAtl6NjdChjK65HrWiQeekhjE00aF0Tk26GIsxyct++6zcu8psO+SmfI5+zk3P3wgsvnHrqqXv16sX958d6ro5E/e6iyJ133slxTqfCyc1P/xy0nKZLLLEElU899dThw4fTLlDWbA/kGxdyWUVyefr0AfPOOy+5e+65J0e4/saF/oko3Qk2DnUqy99xefnll3lfyTcu+ldFnOisu80225AbGhdm5HaFbSCAS6Mmy8l/6HTSSSfNPPPMs8wyCx0ziVwU1eS/Q2bPjz76qHz3w00DNi9fyXDf6Be5oh49etxxxx2EyOIy9SNg9V133ZUt8SqNCzecpbkJTP74xz9+7rnn2Ibcbcx0kFwmPPDAAxSkE2JvOFdZZRWemrQ48lwCckXloOvz0iGjRVSinSbLhOJhEzqNS1N4Fxp4MKWa9AYtB+mQ0SIq0c5MVlG6aiGthXxU4zobpru4ToSrBT00oQzBWZRLFDc6FaqJrqA1GB2GZr5UHjWdhELUaBGgdUwbzu+akvrGRePWRLg6RocyNjDR559//h//+AdnmAzlRDTVzLBUtTHpZliqriERbXzjQuPCUcrR/tlnn3Hicnxy23v27DnDDDNwfHIqy5cHeiGOzKuuuooOab755hs0aBDp9BBPPfWU/LMuxxxzTGhcwjZCOrlLLrkktoMPPpgmgLZjyJAhK664Iqc+u6Kp5WA237iExoVcaVxmnXVWaVywUWG66abjqu+66y5pGugJ5G/CbrbZZuTKTsjV20CIZp4L32233Xi/yd/R2XDDDaU7IZFF6RKk09piiy146MyzIowePRoDy9FtsGc87JYLeeGFF+SmUZn2hc1wE+hR5F/O/e1vf0tx+ToKA3ebVomleShcNaVwkvWHP/yBK6KaNJesEr5x4Q7oxkUuB8IVCamQ0SIq0U6TZUJ6WJ6dDeg0Lk2RR6LRk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+VR00koRI0WAVrHtOH8rimpb1w0bk2Eq2N0KGODEEVwriy++OKcQPPPPz8/N/PpzLGkPYA2w1LVxqSbYam6hkS08ZdzORR33XXXpZZa6tBDD6Udueiii5Zeemlu/sorr8ypScfAuR5OfYFLfvHFF2lusNFYHHXUUdttt930009PS8EMhys9hCTKzYGwT0rJv1OC/09/+tP2228/00wzzTXXXBzS3NXdW794Mt+4vPXWWzQKUo1Tf5ppppltttleeeUVOb/pHijFVYdvXOgJ5B8wpKHZd999L7/8csx0A6Tr2wXMcF00GY8++ui0005LClxyySXyrYaUoum59NJLuVhCXCzvzCuuuOKss87aaqutaLOIUpm9XXjhhWwMD2+MI4444rzzzuvXr98yyyzz8MMP854B+Y+kuEV77bXXlVdeyQ2kOIl0JExONdVUO+20E/eftdZbbz1uI13UjTfeKG0KO5G/47LKKqtwB6TF0ZcDckXlQN1w0CGjRVSinSbLhOJhEzqNS1PKR6HQk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+WRcvLRLKdOEKJ5xRacOkXrGOOUsuGI0rRdvE7jArK0hkm9aNAxOhR0KCIC0DrKZe63335rrrnmAgss8Nxzz8nJFzyihaAze0gRqumCgl4ILUPRvOYbl1a9snFZbbXVwt+W+Pvf/y7/tGtxbk8+OQfwOuus89BDD3FG0jHoX+QBFRhy6B5//PHyOxpSOGX79u07YMAADng2sNBCC73xxhvy/UGZNglyWXeFFVaQRFh00UXvuece+futFHn99dflGxdpXHC6jcvLL78cNy7Sh9FM4P/Vr37FEjRDQJ8hX7qUm5h06+SGsMmxY8fSb+Hs3bv3wIEDpWOTX07R1rAZup8llliCVUJNFt1zzz3xUBnGjx9Pz0fXwhWxbcAzzzzzPPXUU9writDm/va3v+3Ro4dcNe9wsrioMWPG0LvgxC9ZrEKHdMMNN3Cx77zzDheITf6rItpN3biUF9NCrqgcTHo/CDpktIhKtNNkmVA8bEKncWmKvMU1PJhSTfqMKAfpkNEiKtHOTFZRumohrYV8VOM6G6a7uE6EqwU9NKEMwVmUSxQ3OhWqia6gNRgdhma+VB4pJx9wfGSD6ABDbMGpU7SOkWgoIsX5iG8Fu9B28cpfFbF0WFQ0tHZUfJqLBxF0jA4FLRVACkr9EC3Ktf6WBscVhxBnM5OYJSRacuUVQug7EapJEYTMi+ZVhsVu1N54zf+qqFWs+Hu1nKZPP/00R+D777/PSc/BPGTIkDvvvJMf9y+44AI6AH7Q5+o4NTlWOV+pZhbiztA0sBA9wfnnn0+rRFn8d9xxB0fsTTfdREtEWfddwSlOh3HFFVfwlK+55ppBgwYxfPbZZ1mXOqxL00Bx9saQ5pXo6NZ/Esw2uPPs/NFHH6Wbkb8UAuQ+8sgj9EPSysjXJPQu119//RlnnHHZZZe99tprNBby6xvZQ3HjJl0RZUlhG/feey915C/YSgMkdxvBfaD+bbfddt555/Xv3//mm2+Wv5VCWTZGZTx0P6+++up111139tln08RwOZSi7eA+sG2g4aAd4Y5dcsklsiVy2T9XRyK3gjt58cUX08axAa5a/g4NNrbH3eBWcKWhjQvXIugrAqPD0GgRlWinyTIhPSzPzgZ0GpemyCPR6EnjSYWMFlGJdmayitJVC2kt5KMa19kw3cV1Ilwt6KEJZQjOolyiuNGpUE10Ba3B6DA086XySDn58OU4kb+TwYcdH4vyX9KGTxax6RStYyTKJzs/E0sdPnzRFBdDoO3ild+4cFSce+65HAZyUXzisx806EWDjtGhoF966aW77rpLjgSOlttvv11Hi3Kt3y+wMX7aDm2ThBAcaZxSMnnhhRdywEiolf0dkBSpw5HGeSmafoKegHXDoqG4iPw3LmQBpyCnO3vjLOTIZMMcuhyEaI5Jzn7OS6ALkQaCRk1Xk4WYIevjjz+mQaEB4jSlGshfWWUo35qwlmQFSOT28oahOE6uqLVa8Y/LBUFZ9gM8U2z0B/QElJLNU5bNU19scjlods4loLk/7BkbiexHNkPnIRVkG607V946tiR+1sJMinQbcuES5Z1GBblAdshVUxM/k4QwsCibkRvCVYgNP1tl87JPLoctyWWSLvO8k9kzO2/dv/dJB8kVD4kUxyY75DKlm2Rjsn9BXxEYHYZGi6hEO02WCcXDJnQal6aUj0KhJ40nFTJaRCXamckqSlctpLWQj2pcZ8N0F9eJcLWghyaUITiLconiRqdCNdEVtAajw9DMl8oj5eQDbu+9977llltax9a///rXv1555ZVMorEFp07ROkaifDSvttpqUpNPUjSfrWIItF2cn8VD42I+oAU+uH/+859zErA6ByGr81kvTr1o0DE6FPQVV1xx8MEHy0XRLnDfdBRYgrvH9jiuZBgWZRtrrbUW5wq5TK699tqvv/66hIrM74KkUIRSdCp//vOfpeaDDz64yy67oIMtFBdxWvaf/CdR4O5xCvKjPEcmmnOXExTNc+Ro5NjmuOUMxqMPeyGsKMc5Nvycx5ICnMG8cuLSneARc4CNcaM4iYliI5HjmQosTYq8UlZ2JftkY8yQIrnSqbB5JnnLYZMZCvLKEA97RpDLZijOQvK9RbiW1p3773NhXhqI4gLUPxksUdaVvoR5qgEXy42SsmySFWVvsmGJyp3kithY2KdUwMA8mhkqE+JaKMVuJQvQGCjINmQDmPEwSUGGsreAuSKjw9BoEZVop8kyoXjYhE7j0hTeOgYeTKkmfUaUg3TIaBGVaGcmqyhdtZDWQj6qcZ0N011cJ8LVgh6aUIbgLMolihudCtVEV9AajA5DM18qj4xzr732uvnmm0XTuFx++eXyaYgtOHWK1jESlcZFZjhsaCP4dJZhoO3i8qsi+ebAhQ90VuRzHM2Bseqqq4bGJdTXOkaHgubOHHTQQaJvv/127htCO7lvHC1yUJkKnCXSuMiQxmXIkCGiW5bvgE654YYbZBsgjYtowBacomlcevTokblvAlfBbnleIMc8r2jOZg5F4ByVMzX/PiELm2RJtyEwo9NBZ4FsAA9Z5Iof2AOnOMiuwiaZCYmyVZD6DAFDmAlOhmyGJYDiOtq6oC63jhBFZBs4qRmivMpCLBFuEaCZ0U4pIoti4BUdLgfwMyO5spAUR8TFJVGKA5oZPSl7E4pr6DosVdeQ0SIq0U6TZUJ6WJ6dDeg0Lk2RR6LRk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+WRcfIj+y233CL6b3/72xVXXMHHChpbcOoUrWMkSqOw+uqry8y4ceNoYvhIlWGgveLsTf7v0PJ/OZatGvjQZ0V++kTzuU8T895776FJD/W1jtGhoOUbF9F33HEH9w1hirQOoC5fQgAejhOalREjRshwnXXWafiNi3DjjTfKNuChhx764x//KBqwmeLyKza5b+VUAgxcRbAVd3nS8QwSAonqhcyKxb1onbuSqJEiYjNZAobgDOh1ETIjQ2gFC1Lz5bgFw1bJchUdbV3Qf69IdKtAkSLzQgjJa6gG6FZGWTY4iyVbtmAA8SAkBGE+vDIZ7qTktpL+CzN6XlYUimvoOixV15DRIirRTpNlQvGwCZ3GpSnlo1DoSeNJhYwWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfLIOPONCzrAZHjVH7saKd6tjYv8qkh+5QFlTFGncZFcA2a5Lm0QP+QblyBiCHHYVDYu5XotZBsuOuU7NS5nnHFGjx49HnvsMYqUU98R2QCU4xZ6Ib1i0GVOVyQk6KxA6avhzJNKyRRHuDrGhKSmUE5NIjglFAy6gvaYeSiKds0NwiWuUA7SIaNFVKKdJsuE4mETOo1LU8pHodCTxpMKGS2iEu3MZBWlqxbSWshHNa6zYbqL60S4WtBDE8oQnEW5RHGjU6Ga6Apag9FhaOZL5ZFx5hsXBD/nyX/XAFdffTWvjz/+OJMfffTRddddx/DWW2/9fNLv8iWlWxsX+eYg9Xc1oLJxIZHd3n///VyOwFUMGzaMyVdeeQV91VVXcY3smfSiYovKxsXVgK7TuMAjjzxSbqi1pddee409DB48WLbEJVNHp3ynxkX+jkv9xsWka3Rxo0WA1jENne2lx6SKuzpGhzI2cJ0IV4PReliTTIVUyGgRlWinyTKheNiETuPSlPJRKPSk8aRCRouoRDszWUXpqoW0FvJRjetsmO7iOhGuFvTQhDIEZ1EuUdzoVKgmuoLWYHQYmvlSeWSc+cYFLT3KL3/5y9a/QPGDhRZa6K233uJE//LLL/khfo455pB/R0u+JJDi3de48Hr66adL44J2z+A6jQsbfuGFF+aff365qI033nhC698bZbdbbrnlSiutxCWwZ2xFxRb/g8Zl0KBBiyyySPFPeUw++QYbbMBm2MD48eN33XXX5ZdffujQoeEOC9/1V0XyXxWV4ypMukYXN1oEaB3T0NleekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwTiodN6DQuTeHTxMCDKdWkD9ZykA4ZLaIS7cxkFaWrFtJayEc1rrNhuovrRLha0EMTyhCcRblEcaNToZroClqD0WFo5kvlkXFW/uVc+daBA69Xr16c8XQq8ldQmb/44osPOOAAtNgki9du/cu5Z555Jo3Lgw8+KJMxn9f4y7my29NOO40rmmyyyegkQu/F8U9DgJZhSMn/5dxWbUcDmuI1/3Ju//79uTq2tPLKK9M5sX/YY489rr76arMfqPmXc4UTTzxxqqmmevbZZ8txFSZdo4sbLQK0jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnpYXl2NqDTuDRFHolGTxpPKmS0iEq0M5NVlK5aSGshH9W4zobpLq4T4WpBD00oQ3AW5RLFjU6FaqIraA1Gh6GZL5VHxpn/xkXmERyfv/nNb+RMPffcc+Vc79Onz+DBgxFiA0np1m9czj//fLZxxx13yCeghDT1/3Luxx9/LP8LPXqyZ555hmukJ/v1r39NhVA5pHT3Ny6syG4nTJgw33zz0U5NPfXUjzzyCIljxoxZf/31P/vss9bltvONC1m8HnjggdR8+eWXZbISvZBBFzdaBGgd09DZXnpMqrirY3QoYwPXiXA1GK2HNclUSIWMFlGJdposE4qHTeg0Lk0pH4VCTxpPKmS0iEq0M5NVlK5aSGshH9W4zobpLq4T4WpBD00oQ3AW5RLFjU6FaqIraA1Gh6GZL5VHxlmzcWHynnvumXLKKTlTl19+ec7XIUOG9O3bV76oEBtISrc2Lpdffjl7uOiii9iSXjpQv3GhApdMKwZcC+ZLL72Utox2IVQOKf+DxgUbr8ccc4xsafvtt2dmwIABJ510koRAp9RvXGDPPfecZpppeGoyWYleyKCLGy0CtI5p6GwvPSZV3NUxOpSxgetEuBqM1sOaZCqkQkaLqEQ7TZYJxcMmdBqXpsjngoYHU6pJP7eVg3TIaBGVaGcmqyhdtZDWQj6qcZ0N011cJ8LVgh6aUIbgLMolihudCtVEV9AajA5DM18qj4yz8ldFonmlWVlhhRU4UHv06EHKEUcc8eijj8oZLzYQZ7f+qujKK69kD2effbZMxtT5VVHQHOS9evWi4Mwzz4xtu+22GzlyZHxF8L/5VRFLv/HGG7PMMgtbmm666YYOHbrjjju+//77ZbhrSv1/x4X7sPPOO9O4hP9zdSUhPaZV+7/FtRYBWsc0dLaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE9LA8OxvQaVyaIo9EoyeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKI+Os+Y0Lr5yp/fv3l78UQgezzTbbfNb6f8SIXxBn933jQgfw/PPPzz///AceeGDrA/C/Swe+0zcuFNx88825Iq7rt7/97d///ndSQGwQUrr7G5cAq++0005sCX71q19xpe5+4Dt948I9WXPNNWnL5MsbCWXQCxl0caNFgNYxDZ3tpcekirs6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyoXjYhE7j0pTyUSj0pPGkQkaLqEQ7M1lF6aqFtBbyUY3rbJju4joRrhb00IQyBGdRLlHc6FSoJrqC1mB0GJr5UnlknDUbFyY58D7++OM55piDA3XKKac877zzmDGnoKR0X+PCEc6KW2655UYbbYR2D+Dv1LhQ7d57751qqqm4qF69er366qumbEj5nzUurP7YY49NM8009FK8yl+nLWNdU2o2LlwR92Teeec9++yzadTYjMzn0QsZdHGjRYDWMQ2d7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMqF42IRO49KU8lEo9KTxpEJGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ5ZJz1GxfOPzjggAOmmGKKueaaa/To0QyZF78gKd3auLDcySefvPjii8v3PWVM8V0bF/yrrroqjcvGG2/MPvXlQEj5XzYubAkPW1p//fU///zzMtBCp9T/xmXo0KGzzjrrE088wU7MBabQCxl0caNFgNYxDZ3tpcekirs6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyoXjYhE7j0pTyUSj0pPGkQkaLqEQ7M1lF6aqFtBbyUY3rbJju4joRrhb00IQyBGdRLlHc6FSoJrqC1mB0GJr5UnlknPUbF+G1116bccYZ+/Xrh5aoRlK6r3GRPbz44os9e/Z84403mjcuAldNN3bnnXeiJSo2CPp/2bjgv+mmmyaffPLrr7/eXKNOqd+43HDDDQsuuOCYMWPkAuugFzLo4kaLAK1jGjrbS49JFXd1jA5lbOA6Ea4Go/WwJpkKqZDRIirRTpNlQvGwCZ3GpSnlo1DoSeNJhYwWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfLIOGs2LrxymsovGvbcc88XXnjBPQLF2d2Ny8SJEzmGr7vuuuaNCwb4+OOPt912W8qiJSo2CPp/3LiMHz9etoRmpox1TanfuNBobrrpppSSoczn0QsZdHGjRYDWMQ2d7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMqF42IRO49KU8lEo9KTxpEJGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ5ZJz1Gxc51AFD6vATZ3c3LrDbbrvtvPPOiDKmaOMbFyj6l0kXGGwQ9P+scYGwk3JzzRoX2s0VVliB/ctQV8ugFzLo4kaLAK1jGjrbS49JFXd1jA5lbOA6Ea4Go/WwJpkKqZDRIirRTpNlQvGwCZ3GpSnlo1DoSeNJhYwWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfLIOOs3LoLWMRLtvsalVP/5zwMPPLDYYouNHTu2HCvqNy6iY3Qo6P9l45JBp9RsXAYPHjznnHOOHj1a5mWykoyzVbuMGi0CtI5p6GwvPSZV3NUxOpSxgetEuBqM1sOaZCqkQkaLqEQ7TZYJxcMmdBqXprR+IuoCD6ZUk364KQfpkNEiKtHOTFZRumohrYV8VOM6G6a7uE6EqwU9NKEMwVmUSxQ3OhWqia6gNRgdhma+VB4ZZ/1/x0XQOkai3frvuAh0JyuttNJNN91UjhXf6d9xcdGhoP83/45LJTql5r/jwmPddddd2YPMy2QlGWerdhk1WgRoHdPQ2V56TKq4q2N0KGMD14lwNRithzXJVEiFjBZRiXaaLBPSw/LsbECncWmKPBKNnjSeVMhoEZVoZyarKF21kNZCPqpxnQ3TXVwnwtWCHppQhuAsyiWKG50K1URX0BqMDkMzXyqPlJNzfe+996ZxQfBpwgl35ZVX0rhwyGELTp2idYxEdeMyZsyY1VdfncZFDIG2iwtssn///htttBFb1Z0W0Liw+rhx45gkin7vvffk2NaL6moGHQr6iiuuoHGRtW677bZ9991XR4tyieJo+cbl/fffJx3WXXfdht+4SB0eHI+P/aAfeOABuhN5jmJjkqvm/i+11FLPP/8881B/xYyTUIgaLQK0jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKI+XknBs+fPioUaPkkHvzzTdHjBgRzvjg1Clax0j0iy++ePHFF6UObQSa+mIItF1coDJbXW655Z577jm0rs+6L730Envg6uDll1+Wv+KKRy+qqxl0KGjuEvdHLgrNfdPRolyiOJqUV155hVshW0KzJQmJpz6SIs/r448/ZhtoGDt27LBhwyjOWmJjkiH91mabbYYO80WVGmSchELUaBGgdUxDZ3vpManiro7RoYwNXCfC1WC0HtYkUyEVMlpEJdppskwoHjah07g0hU8EAw+mVJN+ECwH6ZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeKWfrGC3/zX4Er3LaIbAFp07ROsZEpRrIEpqGxaXy+eefv/POO5v6MixWVTADelFdzaBDWlOBV6nGKzpEi3KJ4qIlS0TQ2laTkCJ7AHcoNtqjtddem94uZNVfMeMkFKJGiwCtYxo620uPSRV3dYwOZWzgOhGuBqP1sCaZCqmQ0SIq0U6TZUJ6WJ6dDeg0Lk2RR6LRk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+WRcpafIpMoZ1vz2IJTp2gdE0dN5UDD4mjKfvrpp2uuuebAgQM5tsvAJGRdECcwGSoggo7RoaClgpQSGIZoUS5RXHRIEUx6fdxqIMMwiQ19wQUX7L777vrm1F8x4yQUokaLAK1jGjrbS49JFXd1jA5lbOA6Ea4Go/WwJpkKqZDRIirRTpNlQvGwCZ3GpSnlo1DoSeNJhYwWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfKo6SQUokaLAK1j2nA2Kc7Z/PDDD2+++eafffaZzAtuTYSrY3QoY4MQLcolihudCtXEpJthqVp6/PjxG2ywQfgvqsK8iEoyTkIharQI0DqmobO99JhUcVfH6FDGBq4T4WowWg9rkqmQChktohLtNFkmFA+b0GlcmlI+CoWeNJ5UyGgRlWhnJqsoXbWQ1kI+qnGdDdNdXCfC1YIemlCG4CzKJYobnQrVRFfQGowOQzNfKo+aTkIharQI0DqmDWfD4v/+97+feeaZzL+LHzTC1TE6lLFBiBblEsWNToVqYtLNsFQtTeMyePBg+eVRCGlPnoyTUIgaLQK0jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgaLQK0jmnD2a3FIWiEq2N0KGODEC3KJYobnQrVxKSbYam6howWUUnG2apXUVzrmIbO9tJjUsVdHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmVA8bEKncWlK+SgUetJ4UiGjRVSinZmsonTVQloL+ajGdTZMd3GdCFcLemhCGYKzKJcobnQqVBNdQWswOgzNfKk8ajoJhajRIkDrmDac3Vocgka4OkaHMjYI0aJcorjRqVBNTLoZlqpryGgRlWScrXoVxbWOaehsLz0mVdzVMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCcXDJnQal6aUj0KhJ40nFTJaRCXamckqSlctpLWQj2pcZ8N0F9eJcLWghyaUITiLconiRqdCNdEVtAajw9DMl8qjppNQiBotArSOacPZrcUhaISrY3QoY4MQLcolihudCtXEpJthqbqGjBZRScbZqldRXOuYhs720mNSxV0do0MZG7hOhKvBaD2sSaZCKmS0iEq002SZUDxsQqdxaUr5KBR60nhSIaNFVKKdmayidNVCWgv5qMZ1Nkx3cZ0IVwt6aEIZgrMolyhudCpUE11BazA6DM18qTxqOgmFqNEiQOuYNpzdWhyCRrg6RocyNgjRolyiuNGpUE1MuhmWqmvIaBGVZJytehXFtY5p6GwvPSZV3NUxOpSxgetEuBqM1sOaZCqkQkaLqEQ7TZYJxcMmdBqXpnwbwYMp1aT/OrEcpENGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ51HQSClGjRYDWMW04u7U4BI1wdYwOZWwQokW5RHGjU6GamHQzLFXXkNEiKsk4W/Uqimsd09DZXnpMqrirY3QoYwPXiXA1GK2HNclUSIWMFlGJdposE9LD8uxsQKdxaYo8Eo2eNJ5UyGgRlWhnJqsoXbWQ1kI+qnGdDdNdXCfC1YIemlCG4CzKJYobnQrVRFfQGowOQzNfKo+aTkIharQI0DqmDWe3FoegEa6O0aGMDUK0KJcobnQqVBOTboal6hoyWkQlGWerXkVxrWMaOttLj0kVd3WMDmVs4DoRrgaj9bAmmQqpkNEiKtFOk2VC8bAJncalKeWjUOhJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiRosArWPacHZrcQga4eoYHcrYIESLconiRqdCNTHpZliqriGjRVSScbbqVRTXOqahs730mFRxV8foUMYGrhPhajBaD2uSqZAKGS2iEu00WSYUD5vQaVyaUn75peDBlCr6iiwVMlpEJdqZySpKVy2ktZCPalxnw3QX14lwtaCHJpQhOItyieJGp0I10RW0BqPD0MyXyqOmk1CIGi0CtI5pw9mtxSFohKtjdChjgxAtyiWKG50K1cSkm2GpuoaMFlFJxtmqV1Fc65iGzvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJmQHpZnZwM6jUtT5JFo9KTxpEJGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ51HQSClGjRYDWMW04u7U4BI1wdYwOZWwQokW5RHGjUyEXPuLlX5Arx7Wr6ZDRIgQq/7tFcZi0KAORU9OqV1Fc65iGzvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpSvkoFHrSeFIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0SJA65g2nN1aHIJGuDpGhzI2CNGiXKK40alQDG3El19+ucceewwaNKicql1Nh4wWIbDEtddee/7550t7RAdTBiKnplWvorjWMQ2d7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMqF42IRO49KU8lEo9KTxpEJGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ51HQSClGjRYDWMW04u7U4BI1wdYwOZWwQokW5RHGjU6EYOolrrrnmN7/5jf4fGph0MyxV15DRIgQ6lffee2+55ZYbNmxY8b+W/uabMhA5Na16FcW1jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgaLQK0jmnD2a3FIWiEq2N0KGODEC3KJYobnQpp5Hc348aN692798MPP4wuA7Wr6ZDRIgQWolk59dRT+/Tp88UXX6QWMrTqVRTXOqahs730mFRxV8foUMYGrhPhajBaD2uSqZAKGS2iEu00WSYUD5vQaVyaUj4KhZ40nlTIaBGVaGcmqyhdtZDWQj6qcZ0N011cJ8LVgh6aUIbgLMolihudCtVEV9AajA5DM18qj5pOQiFqtAjQOqYNZ7cWh6ARro7RoYwNQrQolyhudCqkoYGgnzjiiCM233zzL7/8MvUbnFax6oWMFiGwEMUnTJiw2GKL0SF9/fXXZSByalr1KoprHdPQ2V56TKq4q2N0KGMD14lwNRithzXJVEiFjBZRiXaaLBOKh03oNC5NKR+FQk8aTypktIhKtDOTVZSuWkhrIR/VuM6G6S6uE+FqQQ9NKENwFuUSxY1OhWqiK2gNRoehmS+VR00noRA1WgRoHdOGs1uLQ9AIV8foUMYGIVqUSxQ3OhXS0Ey8/fbbc8011wsvvICGMlC7mg4ZLUKgMr0LnHXWWeutt95XX31VBiKnplWvorjWMQ2d7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMqF42IRO49KU8lEo9KTxpEJGi6hEOzNZRemqhbQW8lGN62yY7uI6Ea4W9NCEMgRnUS5R3OhUqCa6gtZgdBia+VJ51HQSClGjRYDWMW04u7U4BI1wdYwOZWwQokW5RHGjUyEN/cSRRx654YYbfv3119JblIHa1XTIaBGCdC0watSoJZZY4qGHHpIhIePUtOpVFNc6pqGzvfSYVHFXx+hQxgauE+FqMFoPa5KpkAoZLaIS7TRZJhQPm9BpXJoif9Q1PJhSTfrEKQfpkNEiKtHOTFZRumohrYV8VOM6G6a7uE6EqwU9NKEMwVmUSxQ3OhWqia6gNRgdhma+VB41nYRC1GgRoHVMG85uLQ5BI1wdo0MZG4RoUS5R3OhUSPPxxx/37t17wIAB5Vhh0s2wVF1DRosw0B7tu+++m222WfgOJuWEVr2K4lrHNHS2lx6TKu7qGB3K2MB1IlwNRuthTTIVUiGjRVSinSbLhPSwPDsb0GlcmiKPRKMnjScVMlpEJdqZySpKVy2ktZCPalxnw3QX14lwtaCHJpQhOItyieJGp0I10RW0BqPD0MyXyqOmk1CIGi0CtI5pw9mtxSFohKtjdChjgxAtyiWKG50KBfhYv/zyyxdccMFx48ahy9lJmHQzLFXXkNEiDPQrzzzzTK9evd5//30W/eabb1JOaNWrKK51TENne+kxqeKujtGhjA1cJ8LVYLQe1iRTIRUyWkQl2mmyTCgeNqHTuDSlfBQKPWk8qZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUdzoVAjk2w5e119//V122YXugSGU4RYhhfmvvvrqvffee/LJJ8ePH0+W8Nprr91xxx36PxFqrVlmBWEgceLEicsuu+zJJ58s66ac0Kr334JaiwCtYxo620uPSRV3dYwOZWzgOhGuBqP1sCaZCqmQ0SIq0U6TZULxsAmdxqUpxUdLV3gwpYq+IkuFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiRosArWPacHZrcQga4eoYHcrYIESLconiRqdCAg3E8OHDZ5555ptuuqmc6kpIoWXZf//9V1999Z49e+666640MXD++efPOuusU0011c033xycxZJKizC0ep5/77fffquuuupnn32GTjmhVa+iuNYxDZ3tpcekirs6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyIT0sz84GdBqXpsgj0ehJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiRosArWPacHZrcQga4eoYHcrYIESLconiRqdCIJ/pl1xyyWyzzfbWW29J92CQFGxEv/7661GjRq200krTTjvtO++8c/311/ft2/eaa645+uijGYofWmuWCwVhoBo177rrrmmmmeb1119nmHJCq95/C2otArSOaehsLz0mVdzVMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCcXDJnQal6aUj0KhJ40nFTJaRCXamckqSlctpLWQj2pcZ8N0F9eJcLWghyaUITiLconiRqdCNdEVtAajw9DMl8qjppNQiBotArSOacPZrcUhaISrY3QoY4MQLcolihudCgHtwjfffLPtttuuvPLK+p9U0UhKq8Mpf6I97bTTJptssqOOOmrvvff+7LPPwnzhbtFas1woCENR7ttvx44d26NHj4svvrjTuMQ6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyoXjYhE7j0pTyUSj0pPGkQkaLqEQ7M1lF6aqFtBbyUY3rbJju4joRrhb00IQyBGdRLlHc6FSoJrqC1mB0GJr5UnnUdBIKUaNFgNYxbTi7tTgEjXB1jA5lbBCiRblEcaNTIaB1+OKLL+aff/4///nP0kmUAYVJx/PCCy9MMcUUCy200Ntvvy3/+bSEgtNoETGy4lJLLbXNNtsgMs5WvYriWsc0dLaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE4mETOo1LU8pHodCTxpMKGS2iEu3MZBWlqxbSWshHNa6zYbqL60S4WtBDE8oQnEW5RHGjU6Ga6Apag9FhaOZL5VHTSShEjRYBWse04ezW4hA0wtUxOpSxQYgW5RLFjU6FgHbhjTfemGqqqc4999xWF1Grcfnkk08WWGCBxRdffPz48d988017jYusxetWW23105/+9PPPP085oVWvorjWMQ2d7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMqF42IRO49KU4gOmKzyYUk366CkH6ZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUdxoGgsYOXLkMcccM3DgQHQZ+/Zb2oWTTjppsskme+CBB8qpiLjyhx9+ON988/Xs2XPIkCFSnEktwivodJdDDz105plnvu+++zJOQiFqtAjQOqahs730mFRxV8foUMYGrhPhajBaD2uSqZAKGS2iEu00WSakh+XZ2YBO49IUeSQaPWk8qZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUVxr+Oabb8aNG3f55ZfPOuusDz30EC1FGfjPfz755JPevXvTuLz77rt8uJezXYmrHXXUUTvvvPPkk09OzaJbaf1FGV6pgBg9erT8ezDM4DfpMVdffTUb2GGHHTJOQiFqtAjQOqahs730mFRxV8foUMYGrhPhajBaD2uSqZAKGS2iEu00WSYUD5vQaVyaUj4KhZ40nlTIaBGVaGcmqyhdtZDWQj6qcZ0N011cJ8LVgh6aUIbgLMolihudCtVEV9AajA5DM18qj5pOQiFqtAjQOqYNZ7cWh6ARro7RoYwNQrQolyhutPQQ9CiLL774gw8+KP0EMD9hwgQme/bsOWrUqDBv0NVIefjhh48//vh//etfU0455R//+EdmJk6c+M4773z99de0LFtttdVcc83Vq1evvfbaixWJ6nSXO++8kx6ITijjJBSiRosArWMaOttLj0kVd3WMDmVs4DoRrgaj9bAmmQqpkNEiKtFOk2VC8bAJncalKXwcGHgwpYq+IkuFjBZRiXZmsorSVQtpLeSjGtfZMN1FR/lpMvxkyasI0SKEOsWljk4MToSrwehUqCa6gtagNcg+zZ6Nx5CppiEUokaLAK1j2nAGwRVx3MpfL5UhgtevvvpKDG0UB72Qq2N0KGODEC3KJYobLcPx48cvtthijz76qMwL9BZM0mp8/PHH5VQEN+TLL7984YUX3nrrrffee2+PPfYYM2YMibPPPvuyyy47duzYU045ZciQIdguueSSO+64g/bl9ttvn2GGGS688ELS9WZcHnjggR49euy4444ZZ+siyqjRIkDrmIbO9tJjUsVdHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmZAelmdnAzqNS1PkkWj0pPGkQkaLqEQ7M1lF6aqFtBbyUY3rbJjuEqK89TnVOOS++OKL4cOH33vvvddcc80tt9zy8ssv8yEuZx4eMQdM8dafoG85BvjoB7JkhlBwIlwNRqdCNdEVtAatgR3yo/m4ceM+/fRT9zJjMtU0xareHoIArWPacAbBtXASn3XWWQceeOABBxxwwQUXfPTRR/KUxdBGcdALuTpGhzI2CNGiXLp4621VgJaQ9CiPPfZYK15C27Hgggv27t1b/qtm8Wtk8t13351zzjnnmWeeTTfd9NVXX+UWAXqqqaZac801BwwYwB3jzUz7IoLob37zG+4quWZjMc8///y0007bp0+fjLO4PO9igwCtYxo620uPSRV3dYwOZWzgOhGuBqP1sCaZCqmQ0SIq0U6TZULxsAmdxqUp5aNQ6EnjSYWMFlGJdmayitJVC2kt5KMa19kw3SVE+SzmQ/nOO+9cZ511pp566sknn/wHLRDLLLPMOeec8/nnn0sjIn4wn9rFCdDqfl555ZWFFlpoqaWW4sdcajJTOlqQErK0Bq0NOiQLQTlOE7KKZVo6JGrBDvfee2/OraOOOuqrr74yl+kSKoPWhtaydg+iRYDWMe05gUt47rnneBDHH3/8yJEj33nnnV122WWllVYaOnRoeCINt4EwurinERIVG2gdE6JFOaV5lWqtf8/2K5pjAS2h8ePHL7roovKNSyupgMZlgQUWWH755Xn3ctU6FGCSZv24447bZ5993njjDZ4+YH7hhRf69u17//338x7GwwzzLIdgZsMNN7zuuuuYD5tMMXjw4F69em2++eYZZ3Gp6QsXtI5p6GwvPSZV3NUxOpSxgetEuBqM1sOaZCqkQkaLqEQ7TZYJxcMmdBqXppSPQqEnjScVMlpEJdqZySpKVy2ktZCPalxnw3SXEOVD+bTTTqNlmWKKKTgA+Gny5JNPPuyww1ZeeWVmaF+23HLL0aNH8xHPp7ak6E9ttMDH+sCBA6kz22yzvf/++3G7A2FIeqgAupoRIUQ1jitaohEjRhANBk2cVSzz//4fe2P/XIV88SDpwBVtt912k0022X777Tdx4kQ8RCVRwFOqSYTKoLVxtpbtsoegRYDWMW045dK4+YssssivfvUrbhdD7tuYMWN69+692mqr8VDE2XAbCKNZmoWADuCuu+7imJcbEmygdUyIFuWUlou6++67524x1yRoNx966CFC0rg88sgjsqLAJc8///y0a9LfuIRVoLWmv1U010UdXnn/bLzxxvItjva4DBkyhMblD3/4Q8ZJKESNFgFaxzR0tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4TiYRM6jUtTykeh0JPGkwoZLaIS7cxkFaWrFtJayEc1rrNhukuI3nrrrVNNNVWPHj123nln+WX/uy3efPPNf/7zn9NOOy29S79+/caNG6fP9ZDOJzgwD5z9L730Eu0L7cUnn3wiB6fYQLS8kh4qgKkW0CHqn3rqqTPNNFOfPn2oLD8NSygmZBXLtL4POPTQQ8ndY489uApOIKq1Vvj2nXfeefHFF4cNG8ZR9+mnnxKSRCCqX0WEyqC1EMqiQ7TYgdIiQOuYNpxsnqWPOeYYHtmFF15YPJJJ9O3blzb0lltuEWfDbSCMZl1WYQP777//BRdcIItKVGygdUyIFuWUlsq0XLyjgDYlwNuAEILG5eGHH5bbLvA055133u+rcaEy7zeW413EO5xFmdEel+HDh88wwwydxiXWMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCcXDJnQal6bwcWDgwZRq0qdSOUiHjBZRiXZmsorSVQtpLeSjGtfZMN1FohzVP/3pT3/wgx+su+66/KzMKQ78yD5ixIiRI0fSxOy5556TTTbZ7LPP/swzz0gvwuHET5zyDTyMGjVq6NChHB5U45AYPXo0uRwbEyZMCI2LLMTnOD+DynceTMq8QDVqyrmLxknNsWPHMgw2uiKOZLa6wQYbfPjhh/RGzFBKogj2QOP1+uuvv/3223KkMS/pFN9nn324kG222UZymcHAimyMPTPJcmjphwQKshN6Gq6IS9MFgaHcBxnS2OEEqcxMcFITm74bgtYx7TnZ53LLLUcn+uSTT3J1shM4/fTTufztt9/e7A3a2AYi1lR+4IEHZp55ZmlcQlQEaB0TokU5pXmlmhQM84LMc+dpXO6+++5ytgWPWBoXue0uulqxZNdhqSZpbiZtH6sgIMxnoPunV954440zTkIharQI0DqmobO99JhUcVfH6FDGBq4T4WowWg9rkqmQChktohLtNFkmpIfl2dmATuPSFHkkGj1pPKmQ0SIq0c5MVlG6aiGthXxU4zobprvwjueVn1Pl90FXX331my3oWuhFOLPpPDjLX3nllV69enHm/eUvf2GeyQ8++GCNNdZYaqmlXnvttQMPPPCHP/wh6f379+eEePnll5dddtlVV12Vj2zpcjhXOLYvuuiipZdeGht15phjjt133506HLHyB49DnV6E4/bxxx/nbFh55ZV79OiBeeGFF77yyitJ55ygR1lnnXVkLfbTu3fvJZZYYr/99qNLoA5XsdFGGy2wwAJcC5C+9tprDxw4kA2QS9vBhmeddVZWn3HGGRdffHFy//nPf37xxRfkHnTQQQyPOuooLplq+NkzV/rnP/+ZDciep59++vXWW49TGb/sB2itllxySRo+7tjJJ5+84IILYmb1n//8588//7wsLRd4yimncLv++te/kktxZuQRfF+PMkSpTPfJT/nTTDMN3VvYANx4441cyDLLLMNV6xRoYxsIo1nrwQcfnG+++VjlsMMOe+KJJ7j/rMUl82a4/PLLDz30UPrOe++9N7SMcieJnn322dxD7i3N1hFHHHHOOefQ+OKRexUWAq2BKG8e3rqrrbYabwYuuQz85z90MwsttBDPQt5m5WxXTGUz5JUNsEkqIHhn3n777bJt+XJO+124KNq4rbbaKuMsVp0UNVoEaB3T0NleekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwTiodN6DQuTSkfhUJPGk8qZLSISrQzk1WUrlpIayEf1bjOhukp+ED/+9//zknz4x//mJ6DM49TRL6N4MMaOH3pPzibOZLXXHPNd955h49sXuecc84pp5ySXoF52oVVVlmFkwnn008/zZE5yyyz4OGTXc6t448/fuqpp2aSfmX//fcnl9N9m222oReRIwHnPPPMw+QOO+ww1VRTzTbbbL/4xS/4cZmN9ezZ88UXX+RwovgWW2whnQSlWJEz6eCDD5ZfYPXr14/T+pe//OUBBxyw1157/ehHP6IahzT9B1Eal9///vdyprI6uZxzJ554IjW50m233Zaa++67r5xGnEwjR46k/aLCXHPNtcsuu1CTtchlCS6TgtKU0NKROPvss2+55ZZE2TDtEY0RiTRhFGHbXD4XeMghh/zgBz/YcccdqS/NnNz/7+tRhigP9O677+bRTDfddFydHNi8Ah0Du+UcZZ7JNopD0AijuRs777wznRyr07luv/32PGtu7zXXXMMtOvzww2lr/va3v/H24O0kX5U99dRT3DoeMZ0oLUvfvn1pauUGcsNl/xAWAq2Bp0AdQW51GfjPf+iM6W7XWmst5supCFPZDHmVsrxLL774YvrdTTfdlDfSxhtvvPnmm/Nwtd/lo48+ol3muWecxaqTokaLAK1jGjrbS49JFXd1jA5lbOA6Ea4Go/WwJpkKqZDRIirRTpNlQvGwCZ3GpSnyUaXhwZRq0o9N5SAdMlpEJdqZySpKVy2ktZCPalxnw3SX1qf9v+kVOFNXXHHFIUOGDB8+/MMPP+RQ4TzAQDoGPrX33HNPPPQW8ose+psf/vCHnE/TTjstfc/rr79Ou8MPlxz8jz32GJ0HjQUehpxbL730Eocop9F9991HA/Hee+9xgHGS0co899xz0rvwOvfcc3Oscqqtt956zzzzDAX5eZ12iobguOOOGzt2LKWwHX300exk7bXXpuygQYP4CXv06NEcUQ8//PCw1l9SYXsffPDBrbfe2qNHD45Dfu6XdgT23ntvqnH20KKxZ3Yiv2ziB2Jq0u6QSykaC3FyUL3wwgtcF9CHYZBWhlyqcZ5xVGNjkoZmjz324ObgvP7667kzXN3999/P0hxv3ExSHnjgAa7L/Dbq+3qUOsoG2BWPhh7riiuu4FU48MADuUyeDrvC1l7xoBFGy9uJJo/VL7roIjQXzg2hVVp66aXlTUXPR8vIo+HNwJBHQFfKY+K50z6yW7o93jm0fbwZbrjhhlBfBLhalhZkBrjVCy64IMtJc1zOdsVUM0MR5LL5c88999RJnH766byHtScFb5uZZpqJHj3jLFadFDVaBGgd09DZXnpMqrirY3QoYwPXiXA1GK2HNclUSIWMFlGJdposE9LD8uxsQKdxaYo8Eo2eNJ5UyGgRlWhnJqsoXbWQ1kI+qnGdDdNd5NN8ww035Dxbc801OVzfeustznJOl9LR+rPBAUN3gofeYvDgwbQFQ4cOpfPgfNp3332l3eEDmiOHHuJf//oXRyNn1WuvvcaQT/xDDjmEQ4ifUzmlsPFKU/Kb3/yGgtKRcLSPHz+e4hTkmOEk4+jCCfy4jI0f1ul45Pc48ndcmKdpYBuh+aDb4HXcuHGjRo1iJ4T4OZ7j8I477mDIz+60C/vssw+5NC50PHQ51JQ2QhqX3XffnVOWGVo3fu5nM6eddprsmXmumiuVRurkk0+Wu0QDRCKNC40d0TfffBMzgj6Gydtuu40slubgBBaiY2CHXAW3VG7v9/UodfS6665jV3ROf/nLX7j5B09ik002YfM8HZ5L6+Ou/PiDkC5vieOPP557FeApl6oFXZ384/ohCxE0jQurX3jhhWhK8TagheJ5yfdMXPguu+zC+4FGiigzsNhii7GrO++8k2ixrdZfo2arbB5NnVAcjE6FgGdEq/2zn/2Mm19OtZAlYkzU6DA0mle5Ll5lRsM7efrpp99yyy3N3jStiyijRosArWMaOttLj0kVd3WMDmVs4DoRrgaj9bAmmQqpkNEiKtFOk2VC8bAJncalKeWjUOhJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN1FPnM5yDlpVl99dRoXzl1O7vBBX3w2txoX+TF6rrnm4hyi7eBspnHh+OHnZmZoXKQJmDBhwpNPPimNC4c6k5zTHFrkrrrqqn/96185Pv/2t7/RBi2xxBJM7rXXXnQJHO3kUpzDnp/UpSBHPqE+ffqwN/xsjHOII//YY49lZp111mG3b7/9Nm2KdC3smSge1r399tv79+/PeQl0D3JR9A377bcfuZtvvjm5XAWTpNA2bb311sz37dtXCj7xxBPsrVevXo899hi9CL0LJz0Xgn+jjTbCSaMjzRA9FsMf/vCHpMi2MdPlyF+1ufnmmylIs0KLw/ZYiK0Cw+5rXBADBgxgV5yXspA8RLjpppt4ZDQ0XAhOZiQFQnrRR/z73/JrshRTTjklLR02vWjQpnHh0dx99910mezk66+/fvrpp+WXbqecckprU8U7kDcDLSYPDi0zdE4UkX90nzqhOBidCkGqcQGWALZEKMD2oBy0hqXqGjJaBJcZnqmm07ggXB2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpSvkoFHrSeFIho0VUop2ZrKJ01UJaC/moxnU2THfhVODTVn4NxE+9nC6c8ZzQfByXjtZHPB75TmLxxReX5mbw4MFzzDEHZ9j999/PDKc1ZyEHM5/gzz77rG5cOPKXWWYZzks58ygicHQxucMOO0gDwelO4yI/iFNcCtIZ0FLgpHF56623KEWfcdxxx5G+7rrrSnMzvvUfxLJD1tp7771nnXVWoixE2yG/ybrlllvkayRy999/f6JbbLEFuXRF8isbkMZlt912k1XodVh0ttlme+qpp5iR3x/RBODfdtttCa2//vrcBPY8cOBAEuecc06cch+YZOcLLbQQq7O0/CZLGgiQ5bil5c39/h6ljrJ/7uR0003HtuXg5xWuueYa5uk4ubFMusVbB/q/uWouJzB06NBStWDI/aRgyEIErRsXSsmFs+IFF1zw29/+lkewwQYbcA9POukk2RXQJ8WNC54//elPDKVsICwEel3QGlKNS2vNYpUTTjjhRz/6Ee9kgQYUykFrWKquIaN5pQh/QHiy5QKKTuOCcHWMDmVs4DoRrgaj9bAmmQqpkNEiKtFOk2VC8bAJncalKeWjUOhJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN1FPr45UTgkODnoQjixaFxoBcRAOoaJEyfK/2V38803HzZsWPjGhfPpwQcfpAngaJcmAPNzzz2nG5dRo0atsMIKOHfZZZe77rqLJR555JHHHnvs0UcfRT/zzDPSo0jjwhJXXnllKMg2tttuO3IPP/xwaSnojeRncRqXN998c+TIkeyNcxH/T3/6U/qVNddc87rrruNwlb850bNnz5tvvhmnblw4ReSbFarJHdhmm22Yp3GR+dtvv50DfqaZZmKT8m0NnQc2zj9xcgCLUxoXzi0uRLbNKvCTn/yExoUGIjQupMsNB7m3wvf1KEOUhWhAOSxp2mjs9HLnnHMOd5gbJc/XLU469/Oee+4ZoLj++utL1dLcYd4AOEMWImjzjQvviieeeGLFFVfca6+9eNZk7bzzzhhoXMTA63LLLcfb79VXXw1FpHHZdddd0bo4GJ0KQeYbl+IxtP77fPpXnlpzWEsesaHTuCBcHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmVA8bEKncWlK+SgUetJ4UiGjRVSinZmsonTVQloL+ajGdTZMd5GPb3qRWWaZhVOfo4X+gI9gfdAi7r77bk4RDBdddBEfxByHdAbSuDz00EOczSEFv25ccNIGyfcZ6623HgcqBx5L0PoAgpOMD/3wqyLahauuuoqC9DHyPcr2229PbqpxoTGShql///7kLrPMMhRkEoYOHcrJzXH4nRoX2hT2wz65WK7ixhtvZJ/SybEZzr/VVlsN5+67785+cIbG5dlnn5X7gNNtXFhI7jl0x6MMURaaMGHCvPPO26NHj+eff16ve+SRR7LbnXbaSSbd4nJDuJM4U3CrTz31VGwhCxG0+cblhRde+PGPf0zLy22R1vb/O40LGF0nZDSvXBTwDpGQptO4IFwdo0MZG7hOhKvBaD2sSaZCKmS0iEq002SZUDxsQqdxaUr5KBR60nhSIaNFVKKdmayidNVCWgv5qMZ1NkzPwFkr/6bqTDPNdMstt9BtSDcgn8u0AvLvr/CTOr3I8OHDR44cSa8jjcvDDz8svzShSOuT3DYu48ePv/baa0nv2bPnNddcI30PTQOHClkwceJEGgiaA2lc8MiXHBw27EEal8MOO4xGgSwOP2lcVl99ddkJhzS2fv36cc797ne/o88gnSUGDRpU/A2XaaelcQm50rhssskmYUbOG/2NC+0Ircbyyy8vByczbIYdsgptCldBQ3DFFVfQDFEh/KqIq2ZpuQ+YQ+Mi/ZZ0dSC3SG678H09yhClPgttt912PNAbbriBq5MZXrmZ3OHrr79enG5xMd96662XKi677LJSTdI0GThDFkI0k/IXiWhcZFHuOXfyrLPO4gbK0R4aF1mLV2lceMNIBQiNiwyLNSYRFoWwrqA1VDYuCFJEgw6B0WFotAi5EAlpOo0LwtUxOpSxgetEuBqM1sOaZCqkQkaLqEQ7TZYJxcMmdBqXppSPQqEnjScVMlpEJdqZySpKVy2ktZCPalxnw3QXifKZO2LEiGWXXZYjebbZZuNEoSegIfjkk09uuummJZZYgtNu1llnveOOO/gUlu8VOLbnmGMOjh9pXDjaQ+PCWW4aF47zDTbYgCL0OieffDJnORVofa688koOUY55zhXWCo0LrRJ9DJOcc3HjwqnJPjnnEPIFCW3BGWecwTlNm4J44oknLrjgAs5CWgcmjzzySC6H7oElOA6Z5FoGDBhAn8E2OE1BNy4ffvgh187xjHPqqaeWpbnARx55ZNFFF+U0lf/8StoU07hIv8V+TOPCNbLbjTfe+PTTT+eiuOFy/+H7fZQCT+Hxxx+fYYYZ9txzT56LrEgzuuSSSy6zzDLcQ7G1VzxohNHyBpBO5dhjj2Xdp59+mjObIXdYdjJ06FA6YB40z4J3Hc+ayfCXc9HAEznuuOPw0OKg9e2CsCjIuuWgawgyjUvApJthqbqGjBaRotO4IFwdo0MZG7hOhKvBaD2sSaZCKmS0iEq002SZUDxsQqdxaYp8/Gl4MKXq+uMOpEJGi6hEOzNZRemqhbQW8lGN62yY7hKifLJzovzyl7/ktOCY4aN2xhln5ORjCBx4d911F81KONcR5huX8KUCRziNxUwzzcQP5fKf3nBS0jrQuzBPJ0FZYAmO9gMPPJBDnSON/kb+jguNi3QATHJoyd9xCY0Lp+/7779PU4KTo446++yzz8SJE9mb/C0clphuuumovOqqq26++eYsR/PBscEZybbplhZccEEmJfef//wne2Pn4b8qog59Bk74xz/+0av17wVTEDN1uBWrr776k08+yWboybg0+a+KaFzoYOQ+sG3zjQvbZubggw+m1A477MCKeLhRcue/90cpcPxzdTwj+i0010g/yin+r3/9i7sqnvaKB42INWtJZ8nSO+64I7f05ptvpqGkl/3d737HDM+CZpRbscACC2y77ba8Q3hvcJ9JufHGG6VN4RbJ/2VinXXW4UFQMywERqdCwKOce+65V155ZW54ORVh0s2wVF1DRotIwVuO90+fPn0yzla9iuJaxzR0tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4T0sDw7G9BpXJoij0SjJ40nFTJaRCXamckqSlctpLWQj2pcZ8P0PJwNHK6c7ldddRU/HPMT8HzzzbfooovygXvWWWfJv/PGGYxBvlTglXN933335eTmh2bpM+SPEG0H7chBBx3Ez5ofTfr33KThuOKKKzi0fvWrX2244Yb8MH3KKafQ3NCysDQGDtr99tvvqaeewsmkfONy7bXXMnnTTTfRFvADtJz6HAb8WL/++utzFl544YXM07sMHjyY/TC5ySabnHHGGXj4CX6nnXaiYTr00ENlk9jY8B577MEktuuvv56FqHnllVeyinzZI6tw0hN64IEH9tprL3YLtFD9+/d/vfUfVWGjR2HbtC9/+ctf5J/gY9vyRRGccMIJ/fr1e/bZZ+V7HQrefffd3BZuLx75TZzc+e/rUeooT4HjnyVoBWgOuOc80z/96U/cEOa5qxiwtVc8aESspe3gvUGHd8ABB3DtbOOCCy74WQtuC++H559/frXVVqOH4x1FI8WNopuB3Xbb7cwzz+Q2HnvssdLi0FDSs9LZyCpCWBTCuoLWwOqdb1wMrrO99JhUcVfH6FDGBq4T4WowWg9rkqmQChktohLtNFkmFA+b0GlcmlI+CoWeNJ5UyGgRlWhnJqsoXbWQ1kI+qnGdDdNdQpSTjAOGI4fTnR+COZsFTmiga+E4oQvhLOdE5+TjfOLYFjPzn3zyCelF29L68obhqFGjPvzwQ053ahJikhOLn4BpfSjFcc6RjyaXgxwP7QivnDQUJJcOgyxWYZ5VKAWUZcg8Bz/OsDH5NRB7QzAj3wwRJcQrpyMXQlnpUdgGizLPJHtgh1STC5dVsLEK18iVsjfMsmFS5JVts5A0H9jIpQj1mWTbcnNk2+HmUIoZojhlRWxy57/3R6mfKTthk6wOrMiMhIQ2ioNeyNWC3AdBdiJC5oMQLYiWVynSmi6QoaAXMutqDZ3GJcZ1tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4TiYRM6jUtTys8qBQ+mVN4nWqm6howWUYl2ZrKK0lULaS3koxrX2TDdxTg5S2hHOK05dDlxOaHlLOdI5riV707k/ANpIDiVmedcDCcQgiJMEuJQ58xgBshlSHGOE1oT6vPKUNKphkEKcsCjpRrzVJO+BCGnHZN4yKUCe0NQmSKks3NmKM4rGqRXQBAtDvAvvpAGQq5LyoalQVaRC8HMDBuWmlKW5TCzBwyyQ71tPcmlhasDuRCZCbfxe3+UQci9kv24tFEc9EKuDsjqIFpP6pmAhACtCzKjixudCgEPq/OrIoPrbC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnpYXl2NqDTuDRFHolGTxpPKmS0iEq0M5NVlK5aSGshH9W4zobpLsbJHwBOO1oNjljOXU5iTlle0XL6YijdrT8zmDkSQI7hMvCf/+CUeUQ4h1pHUvFPoUhxoNuQpiEcsZIovY6UQsiWQFYRZGnSKRJSSGefVGbPhNBAVF5lM2JjSKLkyuqyigypz9K8arNsOKSIR8DT2mD5P8RhhtcwKWZAkBtmJPd7f5RByBK8sqsw1LRRHPRCrhZkXV6FMCNC07IXiJZXKdiKl8PC0cLoVAjkG5dVVlmFe15ORZh0MyxV15DRIlLQuHS+cXF1jA5lbOA6Ea4Go/WwJpkKqZDRIirRTpNlQvGwCZ3GpSnlo1DoSeNJhYwWUYl2ZrKK0lULaS3koxrX2TDdJXbKUcEBA5yvAlrmxakRpwkxjOeL/NaQ1xANBE8Ihb1pp8yADIMZZJJh2LPREDt5bdX7b0FeGYbVw3yoI4YATjFAOdWilVdSTk2aLAct9COI0dGazjZSoA0nwtUxOpSxQYgW5RLFjU6FYPTo0aFxie+8YNLNsFRdQ0aLSCGNS58+fTLOVr2K4lrHNHS2lx6TKu7qGB3K2MB1IlwNRuthTTIVUiGjRVSinSbLhOJhEzqNS1PKR6HQk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tE6HlwP/G/E25xrYVwusShFCElFqFCXM14wjCg/RIVyimFqSyIs1h1Un0ZBh1KyWQKHa3pbCMF2nAiXB2jQxkbhGhRLlHc6FQIxo8fv8gii6yxxhpfq/+QymDSzbBUXUNGi0jx/vvvzzLLLNtvv33G2apXUVzrmIbO9tJjUsVdHaNDGRu4ToSrwWg9rEmmQipktIhKtNNkmVA8bEKncWlK+SgUetJ4UiGjRVSinZmsonTVQloL+ajGdTZMd3GdCFcLZliTkNWqlyweTpc4VEnIFXQFrSHWJhe0J4/rlGqEJBqGQYflZDKFjtZ0tpECbTgRro7RoYwNQrQolyhudCoEEyZM6N2795prrln/V0U8F/lSTV4DRGUmOI1I8dFHH80666w77rhjxkkoRI0WAVrHNHS2lx6TKu7qGB3K2MB1IlwNRuthTTIVUiGjRVSinSbLhOJhEzqNS1PKR6HQk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+VR00koRI0WAVrHtOHs1uIQNMLVMTqUsUGIFuUSxY1OhWDMmDHzzz//d/o7LrQmxe/8JiFDXr9Rf79KnCFFRIr33ntv5pln3mKLLTJOQiFqtAjQOqahs730mFRxV8foUMYGrhPhajBaD2uSqZAKGS2iEu00WSYUD5vQaVyaUj4KhZ40nlTIaBGVaGcmqyhdtZDWQj6qcZ0N011cJ8LVgh6aUIbgLMolihudCtVEV9AajA5DM18qj5pOQiFqtAjQOqYNZ7cWh6ARro7RoYwNQrQolyhudCoE8pdzV1pppS9b/1NJ6TkMJp02ZeLEiXQbo0aNkmaF1wkTJrzwwgvjxo1r9S3frXF5++23Z5xxxk022STjJBSiRosArWMaOttLj0kVd3WMDmVs4DoRrgaj9bAmmQqpkNEiKtFOk2VC8bAJncalKeWjUOhJ40mFjBZRiXZmsorSVQtpLeSjGtfZMN3FdSJcLeihCWUIzqJcorjRqVBNdAWtwegwNPOl8qjpJBSiRosArWPacHZrcQga4eoYHcrYIESLconiRqdCMGbMmPnmm08al9BzGCSFEA0KHH/88RtvvPFss832i1/8YuzYscwMGzZs3XXXnWaaafbcc8/WtzDFX8EOC5kVY+Qv5/7hD3/IOAmFqNEiQOuYhs720mNSxV0do0MZG7hOhKvBaD2sSaZCKmS0iEq002SZUDxsQqdxaYp8xGh4MKWa9NFTDtIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0SJA65g2nN1aHIJGuDpGhzI2CNGiXKK40akQSOOy8sorf9H6v3O76JSvW3z22We77rrrFFNMccUVV3z88ccbbbTRCSecwMx1110XGpeQpdNdBg8e3KtXLxqXjLNV778FtRYBWsc0dLaXHpMq7uoYHcrYwHUiXA1G62FNMhVSIaNFVKKdJsuE9LA8OxvQaVyaIo9EoyeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgaLQK0jmnD2a3FIWiEq2N0KGODEC3KJYobnQrB2LFj559//qWXXvrzzz/nw72c7YqkEJWmRL53GTRo0NRTT/273/3u0EMPHThwoExC64wo6oSFzIoxQ4YMoXHZfPPNM05CIWq0CNA6pqGzvfSYVHFXx+hQxgauE+FqMFoPa5KpkAoZLaIS7TRZJhQPm9BpXJpSPgqFnjSeVMhoEZVoZyarKF21kNZCPqpxnQ3TXVwnwtWCHppQhuAsyiWKG50K1URX0BqMDkMzXyqPmk5CIWq0CNA6pg1ntxaHoBGujtGhjA1CtCiXKG50KgQ0LgsttNCiiy766aefSsMRIymhI+GV9uXrr78mcbrppuvfv38ICaLDQmbFGPqeaaeddquttso4CYWo0SJA65iGzvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpinxSaHgwpZr0CVIO0iGjRVSinZmsonTVQloL+ajGdTZMd3GdCFcLemhCGYKzKJcobnQqVBNdQWswOgzNfKk8ajoJhajRIkDrmDac3Vocgka4OkaHMjYI0aJcorjRqRBMmDBhscUWm2OOOcaMGVNORbjp33zzzbbbbtuzZ8+nn36aPkZCIWq0iBT33nvvFFNMseOOO2acrXoVxbWOaehsLz0mVdzVMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCelheXY2oNO4NEUeiUZPGk8qZLSISrQzk1WUrlpIayEf1bjOhukurhPhakEPTShDcBblEsWNToVqoitoDUaHoZkvlUdNJ6EQNVoEaB3ThrNbi0PQCFfH6FDGBiFalEsUNzoVgk8//XTxxRefaqqpRo4cyYd7OdsVN51mZffdd5988snPPPNMSSyWmRQ1WkSK2267jTo77bRTxtmqV1Fc65iGzvbSY1LFXR2jQxkbuE6Eq8FoPaxJpkIqZLSISrTTZJlQPGxCp3FpSvkoFHrSeFIho0VUop2ZrKJ01UJaC/moxnU2THdxnQhXC3poQhmCsyiXKG50KlQTXUFrMDoMzXypPGo6CYWo0SJA65g2nN1aHIJGuDpGhzI2CNGiXKK40akQDceECROWWGKJKaaYYvjw4ZWNC4bAs88+u8MOO/Ts2XOLLbb4uvX/AJfvXVp/0aX4/4GjJUuv6HLxxRfTuOy8884ZJ6EQNVoEaB3T0NleekyquKtjdChjA9eJcDUYrYc1yVRIhYwWUYl2miwTiodN6DQuTSkfhUJPGk8qZLSISrQzk1WUrlpIayEf1bjOhukurhPhakEPTShDcBblEsWNToVqoitoDUaHoZkvlUdNJ6EQNVoEaB3ThrNbi0PQCFfH6FDGBiFalEsUNzoVoreg5xgwYACNy9133x1aDUNIoTWR14kTJ+6+++7vvPPOAgssMN9880njMmLECF6HDBmy9957n3jiidLE8KpXdOnXr9+cc845aNCgjJNQiBotArSOaehsLz0mVdzVMTqUsYHrRLgajNbDmmQqpEJGi6hEO02WCcXDJnQal6aUj0KhJ40nFTJaRCXamckqSlctpLWQj2pcZ8N0F9eJcLWghyaUITiLconiRqdCNdEVtAajw9DMl8qjppNQiBotArSOacPZrcUhaISrY3QoY4MQLcolihudCklv8e67704//fSnn346QyhjCkkhNGHChA8//PDLL7886aSTaHToYH7/+9/36NGDnuOll14677zzqPbKK6+svfbahxxyCFEI6S6y4iabbLL66qt/8cUXGSehEDVaBGgd09DZXnpMqrirY3QoYwPXiXA1GK2HNclUSIWMFlGJdposE4qHTeg0Lk0pH4VCTxpPKmS0iEq0M5NVlK5aSGshH9W4zobpLq4T4WpBD00oQ3AW5RLFjU6FaqIraA1Gh6GZL5VHTSehEDVaBGgd04azW4tD0AhXx+hQxgYhWpRLFDc6FQL6hq+++qp379477LCDtBFlQCEphLbffvs55phj6623/uc//ynfspx77rmTTz75L37xi3322Uf+2VycO+6441//+tewkFlR01rw25/85Cd9+/ZFZJyEQtRoEaB1TENne+kxqeKujtGhjA1cJ8LVYLQe1iRTIRUyWkQl2mmyTCgeNqHTuDSlfBQKPWk8qZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUdzoVAhoPv7973/vscceyy233Oeff17OdkVSsF166aVrr702zcoXX3xB4wKjR4/eaqut+vXrN3bsWBogmg+c36lx+eCDD3r06DFgwIBO4xLrGB3K2MB1IlwNRuthTTIVUiGjRVSinSbLhOJhEzqNS1PKR6HQk8aTChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+VR00koRI0WAVrHtOHs1uIQNMLVMTqUsUGIFuUSxY1OhYB2Aa6//vpZZpnl9ddfl1/uGCQFG1EaHZCuhWGYATQenDUbFynI0jPPPPPbb7/NMOUEQiFqtAjQOqahs730mFRxV8foUMYGrhPhajBaD2uSqZAKGS2iEu00WSYUD5vQaVyawh9yAw+mVJO+7C0H6ZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUdzoVCgwatSo2Wef/bLLLivHXTHpZlgqpXfYYYdDDjkkDLVHI03PH//4x/XWW++rr75Cp5xAKESNFgFaxzR0tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4T0sDw7G9BpXJoij0SjJ40nFTJaRCXamckqSlctpLWQj2pcZ8N0F9eJcLWghyaUITiLconiRqdCNdEVtAajw9DMl8qjppNQiBotArSOacPZrcUhaISrY3QoY4MQLcolihudCmn69Omz2Wab8flejhUm3QxLpXTNb1xoXMaNG7fwwgtfdNFF8m1NygmEQtRoEaB1TENne+kxqeKujtGhjA1cJ8LVYLQe1iRTIRUyWkQl2mmyTCgeNqHTuDSlfBQKPWk8qZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUdzoVEhz1113/fjHP/7ggw/i3sWkm2GpWrr4wfbbb2lcDjnkEIa0IzIvBgPRBx54YPbZZ6d9QUPKCYRC1GgRoHVMQ2d76TGp4q6O0aGMDVwnwtVgtB7WJFMhFTJaRCXaabJMKB42odO4NEU+IDQ8mFJFX5GlQkaLqEQ7M1lF6aqFtBbyUY3rbJju4joRrhb00IQyBGdRLlHc6FSoJrqC1mB0GJr5UnnUdBIKUaNFgNYxbTi7tTgEjXB1jA5lbBCiRblEcaNTIc2nn376s5/97LzzzqOB+GbS/ytRMOlmWKpJ+uuvv95qq6123XVX6qDDfAzRbbfdds899yzHaScQClGjRYDWMQ2d7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMiE9LM/OBnQal6bII9HoSeNJhYwWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfKo6SQUokaLAK1j2nB2a3EIGuHqGB3K2CBEi3KJ4kanQgH5ZD/nnHNWWWWVL7/8kp6DYRmrXQ1N4oMPPvjXv/71kEMOuf/+++W/M9IezfDhwxdeeOEXXnihHCf2JhAKUaNFgNYxDZ3tpcekirs6RocyNnCdCFeD0XpYk0yFVMhoEZVop8kyoXjYhE7j0pTyUSj0pPGkQkaLqEQ7M1lF6aqFtBbyUY3rbJju4joRrhb00IQyBGdRLlHc6FSoJrqC1mB0GJr5UnnUdBIKUaNFgNYxbTi7tTgEjXB1jA5lbBCiRblEcaNTIQ09x8cff7zkkkveeeedbTcuZEkur6GI9kDwHHbYYdtvvz19UhmInBpCIWq0CNA6pqGzvfSYVHFXx+hQxgauE+FqMFoPa5KpkAoZLaIS7TRZJhQPm9BpXJpSPgqFnjSeVMhoEZVoZyarKF21kNZCPqpxnQ3TXVwnwtWCHppQhuAsyiWKG50K1URX0BqMDkMzXyqPmk5CIWq0CNA6pg1ntxaHoBGujtGhjA1CtCiXKG50KqSRZuKCCy5Yd911P//8c4ZloHY1HTJahEDlb775ZuTIkYsuuuigQYO+/vrrMhA5Na16FcW1jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgaLQK0jmnD2a3FIWiEq2N0KGODEC3KJYobnQpp6Frgyy+//PnPf3799dejy0DtajpktAhBGpcDDjigX79+iFSHZGjVqyiudUxDZ3vpManiro7RoYwNXCfC1WC0HtYkUyEVMlpEJdppskwoHjah07g0RT5lNDyYUk36xCkH6ZDRIirRzkxWUbpqIa2FfFTjOhumu7hOhKsFPTShDMFZlEsUNzoVqomuoDUYHYZmvlQeNZ2EQtRoEaB1TBvObi0OQSNcHaNDGRuEaFEuUdzoVEgj37jw+thjj62xxhpjx46VeTDpZliqriGjRQg0K4MGDVp55ZXHjBnDUNYVjFPTqldRXOuYhs720mNSxV0do0MZG7hOhKvBaD2sSaZCKmS0iEq002SZkB6WZ2cDOo1LU+SRaPSk8aRCRouoRDszWUXpqoW0FvJRjetsmO7iOhGuFvTQhDIEZ1EuUdzoVKgmuoLWYHQYmvlSedR0EgpRo0WA1jFtOLu1OASNcHWMDmVsEKJFuURxo1OhGD7fv/rqq3PPPXf48OHlVO1qOmS0CIFO5d57773vvvvKscI4Na16FcW1jmnobC89JlXc1TE6lLGB60S4GozWw5pkKqRCRouoRDtNlgnFwyZ0GpemlI9CoSeNJxUyWkQl2pnJKkpXLaS1kI9qXGfDdBfXiXC1oIcmlCE4i3KJ4kanQjXRFbQGo8PQzJfKo6aTUIgaLQK0jmnD2a3FIWiEq2N0KGODEC3KJYobnQrF0Lh83QJRTtWupkNGixBoXORblnKsME5Nq15Fca1jGjrbS49JFXd1jA5lbOA6Ea4Go/WwJpkKqZDRIirRTpNlQvGwCZ3GpSn8sTfwYEoVfUWWChktohLtzGQVpasW0lrIRzWus2G6i+tEuFrQQxPKEJxFuURxo1OhmugKWoPRYWjmS+VR00koRI0WAVrHtOHs1uIQNMLVMTqUsUGIFuUSxY1OhVxCYxEw6WZYqq4ho0VUknG26lUU1zqmobO99JhUcVfH6FDGBq4T4WowWg9rkqmQChktohLtNFkmpIfl2dmATuPSFHkkGj1pPKmQ0SIq0c5MVlG6aiGthXxU4zobpru4ToSrBT00oQzBWZRLFDc6FaqJrqA1GB2GZr5UHjWdhELUaBGgdUwbzm4tDkEjXB2jQxkbhGhRLlHc6FSoJibdDEvVNWS0iEoyzla9iuJaxzR0tpcekyru6hgdytjAdSJcDUbrYU0yFVIho0VUop0my4TiYRM6jUtTykeh0JPGkwoZLaIS7cxkFaWrFtJayEc1rrNhuovrRLha0EMTyhCcRblEcaNToZroClqD0WFo5kvlUdNJKESNFgFax7Th7NbiEDTC1TE6lLFBiBblEsWNToVqYtLNsFRdQ0aLqCTjbNWrKK51TENne+kxqeKujtGhjA1cJ8LVYLQe1iRTIRUyWkQl2mmyTCgeNqHTuDSl/PJLwYMpVfQVWSpktIhKtDOTVZSuWkhrIR/VuM6G6S6uE+FqQQ9NKENwFuUSxY1OhWqiK2gNRoehmS+VR00noRA1WgRoHdOGs1uLQ9AIV8foUMYGIVqUSxQ3OhWqiUk3w1J1DRktopKMs1WvorjWMQ2d7aXHpIq7OkaHMjZwnQhXg9F6WJNMhVTIaBGVaKfJMiE9LM/OBnQal6bII9HoSeNJhYwWUYl2ZrKK0lULaS3koxrX2TDdxXUiXC3ooQllCM6iXKK40alQTXQFrcHoMDTzpfKo6SQUokaLAK1j2nB2a3EIGuHqGB3K2CBEi3KJ4kanQjUx6WZYqq4ho0VUknG26lUU1zqmobO99JhUcVfH6FDGBq4T4WowWg9rkqmQChktohLtNFkmFA/b5//9v/8fkMkCmbVTclkAAAAASUVORK5CYII=)", "_____no_output_____" ], [ "$$\\vec{v}+\\vec{w}=(x_1,x_2)+(y_1,y_2)=(x_1 +y_1, x_2 +y_2)$$\n$$\\vec{v}-\\vec{w}=(x_1,x_2)-(y_1,y_2)=(x_1 -y_1, x_2 -y_2)$$\n", "_____no_output_____" ], [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAnUAAAG1CAIAAAAQlZUqAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAP+lSURBVHhe7N0HgFXV8T/wfbsL2I0dRHoTBBQQERRF7C32bkzUqDGJNWrU2DXGEn+xxPyjSazRxG5UMHYFAUVUEKX3JmI3FmB33/t/7pvHY4UFlbagd0KO586ZMzNnzrnzPfO2ZXK5XElKKaWUUkoppbRMqbTw35RSSimllFJKadlRiq8ppZRSSimltOwpxdeUUkoppZRSWvaU4mtKKaWUUkopLXtK8TWllFJKKaWUlj2l+JpSSimllFJKy55SfE0ppZRSSimlZU8pvqaUUkoppZTSsqcUX1NKKaWUUkpp2VOKrymllFJKKaW07CnF15RSSimllFJa9pTia0oppZRSSikte0rxdRWjcU+elMnTdjeMK7C+FzTuhu/nulJKKaUfLK1q+FqEl6/TdtuddMOTP4y8PObtQuf7RWNH3BqdgSPGRiellFJKaZWmVa5+rRleBg689fS9Wm130pOF55Rql8bdsF1cfE76treeVu1OjE6Pdq2ik1JKKaW0StMq+/lwj+vH5go0tm8hNQ+89Yr0w8WVi94eU+h8E7U87ZbYzQGntSywUkoppZRWZfo+fP215Z63FBH2/j4pwKaUUkoppVT79D35/qZW7XoUel+ncU+etF3hk0q03XYnPTmuGv7O+xQz+Vi5umReriATNC4/PF9TDSI03FDN2DcKfMsvGn/Nr5MWXZ4vfqUL0KIXXnBo8VYXH43kK+StTh8YDwNPbxUSoaPw1XNPiYpCP/+hvuf844Lf31Tdk3zEFj1a06q/cVNSSimllJYXxYdyqwyNvb4ApNU+H0bz6teSE/sWONU+Nl6A5ovMU3biiScuDM/zDRRtLkQLq/o6VfOmbw0m8vT1hXydFm255g/IF6Bq5r9O8/T26FGDftEo9OZTdWPfFI0anSkoKIx9zW5MK6r9RlPzVvXt9/frtMiopJRSSiktS/oe4OvYvvPTaA25tcf1fUNyfj4uZujq+ffEEKvGqy7W40RqihaLMvNE5qkuejWWtR49iu4UTWPN86aopBqifI3mmylIJDoLnOqm5oktZqULULVFzvPo62i1MLOo6VtEoxpvgbV9zUiBYtbCM+brrb6wHgvLL2rV857na0zmz9+UlFJKKaXlSassvi5MRehKqChXPZsuyKyewWuY+rW5X6OiyLx581N5NR+KtJB40DcYqnlWEUTmcYtii1vpAlSj5cUzv+7616gGmUVNq4avRf/z7cIzFm+5OLqYVS9+U1JKKaWUli99T77+moDrnbfsOf87T8eOKHwB8Na9Cl97Q8WvCi5IPQ7de/7Ulq3bF3pvj5n3xbpx8XW8eV/IW0jRnvtHLh94+l6t8l8l/NqX+YrOtG9d/XtjazJUjWqetdBXmr/bSheg6j8MU3SnRmZ1+qZofAs6se+87xOuHpHqtIigFejbrHrxm5JSSimltHxpVf/5nHkfsw68da9W1X76ddzS/BaGBSFs3A3btWq11+m3Dhy4aBzZ85ax18/78DZ+GHf+9wUVnVnwRzsX9V1ZeVrkrK/TUq10SejbRGMZ0OKX/+1WvbhNSSmllFJazrSq168t9zxtwLzPAW/da+HfL1Hjh4u5W/YsDNdIxdoo6MlrC2XR/M8Zix9EVqOWp90yIPkKn4wegxJ6QH6xBlzwVxMttkRb5KxF0JKsdAnoW0Zj2dHil/8Nq170pqSUUkopLWf6Pnw+vOfZ8xL8rY8WUud8cPpWPxD7tRw+vzbKw17x8cQLTqv2AXTN1BLe3zJggLIpnuOD32Kd+vXPgRdfos2n6rPG9bn/62Xjd1zp0tJ3isbS0PwPpmv68Pw7rbrmTUkppZRSWr70vfj6a8u9D50HsPN+gdO8r72VDDz9p/O/7lb4wc2FPyK8da/Cb/Ib9+QNPy0WaGcnZVAxlQPv/LxqIvMo+XnSaj97OW7siK99fFn0r5oz1Q1V/+rvfPraEgqWT1r4a53fcaVLSd8mGmg+/o0YG2L5p+9CCy8/b+wkcdb7Fqv+hk1JKaWUUlrOVPhEbVWhRX1X6fxvTC1+Q+k3/8TpYj/YnP+NqdW+6XVBKqhahKJqPi7S1GJ/XKQmy/N+drSa8m9e6QJUcxTnf7/t4pjfHI08LSBWGCpyF1p1TS7VvK55U79p1d+8KSmllFJKy5G+L98/XMNnxHveMqDaF97y1CP/g5sL/4ZbOTf/o5UFyv+kz/yvXO55S15P4SlR0rfaD4Ym1PK0O79uaSFDLU8bED99WXhGBZnFfYV0QcuJXwMumPfBaZG+w0oLVPzw9bvTN0cjT9W/t4jUAl9h7tGu0JlPNbiUX9dCYc1/qoC+adXfvCkppZRSSsuRMjC20P3hUfKtsPnPNuFrmnZTSimllFJahvR9qV9TSimllFJKaWWiFF9TSimllFJKadnTDxtf533Nr33r+G9KKaWUUkopLRv6QX/9NaWUUkoppZSWE6WfD6eUUkoppZTSsqcUX1NKKaWUUkpp2VOKrymllFJKKaW07CnF15RSSimllFJa9pTia0oppZRSSikte0q/fzillGqg9L1ICTkGmUySJIttYSBPcUgws9lsdPLsQsdoTCk+LjBUWvq18qZGJsI0K9oCqxrhF3p55UXhAquaxXhERZloWVxgtOhDyEQfVRdD1Q0V+0XN+kVOIvGtaYFZoW1hVTVyFpgV/SJVn7KAQI3aorOwzuhUb0Og2ClSwe+UUkoppZQWoKqqqoVzaOTMYidGUX68QEXELY7qBDPwDCcvOD9rB4VkcHRQjWinr13gMah6v6in2Cm2ZWVlVlcdzkOselsYyFMwo69jLTQEp7q8NpYZ/WInBKKDE4/V25CskWJKEEn6Qz7m6oSGYC6KyKDqwdQWp8RjEKbHYotTfbRIxdHqMVyACv6llFJKKaW0AEEg7Zw5cwIzqqddhKkPZor9SP2RVMvLy4tIUMRpzNATSVkn5nqM6TphFB8VH5HR6CD8EEbEKisrdfDDdPQrKir4Fo/hQEjWqVMnHAuF4ZV+dMwyFLOiE8IeDXnUr1u3rj5O+B+joUpbtBUywUecCSuYOmzNnTu3qIRYUHCQfrRG4zagH0pChgatx5AM5YS1YTeGcDxqi5La4EcbnZgVeoKpE5uLiYpTkCFBiPDGntZIhaOQUkoppZTSAiSNypD//Oc/Z8yY8f7773/00UeffvopuF1jjTV+9KMfwSoy4Mrjxx9/PHnyZBwwoF1//fVbt26t/8UXX9AwceLEevXqycgbbbRRw4YNP//88w022IAqo0ysueaaHqdPny59G9JZd911I9evvvrqjOqY+9VXX4VFakEXZrNmzTbccEOmP/vssxYtWujwbe211954441nz549dOjQ1VZbjXDTpk3jisDQpEmTSH7yySdk9CGcWbvssou+BTZv3vzdd981ylCDBg2s0cT11lvPI39GjRrFqIV07dr1ww8/5OTMmTM9WqyWMFXmcoYwh6dNm6ZvCs+5zbqOBRpl1yxqLYQMJ8nz3CPQIsZ/o3wWgU033ZTy8ePHk2nSpAm+mFuaoJH/3//+x8kIgpZyc9lCMcp/axFb0SbJND6HKVlrrbWY42EETZ//lkaDPurQoQOxWbNmmW4j1llnHa7iaH/yk5/06NGDh8SSs1ITlV1yySWFbkoppZRSSgvRo48+CnvAoeT73nvvyc6YkrW+bK4DDyTuCRMmGAr8gLjSrkdJWYoHmfpffvklSbBkiKoYwoRSwMZEoxCCQn3pXssoJkiQ0OV9wvToByYxSgMZfJBAjGNaU/KOl2AaxQlJpjkWRsmAKHpY0cEfO3Ysx4BNGCLM1pQpU0A4JeFMEJlQYiJh4GQJrIS2WBc+McthiKRF6cdcjumQFCiOIZDpUcthkdHXAWOCAB1NhKwUsmIibdARNJKPhRCmkxiHaYtQU4LJQx3EN8QNwmTgoulQE1Mk4TdnCLtemCsa1IY/hkIDMlGIzDK9Xbt2wF4fJbGuiVJ8TSmlGiiyQEo/cIpj8Mgjj0ydOlUhBTZkYUWeR3x5WaKPWlPmld+l4/iIFR4oVclIxwopKZuMfE2h1AyBlJ6BCgEJISaDUyK5E6tfvz6kMVRETZJSOQwAPORp0waT0cBLhpjG+eCDD0CRPiep4rkqEMzTDJlglYkccG+gmTOmmAvmGaUfkz+Wxgp8NZ2ACHCPMLtsmU6JjhWZohPloFVYIEOmeDSXJyR5YiEUIn1DJvIQPKNwlVGzQu0mm2yir0jlANMCyB8BadiwITFTjFJFJ4rwUsiuKZR7FMlQTnME0ChXueTRlAgRPWFC0AiIJBimAV+HTKNGjWbMmIEvOEzzQb9Tp04tWrQI60HkC715lOJrSinVQF6VlFKSbR2GMWPGADD5N5ADR3qVsiVfaRoS4AMnMjIvfpDkTlJ2Niqn43gkI+9rjTZo0CCqYbMYku7JyOAE9HE8KqfYZQWYyfg4EN2oKbDEEJ0QghKlHrUECINJdqGRR0QGVnGDAD2jR4+OOwFJ2qgNjIGdFKrnwlb4SRVoIcYEDvdCLcymwRRuBKziC4IhwUEQS+VnVJ/CiEx4GCiIEw5oQaAQ6WupMsQTHRPHjRunhBVkEzmJ6aLAoj6XQlt06NHHp01r1bTh0BZBtsx11lmHDygs0tOxY0fOELAXHDAk5gH2YNgWY1o1AWSBHBATS+vQoUPjxo0ZKhKLC1DyReCUUkoppZQWRZIyOAGHMANKSfrQUZ5VykRtp1SVeaNKwycPHmRqiVtHZpey9YFEsaIiHwUfJRK9kosheVxm9wgqdCRxfImeTo/mwiom4Jw2FMr+Hlu3bt2qVSt2acOhitHJkyeThx8c1uGkKby1BCBKW+Cc6cpZqojBS45RToD/VsQl1olh4nBj1qxZxJR6ZMSEtpkzZ8IhCiGuUWIQUR9BR8uhXJ85jsXy6aEW8RNNmTJFJc00tRQa1aq2yVu+Ow1blsYHLSCnIXQSAGwsihg93LBSHbYoJ08nDYQFk1ocs6zUurTkJ0yYYBeAqymGzOWwDrJqiyImRDDVXmt54sIRJghwYDGU4mtKKaWU0uIIOga1bNlS8aQjO0eBJbmDQGAgvyM5VxKXkZs1aybpy9HQDiTI70a1m222mYnElJvStMwuxUvfYI8hc3VkcDCgL9EzoSOPa+GEdE/MdLCBSVWIkQdC8bloixYtOnfu3Lx5c32SnFG8Mm0UACu5KDGRPwQ4iXTAG1CkjaRZJAlAssAYfjIBhgkT0/K5UaNGlkaMjJVOmjRJHJAhroJM/F122YVmPpNEMRE40aYvOALIf5eDNm3a4CBz+WyuobiFgHmoqR+XFXONcpKqEOM/JREZOq0CB58kox4pFB/+w2+3AfcDfuKEQNu2bdu3b29r2rVrZ8mEiTFEoT1CEydOtC8UumYZdWno378/GRwyi6Fa+3xYXAq9lFJKKaWVmJ599lngoSPJSrtR24EZj1K2mgboyv5Al4zMhgxJ9LATMpGBpgSkckOgQi3lEW6pKaE1/AAMcMUUWRtskNEHcjBAn7YoqtRtTEjrsjw+JlQAFfzhGNggwIpZ3MOHH9wznV1DOPhRC9ITRg2ZTrm+Eg2fcm6DqKjL40JAFT2wil1DcMgjYXaJcTsWy4QpTMBdfJrjM3BWrMt0U0znleVHJco0gSg9zSIQXgVA6oSr5ANoXSBCT4wiczmDyJhlFJMVs8jHh9hxj7G0IipDbmTJHpXdUFMty4QQWQimifwxlyrKBY3DpuPwuVevXu4E1SG2ej+o1upXrqSUUkoprcwUyUpKVfxJ0DhyqzYwLBI0LJk8ebKcjq+VwRV2ErHsLwuDHxgzffp0kjhaquArHJKype8oB8GAThGhtfpGFUxAFNCSwWSUzsAPXkEyXtGJQ2GDBg3IgApGhw4dGh/V8oddsAHmISUiTzMoAuFRkAV4K9Rw6DHLFPqnTZsGIwMyKTeFn+QbNmxIHkYSc7eAQ/RA+qZNm5KBjmJidXxT+XHJLHFjmttGeWuuGBLQ6nMYhiFG6SdvFiKcbMO8L7WywpzptJGMULDIfz5bF22WQIMFFmGe0UBEbfRpJizgOupsC7cj3OMGeXtngXwI5dqItl0eOXKkqIJV1wKc8K1IeZe/Run3N6WUUkop1UwSq7w5YMAA2RZgyLxQCkeuhzoyrIQuU8v+U6dOxZfogQ04BEv4ZCL7v//++/I7vj4UAQAwWI42C47SQ1ghpdQjSRsZ2ignr2XddM5wyRAsiYxvNCAhEAtyoJkzZ7Zv394UlTGHyYAZj/CD5lBCG8Q1kRVGYzlRotFAnoxRWEWACRONctJEOuOKQL9Rj6o9Gky0FpCJrIJds6IitF7eCoXpAJJCTrKCyTrTSAciWghVESXmaGbCRLGKL6CKEn7cNrRFD4WdXXO5jXSMBoLqa7kXoaONz64Lsad801b/FIEAu/quFzzEEVu3BxzmiNHG2x49enTs2FGfA9oaKf36a0oppZTS4kheVuJIx2BAZQM7pVRQJPPK7FGASu7SNL7sj4mghUeYKoMTICxZy870gB/IShtOq1attMS0DMn70CWEzWIXU7oPuNLSj8O6R6okfbYAT7gKtIxSrsMxSAPzuAFcCetEsRhfBuYhwEO80qdq0qRJNJtIknVgTFgnvmpLP3TkmDIannHAXC2+1jI98oQewhE0fAqBIp04/BcokhbokSQT+vzUMZ1yGGaKZeKQ5Akm5/GhLFtcxY8hZIgVqiIa1JKxOj7g85OMPp2uBRtvvDEr3ICdMNIUQ5CbJOVaE633gw8+GDduHK/YYp0MJ6m1xfo2jnWP30hp/ZpSSimltEiS5V9++WX1ViCE+lWCDqiQfCX9AImoEaVd+Rc2yOnyNdQERUX8kL7JRPFERpqm3xQKlbOmACez4qu5OoAcGsX32pAhbDpmAEkAJ+jFhwrwAzBwD1rwSt9cwMkc05h8VqXpGwXwrPPBXJI85CongT0NMIlCC2QlnA+UJWwJfLPG5s2bE0OcidXxhGbtrFmzYBJz2nCPITIk6dEGYlEVFWeEC7EYRap+mKaWnx5FycItJ8IYS+YJGY5ZI4Fww0TyOnyLCJPE4VvcQlhHbOmLqqsGYdDbtGlT6EvSRsNRtijnMLX0aOGxXSY8ffp0q9tll106dOjg0RSqaqS0fk0ppZRSqpmK2VPOhWcwTGqOLK8FHkYjC2v1ZW1IiQ8zJHRYAkjk7pDEpEGfvBytBQYUxiMBOqVy+LHJJpsoGdlt2bJlADk+yZiLT21AEWECrMv+gfom8hbe6EBNsGQKJMOkJ9DCIyxBdMIY+KTEpIoeHF7BeExTjBrSUm46J6klMGbMGGqtMUKkY2nk3SfCHIjiDEmexMfjxAgjwpg63ONDEf9YZJ17KO4QgaCE9UeOHAneDPEEpoYMedEwXdiJkWfIGvlgSJQMWZfprIQPsa3hhouOGPKcaRWt+8dbb731yiuvjB492iwLh6ACyAeXBq25lkYthaFq8ZTia0oppZTSIkkmnThxomQd2V9Oh5eY+ipLaRc0BghJ65I1jAlcAQZktB4RGagQQ6C6RYsW0AUekAEMdErfMXfUqFGKSHUVPMaXzSGKcjbAjzBDkDhgBlPLCmGzwlvoIvtTzgRs4xvnIZ9H02fMmBE1H8ihPAAYPOPQhgNcKbT29ddf32jAMLwJWNLhgOVwwHoNgVJ89aIOJh/MIhAg59FVQNyYDj36fANUAYeYUVlaO5eoMiUWq28hZGLI3CJT322DEltAD52m6LBidbSRD38o54BZ3I5N0Ql0FBOjzFFCEgcZJWw6bYiAmNAgwtzwiCLyxGLKoij9fLh2KE7P0KFDnfUieVEdDhv5bW5GKf1wyM1auRCHxBVbGx9XyjuRawpyKS1rknmF9+GHHwalEE4WxpFbo+MVVgNF/LVQ08srO8MVyRpK2Sy1kUrLfhXRCA4pm2wcBJXfqbWhhqAgDdR6JAw/cBRSmMhE+Z0VQwhyyCEOAKbsTyE94TMMcDbAw5QpUzwyRJXEQjicNxEfcluUR5Iyj4WAK4Y6d+4Mp+EKea7iE2DdonSigtdSy3n+4Fs1nexC4lBiiAZOskuGLQvUERxET6xChycMIXyazeJtnGqdZD15pOSnUZymTZtC/TBHA1c57wXhajjD4QBCiOs+oYPMpd/y+aOlR9Aw6bFHONxjdJNNNsHUCQSlDZHURgw5E2Gns1evXqpbHHbzbtZAaf1aO2SHbH/v3r27devWtWtX7bZ5uvjii212QSillPL04IMPOiRFclq6d+8uL0SiTGm5kmwroQMwMAk/FI4wIC7ByGh8fijRS8pkFIJwxUSJXmqWps0ihrz1ZKCFfK0dN26cUUwwQBUlAMNcQ7I5iBo7dixU0JHZqTIqrQMw8vHVWZIm0kYtPbChTZs2+jqEzQ0gIcOfgEZQ17JlSxworvgDsRymJ+psSgwFvrpD4EMvQ1SFNkOUQFBGLTBME+AhYcGhDQfCuRxwNZBeDc2W6fiYJtavX5//HqVBfMoj1NEhgPijLyyUI76RjGUa4qdYaUlaLKPCYptU4cTiRhKOMe2RMPeYw2eodevWgk+PzaWTgFnMeblowCFsInkUE41yj0ItzeziRGwXRYVfsLniqbbsriQERL2KjRo1Ou200/bff/+Ihk312jj9IZNSSkEynTQXaCqVPPvssxdddNGoUaMUQF5vxybEUlrm5D0VYS9pVGP6ca2Rx2XkqVOnBiahyOmyvA3y2KVLF/jxxhtv6BMeMmSIrA0AGjZsCOEkbvVW4K4kMGjQINmgY8eONhroQiZJnGmSwJUVfYBEOW2YUauBBy4FDmHigDFoCrSkESYosYSAE17BOXDIUPv27U2cMGFCZCFFm0WF29omTZpMnDiRLfKARAci6tCvTxWcppkVRTlzAUuE8ckQgNkACV/fFH3+wCfKtVRxiU4hNWoKsgqIFX2zOK+lVstJozpssdK2bVuRpIokVZQjCpFYWYI12qCiOYYCjAnHHYUkbW5IRvF1LMdcVuwFNwSHaZHhIWGz+EMzPSbimOXxmmuuOfLII0l61FbvFKnW3kyu/MDJDtktKbJz587eRm2nTp3iaplSStVJpnY2tt56a+dkq6222nzzzSUd58dQtCktP5Ks1Do6MrXcrYUWwi7P6si5BPTldJsiiXupMSdPnixBe51JyteQVQciYgYMEAZRgM0jKFLnGbK/0r2EDvNgj4nUGoK4sj+igS0wwx+zEDFMs1SWcAIkwDmOmRXQQglQ2XTTTYlBenZBOAiELjihhD9WoRMe6rPrkSH6oY4iL7Th0EBMKKjVZ4ty5pi2fBNp03fbcA+AwQwR45VZxKw6lFPIYcJmAbAINRJARtmiXx9HQGKZOsAVCQV5evDpxKGKPLVM65vFMS8OEnzyHvlsVMuHcA+507iXmGIuPk90igDPbdcg4aXBaGwBhfqJr3niSVDhuRqlN9/aoeqbEf38BiUUzJRSKpIcVDgcecKp3qa0/Cgf7+QLqzK7vAyWJFb515DkbggfeOhLx7BH8pXl8W2ZqzP4VLBieoQEJsrjM2bM8EhMqWQiDYgMgBw4cCA0UlmySC0ZIAQ/yJOEfKYbwscxi0JMQ9OnTyfPBDAIebDBDT7jazkMGyAZeaMelbxK1cASfLBNMj4sBUWUQzicgCtMSugPPs0mxiMZfTAWIKfDEEimn5/I5YCH9FMCrjyKCWFGrdoUHUMCq6XZkjlGA2ratKmWP+TxCfAnwE+fJxwwkRVG9YmZG58SkyEgyPHpNEPcYEuEhdq1Rh9whmbEGWuhJ1YX7nHYXEFjhQaGRNjCUXI+volSfE0ppZRSqpnkaBQYICPryO8yO4If8rVETEBSDpjRj4nkx48fr6NCMgs4AQnp3hS4O3r0aGAMD2AhIJTiwQAsadGihUQvm5sbmBFgZpRyRgMFoakOJgH+gAG+4UAC8vGRKRjDpMp0qvgAV+J+AEQVtSSRVYAKcynRmkiSD4ZAC4tM4POfA6GKFdMjGsT4Y1FhmhJTcPjGHD4Zkh4VsiEZwhCOElEyy/KpxWeIDJ/fe+89YeGtlZpIgymGArNDT9GfaK0rwi7glJPnAGIa074QEBOAKua84iol7jqTJ0/2SCclkJi2/AaWBEKjiBKOLRMNCuPOYTrhonyNlOLrSkd21GYXKc5ckQpCK5ziqIUPDlZ0ghNDBbkfJEU0hKU6BfMHHplVnYrbpyMj60jWIERWlWrjb6Tr48jO8DIkI/kqNL28kjIkCCDUGpW469evD73i24bhAQiJv7cDV0whQKdZlMj4mB6jEsVkgjnwExapAhhm4cNFmBRlNB+4CmbM1TKEqfYipqo2cdNNN9Vi0gN1KDcdJzAp0g5vccBbQAtExCSvH6YJ08zP+I1RsXZi1h6PzJHhCXgjZqX6jJqFw4oVBRbSjEOz0YgnyQgdvlUQNkt9LHQekSmssMg9ASRGwCosTdXrHiMy5HEoFECRdIHgHhN8w+QSDxkiQxs+czhMMBe1uJUS4wltZNiNm4dOLHAxlOLryks2z756/RwL+2q/4zTUFrHOB6fZu+R94B6vMLXfeM6+3xRB0NopmUU0IjiGfuCRWdUp0mhA4yabbNK+fXslJjRy/uXigA0bjbwXAUuYWsWodAyZ5GVDCRrkP0yGBKNGjQqQUNE6M4BEh8Jp06ZpA+qYi88wI+MrofRhJ+Uo0A6FcnZDcxw8hrTAL4AKUQhUKFcOWpQprNOpdOMJeY+EA/UtllpMhqCOR5rNUvUiSvTDiqPOLhmqxIEb4T8BkEmJjtXBPK7qEEP8CRQkSWG07AoXbevk/9Qugo6NGzdu0qRJfHFX6YlJYWjgbSxKAHkCNVnnMLuC75FvAlgMPv2UN2zYkB5EeXyQrjAlw23ukeFnXCaEhTyFiml3AjKMUmuZmHJyMUpCwatFUfrzr7VG9um6667bbbfdtt566+DYLW3yiuRpypQpJ5544qBBg0aPHv3www/36dPHxdZ7G8IrnpynW2+99aKLLtK58cYbJ06c+Nhjj+20007OOs/D+R8m2aynnnrqpJNO8uI9+uij9kugunfvLoksj8iMGTPmgQceOPXUUyWj5aE/pepkcx988EF51raqgeTZrbbaSp6V3ONzSIlYi7wISI6W9IGBiXK9R8naKyMjS+IeA7TggVTuddanIVK8DB5fSQ1VHk2EBzgwSRs7HrUsH0xhhW/UEqAEn35z+cNVDtDvTqAOBhs4jHrkOaNWpwOKQAtV1JKnBwEeeuQoffBvsQzBNnDFCuVmsRL3Ay7RDCmNmsIHHdryZzP5rmOSAgLSGKUHR2sKGasIMXq4ga80d0+1KLOsl0usBKzqyIFaYYm5+DqChrx9TAssTwzxgZ+8jXVhMoGDz4q1MxH+i17Ac/C1sXEsUqgTgcWM6WQwd9111/TnX1dJsmEus/vvv3+bNm3+/Oc/X3jhhQ7HP/7xD7trm4OqslW5qq8qctlKByKby5YoIquqVE1VFfhfZnOVy6KspAEl5qqqrr76as787W9/O+OMM67J0/B3RpZmSpknl7dWWZmtrOBMNldVwsGKbAUvTOXq7LnZqjm5Eqc0l0v+WzCwKlMxOI8//vhxxx13QZ6ExWWof//+8b0xxm2Rf3LC7CQyyTfCJLuVLRGiklxlEjqxyv+zh0lgcv6XNMl/803yL6XaIHulBVHytY2W9yXiESNGSLLQAh/TWynbQi/5F+FAJgBgLhlnQN3pFQAe+nI9SehFjM6RI0fiAD8ABksidxsCG0DIFApZ9Cj7S/fwiS18CYG2qHEDvwMUFdk4ZAJg2MWnkH6q6IFPpvMBB8DQyS6w5AbhwDxwog/nLEFxSQ8+E3xjQh+HdS0NgcH4hA25DRjS4YAhMgGuRgl7pJzDPKFNoIiRJ8nuu3niJ6ONGjWyRkahpggIdajidjijNZFaCqkyxWWFG24DwjJjxoxYFG0Wazk6rCB8+yjBGuW55XOGHpqBNCvkPQpacQuK+0u4qA2xywftoijF15WRnB7ViQ0+++yzHSOc+KsdW2yxBSb66qvZ77773t+uv+rKa25KTn1COUD2+RdfTJww+k9XXHjr3Q/DNJzC4FIQc87Tc889d8UVV1x88cVNmjRxpDiz+upr9OzZs7y8rDRXUlWZ+/yzz4cMGvTTnx03euJkh7MimTT3q88/u+76P59x5jm/OuHYU08/beykaZUJFic/f1bQviqTVVjk5MmTTznllIMOOqh37944XkjBad2mjdyaX2buy8//N37MyAt++9t7Hn0SeiYfG7uA5G8lyc0oiUQCuIm+pJtXTSTfnfeUUu1QHFRJWfJt166d1zA+zJT0ZXCFrIQLIaKs8V5IxFokiUe+BirxObCXWqlEg7RutGPHjsBSjtYvCuDAFQp1WGErEDrees7EyxiQSR7qhEUYQwkBE4EQwGAFeoEoeoxCBdYJuLJbESVhy6iJ6+X/dEGAB80c4DbllOjQTyejljNmzBij+PKSIf6zYpbFWggx6B7KqTUl7LKIg/RBYKyUWoF1+RBJHHaNNmjQgFdWbQqmlgaYbSHssiUmJjJhLtzliVkogkAbxyyTMAdYcZ8gg6ZOnRqf9FpsaMbkPwEmzIpQIHpo85hAaP43YvINh6RHOilnwmIxdahaFNUavnIrpRrJnr3++uvPP/+8+tVmR7gGDBjQsmXLTTfd1Mkg0/fJvn+79da+j/fN/+x5MitJzqWZf//rX/f+8+7/PPy4G7E87h6ZjC0FxU45Xtdff/1mm222884743h8++23v/ji8x7b9chlSvzzUt14000PP/Tg00+9+NVcS6io42XJZa+79so11t7wit9ffunV134xa/pPDjxg+vufzq0sc+UL/as62ax//vOf3urDDjssv1HJ94u+8847vXbcMe7XYvWve/911513P9Hnyffe/7hSgZqpymUqkg3LleVKMqJXknNHKdfmgxmKk7I1AeH0TalVigQKO2GAqqhr167w7NNPP5WaJXdMyTo2GkG1SO42nbBSzGGQpvXBj9QMb/RhA75XxiPA8FKrFE3BpIRkvGKMkgFa+pjhTOAKeYbIS/T8IcBKmEb6AY0sEoCa0ghb3NahAVZ5BBvKuBYtWnCbIcIkARX8U256xI8CHZCYFXcInSlTpvCEZLNmzXRIBtCaSDgovmJKAweIQU19GsgDM07Gd+rqUMgZ8m4ejRs35hsmIKRTa42sW4slyzkejYqPWayYyA0dMoTjF3EgMQnwMxFFZEgKnSUIqT4B07nk5TWqD7lV/4R5aC4PWYnH0EYzSRyBgrVW5LE65RPA16jW8JW7KdVIdvHll1/W7rrrrhEob/KQIUN23HFHx8Km2uADDzjg4ovO77btlo5npvB2S9Ulxx533O8u+F3bzZtlK2a70S19lMMBR23YsGFbb721I5jfvZJ+/fo5Yh06tE9wPVey4UYbnnf+uWefc8aP1l6/sqTc+S/NZpzlZ599dtCgt1dbffV1N9j4d2edPGb4sDeHj86VJvfZ0L/EREOc6cLzvE9pgonisTCWp+DUOLRkRIm0orKXGjp37kwz5qhRo1S0u+yyS9gpzZQe//PjL7zkoiaNm1RmM67BVbkq94/EgQRLk3869s4/BH1pTQravIf+vwwcTWlJKXYBSOgoqgCSCy6gstc63gvJWo52DGThqGy8uZHKIzvrBEpJypDPXMKABH6YaBQAtG/fvmnTphtttFFgJ3kwrJXW6aSHckgM9shQZSKkYQUqEOAJfnhCAx9MCbvRMaSPz5Y+E9KIdWkD2wJRSFLFvVivFlDRaRaAp8FcoxaS//gq+ZiUdY/mQuv4HNu7wG2+UUUzPcCbt2DM3KIz1ku/WUhYGHId4ZgrS8AkYUumHyJCdErCXCzTXCbsAoX4hPVZZ5SrgqylmR4craHYSr7F0ii3dkMUIkM2K3A0to8VSlCxwDUxlkCnmLBLuDoRWIDSz4dXOnJQJk6c6MzFb43B8Thp0qTu3bvb8ldeecWZQJlclcufnFyAC3m6tLQM1paWlCmFMiVxq47BpSH+eDecs2222SYOmYM1aNCg+EVlL7300pezZ4MJdt0NMiXgAVyUlZSV1q23+km/PHnf/XYvzWXLspWffTk7U1peVso1Xi2tY3wQBL5pg5OPSr6Wz1+K45WLIR2PRr0YMar1GKNLQxS6OMuP8QYKuLuRjNNm881fe+21/NU4+WTJUNKxPXpl5SrTaRMn//qkk884/TdjJ0y+5JLLzj7zrFNOPXXy1Gk3//Vv55x5xkknnjBq5KiCjTwKF/oprViKIyRTy6qyNsAAcqoch0cudpCiDrOrqjc5V8q2+wEwHp0KZ0CHHhz4QUCul9+9y6pJHVCKr25jwiml0/Q4S0wDnihk+UBPFJqhQRuGVMDeRPUi06ALzpmrE2hBoVPqckCSWqP65nIY7DHNySL8BHK4RhBmOt4UrUcO61jO9OnTA/Co4nZyovM/vWqIkwFaNMBCMNykSRP68fWL66JZK5h8sBCBJR+gGEsGflyK9eJ7NH2z/N+EFyX6KeQqYWJ0ChHyGMCpxSQQ28Ql5syNcHFMJ0Idy4xV2wgdjwG3OpQTloojRExTq3U3Mlc/jsdiKMXXlY5sm1fFvjp/+jiPPfaY3e3Ro8cLL7wgmzsTtj+j1Kn+5bk8YOWbXJ6vrT685MSU99YJa9CggUdHatq0aYMHD96pVy8odu+/7vF6VWUTU8C1NDOnrKQyV1J3bmk2V17niMN/ut8+OybfhDfni+tvur1jt57bbb1FZll8/VU9feWVV7rqFp7zfrKDHn300Ztvvll+qW5FX4L405/+9Pjjj4dkEsKlIxrshRQjwcXL5iXv06dPz5493fzvu+8+uyb1FqQTqvJQmivNVZbedds/zv7NyX2fePKgA485+KCDfn/lRWPHjDj40KMaNW5y9bVXr7vOujdcf5MJyUbG1JRqg+IFlIsleodt/PjxdtkjJHCcvKQe46PCSMqEoQ5hKT5Qx0kzFNkcKniM02KKNK3aC5yj0EEy1xBJ00lqZXNJnyqQ4HQBG0ryaJJ8uddcozA+jjf5cMlcXplrIskoMY2i9ddfn11vh1kQiyR5buN4x0nSz7eAQG4DHuQRn59mUUgSR8XJKG3chmFwyBQoGwu3ahOti7w29FOib8hCcNTxIhBIZhSZYjoZhSzfNs8TGUNRZ7vckOczW6zEukyJ0QBm/fgJnBgio7UWLRmz9GMvRFVr+RbCqC1GHCODqeWP3dSyGGGB9NQaCpnFU63hK3dTqpEEZ7fddrPrkydPtq8PPfTQE0884UXy4j3wwAO9e/cm4MTkSqoy+W+RSb6IkVRvhZCWQC8CuTziLXWYEx0lJTLC1ltvPXz4cOcSuF500UUc27Rhwxeef6Fdu/bJ9VAaSiwqtCoU1lUlGWdWvVZSUl5eUpXLVv31hpsmTH3/1rv+sc7qdcrz389fMLCkdMMNN1xyySUik4QiT/wUIi/z5Zdffv755/M2ckdxaNy4ceedd965554bF9sYWkryyu255575r0YnieZ3v/ud+Hi93313ZsXcuXKcVCcsKLASlWZdRMpaNG2y7lrl78785IqrLm+zeYtc9quPPv5gv/33322X3STwjz78cI01k29XUTyYZGq+n1LtkDPmCDlabrcSNBCKpAxRpFplKGiUryMXOxKBhYG7srC3RgcH4JkOpZwW1RLOmDFjorqiClzRSRsYMCuAgVqYQT836Jf6Q4ZdVpDpDp4h2hxs55wkbTKG2hFmKLjj10JxAzyYEhBIQ2CMWRRCIwIcYxSTq5hOMk+Y4DbktnCqQBclprtw06MfIdJhhQxnzGWXDCgFyZZvOlUeLdM74qYivyHWQSZJSwsYhtass6LMEE/yKmadWB1Pwnqgsr4O66YgscVhKKpYZEq8/lahZSJef0xiXNU3nSotDoGQMYT0hcttAIkPhRboWtC2bVtTQk9yRPKkvwDVGr5aTEo1kuB07979iiuuuOaaa0CI4wVFdtxxxwsvvHCvvfayuwTcUHMlc+XpXLmL2dyKXDb/I12VrtqZ7NzZpauXlUjMpSVVlUv55bu8reQ75f7yl7+89957F1988Y033qgFb88+88zTfZ487mfHJd/7Ckby362ezdStKKlTkausk60ocwHwqlZ++X833TR84rtP9PnPhmvU/fCjj4Cuk1gwsKR0/PHHH3bYYbvsskv1s6Tv9fvFL35x7LHHtmvXLt6c4FtF8+bNf/7zn5922mmSAoqhpSRv7JlnntmtWze7A9dh7b///W+v5fXXX3/iSSfZEgJi47pTmlHR15kDKatw6x557FGD+j+/afM23Xp29mZ/8P77b4+ast+P91ujbtXnn3zQf8DAbXfcMVtVMjf7VXZO/ou1KdUGibwdhDr21NECmZKsDvQKQCUjpztO0jdhHafOYZOpQYU+qEM4sr+WNpl6ypQpsnacT6+V9wt+GAJCErq5INOjLC+JAzlEhn7WGQ0o0iFAJ4JbpoBYTCRpIGBAEjNAgpg+ot9CABhtaNKkSdrw3Er1w20OyDZWCt3p5wD3OE8/zWSC45FOIAfSwCo+JlcVguED5LZMnlAOSiG0EjAAmyfuGQAYprJouvWSB2Bxb+BMxI0tQcC0Fo4R0xqNOHO+GN64bYwdO9ZlSJ9pthCjancrsnYtxwwhrvI5dpAGHUz+UB6QTLM4MxTRs3emkwlINj3Rkif9BSiKn5RWNAm7vVcX/vGPfzwpScTJLsQO2de8SPJTAdrYdeTQOCX2uyCc/ez35/9m7Ofr/uOmq7MlpUTXyFXaz9K5H59wwqmNO3Q/6+wzVyupKi0tj0+Nl4zYcrA4Fp04c+Gno1buzlq3vKrEXbsOGPnkg4ndd9jn9of7dGrbrLyqsrykvKKy4v9uuPaLr7IJqpWX3XPPve3ate3RvXvdOuWlNR3Hb09eCcQTAQl/PEZktMHRiaykw/NYRQQwZoXYEhM9RQr9XjkdnhgtK69bVspiZfKV8mzl/j8+sMcBx55ywmGrVebKShWxX533m19O/Gj1O+7885oVX93/0L2XXn3Hi/2f2mDtei888uDPz/jDi0NeuO2vfzvz7DPXyJQoUsJVhrSPP/74IYccIoNIkflFLNUqUloMxVbusMMOM2bMaNq0aaR+by5wVYHJ7FEXQg7k0XmzHXK0lwV+6OM7bMDDkPcX0yGhTQtO3PnAs4mSPhnlbHwaSQM80KEBdNl3wKblj/cu5nLMLALxStJD0pEgw6KWbypFfZCmBIQKDOFDC0PAxhREoRVRoiZ75513+Mku63ywTI/wFQSaBXV4zorKlVq29DljUbNmzYJYZOAxVxWm3OMwVQQiLEwwxByMFBYOGMIUK/hNnoeIcPD5zP/g09yoUSMVOb6Ay40U8oEw04Bc3GgmZoHijM+6JVAVjulQZaJZOPqiEfKxfMKY1PLHXEQJ/z3G7YF+wvRYOGEXawUPMQpNrJFqrX5NaVFk5+wxsqnFSxZmvK62k8y0aVMHvjz49TdHjxo+/OWXB82YMbPMwchlJo6bMGDga2+PGP3a4FcGvfLK+x99lFu63Msco9F32vgQhwkzOcTJDS4HRCrmzn3zjTefeuaFjz/6+Om+fV8f8oabZWW28vob/nTFFVc9/vgTu+2+5869d/2//7u+YaNGKoL8IpaKio5FQJDH4AThF4dQ8eVBi3kfvhOFidDprSu+0shjWSmB5AYwYsSo559/acq06UP69Xt1YP/P/vc/RezcuZUvDXxzh+27l6vySzMv9x/QaatOa64l15SPeHvUVh079X/xhTVWE+G67jAFeymtcIrN1bGnOvZXvQXVwJtSDNzK19I9vldDCtaRgpGkDGa8v8lJyH8zLayCT1RpwQkmJfgmTpgwwajCKF55Q5hRzzlUVKFwJhAXUQ6log/t4js2WCfAMchhLoVmMcGcawFJSAmryCDI6kXwFkNE+hnC0WHXLDhqafzBgYVcAmDWHjcMi6IWxyhDgaY41DLKukdTjAoFPg2EFcpwWr2Ow9VwgJgWnDdr1owncM66SBITz3bt2sFUwqbEDSPmcoDn9GOGCaYxRQ/h2BRkyZRrWTFLHOIapMMKjlELZ4iATjySt3BLIJDsZR6wPVrp5ptvThKci0/sO/5iKP39iLVGjm+Nvx9RuwDFEQmyo8iBfmPIm9tss23nbbvbbCegebMmUGv0iJGjx4zbfodem7fbwuXPq9Jw003nY84SUdit7kOxn/9v8p3D2arcs88+lynJ7rH3Hmuutc7UiRNbt2pZr175hIkTttqqS7st2rdrt7nKdccdd+i5/faQyCtVDQqXhMIBxJkCax7TCxD88LM4VOQEBTNGl5hC1QJU3UqmJPP6G6+/++57O/baqWWLZtNnjP/RuhtvsvFG2ezcL+fMPeTgQ9dZazXF76eff7rLbvts2nA9r3abFs1nfjirvM7aJxz309K6lXVylvO1cKW/H3GFkXdNeG+99VbA0KpVK+giiUvH0rRErHLCd96UYh5hQ+RlMGaW/KujNqIEMyonSGCIsFFA6OU1CpMwqTWdvNTvzZXT9QOrjLKrz0QAsA75+vXre/fVVTj0OA+wNnACPtFjCiVwiF0+g0bobiIw5jM8C6RUKepQxZC5wD4coJBLoIsGTppLOT5iLsAVffDBB9ARgpqOLyZ8jmsE32gI4fDfUDjvmsJto3Qampn/OzmiBMW1fMO0Cn1TiPG5devW4RgmnZQjfSirZZ0knYhYbIfl8D8ciEdiBIQl1mVISAmj0IlPD0mG9AmD/+bNm9PJN3jftGlTq95hhx2smmTYrZEK9VBKK5iE3QbX+PnwYohYvAM68SMbVWUZV8Qk+2Zz5QnLUHzHaa4K8JWVetWWY/ZNDFbyQtJwsEvLsvq5TJ2SqtJctrKsnGtqs9Xyn5IkTlVVxjXcpTU55YmG7zvZKe9wsmXeWLfvMuXOavWSunZ2smFVa5TqZOtmS+aUlqyWy1RUST5VX84prVOSkYpyleVf1K1aXcAK3xqePyfp58MrjOJ169Kli+zfvXt3eVaOlmTbtm07ePBg2V8RBinj010yYANuydRR/8nOOPG1T6rWy/9EAJwjA8MAhtbxAEXAmxId5uJLs0ZhHojCJxOSpkcLBsCzDgiU8blqLphxA8DhGCXgAcdQID1P4Cs9HJB8WrZsCcAY4uTEiRNxLI18o0aNGDULhBhlQoeAy4SJ8JsJSqJMpBM8m6vQ1OGw0pOw6YIAOHUwQVQAuUeei5WAiA8+jhUR1vKHmJCKMys6PKGZralTp2rhK/nAY85QQltgs8sEbTi8jRuG5WvFk5+UY3KbcBiNCwc+Jslwg2P4ZGIWppZLdIphRINvwkjmzDPP3HHHHclgamukZfNBWUorkpwhlEBpWaYs+Z6mbC5b4b0DucmPxEK5/DcVl8Mxl+/CpOVKiZn8T7UmBxe0ApP8Ac1/yTOb+OU/+e/xST5MLk1+hUKCEz8QSkIRO5b/dYiZsoyriPdeSPIflM9NfjA2eefnlmXkhUpoXJqpkyuZnSmtKMvVs5c/qHCtVJQc4DzJoTK4okrGByowKbKzbCvzRpVDGCTANlBkSqRmyIEPG5C5srlHqsBhIBa0phD0ggp8xR+4MqoDIwkH6HKGNmJwl13wEx/2IqMoijNifNACXUNt2rTRwo8NN9yQ6XHjxilVA430QY5ZPI+J9GtZ5zmCNFocC3QhiDgYZd2QuZzBB/NcsgTCOHHVoIpM3DD4hml1VKlQ8cUBPjFq7cIV1acp4YxAWbhyWbQhGQGYbYo4MEFSyxNqtUzTw0NxZpcG7lFFCQExFGFEht2IAz5nPMavh6QtCIcnIhNu4+gErFo1DR75wB9XW/cnkuGGdS2KUnxdlche2maUvMxl8YubyuuWldcpq5OkZmnaUMb9rgwvL1yYuLwoOVps1GWvrDzvW4bd5HdIYJSVrZ7J1ClNvpzB0+RfaVl58i/PCgU/BIr9Eo9MWVldaTAJUKaktF55ad18VNYoLyspzRgXOqERIUOZ1QTUhJLkmlRQlNIKJwlUC0tk7fr168uzkaalWpkdbEjEyP5CDmAjHesg+AHnDAEG6V5C16cEyOlI0/oBqEZhA1X6pryb/6FSQ1FsUcgQN5JXPg/MHNA3HeowZOKYMWMmTJhACUlgEABDDJYAsABC2nSg4KabbooPflihgZghsyiHu/gerY4DIATfLEYBHrejrATYm222GXNkCrkok1GzEgs847OCjzPsQl9GMSEfMRQ6eWiUJygKZUN00ty0aVMC+oFtpkeohZ1FYjwREENBfDCEb+FJCiot5QZ4tjorconhmEDBxcmTJ+NQTkMMWa/9NUUbkGw5saJQzg0h4okleOSJR6oExO4UT8iiKH11VzGy8fPJk7PhH7DN/0vqoDyUGcn/MqcVAGOOUMBnYizvRvI3dZLHBO+Tk1qSPCf/8mL5/tJ8T/MqRfk4FKLjv65Bye+vSliA023DUJ0EbZPrUvKQvyTZxwRx88GM6P2AriMrIYGc+NR09OjR8qz0KuEqBCXxqVOnSrjyvpQd6IvAZ1Q8wQRXSB8AaGnTBuETVjaNHDkyMMMs2CPFh1pGWSSpTwC2SfEq1xgCJHCCPCfBhrmEeYV4qOUkcwAG0JrCECU6mDGLAOwBukybCzMIg6uoSpmAi+Ew/eRNJwCZKPQIGg05pRYSH56DQ8Q6uCJMJ7JGLWaUmPl4JD+ryhN2LRaMCRSLHk13lYG7itcIEVBv2LChIYs1JXl75kWJDx5pjnB51JqijUcaLMoaVcMkeW45hPUjgBZLhqt2B7FCW5iwZBwdo9ymwaK4aorzYO34TCyGUnxNKaWUUlocSbIBEpGL1S7xAyEwRoaV6GVtmRpT7saR3yPj65grWcvOgX+RwVFMlKyNRoFFbcAesWLGByE6xGjA5IZRammQ7lV1vGILhwNgCZbAAMikmMYBqHSCLq6abhZQAdJGwUODBg1woHWXLl1gGw/DfwjHeUOYLhPxIWqLFi1YpJ8DfCZGCQ/BpD4ZgTLdFAJUId7iWB0ZIeIq4qrlGMVnRYdyChk1qsTkM5doY0tfWOKyEqvWRmxNERwkYpQQ0DGLcpVxqNV61BEoYWECbLPCYdOpgsE6xOjRKaolEF7p0KmDcOyCDv34AcNGF0MpvqaUUkop1UyRQLUqJ/g0adIkUAELpWZJWTqWoGGM5E5M/pXiDUm+kZcNeQQPxMjL75H3DYWMVB7YCfPIxNyiNi0TRRiAVVAKGaJKllfYwcj4niNiHKPQFBDCNCxRZunQw7Fg6sAwFpFHNwYIOmPGDHb1jUJK8AaPoWNUzybSKQIWokilDXHSfQIEsmtR5kJWfL7BXaMoVmRUP6o9+onpUCWeWtbpUWrHB+NGYzlg9ZVXXqEfDA8bNuytt97iMIBnArCZq2OxOnwjJj6sUIjYFR+PFhgBsWtWLXSUMEEmphOIudyL+woPYxXRcp6YFt8jc+Kjta44G4unFF9TSimllL6B5PRx48YBMGnaI7CR92VeSRxaxMeGEm7UNEUQrZ6aZWRTAj4jNWOSJ+MRhoHPYIIHMKCPAAAZGAAhaCBMQ2gzCsmYhoVMbL311kAFTgMAKAgy6WELB59CqGkiFIE35GHMhAkTiG2a/6uXoAXwqIAhDU/CZyZM5ABDZIAou4Hi1mWKli18tooYrzUFNsdaQhumPqOAEz7RgHRI0myUvCBwWJyHDh3qkfXWrVt37ty5efPmnGfO2s1K4jKvuhUf0TOLn1r8CB0mr7R8sC76yeOr6eOT7ZilE8HkGz5tnGHCEBI6CxRPAbS/hFnksxBRzhmcxVOKrymllFJKiyO5WCsRK7B0gOvo0aPlYhz5VzoOEMWRc0NYmg7wwAm+fE0mYNhErUdgILkjwEMPMX0TpXtKYorHKCI9MmRiKITW0BHusgLbTGncuHGnTp0233xzeAA1AVLAgFkxEdLAY2AZH3FDlICx+CQ5ZHjiESiayCWEGUWtIfKsmwJ7wMyYMWNMZJqHqkMc9ajVBUrFeunhYcSH6W7dunXs2BHMW1cMhYdMqMip4hvHOIBjSGlrNIpOwjo0i1sQnTQjThrFMQXkm2Klrgv8ZJTzZChHwkjYKOHQxlvKLUrwTaTHEoSOAB8ImyXabOnzzQ3Aqol5NGsxlOLryksOSjXymPAK/2q8Oc0bnPfg+lmj3JJQwYuvU2GsGjGZWA038v9ZhK/fKyqEI0/ezOgUxr5Ghbh8/V9KKy/ZR+lYCtbGtgZ4aOViAjKvR7URkpojd+NryUjBUCQAiZjp+BI9hSg6sFCuj9xtik4cIUNhzly5Pmop2nBQTMQBbCCWDCU6YAkqwABYxSKd5jJkuscNN9xw7NixXKWHBnqYgGeKTkxrAdiABOHzBFjSaboWJ5CeJLCfPn06i1AcHwhFcWn5CPYYCj91EOBv27Ztw4YNwe2WW27ZpEmTPfbYgzzC4TAixoFYoHUBRWp5IoZ04iDL0cbEWJRZRvVpIGmxZnHSipBoWAXn+UxSxEzBMZ2YCGCaSIPgxPSIP9zlGD0kGTVEA2cYosRcms0V9sVTiq8rOyVHKfkTLNGrlpG/ITMb9mYmL/ayorz9BakwNo/m+5fv5QUWlPleUkTDmxkdVBj4GkU0qv9LaaUmudVWSsEqM7nV/oJVuRgMGI2N9ohJMhK9XFycG61ZCB+FNnlccpe+IQoBj/RL6DrwQ/rW0maI8tBPMsADH5muBULwgNH4hmFwSFIdCQl0AjWp5W18dZMA5CBMhm+cIaYGnTx5sgJOH2HSaTR8MAte0oMZdrkasKQ6JMY9kANBiYHqzTbbDF89jdmmTRuYqpqE69zglYnDhw8HTnyLr3fyR3EZ9SXTWnNbtmxpCm85FoFyYxANLplbhL0gXnGDe3EFIU8mLhDEIshk9JGNCFxUiFuUW4gYRjAtBCcWHkroFFWqtCFjOp+tiJNME452UZTi6ypA+Uyc/FWzfDeT/9Oui6b5gzZ3We5v9ZPkpNZ4sEpLcqXxh2kTN5OfLcn9MM6YaCwqJimt6iTPAh65VeaN5KuVc/H1CeRTd1LeYUYq1zoMSCf6JANlqQqO6aYEYEQGR0A0Kioc8sTo1wcSMMkU8FCcyw04BJwkfegVH4eSB2YACRCaODNP4ASmQlaIxT0CJpIMVaBFBx8oso6fh9pkyUbpRPSzSAaOTps2TV8VO3XqVFZ04qNjQ2aZTl7BykSjRo1atWqlYKW2devW1gU1rZFkoBQMJmMWDW4PAX5wetKkSWQ4wHl9HIFiXTFqUWSoCg4ZyvkDHWMobi0RPbZEFTOAliEy4mYU4RiyagEpMnWsWsd08kxQAvhZRHG3wA8wNqpdFKX4utKRDSvS9OnT77nnnrvuunP69KmfffZJRYX7L+xKfg1Qvjj8OuHMY2aTPwfrhVnadJ9XlbeW73sPn3322b///e/9+vVzNL0AibE8hRh7+W9pz3NyyUdbibs1UTK0ipNVVA/OG2+8cdtttz322GNee1nAK1oUmEfeeZKCFFcf/1I8XqnJntlByTSQQ+aFAbKtTqAgAWI6srZ0rLXvOlppnZh+KCEWozpyvVGZOtK61mPIaMmAzMAMU1ARMCCZRA9NSeqYK0UYBUKvvvrq+PHj+YmvpURlZoiSwAkEdKFs1MfhefhMjEumYHqMihmxSCC+nKncNOQRGjVo0ICHoDoKZa7G1y8jJ1AVSBzKaZswYQIxayTgEV7OmDHDI820iSc8FgTCFsjJ+Gg3nDEx/KFfC87Js6jwFXZKrJfn+uGA6REc1snwnIy+jkdz44ISFJeGQE1thFQnTBPQIg4g0WAdrPIt9DC0eErxdWUkG2k7//Of/+yzzz62vHXrVqeccurWW2/z6KOP5V9VL3DyJjhVyHMcAnz/HAZ9bP/IFDQuBVGCWPFmHnbYYQ888EC3bt2efPJJvh166KFOvKEisZxVa2fzf1SdS66TeVcQJeEtKqhe9SkW5ZU799xzzz///A4dOkDWww8/vGfPnrKPlYZALDwfRo/LZl9SWjEkydo+nfwrVrgtAQN8uTgAskiRoA3FRJldskY68w5AUjN5u704+IQNATxZ3hBDhqRvEAVCACQ9YVGHsJwOwGggOXHiRPUoB+Id1EaFCrEUjuuttx4MoIHb5JlAZEwHxuBt9OjR8Zlw2OU5sGE3rJsSRulXmIYYfKKNACUBSFCWjCWwBXTBJJ30KFKHDh36zjvvUGVFLgQBY2JCj3dEn0UtZ6AvnWGCNkQsKPr5hFb4PiZEIQC2WMoNmagNJ2OnCOi4ENDPMaOsB1Nw+Bw6xZNFL6/WMjfZZBNLxidJFT34sUExlzYXgoYNG9IgPqwnvi6Wag1f82cmpZpJfF544YXjjjvu4osvPvjgg7fZptuuu+4+Zcr0TTZpYMj7ns3N9hZ4wd5//30HIh/RCGny+18cpjjBZPPMJae8uYS8AEceeaSz+Kc//aldu3bnnXfeyJEj40sXYZ2MEz958tT/fTm3qiSTzf9mqWzl7GzF7KnTkr+0zKtQhcgXDKz6ZDmXXHLJfffdd+utt3bp0uXoo4+OmETqQQTkI3f2WbPeTw/+qkWxg5Iyktn1A9uQ5OvVK5Kh4hQtYfvuHSSm70gExaghGiKJe4zv16VEljfFqBcNSsGqeDQxEj0BZKJCCgCPGzeOnvr16+N07NixcePGXvypU6fCV3hj+rRp06highvQEZGnE/oCSKqArunIAqEgPTwhYwriOXyiB4c5Soi1aNEiLJKM7/iF1k44zNNSYgqfDUEg9ShwskATKaGQEhMxYTCMN4sbHq2ORRO1ZJDgYPIWR3A4jCwhPNeKCXk65RYcK+UqfzBNx0QmRvxpC0kdc6GvKMXP2gomeaEzN2ZxFfGK9ViOjsexY8eOGDGCJA3hauxgkP4CVGv4yrmUghaIhken5Oyzz95666332GMP+2r8008+a9yocZv8n49QGT7++GPHn3Dig488+qsTj9tn773GjZtYkvxBnezYsaMfvP++A3+87x133xNX7YLSJaVkq/J01113vfzyyxdddJHT5rA6yl6wXr168U2F9tlnn5x66m+uv+GmO++6c/vu3a+66po5ld6SqhlTJx13/PF33vXPP177h+237fbQg484uY4hvQUDqz4NHjz4lltu+fWvf73ZZpsJuPfZ69epc+d11l0no4yvqrjhhpsuuuSy++6//5AD9jvmpz/94NPkj4AmnxNHZJM/ypB/KLygwc7k/0hDJv/L/YWrYAsljymtKMrvRUJgQ0qVf+2vky/5tmzZsnfv3pgBV507d44hrYnyNRkduyovx97Go6E4Jzo4kr4+CtTx6GbmFQsrxVlaj1BTnwz0BY2YAOyNN97g2AcffIBDxrvpBHK4VatWajJWyBsCkMovhAlC8BmKjiGq+GA6u+EwDgQy5NEQQ5wBS3DRTTq+KwqHq2CV87JWfJXUKHkuEeZMrNcsHKYlELgIcXnVrFmzpk2bMsE9fNoC28jHwsWWWICcUOubq6NQ1sJUHGI8ZJchdsNzfEGIr+NGJMnQqa/Ej9sAIuASYNR0rvIkMNui6EH8iYVjBrHy4YcfakOy+pD+ApR+PlzLZL/tot0qPOfT7Isvvqg63H///e138Pr1e7FduzYbbbhBaS7z2SdfXXrB1Zdcdd0ZZ536z9tvzs7+8te/PrtSjs5kpk6btt7aa6xf56tPK0pzpSWZQm255OTQxOH7+9//3qZNmw4dOnAP880338Tvtk3XDMdzX916y5/rlm1w6eWXXXjxOTdeec6NV1/94JMDLO6K88/vvv/RZ5933h+vvfDo/fc++eTT3po0pTL/JdmCgVWZYuPuuOMOnb322ktYvJaSi73rueOOkkRm7qdvvPLSw088ddEfrjrn7LP/ddfNr/R/7qLrbq6SiCXYquycispcxZfZquSz9KrKuXOk3arZyYf7/pOrKqkqSWKV/4mngsn8jkQSQTW+0iktQ4o4d+3a9frrr//JT34ia0cS17HX6rb1119fZeYF8TooreRc/Jirg69DPpJ71F7xHbYwQz/St37Mij2V8emB2Um9ucYakgBzZIzSAzbkdy0xGA+iCKhT3333XeirIIMf8dnsSy+9RL/6rKiBJ/pMBDJxjwDYK2YhQ4gkIhCYxxkTQSDPjeKYzmEoNWrUqPgVx8AVogNLs0hqqYVY5EFp6GSF2zPzvwfKuqB7CFsCEwJIPrBcn7xZ+Lwixpwpeb8SrPUYuVEHh0x0yEcYkUd6om8uvlAzzXlilAuXt1UwXVb4Rgl5m0K/uRQikgyFCWWu24kIuzeQpIdmahdDKb7WMsUO2WwUHJ3XX3/dpm6//fb5s5H8gNrbb7+9Q8+eChmnb+a7M8eMHTtk6Ih6uS/r/WjDA/Y8YPTooR/Nnl1aldkZ7bLTeuutNe/7ZpY2/7LOH3nES9uzZ0+pJI5a//79HbKWLVs7k6qrCRMmP/zIo8mPEWXLtt2+13rrrz1i+Btz5n42ZdqsZ5/sV1YKeNY96NADvvz8k4kTZqjYoixb1UlkpANXjRYtWsh0IoMzbNgw9+JeO+woLCWldcZNmPDaa4PfnTETTDZs3Kz7Vh1eeXVIVWlZnYovZ02b8fHnn+fK6n4xe/bkKcm3ilRkS+ZWZqdOmfzhB+9XqXATI97QBWNlU7TFA5PS8iN7Ktqgwu5IxMCgcePG0NHrCVRQ4JzsDM9IRlJGcTa0xdRsYvPmzZ0TuBI5mv7iVhKLRzqZwwEJEP3uu+8+5phjAAwlrOhARM6AIsI6UA01atSIV4FkAVGAgQz3gIq5iP54fxH/oQvPA+PBTKyi6I8lgBkoQk+s1BBhGvivrhUKkvo4qkn3DPKEieGQdD+wRiC06aabwngczpMnjIw2bNjwgAMOiI+11bXt2rWzENMRzYgPWusK2A5/+MaZuLJQQqGliRjhgGFtAHwgJZcI6GAaZTrwlQCKi0KAq5jg0490PNIQHHMtmZM8tC+0kecD94re1kgpvtYyxSXIFhb3SQegxgsZnNGjR7uc9txxhxkzpj/00EMtW7V85tln9ujVzUnOZStff/uNjTfYbO16dUuSv6Se/ygxUeXfMsu/XmMvoTch/HSyBw0atM022zh2f/vHrXPmVl54wSX/+c9dEr5/02ZO+Ox/X2y6UYO69Ta68a+3XnrhKWWZylym9M23RpTXKd9kw/VKM+6Y3wdsEAovvMQkNUQuwHn55ZebNGki391/30PTZ7y/34GHPPt0nxaNG2ZKcrPnzJ066f3WTRrbm4EvDrr2qmsOOuDAu+65/+xzzn/ssSf2//F+L7488LfnXtTniUd+9pOj+/T9b/LRexKofFON4p0vHpiUlh8JMjQaM2bMWWed9cwzz8CJAA+b7pWUYUPAMdCRpnUiWXuv7VFAHT3ktfpyOg0eHZhAC62UbU/NQvRgMgFXwHDr1q3VTFBBQoAlHNBncaeddurWrZtqbMiQIfDGDY9FetyDmaANUwti6YxTyjHy4Nb9Lz7z5I9ZWpJRxZJMlp2H2JD37hsN+KQThLtnUCshEPPIGZnKckaMGBF8qgD2e++9N3HixMmTJ3NJjYszYcIEwqpVq+CkdVkjnzEFkzwnRYw5MAbejLZt23aXXXaRatxLxMQox9giyQ22ImiGYLkguMHQFvuCKPFoIZZsboKo+T+Wrg09lh9qQbVHdi1Tx2gxDgQU7gQ8GgW0fKaBnzj05wVrphRfa58W2CGPTkncnuJ1/ctf/rLeButtscUWjz/xxKYNNy0rL+/evfuPVqszJ5MZ0G/I408/c9ElF65eVp4tUygqFZVNga+aZQBj/PE+OF7hj1Pbr1+/1157bfvte376v/8NG/5m3bqrN2zYqH2H1uWloDd72RV/att+y6MP3w+Otm7Xvk2TBiVV2fc+nH7lH64/9pjjO3VspXzNZZfWsX/+85/eOmDm7QoO37wnXrkzzjhj991391YXh/hs1BsuKxmtnkeWhrx7XjPXcG+sPnOTJk168MEHu3btWqesvP9L/TbcpMHqa669Teet6lluVe6Of9w++eOvfnfWqeWZL/o+/uzZF/526tgx99//n2uuvepXvzph7bXWOPv0357xm9/86tcnddm68xOPP53LVFUk34Jd+OpddQrOwvyUlgfZX4dfbo3bcKRvWTiQCXlbnahIu2RMMWRKgAdJRMZ9NPSYApMQARwyziQmMenb64Youf3223feeedbb70VYsVeb7vttg4wK8Bp//33h7iKKuXd2LFj44dEFaxMO4rKU480w7z4qi1+oCxXoaYOl0BmHGMtYSZ0ECe18JWAd8pcU8AkVZwBq+ZykgP4r7/+uoVYTgAebdIFx/IfcbWM74LEQXBu3LhxQJdvNPOZCf6IjBVR4m3q1KmTN8hEqjxG/IWFvKsJVSLMh6lTpwJsqzDLBde7LwgokgD3kBVREgH3qEObIHiMlTLNATrtgok008aKJTCB6HcXGT9+vCWLJKM6gqlv4ZTEviyKUnytZcof5q+BjQOx3377OQ1PP/00SDj33HPdqmDqu+/NfO7557ptu639TF7vqvKRbw067exL/vqXO/bdp1eVnUzQJLRFxbO0GBbElhfp0EMPffTRRz/99NNHHnnk/vvv9y555f7+j9t33mUnCYRpt8Zs1ZzLLv2/jz6puudfd/1o7fKS0tncKa8smf3l7BN/fcq2PXtf9Yff1y0HbJWuAQXtS0rPPfect/rtt98unm/vkr6X5Pnnn3cDmDZtmkjGUNCMGTNeeumlJ598MgK++Bfj21CkkuOPPx7MyxRvvvmmzWrVqpWXduDAARtttH6d1VYXmPLSkrKSzEP/eviuex558LH7Nm/V3P4cctRhn3w846uvcpdcfvlaa9TLzvli7Ki3zznvnAb1G+Qq54weNapBg80qq5LfCqRSTTa0GuWZCRWeU1qe5JwopABJfBcPjNGRkb2VUAFmSNZIgv7d73530EEH6dgaZyBgiQYnE3iQdFpUXY4KFNQhI5Xjy+OG4jgpT2EJJJC+CWjZooEMH5z5Pn36EHv44YcvuOACerj0zjvvmOJSHhpYB0hckkM8UhX4oc9hbtPDCpfASSCKKVyNE6XFZBG0mBLAwxAcwhcNVkCUR8CDz0mc+Du4bdq04QZA1Qel4TlbYLJRo0bt27e3Xvqt3URARYx1rYKBgIq8V69esg0MphCGESagtCXPCqLT6qw6gJCMFpCzW6wsyUTYbYGWzyGM8EUSR1gCgC0wPqvnp6CZwqgo4RMgyT2ttTOqjT0NsSCPi6IUX1c6smHOyhNPPAE87rjjDun7zzfffN111912222XXnZZ3Tp147cjvf7GgEt+d+0d995+wEF7P9nnma/mzE42Op+Hk+H873ZYMDEvEcUZuuyyyw444ICbbrrJ6bzxxhuh7CeffCw1/Hi/feFUNpedW/HlRRdflK2qePCRB+rVW+uFZ17Olazmnv/JJ5NOOvHEAw845tr/u3rylHEj3h6XAIYaduno2muvvfPOO4888khvSHC8A5FT8BWRXbp0wYkh5M10yXVFeOCBB0yxouqjS0zUHnzwwZy56667Bg0a9Ne//vWee+6RJl4d/OqZZ55Wkc2q97n277vufvihxx98om+HLRo9+vhjVSWrd+ra+tWXX2jSpu3mbdtkclVTJ439/H9zenTvXF6amfnujP79X95r372klfznBV+7JaS0gslReffddyVWe616A3gyOAiR2eVonR//+Mc9evSQ9J966ikAI00TkIUjyyPHEgjF1zjj4HXs2LFDhw6OaIsWLaR+HK9S27ZtN9tsM/LKu5kzZ8rmEA4KMkSbg80BoOUFdMCYkOIBhrla992jjjoKjClwwSeL1AZ+QEF2gZnjFB8LG0XUUsIxCqnCpz9GQ7nWYywEkyQqcswiwG2z2AK3SlKgCMIbNGiw1VZbNWvWzHI8WoXpgdNctfatt95akQpHrRFw0gbh4CtUIyzUVnHYYYf95Cc/2WKLLUQMk13C/A9go4dCoSZsd4YNG+b2zCVMtijE562WZNwwuB2RjCF6yEeILMGjPhOmE4hHsyi0TEosBHEA0+5gxsYhvi2KVuqv4lT3TUwLve8FWZoTuemmm/7xj3886aSTYqWxxhp2BCMB1Xwvl81k5wx78/Vzf3fF+Rf+fsNN1q+YM+fyy66865+3lZeWl2WqMiUVPz/mqKZbdD33vPOqsiWrlS9V4BZ2BqfgZ/63G8OGsuRXSVRdePEFsysyP//5Ca6PL/UfNGdO1YknHDP7fx//8oTjt9/lgJ127lGSrfzrLfceeOgh3bq0qVPGrUL2WTKq7ljxbCzs7WKG0FIeqhrNYQY//9sEKspKqh66774HHvzPeRdetuY6a82aNfPufz14/fVX183MPv4nx62+cavr/+/yehWf/r//97fb7+v77AvPrF2n9J+3/PWm2+5+5Im+v//DVVdd9Yc16paqAgoxz2t+/PHHDznkkLFjx8b3c8ZQSsuDJGL5d/vtt5dP5XHJF0jI0fAsRmXbww8/3EYoXoNz8sknS+jnn3++iYBN7rZrkeUhCiCJlA0FccAeTIp8TR5gUBjlmr4UL9crl20x7KSKfJMmTbp27frYY4+RkfHJUM70Jptsos52HecJu5TwM84G2Ag84D8H8OnU8pZvLIbnHGOOTpJsUaujNZceYkZp8EgSuVJE3xAiYwnRmk5SywrlQR4Ri5hxklF4aIqWDx4NaUnSqcWPWWRiKPg2Ijq0AU5DYFhf65EqLbJ2mVZAQKMpoQqT2ghLOIBpIdG3LtBLnip7JOCC4PZg+abT5pbj3nPBBRfss88+poTzNVKt4Wtt2V1JyPIXha+LoUQoWzlr5rRtu3Sb+sHHuRLQlrW9Bx544L3/ugcEP/Vkn6GvvfrgAw+s9aMNdt17v9332nObrTp8g9IlJc5UJr9cOFeaq/zTNVedf9Flc70dycfU2fJ6azz2xBPb9+j2i2OPeeC+B+eWrpnJfe7V26hB6wGDXmpU/0dlpd7GpcLXlZwEJ5fNZUoqX3rmv/sfcMiXcyuyJQlKyniXXn75ueedNufL//Xs0evcS6/c/8d71ct+etyxJ/9oszZX/+HSOtm5//zbP/5+3wPduu3QedtuB+6zZ2mmJIlWqE3xtTZohx12kIg33HBDaVee1Zde4VCUszvttJN3edKkSSeccMJf//pXJexee+01cODAv//97+SjZrVTUUXJ0VEYgUO7CYSUSvK1mg8iqnHphwpSdqBX9OEcc7Sxqy6k8JVXXgk4CYyBDfpgYJtttnnttdcIYLLIBHP0IBx6kD6BAEujAdLBtFgKdUxkNwAv+EV4Y8gjbfzhA8LXmhgycMuoDv34vFLlmzUh/3uaME23LvLWbgrnIV+YIyZWfHPtCIfj02YybiQRT95SbpSG6JCh1nQC9FiLIURPEE604TbrWs4EALtt2EHO8JYSGj788ENGQSx/hNGlRG2tj2OUPOa555675557UhgrQhGo6rSy16+8z+9vQgXu94Is7bvgq0scAZtVUlVZMXfO7PHjxlWVlCWhyZ/7ddZdt+Gmm+rMenfGZ59+vFq9elD3yzlV62+wQYONN1xOgeOQ2y+fMrnsjKmTP/j4k9LyujwE8xW5TKtWLevVKZ8yYaxTXFm6WlnJVwne1F2neYvm9cqqysu8zN9nfJVdZJiSbMX/PvlIpigpLS8pq5NNfuYm03CzhutvsE5l5Vcjho9u0a7DGvXqlGW/GDt20hrrN9hw/fXrlCQbPPSdUWuusXaLVq3zX7tNvoQbZyPOSYqvK4zkH+2RRx4p8lJqfDVUZQkFwaEsDAAAodytdpSCR48ebXS//fbTP++886DmtGnTZHDwAFDtlwQtj1MFLWQAUCGhq0QhBPymf8SIEQplfDk9vmRLjH6zmDZR8crooEGD+ECtXA+VYSSFoQ3qb7HFFqbznNuOR2AGP8lTiCg0GicnDpWVEgsKDjjRx4c00Cj6KJAJqHCJWGCSwrpDhw5NmzY1NHToUDKA0IooUet37NiR2IsvvqhSj+VrqSJsUfwxGvhniKsWLrzcaNSoUfPmzWfMmEEV5QJFg1WwGHbpEVJXHDHxRrjWhEJL46GOmOhQSz8ZgYrlY5IMmeiboo1OxI2HTCB95tiyF3bHNmlPPvnkgw46yBCHxaFGWhnxNb+JCXZoL7nkEstwTC27MPy9IGH/9via/PGc5I/SmJV8TEw2Y+MyeShLvkU4mWUufmmmNJeND2QyWbWlZ7388/Kg5HcM+U/+51lVWVUe82vI5s+VwiuTDJW4CpTmKhK5TB1OlpdUJj+ik7j2vaXk9p4U91JVfuMsNuOuXZURlYx/clhlpqS8KlNGJlM1NxktresKkkzJmJ5/ab3juWwSzfxP5VEb5yTF1xVGUpCYn3nmmWAMZkjQou3CBDyUZXZB2fr8889L/ZKv+ixqQVWmwmjixIkwFfgBPC87VJDHAlkjv8f34kauwzFXloPHsIow/ThSBBkC+vGlShrU0w0bNrz77rt79uzZp08fMG+URaAOCyGBuVwlH8oN8bBt27ZWFMAW4JS8pPN+hzAMA12EaYiamxggAXVsvfnmm9wg5pEVCs0lqYV8lFhL+/btYaEp/fv3J2OBwBV6WXW7du1oGzx4MFgSCvwkuCUlG2+8cfiMPFJCoVEKI4weaRB8zsNpUBpOcsYSSIqbUfhK3uVmyJAhRplmzmixH9Co4/aDKTjkBY3dJk2aeI9Egzy8Z47RQFwdXpFxNQmXmBP/6MR6aUte1UXQSprjOG1tjzzyyHXXXTdlyhSPhYEfKMFSmw2wZFzABDghZ5Kpkx/HSfJ1AqMRpUxpWf5fsulJhg4Fy4cSg3l/kmI0f84SJxKHIGr0yiBF4nOJSqwOL5NPOyHscgT9lYKSpSfbUlaa3478Ntk0y89vUnzalsQr+S1bmbJ6meS7jHNJ0FS6rtFJcU8yf2FRvH7fw7WSk8wbybdly5bwAwwAGCleut9yyy3Bj2QFCGXh6dOnA0i117bbbgsSJGvoJY+T0dpwiZtCKVtCV/LCHvAgX9P50UcfgR/p2yzmPEriDJmIyQQogrtKOiDHGZ688MILwDh8gwfJO19SQh6cjBs3jjYQS79K95133nnllVfia71a2G8U9k/O/4FVhkyhhCrYA4SooqdFixagmgke6nAelptOmCfQUdkN5+CWNtBIn0ICOJbJ/+SY5089Z6gKXLdGyzedMBlqmba0Vq1aReUNtg2xS6HWFERGW+RTizxyLDo8D/zTEXlKbAqidt999911110h8VZbbcXz0E/YqunkVdxpzKJZ9N7O01tvvfXyyy9bAho/fny/fv2eytMzzzxjlxnl6mJo5cVXe3/xxRfbJLtS4P5wyS7GPyTnqoMSprS7wL/CcF44AWNPy5PCRFj0r2gxOpyY54nhBFvzQ0gn/9/vLyVLjr1QmCb/8kGIsXl7kk04SamfjetRwsjl/xdR9P/8jwrnH1KqFZJ5bZ0WYACb7bbbLpADU+ehhx7q27cv2AOrEAswHHjggWeccYZKS1KWwYjBGO3ee+8toUvfcj3ciqpLgpbWQSw+5IY3UAF6GSWjL9ErJWEPDeTxlaGbb775Y489dtVVVyk8QAKgooSVgFWPTMAPyAF1TAE2bdq0cQ+wnACeuBwYdT9Q1QX+BbpD36FDhw4fPlyfXWrDBOejcgU/AausMEEbAQuPWpM80NLHx7Q08qZ36tQJdvLQLYRRxDEVM8TSRkAYfemll+CZqwY98WGvO4GbAa8CAi2BubjZ8FnHJQOTIdWkEI0YMcJ06Kjg1tJsa2KNERxoauEW4sYjGrYSk6rQv8kmm9BmXa1btzZEeNSoUSZaFA0EiHE1Ao5DePH0zRIrnmK1/+///T8baQ3f5pqwypEFenmcwugHczE0b/32K9mylSYcC3senAQb8h2u2ryA/uj/kCkfhyQE/p9/O5M4FUIT/y92CzeVQuTmk9PidXByauul4ECRCqzlTN4UtCItLkDyrFQOCe644w4Fn/zLn8jUUq3qcPvtt4d8wZSytthiC/lX+tbKzpWVlXI9hLNrMXezzTaL34amnpP9oQVUgGfSurzPFqNA2lxIoKAEKhAOEoDGYcOGAZVTTjlFHda7d+9evXo1adIkADXQ1Fya45f7wzNYSL+qesyYMQEM/EHwI1KrVcCP6ASE6PABk13eUss3cE4PDviM+wTsAWlcCrQLPeQxBQEA8xMnZoFAQ8TCKx1MYrE6HKsQH0osGRgbEiUclw+ttfONG3aBJ/zHMQTFSbqFiJubR/fu3V2DmjVrJtqHHXZY27Zt4S6iXACZ5oaLglniEG4jS/ZIOc/pMat58+ZiK4b8YYXnrOjHI1Wm4HzjgUxUF7orDYmjnfvDH/5w6KGHHnDAAZdccsn5559vnwrD3wuyYd4rx+jqq68+6aSTgmmno5NSSjWStxX95z//OeKII6TLWvn6q6MbrfyCgrlcSULQxkpRMFcMibaVekPVQ4ohqRlsAFEZX52nA2wwoVFUZqBICpb9w1VgAIow5WuJWxKHRtTK/jI+YCbw+OOPC6NRJWPPnj2Bh3JNBqdQeYdPA1RgnX5AQrmA7LbbbtCLNrA6ZMgQEAWMGZInQazcwk/wSVVsE7SGiFtvvTUBRqmyBNWeI3T66afDqltuuSVKPSZgHnMwO77kCaHBP+whph6Ff2xZJrXs6ghRkFBYnQ4rSJ+HKlEBNKtfv348N2piEtzkjplQxMpjDAVHn3Ac+OhzKQLLSY/uB8Kuo/KGrN6Ip59+esKECYLAolEwaa494jMxPvMfX4kMOA2JD83iXL9+fatTKBMDz3ZHDIWXpHsJx4477jhKBg0aJDgmMiHsu+++u7o8VpEspiYqg16F7kpDAnHmmWdecMEFjogqNr7ffTFrWBXJJtnp6667ziZ17tw5jle0KaW0KHJsEGR94IEHTj31VCkScwUfGzlIy2jkwWAub2JIKtSuMItBos3iY489BkXkdxlp1qxZEroiRk4HIaBIHOBf/CJAuKg2imILPum0a9duv/32k+u97zIbvKFTdiapeO3fvz+0sI9yvekKL5sLzKAvE7CBBqaVVqBLx0R68CdNmgR9mTaXQmrhAVRwZZf9wapHCjkTv8ICEEJKBRxh64piN4Anvh5pUe4KVqH4VkabBXiIkTGdqsAhCsXBIzLESaOwU0fLCjKLJ4LDYVcKyxQr8nCd9Tg2sY9aDqAimuogOIpCQARMiauDjpYJPlg4Bxg1y9K6du0ac3niUuJCwyIZAsRMMVdrC8wNH4xGHHD4aZQPLj1MeIzbiSWQFJaOHTu6UeGjuOUw4f4RTmprpJWofhWacPSRRx6xkosvvthSHdz9999fNrHmEKtFEqsIl+0JzsIUAlprWUzcCXgBXB6vvfbaE088MTiLUZtSSsg7olXxHHbYYSugfi0eZidTy7pc6VK/00474aDlZ7o6MY0kO/UW0wssebn6IOFa5rnnnhu/U0mmBhKRsoGNZO0tBgDSlASlAyxlZN7it2/fXu5Wj0Ivj3ARX52kH/sIEuDi888/L18ble67dOkiwiNHjlRCyeBglTZVmjXCAHq4QadC0yM4hLLgkFqekJTuR40a5QbACvcgIiSGE1pwrqrr1q0buxRylQ+IewceeCD+s88+y3mLpUcbYIbI6+dDPh8shAXMeOSkfrSwylwdNV/ot5aoeg8//HArUiyJ4ccff2zVwMxlxQLpoZAqE8MKwBaNAGxKTKdEZNwwdEwXKC4xStjS3AxoU5LR9uKLL/LBXGsUKNPDYQsJ5cU+tToE6DHFFvCWDwS08RkA4gM+cLWE2267zZ3GDYa3onrooYfuscceVEWUaqRaw9eF7Voqprtb/Ji2AyFGTony7oUXXojQ1C7x7aSTTrIT+ovxx77aCRtTeK6JrNTmeQG+0++XSOkHTnFOVuTP50TWYxeu/D1PO++885133omzvE0vQMOGDevevXuvXr1k0p49e3rLIhqLyW5LT0zISy7B8jh0t16ohgkj5WUwILMrOsGbPg4AkP0FTe4yhGRnwKOiwgSK+BAFf9y4cUC6d+/e4AG2uS1J3FTJ+1a6zTbbCLilsWs65fiqT5tuurwBS5gDJApN1ZuJU/O/sBf08tCs4cOHs2KKufRQYm6nTp30zeUVqLM03iqv995778GDB+PHemmOtZurw+fgWKaaB/7RD5MooU1MuITcJ+KDVtVReK5PG9TfZ599QNE111xj1Wbxk6Ro8BlSeoRnpohq7KlZ1gWABY0hi9p1112tC7giW49pYsQ5bhunn346JbfffrtZptNjaYTJcPvdd99ll4C0LJKm40ddrhUoq+BGDMXdghs6VLkuePzVr3715JNP3n///RyjxIbusssu8dEjIoyKnSKtXPWrNf/+9793I3OlCs7mm29ubS50lh1itUUC5ZC57uksHMcixagjxeHFi6X4mtJ3pTgnKwxfmZM3Jambb77ZlVHGxzzqqKPuuuuuOOfLz/TCBHWkMzlBNpTazjjjjB122EFyQAWJ5UCWiY488kj1ivwrxQeOMiove8dlZCDXoUMHMBlIBkIkXzgkm5tLTKIHJ/AGE+SoLyU6uZtA27Zt5QFVl0iq8KgiDMjbtWsnrUsjcJfMwIEDtfBY/SoCHIMZrVu3phCycoN7hnD49s477/CEadhDYaDLBvk/CR7CPAE2iBg33Fp+9rOfPfPMM6wQ5omVmkgekjEKdzEhNGH+W7LWaOy+7cDnCfC2TMvRj5/h4YBZ0IiJo48++qabbnrjjTf4z67pyhX9AEtx45UYiok7Af1gTCVDgBgZ8GwJysfYBXbJ08ycIp63xx57rLBLp+Ighoa0tJGPopmf9PDHcixh4/w3zwqv2h3KCqAqjgAmGUsjo8YzJHpK51NPPZXMFVdcYUN5CNGdCtcjkqYkZ6UmWo5H81tS/gAnpO9k3HvvvU8//bTLiPWcfvppM2cm9445c+cUvn0w57/JT9y7ZXn3s7nKhGFu8p+q/L+8SELRp58oSexkoCr5n67/JyMhm+9X5mVKsjFcYFdxKy4gnsWxSZMmTZs2bb5ocizILCbiKaW00lJy6ueRl0NCcd/t2LGjtKJKwJRNYjSEY9byprAYxAFZu2/fvnvttde+++4LmSTree9xgQrTlgXRxqICEW4BUbcN6ChZS/3SPbiCE6qu+DkTMhK05Ct3C5raQAfBDwLSutKqVatWLVu2bNCggdxNc3zoDVeQ6TK7tB7QqwNIiAVaMG2ZPLERrAMzAMwlAGDozTffdFmHhfoAiVdmgVvW+cMiW9TCGIuCXjTgC6Dp5PEZ0rIIS6iyOqBFhrw+B7gUEyEWAWsH/xYi46m86Uc8tFhuYJLRxnKgoEji0Ek/iIoMGchqYgCbUSEyCqc5QxIzNldrUUwPGTKkX79+r7zyyttvvw2tJ06cSD+yCh4KjkMr2kqy1157TUsPJLYETnKARfpp0zIqnrHL4hCfMRiyU7bY1jhasfVa+y5ElsMQRKeT2+Yunmq/frUeLTfQcccd94tf/KJNmzbhOng76sgjXnix3/iJk+vX3yST/A4jgS4pzZbnSiqzdb6ozJXWrVorW1pSlqksyVXaiNGjJowYOW7/A/cDlOWkqzJzKyvueeCu/X58yHrrrpnLllaVzq03t262fE42s3pprnJOSaYsAzjLyiq+zJatxZVcZfKb/HKrrVG3pIpAVfK7/ZKfkeCeQIu4EOcdr5ksh/N2Ud+uB3Nhoi2tX1P6rhTnZDnVr/EmIlakmD//+c933nnnlClTglkkRqV1mWgxx3uZE5fA2IABAwrP80hK3W233dSy22+/ffhDchk6JiYUnnLKKYBqn332GTRo0EMPPSS3whWQ4z4t80q43nfJAaQBSInbDQDHRB2S5OO7omR5dY/EDfDkfUigyHv11Vc5LHfL74Gvij/wDFrMBSFaCR1+w2aYOnTo0K233hq2kYxCGSwBEqGA6Kp8hmgLr+QrKMIBgGE5Xbt2DaigXH/06NFxbfrtb38Lrh555BFLCK/4CczgHP32mi1t4A21UNCp08E0nTz+dtttx66+zEY/vuhRApJ5++tf/1qB2KdPH4GyXsvnT8CnO0EwKQxvGfJoonPoUVh23nlnsD18+HBxEB+L5RsxE8UHOoqJDTr//PPjj9HSb0hH0HQsX5SYc2Og0EUB8VPxtuuuu9JsIUJBf+hkkXW7gG9FzLnPOWOXXnopROd5586dFf1iSOdiXsBaw9ei3dgGXtpd9et5550nsuExnPvlL0++/Y47Xnl1cIf2W5Rlsl9+Nfuuex+st0bZj/c++LUhQ4a+M2izjZrsvs9+666zelmuqiRb+eEHHx100OEHH3jML355fHmd2VXZzy+95ObXh0x96D/Xr1anbjZTOverL/55+z3rbbDWbvse8tqgfm8Mf6fpZg322OfH498Z+sJLr26w7tr7HHDweuutK2jgTl2cr0ML4XNk99hjD9vD50XF1JDNeOqpp+yZRRW4CxGxFF9T+q4U52Q54Svl6N133/3b3/52yy23SD2YRdAtUlhEUk+BtZwp3qNwLzhFMsRD2X+XXXY566yztt12W4mVb4XhpSbKabv22msVdpTD1CeeeMLbDRLkAdbViIExsFN+kMfVSSaCCi2oAxgSmhwNa4ENTtSFcneTJk3UnWPGjFFvtWvXTkKAEzI75QAJVjEB7UzEoVk9Z1NAzg477MAoCBw3bhx0gSLQkXuwEHbSD+0YMp1p6Yhabrig8MqKIAS3oRR+wNhBBx2kVRfygX7KkSsFMKMzLg0ktZbArvWKs+WDIo+EwRL3wHNsljYiYLoliJ47iruay4Ep5CO2Wqp0iq1oV+8TFrpQzjpt+pRzRlhMt0YCFuJdOOyww/7973+/+OKLtslinQqSxDyyKA7CEt5Sji+kpgdy43BYy1u3GcLUqr9dhsjYLzvSpUuX5557Lr5wTjN87d27Nx+K523hg1f79WusVuB+/vOf//3vf7cTvAxH4euVV/7h0ssuf+yxR3ffbddMdu6/7r53btnqN9965ep1Wh28/4H7HLj1ScefsOchvzjphKPL3Fuzlcmfl5k16/DDTjji8J8e+/N9rr72yldennj3Pf9Y+0d1y0syFZnc3bfcusEmG1124SUbNW132IG7995ll+N/dsyPNmmy7ZZtDzrkkOuv/H1m7QbXXnfl6ure8npVJaX1Siq9LAnWqqarqsSaY4K7cCiDIp7xUhEL5sLkADnK8QWD+P5htCidRapxs75x1gqglc2xBfxZGUKEqnu1ZC7RgKT45YSvTrgK4/TTT5diQi1mjBapeKojP64A4kkY5V5wilR00hv305/+9A9/+IPc59WL0aWnWKMLh3zt3fco7NyQmiV6GVnuhkNSOR9gjyGe2BcJGmKRj1G4aAmgC2JJ3NBCfu/ZsyeQI4NvOlzkeawlylmjZoElaslL8fYlPmTGcRNSzsbXpImZJYvCD37ia6OGw2GOUf7QL2IyT2Snpk2bWoUFqgU5zI0JEyYo7MIHkrEcwjTQ4wKBYyIBswigJEz5XyGpT8YQAX1eYZpCIZ9/8YtfgDflOA0xWtQT+xWqMBN185QTi5YkMfcMAh4RPZxHFmulLgHHH388/xVpxBABYhYbSCwCCCcwmwDfPCq7RZ4w94JpcwGq3WzevLngM+HeAJj233//gQMHqo9pEzH42qlTJ+4VfV6Yah9fnaHXXnvt8ssv5+iNN97YrVu3JK6ZjHW+8/Y71/7xuieffPLUU0859KADt+7U8eorfn/ab07bvFP7k0688PwzTsnU/eCAHx/UeecjzjzzV+4edTO5TLYiV5L98KMPDjzw8DXWWK+0ZK37H7qj3hpzSnJrlOVKKkoq/3D+Jaf85qRuW/c+/jfnnXPKcaUl2T127r1hs/Z33HpjeWnl7049ffIn5bffefNqma8qy1ZL8DX5DfXJL9ctuLuMyHY67kuAr8X9cjJiXyNcwawt4hV/qrtRi17xpOhPvLqotpypTt5ejsW9fjHv5GIo1gVfDz300OX3/U1S81133fWnP/3JVT2yIWI3RvmPiudwxRAHREzKKz5q+YAp9+27777nnXde2/zvr1+2xAS66aabgISMJC8LDrQAJN5fuVtfvpbc5WiJnphdlny5Ki9DX7N69OgBL6VvhSMAiIu1Jcjjqt5NN90UABOmQcGkPKBZhCX0kSNHEqNQMQr/gJy5bIETc9VzmCIAAjGho0fmTA9/KDcXDKvJPPKZHktQYQdY2lPLwQl88rIQg0Y23aKQ6fhMMC28ErWhMMpDbWxEkEctnTHdingiIPr0u/3w87///S8rBAiHZESYHq3pRYVmEYhHU8IN/mCGpLlMcN7Vh0v4Eqn4vPrqqySFNK9m/lEp2o3p9HvkoUdr0bdw08WfQnHAdzcSPUPOmClHHnkk8FbC2jLmjjjiiKhfY+E1Uu3j6zPPPPPCCy84pmJnneecc07wna077rhj/fV+VKfuanPmzP3i888uOP+35ZnsiOGDf3zESc++9Gqbhut/9b/JLdtsf/M99+/cu2fdkkyd5LfcZksySSI757wz//qXBy+59LrTfnNUaa6qTnlptspYRdVX2deHvHDgYb8c/M7Ihhuu/sWH73fq2OnGe+7frdf25RUf99pu5x8f9Ytf//rndUCqw5P1f75UP0XLhmyPvXTQr/4uv7/JZiFi2jh/8RnIsvfvOxJ/OCPs/OFMeLiYY7dciWmvBOKAExXMWg8REiK7phNRCuZ3Ihq0jz322OGHHz569Oj4KHKZx5kV0ZN/H3jggWuvvdYNOPY3RvfZZx9lYuzvMjddIzHNHLA/4IADgsMuJpBQUpx55pnt27ePdB+jy5DYRYIAomRz76wsjKOUlIIld5jKDchhFIJG1agVQB5K35jNmjXbdtttaRgzZowDQAnsDPxr0qRJKDTL6wONqJIMMWGtYpcqHGsPECWjxXSEgDcTDhJbrBNYf/31PYJ2Yoh76ubp06ezBSA5A5LJELa5kUBoY8sstSwfYjqOoYgA6EKGOIkZfpLxyCIBTKpgkrlGOcM9pMNJhmwNB9yBwNWdd95pOklEABXf0NBvE2nTL/oQHTcSAOkOwXlTxFBLSRTHCk2qmACEl1xyiYUzHRHjvA4HTI/3DkcH0YCJmEPhA+tUcVKf8vjcgkzo14dK5nbo0AG+rgL1K7+13OBotPGIStWjyR/oKqsKdraqvLTi7zdedct9rzzT/5l1Sz+87567f3/Vg0/16/Ps008cdchh5Sa4OlVVXnbpNUOHjv7TzVf+9Kc/O/zQE35x0tGlZRW5RMOckmzdP193aZ/nhz305BNrlFe+/NR/Tzrxl88NG77hOmuNH9pv5z2O/u9L/d55e8ge++633mrJVSebqbc8/nyJ1Xnx5IULL7zwhBNOCM43Jggy0dp+s7x1999/v8vU8sgs34m8AIMHDz7uuOOOPvro3/72t7GJteWV+AwfPvyoo46SfL1s4Ua4VLukenAF9nLed999Ml2B+13I0oS6b9++xx9//Ouvvx5bv8yXxoq3klodSfnuu++++eabwblHo5agutVZYSENu0qHLfO/pB5Jqfvtt99ZZ53lDQo/MSXBGF2GFCn+hhtuUC+iyLN2MIDQo3oUYvHHKEnZX4qHVR988MHkyZPhgSQu3Qtj1KxA0csLh1S0mHK9hG4KQ9TKCTquTfAgdEJlhujRIiZM19kk/4OtgVV0aul0GOABJXAloB1CU04eIppCuX2knC0ylJtCjwCCfGU3PujiFSYsCfATZByVnOnmum+xa1HUigAZJCxGadMxqg2gcubdP9jq1atXu3btHn74YYhLJ4tMEPAimMUu8ohPD7U6WkMeY5n6JrIVTMI08NaSGzRoQGCrrbZSZDuu1s6u0XCSJ+FYTLFSHQImcqxz584WRZJFSgyFdUYRvili64LCf1ci7x0ZwXHb23777cNPjtVItZyXkZXzD0VHG50kjmXlmdI6ZaVldTzag3Ilf/kLLw3uueO29UqzmVzZqwNe6717j9FvvTl90rslmTLl6dyq/112+bWvDR19xz9vbd640QP33v3gv2//+z9ur8glfyasrLxuaXn2xZeH9OjedY3SkrLslwNffaVJq/YbrrVOea5k8OBhHbbcwnXnuecGrOmmk3zZtSwpiJcP2UtHzUEpPH8XsuuvvvrqsGHDHK8Cq1bJIRs7dqxXd8SIEQVW7ZGQShyjRo165ZVXgiPU0ald8j5z6a233vJyFljfnbwdFii/a5ffuorvo+z8y1/+csCAASAWbOAwWrQb/RVADpj1si5Zq91feuml22+/XQ3BSaNJylg+lznKtXKRNA3wgIGMjGAqVOOVmiySlfwrEcMhJaMXc+LEiV5So3K0dM95WZ6kNspWTDrpMQX4ackEYEgLOBBa6UaJlhI5PZSwDopCRjS44Th59Rh98cUXvYb4kyZNCqMtW7bkMyVsjR8/3r3TqOnWRSdtZPhglF2P7gTeHdOhaQIveeeHDBliotPLE6szF1SLhuULDib4dyBNdCEoMpECYEr+1ynrgyXgSow/YZ1FazQXYLumKO6nTZvmtXWRUnOb5dF0Lc3sIkxi1OpYlFafhwRo7pen+BFKleWOO+7YsWNHoLvTTjvtsMMO3bp1cz+LbysTFvzu3btrOWNfaON8/HiPS4kO04LAQwsXMUy+8ZY/AwcOfPvtt1ksnpBFUe3//uHwT7tYKgjYlZdefOHAw4/crOGmdXJVrk9D3npHJM4+51yJp7wsO+zNIc88+/Ktt/11nXXXKi3Jrbvmmrvvufsf//R/e+y555prrhb17ZP/feqYn/6swaabllRVPP/8i9177tip05aE69Urf/OtESNHjfvd7367zpprSjA8S/4tNoJLTNYSv39466239hgLjKFFEYE4vt6H3Xbb7eCDDy6mmBCoLeLA5ptvvu2226oaZeTg1KJXm222WY8ePZT4cbUXtFp0pkiuyXZN/ef9Xxp/pJji7x9OoryslxY6q5OcLjf9/Oc/VyjINbIVZhy8FUAwQAtR0D/+8Y9f/OIXKq0i0hep4P0ypXjd+vTpI5VDOMSutw+TVyJgCwQHvnLAIyaMBHshuffee8fnrptuuml8Qgtx5W5pGijGV0OZAGwUKjc9OiTk49xK7vqY1AYoIpoZIgaS6ZFGqKKEOWExSp4Y5aCUQOAZtDArUBzaMRcOWAukBCqqukBTQ5aj45FOa8HxGBR7wVXLpN8jD3HIiINHsyzHEOeZFjcEunjCHPdImsKuNQqLACrlkcgIkRZ4M4o45i7yox/9iIz14hCIKw6LTFiCxboNgF4L1IqS1YWh8IQ5rUetKPHKRAvhv9WZLldEPPnDhOk6PGTdoxUJGlVIWNq0aUMPDkNdunRp3rw5naazUiMV8vWqQrmssH1ZUba6N7tOdrbH2bnkQ4qSTHllNlendG62Yk5lrl6mvB6szWSrkh+ZLS37anZF8k7UKcuUJH9Pc/bsuXXrrO52XjV3TvIHr8sUvtlSYcjmKnK5qtKMwpmoujkTsVkOb66wu745Xt/p53OIhSSy8fbVo1mL2eAVQ9zgDyo6g+P4xugKJm54nXjCh0h8OLUeIsSr4n5pC9zvQqZrV+TvRyyStBKGpCQvUyxhxZi2fdEiGxpBWDGni0Xmfve730W9KKtKvpjF9C0OWhAS+VdkPBKQgiGB/EsJaFHrSNlvvfWW4slciVtVREYALYSk/E6GBuE1C6jID3RiaimJk0O/WbAQPAS4sgVmKKcW8OhTQjkIMUWfh/QbBS2xifrEAL8VKV5pYIVas0yhGdqxyDFMOs0KSI5PmGmArzQH5NNPkhjPjZpOvz7fEPDjJ7cZclzBJ0nyxfODYkNjgShMeCSDX50McYZyo1yyRsit/I0zSXmjRo1sELdj4aEnJjKqNUSYh2EFWUtET3xM8UiG54ITuyYaOKZj6mCCeft10EEHgVjKF/azSKsYvgoUBMyWeMdKyjIVknqutG4Gkub/nHcZcISfmTpQNJPLgsfk91FYX/L3q5Ng56fn15tLvqia9JOJ7jjwQXFbakKmjAy2yC+PL7wWiOklw9dC7+v0jROXN1V3TD/8qS2vOODlKR76Wg9OdYq3WmfJvIo41wq+Mo0WsLViTBeDVp1WmGmtillulZflVmWNJCsUgCRAZdasWZI+sPEo6ZuCCHhUfqmlYLOXPT5QJSBTg88OHTrABhMxmQisMtGotADqPNIgy+vDYwmdMNMkFWrsmmUuJmgxC6LoYHKVZrMwPWo5QDIUei9YMYtCXkFZj+AWwABCHUMuAW4MoAWQWwKjqkN6dJApVkFSH5hRyx+rMxE40cY3wSGASAYYMweetdRSFeBHODqIh+FbBCQ4MeQR8TDWq6OlmaFAQR1rV+ZGgavPIisxN9rQH4+hmR6PVImYqHqMyNAZ/oubyl4ojLofaCG6VZPp3LmzW1eD/N91KHq4MNUavi6x3WrTomuXlJ/Jl0nL4GIuAVr/8sm1KDt//YU5X+uZ6gE058WoK8jPn7XMyfKXAF9T+oFTnJNawdcfJgk4OvnkkxU3cq7ECkgCP0CLV1gel9/JBE5I+vpybggrp8jDrYBSeR/AUAty4pP2AEVTpHj8EIAQTOjQxgQ+PJb6CRPTwY9cbyJQ0ScTWMUc5WPGjCGsD2VR4Aq1ZMxi2pDkAzk8avnGQ4iio1xrmP+js9RSDl9xyAfqsBjQZRRS6htVQZoLnwiIBvfi02ne6rMLqEwhHJ4gejhDAOkXCRMFaAVhRicmRp8eYvGo1Rc6/rhAIB2EE34ipt1RMHVillGz+MwBq8YXCi1zNLgxeOQ8fLWW2H2hEBkClqZ/4403uojk3ZnvW3SKVGv4umTEV1eOsqTJVOUxtKwkqV6zpclvJS7LlQPHXKmSNVeaoC2pZE6y6vwqc5nCdyuV2jNbrKdIjaGkSSh5jCgtGKtlScKe4mtK35XinKT4usJIChbzU089VZL1KBfL7DKvbCsXgx8pGz5JxCQxtfhqHfn33XffDVxU8ZgOeOyUBO2RBnyghSn7Y0JBMuQVefQAAyle3teBTxI6nQwBDEwtrxCFtJEMIPGog8M0ePYIHritT5jzcg5tzZo123jjjel87bXXmC5WmXFR4DlbPIQ9rPPHFJCjD4PDbXq0lHOG/LBhw2ho3br1hAkTXCZ4QgAmmcgZtoq/VgJxIzps8U3HUBLrfB+JcHUOi9EJ+eKsZDhPlizmxAxpTYedfLDAeLRHOgGoJBGvVOFay7cui6VHBDjfrl07a7EXFsuKW5SVeteGDx8ujLEjVmTu2Wefvfvuu4fD4cnCtDL+ffXFk7iCxvhvvl+oWfP95DuS8giZ9AvSmnybUL6TDCTyyWfA1QcLE4uM+QPLhbxF11133W677Rbf34SqH5oayQGK46UdNGiQY+0C5bjEaC0Sf5zRBx54wFFeb7318m4mL39heMUS01LGI488woH4ShKqLWeqk5fzueeemzJlSlx7C9zvTsv7+5tSKlKc5GeeeUa2lUZl7UjZkqx87RXWhz1auxD4im+jAzVl5MDgeC8AYcuWLSGW7CybYwJUcxFIg0bmep0podMLhe8k6zAHlsgb5QkOWzhQmSFiOGZBBdhgrlnsMscQ+PEIOxXTymICZrEOOViks379+s2bN+etlwUnCN9cLUMAhl0LtC5MxSgloEhkqILlHTt2hKwcwwS60NG6uLHrrru2bdv2nXfe4UO3bt3UxDoRUkShltvirG9IhwmcIl+nuAshFlTsWztbvNLhNqMC0rhx406dOolGXBEwydgLHgLdCJ1HnrsHjMv/yTwluH7Xrl25IZhaCjljXawQNlHELBzTNul07949/vguSf7USKtY/fq9IWG3nUvw9dcgJ6Bz586TJ09+8803XbgWs8ErhrwMt99++wknnLD//vvff//9+ZO/uGvdciXOPPXUU/vss8/2228Pz+K9/cbYrgCyX15Iie+tt94qAv93IluvTevXFUZRSB1//PHyqTh7Z52uSL7IdmjxQ0wehy4wjABkAi0BFVK8zC6tQzVIgKT+wFdwK1nL79AXyMEnrzZtZpkirVMVnxXr0BCgzg3+hBgH2IV5xBgytNFGG82YMSMmEuaeFurAV+YMsWuKUaBCA0lnkjY6MZmgzdHSV5uaSKc7ATE+A+NwgF16QNr48ePDW48cAKIWCM61YkInABMHsGQiDmxjCyeCg4OSQ1ytkA3CKQoEecSMmOsjRr1KgmZp3BNGnvAZn/+uBdaCg+8Go7VBET2XBi6JCWGuwk4VLYWYSlUmaOMwARyPs2bNwtF3paCHM+ecc47XUIczee9qoFrOyyl9JyoeOKdnzz333GuvvRz3GKpd4tU222zjiqocdxZxFnPmVgC1adPG7RLYhzMrCXkt4w9Ne7cLrJRWbopULoNDHVjiQpwk9fyPeUi+Mri+bSXjEXLI8pBJLYgTEAK6ZP+NN96YEqlcIWWK99dNS8YHtASQrD1s2DBZXq5XbpIhSUAHBx5QopXfY7qhqNUYNcQ9zuAbhdbEAh4ADDGj5OGugxfYEKPkeQU4pREQjo/JGW7QGYAK8/ijeuOhvnrOo7CwyyJw5QP5Bg0aNGnSZOjQofDVqiH3pEmTaBOHrbba6uijjwa98gNPuIT4yZxRqiLIEW1Gi4QZZAjFFC1iIjp8sF7uAU6euKN06dLFujhpUS1atLAcMkKnDpk2bZq+Jbs3WIslUE5YIfvCCy/cfffdffr0GTFiBJkoczt06NA6/yd1R44ciSOGFJpr19yE+BCOLYbS+rV2yN44wVG/ftffPxxixX5QCNQWFY8al7T88aoEZ8UTH7wSEZaiG/rRqUUSJW4Ud63A/S6UbHn+9/vH7x+OPL5kqlL6NhR5/KyzzpLE9REcii+aOloqIVkYcsi8krLHQCzZHAADNi94wBthiX7MmDEQTj0HDCR3mNS4cWNAQqFNBGbKRCb0IRBtbFEomzPBUBSF+sDDCY9Z0ggPWQHhIJBplRbPqWWUOUqI8R/+eeQYGWo5AMJJ0hxiFJobCzGX82Q4H7gOV+Jtgkw6dLLLolDgAG9iScjyCunxyFWOuUlYTt++fXfYYYchQ4b079+fgCkkKQkcDc0URgeJQxgKSUM42rCLE5IeYzoTzZo1sy44yv/BgwfzOeI/c+ZMCNq7d28uCRefqbI0btDgSiSephjC2W+//eg3RUvS8rX2lLwNUg2L1ZT8X1y48sorDz/88HCAnhqpsJiUVjDFcbExcVa+JZF3kvIbmnTiFdIvDNcecSOIS+FVYaA2SEC88943nuRDlVBhrFZJWHgSbYH1HSlOSySaYt7Jj6S0vMhmCXicbYgIh2RYmRoOgUNJOWpB+Rf46QAwSRnameLRBpHE0ZH9qVJBytdqPnyZ2nRVILUAGJSSiQKRTGwx8KZH9YzjYCOYwSvTg/igrIx80qpVK5oJMEeJDiWQxnQ4ATZATtS+URYHtKjG+AAUCUBZ3mKaxRY9Hq3OdBHwWnEYXKkFuep+wGijRo0ImBIXgm222aZnz57KVtN5/tZbbz333HOcURoCv7322gu/+Aqwgky3EC2XwqsIHQ5JHZJMU+JRG0kG31wLtzp7wfN33nnnmWeecY8xyp8ZM2bQpk/Pk08+OWDAACu1RkV23E5osBYLtzpijHL1xRdfjFeMmKVZUXx1ltu2VXg322wze8EHXoVvi6JV7/ubvjdk42+//fadd965U6dOwSmeuZRSqpHiZVYAPfvssyeeeKJELymkx2b5kYCjxx9/XPYXbXkcXEm+QHTSpEmR3OXfqG6ho+QbH/DCJNMJ43yc/7ZbGqApGcz69evL0c2bN5fWv8z/liWwR0B+pzM2VEIPFJHW6VQ5UeVRURibzhl9ZSXAo4QAcNWB0yAcotAWhmhjQosZHvIBKpPE4Z4WepmLyagVkcEJkMMkYAl8sy7I5BEz9MBUpi2kV69eXAVXRrt37x6zcAyBYY/AVcX80ksv6dOjr6UEVrFIEnGPVx6pbdGiBesERN4y+SNEEZDAWq6KAMciUDoe5VWL5TkmeQG3HbEXWs6EfBDlZARTDOkkYCvxxZYGLayNOxACzPRr4TGdCuL27dubspgXMK1fa43sonPpyBaevwXlX/YaqDBce1Tw4+tUGKsNYt3rVLs+LEz5qBSowFoisjRZQ8eLvZSqUlo8Ca8gq40kX8jx3ry/ogNfvbnDhw/3CsMb7cSJE8kDDLcfQCX/ggFJX7r3psvIVNHTsmXLpk2bKoNADkyNnN6wYUNigSs4yfnI4wok8MiWR7keosNCtRRIYBGfWs6YyEkaFKY082fIkCGYkJXD6jmeGCLs2MAepae1EFPnKdHGjh1rbhC1fIB/lukx1AYBS+AEZhw/C2QIQFpg27ZtLUrNSq16kX7LHD16tPgAxe22287q4NAuu+zCAf4cfvjhADU002mB1LJoUR7FTYfzMJUwE4HiAZOBrCImpADYqA6Olicmio++uQp0HK2+FdmXKNnN0jckpDykWcuZ0KwlQz/TkydPhqzECJvFN/woiM2yfKszirkYqrX69Rs9+96TE1z8+ZxvH42ipI7XD9n+4NQu8SR8K7ZObX6kFogz3gGdWvRhYfImc6zwsBQkLd5///2nnHJKJNYCN6XlQ06yIkZ2jjSNA7fefPNNj4BBNlfTQC99HdkcwIAfGy1fe8djIjG5W0YGrpHTPU6YMAEfytJJM4zxCCFApmwenw/DDHpiyJsOZRmiMBwDOdK9UbBHDN5svvnmZGAw4AwUZEiHXbUs5YF/5PlmKBCXKpKmMMcTj6YYpYozAVQc0GeCA+aSQYZMgdZcJW/VEBpEWSC13OCeCwcId2h5ay7o5Qa+o0uDiWYhj0WiikUawk+jrHPGqo0yqrVeJFCEPRraaKONdLS2I15/L0hgMHMqY7sjCGQE0CwtMWq5EauOR2gqROvlf86wuFKLMtEonWLVunVrIXW3YKVI4UZ1qrXUzJWUloCELva7T58+f//73+NWVRirPfICeK9cFwYPHuyx1l3yAv+///f/Xn311QjXSnLebNa///3v++67j0sFVkorN8XJATzx5UZJX3qV9xs0aABCJF+5voht6jMJmjwgMQr2ABIAQ14QTDg0Kk/wBoBFVUQtSDA9YEOih6yG4JYhmvG1jjSFPOGADlUhzCKEbtSoEVsjR440RV178MEHt2jRAmf8+PHEcF566SVDXbp0CZikDXEMbLgu0K8y0wbMQCPequH4aXVAkbcW6wLhNbc0ZC4mYas2pNQjzDHe4sCh8D9+emfmzJl8Ez3633jjDXO5TYmhIE4KdURbrCghEGWuR32qxEqHcEjqcEyHnggRYYAH+RjiACgVGUOWw/POnTtbKT70hZ2wViggJSseBZwGo/HH8JmjlumuXbt27NhRdd6qVStBptBEOt9//32AHc6wG5T362uUfj68ilHsojfkjDPO+PWvf+34Br92yYF75JFHzj333GuuuSZ5Xea9A7VC3orXXnvtzDPP/O1vf+uFrPHc1wpJWCeeeOJvfvMb+a7ASmmlJ4dZqgU/kAbqyLzx1U3nSs6Vu2V5WCJry85StlROxl4DG+gFxhxIsETGaYRD8jiF+JjaImZovUcQiE6qjJoYqMw0MRBlCJ9FWR68wVp8MpMmTaKfvKNF+PXXXwelalkwEH8pj8+AjUv77rtvVIcM0fDpp5/yQZ+rwAbMQxEoFRhjjXRabHhIjPMkDTGtIyxInQpc1amu17SRf+ihh4YNG+YmwVWRadmyJaOSleJViKilkOZ8gBPC1IYVQeAVzyPIMSpiyGI9sg4OEYcVmly1HB1DXBIuQaDcKA5Vlo9vR/QRJs0IALdp08blJtai06xZM2337t05vOWWW26//fYbbLCB0CFr51hcLyi0EYiqxVP6/U21Ri593/X3NwURc+C8VE7DgQce6FUsDNQeeTEcQW/ykUce2a5du1jIt1zO8iAvg3f+8MMPd2nlRi16Up28w95P27333ntHQlkyGpP+/qYVS4o/uBjFq6utvZP95VngIY+DHAlXfvdKBmbI3XIxJjyDLgQgmW0CDEajEKRHgtZ6f3HkfancLKWeVwkn4NObpW+WuYyaEp/QmsgBmAG3QBphkKC65SRhb6KCG/Co2AjAjCiyzQpojI9J43c2eVmQejcE+BzIoT5jy1qMGoLWjRs3Npc/lmk63wiH//DeUACPc67FhHN8i0BxlSqlLW2WSYPp5nKPLf1ifMhHGFEYwoGsgalWihmGQphvWsGnnyEr5QwZ0RBV05mIWhPxjbD4Q1+eECZDg8etttpK7UvedLsZtw3ESfJipd+8eXP3DwJK8J49e3okj8LbhanWvj+ituyuJGT5jvKmS/T7hwmTjGuvk5rf39rPsOEPT7gUnNryyqvFmQCwojMrA4lPJI7YtWB+J4pzkv7+phVJtsw9WPH34YcfKlgj4JMnT1a3OWMSvbQe8ENA9pfopWAFJcRynTIqC9MDWiCo/A4DnE8ER+FTfF1Qoocucj0xMpH9QRRbIQarTMEBY85PfDxLObTAIRbg1LFjR0OE4YF7GPdgA1jilVG29OENMsR/+Aeo9Bs0aPCzn/2MQqcrmKyEzjDkogCneaVAtDpqnUam2eI2GZgUv5yB/zBJ3LbddluB4oZHwB+1e5i2Fj5r4w3VJ085ndaIU0wmOgjHKM3mWjLifIy6K5hr1R4pp7Np06Ysvv3228xZMsdCUkFimyzNTvHcRnDeo77R0G/thBFDgiYC+p06daLZlV3fZnFMIX7YYYftvPPOJmJyD+FHp0i1hq8/cBL2JcDX6psVmRrHOfvGicubOMOTcMajDpdqyyvORHCWUWTyMc8lv+U6+ZXVVVWFvxGRkKHvoL8qWwVg80njO+ErKwVDeR9KH+vbF76OS/F1+VMcJPgq7aosR40a1aZNG7kbxiiP7KPsHNWbd1mKB6iYKiFIIzU7gYiAnYJAEjqYgQfx4SfNEEsnwAkwyAmBE9ALEjCBAy3ACc1eK/6wpeWMR5AAP6AvhWbpMw0VqDLkYJDR5wm0YB2ucIZRS6AT/PCHh6ZAcXMpV/u+9dZbprNo1TyxWLNC0hRMozrWzluA3aRJE9hJORMkeWJUS1vcJ3horllRmhviGD4f9MNPAjp0ioZHmEeJKTwvYie+1vLJxESr0+J4DBPiSUbLEE4wkSVrDekISKzdo13A56cOW1S5Luh4FHwyRnW0FIqwJbiLkN9xxx27du1KjAltjZTia+2QsC8xvoaYvk60+cHapHCsSLXuWNGfpfQhryab/NWlqvK5mbmlmbLPrr48039wJrvmahVzKup+UVK1ZllVprKOt3H5LVbumFuRXaOkfE42kys74sQX11rz0MMOHTd6VIqvy5tkW2fp3HPPBRUyrEQfEAIGYAYEeu+996RaCVfmVYNOmzYNIAEwGTzKHWgqWVOioISXgXyyNpQiY64Wk4C8zxy4gjo4P/rRjygEJITpZI4/DCm5aICIymIdkA//AvMgsRIZAQ9MHKkf7NFAJ+ehDlXm0hboZTmYjOrTzyhhZBa3LZCH5gZoOWlWZFQ0EPeikBUQGvg/fvx4yhEZwmpxc3UoD8SihEvkk1Obx0UUNwBDZBjCEQ12ESu0SZUEFNAeGeI5JSJjF+jUYcXohvnf6Y1Jc6AvPTpa8jqIFfoNUYIf1k03FLugo+UJEiXRNkXoWDHLQnhiry3h7LPP3n///Vk0RVsjpfhaOyTsS4CvKa1gsiu5efhamZlbkiv96Nij1/n3fV+Urf5VWbaibmXditVKJeFMZd2q5fVXjPiQzdX7IrP2ulWflmdnZy+58uUt2hxy2CHjxoxO8XV5k/yrveGGGwCYLAxLAmzAqoQLwyTf+BKmXcAEOTCADCCcMGGCJK4Ok9xxqIINgR9qINlc+gYhIA1HBqAKMEMmmYESCnVY1NIPPwAYwIOIFBKmyiP3TKHHoynx5V6aQ0w/lkCAewSoglg8xwEV1gUtXAJ0aDPUpUsXQ+pjQKWSC7zhLQIqHrmEdBDlYU6fFT5YaZHiLoJJIYf5Ga5WBx0uhYehDRmNviFL0BqNx2KepJw/vKVfywTNLhaCSYYAr8IlLU4wi+aiDaZOrEurTyZ81rcugQorEUxiAsiQiB155JF77LEHDcisGinF19ohYU/xdeWnBNuSj4OzmarybGZupqr042OPzj74yDp/uDazTeeSssrSOfVklVzdypKq5PtKlgcl6A5WM3Vy9/z7i9vvzF1x+WvNmh585FEjx6X4utwpsvBll12mjtGBizIsqHg//wub4od2ZFtMKKWUhK+ycJMmTYDTjBkzSBqSrIFBmzZtoj6zX4Sl6ZYtWwY8QFPKjQLgAE74gaMlgHTgND7Eoj/SBXlq4Yqy1SOL+FQBRY+m8BCBVVMcFYZ08K1F27hxY2AMPzxSwp8hQ4YMHjy4W7duYIMtcAtU6FR284F+060XQEZktDRwKThK3gSE8+UgYeasHU6T1AHJAsU9CxQBmk1B+fNbwLxiuBhFOFoc7mGSYYhXTBiyfCZk0XDVaIsWLSKA5M1iIlQZCnP65hrSD0MoPkigkCqj4V5RmBWeM+Ryg68jFGJl+umnn96jRw/M4loWplrD19qyu5KQ5af4uvLTfHzNlucyFaXZsk9+elTuof+s+Vif0p17lZRm6lTYtmxS3JYsr/rVK1qSrawqzWSvvf6zi35fcsUFr7VsefChR4wcNybF1+VN8WJeffXVEquCFS7KxdK3jpw+duzY+BxVzoUcMjXcVc6C0okTJwIwWVheVlRppf7JkyebDntUtzSbgkByfG4sTUvc8jttRm0rYIgPP+UKkjxRUeEzRJ4hXunALXw6oR0NH330EYXE6KETf8qUKdRCRyDNAUwKPSqXeQVWGzZsaBYlgwYNat68edu2bQnwAZ911whqQRF4s3aqAm+sjoccMKodM2aMteCQZFpHlEaPHk0ti8zxhM8BfshytPwM5TosIhORIXwTaTYawkZFjyQmtayIv2jEEHkx1wYW6oie4AdkEnYVMIsnHsl4NNGdyVyXkhiKlgO81bFlBASB25tssgmd1iiMaLfdduvcubO5tHEPFTtFSuvX2iFhT/F15af8trioZzM59WtFWVXpJ8ccVfrQg6s98VTpzr0zpVWl2UxJrrREJ8Hi5UO5kirVc1k2e+11n190ackVF/Vv0+rQg48YNy79/qblTl5MdOmll8rOMru2UaNGOF5e+CpBY8rIUeJI9LAWCMG5WbNmYZKXweOLmpK1TC3jS/eYUQhSYrpZhmhgUWanM4AQxyxKAAyj+qzrA4MAORyamXMGaJb9zeKb0hkzcBEFXJnIJbhCIVjiPJ18gFWYPKHEXKZNCWZAvr7pbJFnl5Mk+awT1wUKIZCF0GZK3AnMIqN6Zj0w1SwCUMosc2U/HvLKkEfy9BAOK5g6JmrJhF2jdBrC14/QiQMiExHmsGUGalojf8xlkUugkQ8cJokPO91amjRpQhVJ20ctN4zi6FCiuDeXQkMtW7a0lePGjQtDO+20k1qfV1Rpa6QUX2uHhD3F11WAbEvyLb8JvlZmKsqrSj8+5qiqRx5eo88zdXfYIVM2pyJTVparU5azd4UZy4OqQGxJZcn11/zv/EvLL76k3xZtDjnkyHFj0/p1uZM8qz399NOlZglaQSaV66hTvb+Su2Tt5ZVtFYI48APTLDISvaH69etL0JCDHmmagP2CEFEavv3224QDAyR0SuAEPiWEDZEPjlmRx5s2bcoN/rz11lucIRNQBGycB7hOlekIDKgX47NTcynkjAqPt/RQoq4lxjfLASQk6SFpyLr4zzoOPRtvvDG+BbJIP5doMKTPOkDVhz1WSqFVGGWUMDTSN1eRzRATYJtX5CmhMPCbCXqIISs1pKWBgLVzA+qHIQJGoR0BQYiYe9ShlgNk+EAtSYvySJJRMvj6lsZts9599134KiacoZwPPGEotph1hkw3FF9wVa2aG/pp6Nq16xZbbEEhssYaKcXX2iFhdyBSfF3JKdmWXLYko4KFrUn9Cl9LHn549Sf6rrbjTiVlc3OZctCbSJYut73jRFJBV1Ved80XF1yWufTSl9u2PvjQI1J8XQEkyQrvWWed9dFHH0mpEEIWlnYnTpwY+Reqyekyu4wsL8MhpSEkQw0aNCCz0UYbmSg7qynJwDakTxiMgQHJevTo0WBv5513NhfiQgWaQYJsrlTVZ4IwSODS/vvv37x5c6r69+8PFwmAGaVYu3btYHZ8HksSVDRr1qx169YTJkyQXtRnPOcJfzzSbC0jRoyQhSwwPijWUVyCJajDMYDESdpw9EHj2LFjxYHDiLagAEg6eRjISpv1msIoDAvM82ghClZzmbZej6pbcz0yzSgNIgD1cUQpkiQxVjhGs/UKi7WbZYqWh8APtFvR7rvvPn78eNP5bFY4hvgvINzQEQEcmqOFncRilxEOW4zynGYce2ErGTXUqFEjGzdz5kyqePuTn/xkzz335BUl2hopxdfaIWFP8XXlp/y7kXz9taSqPFtaUZr/fDjz0KOr9X2i3k69S0qzrvulhErJLPIzoqUmFXSuMpPN/fG6z+Hr5ZcMbNPyoEOPGJvi6/InmVfrJR01apSO7AxCQBrQkn+RVEsGvsZXAaPyUwN16tQJQkRml8oBDETRgRA0kKGBsLlyd3x9tEWLFkOHDpUKJPctt9zSKEP6oIJmmCFjBBTJ+IFGkOyNN96gbfPNN3cY2ApIg0C8Bbe0MQokoJp6iwkgEcDDedpef/31Z555BkLTj0MzW8AsPogmSUYfXoY8SV6988472vbt21sjjiWAJWtUW2+zzTbWwiUxMaR+Jcw6fwgL1/Tp0xXZVkHYMvnMkEDh81xA2CJG3nqpjTsEeUxDPJw0aRJJ/rDCKBn4yoeePXuKpHuAKRZuSiCfVkBEzBSjsXdx1cB0DRIfqEkzdC8Cc9wJ+CDOTBAWwMGDB1uOuXbzjDPO2GqrrYpWaqT09yPWGjmsS/b7EVNasZTUsOrTXPIpcWb2Iw9nRo4qP+rI8mbN3E69vMme5TshvTwoqWCZGDho7vMvZXr3mrrh+vc/8NCppxb+fk56bJYfxcVXqpVk119/fSletL25Mr687H5MAF7CD7kYnknEcrccLV9DEekexpgbNyGJGwyot0wJAJDZFazxS46AawhL7jI4JgJ7UCQwySh5GR+HAzBjyJAhwEaHWhOpDZ3KRDKwloe0WUKTJk222GILHAAD0lg3ixX1qD6Y6dixIwFLMMUq3CesCKnU40dsYQ9V7FqdNQIVoxaO4juq3DDC6MiRI4nBPEsDSIFJAoKvYlazhlGIxZCaOBYrVopvHRBLgJ9Mc4Z8wB6jlDMhjBwG5BZlF0CdrQG9XgcYySsR3njjja1FWKzdftFALQ89BqIHKBqKaLBrVJ+Y7eZAtDykk0CzZs2YePPNNyOYbg/xgYGAI8waKa1fa4eE/Rvr12BqF7+FK4xqPCorg2PLj/IL/ub6NZeRwr5Wvy7TsKT1a62RMy/hPvnkk5J7wIloS8pyMbCRjmVweCbhSu7gTUaeMWMGdIEEMECOpgEqS8c2SwFHA6QEEtrhw4dL9Eq3kAcSEoKU7dEsAENVVHXx6mmZgFvM8YEShSb4oXbfffdt2bIlGAO3YZF++NSrV6/47JTb8Gy77bYjz1VTKLQ064I9DKn5eAt1gBPY1ifTpk2bfv36kVH4tmrVynrfeuutQYMG0QDOaY7lSGUwjzZLMB0TFDVv3hzI/fe//xUo4ASlwKclC5chmLrzzjtDxyeeeMJExB8TBdnq2NKnzSVg2LBhKmz6kVWYSEapSoxmxDomJ62aVyZahYmWgMNnMcEJpNTSY0jkBZMJOkXg8ccfj+9/jhsGtQIF8iPgJrJop/r370+zuQwdd9xx4kkzSs5KTZTWr7VG3pBvrF+9Kkhn4aEVTOGJ16C6J/q17tjyJ/H/VvXrAju1DCNDcVq/1iLJ4DI1CJSdYQZU0MryOEo0md1GyLnjx4/HgYh2JKBXKjc9HuV9wAnqJGvy4EEeB6JqTQkaUtIjg0sLRoEx/VOnTjWqhNVnunXr1u3bt48/6gIJJPof//jHO+ywg7pwwIABMIMMqICFvXv3bpT/HchqYuZUeN5cPrPOPbjLCthG3OYnu3BUh10wzE9KgCKI5Sfk5hvIoZAqK2LdRCtSj+pbhYsFkGOdjH6QVVtajx49OnTowLcXX3wRVokeVKNfx7r0lbaQTBs6xRDYg1UaAsIV0FqjhpBV8McCBc2+0GyNESJL4KcOJ3XERPQESiuSVu2ig0kh+ETK1rjEiKEgUGjVsNNaRCyWaTs8du3alQYuMYRjaepXwrG/2hoprV9rh4T929SvQfiLuSKtGOJGdOKm6dFx1Na6Y8uV8mv+tvVrslX5gNgvncW8dd+R0vq11sg+alU23laZXS6WneVrGdxGy+b4Hr0LHiVfiT5qQQUccJKFcQCbXC8vS9Yk1bXyMlyhHFQoFuX3wO+xY8dibrnllt26dTPl3nvvfeCBByj51a9+FSjIOpAA6qAX7G277bZNmzb985//DAkkfUhGBlSEt9RiMoeYg80ozie4mjBhAixUnyEm6OzSpQvM4yQZBXoAD1VQM1CNS4S5Ch07d+5sFArGKojFECb9sIcn7gHQGjjBM2Wr4pWwgDAX3rp/4FsRDUAdmio2rAvy3X333cII3cmAXrvAZ/gnDpRgduzYUYS5yjQOUBQcznPSKKPcUFtDVmL2S/wxbRYxJGJiGLuj1eeAjlWQ5I8IKLiJmYV/wAEHuEyYyFtEOc5OO+2kg/KHpQZK8bV2SNgdixrxVR/pO2SvvPKKnd5xxx1tu7Nbx71s3l5m8sm/uHm4yTeZZpPvtnF1y5SU51RbuapMaVl+cMkpfNM6ml4nucY706tXr7i2L3i2Qjbh5fkFH7O5rCuCEozXeQczi7v0rTyU93XR+JqpyubKxbcqVzFhwuQ33njDq7jLLru4goiMNz+vYz7ltSWvXP6/UFkE8qFKwhIBmxedAqtAKb7WFsVtUuECEpx/BR8IVK5J4vggxOsZXyW19TAMQsjXgNbLK9dHovfmyt1Sc6tWreLDRmAg44Mf2RzOAZKACsyomSQHU4wq7CinU67QieoK2ukDD57ATo6BQHMPOuggo4ZgNjADkHSCJf6AMUto27Yt5MaBEHwzakW0tWnThjBE2X333XkeyBqL4gPJ5557ziqc7ZhuCaF54MCBHLYo+YEP5Ck0SqGa1cm0NBow+WBFZlmd0AFO4VINsyKTMCfUEksUoKaYGJjtMX6PB3AVB5otGeKyKP6UsBJYPn78eAFBhqjF9BrSqeWqWZgM2TJqOcnbGLIE+jkgRKKBHx1bZopy38ZttdVWffv2tYqwawknn3wyfNWxKG2NlOJr7ZCwLwZfHcEXXnjhggsuOPbYY70SN91009RpM0455ZQjjji8PMnnQVLz/M0zc25uTnllPdDqelpWWXdunZK6JXMzpatFvl4y4oxjp/UO8Ae4XnTRRXzr37+/o/bMM884oAXRoMShrCaXUdxlS/mSqKjMqvO4C2MNZnOZ8tLM8vuBlmVH+fAuEl9zJZVVufJcZdU/brvlttvuvOSSS7yo7t3SwX/+8x9X+LyOAiUxSd43m5GkbN1sst1CpP6dt6dJdEglj3l8TWbop/haWxQ4+pe//EVhB9sAic0dMGCANA0RJXejUnCgr+oQBkjo48aNgyhStgQtj6uHZGR8U/S10I5mWGIWIEkKzPxfxZETzJKvyUMjcCL7S/QIx0ZrTcHUAQOR2UnSxjcQGObAJ4x0LQC30BFken8DsNW78YVG72XXrl033HBDdlk3i1oWWYkvWOKzwlt+GrJw+lkBw1YnR1m115+MVCA+Hq1FHQzhaBMKmAcOucR/1hWmjHLYVYAqPrBlOXQKFCU0UIvJ7RDjBqZZOiRBI22c4Zs+nSQtjYccgK8iT5UbD6+8HTYogi9WrNDJeXhZjCflkrCbiljRyTpilyRzOGzp77bbblb64IMPsmiUOf7I2y4clCAyNVKKr7VDwr4ofPWqQK9DDjnkuuuuO/LII3HuvPPOE0/8xXPPP9+9+7YuxMn8hHLVSxxUWTK3rKJOVpmY+3L2F9n/5TIbrV2vvIz8kudfJ0zrkJ166qmvv/66G5wX0oHeeeednTCJxjENyaA4TQpnoJrgaU6Fl6nMZt6bORNK1994k7I6dXPAtiwpZ/MzVmrKr2aR+Jot8QaW3fa3f1x4ye/69nnSDZe0W+0999yj7BCoZHY1oiiPrwUSIFibbOKC+Co01YOTfj5ca+T8C+/ZZ58t/zrwkixgAFcAxlCgGrCUcKXy+vkf+iQja8vOphP27kACAoSlZokeQhgCex4lenxzdcxt1KgRZkyBN2DS+0VeCxgo0ZE3yACn1q1b02MiK4CTPA8dCdN5pQ8MvKosOo0gE/CYSAO1nCTgZqCO5DZ8pVZrCNhQSxJZo+mmWLIVbb755hDUer3ecpcLBzSiHKbyTTFKhjCjXAKulOPQzNCIESNANZ2Ui49V00yGNnqs3QINUQgy+cYQILQWy+ehljPIdlgIB9T95lqjzpZbbsk39358c+nRsiKM5DGL74gOD7U80YHKWqhsCVxChMWB21ZBngNuJLKxA/Doo4+aZdRiHYAzzzwT7lKFQvnClOJr7ZCwL4yvQU5P7969Hb74rjlD119//e9///uJE5M/BVWsX01IdjX+k28rSyrK52bHT5z6+tCXb/y/v/787IuOOHCPeknduOT5l3VH7amnnjrssMNuvfXWQw891GHi+fbbb7/33ntfddVVIYYZS/B//xIgAa4lFaXZiqmTp15wxf81aVz/3ffef2vwK+de+vtddt1jjTqZslXhC7fJkhaNr+rXKdNmdevS9aBDfnzjjTcLgvf/pz/9qeLm6aeftnd5HdUpv0/zKa9+wd1hDr8sxpBICm2Kr7VCQo/sqTwu+crFEjcMgEMxJOGKv3dEKtfieHOlYAhnCvCDZDK1UXPBjFEa4lNN+r3RZAjTQ8adzFDgqFHyEChwhfL47JcJbSR6iBVMjyAKyhqlH1ahAEUdmgnwh3KjFsIr8GwhrOCgqVOnmm5dLCo9OUAVedO97/nMk0AjAcuJD5PdBtS7+FIWvKeTpFF4OXLkSMrdOOmMm4SXAoyRoR/znXfeAZ98phPHQkCytmHDhlDTeskrQ6EsNwyRpIcbnOeDdVmOdVk7qNtrr70oee211wxRmOBkHinZDWGPdAJLYbcFZNjaYostRM/9wNVk+PDhsSOxm+RNJE+DEpx+S/7vf/8rUHZHy8nTTz9dDiQfE2ukWsPX2rK7kpDlL4CvKPbpueeecy26+uqrf/Ob3+DgH3DAAd7oxx57PC+AUeI1r1LpqJ7y37+alZFLM5Ul2XpVc++//7E1187+/rKrf3bm7485Yq96S/05rKO23377KVW9D15pPsS3Qjz55JM77riDxwwTiQ3o4kyXZErLkqt7pqROSUWm4svjf/bzXQ869vADd6/IlP3xorOuufnfg4cNbd1o46X0asVQ/owuGl8zlVdffcNFF1zw3AtP9dx+R6JeSGnrmGOOueyyy7zP3kPR0TGUD5Qa3oZauruFvst+uYjlBdT0yd82KU3uJWaWZ+f90oqkxi0ttbkpvq54Sl62/O9HlF7F2f7arIh5AkqlpeCEDDSKUtWo90XuloXlbkMgyovz4YcfQoKAW/mdDLSAQNI3IqbFdABACCwhpiWvKCTGOiwJtKN5o402MqSjNfpJ/u/KGeWD6pM2evhgSNXlnYVSHAvg5CTcZYswMXgGP0aPHg2cwHyshW9xReAGAYUgTmAhYksGePXVVyGQIVZUgY5iADb9zDHBN1EisMsuuxjq37//zPzf0jHLWhBVfPBITF8wWVeqeoTW0iOjINwjnZYsgFFb44TzTGBaKfwDeH369CFJgM78O5VcX7R8sBa+4bPiEbVq1Qq+Ch14xjQU9T213DBdK+wWojK2C5wfm//uMxbpRE6FxBjOB1XvB9VaDcGVlBagiEy/fv1sM4iNR7s+aNCgnXvvlP+6nZSdqazwhsysTE5AyUcffPT+rPcr5lZUVGVhrP8dcsghe+6123rrreNNkdBpLWhfIuKAA/fKK6/06NEjPrfBhLVuoPnfvUk5kez778/67LNP9Csr5s6cMWPOl1/lYD70z5UPG/r2f/s+DTLI7rfPrnO+mj38ndGZ5C6xypOX7bXkb49s1q5tW30kT3nPt9tuO0EhIHSuxt5hj17vd2dM/+STj21U8oM+meRmNGvWe599Jg3l5sz9atq0KXPm5G/fFbNnvfeeBFNZlU3im//lxmExpRVPdgTYNG/eXB537L0Ckju+NqAI8OCDNxgDadSgZKR+BZY0DVkdAwldrnefJokZCBrf4wNuA2tZAZlhEZOAEyWtO0IQyymCASyCCmL0GIVSkBLBPKPE2ArIYZGMUhKmbrnllu3ateMbtZg89C7zwUSpBoTo841j3IhilN1JeRoyZMgDDzzwz3/+8+677/7b3/7217/+9d5778Vkjgmu8sd6ATnrFgun1abuAaFKhxUKqQJX1BKLktEjvDRE0kIIw1QASZu+6BEG5G4nIJAhfNqsziglcbcQLssRWxHgiVWIaseOHTt16tS9e/ett966W7du22yzTdu2bdu3by8O+F7PPffckwzHqDJXWNj15iriR4wYoZa1hFGjRnGADNPWxRYxMRcra8fnjyGc5JTkyeMCVGv16w+chH2B+hUz7lynnnrqo48++sYbb7iyeXz55Zf32GOPxx/7jzd86NA33dSuuPyKCRPGb9Kk+Wab1P/8w4+nT5m8zoYbXnr1lSVl5atlvyrJ1SvJfLTP7vvud+xFPztqt7qZMsk5MblExDFn2it65plnXnXVVeHnEUcc4eQ99dRT99x7zyEHH/jWsLfuvPvuMaPHnHraaYP6v7ze2us9//LL19z4546btyjLzX3zjbfrbrDxFk02rsjUefGhv/346N+9+Mbr227RLLMKfDz8DfVrVa5i+5671Csvf+rpJ+rUSXLuzTfffMUVVwwePHjy5MmS4CYbb3TFlX+YMXXKfgcdOvytoZs12LBfvwFHH3vi/vvvX1ZS9dqrA2+7PflmqFNOO3XAgH4/WnfdAf0GnnPOOc899cS6G9V//qWXf/GrU3ffdad69q+8rNKbmtavK5zilTz//POlV5ldVtUpJtbggBl9mRpTCtZK/Tgb5wmWADB6gChsMxeO6uMAyxDGoccjdLSblABvc2GSR8LQiH4YAI1MYdEUu6/lg+w/Pf+jOFI/tZQ4e7SFKgjqUYdpnKjwQkkgqyHCxDBjvST5E+TRkqv3KWFC3xrJm641Pb7ayhNQR9gowAOcvXr14oCyIcICnKQ+/AiFiTR4NGrV/KGfcq1whXIdxTfllh+u4lOlA6TlSaGg0yyxohY/WraoDdjWNxe1aNHCtUZsJ0yYAOAFloy5oZkPJFlnlxJ82MwrHo4cOZJRpkPVr3/963322YckIlYjrQpJ7odBdi62ypWz+Aba1Ntuu2399TbYYost+/cfuNba64weOWy91Tfed/fe/+8vdzVo1PjcC879+c8PevQ/j3701ey6JblMWZ3k75BmMlW58oqyitLMvG+cWQqKe7cLr/PttXG5e/bZZ90NPQ55a2SdsuwDd/7tggsv+ODdd+++79HfXnTR+eefu2bVR8+/MqKkqiRX9sWWnTq3atqgpCoz58PPLv39LYcfc1iHLZomvy1h6ej+++/33iqsxS04fPPyO/qyIQDzngtgDJEx6n669957GyUTyWIpyWtls9ZZO/kWSrZcRFztW7ZsKVz/ferZddddp8/D9x11zM8a1Mtect1tx5108plnn7Fjt7b3PfFUVWlJacWXd9517wUXnTvr3Wl33//EeRde9rvf/nb1uZ+d8tuLf3by6WedeWanzRvf+8BTOSl4aUOV0tKSA+Pw2GIb7XVweBx+eVbeL1aKsEQH0yuMCe3k9HHjxoHGeK/lbi0NXmoUdRg9hCGEK2z9+vUDF6V1fMhBodQPBvD1OcC6Dn/w1XaqRnZBAsAwSqdXwGnXAhWmua34U5Z5HbTxPUH6lMdPfMozyIpUhIZM4RuADECydm6zzknuNWzYENg44W3atNliiy3UgltttZWOulB1yDdim2++eefOnaNqbNWqFSZzo0eP9lIYMrFDhw7Kyh122KFnz540eMTU8Tr36NGDKjo9dunSBZ8GFhs3bgzhZMWmTZvqC5eAWzXPIxpRB4uG7KQyeemll1544YU+ffpo33zzTcUoMgpThcgeKbJffPFFlXHckIQrlizyKNRadZgQgSCrs03uPa1bt+aVulmoURySGinF15WODj74YC+e656zDgwcd1uL89zzT2+5VYcNN2x02JEHDOjfr/ceu+27/77lpbnx4ybWXXP9derVTT5MlPKjXI07YPKfpS1u6Dz55JP79u3rlR4wYMCf/vQnr5kz2qdv32ZNG9cpKz/uxJM+mTlt8owPL/v9FWuvvc6XX3wwccr79euvq7zLVq2ZfKdOrvLLT/73i5/8tOtee1x1/bWr5TKluaX1ij//v72zgLOq2sI4MAzd3d01dHe3SEoJEgqCKKAoCiiCEirgEwQJlZAuFRCkUwHp7u7uHt7/nu9wvM4MA8xcmGB/P99++6y99tprx1nfXndmLitXruTlcUiUF4OzzsvGezV//nzeJadJIKD88ccfM2bMcK2SRbqSBxm+vg/feKPlrt27CR8HDhzo0aMHrxwrQ52YljZN2tJly+fMnH7d5p3vf9g5W9aMEe/f3rB5R8qUqVxDR4r0ZuvXL54+cezUxT59PyN23L1+af+x0+9/+H669Om8It7ftWt3xvRpuTFZf8ZjEDJwvVHWDxGhPQViojBbDG1Q4YAR92lFh+OnKAzhwXaSo0kFSfLkyaEEWiFOelEnUouS6QjPEd8pIQl0eOUR6rNQQjxjcWA4IVRwAwW6oyCqhhdFz5QoYB8/8YdxNSKzwFXqDAfr4xJ3ZTRhDoQ8YgqXiDCXra9h4jVRbscxBvv27du7dy/8BJ3v3r0byVYLUNTOnTs5+VAa8i1btkBsyKkD9KnTJMrHPhQIqUP2R44coQS4yqCQOjrEFiQMyuiqMC4rwOwoYWhuKgj1omF57dq1lDimzwZYH+4T6Bw9epSBlB8za+xjmZKxkDBZMl2socMKsDLMnRWGMtkg8m+WEcDi8LruEPA9qwTXsrzifm4PeSywnnS3jsljEWKfD4fUuKEEOhB+Ph/mraACOJ0zZ87ksWrVqmzq5ElT9u0/VPuVigULFIxwL+qDCOeLFy79Vre+zRrX9b59rVHDeqlzlu/RrVXkmAniRovkYtkHl6tVfqV2m67tGr8S/AsUx5c3ZPr06bxOHMQWLVrwHpJVc+zad2gfM3o0NKaOGvrF2IXrV//hHdF31R/Tm7Tt+ff2v73u3k+aMIlXpJvXzl/t0L5j4coVWrVsvffoUS/fB7myZA3m58OnT5/+/fffGzRowCmXBD9ZMYIL7zlvac2aNRWM1Kq4Nnv27AwZMvCGIEEZqPVxsM4oJP3Y328ipVm1fOWSZYujR4/x2muv8RIOGzaMQNnpnQ6JkyR94Pvg6J4thUpWX7llS7bUya8c25avSNUfpv+eO2u65AliR4pwf+wPwweNX7Rx5RzfSF67/1pQuX77NRvWpkya6NLRXfmLVhg/Y26GjKlTJIhPDDSfD4cgevbsyVni8MNJq1at4izBXiw7pwtepCRwc7FDwsvLIyCs05FHlFOnTg2xQRX6LSeCMnbEjnTk5eIME+4J9JxeDIKECRNyhulLhUcsQ4qidsmpAOK+CIxIgkt0h66gE4agCwceT3ADTdI+DQdopZSfcAlG8JPu2IcC4TPAoJhFgVZM8bJjjUkxBEyPA6449fAhFwKGQxPQhUlhHB0HmCXVQxPmQwdTKGAWf2AyJIzIsuAko2MWJxmIjoyINaZPX14oNOUJrQKtNBGL9GNs1h+ulQ9KOjVxSnpR4SrD+4JZ+JtHLDAKNhkXMAse8Q2b+MBwWiXkdEeiKcsB6pRt2rQhPkvIKBpIFQchxq8vOVh2//wKtEPOoyoIrU284/qD0nuRTp3bkSNLmWXr/8mZPd3Fwwdy5Ss0488186b++Oa7H6ZPkeAep+rhtZrV6tRq1fnNhjU5z1Ya64JsPhPkAKXTncoj9x76PrzvG8Erwv177zR/7VbSHGMGf/4wwp3327x5+l6iT3s1n/LLsh49u9y9eKpN87YVGtcrVa4Cjgwf9F2T1q3y5c4WOXgE6ywRcHxzFwqBNIEnronVJxB+ffDwcf8+HXE1wsMHESJNHdp/0C9/Ll22MG4Ur9k//a/X/6bMXTT3q4EDBvT9PErk++2bNn2Q3GfM1z0eRPCa9F2/r6eu/mvF714RfKePHvbN6KlLl/7Z8b2PRg8fEsXb/Pw1ZEAYpezatStLTdTmikkEJ/JSivl4i5WnIiRYw0BEcOrKmcQK3LoUnckOISRsogCbYhkJwR0djEBFUA4xgUcqqJFHYhkJmhhkRIgEsmEUnNmwYUP69OnVHYNYQAjNAOpYhnsYFAU4GIKnO/4wCmX8+PFRxhTkR3eaGAI53MOMuArQHU2Gw8+UKVNSMi9oBurFJrdbmiBaOAlntA4Qv/JvTCHHDeTiMCQ4w4zoy1LgLaMjR8Ijw/FIRz1SMms8xDKaPOKYFpmSR5zhRgKPMi9G4ZHZ4QxeaZ0ZjlbeC1q1NWwfTQiZMq7ShJARuY4cP34c3xiCFUYfOT5jE4N4xRA80sRcMOUAfU5F5cqVGQ6zjBggDL+GDFh2//z6uEDp+sYjGA6edH3Uen/KtPFffTVqycolcWNEOnXkaKmK1d7/sOfJk4d6dO8RKaLvrN/nHtu7ZfTosQkzFWpcrXjV2rXSpk2H5UAOQVCBUw/ucue7dz13jjyfDh7RsFaFew9uvt+h693YyaLcO/NOu+6pM6dsW6f2zPkLvaJGv/vwQSRvWNXr7PGTMWMSEcIAKwTOr873DwfAr7734Nd7DyN93KrJvSSZBvTrE/Xh3Sk/fD9i6p9lyhfLmjNng7r1vB7czJk+Y9/vx9erUZ6g0qFZ/eiZi3z9WTcuMNPH/Tht7pocmRIXL1u9UsXSkb1M/hoyIJpTNm/e/OzZs7ytIgBeJcIrgZgmgi+xnlhM4KYkQIseUKBOBIe6CNaZM2cWRUFggGw1UaJEVMjesAabKg2lY7Zs2SiR0JchSFt5zSC23bt3E+VhHVrxgVwNCQcAisUyaShkwEmghOfwhKQQBfiDxBGiwgh8lidPHgiVsURXDA0jMgUUqGOc4VCTBGuMixD7eEJJyGJSXDLWr1/PQDlz5mRclgVXAU2Mgh06oiy6YlLbt2/HWyTyHKJCmUWDFFlAhkANl2jiER3WikVgxAMHDqAs/uaqQS+66HePSX8LFizIGmqh9ME4TXi+b98++iKhZHQuB+jglUiUCiPqJ7WYZbIosAhsHw7D0AAFFpCxmA63E9zDrC5GdMcyvZjjG2+8Ua1aNUYBeB4gDL+GDFj2Z+HXB9YXDERxZYwPL508cffc+Us5fbJHiXDzga/35l07r1+4nL1woURRvSNEjLxr/8GHN85EihLzZuTYEa+cS5MxY8KEri8SCuQQBBn3HpC/RvS+e2nxP/vz58oSN1Z034iRLl+4uH7PvmwpU6dLk+ZGpAiH1q6+HdH7XkTXnxZFvRchUhTvXAXyEgl4C20roRhB5lfrKyF9bz+MfGLHmkgJMiRLnixahNv3btxes3Fb7NjRcuUrEOlhxMgP7q39e13OPHlixI7z8MGdgzu3RE+ZIXWCeA8jPrx7+/bf/+xOEvd+1mxFInpFiBQ5Imxt+PXFQy9my5YtyduI44RgIiyRFx4iWBN8if4EXJI5QjASJXm82uK8HDlyELiltnXrVnrlypULCyigRgWG4MVEDq8wFjSzf/9+KBP+oAlrjI4yTMzQhHXGohdq6MOghw4dwiwHALpCGTsVKlSASFKnTq0MD7b48ccfYVxGwaWsWbOWLFkSf0QYjEJfaIYMG/JgjnjFKABn9MtTyZMnR5Mu1GlFX0QFQ1NZu3YtRpg+LAXjMqK4GfdgQQyiiZ/YZzp0wSx0RWbM4qDM7MTKTBAHGIVpoqwkmxFxW+xLEySNcXyjAmXKJnuBV0hwBmLWEMyUzdIG0ZdFwCCm0GdS6NPEvhQrVoxWpg8lY42OjM5FCseos5go63eG2UdmB4VjED+V2hYtWpT1dL1+j38BQ4xfQ2rcUAKmz2vDoYRf27Rpg0SbhDwgImStfK1v0XsYIeLtCA9dXynMc6QI3Pgiu/6w9OGDexG9ojx8QJR3fZJL4hQh0v0IXt6ujsH/DafHwuUWDON7+26k6JEfun644nIggq/rXxWAaSM+uB8xEsfcN5IX98bItEFDDyPcieTrHfE5sP1zgHVGg8KvdH3IZYhbxMM7vhFdfybo+rcWrO8/pIgYMbK92ajwf1z5Xf8Ana9vxMiun/9ERMiQESM9JPREdb2jvtxO7j/45qsrn/WN3OPzlTmyNWr02t49Ow2/Pm+wG5Rt27YlDYX2iMvQCUGcQAxtHDx4UFkO7yxhmmSR4AsBwCtcnanAEHQnKNMREoICERLfoUkoEBqAPGCjE9Y/To4RzKKvZJTNJdDTF7M00QvGgjagChI1BtLQyPGHbAwmQEi6SfJXtWpVuIQ6drCMPr1Qww48xHCQB61YQAg5MSiZ9MaNG5EwQdxDAWU8ZJq4ASAVnGcIfIZpmJHr/Fq/0gwBMzuMMApdmAWjMFmGgMBo5R7AoPTCftq0aZNY/1IeE4fkSIJxEjsMxCM6ulUwa5EiTcwaHVxiNaiw1CysrjvsC/4wBBNByExhU3wg3+VmgNs8shTwPQZ55GXBAhKycNaZSeGqCJuJQPMY9PHxwRpjEaJRxgdGpMQB3MZD/KQjlMx0kDsvoP830eSvIQMuYmwe71ju3Lnz58/PxnCMkBcsWLB58+acA6kZhCyCwa8eA9y6YMmfc2YuvbPjn+vrVyUsUPxgnFjLFi7etXsnoRAF3nBpGngcvKeUvXv3JtcR4RGpkSjrIuyiQFymLiqCRQjBCAnThGbCNOGYVkI/daiO0Exf7EAJEAx1mogDokOlR1AXrQT97Nmzo799+3aYhkGhE0wx3PLlyxkL/ihfvjxRPlmyZJs2bRo9ejR8jwLBBALD2zJlyhBPMmTIQHfcOHfuHAkZHlLKJfEoRIWQR3yQzxA5jqkJT1DmugBTkgfTin0eGQtvmQgsxVio0QsOwyt08JZEUH8+R57HxGFNjLM4CI8cOcJy5c2blwnKByTr169nUvhZpEgRJFijL2oMp0/UCZVMliG4SeAtZA/JYZmh8ROgzC2BLJalW7hwYZUqVcg7WU+4n51iVVFgXoyCexjHE4xgHztsGQvFOjBN/UyaRVNYZivxCvu8aOw+QlnDB+kE8gIafg0ZsG1sZPXq1TmjbK12ga3iTPTp08fwayhBaOBXRv9l7Ojvh/wS4dzxCKeORE6e/FbSRN6Ro0ydOVNfJmf49fmB95R3c/PmzXAhcRwKVLRFSNSGb0iYiOm7d+8Wp7Ij+mkfJRLIgI7EYkqlU4RmzGIKsoRyiPjYhJDgUeI+gBsQQieZMmUikcIgY+EGgQJrJFvkgosWLaI7W0+4J+7Di5T79+8/cOBAxYoV6ajfGNq3bx8EkCVLFnJEWIfH2bNnY4qbPT7QF8sYoXvp0qWZDoOiz3FiUEoREpOlCyXAf+rwCnWICjAQjjEpJDAuvvGIQQgSDmYRMIU+7qGsnHXPnj0kyqwVt0MUEOIDwALLQkd8wD2oVLkjq4SfOIMRlLHDI9PX58/4zCPj6l/04j7B3SJXrlwIWVXsa17sBZ7LIGvII+U///wzf/58DLZu3ZorDvoIIWaWkTrjsmhZs2ZlCugjwTGCNvrQMDvFWBkzZmSgQMK14deQAZtNyVmk5GyxCyrZfg4BdUvLIIQRKvjV9+G9m3fvEu6GfXXh8wExun8c9733rE+5XFdp2jkzUjTwOAisLDJ8oHeTFxYaUCpDefjwYbI0eOv06dM8og+TwS7EX4iEdxziIRwTqbEDVykrgthgF2I3XWDovXv3Yh/C4FHJMcbpRQVmgoRQJsGCJmEXqBF+ghUwiEt0yZ8/PzTMWDt27NiyZQua6GMHnoYnoCXYBZKAaZTMoQYb0Z1RUNNExOUQhv5oFcs0Va5cGTl1jSXf1AUoXjHNZcuWTZo0iXmRp0JsLNG8efNgx5YtW+I/EwH0goSwwzUCioLG6BgnThx9qkwdZuWRtaUVJ1lAJp42bVotHYvD3GE41LAPkbOY5M1r165lLrRikFsOPrD+TAdTzDpbtmxIIEIIkl64gSasz5oghMipYBAnUQYo4wzb8csvv7BxKKRMmZLUHE8YGs+ZBftFE2AjqlatWqpUKRbB8KuBQVAQGviVF/TugwjXve7H+HbAze59I376WbyuXSJ6eYeJL5gM6yC4U3722WeQlmiMSE3aRKiFLAmsUIJ+egdVkEGilj59ekiXCE6Uh2aI2lARapTcibCGnKirhJW+sMXOnTtFybRiWTdsoj9qgIHIfQnllDAEBAm/wjdyr2HDhjVq1MA+vqG8fv16WAeWRRODVCDyDBky4CEKsJeuCIxOiRGmAIdhiuS4Tp06JNPkuFA4PtALYi5cuDA6+IOOhhCzUofJqEA5n376KVPGw7Zt29JxwIAB+JA7d+5ChQrBQ8wIz6ErLQtsx5rAWKwYQ2MEVzHOKCwscjzctGkT9InnGgtvmQt2KKFJqJQ1gRePHDmCBE5FBzkcyaSYI9a45SDEOIOyJtAqe4F9iDlv3rx0Z+hTp06x8iwR06SJXJyJ79q16+jRo7pDMBda9T3GWMNhZs2gbA03KmbH5UZXDUYMEIZfDQwei1DCr9bf9D30/WrYlU/7eH32UcL3O0aIBL+aDzmeO8QoCxcuFO0plYTVCNPwARVAeCW4E9ahKB5hCAI9akRwgi8dUSBPosQCjwR3aACzSrz0hUQ8ooB9KhiBgdAEWMMI+sR02ILQP2fOHAYihULIuAwExcJt9IVF8GrFihWo4QN9N2/eDG2gRjIKd5KcwVU4Dy2JbinpApmhDAWiA99Tcgm4ePEiGSRZKeuA57gHcA9/cAYWRI6H8CsjYgoSVcKN/zA31njEVewzZXJrbgYYh+doQo3kmCWSZRQwi1c8MoQuAXAeQrrjm1aDFWM6qOEhYGhaGYXhoEYUWA0eKclrmSzLy2RZfKkxKBzJVQZ97Bw+fBi+h8ULFCjA0iWxvix60aJFrAl1rOkSg7e4xHWBXJbuzBQH6E5+zPpoAV1nJSAYfjUweCxc7wYvEP+LENnr4d2I970uvdHMd+bsGHN/i16mQoRIEchKvPTv2zzP39L2fXjvfsQIvt98c63XZ16f9ojzQWevSPY/tGLwXAFVECG//vprshY4jwwPeiDCsvjkqaIEyIBADB8Qx9EhUyS+E9YhIdgF5kBIoCeRIr7DkcAiC9dXCMF8mKIk4duwYQOUkydPHqiRcSFCsls4hhKDmKUOJUAqcBV13GNQgjtuQDYQYc2aNWFKBoKDt2/fjgPQFZkWfSE/SgaFGBhXowO6YxwPmQWkwtDMl4wQl7DPIxP5/fffUSPJQ0JH/Gc6uE2F2TEKzsDu2bNnR0LCCgvimDJIOQlVw3Zowmc8Mjss4DO9GJdlQZkbAwuIKWziNuvMWOJXOjI6y47C8ePHWXMWgY40UVJnRJFohQoVIEtYnMlinIGwwBRwAN8wC4+yhmTAciN16tTMkXXAAmZV4iS9SFuhf/pilo6osSPMCAnLhSa8i6tMEGXKAGH41cDgsfAle3wAuUa8EylSzPt3Iz6IdK5Ny4gzpsecOyVa6WoRInrd9brn9dDL6yG55PN6j7D70PeBb0Sv+98MvtT70xgf9YzzUZcIXlHN58MvAIRgiGTAgAHwAUE2V65cxH1YU4ySMmVK4juBmxBMzKUixiJ2E5qJ8lSQkwsSi5FDWtgh9JNLQSdQBRSCQZgGCkECNWKEPG/Tpk3QBqPA3IxL9slYO3fuhIbxiqCNYxA8rEkrj5AoNwD6FipUSL/dA7DJcPv374fAoAd4BYpSzkcvuIHuzIImfECfZDdLliy4RB2f8QcLq1evljPwSv78+eEemIy+9KI7lMydgBkxcWwya3JxfM6YMSNT5hEjJIXLly9HgeSPKbNiLClTYyAWRyvMQrE4NGFfv/lMvogP9EITz5kUNw90UOCKQElfuJwK20G5a9cuOpYvXx4q5UIg+sSyj48P1lhhrhEYJBXmvsJ0WEZWg1kwHa0hdrDP1OiLb6wM06dkVZkRpgCeMCgkjWP4QwqLHdbTdVYCguFXA4PH4gGZ68P7MGgEX6+ID+8/9Hp44Y2mkSfNizl3qleFil4PI0fwvef6u2TXvwH4/Pg1ousL9Xwf3h78zdm+fWJ9/Gmsbu/xSrt+ncbgOYMYTUm8JrYSbeEYoi1CojDZDCAWE6ZhGiI4m0KwJtYTgjNkyAC1nDhx4tSpUzSRVJHlEPrRgSlJ1+gImwLkkCIhHnb59ddfIVdoACpCmSRJn6CSUKIPhcBk8BmsgDOQAckWoR8mI9wT+hmRJDJZsmRYYCB8JrxjhCFgF+rYxFs5Dz/BuzRBYFAOXbDGWMgZC1Lct28fozPKsmXLKKFSAO3BwVhmXGZKdyiToXFMdwiWiyWiwirhCY5hHM9ZCoxjGTnLhVdMAWewQxNzh/lQRgE5CqTjUCP5JauHZfrSETdoBWgigQWpsObMbvfu3dRJ35nmoUOHuFhQwU+cwSYrBn3iG6MwKYbDAbYJD7k0MC+UoW00qSDHAlcW7DMivdim9evXszJ58+ZFiCn2t3Tp0jwyBIuAToAIMX4NqXENDJ4FJK/3iBgR73v5RiKP9b3yetM7sxbF+l+/GCVKI3d9l8bDiA+fJ7+6fCBLvnf/7qhhF0eNi/bRFwm6do4QJfLz+0DawIHC1IIFCwjihFHCNOEVCWGXcAzI+Yjs8GWJEiWI+ARuGI70dMmSJZAK3dOmTQur0Z1AjLxKlSpkP5AZJEFmSSJFXf8gP9aI7DArlLBu3Tq4h4Au3mI4uHPbtm3YIaMqWbIk8Z3Wf/75B6akCQ6D5+Ak+A/6hJMgGxwgqcU+EkgRLlFCjFcYxwHmAqNgHCOMjkG8JZdlCP0WD9PBMSaI88pWscYioCNSgfOUXy5duhRKhucwxVjk1gwkfUbBFJpMn1aITdcCfOa+Ql/GwhqUjM90RB9lmBUhriKkC62iRpYI9zDFcExTH+oiYaFYB/qK5sWU+MmkmAVO4gMGkdDKZFl5HsnO2a9SpUqhz0B0AShwdcAsHu7ZswdX5S0KbA2Qh3Xq1KlWrRorA1gK4FQcmPzVwODx4O3wvf+A7DGC150I92LcinjhjRaxZ0y8Ezvaw0j85/UgwsPIru+udKnq3dLrFEj9iQqAunslsm+UhxG8vG+fu0MQ+Lxf/C5vR4gczcsT/7ivQeAgjBIhBw0aRMpCYIWK4CpAOCZqw3aEeIiHXAfWAbACFHLw4EG4ED5DgciOAhRLlkZcJnmFJ2BiqIWojcFcuXJlzJiR6A8Yi1GI4xhnaOI1dcgDORTOI6EfT6AZqAtrjAU/oUMTZqEWiIdHZb0wBDaxgybDMRa0iufwECzFEDgAI6LGTMmbs2bNmidPHoxgDc/Rh5wwixpeoak1wRpyKvATY2ENulqzZs3UqVMZFEJlEaBYHOaugD84Qxf0SQpZB4ynSpUKnoMaeWTp6MV0sIlXjI6r6LDIeI7OqVOncBtNHrEAKlSowDrMmzcPtZPWv6/+6quvsrD4AAuSlGtESgbFGWwCTRwjWi4UqJOYMgQbt3XrVlylLyuGcZxnUMbibrFp0yZWAyGLgEH6UsG9du3a4QmPNCEPEIZfDQweC9fL8dAXlrsfKcKdh3dj3ox6dPCA+4t+JZeN6ktDpIgPvLweRnjg9UB0+Fzw8KH3A+8HEaPd975xJ1LUOG++m6x+FddXEkdy/bGHwXMFsZgI+fbbb5NXEanhAPgDyiE0E1iJxURh6gRoiJYKbAHhEehJBInXEA8hmwhOhdBPzgRLEazhRcxCCZkyZYI+6Y41xiJSMwR8A3URwSE2fXTJKHRnOLrjAEQICR04cAA76FCBmdCkDqmUL1+eLJZe0AByZkHHOXPmwJeMwnB4CKOjCTnBW9wGsA/f5MyZE29RgI0gHvykxCamGI4J5suXT7yOPp4gpAJpke1xn1i5cuW+fftYBBQAo8NSrANmUaY71pgF4zLHHDlyUHL5wENmiinmxUCsAHPHT11fxIUsCG6gyTLirY+PD5Oile5Yg19Zaric7qixJnTBAUzhDIuPJp7gA3YYCDkO04TxvXv3MmUs4xv3DJSZO8oYZHfQYUaksCjQxAJq41hPBmrdunX16tXxBIddZyUgGH41MHgsHkSIcP9hhKgPfB9GegC/et+PGcH3aqTbER+4qM31bc9e9yLDvg+jkMc+OSUF1J+oAKi7CR/6Prwb6WE0UuhId7x8Y8S+7+3rHfFhxEiunMDguUL82rNnT/gA1oQJIAyYYOfOndSTJk2KDrwL6xDrCdDEX4hBnz0SzYnIWIBaSJ4I+vDZunXriO/koNiBySAh5AwBZ0MJGESuH9BCupAfXbCAHUaEyeASusDK+fPn5xFPGPTvv/9euHAhVEET9wBcJSdm0AwZMihJxatRo0YdPXoUeitYsCCWy5UrxygMzRDMCxaBYHiE82AgRs+cOTO8yOyYhYgHCeOiw9QgHlWA/MdJLGzevHnjxo0oi7lRY140MSKMhWPk9EwfsGJYBlhgRBTwE026IMEguSOjs1BYYxbQnsi7WLFiOAmv04pNFo0mCA8dgOXLly/DwdhkprAvzjMKChjHVbrQkUdWjy0ToeIMg8LQTIFWPEc5e/bsOI+a8nhKRmcI/KROX5YauJbA8KuBQRDAu8F/vO5W3VW1fqfY+gJ+4Pqpq1XTY3Cg99AuXCWxgADmagKkytaPdlz/qoOrYo1q/j7n+YM4y3b07duXGLpgwYJTp04RmiEDyIbYChemsL6YlyAOiOwQHkACPRC1CdknTpyA6mA+fS8SKSBJGCRESoRxzIpWoS7oEBDuybQAA+3atQtTYiY4e9WqVUR88i1K+BUKIcqTNe63AAcwnAiSVlEavA7lbNmyZebMmZCNqAjOQJMcFwe2bt3KKFQA1nAeFqELXsGFVapUYV7QCd7Wrl0bStbhRALN4DzzYjgXw1h/pIsbpIYYwVuoiCPMcPoJK/rIETI17OAAo+AhfjIWFSgZT9BngnRkXkgYiMmygJqXfrqMNfqiwy5gE8unT5+GibNly1aiRAkcxkMGosQZtoy+SliR6OfZKLMIDJ08eXLuHFhgIvrAQHSLG3AtM2Je9GI47ij//PMP64A/xYsXxxm2LFWqVCyIdAKE4VcDg5AH0Y14wcuoChJeWlUMQhAW7/hOnz6dRAf6hO0IwQRi+IDsU9GZlBRmIiLv3r2bChtHOCYKQxtokgtCGARrWmELgjLp1Pr16zFFfCdJyps3L3QIhQCCO32J9ZQQ8/HjxxkdIs+aNatCPKMwLkKMU8IiUAWJHXxG61Xr3wyATiCeLFmyQAbiM0r8xAKptq4IEBIe4g/EwyhkbFAjDIRL+IwFOAbyoIIOdwJKOD5dunRMkIFYCuYOx3BiKaE0JsXQOEOFgWAv6JwZ4RIrxvSxgCaMhUREiz5jifBQphfO0BEFiBbix38GElOyJixC+fLlV69ezQWFRS5ZsiQSVpVeWGA6OAzFMgXcxiV8YyCMMFlRLDNl43CSJkbk/cKB7du3Q5y4Qb7LorGwjLh27VoRfMuWLblFkZezbjhDKzcqWBxl+gJ8M/xqYBCqQawUvxJ0eOTlpyTiWI0GIQYoh32ZPXs2MZcQj0T0CbuwZTABUZvNIo7TeuDAAaK/PkWEIVCG2wj9RGSi/M6dO2EauufJkwfSJVjDghkzZiTiw0OwAh05ANghiKMJi/AIJzEWQ9DF/TwwOgqQFjobNmwg+lNnUPiJ0elFgoiEUbAPqSdOnJhe1HEVtiM3hRvEynAe7AuFMEThwoUZCzU8ZwioiHsA3AMlQ0IYpJUkEroSYDXUsCBC1QQZlFkwBEZ4hOqYGgrcA2Bx7g2QNy7hJGpcFwBTZiUhb6zhKiW9YP09e/YcPXqUiTOpnDlz8nbAu6hxZcENXDp06BArTEKMn5o11rgQMDrOYJ/1Ycq0MgTOYBm3aWLi7Cz6KGATr1hnLhDsBUvx7bffsrna2QwZMuBw48aNWTG2iSljR7uATUrn0T8MvxoYhAroTdy2bRtX5kKFClWvXr1s2bIKc8jFvgQaKpa6wYuAFn/AgAFwEqGWRwI3UZiUCHIiOhPT4QMiPmGdzYJIiOkoE76Rww2QDRbgQipEZ7qQs5JHEqyJztp0uIcSU0RqwjdbDB8AWI0QT8RHgSyKCmoClIYDh61v+IOHyBrhAJ0QnITpOUjI4W/oAb7EPRQwwij4hof4AzFDLehDUTAWTqLAiAwNA5HDlSpVCupiLADD+fj40BFn4J6DBw/CxxUqVBDHUIprMYtjLAULxXxpwhROiurgNkZHjZkyHSS00pd1wwdGYWia4FGEy5Ytg18ZV8sOJZOt0gu+58qiGwkbgT/A+VMlLgHYxBrrDyVTwSU6Yjx//vxME2uMwjoAVgZrDLdp06bly5fTilkcOHLkCA6waCgzUyzUr1+/du3aNMHE6HAjYWXImJkgQteWBATDrwYGoQi8zwQ1/cOZvMZlypSpVq2a/tExmghShl9fJFhzMHPmTDhDOROxmyBOdkXgJgQrpySyQzYiQkIztAeZoQC/wjrQADS2YsUKJUNwDPxEWEef3aQLbMQjpuAAmBJAAIxLKwRJXoV97EAqTijHJqS1dOlSffhJE7HecYnDgwW4cOXKlbiNb4DRKWnlNkBiCsFjBJ5garALSRvdsYMOBpkgDuAPo+TIkQOuogsSzMJJ9EJCHX28hbdwT9kqNtFkleAkLhPIEfLIKMwCIRlnlixZatSogZM0IWQ6O3bsoBeMzugceyZO36xZs+I8fnLs8QfiZCyasCaahxehQIzAlFjj1sJqMBeATQZihWFo1Fje69YPvPEHa0wQsxhnNahjhHGRc/XhTsO8uJowd/QZRTeG48ePY4E1zJs3L9vKoGS05cqVoxLIKxli/BpS4xoYhGbwXnz//ffvvvuuwiuPvPxz5swpaP0LHoQAW8/ghYA1BwsWLCCmk6/AXidPniTPI5QT0+EGxWh2SiGbQAxRwUnQD2kuj5AWCqdOnSLoz549G5tIaIVo8+TJAw3DGbAUcZ8dhw8wS4iHTjQ6yjAN9tl6NB2XIBvyqkWLFiGBkOgFDaOMG/Ac3ECaBY+Sk2ET6oUVaIIq8AT3oG3yQvxn0F27dtFl7969KKRPnx5N2IixGAXOQ5mMFvcYES5Bk6UQPTMLRsdVWhkFP3XzYBQUICRRI2xECe9ikJIR0cyWLRs+U+ERV/GKLpxzTrumwKRoEnsVL14cUmcZWX+EbAF2dJOgZBR2hMkyLmuyc+dOWBZvmSB90Qf4qTolNqngDF7RndlhnL2AU0nf2RG8oi9ryPrgmFaSOTI7Voatx0Pcq1SpEvctWaZ0rzgw+auBQSgC7yNBOV++fAQLhYAxY8ZwU+bV5VGh3FY1eP4gjhOOoUz4hsUns0FCtFWYVqonDmCniNrEaIIy2RiUDFWgTKaFQqZMmfQ3MFAdzEoTBABP0ERkJ6xzf4IgFY2J43AYozAoCRP8QQXjyBmOsaiTk8HWcBgSaNXHxweGwxpxH8ukblu2bME9iAoF2ILR169fTytcTiuD4gYGYZcDBw5AMJQbN25kjlwLIBjcoGQ6DAcRMiJuoAx1ScKgPOIG9ElfHhFSJ8nDAfRpxW1WAxouUKDAX3/9BVHhCTNiEVguDOIeDIcaXjF3zjyeiPNgeiQQJ3XkLCkT4TaDQaaGKVwFKFy1/rF66jAfjyjgIWDx4UgmwrJQV0V7Sise0kUSSu40VJDjDHOxbLs+TsBhmJWhcYZNoYlVZUQm26pVKyiWLjISIAy/GhiEIvA+8v43bdp0xowZigu5cuWaPHly5syZaeWd532WpsELgMKjfpeV7SDgsiOEZkoRAzwBS0Gc7AuRFzWwbds2ukAzhGaCMkRCukaMpguPsBGABjCCTZKz3r17J0uWTGGaXnAYHQninIHt27fDzc2bN6c7ZqEQUk/oBPtQDv6Qb0HMEC1cRR1nSKrgUW5pR44coeROANBnaKWMsCAlXIU/dME9gOU1a9ZgB+5HuGfPHnhRmSjThGiR58+fny6sCdxPvosdWvEB38RPDIQDKDAXVgmJiIrFYV7YwYGKFStWr16dteJCwOqxXCwFOiTKtLIOECrTZyCMYBaDLMKKFStgZbifu87SpUvpxeoxtDaFtWJQbELerJ42iJIm5IyOfSqqI3fqeCjIW4E6NI9XOKzVwD5m4Vp6MSLLhf3u3bu/9tpr6GNcHf3D8KuBQSiC9eI/nDNnTrNmzQi7y5cvnzt3Lnf5CRMm6Dvi3YOCwfOGwvT+/fsJr0R8QPRX6KcJBbaDHI5wT5BFCOAMaAZiILjTyqOyNLZV2S2EgRpmaaUXqVvRokWlgEHkGCSUA5ThMExBRZCHtp469tEh+kNsGMcsILWiVRTFoHDw4cOH8QE/EUKuWJZxWnGAiTAiTZTwGYxC8ocdGAUJNwYYUSuAkwwKqaCGkNFRgHqlgE0MarKUeEurXKWJCmNRUkeCJuPiIeTN0NhUR6YvBXWhdC2l23d04AbjokzSD7tLIjVKeQJYDbmHUBJ5gkSPQBIpOK6q7lQQanSNoianwjqwJlmzZuXahCQ08mtIjWtgEMrBq0G8njZtGlkL0bBz586//PILUXjixIn6InjgBAWD5wonTLkvOEIeVdqiRwikyT9k3I++Y8Fp8lNxKQUEPwqOHXcJpWPKqTsKDtwV9OhfRwikCciOu5EA9Z0m6u7KVqNfuKu9APj3RA6oQpN7qx9N8O9aGxgYhDicazUXZy77VEh3unTpMnbsWBKLMWPGVKpUCQWuzP5fZgMDg1AFr88++8yuGhgYhDRgTSgWOAwaJUqUcuXK3blzZ8mSJXPnzs1swfCrgUHoh+FXA4PQBYhTUJ0U1tvbu3Tp0jwuXbp03rx5KVOm9PHxcT55kqaBgUFog+FXA4PQBYtb/0Uk658TIWEtXrx4woQJyWL/+OOPePHi5c2bl1bpq6OBgUGoguFXA4PQDqWqEK3+ShJ+XbBgQbRo0QoXLmw+KDYwCLUw/GpgENrh8CtU6uPjkzlz5kWLFkGxt27dKlWqFBQrNUO0BgahCub3hw0Mwgx8H/1pPClsq1atrl692q5du759+5LLwr4O0RoYGIQGGH41MAgzELlSgWiXLl3aunXrkydPNm7ceOjQobGtb1eXmoGBQWhAiPFrSI1rYBB24bw1+lqZDRs2NGzYEIqtX7/+yJEj9Q0+BgYGIQL/P6Ax+auBQRjG7t27IVfKsmXLTpw4MWHChLzkJpE1MAgNMO+hgUEYRtasWWfOnJk7d+7ly5c3atToxIkTpLZ2m4GBQYjC8KuBQRiGr69vhgwZ5syZU7JkSSj2lVdeOXjwoN1mYGAQojCfDxsYhG3wCkeMGPHs2bPNmzdfvHhxrly5xo0blyNHDoTmN4oNDEIQJn81MAjb0G9VJEmSZPz48XXq1Nm+fXvt2rU3btxors4GBiELw68GBuEBvr6+CRMmHDlyZLNmzY4fP16vXr2VK1fabQYGBiEB8/mwgUE4gT4ovnnz5scffzx06NAECRL89NNPVapUUYIbyfqXqPUlUNI3MDB4rjDfj2hgEE4g4owcOXLZsmUfPHigf88uXbp02bNnVxOlYKkbGBg8X4QYv5q82cDgOQEGLV26dNSoUfU1xcmTJ8+dO7fTpIqBgYFn4f/lMp8PGxiEK/BGC76+viNHjvz444+p9OnTp3379l4WaJKm+RoKA4PnCvP5sIFBuILr81/rK5yAj49PihQp/vjjDxJZhMWLF0co9qWCxO5jYGDwHGD41cAgPCN37ty5cuVauHDh/Pnzb9++XbRo0ShRokCuNBl+NTB4rjCfDxsYhE/wavv6+kKiVEhhW7duffHiRcpvv/3Wy8vLyV8dllUo4FG9VH/w4IFSXh6p0CQ5pXslaJCHqssfR0IlcuTIanom0BHIFCWwGx4PNCkdTXVU/bnC5eijsZwRncUHL8YNAU9Uascdr16kD+EP5gcwBgbhE0RG8ShltWrVpk+fniJFijFjxrzxxhuXL19GwQrv/zLc/fv3T506dezYsZMnT1I5c+bM1atXaRXQvH79+unTp+nLo0e+5Rg7kydPjhMnjvMp2t27d8mwY8WK9cknn0jyrMDmuXPncP4pPWReKtHXakjyAsBArCdLzaCORKUqLxgat1OnTuzIkiVLQsSHcAbDrwYG4R/ESnhrypQpadOmpWzXrh1kKbkTRqGlcuXKlShRolixYkWKFClcuDDl66+/vnXrVggAtl68eHGhQoUGDx5MF2hbvYIDjNy7d+/27dtQuyRYvnXr1p07dxyvnhV0b9SoEfcJpmOLngJWnuZK3FWX8HmDy0T58uWrVq3KItgiawVemAP+wdDaEdwI8hYYODD8amAQ/kHcJIuFHefNm5czZ85Zs2bVr1+fPBU5kA7sQjqlf00WWs2XL9/58+enTp0KASBXtCWzjBIlirqgD/xEYUf4ONCKmlNxHl2dI0TASfKnyJEj+6Ec/7DbAkKMGDFixoypG4CtHRAcP1Ej3+X2sGvXLqbm3uTAMeUawILqaPpXBtIB9vN/YbdFiICfgIrkFy5cWLZs2Y4dO9QqoaBR/MslAdT9OGM3PMZJ5HbNgjTdoV22mx8P6Tj2ra4u6BHYzy8lDL8aGIR/OB8UZ8yYcdq0aQUKFIBOyPOOHTtma1ggGnp7e/fq1evzzz+fPn06OvHixYNlYWVaq1Spsnbt2q5du6LmfJoqqLvqNDkSIIm6WO0u8Kju4jNb1fKzQYMG0aJFS58+vS16ZBaolyRq8g/IddKkSXPnzk2UKJHjJJmiRidRdhl65CSPgPrChQuZ3Y8//igjUqMvNA+kqS6yKUiHUpACQCg7QBJg9fiP81wjlixZMn/+fNZcZnGjUqVK48aNo65egkZXL+De6i6hRA1Qp+TRaVWT5K5u//VW82UvtB0SOh9RqC+gIlj97BlRcRbHqcsgkIWXFub7JQwMXiLw3sWPH79OnTpbtmxZtWoVGRt14jtNV69eHTZsGMERBo0ePTqhFopav349OuSy5cqVI7X9559/7ty5kyRJEuxgYcOGDbBvqlSpUJb9W7duQRLQdtq0aSXE4NGjR4cPHw5tHDhwIHbs2IkTJ5YyEXzTpk2//vpriRIlKlSogISIDLUfOXLkyy+/jBs3rtQI1ow1evToKVOmbNu2jVwza9asTvT3A4b7+++/GTFFihQQGH4eP36cmVLB4Jo1a3766aeZM2dC4bgNndNl69atCxYs4OqQNGlSJr5v376ECRPC0yKb7du3f/vtt5MnT2bcBAkS0EQX5GDjxo0sCOuJ29DkqFGjsJ8sWTL9K/eWOy5/sA9zw/o4f+rUqSxZsmhcei1duvTEiRNcJtDH2p9//onzLC+j4waVlStXks5KIoOARcZhTDmLDBgIl9BnOOYeNWpUp+n27dtYJjlmygzKLsDrs2bNmj17NnV2mXkhl/5vv/22efPmZs2aZciQAYPnzp3DycuXL6dMmRIFJJSsFRvHTBmFjuwF5Z49e9jliRMnHj58mDXRQr1UcBb8X7iOiYGBwcsBsjHoipLoTAxdt24dj2qCkyA/iIdATLAGqFWvXp2o0b9/fx4JndQ7deokCzBKnDhxCM0rVqwgvAJ0+vXrR7T98MMPlfYByIzQTCDWZ8skxIR1SBoj6EM82EQfTbk3YcIEIjWt6g6gN6gRx7AAYJobN27IZ//glpArV67kyZNzG8AadsaOHSuX2rZtiwN4Au9Scpkgr0WnadOm+tCbklGY1OrVq+lI65AhQxga0o0ZMya9UqdODQviNl6h0Lx5c4RMoXz58lRQgzjhNlZVngMoBwuaPkDn0qVLcpVZwEOQOhUWpFWrVlx0cFXuMSLXgho1aiAZPHiwlks2oUCE6OO8TNGEAo9NmjTBh/Hjx/MoZTBjxgyE7733HkIImC3DPsvIZJk1PAqFO/qYhRfgb+qYXbZsGcvCJUyLCRDWqlULHWhYEpq4W0CoGMRtZsGOw8pMCmV5+HLC8KuBgYEL8CvUQuQ9e/asiIfklShMuCRWEijJwIiq7777rgIxkoEDB9JKgCYJowt8AJcUKlTo4sWLirzkvkhgOziYrAuuJbhnzpyZXFAW/PCrA2sEX3zAFMGa2E2gv3nzJgPBLoxl6/nDtWvXcubMKX6VhLwZNiJL4/bQvXt3CLJHjx5wWLZs2SA2nDx48GDfvn3RadmyJSkmuHLlCkP//vvvUAvT2blzJ0ND87Bj5cqVmYjIpkWLFkyfCWbKlOmHH34gl3311VdlR85DpenSpYNEyemxgEvYdH57CAlNOIZBZkTWzu2E1cAsPpA3c1dgxXC1cOHC1DUoZc2aNRGS6fIoUwzHYgIuPVjASZoEhqtduza0B+vTl6vD+++/j6ssFJl91apV0e/Vqxd9rSX/D78C+JhFeOWVVzClgQAOoIOT1OmITXSyZs3KgWFS7CnHpkCBAtRplYcvJ8zPXw0MDFwgzlISFAi7e/fuJdYTVYnOpLBFixaVjh/AtSRYhw4dIjci6xX1fvPNNzAZrRgcOXLk9evXYTUsQEXEZWL9/v37t2zZwkAa8XFwhadIkcQ9cBisSRBPlixZtWrVkNtKTwcGYiI406dPH7JbvMVD7hMIaU2TJo0+aIXFs2fPniNHDngUYhgxYgQD9e7dO2PGjAzdpk0bOOOvv/5ivigDJgvrcGOYNWsWtJQ7d26SewxiWc5jnxwO5xkUC6Sq8BnUaDn1LzBFmSpVKsiYOmq4AV3hRsmSJXGP5YJurTEjsjVcVooUKeJ8pzRgOFWKFy+Ot1Dvvn37JGe14UhuCdKHqtlZHx8fjCdJkuT1119HeOzYMcdCEPDdd9+xDp9//nnevHnZ5aZNm5YuXZrslqFx2FZ6KWH41cDA4F8QKAsWLEgs/vTTT6HGjh07QksETf+BEgkUQlYHIc2YMaNKlSobNmz44IMPihUrplaohcyGCmkcSedsC9gklB84cIDyaWI6GR6sQ26H/enTp5NxIgxC1G7SpEndunVhROqQK1Squkw5zvCIXD7DEGTzkOW8efPwnBIdHNBf/lBn+gDOhguRUBdPyzKAWUnuYS9oderUqaSMCAN03uvR90JTujsGederV4+8Ux8e0Dp58mTWEA6LFi0aIyIEKNMESBzRZyASX+QosGiXL19u3rw5BI+ay+NIkbg9nDhxggSUtJihtapBA85s2rQJy2fPnv3VAnutbBh+tZVeVhh+NTAwcIHoTAmVkt717duXQEnaNHjw4MSJExORpeMOYihRNUWKFF999RUVEiyI+Z133qFJ4Z688+bNm8RZEibyJNCiRYu5c+fCAbAXahrxccAIgEWGDx+eJUsWnIFUyMPgcoYWArfgAE1yNRJHuAQwHXXUEIJoFaFFQJFwHu6BqDp37tysWTM5v3HjRhjXnSaxww3AsYwRgNAy6XL++++/z5Yt244dO2A4nJ8yZQo3GDlPd0GmGJTS6cujXKUj5bRp0xj3ypUr8Cs7Urt2bQ1kGXABHaFBgwaxYsWCX2E+sv+JEydKHx3Mcun54YcfypQpgzMIe/TogSfIZeSJYETGxTH7OUIEDLLLyD/88ENcBazVmjVrnF1+mWH41cDAwAVCsyqffPLJRx99VKNGjTRp0iC0on0A8Vf68BARX7H+woULfr7VgUBM2vTjjz+uWrWKXPYvC9RhLGw6I/qHBhXy5s1LX4iqQIECJL6Eb0ZEbqs+HWRKsEWP4C6RAtDE4U4y1xUrVuAAnCHnixQpQpPDbZb6v/AjzJkzJ12gNC4fhw4datWq1fjx45FLzYEkjtzV8xGg56JFi7KwCxYsWLJkycGDB+HFpEmTuq+erWp1z507t4+PD8k36emyZcsOHz5M4s5EcBiws9yBIL8RI0asW7eOu4vI0pnO08BdWYNikBuA9hcw5ZUrV3KEpPPSwvCrgYGBC1b4dX3mCVk6wToQKMj+9NNPZEiNGjVq0qTJkSNH2rVrR0JDE92JualSpeLxxo0bBP08biDr9ZN+BQLU4saN27Jly8WLF5NN0nHcuHFy8iktPCswzohp06Yl/4N+oCvb7zx5IHtx1RPXR0AzTpw43CdwXr/GhfNKGZ/SeXaEuVPhmgJP8/jGG2/Q93HdSaZfe+01Bvrll1/Gjh2L/40bN2Y4wAZNnTo1Y8aMNEHS5PQJEiSgC6ZoVffAgdrJkye3bt1qP1sftidLlkwfVNhr9AiJEiWylV5WGH41MDBwQUGWUqkbcM+QiJ4K6JQOtmzZ0qtXr3Tp0g2xQK5Gqte/f3+a0McCvEt8R3LmzBk/fWmltGz/m7c5oAkL8BAMBz1Tx5moUaPCClRImqUTYEenQne5LTXqQBVLxR6FEuAnzGGpuCSMVbNmTUbv16/ftWvXpOMoC1hwn4WEzlhy/vr169TxmTxeznPhcHpRIqGUZSQ0HTt27P6jv5bBCPLKlSsnTpx44cKFixYtKly4MHyPt2jS1zXwf4F+rVq1uARAon/++Sc8p4Qb4Azw9vZmdlLWl1/KBwGhLFNXE2BlbltfY3nixImmTZsePXoUHRRwDzpnl6n36dNHf9nlsvJoTV5yhBi/ag8MDAxCCQijvJjETaKqLXKDXltaKcVbBNO33nqL+vDhwxMmTBgvXrxRo0aRD3311VezZ8+WfsOGDatVq3b48OESJUp8/fXXc+bMGTZsGNmYflsHYDDAEWlSuW/fPhjlvffew+aUKVMYEWWYT95K2R1YUxOQZahImiohCWtMFyRkCpAQPDFr1qxBgwbNnz//6tWrtJIrM/SSJUvKly8/cuTI3377jdZXXnll3bp1jkHUVHfAIw6ocujQoaJFi7777rtYJnFs06aNu/NA3RHKmSxZskDDc+fOHTBgAG5cvnxZdkgQ9SendNRvNqGvLn6ADmXy5MkrVaoElTIR/RERcoA8ZcqUe/bsef311wcOHFivXr3evXuzPqtXr167dq1uLbKMS+qSJk0a0tBNmzZhp0uXLuwjFJstWzZ0aNVEWrdujRyd0qVLf//99/j/7bffkjQzBVrl2MsA13r5gd1iYGDwcoO4CUfGiBED4rRFbrh37x4MQcTQ1xQQi9u3b09o/vLLL3mkFSAcPXo0QoIyvCgJ1nr06IFlYjGgNVWqVDNnzlQXfYDZvXt3Aro9kgUeoT26Y6dYsWLoAGI6/N2tWzcyWlqBn16AXFP/qjw5lhwbN24cHXv27MmjJJjNlCkTLHXx4kWMICGnRCFmzJiMgqsMKuUjR4688cYbZHtyHjtQCzkfFgAXBajrjz/+0NA4Q0csVKhQAbMoHDx4sFSpUvrtJ/piuWvXrnho+f6AWXAvgTtJc+nLcOSIMCsUi36cOHG2bdsmNzAFzWMhSZIkpJL01Yj+oXUDM2bMYFz0jx8/brdZ6eavv/5KKkwT1ooXL7506VL4nuGcFWO+PJL4UkcflyZOnBg3blyErEDZsmU3btxIIs4j68DSyT286tixIz6jI+MZM2ZcuXKl/w16qWDf4AwMDF5yEAoJ9FSiR49OlJTQAYFCkZToCQEgESvAUoRaJ4xQQU54hXgUZxGiBpcQgmklBMORWGAIHongjk0pCzTRSxLGJQ87d+4cEjIwYr3cQwcF915ADlBCijiGRITKELikQRHevHmTjjjvzBTjDAHjxooVy/27ACmvXLly9uxZdPAc/zV9gFl8wwjGJeERQsImo8sxd+ehUt0zkAfuxvnz55km+ljGAq3w5WuvvUYGPHz4cLo4+n6AWVmmxBM0dTmQM2qF4E+dOsWgLCZrghrXiNixY0OxqHHV0KS0eupCJn369GlWxumCjp9z4ljGSPz48fFfC66hX04YfjUwMDAIvYDJuIVUrVp19erVZLFFixaF+V5m0gpDCDF+NbxuYGBg8EQQKpcvX169evWCBQvOnz+frBGh4ddQCP+bYvJXAwMDg9ALX1/f0aNHL168uEGDBnXr1kVCHDf8GiZg+NXAwMAg9AJ+tWuPfl/XIKzA7JaBgYGBgYHnYfjVwMDAIPTC+SjYfCYc5mB+v8nAwMDAwCC48H8BMj9/NTAwMDAw8DzM58MGBgYGBgaeh+FXAwMDAwMDz8P8/NXAwMDAwCC4MD9/NTAwMDAweBEwnw8bGBgYGBh4HoZfDQwMDAwMPA/DrwYGBgYGBp6H4VcDAwMDAwPPw/CrgYGBgYGB52H41cDAwMDAwPMw/GpgYGBgYOB5mO+XMDAwMDAwCC7M90sYGBgYGBi8CJjPhw0MDAwMDDwPw68GBgYGBgaeh+FXAwMDAwMDz8Pwq4GBgYHBc8FDC07F19dX8pcE5vebDAwMDAyeCyBUh2KoeHl5UUaK9LLkdSZ/NTAwMDB4XnD+aoXKgwcP/P8RSziGyV8NDAwMDDwJ0cq9e/cuX758+vTpPXv27Nu3r3Tp0sWKFYNfX5781Xy/hIGBgYGBJ6E8deXKlbNmzZo3b96hQ4e8vLx4LFCgQDgmV/+puclfDQwMDAw8CfgVZvH19YVyfvzxx/bt22fNmnXVqlXx4sV7qfJX8/NXAwMDAwNPAhIF5KyUBw4coCxYsGCcOHFeHmYVXrr8lfnev3+fbT59+vTatWsPHToUO3bshAkTpk6dOkOGDAkSJHh5TgC3S10wqbvehogRnUfdPVkKHsP9gjBZ4F5XXJAkzEFzUek+CzaUUnsqiYHBU0KHh5OjcyUEEhkcNYJtqVKl1q9fP2bMmObNmyN5GUKKg5eRX+/duzdu3Lg+ffocPXpUJ0Zbnj9//hUrVkSLFs1WDe/QO8P0qcAoqiv4IqGu10BN4RhaAU1cs3bmHhaB806pSQmOxF1oYPA0UEDQ4dFBAk/zjpw8eTJdunRU9u3bRw4jYdh9uZ4VLx2/clC4SXXs2PHBgweFCxeuWbMmkg0bNmzatIml2Lt3b9SoUW3V8A5WgPL27dsHDx7cvn07CT3nPmvWrHnz5k2UKBGt3DaoJ06cWPrhFRwAyjt37ixZsqRq1aqin7AbArg+nj17VrNQCTjb3JNix44dPXp0R2hg8JTgHSFQ3Lx5M3LkyJyip39HJkyY0KJFiwIFCqxZs8bp9fKcwJeIX50wCmccOHCgfv36P/zwQ5w4cdR04cIFWLZSpUocICQsi1aGo6DTwCNqqlOqVXKn4rRKAkX5P4K0OgrAGcWx40fB42AWGuvatWsjR478/fffDx065OPjkz17doRbtmw5fvx4oUKF4saNO2/evI0bN1Kx+oVbaEGmTJkyevTohQsXavHDHL/qeOM8t6VZs2ZxN+LKiISpaS7wa9KkSYsWLdq+fftMmTK5T/C5nrcwB1YMqMLKUN6/f5/lZcWoc/9WHBDUJQRBkMENHKNCcMM9Ihh1JNy0okWLRmvwDzPW/vjjjw4dOtSrV2/gwIFIrNk/dvo4QEmv1q1bjx8/vmvXrvRi3QCLyVHESVrD/Wdj9mEK99DWsutkaRwLTuGCBQuQqJWDSJPeIkl4BMgRSqLukqiUjuScbPcm7npI7t69K4mEPHIBpORRvRyzTkcegUZ8TmAgfONeSZBNnjw55/7y5cvyQW6sWrWKJlapXLlyeGt3C79gwffu3ZsiRYoGDRo4G2q3hR3Ic3aQCpvInSlx4sRsYt26dU9Z2Lx581tvvUX+yoVp3LhxrnNmwe5v8AisCQuol/fYsWNffvllnTp1uI6/9tprlNzCrRfl31gRgmCv8ZOL8jfffFO5cuXatWuz3Tt37vzss8/y5Mnz9ddfMxEdDLtDUIGF6dOnwxdvvPEG9Sca1BKdPXs2a9asUaJE4RK/Y8cOuDZJkiSkNKQxy5cvR8HWDr8IY5f04IDZUnIyKDlz+kDYanElK0jcr2M89u3bl2vX6tWrbZGFc+fOdevW7YMPPjhy5IhMUUJIH374IfqcGKIYx4j8r0CBAh9//PHJkycZ5cqVK1999VWRIkXy589fsWJFboIaTg5QwnaM9csvv8jmcwUc/8knn7Rq1YqxSHTeeeed2LFj44OAQuHChYkpXDCZQvi/YFoL8v777585cyZ+/Pg8voAteB5QdkWpY0xg5TRSqV69esKECeHanDlzDh48GKq4evXqu+++S5pLq7PpBu5gDXlDp06dCmnxCg8ZMoR3k0sJJ6RTp04cGFsvpIGf3AC4F86dO7dfv34TJ06EWb/44ovRo0fDZwULFrT1QgicsaNHjyZKlAiiJfeFXIk2qVOnXrJkyeuvv86l1tYLx9ALFu5B0ARUeDeyZcvGuYwXL96PP/5IGELuumtZqRsvlfTJ8Lh5oTZs2DBJAK379u2LESMGrPPXX3/RRX1HjBhBXEubNi1voPuPb1ErVarU8ePHixUrxiPWUIO3vL29f/31Vw1njXyf00Zr48aNqTs+eBDW7F2uMn1uoDiWPn36rVu3MhZylahREchoyee4sdJFFsIfmKZr6e/fJx7pGsG1Q0JKW+mFg6G1I/LNfXeeBk7fGTNmMKmYMWNyYt1P2uTJk3UOv/32W0me3ni4B6sk3L59u0+fPtw7yQtv3brFAoIVK1YkTZo0efLkUBo6z7pu6KsLfbXywBotYDAi+lTU3R2SgwMHDnBlJ7xcvHhREtLZsmXLcpjTpElz4cKFx1kIHHKMUg4DHqdNm4ZZoodrGLdTSikdd7jmZr1ZdOFS8t577+EhXcChQ4dSpkzJISRLkR1gdwt3eFnyV7YTUIkWLVqPHj2iR49+7dq1du3aQWwEIHcFd7BA7kLqkkhIkJJQScOpU6fefvttWJkQtmXLFuook/6SMcBkPK5du3bOnDnJkiXj5P3888+WSVd3SnIL8pZa8xkAAGwnSURBVIxcuXI5Eo+DF4/TTBpNrswQgwYNYlB5rhIdKgJxOUeOHLqIqHt4xfr163/66SfuTNRTpUrlvhohBUUc1V3v6LPkl3gu5xcvXsx2c6K4J2lGAuefR+yfP39eEnU0cNaZytixY0kH69WrR4jgNsyKIfz7778hiaJFi5KQPdOmuIOOZMNkloUKFaIMBLVq1YIg0fc/liTXr18nHSS9JgfQn5YCZ0OLFCkSN25cHhFK8vSgFydEo1jjuyo6k1hTk1MCV5//Qm4sWrSIio+PDyFX3ywBeMsKFy5M3127dsmIuoRPML2XCuwoF0PeH25VvDlkk1TY/jNnznBp1X4D5a+sz/fffy8JoHXv3r1wD73IX3kERLGRI0dy7EgXypQpc/r0aegTIe8GLI6QW/DEiRMZVD+jff/99xFy5ni0Lnn/3mFp5RGb9nieAzaxvGbNGt5DjjivLnQbyEAod+nS5erVq3hli8IdWG02izBEJE2YMCHLwg3dtaPPYf2fHiw4DrA7EOTUqVMvXbr0rP6gf/PmTa5HnN6PP/4YgzpjlKB///7ImSy3QAlDdr6hB6wDCwK4HCdIkIA8lUxLS8RRocKNZMqUKfCZpeWC3fPpgH2BN8sVeZ8EElCNRRfbxCNo9AEDBhCIiF2KG2oiqhQvXpxwNHz4cP8dnxIY5ARq1pbLrpXh7cCsfv7Ko5ZFsLu5AR2cF+vrR610cZpatGiBvFq1avSVNTWFP7x0/Aq02evWrStXrlzUqFFdd6qIEfPmzbtx40bk2m+49pn4Fc0kSZIcPnzYscABJVVF3qtXL+cYMfTgwYMRZsmShSEkURcpqGKP5zlgkxevQoUKDA3fr1q1SmPZzQFB5Po8nAlZOIvMBnXq1Kljx47sOxvKCz9nzhzNV2VIAQ85FURJjuXOnTufdRdQ3rZtG3fHKFGi/PHHH5gSsEPI02ckGTNmFHOHyy0OGrQavKr6GoQOHTrw6AfosJKUgt3z6eB0OXjwIC/gihUrVgYKIsytW7c0qCw4QHL06NGUKVMmTpz4xIkTzv4i51rPZZEYtXXrVv8dnxJY46iUsFDSAqeRGxvvCDGtlIXSpUvTSuX333+3u7kBC7Nnz0Y/ffr0BDoeHWfws3HjxjS1atXK8Gt4Bsd3/Pjx2bNnZ7PJKVOlSrVjxw42m12n6Vn5NV26dO4ZMEeKKzAhcsyYMZIAWr/99luUMQ7hWQaC+A48K5YtW6avzqhcubKfE/9Sgf3SFk+fPp0AQQrL/ZoDwLaSuNhKIQqdCnxjs+BXPdptTwGUv/vuO4VC/WY4k+UyQcSsX78+8qRJky5cuNDWNnADa7Vnzx793rXHf8FV+wgYBctPBFsGUKaLbeIRaO3bty9bSTbJ5sogymjOmDGDY5M7d273WPSsoOORI0ewjykHPLIsVCgFYibl6NGj7W5uwKW33nqL1nfeeceazb/5640bNwoWLIi1UaNGyUOnKfzhJfr9Yf/gfDRq1GjRokUNGjRgv48fP/7555+z2dbhCVc/l+IcT5s2jVeOet26dUluwtkEnx4cerb48OHDffr06d+/P5d9/So4a6KfwoZ1MJe5c+dSyZAhw+7du1evXj1//vxevXqRhXDHeuWVV5YsWVKuXDnWQfoGghZk7dq1pIBcQbh28454cJWsoOIyyO3HlRg+Sg0fBy5DV65csTv/F9z+uR0SvurUqSOJjPOC//DDD1ToTg6gpiCAI5QgQYKZM2eSgzr48MMPsVylShXksyzA5ZT6VMwPrl69ysEjqJYpU8Zy7d9ow412165d8ePHL1++vOR+iDxcgf1+ScChcUCEpdTFCly7dk3/cJI+N6PVyV+HDRuGJt0RAvLXWLFiPWX+qs+H3e93tPrPX/3D1vYEZJDjni1bNsaFRZiC5uLZgTwFOQx27NhBsHsieFc1F2Cb8Ae1OuAGTfCCXLn7g2+++YaVSZEixbFjx+wOLxC2T27Q7ih/1QcqwGlSxe78X6iJWegXteBXNj116tQcVx7Tp0//22+/KSvicGLK7mZggaVjWbp3785a5cuXD27jUUsqWPvw71sjoVNxEPjC0tq5c2d3vnkc2LhTp07JB7vzI5BkR40alavh/v37UdBvcmB5woQJ0aNHpy8bzcF2nFEFOKaoBGhZoEmtDujLBR23W7RoITuWPRfQVC9pAoRbtmzhtpokSZKDBw9KR02oEQCxo7wWCSUKagUo6NG9EnbxEuWvzFYle0aFfYVQuQMCjgI3enadkwrQ0QtA6ey0OhLNUXCZ8xCw6Q4NZLd5AnL7zJkzJ06c4DFlypTczalogqETWgreQP2kJ3B89tlndGGaT1w3ZxP1d1nvvfceB4B1OH/+PK3xLEjzhUFuC7bIAo/yDVCntBvcDrAfuExY2LBhAylC7NixCbLr1q3btGkTmUTdunWPHj3aqlUr/eK6u0EDBywLN1EqLKOTVFGnZM05MyetP2d35ESDAwcOuA6rFSW4lHPho0RTOv6B2fbt2y9evHjJk/DLL7/oQPrfLP3NFVRKFsgjEYzRjxw5AnURnXjBfXx8kHPgVRLroGr9+MPV33J+27ZtnBM8l8QdioqM6wc06ViqIqBpdXIBswyHwvLly1mHnDlzckuQjpouXLhANsLNT7/kpaVjOs4yMi8tIzkPXZA4PodFvCz8yiYJznVJp0RArj+357zqyxZojRkzJhXlNDzSZefOnW+//TYW1MuzYBS75jnIc3D27FnOKxLuvO5/oRs6gcOUJGGkX08EeaeWTr0eB3T0km/dunXo0KH9+vUjq1PHw4cP05ft1t3/BYOhcYPAvfERIEjK69evE5iI4BIK+spDdVF3d2gFFi1axAkntKVNm5aLIzGabGzUqFHcRYhuxLX169fTPfDlemmRLl06FocjcfHiRSoOWNJ58+aNHTvWWTeYDKbklkZdO8LGcU1HLfC1zZQpU9mnQIkSJby9ve0+/4UuAUSkmzdvyj0u0L179y5WrBj1XLlyJU6ceMGCBXv37uURTQ7SoEGDqlSp4sQuogGP8LEUPAVcYixKRuexfPnyDvsiBKNHj4bpv/76a/fv+m/atGmnTp1c07Cc4Xwyfa4g1LWwlmLYhGb14qHlfpHgbAECFu/ApEmTIFR9nMtRGzFiRJQoUTgKPXv2RIc7FC9PzZo1WZ/06dP/9ddf3P1JepInTw4BR4sWDc01a9bwygH09fkw4YybFxJnOGWKHClJAK1DhgxBmDVrVka3DLh+03jMmDEdOnTAjsvF/34yE0xgH4POYS1ZsiRTk1wKoRDyGT+fBiwj+4V+4DOiFTVS1eLFi2fJkoV8rkGDBvXr169Xrx45PYtDuHnxa8KIcozgSDAVOIqUxB3gPApp0qRRF2CbcANCTiDJAR0//vhjrQnGLfUH5EOySTh74nK9nOBN/Oeff/Rno82aNeNWqqXj9gM/waaXL19m6aSMnCxt9erVOn4ASuZlpxcS6QQH2HdKP1i1ahW3Q9CjRw9C09KlS6tVq0bKy5HG844dO5L/4a3+BADH8Ie4N2vWLOqywEwnT56snz5I8kTo+yVatmxJl8f1Qs4QrAD3eNyDKe0Ga8QZM2YQQidMmCA3UNbhXLhwIZr6sQW4cuUK8RAadg0Tpk6pzW1uCDF+fcFg8mwVm0f6whEEiRIlKlSoEJdEbvocBSizatWqbK30UZ4yZQqpHvGIjBamRIdD89NPP/FIpHPeK8rhw4dz8rj5Et2cVUbu//eHMasfP5B4id11gFq0aIGFxo0b8whs7WBD1gDeMihDFC1aFCetY+yKsLbeIyDRjCht0QuHsya8clwLnojNmzdrLs7K+wdN6PACd+vWrVGjRhMnTuSCJYwfP14/Jtcf9tkdXhRwG+AbNzyyTHdwzDh75CIFChSwRfnyVapUSfr+XZV8+/btOszELB7tNqtVQZmZFixYUHHNbjOwoEPCyvTq1YsFZK1Y/DZt2jRv3pwt6N+///Xr1/VqAFYP8CiJrnpqAghto88HhCkCF05yQrgdEkx4wXGgTJky7G+KFCm4OB46dAg3cFL+yFteASCfVQLbaKBgcaZOnarXRL0waLc9grOALBRrSICF/uF1mHL37t3du3cvXbr0smXLULA7WEFScGw69Rf/Mj4PvHT8evr0aW6mcCqnMEaMGLxFsWLF8vHxGTp0qK570tdOw6Y5cuSAJjNmzMjdcMuWLZxaKApiVljXaSBM582bt0aNGspfZQF5xYoVkf/666+S4AP648aNQ0j+5PAcJ57jyDvMEeTRsRB8aNaUx48f53IAxZKO6w4uua33CAhp0qRs0QuHVgkHCBaubOtJIAHF7QCn40A2586dW7hwYQ6A9BkC3Lx5M0mSJKwMCV8gFp4TGFHO4J7AYSBQcvcqWbIks9u2bZsCouAEcf+uytRXX33FXLgxEONQs9us0wjjEpEJkWXLlg3QwksOFoRl0frPnz+/devWLBSnq2/fvuSIDi1Jh8sKB4aYcOHCBSTXrl0jO2zSpMmAAQOkYxt9PmCjDx48SIZKSvD1119zpHU8fvnlFyScgUuXLuGDztXGjRsHDx5cu3ZtzhJCZvfnn3++Y8HPIQkEqO3du5dbILkyNl1H9jH8KqBz+PBhxm3atOlrr73WtWvXBQsWsErIaXX0V6xYQdBDRw4TgcmqSTa+++678HE+Xzp+pWSPiapHjhzZtGnTunXrOHbaeF4b59BIGXBVRPP8+fNOmENIHUgBTacJIPFjwbGJD7aSNRalowMkpOLoBx+aNYDLefGIrdwnONMahVY/QEjThg0buGbaJl44cEPutWzZMv9ToFOnTs50bBOPYM3JBpvItYa5OxvHglMSYrhgwUm9e/fm0e75omA7Z628UwF4In7dsWOHzolakUvB7v9fC+xytWrV2OU6deq49wI8cofTZxg9e/YkyNr9DR6BVdKpAM46A3ehQJ1LauPGjWvWrMmac6JYT3YqXrx406ZNo46CbfT5AK8sj1xb7H6eVadURQpEsIEDB6ZLl+7s2bM6CSdOnCBD6NKlCzoo2EYDBWpaH0GPdtsjIEEObCXLE1XchdQdfWi1UaNGr7zyCl7RxLXyn3/+SZw48YwZM/zbD4t4iX7++pJjzpw5+rIqTjP3RJ11wHHncANeBkoSdG7uzpdxh2kwNU2KIEiqoT/IsdusVkLhgQMHYsSIAZONHTsWid0WosANULx4cbjQz9/nBAgpsJXcIVKkSMEW//DDD0iYLEJK6lu3bk2VKhXTjB8/PokFcruzwTPCWmzXVYa73ZAhQ7TISEjskiVLxnGiHkqWV56ANm3akES6/Lag735auHAh9ZB1Fe7Ply8faa797OtLbp0kSRL9YnOYg81tbgix3x8mChi8MLDglSpVql+/PnVO8Jdffgm1aBes3bCxa9eurhb0NcXhADAKp/y7777bsmXLG2+8gYRpqsl1+iNFgpPgYITcmp2mUIKn8Uc6KtesWUNgihkzZokSJTQ75GDjxo2vv/76yZMnuWB9/vnnxFaUUXD1N3hGsJ4sHddQssBixYqxyAiRLFq0KGvWrPpduVCytnhCef369b///rtChQp45ToNESJs2rSJMw+xSSGkwOi8ffv37yc0SYKHixcvzpkzZyh8GZ8GrsX9L16iv399yeHl5fXtt982adKEe9bAgQNfffVVMtrjx4+fP3+eoLxhw4ZPPvnk3Xff7dmzZ6ZMmZxXMUyDWXCNgFx79+6trxpH6P4a0Dp58mRNdsmSJSLasAWcFy5cuDBo0CAksWPH5jKxfv36tWvXTps27c033yxfvvy2bdsyZsw4evTodu3aoUziEmA4MHgasHSrVq2KFStW7ty5tYwcJP0Sb2Trj770c+7QAF52UmpuA/rVJ7zFPW7YRYoUiR8/Po/yP6Tw119/cSPJkCGDHvGWa0qNGjWoSBLW4Vpuu2oQrqGNJhBMmjRpyJAhvHX37t2LGzdulChRCArRo0eHeonFSZIkcdcnEN++fZtrJvyUPHnygwcPkgYVL16cLlyKEyVKlDlzZjQBL+qyZcty5Mihv0q6efPmmTNnEKZNm3bPnj2XL1/WN2QdO3bs6tWrefLkcbIrjwNn9H4uX74ccl23bh2zYDjcJnvLli0brazD0KFDSexWr1593/qjQHK7kiVLli5dumXLlhAtNw9WJk2aNMyd8EQTj1z8WQfCgeZLLzgsXbp0+g3kGzducFlhoNSpU9OFW0vhwoU15YsXL3Ir1/c/Bw4s4y2VmjVrQpNEbX2P2OPiIMqnTp0aPHjwP//8wzpLqN8sxRQjsqHZs2cvW7YsGQxb477sTJN1wFV02Mq9e/fq75cYi2VhXPS1kiwRm1uoUCGmzyMp0blz5zg88eLF2717NwPBNGhyPDhUrDA6oYdjPAumyZq3aNGCA/PDDz+wniwXawhj/fHHHyyR1B63Xy8SuMrWcKseO3YsTMYBRshWFixY8K233urYsSOPIegn7tWvX5+D97///c/b+sZW3hQO0vz583lxOD+hYQ2DCzbA4KUCEfDWrVu7du2aOXPmuHHjSHFIXp1fKpYOFeHw4cOffvppzJgxFy5cOGPGDII+byn6KJMJ+fj4EJSpE6l//fVXwjQ5Mb1g0DFjxtDr559/xn6pUqUSJEhA9J83bx511CAtDfQ8gANyiRLokVkDSQCt3BtUd83z0U+qkPOS9+nTh+zkt99+Y4m4TUMqsAtU9MEHH3CBgPZQwxoK3L6nTJlCR6bMYjLl77//ngABHxN/d+7cSVoMZ8ND+oYa28XHw/GEinymVyAd0UTH6eIHarIfLDimkONSp06dmCn3jPHjx5crVy5hwoT4zPHgsoXbbBNqGCHxTZEiBfcV6hwV7mfwyu+//87c8+XLRyYE3U6dOpUrVKpUqfTbghol/IE1vHLlChevESNGjBw5ksmyRKwe1yyuHbZS6IB2n3eWezMv79GjR3nkPsTFiDNsK4UcWDpOy08//cTp0m+E/Pjjj1zO9LvEgZz5MATz+fBLByIjt8VMmTLVqlWradOmr776KjQJhbjfFqnrfMMftBJ2ibmxY8fu3bs3QiiWKAMJkbwSiLm8Q6vQbf/+/StVqkR3aIYMFX2MUHbt2pVESt+Q1aNHD2uE5wucpGR0JqK68gzqeqROCu5StZR5BJKTjMITsA4EyUyZFMtFBQuffPIJdNKoUSOIB+Jkyh999FHt2rXpFSNGjLx586JDRxanZ8+edCEJJlLQC1Ma9ykhZaxRMgtL9li4pvRoUn6gWYMAjaRNmzZXrlxk27AjkY6tYU2YLCUpBT6zuWfPnp04cSJ3LFJkpbZMkF4cCeIgx6NDhw5s7ubNm6nD1sxdg9pjhDswNW5acAM3Tq4g7DIS6mXKlGHFbKXQARzjYJw5c2bbtm1wavLkyRFySUqfPj1bL50QxPnz57kBEDoqVKjA64OrixcvJnPlgNkaYR+GX1868NYRAb0sOHXkVICj40g49BkzZiTbI78h682SJYu+U5ScjCSVqFqnTp22bdt269ZN/2yyFe0frlq1ikjNq1KxYkX98uqNGzdg3/Xr12NBnyE/J1he2xNUhek4M6WkroojwWHpACwsXbqUAERgwnnu+yTfWbNmpYl7w9ChQwsWLFilShUm+9577zFxLGhc0nrCBLNmmkQ0whnhg0SWRSP5A4wizUCAMwwkm6pIolb/oBU6dNQCBOOiYD88MkUFOXl5hgwZ2Br8xH+cZKdo4r71yy+/MB1iH9ejL774gnuY+tKLGM21iU0sUqQIvbirwd9cSv75559ChQqJcjRKuAR3i3Xr1nH4eS9Y2Js3b3LadbMMVeBIgOnTp8+ePVs/f+X2M3/+/PLly+tz/pBFmjRpOEijRo0ihuAY17W1a9dWr14dn22NsA/Dry8diH1+IKFaBcIlp5xISuxYvXo1nKF//+DPP/8kiYFmaOWRSsOGDQ8fPkzOR/xVvgh4W0jvoGGCzt27d2FownGpUqWQE9B51QnB9kjPB664YsF5lGNAcvvh0dwdCVOWk8TNqlWrQl1wLcyheIQaE2ncuDH5K0mb/ploZk0X+jJN+IkpIySh4f4B2WAHOYkOyQ06ljtPgJxxXAJ2Q0CwNSzYIjdIHqApZsqdacWKFZAoPpNnr1y5EjZlr2mlC/6zuQcPHoRxa9WqhUQdtT5ckqBk5k6IpAt15NSJ3RpOyuEPTI0JcqXg7sgasgK6V7GGoW3WchUai2/9MwB4e/bs2e3btzdv3hy3pROCwD0yaWIIdfzcsmULLwv8qiZLJczD8KtBAOB88wbyQu7du3f//v2tW7cmh+OC+ffff0MVCrW0Qr3du3fv3LkzvNKmTZsLFy6o+7Vr19asWdOqVSteHl5pkjkSPm79p0+f5vUmBEstFIKJk7bi5FtvvUVUun79OvxatmxZmpgvFMIKdOjQgSnnyZOHOMWMlBreunULHoVxoSV6bdy4sUaNGizalStXyOoqV64c2kIG/jDN8+fPv/nmm9wVLl++jJ/k62pl9+fMmdOrV69PPvkkbty4bC4TQc4inDhxYseOHRwJbgz63dSWLVtSP378OHXdw1go2QnH4Fr5/vvv81JMnDixd+/eHO/QtsUO2I4uXbpwVmfMmMERzZkzp17hkAVniWPGMhJDbty4wTIOGDCAiIHc1gj7CLFVZhENQjN4A4kX0GTixIlfeeUVJPv27SMK58+fn5yGRzEokbdHjx76coYWLVroXx2BXdhiBWtu97xFTZs2Rc4V9d69exkzZjx37hyPoQ3WwYygNFSfU+n3pYsVK0bCShP+v/7666+99toXX3zx008/waxQLLcK+jJlVkC373Xr1rF09evXp87dgtaUKVPq8mGNEyqAMwsWLChYsCCzw9tNmzbhP/kEUZj94lbRsWPHbt26EfsIfMeOHWOvb968iebatWu5QxS3vv5i1apVTI3LE3LS39SpU0ePHp1DwqM9TPhFlChRWJBp06Y1a9aMF4Qph6r9dYfYdPbs2WnTpuXCFEp2Bzf0ow2u45yxtm3bciVFCPA21C5mIJDb7ggxftU6GoROsEHKX5csWVK3bl29nzAl7wMhlYSG1smTJ9epU+eDDz6gFQ4eN24c3El6Ry+ic9GiRfXXINAVRIsCZtevX+/t7Y0FEibrFLigEUMDrHfE9SNJMmwohDrsGCtWrBUrVpCf8Thp0qSaNWsSoViHOHHijB8/Pnbs2OjTl0llypQpW7ZssBRTLlCgAI/IWRAWAU7SGlrjhArgJ5tboUIFTQ0PybaRxIgR4+7du9we3nnnnfbt2+N2kiRJCH+w5tatW9l3eJTNTZYsGXVmWqlSJVI3psZMOQDYwSBN9jDhF6lSpRoxYkS7du2cr9oHdlsoA8d1yJAhb7/9du3atbkW8BgaXNXrkC5duqFDh0KuRYoUkVfWQobSlQwcrnD2X3jpn6c2MHCHi2es69j9+/fJyfR1KkTkmDFjcltPlCgRj/qTSvhSyvBQvXr1IBjejevXr8NDZDYuWxEiVK5cmYyQ80cTwRoL0JKOY4CHMgQBQ+ASzuMnkxJPkLPqL1xLlSoFnTiXA6bMjVtfMsCUWag0adIgJ9pWrVpV00c5fvz4TBnCVtSwuoY8bt26BZW++uqr3JaYKRNnOm3atOHegP9Ms0yZMgRimlDGea5Z+sNf2JdekDF1zgOzFr8yO9JfknumzDR5tMYJt2D6zm6G8sniqrwVbGnogO1TGFnJZ4Xrzm5XDQwegVMhatFrKYlTlxwFvQzUVfLodJHQgToC5IAITulYU1OIQ74BJkIpIdNUxd1blY4OFSRaECruclUkVKskIQ78cdzTXss3hFRUqsn/LrsLqThdAEKBLbZr4RTOfF0LF2q2NUDIVTkZelx1vKKUxLWOoXslnxX/zs3AwMDAwMDAUwhXybiBgYGBgUEogeFXAwMDAwMDz8Pwq4GBgYGBgedh+NXAwMDAwMDzCLHfbzK/V2VgYGBgEG7g/5efze8PGxgYGBgYeB7m82EDAwMDAwPPw/CrgYGBgYGB52F+/mpgYGBgYBBcmJ+/GhgYGBgYvAiYz4cNDAwMDAw8D8OvBgYGBgYGnofhVwMDAwMDA8/D8KuBgYGBgYHnYfjVwMDAwMDA8zD8amBgYGBg4HkYfjUwMDAwMPA8zPdLGBgYGBgYBBfm+yUMDAwMDAxeBMznwwYGBgYGBp6H4VcDAwMDAwPPw/CrgYGBgYGB52H41cDAwMDAwPMw/GpgYGBgYOB5GH41MDAwMDDwPAy/GhgYGBgYeB7m+yUMDAwMDAyCC/P9EgYGBgYGBi8C5vNhAwMDAwMDz8Pwq4GBgYGBgedh+NXAwMDAwMDzMPxqYGBgYGDgeRh+NTAwMDAw8DwMvxoYGBgYGHgehl8NDAwMDAw8D/P9EgYGBgYGBsGF+X4JAwMDAwODFwHz+bCBgYGBgYHnYfjVwMDAwMDA8zCfD79oOAvu/8N6A+B+IMPuEjELEClSJPfpALPpzwRWjxVzX0OzgAZhCCZ/faHwfQT72cAfXLz08OGDBw/co2oYBRvtTMds+rNCB8AsoEHYhclfXxxY6qtXr168eNHLyytVqlQkN3aDgRsIo+fOnbtz506UKFGSJUtmS8Ma2GsmAk6dOgU3JE6cOFq0aOy7Sb+eEk5conLy5EnWMHbs2PHjxzcLaBCGEB5CPG+gg0uXLkFg0Jj9HMpuD1OnTs2bN+9rr712+/ZtW+QPcvv69esXHoEZgRs3bqjJgd0h1MN296kd/uSTT3Lnzt2tW7en7xIi0KTcYTdYgAnYtapVq/r4+CxZssQQwzPBWS7uKHXr1uU8jBs3ThJw//59vRpUbFEYBAfm3r17esG5QEiiphcPhsaH8xbwynWaLdjNBkFCOOFXfXzEPbdcuXL58+evVavW3bt3rfwhFH2mRMjAz2vXrsGdtugxwO1BgwYVKFCAuQgFCxYsVapUixYtNmzYQEzR2xgmoLdUG0TFlj4epPVcPrghkcLaorAApuY+QfYaMBf2mh1HQh2JWg2eBs5y3bx5kzUk6DuSvXv3FitWrGjRovv375ckLILTsn379iJFivCmHz16VO+I3RYSOHHiROHChYk2+/btwzfnMBsEGSHGr9o/T8HLy4ty7ty527Zt46SuXbt29erVrghn/XJE2AKLQywmoBw/fvzcuXPe3t7Mjrx8y5YtEyZMqFChwq+//io16Ydy4CeBA7K8cuXK0/scJmbnOMmNBw6ASpmpJO4IKzsVmuFnDVlwyIA3nWu0LQqDYFL4z0ROnTrFnVISNb14MDSnF2eOHTtGhUfBbjZ4CthL5oYQ41eRn6fA3DgTv/zyiyZJNJ80aZKCna0RdqD1wXMus6lSpdpsYdWqVV27do0SJQpB/PPPP5eO3SF0Az9/++230qVL165dmx2xpU9CmJidDhvlsGHDSpUq1a5dO7vhvwgrOxWa4WcNU6dOzV1z8uTJadOmtUVhEEwqffr0RCoCV/LkyXWW7LYXDq2wSh1s59HgKcFy+UE4+RUbDsTOnTshociRIydOnJipksuS/AU4Z/+AyQj99+7dU8mlUp/BCtJhCNSkCVBDhxIu59FdX3akiRxrSPSog/uUv9mklw1OjRkzZrZs2b744ov69esjPHToUOCfMDuuAo0uNxwn9ahWydGholaABbvmNn37+VEr3WWQCqVsau7ucsoLFy5s2rSJdJy6gAWZDRBqpZQ1oClQ4ZpPRXUHGETC6E6rn44akVLKlJoCTY7E6ufSpM6e0hFQkYR6gD4jZI+48m/ZsuXMmTNI0BekANhuHrU4DCHjPMq4dKjQhJCKWi13XO49bmhBHQWN4vR1jANNEyDHoIagdFfTmkiTUq3S5NFl0W3ZrXHsE+VuwZFQQUFGkFA6agFCao5Lt27dok7F6YVxoHrcuHG5rr3yyiuxY8eWhCY0naFVOktNHVDXEBjnUXVKx4KElHTkLDl9seYM7T6KKo4aFQmB+yPlzZs3nTpdZAokSpSoVq1adevWjRMnjsKCesk+wLKM0xchcFm3oEegOiVquI3zVLCDz5biv94KWhYNBOQJULR0YqYzZYMgIzzwK+cA/Pzzz5TFihX75ptvOKmnT59esGCBrfEkcP7IEQcMGMBLW7x48RIlSrz66qvz5s1zHV63l2HZsmWfffbZiBEjEB44cKBTp05okpl9+OGH58+fRwEHKBmdCjpbt25999130QFt27Zdt27dUzKrA866c+jpy22XOowbNWpUqz1g8Ar17dv3008/xQFezh9//LFixYr40LBhwzVr1qCANfigefPmTJbEC+Ym4shnTYELNd1//fVXy54NTA0ePJgV+Pvvv+UVylQg+6+//rpChQpausaNG3MlpwnwtrOqs2fPRpklwmavXr3w7eLFi5bJAEAvu2YBYm7dujWWWWdMERoQOjq4tHz58t69e1euXFk6zZo1Y53xCjAdpkDr2LFj3c3SxOOOHTuYCzPCCI8ItVbVqlWTqS+//PLy5cs0ARYfBbv/IyC5dOkSk1q9ejWP+/fvpw4GDhzoHrZkHMnEiRNr1KjBgjMEh9PLy8vWeLTRhDxcrVOnjhxgrVgo5LbSY4ACO85CccZKliypvp07d967d6+tYeHkyZPff/89i4kDbFOlSpWYOxuEe1KgcurUKfzv06fPiRMnuJ6yAmhisH379kePHkUHD9lNeVi+fPlx48Y5Z0YWKDlLo0aNqlmzJn3xBGtsPWrSCRBqpTsTYZVYHzqyp5yrGzduaAXQcZYCPz///HO27+zZs5IArtRIRo4ciRF2pGXLlqwG4IXVFl+7do2jXrZsWYS8C7zy7ttKhSGYZs+ePcuUKYMDTHPWrFnuEwTcpRiaQ8WtEYNEG1aSmXL33bhxo5Q1HYAnv/32G/SJNVasevXqdNQZFo4cOYKENVcAwQdcohdGlixZ8s4779CRmFa1atWvvvpKv0ctZ+T22rVr6YsPnC4cIxCVK1eO2RF22D6tGJBX7DUHQKEAZxo0aDB8+HCXE48gtymjRYsmiUFwwbqHCNhFD4L3LXXq1BxNwhMHK126dEyNY8SRsjUCxT///MOR4iByHc6QIUOsWLGoE/6GDBnCwbWVfH379evHEIULF548eXK8ePH05xZCoUKFeENQ5gWwrob3JkyY4O3tjT6aadOmFSkWKVIESe7cueEk26g/YAF06dKFKWTJkoW3UTbpQsShO9d2d6/8A+UECRKQyg8dOpRIQQVTdMRPfFixYsVPP/2EUBJAhbivay9jUSEiIHzjjTcwZRv19SXSQfDIf/jhBwIWyoC3umDBgrJMa5IkSahjDTu0EoC4oVuDuNZTlYQJEx4+fNg2GhCaNGmCw7z/rL9DQupLhLp69aqGZghiMf4AbDJ69OjR0Y8RI8b8+fPxHIXx48cjYVs5FUjUC1Bv06YNBuEk5ssjEyE2YYrdz5o1a8yYMeno4+NDwJUp2zk3YO3gwYMY0dRwku7UOYqsD13IaFOlSoUENwjW0qEEnAdIyDFLhTPMHQVlZpE5c2Z8pp4jRw6GkE6AwAfc40DKYdaBw8YKMAR3C1vJ15d7AKcOBbYpTZo0bBOucj65UEKHtpKv74YNG+jIiYVXdJkD9EI5U6ZM7JqOJTpOE7clfFB3ZgHhwTco4A9dmAV1KhCJdB4H+rKznG2tYcqUKZMmTcopZSk4zIzF7cpW9fXldoh91pCKLfL1fe+99+jL6LAITdRlir5vvvnmoUOHWAHLa5vGSBm5gLpvweLFixMnTkwX1ofpMzpLxN2C3ZQO4PbGBgHonN3RENZQkZgsLxfK7Agg+NAXOcOx5pwKFh/H2AuZYt0gSCIPo+zZs4dHQEfW4a233kJIRzY0Y8aM9KKeLFkyLv169dR92LBh2Cdk4QwvmuVFJNzGJYTHjx+XMjbZFy4WtGKK8IgpDPLmypTAZU7MylFE7qyMwVPC5jY3hBi/ehCcHoVRTjBRkmPB9Y1TQqDkZdBBAbZ2QECN15LL+M6dO6HJ9evXFy1aFAuiN+mwfP3790fIa8nrxwHlcs01mTumeBRqZyyBqzEvBi5BVORJJNO8eMQOJBxrorb/P7ZxoK3q2rUrmtmyZdNnWbyTpMu8GwyNe0wZHbuDP/BSEZLojjJdIBJGJ6eUS7ly5eJVhEJgWfyvV68emsSLbdu2MRALRYnbzJQMwBmFCgRP7EZOisCjNAcNGsTcYRGu20wTMiDLJzrThA6ekOx+8MEHDMHoNK1ateqvv/5Suvw4iF+ZAvGILRgzZgxJKokpRpBQxzhgEcizoWGGO3DgAGS2aNEiwgpqJE8o4CFJhv5oksRUDlPSkdDDDhJNcFvyRo0aoZY3b148hIwpMcVyvf3227p5+F9wJDdv3kQTH3CY8L1y5UqWmqApfYIaZxIjmGKVatWqNWfOHBIaJsUjW4BlrSQLJb4nmLJKRLo1a9bkyZMHs61atUJBCNAH+rKn9G3cuPHu3btxnq2cPn06IdtWeviQdeDSA0VxlWSh2CYuFnRh30l8pYMpjhYjEqA5IbAsuRGbRc6Ht4DbIceJWwizYN04xpoFh1PuMWvuZAjxh3VgULozWVYAhtOaADQ1ogP1/eSTT+jLFnPz4EXev38/Ww8z4RKu8vbRV/rQqmgJgpQEC/AranHjxoXn8ufPz52DrSHJRohNuITJvv/++5x5CFj36e7du7v8tkaHgHV9bNeuHevDMqJGF1Zj6dKlztDECvoihP5xgNcfVv7555/1crEFnC6sYXPXrl0sF0P/73//g7FOnDjB8nIOnXcfmxwVFMSvjidk4fjGJYlEed++fZwiDgOJLPY5tGwxalpG+FWaHHLely+//JITSHfRJHmtDFJy3UcTHW5dvKdclRQTODyOM/A686Uj3jILjaJWg6AhPPArB6hkyZK8GDAQdQC9cWo5T0h0UCht7YDATZNzRkdHmZDNaeZFJdJJh6MmfuWdIZLyPtAFzStXrhBiGKtXr15Y0IEmmqBJfFRSixqAVBQCkD8Nv2KBYNGwYUPiMu8P43IJgK01Ljp2B39gOvohNFcBMjxNjZJLMauEWfIkHJNXhBKCBeAKrDeKEtJFjVjpjELF4VfCH48ATVgByeuvvy5rGkuQAtDHUHCw6ATI5uPQtGlT9JkvwdFZQAIui4CcGWFTQmZKyaOGpuzduzc6JBY0WYM/EJFUrFhRPwMTuCKgVqBAAV1fNm7cyHYT5qAfsSmmIEIkhFFCLXZs59ygudDUuXNnrJEfUNfsVBG/suZsBHmevAXQBi4RBMnv0QRc7AhtSCBXFCwf7y9cuJB90SGUmoy7AyEnkAiL/9AAi6AtkL6tZF25JAcYYQjycoI1vX7//XfpIGf6TASHYTXqqNGLJSpWrBiaADrh6CKkacqUKWiyPuSmGg624FLLosEHzixIuZgF6wDBMATQcO6gL0aUVA0cONBy1j607DsnB6+QO3398ytN3Krxh2WsU6cOa4IDmGVnEWIWavnjjz8wiJyN4J3CJi+X5bgLH3/8MWpcrOnLo/znYKPm/iLAr8QWloLTCFdhUN1ZGboXL16cXihT8kIhSZ48ObvMo7MgQKZQkzUmsnfvXh7ByZMnoWoG7datG13Ui1EgRd4+DDo3FQaFX/EEC1xiNmzYIGcoCxcujCYuSY2SmwSSqlWr6ni7PLZeIiBnABNnVVkuzq0U6Gi3GQQJ4eHnrxxNkiTOGSkOZ4JjBOFx0KnMnj2bQ8Mp4dDY2gGB840yy0HJI6YIavSSUDrUMU6FWEbalC5dOr23PKZIkUI6AieYV4tK7dq1SQIkBESZ7Nmzq5ctehKgBIICuQLcxpsA5RDr7bbHA/9xngq3daWnrp2OFAmfqVOSucLcmhovc8yYMakTN63ezwCsEbawTIzmLqwVc5pUuksYxYGE/kETmvIWHiJeW2/6AyrEX0dHFWaqR3XBAZiMuuMJTcRQ9pfQzzmRkJXEMvotWrRgSZHgP0OQO+ojROp0pC5ug4p4BDT5gbtQdUr3dRBIa0ji8VZypkbdvS8HlbAIkZBmIVfUIwkj3rEvykRt1f+CsThXzBoFbnjcliTnkSbV3aHFYe6sCaDuXw3fhg4dyi1Qg/LIIaHOdYdcViEYOceevnAV4BEFDirecshz5syJhFmgwF2HQwI9kCDyKJt+gFdQMmGdY0meLfuACvyNUI+BA2XsZ8iQgbyTt1IzFWczhX79+nGtRIdHSr2zMB8lLkFjJOVUqlSpwgQ5AHKViwXKUBcK1iD2waaJ5BJlWqkD5dn0Uisls0aZJL5Hjx6XL192OqryOLCGHDm2pnXr1mjKE0Ak0Ucsy5cvJxrY2o9eMd5o9kvOoCwmtjUsr5JYP7ihL5djThoSOemo8ci7wEGCrbV6khsEB2GYX3VEKDkxnAYCU758+XiRaOKgcPGkzuX3zz//5Nihpl7uQKhICrDA681dm5ezZ8+efn7yL2AQNcibCMILoCOIcY4+FqSDkHs6bwgV4qMkdESNUhEnQGcCRPr06SFUZjF16tQSJUrMnz+fGyge2s2PAc4wL8alO0uhoXnUOhCt9LLJKyoAueq2iacD02nZsiUhjEsM4YAEbuzYsfqdIFvDgsxqkR2oyT9oQhN/2E3CBOsMf7iowNtbH17R6ritR0iFTGLIkCGfffbZzJkzNbpKUKRIEfaLSAqHIaQXmdn27dtZBxIdaZIGUSGFetXCKxaY2tWrV1lJwpnGChBOkyqU/leyZs2arLYWnBLadnTk565du5jy8ePHGzRooF8opaTOBQv5mTNnUHZm5A7ssDIk0CisWrWKt4D85q+//tIBoC+gruFYBCIssZhkGnBKsSA33EGE1U8fAd4iYWgAdXHL4VETsXRdDqgC9ONbllEfunC344rJJQaO4ahwSNAJcBaARJMm2JRMF59di/XoOvI0kFn0OdsY0Xy1OFijVZ/JI5Tc0Vdf1plLDI8k5ZwKdp+SKXz33XcocGng0kzFAb5BrhiXNUpdUmWZR8q8efOiw8QxwmGGjw8cOECTtScuyAd30Lpz505K3inWgYp2gYF4kXXjIRrwitkdrLsgyoUKFWJoAX2lB+72OdVcH7kJderUiRx95MiR58+fR4HutoY1KQZCKP9tqUEwEGL8yv4FEzICKf72229UOC5NmzYlhaUE+s1GdKhwxDl2Vqf/gFYdd96uDz/8kNhEXPvkk0/osmzZMh0y/5BZ9QVUJKFJXYhiknAZdOlZkKYUnhJOR13qoRACLu85HpKv02TrBQRa3b1yGXLTR+IIgS39L55GzguZOXPmadOmkerxiFft2rUrV67c6tWrebRs/wurhy1UPRCgg/9U5CpQL6v3v6YI3LBg8eLFudr36dNn4sSJMCXKtKoLJZdx/W7R5MmTiSY0cVlhx4meBCbZuXXrFgqcEygKkEsxhU2bNsHoLD6jSI3SD9ydUelIHDARsZ1aAXWEQI/osK0IUWNc2FE+cA+AzzhFyldQU+kOy0CEDh06DBgwAE3mNX78+MqVK7MRly5dQoFRNLUZM2boN1Hfffddrmi8NeIM2XEgCV2cR/c6pR4pARLZV4XwTRMlJwH/V65cSckywj1EfC0ycNnyB95fjJA7Qg+cK41F6VScMkBgFgdUR00DuetLSKmK06pH1kdbAHux/uwCwHluPHjOESLnkx0HmjiwTNqvm4S0UjIRaKxJkyZMCjsk0KVKlRo0aJAOoXpZlv4DVgkjzg0MoKaSew8lnmidHSi+SYdHR1+PgMfkyZNz9eSuCYNC4e+88w7OcF93dKgoSKJMhUfHoMFTguXygxDjVzYvmMAIU1qyZIk+dyKJIW4SQwXu6RqIt2X37t12n/9Ci8Lp7NixI9kPRoYNG7Z3715MEanpa+u5IUAhcJfLLNBnSrb02eH0VYVQ+/XXX/PS8gYSK7Gv1mfC0/vzOE13uaZZuHBhghFvb/Xq1XFv27Zt8JmfNX/6cYUnjs5mwTqNGzdmr+PHjz9hwgR2bd++fZ9++qmjQ0WrxH2LsLJ//3796hCrR6gi8CmOoENyQFm6dGnO0uHDhzFFKZDWwEmyZlkNGIG0PrFJDlAnCyHFsQd2g37dzIl67pCE6ZCXkJTDsmSZhGDuiG3btlV2gn3SMvJIyKN9+/bQNikmZYDfaeB/CCFAuR+hlhEWZyNs193AHVH08zhTgFcGHdXtBgt+Hh+HANUe11dySpGT2Kt///7uznMe2JGlS5dyxqxOgcF9IM2CFf7xxx+JRSx7vHjxuEN079599OjRKNDqri84J4GMWWvlDiQo8IpB2LboSbMTtMXZs2fn/HOvql27NpPlkHMr5SYkHfQ5LfB6okSJ9MmBn4Nh8ESwhn4QYvzqEXDgfvrpJyrEyo0bNxLZSV+EzZs3k1dxWInCVJyX1h1aEd4fAi5Hikt9mzZt9GvuHGKOF0BBFWB1egJQ40XSBZz3kyGQMDpwv7c+PVwDPzroiRMn1k/C9PmzreE5aCygR7nt4OTJk35+RutEfF54yHX69OlEE3K+K1eusKTScTcIqAdtHfyAoSH1FStWEPggj1dffdX9ym+N6QJ1lKEc/eIJJ4E7+6lTp7JkycL9nVa6AP3+Lfd6No4skJyVTaQELLhoQ+O6Q0MA90cWShUJnwY4QOyjwg2A7nKAEh8EORCgTfRVwXOieefOnblQ6ntIFi9e7JzAb7/9Fup68803v/rqK+bO8WZQLQ6la3f/+3HiM8HlmbXUefPmpeSFwpozBU0HMAsGpcnu5ga6J02alL7Xrl27av3jHBIK6kLF0vUwZJbDwyGhTuiQt0D+c8AotVZPD/TVhSnny5fvf//7HySdLVs2Dj9XHy5ANPmfEV30SxJnz57VL44gpBRYWB4JAvqU/pmAGwzHLbNChQqTJk3iRWDK3NT1axMAHU4FaQZhUMpI/Hto8EwIw/zK3nPTXLlyJZV27drlzJkzR44cuS1QB1WqVOFk08oZ8vPjEwcc5dOnT9+8eZPTRnedM0r9yI1zpgs1oVlNTwTDJUuWLEWKFMS7X3/9lXF5o9T36NGjurqCpz+4KFOiT4U7xLlz57AcI0YMtXoc8g37REP9GiHOg8uXLxOdL1y4wJrYqtbPXylpVS/e3nr16sFbCFlVS8UFTUGmnn4lAwc2uYBjinCT3vpLTQUFfcrHo1gcodCsWTPKOXPmfPPNN/Rq3rw59wAkKINatWoRXJgvGyQP6Ss/A/FWAwHssCl00aeI6q6mwCELlDVr1sQBlnfMmDFI1F1nTw64vHw0nDsYF7kGlQInmSRezmi/CKNkrjTlsf7gR8q0Otun4TRQ0CALlSpVIvSfOHFi/PjxWkZtOq2ClP0Dz8ndUeAOR46lLvTFyM8//0wSqTkCu4PnwPowHMvIGeBx5syZe/bsoaLz4z4FS/3ZYLns6oj/XGsIU9TZFBZfm+UHCEuXLs3bd/v2baIHQwOtJDFq6tSpeFu0aFG22O7wjJAzTJbYqE9lzpw5g3E1gcqVK1erVs3StZUNgoMQ41dtZzChm6CPj0+RIkWw6byEOrskVfpLSn3CoyZ3yI0E1lcxXLp0acSIEbAI10aicLdu3WglWOvd5kRKWaV/OHIqUaNGbdiwIfUlS5b07t2bkEHCRMQh+uif+1BYl34gQJO3S59Ykpr/8MMPjRo14mWgO6/B4zwJHIH3ohX7+JY6dWreau4u5IisDOUrr7yyYcMGlpTRHSPUSYzmzp173vqCHrBp0yYuASjoJ7KAOhkAC8ibPHv2bMULNQWCx/npyKnob5D0V5JQCGHrl19+4QJOEz4znLsyi8+9BwZlJUlH6tSpoyZAK/cwbgb436dPH+7vpOnIecQOi6A/LHGsOZCEEoMoYHnt2rVWSHSFZgf+O7pDZ5XMRmf1iy++GDx4MOPKCBXyUc6AM5bV6V8wLleZnj17knwTgnnk0E6YMAFNqE45GQeSLUDCQmGK2L17925uS9zVGJ2MjbHoKIP+hxAClEtIqSMNR4qlWMZhw4aRjMoyr9WKFSuOHDlCk+ar7u4oUaJExowZUe7evfvy5cs5Ubt27erUqVPXrl1lnF4BdnRHgAqP6+XIqWCfBUmVKhULzkaw6ZxShIQXlvePP/7griPlQOA+EPWFCxcSUo4fP65LNqtN1shcMmXKxHuEDval7A5omHeN7p9//jluaNyLFy9+/PHHHDAuhW+99RZvk5SB+6Du8OMMnsyYMYNVZV44wwHgXGEnb968eqMBc+/Ro8eHH37Ia/KUC27gDq3Yf2C3hEFwCHghOQffffcdh0aXfTVR4QwBuE0/OKlfvz4KSBwdIJ2rV6+WK1cOiuW0cfQzZ86MfuHChQnHnDzyYAIufRlCf/9Kk8MQsqD3gRhHXZ6QLsD6YuWYMWPGsb5cNE2aNPrWiMC/X0I29fevztGnF3WATV4wQqT7RPyAoKBZE2cl0YLIf0bnhXeWiCAYN25clLkd690D0CTRGSGhWYvMuF999RWZIkmqvoJOqFu3Lq0EJtLWsmXLcrNGkyF4k52hqTN3nCc6lClT5rXXXuNNVqt/oN/U+vtX7ijuc7x+/Xr+/PkZa5L1LzewyMRr+AMJlnPlypU2bVqGxj5sh6R48eLMSH2xg76+oYImLGsF1CprxEH8R4GhmXL58uWZkf7el3sSK+PujGAtgOtEwW0alGWnY9u2bTGInCPKTQULulg4FtavX88yRokSRT9mA9jnzNSoUUMecggrVKgA5egPLX777TcUHucDlEm85gCz7IxOqcPM3c5y0IXPPvsMNxAmSpSIkMq2cjvhVoEm98sOHTqwI9j/x/r7VzZx+/bt6siIzIX0Gjc4eEjsgR8+5NaFMhPft2+fNCm5SlasWJGB0IcqqJcsWVKfeTp/YA1sE25gFOifjkwfJE2alDSOXi1bttTflw+wvr9JygH+/et7772HP7zLjhoz0t+zYVZpMQ4IzAV59erV9YgmpfP9TaxPsWLFWExedvaUiehTa8De0crQ+qMpIAvcJzBIcqlHDI4aNQpTrA+m2E0tAu8atwcpUP79999YYxf0rRGSc7CxgzLvC8eYNeScY5wps6e83dIEGlS3YTnDHFlJfVdJx44deaQJsIysKjGN7WBebDq+Ee6OWv8unjrqINFx8+bNdJHcIDgIsfw1+OBEcsq5Muvl5/QwHzXpEfCWElU5Rrz23EM5MZwb6QDpEE3Ig0loeLVIXjHC7ZsUduDAgdAGWRFvJsqcP940Ap9irgPkKVKkcL7FDTB6kiRJCKkQCcqECSi2du3aM2fOLFCgAM7oN++lHCDwAQvY5L1CXyhdujTX+QULFnzzzTe8G85k/QPjdKQ7Dtsiy09onkiBS9QBFgAVjKNMq7NuefLkgZvJQVkcUkMWecqUKVzwoUm6EzLUF7z66quKp3v37uV+Tfju3Lkzq+c+NEZmzZqF/3A2u8YrzaB2W0Bg13CJsbRKjKJdI0LhJ9aQ4GTy5MlZZMwmTJgQJiNycRNiaFYJCwcOHCD4WvZcwFSLFi2I1KzAG2+8gcN2gwUeZY07hG4/9D148CAUiDLHgBFtVTdorbCMzcmTJ4vYduzYQUxHnznSykbQKp/tbtbfWzMRclbqdBfweeLEiYMGDcIOlx4cIN4RfLltcK2hOzrq7gdsBwSZPXt27jFsAeeci8j48ePff/99eQi4rvXt25eJYIRskhsG6Qs+58iRA5/JtLi+4DDzxVt8Y1wsaxaULA47goeSaFyiP5rcRZgOQ1iTiMgBmD59+tdffw2Ls4xEamaBQvPmzVGmO0BNFtyBWV5V3OYd4U1ELWXKlLyJQ4YMYQ0ZhWnaqtYf4GENV+EDW2R9Vz5OouzYV4VNB7yGjucYZy5YYEEQ4jwHAGW4B/LjegQP7d+/X4vJ4kNOjKi+ztBaIqBROH7IOaLOKKwAe4dX8CUXAq50JPd//vmnvgxHrzBGmBrusUR0REJHvJo3b94HH3zAK4AbnAT6wtAEkI8++ggFqaHPmjAoB8k9rAEWkKXgvZAcfa5u1apVYzjeU5zhTeRgs+8sMs6oFyeBXoCd1aQ0kEGQ4fqAwq6GNUCWnDDOgY6CKkCtArPjhHGAdHY5QJR+YiugFWtQKeeYQ6mXFlMESm6LktBRfVHW26iOeqS7Zck1os4rcirEMmIlx1ohw3LQteaOBf+g1bGGw5IAKti0xC4EYsFyyv5pHGqUsinHWBAJ0UFCk8sn608UkDtN6MOs8BbOQ/bI0ZEbQF00CsKbN29ywefizCWDl1NuazhasSxlbjlUCArwDRXLkl+g5ph1po9EFZW0SoFHBoVXeCSQoS+5PhxGoiCIxFlSYPW2waOGoyI1gEHMEoO4cxD4NKKmEyAsl11/nUJH9Jkgi4CcjoLWWWuI3HHGMUurVolHhtZR1Ccf2hGg7kCPgkwhpMLtkC3jkcOmpVMJOMZ05zSyTTABYER6QasXL17kMsRYqMlPdVFdwzE790f5jNDxWU4iQQcgwR+9PhAbC+J4go5Td0AXmVJ3zgklXtGXVuTYlw+OPyioyXEGyIL7WDxSqiN1SlVUVxfHeVaeOk24zRbQxBkAtGKTRyoB7p0GkhyhjOuRU8GaYxBiJpLIvjMikD6QPh0BckrWhDWkxAfeLISoSUG90HdKzZqKuxp1l1HLLEKc4QzgDDvCGZDcmQVTc+pU5BsKrv4GQYJr0e1qWIMOug6E4P8o+Dk0OmQ63O7QiZSCHtGho3P4JFFFEkeTikq6S65H6oKjqcPqx4J/oCBnUJOmHtGnlAKl0+ofskCFLu4WVHFp/NcrSh4lcWxq+lToQoVHhQa1ouay6GYTCWqOUI9qlcNAClScVv9AwRlaa+6y+GggWXBKVwc3SOhMh0f/Fhha60MFSOjelzpyCSWRKYRUEPqHDNKqw0YFOB1RUMWxoFH8SBxNNakLJQjEB5kCUlZdQpSBI0EBsCCqazeBWh1NQcNRAmcI6ipVkR3V1Z1HhJKoyVFw6ioBQgfqqIpMSRM4dUdOqaH16AytY0PFnV/VpEcq7rOmRN+/BepIHB0gBZUBDi1IRxWVgtMR0AsfNAqPlNJEQoVSQrpIIiESoEcNISEVNfHoOCOJS9XNslMHalVJL3XUoyOXhCbpGwQNrqW3qwYGBgYGBgYegn1zMTAwMDAwMPAgDL8aGBgYGBh4HoZfDQwMDAwMPI8Q+/mr+bmvgYGBgUG4gf/fBTO/32RgYGBgYOB5mM+HDQwMDAwMPA/DrwYGBgYGBp6H4VcDAwMDAwPPw/CrgYGBgYGB52H41cDAwMDAwPMw/GpgYGBgYOB5GH41MDAwMDDwPMz3SxgYGBgYGAQX5vslDAwMDAwMXgTM58MGBgYGBgaeh+FXAwMDAwMDz8Pwq4GBgYGBgedh+NXAwMDAwMDzMPxqYGBgYGDgeRh+NTAwMDAw8DwMvxoYGBgYGHgehl8NDAwMDAw8D8OvBgYGBgYGnofhVwMDAwMDA8/D8KuBgYGBgYHnYfjVwMDAwMDA8zD8amBgYGBg4HkYfjUwMDAwMPA8DL8aGBgYGBh4HoZfDQwMDAwMPI8Q+/fVzb/rHg4Q4CZGjGgfKveK1eIBOCO623RG8fX1lfw5DS2z7jY9PgpwBrKf/U3WcYP5Ror0XK7Isq+KSveBJHT3yrOQfUGePL+xDAw8Bf+n9D+vsYHBM4H4TqkI+ODBAy8vLwmdc0aFViQ0USFGBzNQ3r9/HwsyolLD6Ri7Dw2oy6UgA7Oy7NSx6cxC01cTEglVBhmYwiyl/WwZdEZkkamolCT4SyprqjsV1tkhVPdWTVkSSlWeFVouSqfi2KGuivtAjkLwJ2tg8CJh+NUg6Dh8+PD58+fthwgRsmXLFj16dNWJ2idPnjxx4kSMGDHSpUsXJ04cIqMghaBB/HrlyhWHSqlg3Nvb+9atW7t27UKCHGHChAnTpEkTOXJkq18Qce/evW3btvGOKMTHihUrQ4YM2HSGlvzOnTvHjx+/fPlysmTJUqdOrb5BAzaZ4+nTp48dO8akMmfOHDt2bG4JGpGBtm/f7syd1c6aNWsw54g1GWTQM2fOINF8M2XKxMLqEQW2EuASE0ycODH6wOHgZ4Ls050tu3HjBhWEKtUUP3586rTu3r3baYoXL17GjBlVNzAIG9CBNjAIAjp37kxwB1EsbNq0CVqFHgiaH330UdKkSYm/8Gu+fPnWrFkDVxGm7Z5BBcaJ8nnz5k2bNi20DUqXLn3x4kXke/bsIfoLEFLLli0Z0e4WVJw7d45ZRI0aFZtMs0KFCnfv3mUswEwBhPfHH3/UqFGDpZg7dy6+2T2fEayMDGL/iy++SJ48OUvHkubKlQuzyKVw7do17g1yBvj4+LDUtomgQpaZEeMyqCyD1atXs4DImSO7Wbdu3dGjR3/33XdMdty4cbdv36aXbeIZwYjWEt7v1asXO5g+fXrtJncXSk6Lxt2wYQN3C02WDW3QoEGQRzQwCBGEWP4aUuMaeBBdu3adM2fOhAkT2E1CM/krbEr966+//v333xs1apQgQYJZs2ahQ8azbNky8g+7Z1BB2O3fv//MmTOdRLl+/frvvvsug8JMO3fuREL4btOmDRz8448/BvPz4QsXLuTMmfPNN9985ZVXGCJu3LhwACmUTi8JFjOdOnXq8OHDS5YsqWQuaAmWDOL5Tz/9NGzYsGbNmqVMmXLevHmTJ0+OGTPmqlWrcuTIgQIEs2PHDgiPeQ0cOHDv3r2wIGtu2Qg6NHq/fv1GjBjBiPAZs2A3GRr54MGDp02btmDBAtiORxyoWLHi+PHjK1WqFLTJMk3K8+fPczdiCGaqRBlrzLRMmTJsHPWbN29yZ0IZuuX6AgdPmTLFMmAQVhG0AxOGwatlYBA0wK958uRRKgOIgwBOat68+fHjx5FAh6RcEC3c06dPH2Kl3TOoOHbsGGGdIRjIlQFZ0NAaTvIiRYo0adIEid0tqIADyF9HjhzpGJd9HiHXTp06JUmShHsDEiHII7IydGetmjZtCmtqFBb2nXfeYem4QMg4dwjHgY4dO+bOnTv4+StgdED+Sip55coV7AuMBZcXLFiwbdu21BlUPqRKlapbt27U7f7PCDpijUsSN7Pr169jUBLGgtRnzJjhjE5JE+tQoUKFevXq2f0NDMIInssvHxq8PCAu6yRRhwnIq65evdqhQ4fkyZMj4bpKdsUj8t27d0stCNAQgIhMunPy5EnGxbiAAkPTSl0VJIyoSnAgm5ojjxgHSAj65Hnw7scff0zmKjV1CTKwANO0bNnS+SkjU2jdujWZ+sGDB3nEDY0O5A+lKh6BrDlDU2ogSI47BFcNKVy6dInMkmxSmkFG9uzZGzZsyOy0U1jbuHEjFFusWDFZplTFwCCMwvCrQdBBWBTlECJVISCmS5eO9FGPAEn8+PEjR46cJUuWoIVLRoFaoLTTp08PHz78559/Ll26dM2aNVesWEHoR0Fm7fGsnyAiUci2DAQdsoNNJujMEWdIoAcOHAjTk5o7mZb8tHs+IxgIywkSJKhYsSIVjYUwTpw4UaNG1e9MucZ2W2qGU0fLgAcgU/KEigairFKlyoEDB7p06XL58mV2YfTo0VmzZm3QoIEcCAIwyx6Rqnp7e7tmYgH57NmzIdckSZJQxw1nplQ8OE0DgxcGw68GzxdExmPHjkWLFq1GjRq2KEjADilUmzZtGjdunDBhwiVLltSuXfu7776jCVajFUjzeYOIP2/evD179hQtWnTSpEkkYWXKlHn//fePHj3qcR8gcpLFV199lXqQyTs4gES7du1aokSJqVOn1qtXb8CAAVu3biV3584UZH4NEExzwYIFLKa2EtgNBgZhFoZfDZ4LCL4CiR15Sf369X18fOy2ZwfRFlNkPJ988sn48eNXrVpFoCet6dOnz7Jly2iFe1CwtZ8//vzzT0ryuVy5cn311VfMbtSoUeSy0KHcCI4z9HW6T58+nfSxVKlS1OF1CV8kWFuodMKECVDsypUrBw8e3LJlyxQpUqhJOkGGZips37797NmzZcuWlVkk0jEwCLsw/GrwvADnPXjwYPPmzZs2berVq1eQwzEdoZbIkSPr40RoNWnSpO+9996QIUPu3LkzdOhQRkHthUVk5rVt2zacGThwYKVKlbJkyYIzUOzGjRt//vlniyxcsLWfEXRkOgwBDh8+vGjRoi+++CJKlCj6sDT4lPas0IinT59OlCjR66+/fv369ebNm8+aNctTZK+Fovz999/Lly8fK1YsRgQhcpkwMPAszCE2eF4gaBKOoQd4KFmyZDwqWHsKDRo0gN727t1769YtHj1rPHCQuTJcwoQJ9Qj3N2nSBAokmYYdxRlBBpYxcu3atZ49e/bt2xf+thtCCDt37iRn7dat27Bhw/Dnxo0bHTp0+Pvvvz2y4Bhhue7evTtv3rw6derYUgODcAHDrwbPC/fu3fv8889btWpVsmRJwqgiqd3mCZDVFSlShDxPj541HjhI5uyaBaYGC0aNGpV8mkdcQqKmoIHugwYNIp8DIZvJwXzdu3fPkSNHvnz5mGCnTp1gfa4XgwcP1scGwYFzJKBwrmJFixYN5roZGIQqhNiry3tlEP4Atdx/9Jea//vf/woUKFCtWjV7yy3GdX3u+ex/BYtBQEeVsoBBHn18fKJHj67PTqX8vMG4zIuhjxw5or/dxJ8YMWKQxaZLlw43guAJFlg3LR31kSNHJk2alKxR5IpQS2drvygw9Llz5zZs2JAxY0Z54u3t3blzZ641e/bswVtbL6hgRtgEv/76a7FixZIlS/bi52jwIqHtfnkQYvyqMGQQ/sBbBB8MHTo0WrRojRo10nYj3Lp168qVK52Qams/BaQP4BjsKAQT7m/duvX333937NiRuszaHZ4zGK5JkyZkz4sXL4ZTFTWOHz9O+eqrr0rnWZ2RPiUYO3bsqVOnWrduzUAIsX/w4MHly5e/yDkKDMe9IXbs2JcuXdLQlEw5d+7ciRMn1p/NBAdYA2Suf/zxh35HWlM2CK/Qjr88CMmPngzCHyADWLBfv37w68aNG9988802bdq0bdu2Tp06DRo0yJkzp8OvzwTeTMy2b9++atWqkPSFCxegnE8++aRWrVrkUnp1xXMvAEyhePHirVq1+vnnn3FDvo0fP75u3br669UgeIIRSjLCb7/99osvvjh69Ojbb7/9xhtvwLINGzasVq0auZ00XzDixYvH9i1YsGDv3r262Vy+fHn79u2dOnWCX22loEKz3rlz59mzZ0uXLu1IDAzCB8Iwv/K2kyfxwvsPZ0hoFWxRGIRmoQ8GgR7tttAK4uO4ceNGjRpFcklQnjdvHqnJ3Llz169fX6JEiSRJknh7e6MThDAKb6VNm3bNmjU1atQoX758ly5datasCckp45GCNJ83GChq1Kh9+/atXLkyt4fBgwd/+OGHCL/++mtmJ4UgTJD9Zbm+++47lu7PP/9k0ebPn8/qMWXuJZkyZQqCzeCDQWF61vmtt94aOHDgsGHDunfvzrUJyvf/3j0rsID9JUuWcDbixo1LPfg2DQxCD8LwgYZseCGhWP/hjEkJyIN/yw4p4D+zo3Lx4kUmQgBiLqFqOl27dl20aBHcGSVKFEnYlDt37rAjeC6yYRbOTkWPHt3PTj0NnHWgJHkieY0fP37ChAkxxUD+dx8UK1YMQho7diyttihIYCy4rU+fPuRwtsiaoyo4dv78eVzCGVzCDSRB2yBs0pf8lbp/n2nSNJ3JInnnnXeWL1++du1aD36/Pxejbdu2xYwZ0xlFwD1Y//Tp05EjR06ePDnbjRBngGUgiMAsOHLkSLRo0fSFmsC/TZYFOuf8T58+3RYZGIQFvKAr//OAXnu9jVYQ8Ata/UfeMIeTJ0+WLVu2aNGi+sdNQzlYdivquv5tNfhVICgDKrZSUMFuJk6cOGvWrEmTJoXGeAyp/WWaGhp/IPIECRLA/Zw3tQYBsuYslB8gDKmZAoZmQyHdjBkzpk2bFmec6QcTGAHp06eHXD1i0MAgVCEM86tezl9++SVPnjx58+b1cQOPBQoUqFSp0rVr12ztsAklQ1zwDx06RAQPnTHIdZd59CmImNWVZT9K4/BZj0CSIACzdBfNOJBBKtJxoHuVu1cBQgooP1GN0o8a/vgpmTXuBXmOGJedAC1oaCl4HLoWMIQgoWAts2t5GVrgUdN0moIJ2QEMLft2gxvkGLCfDQzCCJ7LG/tioNcSBiWx27Fjxz4Lex9hz549x44dC9PvpHusYSKar/0cOuDfHznpJ1Cq/rjo+UQ4BgMsbSV/oNWuPQbO2XjiIUHBXUdDC0904ykhO4+jZ+R+KE2k6DwGBxjRjcR+/u+CIPc/X6diKwUVjh1VJFGTO/BHcnfHDAxCP8IwvzrgrUuTJg3kev78+YsWLly4QH3r1q36F6ENnitCW9QjFusHmU/EE5NX4Ir9VnZFaYtCAeQMXj3R/6eBJqg6lVA1U+GB9WsW9oOBQRhBiB1ZKzJ4ADJFGS1atJgxY8b4L4gUxFCFUaA6cB4peXWBH4nqagJOk6C6I7GVHn31ARXJ/T+qAlR3mtz7SigFB86KOXJL/V8P9ehHwf3xeeDs2bP6m9RQBdbqypUrt2/fDvyYrVy5sl+/fmJiW+QPzO7GjRtYEwOFHuAPXuFb8PdX08fOuXPnOEihcEPx7c6dO9ybA99Qg9APdvClQojxKwHCU8Cads5+tj5rEhDeunVr0qRJY8aMmTBhwt27d3lXpQ+IrdOnT//pp582btyIHCAkbM2bN693794NGzasX7/+66+/PnnyZIwQemzrESOSH/9ogZB09erViRMnNm3atF69evQ6efIkRghSGzZs6NSpExaaNGkya9YsxsK+3T9ixNWrV+MSAyFcs2bNBx98QPcGDRpg89KlSyhbDtqfCQNndlToQok/DNG9e3d1HDhw4NGjRzUKoC86e/bsweCUKVMIT+ruWbDCDOR8dBlKgEssTpQoUeznx+Cvv/4aP368s7CPgzPBJ2q+SLDFeOX96I+d3KETwgroDPiBzoaapE+dUu8LFT8/5A4NkOdcoB2fDcIotJUvEfSyhUW4aMTX9/vvv2cWadOmJZeyG9yAApz6zTffEIwIH6NGjYJpEBJ9oCJoL2rUqMmSJYOoUENy/fr1ChUqELbQjxMnDgkxFSJOy5Ytb968aRt9+HDbtm3Ro0cngv/555/FixengnENkTt3bij2u+++ixs3rgIWJaMMHjzYPTPo2LEjblerVm3o0KGxY8fW4ZOFkiVLHj9+HGdwElePHDlCIo6yLgHIAcnZoEGDYsWKhasMROihb+bMmTdt2iQF9R09ejQGs2XLdvnyZXtgj6Jr1645c+bUWLYodCBfvnzt2rUL3Ksvv/ySlWElA1E7f/48x2P48OGhbY74wxEqXLgwNz9b9Aj4ySEHTI1T7QfIKTkhznQwJfTp0yddunSBL0iIgBenbNmy3HRDm2MGBoEjnPxIA3KiZD561NwALyQE0759+1q1aiEnTfznn3+kcOLECR55dUkB8+TJAz8hxw6SNm3aTJs2DdL9+++/yWIJPWTAO3bscJm2QHc0kb/55ptbtmxp1qwZKey7776Lke3bt9P9/fffT5o06bfffkuGlClTJiLaiBEjyHTt/hawQF9GT5MmDbGeNJpMFMuktggxrkkFiKlTp3788ccMR0fIftGiRUWLFt2/fz8ZMx1R0NzjxYsHhWTMmDEQU8EEAzll6AHzdXdJq+GUQEJKJJaK61ESdzh2qAAJQwlwLECXEEKitWvX5vZWJSD8/PPPtqobnLn7X4QQh7MFejQwCCsI8z9/BTAoyWvTpk0h0ZoWCC6dO3eGKfVOkoCSUObKlYs07q233kIZwuvRo8ehQ4eqV6/eunVriApN7JCJQl3wIhYyZMiQI0cO8s4kSZJgCvayx7NGpCQ0X7p0aezYseTQ9evX5/pfoEABhlu4cCFRbPny5W+//Xbjxo379u2L8NSpU36SSEa8cuVKkyZNli5dCh9D0gQ+SpRJrI8ePWrrWXBWjAoZRv/+/YmhpI+QevLkyYsUKTJmzBhS6jVr1uzatQs1jIMaNWpwReByQAas7h4H/gDGsp9DB/AHr+yHR6vHzWPdunXceCZMmEDJhenixYvU2UEuN+z7jRs3pO8Oa36ePK4egRZclwNJ3IGQMx/tMSB5dd8va342ezmPagolcFwKbY4ZPCu0lS8R7HmHQRBcANxGaIDwlIA6KFOmDDykT0pdH5jev09s1W88wcTEU29vb5hp586dNGFHNlGGt2BThNTB9evXU6ZMiUG6SAeQpBKnMDVy5EjL9n26wNmwKZr58uUjcMsC5fr166NGjcpwBHS7v/XlO/hM0klMdx9u1apVWKZp8uTJCHHMz+fD6ECiPCLcvXs3j6gB3Ba7c5OwLLnkqqiXPbBHoc+HndULPShYsKD758NaAdClS5f4j6B1TvAIZPkstfQdXLhwIVmyZCNGjAiFc+zQoUPhwoU55PbzIzBNThT3M3DMHxByq9ORkz5TE7gI6vNhyUMP8LZcuXItWrSwnw0MwgjCyefDUOCGDRvIRw9bIFCStHGFV6IpHWhvwIABCKdNm0Zsgh2//PLLzJkzOzd3adKF9xkjpJVkOcOGDYNiIW/klhkb9MIUlOZ0BOggz5Ytm5MvStOB1ARa48SJQ5THONGNR0p9GRCVvXv3+tEXUIOwaSKMfvHFF6TIgKScRJbpo3Du3DlKFByHsRagqZcQ3bp145bDvYQsv3PnzqlTp/7rr794BFRSpEhh64VlsNfXrl3jtOfOnTuXPyD86quv3I+HgYHBc0J4eMegHIIFwZHbd1oLadKkITfVp76UAAXU3njjDTIbmIlLepUqVRo2bIhQgIQAFdLcevXq+fj4VKxYsXXr1lzqiVbI7cH+C8c+XEuJRJr6DSkGBQFym2vIR5/iAvQpUVamS4X0wlb1B/IP+nIJ+OWXX0ZZGDNmDDnWpUuXMMJtwLFsje/K7KnbnV9KOEuRNGlSDkaSJEmocLmJHj06pyVRokSJLbCJdocwDubVq1cvro/9/AFhpUqVOCG2qoGBwXPDS3SHhWbmzp27ZcsW5XPkMadPnxbxiJAoly1bVqtWrcWLF5ctW5byzJkzJDowt1pl5zlBQ+jXO6no82e7zQ0IIWAq8ePH37Zt2549e8h0D1iwvrdq70cffSRNg5cWnJAOHTq8+eab7QJCyZIluWrYqgYGBs8NIfaaQSGewtNYQ+fYsWPvvPMOBPbuu+/GjRv30KFDXbt2JQukSUxG0Pnqq6/IAl999dWpU6eWK1cODkOT7qJkmQocTz81/5oMcfnyZf0ALGfOnO4KTp1Krly5KG/cuAETp0+fPkOGDJQgY8aM1PUPy0jZ4CUEZwNwBgInUVvbwOAFwj58Lw1CjF95/z0FrEW2/k0Pxyx1GPHBo1/tAVeuXOE6f+7cubp160KiQ4YM4Y7/22+/DRgwAB3h/v37O3fupDsX/ChRokhIX8tfF3h0HyVABN7qDsKffHN+04QKSfPVq1cTJUrk4+NDk63qZhYfihUrhgI3g++//x4dmtSdJuqSUMpb4em9CiY0tEZ3fKAUbKVgQHPBMmsVTJsNGzYcNGgQFWxK8qxwraw1uyBbCBAyC5ipoJUE1G2lxwMdjpb94Dngj2Yqf8DTOPOUwJSMO6MAu80gHIGNfqkQHj4mIprwNlJx3kwHRAFK5BDq0qVLM2fOPHjwYPQbNGjw+uuv0+Xrr7+G0py9jxcvnpeX18yZM/VTTKjus88+O3/+PK367SFMASoewdq1az///HOYUmaPHDkC3yOvWbNmihQpAsw/8ISU+q233sK98ePHMx0ycgU7wDViz549mFLfXbt2DR8+fPLkyeTEUnjewCsFdyqUeIJvehRsvWDA3YiGsB+eEWT8lStXDs4Pp/FEB8zdpWCC6fg3qAW0H54ElB3YIjfYDRZs0VODLrih66At8hBk0ClVMTAI6wgP/EpIOnz4cMqUKUk6CZcgatSo0aJFixEjxogRIwgKMGi/fv0Qjho1KkmSJLy96MCsJUqUuHXrFlwl7iQJfuWVVwgfy5Yty5kzJySXIUOG0aNHU9Llu+++69Onz40bNzRo8IFjsWLFwo1s2bJB9s2bNy9QoMCJEycyZcoE6dIaCHO8//77r732Gsz68ccfk+mSmrdo0aJ06dLJkyefNWuWLhxY+Ouvvzp27Ni3b9+7d+/aPV8IYB2ovVOnTkWLFq1evTr3AFzFH7s5GGBNuI6sWLGiUaNGFy5cCH4gDo5XeLJu3bomTZocPXrUFgUb2ncOZM+ePT98hB9++CGQw/BigAOsNjs7cODA+vXr85rYDZ4AljkwWOZg86LpRyR2m4FBmEWI8SvvTzDBCw/gEngRUAHiVwklIfv86KOPoN4PPviAcE/8UmucOHGGDRsG3Z49e5YmZYFdunRp3749rQiXLFmSK1eu2bNnk/8lTpyYXJb3H2sKNN7e3hA2Y2kuCKnwiJyxHAkVSvE9dXcwXI4cOYYOHUp9xowZJM2wYK1atSDIVKlSWfPwogk7GMQCbtOFEjkpLFcHkt20adMSi8eNGzdp0qT169dnzZqVmwG9NDQd5Sez5tHjYBTBfra8pSQRb9269eXLlzNmzLhx48a2bduOHTsWubtm0LBt2zb4pk6dOqwSj2JHNblDbgQO+rKSrOezLg7G2Yjt27dDgewXe8fhsduCDU2HWx2HjbuXAN+wj66F/u9kPbKtsqkV8z+EAxaKpjVr1nDqdu/e/biVfyYwKLh+/Xr//v1btWqVPXv2Tz/99N133+VmzHC2kkE4gnW+XibY8w6DIHvgEg0Idv6BHAUYi5JHIiCpqrrY/S0LUrDMuKDHU6dOwVWkko4CqRKS06dPU3eAPpYVI9y7U1Kn4t5k9fD7/cNVqlRBiGN79uzZvHnzuXPn0MSmdDQL9VWJZTUBNRF5SZ62bNmyb98+bgDO0HKMRyoq7W4eBdcRrgiMyFiSUGc6vXv3xh95Mm/ePNL0PHny3Lx501ELMi5dusRW6tsu2Q7X+gZkM2/evO2e9P3DTwOuU0mTJvXz/cPMCx+0Wa+//jpMQ+6lpuCDgQ4ePEjSv3Tp0tWPcOXKFUZ394G6vl/C//cPPyscy3369OG6hkFnFHegc+bMmRo1aqRIkYJrnEeuFNhkas2aNcucOfOOHTtYT2C3uYEDXLZs2ebNmwfomIFBqEUY/nxYF2pNQxIHjkQXfF5jSqWDQE2Crsmq8/bySJkgQQIyV1JbHulFU+zYsfPly0cWK02AZcF+tqB8CFBXiYLG9a8soEa6nD59evLO+PHj21IL6hJIX5roS7zLnTt3hgwZ9DVPQL0Axik1hecBOIa1BfazNTQlCWu6dOmo40DJkiWzZMkif6QTHESPHh2bCRMmtJ8DAjvIVUN/JWyLggrC/bVr10iw7OdHYEnjxYtHhVIb7SmwaPomatat2CPEjBnT/wLiFbcNbjD2c1CBWR0Y7SZkZjf8FyzFN998U61aNS4ctijYYKfIXH/77behQ4dySHR4/AOvoHP9Ax62yMAgLCAM86vC9+NAK0EQwEDA2/rCByo02f0f0aFKQCu90KSiz2N5dOxQoYs01UVGBGkCKVCRUK3q7qcLEDOpi2y6W3aZe/QZJo9UcExNgCY5DHh0mXg0BdUdoaPjcTjZhv1seRUlShRCMOMiJ4AixKvXXnuNhQ2+GxiXZfs5IDCKeCJwtacB/mNK2+Q4rx0BEqKjenCAq4yCqaNHj44dO3bQoEHvvffemjVroE+tIQruq0cdynkcFz4TMKWFwgG5ITnQI8CHP/74AwJmH+22YACzGKRcuXIlnP3KK6+ULl0auYamSYNaui6gCW7fvu2+AgYGoR/BjQshCF42QGjz/9Y5cgeucGhJKG2l/1oQHB2nFKi7wxGqAhy5zOpRowBJ1CqoTqRWkyoSukNCoO7AbrC4U6WaqEtBj+4Vwer0XOBun4rlkcs3AiWPmzdvzpUr1zvvvKMoKbVgwhkuELh8CvasZcG/KR41TT/yIEMkyn1lxowZlIcPHx4+fHj16tW7dOlCIq5R/Kyep4bGjuA8quKOI0eOjBw58rPPPtPvFtjSYAAjXB24RlBp0aLFpk2bxo0bN2fOHH1XmmCrPnJJh8rAIAzBHNmQQZMmTUhTunbt6h5HwhOIiTdu3Jg+ffobb7wBQ+hvnOw2A39wKKRTp047d+5cvXr1p59+GidOnBEjRnA1gXFDcPXIkvv379++fXt9kZktDTaOHz/ONJnj7Nmz582bRx2irVix4q5duwyVGoQPmHMcMihcuHCzZs0qVKgQDljHT7bhYOXKlQsWLLh69erMmTMbNmx4zvqHB14A5MzjvAoCZArYz26wG4I9EMcAwCteXl6xY8fOnTt3jx49WL3s2bNPmzZt6dKlAQ7hkaFlBNjP/wVyLkkJEiSoXLky7vEo8qOiz3gFKT8l0GeyGzduJFv18fEhLe7du7d+H55EtmPHjlh2p1jZf9ZRDAxCHIZfQwaED4FAY4vCJhQr/cc+JNWrVx81atSSJUsKFixIMCVNsdueM1xM5blVDdyUp4I+ozjnwalkyZJlyJAhkSNH/v3339FBKGWBoT01ujv8mN23b9+MGTO6desG8fMoH6Sgn5L60X8aaEkvX75M92zZsukrSLH/+uuvp0yZkkSWQaXpQKMEvhcGBqENhl8Ngg5CP2FRP2e1Rf8FMZEAOnDgQDQ3b95sS58zGDSK9U0jwQ/HWMCOt7d3SAX34sWL582blzyP0fHBllpzZElB8L1yzGLNYvb/xIRhw4bt2rWrUaNG1apVq1KlSq1atfbv33/ixImaNWv27dvXj1dPCfcusWLFEnNjil0rVaoUpOvn+zrkFe7pp9QGBmEFIcavvGMGYR2JEiUiJirE26JH0C4TFmktUqRIPAt2W7Ah4/aDPzAiY0WNGtV+DgYI/TFjxoQDAhkukKZnAuThpIMiElFOsmTJkidPLrkzFk2xY8fGNzyUJMjAJmBEdlNM5owCcuTIkT9//oQJE8aPH58SHUZEJ0GCBCyylHHG1n46oM9wadOmZXYXL17UZCmxpq8FjRMnjrsP6EePHp3hnnUgg9AGtvWlQojxK6+KQTiAn63kFXJ9aGhBrxPCe/fuEZRLlCghneAAg5il4tBAgGcpQGHQIFP4r0d3MLrIhjr0IJ5Q07MCU3SndJaOEvnly5dJFhs2bCgdS9cFD05Q/jsT8YP27dtPtDBp0iTKn3/+OV26dPD9uHHj3nvvPRQCXJnAoYlA26lSpdqyZYu+qoLpM+Xr169DsT4+Pv4nGKB7BmELbOtLhRDjV4NwCQIl2LZt2/nz5yUhbk6ePLlAgQLlypWTJDjAOKWoyOEzqyUEwND4QCkuBHZDkMCMYJoffvhhzZo1rkW0vihq7NixVatWhW80kK0aQoDh5Bj14MxXHclHO3ToAL/+888/WEbCMv71119vv/22PnuwdA0MwjAMvxp4EoTOPXv2VKlSBUr46KOPYNYvvvhi3bp10Ea0aNFspWDgxo0bBw4c2LFjB/Vff/312LFjLz4QWxTzEE/27duHJzDfnDlzjh49Cj3YGkECNq9evdqnTx8IlYT1008/bdmyJTa7d+/OqoqBXjBcFPoIPOKhhKrrMQigo6bTrl27t956q3PnzlAsC9irV68iRYrArzRpFAODMA3DrwaeBGExY8aMw4cPb9as2a1btwiapK3/+9//kiRJ4uWJr2m8cuUK/NqtW7dJkyYlTpz44MGDLz4QQ6gATxgdeuAOkT59+iNHjsCFQaZYZsH6JEqUaMaMGV26dEmePHnKlCl79OgB9+gn3BCSRxYwyIAUmbW3tzcXpsGDBwfHGeai+ZKn9u/f/8svv5w7d+7MmTPLly//7bffxokTh6YQuU8YGHgWrp+F2NUXi5Aa18CD6Nq166JFi9avXw8HSKJtpYRpiJIiPwmp6zE4IMRjBIOySRSm4j8WkwZly5btp59+CmaYvnDhQo4cOfr27du2bVtbZE1H0IyoIFRFEqk9E2TEvQIcg8AWPQJNHTt2XLly5dq1a6NHj25LgwoN2q9fv1GjRm3bti1mzJgBjqiSJtUtvzywoXbNQoA27927V61atQQJEkydOtUWGYRNBP/AhC2E2CVRL5JBOIO9u9ZXP9q1R3utSKp6cCCbIk7R7QsG49o1C3hCKeLx0/T0oCMWNB0MOlBTkM16EJqgKvIn+F45FlwzDAVzNHje0Ea/PDAfwhh4EmIFyFXQo+CRP9aUKVl2/fnnf//BhhcDywXbB0UN5xHYSs8IWdB0ZBNY9lygbuuFHHBPLlEBcsxuCyqcOTo2Q8NMDQw8BcOvBgYGBgYGnkeI8etDAwMDA4OXCXb0f2kQYvwa0SAcgTfngfWvhFKxRS8cHCockBuRrN97shuCAeegumLDI4Tg6WVqQG7w6HqRgu2MLDimVGEUraSlEgLAB+ZoTdf1Y3t9dKxZG4RdWIfrJYL5fNgg6CDeUSoC8vLoMaQAHziso4pH/JEdd9gNIQd8gG9YdkpbFGwwTaCKM8eQnaz76NpcD87XwOAFwJxXg6DD29v7zp0748ePHz169NixY69evRqCEfnGjRvjxo3DkzFjxuBJFE/8S+CQTbRo0datW/ejhQULFtgNIYT79+9PmTIFT5jj4cOHPfKVHYCFgq0jR45MfcKECdj/+eefz549q9aQwqVLl+QMuHjxokc21MDgRSKEcw6DMI2TJ08ShQnNXta/opMtWzYivtKgFw+YfufOnc7oCRIkSJUqVTAzHvhsz549d+/eldmYMWNmypQppCYIWGR9YxTgMXr06FmyZAnmHIkAzAiDZyzY0ggRMmfOjP2QShnx6vbt2/v27cMxeRg3btz06dPbzQYGYQGGXw0MDAwMDDwP8/mwgYGBgYGBpxEhwv8BDuoIZf9rhMMAAAAASUVORK5CYII=)", "_____no_output_____" ] ], [ [ "#Algebra lineal se enfoca en matrices y vectores en python\nimport numpy as np #importar numpy\nM = np.array([[1, 2, 3],[4, 5, 6],[7, 8, 9]])#Matriz\nv = np.array([[1],[2],[3]])# Vector que es de una sola columna\nv1=np.array([1,2,3])#Vector fila \nprint(M)\nprint(v)\nprint(v1)", "[[1 2 3]\n [4 5 6]\n [7 8 9]]\n[[1]\n [2]\n [3]]\n[1 2 3]\n" ], [ "print (M.shape)\nprint (v.shape)#nos indica que tiene 3 elementos\nv_single_dim = np.array([1, 2, 3])\nprint (v_single_dim.shape)", "(3, 3)\n(3, 1)\n(3,)\n" ], [ "print(v+v)#suma de dos vecotres \n#se lo suma a el m\n\nprint(3*v)#multiplcaciones de un valor escalar \n#un valor escalar es un valor unico \n#multiplca cada uno de los elemtnos por 3 \n\n", "[[2]\n [4]\n [6]]\n[[3]\n [6]\n [9]]\n" ], [ "# Otra forma de crear matrices\nv1 = np.array([1, 2, 3])#puedo crear arreglos y unirlos \nv2 = np.array([4, 5, 6])\nv3 = np.array([7, 8, 9])\nM = np.vstack([v1, v2, v3])#losasi se unen y forman una matriz \nprint(M)", "[[1 2 3]\n [4 5 6]\n [7 8 9]]\n" ], [ "M", "_____no_output_____" ], [ "# Indexar matrices\nprint (M[:2, 1:3])#puedo hacer recortes de la matriz, SACAR ELEMENTOS\n", "[[2 3]\n [5 6]]\n" ], [ "v", "_____no_output_____" ], [ "#Indexar vectores\nprint(v[1,0])# de la posicion 1 columna 0\nprint(v[1:,0])#desde el 1 en adelante \n#similar a las listar pero puedo sacar filas y columnas ", "2\n[2 3]\n" ], [ "lista=[[1,2],[3,4],[4,6]]", "_____no_output_____" ], [ "#DIRENCIAS CON LISTAS \n#los vectores si se suman entre ellos en cambio las listas indenxa otra lista igual \nprint(v+v)\nprint(lista+lista)\n#LOS ARREGLOS ME PERMITEN HACER OPERACIONES MATRICIALES", "[[2]\n [4]\n [6]]\n[[1, 2], [3, 4], [4, 6], [1, 2], [3, 4], [4, 6]]\n" ], [ "v*3", "_____no_output_____" ], [ "lista*3", "_____no_output_____" ] ], [ [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAs0AAAGFCAIAAACJzllKAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAP+lSURBVHhe7J0HgBS1/seTKbt7ew2OXqSDFAVEQJoNC89eefqsT33q02fvXZ9dnr1Q7Pq3914QEZXee+/14ODq3paZSfL/ZrKsCxxwcHcCko/nkEl+SX7JZPL7ZdpSIQTRaDQajUazJdo+Vgvaz9BoNBqNRlNTGMl/NRqNRqPRaKobfT1Do9FoNJoK0PaxWtB+hkaj0Wg0mppC3zfRaDQajUZTU+jrGRqNRqPRVIC2j9WC9jM0Go1Go9HUFPq+iUaj0Wg0mppCX8/QaDQajaYCtH2sFvT1DI1Go9FoNDWFvp6h0Wg0Go2mptgnr2cs/v5K6tP3ucXJKI1Go9FoqhWswzVVZ3f9jJSpT6dv3yuf+/7PsPwLZycDGo1Go9Fo9mJ2189Y9MXLyVAaY8e+fMOJbemVf4qvUQMsfq6vcpj22Rb8FZqg0Wg0ewdqNtVUkSrfN+nz7CJ1ZWTRd1f0UVEvn3jxPn4/Y/bCZGDf5S/QBI1Go9Hs+1Tf8xltThj21rNJT2PsR9/q1bRGo9Fo9mXUIlpTRar1OdA27Q5KhjaTfIqj73OLF39/pbqi3/e575OJYPH3z13ZN3mlX1LxEx4y72YhCFTgwqTdLkjGSLbzuGh6cX6NMlXKtr1hrJIYe0NblZiWsXKq/sFmlaRGWzSg7xa3NCqp+dal+TsgvTuqvQkajUazP5OcKjVVo1r9jO3z0cVtT3xZWcCxc/1/AMxl2xNveHls0jJK1BMeW/gFsLAy72YhCLRNGdNdR9rr9OL8GttuYeMronKqVswXMm9aA8a+XKlsFTP7YaV/ctfvji1dlO1RlSZoNBqNRrN7VKuf8X3q4dCD2rVJhnzGplu3JN9fmTSXfa74Tj3isei75H2XsTekHvBY/NzFSaci+SDIH4+B7AbppaUqVeWdMAzhzfd9Nj90MuZ6vxmVU3U7vPwy8ibzpSoYe8P/KuMbbIvsyaTuaQ/EfOEXVmNN0Gg0mv0Rf67UVJXq8zMWS1Omgn2evfUEFUpjs+FbNEymLX7uYSXc59m3hp2gnJI2J1y/+QmPzXZ48bcfbfYL3lIGs80Jw8Z8d4Uft8tsUVqq0mFjhK/S9qikqtsHLU/ma5PKtdk32GVQ2Bile5sTbv17srDZC3fsKFS5CRqNRqPR7BZV9jM2Pwcgr+armJRLkM4V3yUX1kRtF81VFn+rKx9/POGhTGfFYm07Ju3rLrKdSndCJVXdLn3+ftIf+SqdqzKkChs7d5EKbIeqNkGj0Wj2Q5Rx01SR6rxv0qfPFfKK/maHYkcs3vylrT4d26pAki1diO2K7Ra7V1olVa0su+sjVYVqboJGo9HsF/jX4DVVpcp+xubbIWDMmGGb70bsjO2uxLdceVd6wb5r7FJplVS1sqRy/YlUcxM0Go1Go6ks1foc6C6QWklvecV+uyvvdLHUYxYVkC73x2Opit28TbCrqm7NFsY9lWsb074jzatKVZug0Wg0+yHJ6/6aqrGn/Iw2JyWfYZRvOyS/4bD4+z9eB0k+03DC6clHPlMvRSz+/soKXmv9Y8V+w//80uQ3JjY/MJJi29L8Sq/sm3wxNH3dryqTm0qqun1ePpGqT2ak59r8qGzlNK80NdQEjUaj0Wh2j+Q9j10l9cpH2n2TCkiJXfFdMuYPUq9gbk2fLYQrermkTx+VNa3yikqrQKzit2JTFW5V2eaMlVR1C7abR5KerVKap4S26O+Kurcam6DRaDQaTRXZ7esZu/YEYZ+OyUAaba4fs0h+AyKtIPUk6ZgtXjM9Ydii7579wzvwv/8w5p7Nd0BSyNK2EltUgdgJw8ZsWadfZeo1XFSWXsjmexuVVLViYOn/+NSF0myLF2krqfmOSO/emmiCRqPRaDS7B4WvkQxqqhX5HVP/vgT8jMq8gqPRaDQazV+PPfV8hkaj0Wg0mr8+2s/QaDQajaYC1OMFmiqi/YwaY/ObHwe1U/9qNBqNZl8i+V6mporA10j2qEaj0Wg0Gk21ov0MjUaj0WgqQNvHakHfN9FoNBqNRlNT6OsZGo1Go9FoagrtZ2g0Go1GUwHaPlYL+r6JRqPRaDSamkJfz9BoNBqNRlNT6OsZGo1Go9FoaooqXc/Q10L2KiilyZBGo9Foqoy2cdWCvm+i0Wg0Go2mptD3TTQajUaj0dQU+r7JXwd930Sj0WiqEW3jqgV9PUOj0Wg0Gk1NoZ/P0Gg0Go1GU1Po+yZ/HfR9E41Go6lGtI2rFvR9E41Go9FoNDWFvm+i0Wg0Go2mptDXMzQajUaj0dQU+vkMjUaj0Wg0NYW+b6LRaDQajaam0PdNNBqNRqPR1BT6volGo9FoNJqaQl/P0Gg0Go1GU1Po5zM0Go1Go9HUFPq+iUaj0Wg0mppC3zfRaDQajUZTU+j7JhqNRqPRaGoKfT1Do9FoNBpNTaGfz9BoNBqNRlNT6PsmGo1Go9Foagp930Sj0Wg0Gk1Nof0MjUaj0Wg0NYV+PkOj0Wg0Gk1NoZ/P2BHoG/QORYhy9JTc4aZK4obwDGIxSjkhpkcIQgFCpaxGo9FoNBqFvm+yE6jvhmHDCPGo4RokYRBPCOp5VCSE4PinXDCPcKF9DI1Go9FotkTfN9kxm70MeUHDdzX8vSC8ChEn1DATVrlheBbNgcumL2ZoNBqNRrMl+nrGTuDwK+A/CGJ4xGY8gzsh4lDCOQ2SeCgxNRb5Zl52uRDeX97l0mg0Go1ml9HPZ+wI9A3DP1RQweFKwN+gXFBm0phdPLt47dB1Rd/Pa/hQq2ZXdCEW/tMXNDQajUaj2QJ932SH+O3zqPCI4FwEPNOIkbIZZavemrvp+1XhglYlVqT3T4eafUO2MExDuxkajUaj0WyB9jN2iN8+QQXhhJRTZ0Zs/rC5hb+UBdcbASpMr35BnU39Z/VJNOCZrkVsdKefS6PRaDQajY++b7IZdIPfE5wSLrghqHzyk/ihGI2Nj68dtmbV8IWhuBVwcxkJmMQzRHa0W0GfX/vyTGZymxj6xolGo9FoNFug/YzNoBu4vCDh+xmCMmEwSj1aOLVozVurCr8qNYttCNlchLgdNzxLBAgJiXOih77ZDUFBqEG1m6HRaDQazRbo903SMIj8BgYVlArTMLjgPw8dMeW8SaVvR6xibginlpcd4jkxgxnyq11wQuLhA7PkGyn+KynaydBoNBqNZiu0n5EEHkbyz3c2sDWo0e/Uvo17N3QD3BbhgDBdUsaJZ/GQQeRXQON2ef1O9eUlEHlJSF8W0mg0Go1ma7SfUQGU+n8WDbQMtX/poAPubmBkc4NnlgXcmFUWkI6IvH7h5DjB1jY1ielfy9COhkaj0Wg0W6HfN9kW7m+lJ+ES1xVOxrzwpFNmZC6zY0aRY4igl0WIJ0ggcnCk77c9rCYmYSa3uEH0i60ajUaj0WyBvp6RBrwm+Yf/uSBCfgmUGMESa/agOe7KeIJEhTBtluH7VtQkRqBJwM4xHP+Tof5vrPkpGo1Go9FoNqPfN0kCZ4EJYrtUmMwzPeraBqXcY+teXLrqgRWheMOocBIBmtlQ8EhJoDAvwIPWVaT90215wDDkUxueTWz/fopGo9FoNJok+r7JH3BBDLSIyg+Muy7LcOzI2NIZl02z1gYyRK24HStpvKHv60e5m2LTH5pXvrT84EHtG1/e0KAWEYSZzCSm9jM0Go1Go0lH3zfZjO9keEaC+a6XaVBvrTfztnnB1fUNM0hFiZtR0OW+g+w+hnFaVr8PelnnEeNggxvyhgmFvP9Ah0aj0Wg0mnT0fZMkQj6PwQSlHvwNTqxCMfP2GWXvxkyeGfAS1CrMvq5J2/sPJkEWtZjFqF1iutQ180xLWLITLY/qD49rNBqNRrMl+npGCiGoazgmfAgrTte8t3rjJxszPdvmcWIGae9aLW/uILKEQa0MYRu2aeaYoZyQSS24Ftxk/i+gaDQajUaj2QL9fEYKTjgT3KYOKf21dPIVk3LX1jKFfLZzQ4vyXp/0CnS2mMEDwsTWICaVH+eSz2PIj5XLn0LBv1ayJI1Go9FoND775X0T/yVU/xaH/ClW5S5IN8PjCdsxl5jjLxgfnhQ0iZmgJJ5T3uGZFrXPq20aGURY8ksZQn6YnNMEFRYlAQ43Qz7b4V8b0rdNaobNY1R+qnXzwZMfe/c/+C5J7/jNwhI/XkWki2g0Go3mT2J/vW/CiCsfx2Dyg1v+D5TA3xKmCETM+Y/Ot6ZYVNiEW5FQae6l4bonNWGmxQxmGB56jFMTW3gY1L9poj4eqq1YjcIISUh/gRHhyS+bSNeBC8HgHPpHT+77B9H/k06jzCLlZAJEENRoNBrNHmB/vW/iKy4o7BalwoCX4FFmxeiK11ctvXN5Znm2SVhZkGX1sboM7mw2D8QDXpAKU1orixtUP9XyJ4PDxQg35Y/pypd7lNdgCN/D8/G9C/nnRxhciknfzz9SENeeoEaj0ewZ9sf7JljeukIEGeUGiRs8KOBoCJcb8R/LJ1wxKWN9Ri6rFSVOQbs1x7zR3+4SSATkb52EODE8IQ2X/EyGtlt/LvKChMeJyaiZGq8GDpzyLzCK/R/AE0Te1TKlkwHfUd7H8g+S9jM0Go1mj7E/rsxhcEwYLX85bPnWiXimsUIsvGdBxvoglT/6bnoZvNs9XYPdLBZyDYNZnEqvBGkGgX+y37lmewUGPD38Ywpm84TFHVN4hsBWBkzu2MILCBxV6TnLX94V6uFc+B/ySPslaDQajebPZv+8bwK9PZdYWPgajAtGjEJj4t1jxdskKDI87iVCtM5lWe0eb2eEsDr2KNwSIT/6CcPlyuvxIqB/Mu1Pxh9oOBLA8qKEME7hAFqGf9PLfzA3eclCehZUHh55gUMiD5j8x99qNBqN5k9mf7yeAaigcDJ8y0SNiLH8reXln0YRSbnBDVscFW1xb3MRTBBhUjgVsGVwTQy5QIZVs7STsSeQ/p08XgLd7zB7xgZ3TpGbX06KXeoo90IKSc/XgTsoY5TvodFoNJo9yd7xfEZSBXmXXf63FVBRxqXHq8f9dtGKbG6oupFveCYzBeOi5NeNM66YVndZY4/E4WnED0gc8v7Boe6WIYKuLSxiGTInFtDQwzQ5ltDyvos2YTVEajgmL0QkkfeqmHTz4Al6pdx66rtVE9dlhkOhzKDIDbn1cgINco28LN4ggzQIW7WyQ5m2CFJiUorDJ2+j7K8utUaj0exZ9pyfIauFA8HlNQPPj7A4bLmBXbmnkMphdcqJwahlQZYQzyCWYNJoyPdLd4683SEf9EzuAgQ5zJbjwhCJRWLS+RPZzGCYWJZnFdVa2e6Frrn/aBSUv4xmMRuLZ4o+Sho9uCEp/0L7GTWDfF1VOXL4B72Mnjfk5QocBJN4iOCcM2J/sog/94vHDVseHXlHi/tHRGTQmG0QZmfk2LxZON67hXlyp8ws6lErwy9eo9FoNH8qVfIzquSjJA03/qFwNiQGgz2XPoFM8GOgnUyDAfH9Cl9MpVfJygsiHOFYESsSmHHDnPi7XoAFheG6llv7mpz297QXOYZ8GBSSpn+rRPOngsPjovtd/4Cb8kaVHADYDcjvZ1iEwQFxl8bNuz9ftzTaSN4B893AlDMIp5IZhsUTvWptuOnYBo1q2Zb8UV2NRqPR7AH24PUMWa+0DdT3V2jqp0/9xyaSN0bgZ8i3GZGENSoCWN4iTi5dKa3sUxLK+MiXRDa3VMBUOdSz84etWXTv8oxolsWZMCx2tHvoa4eSJgn51U8j5JkiQBwK61Y1r0azS8C34PAVBKNwBgkT1GIkiENgCQ8OaQK7GABCrIzbz35bOK0wl1OMDjlykgeJCvghAeq0z47cfWJuuyzmGfBVzODmdI1Go9H8mewxP8P/DCeqlxcOHOLIZSuzpQdgeMIw5O10QQ3pVShxeBuuJ0ImxerV5S68DNOwK3elwb8gwinncjUsPRRUzt1o9Gdj3iVzjA2BRMAJcq+8AenxxSEZnTOZWW5jvUwDgljB5BV8aaI45/4tFIGtKlh1ndpND6cENLuB70Wo6xa+f4eDLB/0dISwo2Yg36Fz1zhjF0bmrPU2JTIdMwPDRR0SKcyZCXnO2mSW3nxsrUMamExwE6PLMCiFs6LRaDSaP5s95mfA6qNm+SimIeI0YTBiRWxejNUpobkGy2TChIm3TGZARUYihgixfEtelsiNWwHTsEKV/dmyND+DwndhJmXUWRL//V8TcyaEDW7GTCeWV9hlULfaF9a15EcyhGPFGXFCPNPw7NSPvTMGheECyQ80yFI5R1gmIJ1SpGLr2zuCeBXQ7A4CvoELd87FsZIXrQyDe2WOmF5gjFsambPM2VBulto5zDADhHsep5blyic4hEm5vOYh3Dbh0lsG1O1Q1zLgllJh8YR8GFT7GRqNRrMn2GN+BvP9DMuDn0HiRjzArOii2C93jgrkW03OaXLwZZ2dbJdTgnjKiWsk6Arj90smREWk+S2NOw3oaJgWsSpnyzf7GfIdEzgZDiVRMvPquRu+XpeZyArxQFFmUb2r63W6p4ObzQNMuhWOxQw4G/Jn0ix5McOvx/O8bb0HFYM+VN2I3RR+umaXwXFy0aOEeYRHuDl/Q2LKUnfWMm95WUYppQEjAZlykmszEuaeQxk3bN8dESbHwXOahCK3npB3SF0cC4EkC66lPDTa89NoNJo9wx70MzysWZWf4dCE5VnGRmPiRVPMUWZ5j/J+7/XmzblriICwpOFnouDllSXX53t1vXbfdsnonCU/xJT+Ysr2QOP8u/2cyosZ1DVIgs77cNG6GwvyIjYxwozE3ePKu73SK1Dfko9/YGGMkiHJhHwuFStqEkiWJEQsFvvggw8mTZrkOE6TJk3OOeecDh06IH7BggVLly41DOOYY44JBAL7uZ9R8Xjyv3KGTpEP4EgRDDu5hxS164vIu2RwC+OMrSoWYxaWj19Gl5aZ5TTo/14ad4xQiHk4mFHDDpN4HRLJa5C3ZD2PUzijPEzcvFD8uuNy+jWUbodN4sTA8QuhXDmG9t8DotFoNHuSPfYyhUFMExtL/gxFgNpwGmh92nJgUxe+wJSMstllpmCmfGJDeAiUiw3vrWUsQI4JZx6YJZ+0MGCPKof/JIhLEp6IY6Vc/GvxmvvyM2MB1M6os7FFaeeHu4vGjJie/zwIlsZydQzLRQ34HZa8UgEdPG/69On9+vW79tprLcs6//zzmzZtevHFF7/77ruMsUGDBp122mn33HOPci/2ZydDIr9PsgX+Pk8I4fg/UyI8RzBPcOHi0MiHMRhhHmNesSfml/D3ppdd/1Xpfz4qfG1mcGpZVpGRJW+LEM8SXogxQ5BaNuvfoOTmfvHBF+Q9dZLZLLPcwbGkpJVRdOex4T6NjaBBA6ZBzRBGFjwMC0dTOxkajUazh6jS9Yyq5N0KVRRdTkYe+Yu1PiP3knDn5w6CpZffTBC89JfiaRfMyC7NafFO0/qnNpBXGuQlg8o5SYIw+dIKhcvAFpBJF0+Oz3TCbiAosqLhwtYvHlj3rPpmkHNDyHsx26AUW7x4MTwJbB944IHbbrsNkaZpzpkz58ILL3znnXcuuugieCFwQZ599lkI7+9+BoBfh37ffMVC2nl0o+dIh8GQDp2Q14444TiOdoIYa6Nk8gpn3DI2d4NZ7JoYlCYE4Z8YZgJOpSXgjuQZolVevG8r2qNZoE0tO8ThA3rlpvnyxPLPp5K8UOK6AdndGgWy5be5fB00Go1Gsxewt/gZSRwy44ppZR+WRvLEgAlH0nqC24Q6xrybZm94LZLZPqPbz51JbRgSTuQzmJWxJ9K++W/GGjRCZl47u/SD8kxmCmqUkfJGl9dr+3AbJ5sFuE0to8KLO2gj+Pe///3WW281atRo9uzZGRkZiJF+DqWDBg2aOXPmV199FYvFPv744zPOOANZtJ8BDwIjw9h8gPxbJPjf8T9GYjFCPCnjxTj7faU5bl7h/AK6IZHlsIDhOwmu/Jk7+R3WAHNySKRhRqxXh9qHtgy1qU1zKLwOhnIc0/YLFpPWOW99terME1r0bEZzpPNhyVeENBqNRrN3sMeez0gnpYNHvOgXkZkXzmBedptnDmh0WaYwbLacTzhhgr04t+4juS1uawZLZMHMS5+gMvaEM+4ank1dY+WwtfPuXlonATfBLjNccWL00Fd7GvW4SQIOobBahrxxksymUIpt2LChZcuW8Xj80ksvHTZsWOqVE7Bq1aoDDzzQ87xgMLho0aKGDRsiUrkgSmA/BF2TwP9UmPKXUrl/LUPeupB9RqkrSFGCzd3Axy32piyJb/SC6HfOOaEmN6yEfJHYMKkbFtHmmYmuTYy+B9Y6sFEgRFiIxAiOFbfkh7hM6hnyOxqUWBFGN5bGGuWFMzg3aELWYehPf2o0Gs3ewt51PYNhlbtMTD19CpsdDAywun7Slhp2/jur5ly3JDOYc8jEg+22FsyWf8MdVVdq3SriggpaNLJk2r9nBdYGM4j8bFN5y8TB73UK9BABEja4LQzm0Yp/IA1tnDhxYu/evRG+6667HnroIQRSbgQ8jB49esycOfPII48cPny4eqN1f3YyFEJ4RH7kBG4efA2bE/m7+qWMLN4QnbiETVrG10SCERIQhgFfhFNII+gJ+U15p16m6NfQObRNTrsGoboh+JTM/40S2aXM/0KofMZTeJR78oqJEZSP39AoFUHCbGLGiXQ1cpUaGo1Go9nj7BXXM1JwwUlcLLp/SfHTJeX1Y92+6Fyrba3xA8eK36zQyaGD3m1vhAwDi1pD+Ka+EuYcjYuTxEp36gWTyLRAkImY6ZVmFB8ytHO9Mxp5AdcyTQPLYE5cSz6KutWdE3QOGDFixIABA7D75JNP3njjjemeBPyMPn36TJo0adCgQTfddJOKT31XYz8FI4rHZd9RKy7MqGEsK+LTFpePW8JWl4kSbnlWWL5y6okAjrfhOYYJVyLTTBzaKH5sm1C3JqG6QXQkXAn5uTYb7iRn/ufaLC5DDEEqTPlzu8lXVuBwOIIGhLAJZZRyP5NGo9Fo9gr2LouIdbAbcBqd0cDIpUYhK/i6ODI1Gp3sCIM3OaexEeDyI04wY3BHYIK2BOYHsfKFBlgv+eCg/M44J9xz2YwHZ8TnsEz5zACNZURaX96m9hm1DEothhIR5xED62ZptyqkcePGyoFYt26d73j8QVFR0dy5c+vVq3faaadBIBm7N7luu4d8RQSdJ3tUdqrs3C3+0LvoTfmDZxBSwv5fsvWOESoUGXPKjPdnx+/4rOimT0qHTLOnRXI30tqekU3lx9C4E+CRkBcKJ3o0Kr2rX/lH54cH/S3vxNbh+iH5lRM4ayHhZrCYxRIYpUI+EYxsxOZWgFsW8lPCpJcBbaBEWHojRowjHt6GRqPRaPYa9i4/gxmWwQLZB+XwQwO5bjj2TUn+O/k5pZleR6dWvxz/Cwx21JIuhP9jnlsjf/bVf8iQEc9j3ItzEuP5Q9cXfLMh07ENYcetROjIcMtbmtuhEA1RI0D9qxjqV1nlzZitQCw4wMcwjNWrV8OKcs49FM8YAqNGjUokErV8IJ80s/s+fjt8J0O6GtJ1870IRoQnE3zHgwnCuGAe5/hH/us6gpcJsrSMfzU38eh3G+74pHjoODJpU06ZyDGIZXKK8lCiQbw6pKxnXuTfh4ohp2Y8cUqdkzvk1g/xAPUM+BiGCPqvFVPDpmaImsHNN6MMeB8qhGMFVwMy8pghKB8epSYJyitS2xxEjUaj0exB9rLrGbBothABUueMbJOE4nPK13+4xrVijY5vbNY25acthDQs8lfVtkUQm1Fhetz2TGK7lFFOykZFZg+ZlZ3INQWR3+hqFu96f2c7R77XQJSR2vyP/NsG39yKrKysiy++GNbtp59+ikQiytIBuBqff/55OBxG+K233sLu888/P3/+/GTmfRlDGAa3hLAYsTyDeiZnpiuMODdcz+BwB+DpmYwHPAcp8P+YsAo9e+Ry9uAPxTd8UfLUGD5qbd4GlkuMgIlkQ/6AKqVOiBW1yy49r4v32OlZg07Ovexgu0XtQEgeDP+aCA6EfzBwdOXRUEG5Bb5a6jClHTH5r7/ji8jvnihBjUaj0ewl7F3PZ8iVMneoF0gUuhMPnhEuzsAqeUOL9Ue92y+jpw0nxKQZsGxYGcvfCt/K2UBDOGGGI79CDQPpeXQxn/TPiYFpIcPDStxIZEc6Du5U58y6LkxoQNq9ZMbtg85RP59WWFh4/vnn//zzz1dfffVjjz0WDAbhVXz44Yc33HDDCy+8EI1Gr7rqqqOOOmr27NkjR47s0KHDZsO4z+L/ViqH50ZMuHamEKbg6FikGEL4v9lOPGLFCSl1yOINbPTC2JTlbH3Cdu2MhOCmMHAQuLy3xS3qhkSsfiY/oinp3SG3ZR0rk5AAkb8Og3SPQ1S+i4L/4Uj6VUu/Qf5pNBqNZt9n7/IzPBgwIT+OwYSz8MKlZR8VG6bLz7L6DO7tZEct+XnHANbTWPnK3x3Z2hTBS4lTFoAF84iwC8m0G6bGPo5lOdkuZQk7kXdt7Q4PtOchkjCdkPxcxs5fV0l1DryN4uLiBx988P333+/SpcuJJ544bdq0efPm3XTTTQMHDly3bt0JJ5wwZ86c/v37f/bZZ7m5+/z7DoLEBYGrEZCf04QnwOBbEPkMpiHgC+AAFbrW3I3u2GV82hqWX+QJEXJEwDWpRwU1hA1BOIXCaRBOHNqE9msd7NggWCskfTuL+D9ph4PsP8eJHlZ3Q+DA/PF8jHYyNBqN5q9ClfyMavdROHEoN11pdljJr2Wr31/hkcQBZ7eo17+eZ8ZNkSF/DYUy+U0G/0NdWwJt4tQLyecDPbLqjVUL71hYO1aLmczDory/2/PV7lYjyzU9WEybBPxV8y6AxsLbWLx48apVq+LxeF5eXtu2bbFFEvRduXLl7NmzO3Xq1LRpU9ve5x9FVE93+jctOBFMPqbhX3RwiLGiRPy2oGzK8sSqEqucBRPEZHbQ5dTgJCBcGyJUZBqxLs2Dh7UMdW1sNc4wMvxHNiP+91aDxKM8LgeeEfKIYflJ/v07Q942S/oa2tHQaDSavwh71/UMQRIkYcpH+izmcWpGKA3I6+ki4MAFETzTkotfBylYZ2/tZ6AljMsfwHCMyIiySf+aUGdDHZewBPyMVvGe7/fMPDjkWsIk3OD+1ZBdfGKQ+b8Lr8IImKapbqlgF1uVBBnLko8wqvh9F+63x8CGihinJYKujrBZS+LTFiQWFbJiK8sxg0S4JkkE5Nuphie/x+XVMuOd89yj2mcd1DIrL0RCRMCNwEFl1PYoCfq/z86p5fluBHwO37Pwf15NuTTYpfJZXj9x3+5AjUaj0Sj2Lj+DEYdwGHD5G2cxwyUiYYigQUL+r7kTJmz/lQNHwCwJW8hvdXET6vtX87ES5h6P2a6xVEy5cHxgkm1yyzGsWHak/QsHNjqtQSLDpYZriyBlFkG2zYYs5RNU2BWpVHgVKgAxFYltKgvCSgABhYrfY8hugTZo5uY3c+TTErLLoL28OCE/cyaofDUVcX53IB3uQvJJW5cT4VBzXRmduiIxbklk/gZRxDLiJGjYNpw+3zmQoycoEllWvE1d0atVsHuzQMtcKyg/845jx4lhyiMlH6SR2lCE/2Bz/6gLGL6YVET7GRqNRvPXokp+xh7zUVAtJ1z+hjuz5EuWlv9MqBDc9crsedfOd94vki9lWrComVn/IZ3u6uzWEZ7hBpFDBF1pAJH+h7vgFyo9CRVOeQzYmua++WsZ0rKjDfAIYLllSPaS3EqPQr7dIT0F6ZtJQWHCgRNcdggT8ktYhY6YscYZt7Bk1nqyIR72jKDsC4iaVhwFGMRiTpiVtagT7NGC9GgdaFkrmGf6Vy+AvNqU7FKNRqPR7Ofsm36GvPKhfhke3oMwYfi4cAxuxc21L6xaft/y7EQdh9DScFFev0CXwd1EUyNuCeLFeSRWvLFw4YoV8xbMX7Vq1YYNG8rKyoqKilzXzfVp2rRpz549e/XqVb9+fRjdfdXJkKhvVciLEymrr46WIeB5qN9TDXjyt2Lk1QzCHY+aRR6dmc/GLIxPWk3WuzZ6GH0gb6LIR2s5SjHhpol4k3CiV/NQvzahA+valilM+XQniuXyUU7/6c4/qtRoNBrN/s3edd+kkmA57hJmMXkXJSE/E8lt+QuhRsmo8qn//K3WulqEhAxC1h1Y0PXZzpvqb5yzdOG0GTPXL1uVKCwOBu3GbZq3adu+VatWderUadSoUeqxzYKCguXLl48ZM2batGl9+/a94oorsrOz91FXQxDO5bMRcAzkcyj+PRN5pH0nYPOe/CMeoSWMLtwQn7TUm7zCWVVqxM0wY8QwzDg1mC0FTc5swZplxbo0Ioe1DB7aJJQnn3SRrgplDuVMmLa8SKQ+d+EPqF18+kWj0Wg0f032VT9DEFfe+5dvXQqPuxaz3WXOpH9Ot6eUFVrxhXRjAV+wuO5C2p4GDgi069jh4A5d27dq37Bhg1B2BreITUxpEv0HLNT9kfR+gMMxbNiwpUuXPv/885mZmRDgXH7vQUnuE8jmyD9oLO+IEJNxkoDzQGmGIX8uRDBillFjSSGbsNQdt5yvKHIiNMhowGYk4HmmReP+TRdbuPXsWNemvEebrC4N7fph+RXO1H0R/xYKvBkuqCk/WyKvauCocPzrX9TQaDQazf5OlfyMPeejoF7mwagRA3aRMCHWk0l3TrQ+IhvE2tfEZ8EsevRZPY66/ohgy3p1wrUy5FcbDGJY2Hjy0Q75ZesdmEG0y3GcJ598Misr65prrlG/i7YPORkS9JA6OPAo4CZRkRBoPdwBkqCB1eViwrLob4vcpcW0XIQ806ZcPrKCFkIS/Uo9r47tdKybOLyN2bNlTr0gtdEBvvMgPx6efLbUkF/xkv0pQV74HzgkfgRE/ViNRqPR7N/sq9czpPmEaZSvTgijnKwYsmzFf1dll+eUm1hul5IBTs9hR3h1WdALGjaMIxbe0tbKn+MS/kezYS+3bwi5/BU2sXHjxksvvfSdd97Jzs5WFzP2qesZso+ofI6FMWEmqBkTdGOJN3dNYtySspnrzY08y6OWTQ1DfnxLeguUeBZJZNteo7xA79ZWr5YZTbKNsP8lUBNScC78ayOWfHPEkQdBOi3y5+fQKfIjoTLe/0qX37X7TldpNBqNpgbZB/wMaTGloskb/0B6GZSYLlbQwuOkdMSmiVdNarimqeGxciMRaxft9cZhtFs4IN+o5PIxRSK/gi2/5IA1OfZhfFOX/iuCMbleh7fx73//+6qrrurWrRt294yfgSYn60TI7wl5t8i/okC5bL+Qvw1CqQdDL6i8uoBuMqSbQV30GREeJQUJOnNlYvy86MICupyFGbW4dAzkOzqG/EY4t5gI8Fjrenb3FrR3K6N1XiBsSCcOrgf3b70Y8glPFOfhf05t5r8WC4fDkA4HikGd6DEZBx3USzvyEpBGo9Fo9nv2ifsmvvVU/24Gdo15CRLgbBGdfsFUMj1LiESGCBTnru/4Uufcsxti3R2Uz0HCPPr/7QpoF2wl/IynnnqqZcuWZ511FmLU3ZM/m81N929GACGvO1BD/kac/9EJKn+8lsP+mwalDJ4X4ixYejgZmxwxf737+8Lo7NXexmjAtcKevClCXEqYKRsT4E4m9eqGEoe2svu1zWxfi9a25Rsl8saJdhQ0Go1GUx3sC/dNfAWxXE/3M4gnHJowS8mM6+eUf+RkumFi8HKzrO6N9dvf047CxYAZNWGHE5SaVH73aRdQfcIYe//994uKiq699to942eo9qb5GfL6hLxdJL8cAl/IgMdg+K/eCLTVJlwkCCni5oKN8UkLnVkrogXlvJhkx81MRmgGZzbx5MuulNqcN8xwDj2AHtbSPrBZRobFkTnIPdOQfcX96yL75Gs2Go1Go9nL2Bf8DH8lL+T3GeTtDAXjMcMJbXhpw9J7l1I3wxIO4ZYxgB4y9BCnqRdiWOGHE5YRFK4hH1/cNaOp7puA9957b9OmTddff73a3QP3Tf5AHib4GYI48iMV8CvgbRjUk49gCIuzGDHnF7KxSxNTlpHVxV6CGC7JiMNbsOQNFlv+Ei78EJYXIl0bJY5sFTykoVk/Q9jC8v03+UaqHAjSM/Mj5G0q7WloNBqNpqrsC/dN0vwMaRGJYcj3RhIlo5wZF87KK8gpteRjjm4T1uPjwzI6Z0CSUtcjniCBAPM/FbqL7oFqF7avv/66aZoXXXTRXvJSK+fyd0TgCaD96ApOTIeZGyL8xyXRyUvZskKjDB4FpYaHZgcc+Uimf5uFOHlmvEteok/7cKcWWQ1DJMt/+1QY1KEm3IoAXBUq38iBqCnbDi8Njd21i0AajUaj0WzLPuZn+M8mmMQl7nx39DXjzIlmjhd2TFFSZ9PBT3fKO6eeRQ3DM11beDQaEiblod32MxhjTzzxROfOnU8++WTE7InnM/yWSwwoAKXg7MAH4IaIUr46Iqat9SYsdOatpYU0LKTzBd/D4jimaLH8gfx4hsHa1xN9Wxi9WmY1yTFs+TNzcXQIl+/cyK+c+YXDZXE5sQmxDWKa8nlbv9498jyKRqPRaP5a7Ev3TeQzkJ68VeBFvFk3LIh8WBgQoQALloYK8q6rf/A9B7FMD5YSVpTJzzfINymE/NaUfNXVL6iyoE/gZFBKr7/++osvvli9b/KnX9LAcUHLsZUvjaLx0IpzusER01fGxiz1puabm7wgh/8D5wspwmRwwgxmkViQxVvUy+zekhzWzDywTiDb4BDw3y+BE4ESfUl5XwSRMdQi5KUL6Xb4nkeyr/aCyzcajUaj2efZF/wM+UIqYaYrhGd6QaPcWPrWyvn3LW1SmsWJFbeEeWyi6xvdjDrEEvjP/1lQTuWvn8C6Gg6l1q4+aoA+Aa7rXnjhhY8++mjLli3lzQqfpMRuI3uby/ZILaULwfx3RxG0hCFfJPXD3KAM7ZCuASece8QsYmTGWnfCkvLRa72SeICxMCE23AX/SgczqGESFhDRRmGvezPziA7hFrXtPIuiDkP6Fv5ruvItElQhvQnZQbJe+cOt0u+QL+UkkZcz/B1fP83eAg4ztuq7tAhUw1DcR1Afs0Grsa2ec3A7qB5WAdSidvfEVcw/FTQTPax61e/d/WVcaf5M9o37JljPu/J3R1kgYZeOLB9/6eScTTlBmFdBig8s7/76oRk9AoR6prHj73xWFnXuxePxs84669NPPw2Hw4iprpMQ/o+6QCNfFpEtM+SHOuWjFDgW1P/5VG7KfzknZjEz5hSyMUvY5OWJDRF4B0FMup6w4IjIx1A4swTNEKxJZqR9Y7Nv26xOjQJ58LU48z9GlvzdFs1fA8/zUk5GtQzFfQKciYwx0zRrutWp2QwBdb4j/Jf3M4B67B0trdHu1ezP7APXMziMPmGcGhYzvEXOxAumihkcJtRgGU5mefvBB2afkWfb8qq//O2NanEF/N+Fnz9//sMPP/z222+n5pqqTzryVQ7B4Dh5VH5py4AvwKnpyddH/Ksvrie4S2yPG6s2OKPXsnGLylaXWhGexWhAXo6A+wH/RH6ai9teomGwvHMTp0fbnC4Nw3mZhi24xR1ZMEDZRkhuNX8JcJ7Cz3AcZ+TIkX369KlTp04y4a8OGv7ZZ58df/zxGRkZyttIJlQ3MLeoK5FIjBgxon///lhdIHJf/sXmSlFaWjphwgSMKMuybNveH/wqzZ/PPjGqXJhNG4v/MjL30SViJs3mGQEWjgdL616ZV+e02mbI8aSd9r9SWX1MnTq1a9euqYmmeiY4ebnCM4QXYDTEDDgZgjLXjrsGT1Bazq1lEePzmaW3frr+5m8Sb0405xXXKWe5RECSG/LypqCmk0OifepHbzmaPPmPvJsGNDmmZVbjTCMDTgblxLQ8EnCMcIyGsCJLVqrZZ4HlUyC8aNGiyy+/fNq0aVlZWSp1P6GgoOAf//jH9OnT1T0U1RvVgipNATdu3rx5F198MbawuIhJCv2lCQQC48aNw7hauXKl6odkgkZTfZgPPPBAMri3Il/lxCrdMVa+tGrlkHVZLBzklBsWO8Y7+OlOXi0WIEH5bIOJDUxrtRnXF1988bTTTmvWrBlKVSQTqgKlcWq5AtMYMbhLuccNERHGqoj4fVHZO2PK3xzLR68Jr4xlR42A4ZlBgvWFMKkXoLFMu7x1A3pup9jlR2ef1CXYoZ5dO0ht+RIv/BADf4QanCIgf/YFyG9hJGvV7KuoqR/29aeffvrPf/5z1VVXXXbZZVh6Vs9o3Ec45JBDcnNzr7vuuszMzIMPPhgx/gCvnh5wXRdFMca+++6766+//qabboKrAeuLlf3+sLjHOqpv377RaBRDq0WLFtX5LJpGs5l94TnQhPwKZvHIoun/mhPYaFschpfEW5d3f/sws4ewqGmJDMK5J39Yw7Sqw89AnziOc9xxx2Hqqd61oyDC4fKLnJQwh9D8cmPGWnfiYmPGhkhJwhA8zIQlpN8gbxUZaCcXIdNpkcd7tjR7NTNa5QVqmeVEflHdNCixBDN4XP4su5HplylfSvHrkQ/BEsOWW82+DIYiTOCPP/54zTXXPProo2effTaO835iAhXwsRSfffbZDTfc8PDDD8MPgHWslh5Q3YstCr/11lufeeaZ008/XfVwUuKvjuoBBIYOHfr444+/9NJLJ510Enbhy/rpGk01sHf5GVIV/90H+RyDNJLy25SMcW+pN+miSdYUO9Ozy414JDfW6bl2Dc5oLGwqbPmVDIOZHIYXE0R1GFb0ybRp01544YXXXntt2xlH9dcW92ikxr7usvY0DcQfEb64lHOZKPFfHvlloTtrHdsYs1xquPLzFajJwx/a5H/xk9fPiPRpl9m3dcaBtY0cg5v+5Qr/IVJuIIf/pAaqlX6XfNJD1Sur8ANAP9i1zwP7iqF45plnnnjiic8++yzsKwbnfnU9A+1FJ6C9rus++OCDL7744quvvooOqZYnJ5SVnThxIhy4884777HHHlPne7UUvk/gu3DyblQ8HocvO3z48K+//rpLly62rZ8i11Qbe5ef4cF2Mmpywk2PUWp6hjAY2UinXj+t+KtIo0S2x8mmrJKG1zZof187asmbJdzmBmEUS38iP5qRbuV3G0w9cDJycnL++c9/butnwPmHrZffw/LfG0GN8h1V4X+jAm6C/+MgwH9sk8mvi8EPItwTVpFL52/y5Msji0sKyu24kctMC46DKT8ljkabnHpBGmuaEevW0OzXrvbBTayQRWzBKPdkNdTyf3XdfzdFHjUOb8Qz5Au8NpwzeTVjC6qpMzR7AJyVipKSklNPPXXx4sWTJk1q2rSpsrj7z2oboBPUFq0uLCzs168fLOKPP/7YqlUrdIVCSVYelIZzHAUib0FBwXHHHVdUVDR+/PhGjRqpipJy+weqK9DwJUuWHHXUUQ0bNvz222/r16+vUnevhzUKNXpT7Lc9WaXnM7bqxGrBPw7ymQMXpcM4x82Vr61c8daaWtGgwQOO6ZnHGF0f7GrkmPL30Uxpfn0PoxofzJC3bJ955pkrrriidu3ayag0qPyWtycod+X1B+G/mQpvQsDkUyLNvsHkd9EpSTBieIYV88jMAvb+PPHahMR306IzN9ibRC1BM6CxvC9ChStfZEnUC0SOPCBxfrfw+YflHN4us2GuETbUp799wc2PWyT/l3+Il++TyEdgpdjm6M1/mn0XdSkbPPHEEx999NG5556LFby6jLF/TlWq1cFgEKfKhx9+uGnTJvUrymA3OgTuGspBJyMvJsAvv/zy0ksvPeOMM2Tn7n/dq5qMba1atdatW4fewEg7/PDDTdNULogS0+weGKKp7X7LXvZ8BiPc8Bj1TBFMCGa5JPZzdOzl43IKckxGAiRY2ry0+0c9wgeH5T2BGruwt3Tp0ltvvfXtt98Oh8PbzjvoMfmrp3CEiCP3SECQkLq0ESCMsjh61SWBODWWl5IJK+KjFzrLSwMx1ySG6RFqeSjSiMnvlXpBUVablB7cOLN3q2DXZuG6YTODcIt4WLTCsbCQYf+b9TTA8+QoWbFiRa9evUpKSr777jvM+4FAIJm8vwL/YPHixf369SsvL8ea+4gjjkDMbjxJgFzY4kSeP39+//79Y7HY999/36dPn21P9v0KrK+mTp2KXoXD8csvvxx44IGIhJ+xn3dLFVHuLAabnPf3V6dtL3s+g8ubEjgs8qfJHSYWeRMvmRCemm0ywahRllty0NAO9U5pELdcM2DYu/iVzx2j+kGNhjfffLOoqOj666+vcGTIx8aIMLn8oKZ0Mwz52CbUlvc+DFIuxKoInbwiMXGxM2eDKBO2SUICyyc0injE8EwCj4METbNFHdqnrXV4S7tl2LW9BCVBz7ATkIPbggzwV6wAMf6CT2NhlocdRUD1tooE6Q/3QQbnZ0oA221vGKtDBpCaHlYBgEhlsFWqCqAWFVDXDNLlEVYKIBVAByWp4pVkunzNoXS77bbbnn322TZt2vz222/169dPVQ2VsFVdpGLSUR0Fm4Gt0h8B6F/tT/aljuNWKAWQpKpWQBOlQFU6EGU6jnPKKafACmL74Ycfoq4Kn6VQRy1dAZCqGpqr7XXXXffKK6906tTp119/zcnJUQIql8quWpEqSgmoIYQASB0FJaPGcCq1KqA0FA5UWEWiZJDeZHWaKAGVhK1KSgUqCerCsOnYsSMc3Msvv/yll15CpKyvOpqzU9CErcaMAv2JRqnUZJQvDLd7x4qlikK70s8U5EKSujqYjPK7EWKpmK1Sq8LGjRtxIm/atOmBBx445JBDkrE7RGmOob6tDkqxrTpKdVElFU5lRADlIBcCqS1IH11y/PmP76hd1I66VHiXSJ4/u0dV8laIIAn5ySpmMYvQAjHj5unlH0cynbBHqBdycq/J7fhABxEkcdPFTGbJZyerDbRFNQddf7UPxgQ6XaVugfDwx4XF5G0b+dMplMeYIdaV2zPWOWOXkulrrEKHM1P+trq8tSN//VS+RUKFE7ajB+SZ3ZoH+jazOtamGRRuB5JMnASoyZSeiz+tyGc15D2R6hnpexmYy6655prly5ejqzGI1UnSrl07RGILAUROmjQJ3t6yZctw/mdmZv7tb3/DEVHZU+B4qfMkfWZPP2T5+fm33347XEaEIQyx2rVr33jjjQcddNA333zz+eefIwnlIwtSc3Nze/Xq9e9//xtiUOC9997DGhfrZuRF0mOPPdaoUaPKn8xVBAoUFBRgqb1gwYIzzzzzgw8+QL2pqqEtBKZNm/biiy+uWbMm1QRo3rt373vuuQfW95FHHhkzZgy6GpF5eXlXXXXVUUcdpbJXF7BGTzzxxJIlS1I+DbTq0KEDqobNhv3+6quvEokE4rOysi644ILTTz8dygCVfTdAG7G98847Bw0ahEP5888/d+nSJdUt6aT3CbYAu6mqoSe2K1euPOaYYzDGoBsGWyoVwthCHu267LLL0MOpyHA43K1bN9jgxo0bQwDlLFq0aMiQIfPmzYMASjj22GNhVGQpVQZNgPcDRzMSiajaUT7qPfrooy+++GLVOijw8ccff/311xs2bMBu3bp1b7jhBgxjhNOtRSVBLnDOOed8+umnGPPoGRzHVLfUNDiXcdKhLTDJ6mAhBiMH4/+iiy6Ctb7//vsXLlwISSQdcMABjz76KE5JlbdC1BhAF2EofvnllxiKyKh6pnXr1qeddhpmFSUZi8WGDh06evTokpISCDRv3hzne8OGDVVqFVm3bl2PHj3Wrl37ww8/HH/88cnYHQKdMfPceuutOMUwCNVwhWJt27bF2d2kSRNMCO+//z7URiS66IwzzsCQqOSRQhYVmDJlCk7V0tLSVCRGl5oD0W8A/Y858LPPPlu1ahV2MSQwCWOE+7l3EVSw98B5QnguQ++ViOVPLBuZNXISmTzaGDsmOP73v41z1zrcFS58bp5gPOnFVxc4lhiX6Nn169efeuqp6H21m0xOA36vy7wEi0V5tJC7yx02Ynn88R+Lzn19/REv5ncfFuv8itd1GOv+ktdzKAJ+eHDZmW8VPDWmbNTaxJqYW84weBLCdQSa4nGXi5hAoz1XxBkv47zMEzEkeyL5VaK/GGj86tWrhw0blp2djRGIETxw4ECcjTht0OFIxbasrAyDe8CAAZDB1AOHIJk5DRwyuPyQj8fjyAJwyEAy2X+EHnb63HPPxRkIYJlgm1ELsqB8zFkoH/GYdzBtTZgwobCwEEkAU5K6W4FUOCVIikajqvxk0TUMKvriiy/UjRKYVaiE2pNpfgcCqARLCUsP4wc90Y2HHXYYvBO/J+QwxkyK9cdJJ500e/ZsZauqF/QSZs/XXnsNE5CclijF9L148WLEQ38cMjg3iOzTpw9mNPQntKpiB6LVKAE1olgcNRgDxCTT0sDAUJKpgaFIJvvdi913330XhaDrHn/8ccgjl0pFQIUhM3fuXDV+UCO2mOVhBVUDscXwAxio3bt3h8AzzzwDc5gqp4qgCoxSGCfl0wC4EePHj8fUBG1RLwQQwJHFQEUqjNDEiROLi4tV/G6ogYzgoYceQmnoGVgy9EB1NWfHoBa0CGMGZ+Xf//531eEYvXDacFZCDfQ2zuWmTZvipEAkVinKyu4AtAVlIi8O2VtvvaUe7gHwTTFKkR2VKiAGmSeffBKnG1zP+fPno7pkKVUG5wg8A5T8448/JqN2BvSBAjiF4U/goKMroDb8CTi1akjjbLrjjjtwjDp27Pjbb7+h3xCZzLwzkm32XRnkxTmL8qFerVq1hg8frs5TBdSADNwvHAj0/KhRo3Z7GvmTfNUK8L/1IDfyHxnCH2W2K4hrOyVjSxcMWlgrkk3kB6gCsVaxbv/tbNQ1HTMBb8/0LPkr6FXAry35lwR9jf6mFH19WM+emf5vmiSTkhr6yCzwdUiRE5i82nztt9j1b2+87/vEh4sz58XrRe36nFsBj8tfYDeFzWKtQ6Wnto0OOj3j1Qvyrj0s3Ke+2cAgGfIdFIOZJv7kb5sQHhRc/vSqCHCSyUgmArb/bgn8WFWj/7dZEXSB/yeDSqN9Cpwe9erVw0rxwgsvVEMc52GdOnXU9XacUehmzCZwCzAXYCmDZYfySLYCZ8Inn3wCR/6NN97A6ecfvS2WttjFWYTZv0GDBjivEIOKcM6gfEw6LVu2hIFRP15Tv3799u3bq9UbgExmZiYWNBB76qmnsIRVJn+r8msOaIuVBM5zKNOpUydskwk+2IUmUAkW6DofZS/hvcHA+IMU40K+PtCvXz+sEdu1a6e+ol29qON4/vnnw96rY4dVNQ6Z6mFsMV/DS/vwww8xs6M/Va4qgpI7dOgQCoXQRZi40d5kwpbAQr/wwgvXXnvtmDFjkAXCWx07lV2NChSY3sOQVMKIbNOmzdNPPw0BlYRpXd1eQZlIBegEuFkYiieccMLll1+OyVpJVh3ohk7r378/FEB7USnOgmbNmim3MinkPxuLkQDuuusu9HNGRgZ0S6btIqpdqrEw0l9++eVuF7WrqP6EKW3RosWQIUOw7ocy6FW4yBhaUAb9DLuLw4pV/sMPP4wTWQ25HYAyUQi6Ef12zjnn3HvvvRiTqAXHCOsK9BhSASSxxUFEKrb33XcfZgZUpwrZI6B2kJeXd+aZZ2I9pk5eTHFr1qxBPHoDbcfZjVZgNdKrVy80UDWkMviNlmBQYWWC0wRjBvGoBa5q+nkKGTW60Gm333573759leTugIOxZ/AE80RCOp2u/HYVE/hjDne9RGxB9PfuoyeaE6fRGZONad/W/3rtO6tZGY96TlxEmHA8SFbByUZWrKpSf9iFy47hGPN4ucvOu/CCmVOnohIPa0h5jUPehRZunLtOwmNFLp+0Lvr8+JKL38k/anBxjyHuIUP5oUPd7kMiPYdEDx3Mug8WfV6Infnahrt+zP96SXRVTMDrRvNQCqpAeViGoJFodRy7Xnl+WWTKRm9aoVfmcCzAhetBECKFCW/exvj0Te7qiBRGbFx66NCFLS1xJm3yFpR5KEp45YzHoX6ybfsCmD7QrdhisYixjkHYuXPn1AUDAOOB7axZsxo1aoQlmopPZpadIEHnwRlXngHOhP/7v/9D5LaSKArAEqMWnDZfffUVZBCDLcCi4cgjj8Spi3Iwo6lIBcrHdNazZ0949KpMefgqWtshVVUEt8CvrVKoWpCxwjKxbsAUoOaC33//XVWRTPM1UbsqgMn3iCOOgDC64qabbsJ8hMgbb7wR7gVWh1JpH5W3GkGZqAhbNByLfiiATj7jjDPUFaNnn30WblBKeV+FquqActBpmGFhKlAdhgdm3mRaGqgIHiR6A0cWBmnp0qWqw5PJ/ghED8NpgMKYSadPn56emg5qRN6PPvoIRaFAOMdorFIDtWALpk6divn39ddfh6RqaTJz1ZD95VeBEXjqqadCAZjhkSNHpteuwjjW8D/gZar4FMmCKg1KA5MnT0bfggMPPBAL5QrLUZJIqvyYhyRON5UxWUoavr4SpEJ42rRpOHDQAR0LPxUxhYWFOFVPO+009EZSdGcNTAr5oGfWrVsHlx3diGEDDzilhkrFoB0wYADOI0xESEKMSt0lZNv8blFbxdq1a3EiYKT98MMPySi/Rmwhlsy5JUoGQACKXX311fCB0BtwOnFEEPn+++9jafTtt9+mC6u8lQdZVPlY72FsY6r59NNP0dWITIHdhx56CHXhpIPOiElm3kX22PUMIS8fECx85CMiWLnL/zi3PKPEnPvIXDFDfnjCoYmSQHnLf7RodFxjHpJPWtrCMhimM/9jXrsLKqTym9/qTz5sQYlHuBNkpcX5SwqLNrbp2IljxUApzDonwiU0QgPzIubr0+P/+TJ+07fi3enmwrJacRKWl2AIiZlm1Mxg1A7Tsl5Ni244jg46O++u/g3+1izQNERgSGECKPVv7csPhEN1h1AHGTm1xszb+PAnax78dPWaEodhIUiZK9/pJROXl93/+Zp7Pyv4vzFrE/LzY/IaCeWJKBMvD1/5+KfLRk1fy9Bl8tmOPel67wYY0xh56A24zzixsQsbD39CnSoQQBL4+OOP27dvj6kBYZVR4Y9beUojF04SxOAEgKlAYCtJ7CpjM3DgQPgiiPnkk0+wRSRAKpJgIFEaTDVWbyqL2qJkKICMaimpIlVgW1ACzvzevXvDOagMcF/ee+89lVGVsBWYTGEdEYCeagGdXjXCiEcA2dFpmIsfe+wxrH6wC2uHFfzPP//8zjvvPProo+qiqELlrUZQpupGTIKYjNTl/e+//x6z1Zw5c5566imsgdBYaKXEQDLn7oISUBSWXOpobtiwAWtclZQOalSXMSAPgRUrVmCEpNeOMHp4+fLlCGMMqPs+kFepKVIxJ5xwQqtWrdDVH3zwQX5+PoQBdpGEwNtvv422H3PMMdBNVapyVRG/EjlEMQIvueQSdDJ0fvXVVzH1p6pAYOPGjZ9//vnpp5/eoEGDVC6Fkqk8KhcWx+hhNKTYJ5m2JUhF89GrJ554ohrSleHYY4/FabVtPwNVNVADG2f9rbfeiiajsffffz8O9PPPP6+eB8JoT4rurIFJIV8M27p1615wwQWoHaXhOCqZFPDIJ02a9M9//lNd50jG7iKqaWpgpEjtpheLZqr4HfcGUm3bvuOOO9RFJnjtGGwrV6688847r7rqquOOOy5d2M+6ayAXyr/44osxxrA+Ub6yilcFYmKEn4eKlNtXobaVAjn3CK7gLmfC5Q5nMe4JeE7ccWNs+UPLfsz6ebw5cQaZ80vw999O+JWtYk7MK/W8OGfcZSKBtX2VFg3+xQvH/3P9P9+1RKGJktfeGDzo2Sd9r1vAryv13Lll7P9ml/7zs6I+L5cfOtTr9iLvMYR3GcK6DOXySsZgp+/g8mOHFlz+Sclr0xPzIsiC5bK8+IH2uNtcdcG+I1BVlPPyKBf4Z8SqeJ8hkS5DEh+M21CG9nGnnLMSJu75oaDrMPfgV7yz39mwNCpbzhnWqYlpG90jhxX0fqnglxVRdEOccXmbMVn8voHs7c188cUXGOg4684++2wMceUyQ6CkpATLccyqKeFk5s2LBrBgwYLmzZtjIsYMotYKKj4p50uqSJwtffr0gSQc81WrVqVqwRbGGGcBTiH4NGqppNQYMWJEvXr1YOwRA5IlVgRSUQ4W0JgW1Tm1U1Dd//73v5QayYLSmDFjhprv4GQsXrx4e2KIVIVgpQhXQy16evXqBQfupptuUiuzpGhNAjXQafCcMFuhk9GT3bt3xwFFf6rOTMpVGdXVmBDbtGmjuhG+YDItDYjBoYT3gHHVo0cPLGS3UgOFwKioYxEOhzFxV9jDqjqA7M899xxKQ+tuvPHG1IEDBQUFmIJvuOEGVYWKT+avJlBgJBKBb4r2wgOYPHkyKkoB2xAMBqdNmwYFUHsyz26BilACegMrftSFowl3rcLmoF5I4kBU/neDUSCGNIYE8iZL2Q6qJyF58skno8/BKaecgjMX/qu6fJKUqzRoAgpERjQNBwsFtmjRAg5oKhVJ8Ilr164NFwSSYPd6Uh19kB7AhKNccHWfTqGqUGLJzBUBMdXkb7/9Vq03DjjgAKzN/va3v2GG3HHenZJSA86feroTB0hdMFOKYYvzCKPr119/VQcFMcnMu0gVPJSqweT7ofJznswQCNtcftuy/JfotPMmk3LDZnYt1nBj+/Vd3+mQcXDQkRO4ZQtDvrZB5EufxKDG7q7j4UYmMO6xlJGf0fRLlFdTSDTuXnLpZffcfVf7Du0LomLGWuf3JdHZ6+1INEHNoAdlqXzHxJM1C9NL1DKjrWuxXi0Ch7QItaprh2nClK/k+u+5WmbCFAahwS1/ERe1uESYwjVwlGnAI3QdJ1e+Ubaa5Zxad+M9Z9QxDPkJ0fUx4/pPylcmsuFW5dGiB0/P6lPfNgiPUPv9KcUvTs1snOW9fmagscUdw0Iv/ElvQVQTasip7aZNm44++uj58+djOT527Fic/5gFkPT111/fcsstmOZg7P1McgWgAjgB0Fw5dinFKuSbb76BD3HooYcqM49IkJJUMQhgeY1FAOzECy+88K9//QvZUSCWa8gLe7x69Wq4OzipsD6DPM4o2GlMEJjaIJaqukJQOEqDXwLhVNU7BvInnXQSZk+EKyx8/PjxUAyBWrVqoXPgSCFcoaRqI0BbBg4c+Msvv0AHuGgoISsrC1kqqVJVQHOgBnydK6+88t1330UMJtbffvtN/QYhktC3SrKKqMaqBi5ZsgThN99886KLLvIT/wD64AhOnDgRXTdgwICGDRsiC0h1IMrBfHr88cdDEn27fPny1CpZCSiQClRg48aNmN8XLVrUsmXLUaNGqeUdyvnoo4/+/e9/w81VfoAqQW2rC9SO5rz//vuXXnopwtdcc80zzzyjkmDpcdDhRv/888+oFMO7KlWjcLQoPz+/d+/eOCNgdb7//vujjjpq2zKVJAI333yzuqa4U5AF5Tz77LOpC4TbQxkzBHD4YP9g+yGPozx06FAcQbSxwhNhx6ADVS74E08//TSUxzzwn//8R1WEKjALoaWDBw9GEiIhvBu1wCf4/PPP0XXpeXF0cLxwel5xxRVwwVWk6gGcF+ecc446uysEmiig/2233fb888+jZJzXmBg7duyIePRGUnTXUSUDhOEDnXbaaQice+658Fz91sv3fS655JJZs2ZhZsYJglSorTTfZZJV7RY4JFXA9TA1SaeJuwkHq/LYrNjIHj9NJBOnGBMnGpN/zR2f/+Y6LwEfKu7xmLyF4clLD9hEuef4FyF2G2T3mMNZXF5acLGF38gWzF147KmXfzOn5IFv1p77xtreL5UeOoT1fon1Guz0wN/QRJchsa4vx3oOLjjv7TXPjN44ek10fYJFZSnM5czjDrYOdESTvAT3yuSVmi2Bl4gmQwY1Ip/rxEo9dve3mw59yT15aMHGKCtHf3jlo1cm+g4u6/dyvPtQp+eQsqGTIzEnwZ3CIs4u+3B9t1fdm0eURjyXu2XSw5SXZPZV4CY/9NBDalhjRY7WIAYW68wzz7z22muTQlUGxc6cORNrL5wkZ5xxhrpPjIrefvvtVq1aYQbH3IdzASeVil+7dm3r1q3hdsje3RmQUSBjMmpnQBgjDiT3twEntjo94X7Bwqnyk2nbAaVhja4W8Q0aNMAcXXl9qohqDqpbunRp8+bNoTbWXrB8SoGdal55/G6QdOrUSfUPbE8ybVdACfAMkF31lRoPybTtAIH7778f8pjZH3/8cbQXMTDz8Bd79eqFQFKuZkB1hYWFHTp0wACG66neCYdVmzp1KnaHDBkCZUBSendRhcDPgLuPiuC4f/DBB8m0LVF1QQEVqAx+2ZLk/vZJyaCNsKxYT0OZI488Ur1VUZkSdgCGKHoMxxGdGYlEUCD4+OOPMQPAM04K7S4YBn379oW2QI3PHQAZnN2LFy9OZt4Ofp/JJuO4wE2B5vD/MDVBbfS/kqk6WGthqYbC4U/AsVDdsmDBAvhATzzxBMJJud1ll122dFSH7ibITeRPn6EYk5giX8x+aA6dTQ3CDB6KGbE6N4TzBuYSyqgImMSSP+pBBfRlWJdgfvB/JGw3IcLmMZPFsa7xhFlO7XUunbwu9vhrX6/PPnTQSPHV2vrznNrMsEPCgyuZEGbCMAn1WmYUndM28vSpuS+d0/CKPnldGmVkWfKnzAKEm0xQLn/zxKPMMRiDvMggIvlI8x8QaqEZ6Dv1TIphhInXrVkwQ8Q2sOCsNaXoE0GDk5bGPBLo1jzRIo+5NDRrrce4gJ+/dmNieUkg1yk6pqVlI4bK6+R+T1btWOwh0A0Y2WeffTY8AIxFTGrqfjBWjePGjVNXHZKiVQan6GGHHYbCMZuoRx8wKbzyyitXX331CSeccNBBByFm+PDhK1asgPAvv/yCpQamNuVPqxK2BzJCBrPA9OnTMe9XhmnTpq1btw650GpkTxaUhqoUSRAo97/hoeJ3AGTC4bC6clBQUHDdddephWYyuYZBRUA9nQ77hBn8hhtuwNHEFFm9OqiKgAqrxfGugoxqWYzuhTOkehgkkysCR+TCCy9ULy699tprONyQnzt3LlaWV155pbpjVUOgItSOVaz6uEtJSQmsL9RAEz766CN4lurCmBJL5tktVCFA7aIutdRWu+mgdtQFBWCTkmN6Z2DMz5gxQ2VMlrJ9UCnEAMwe1ADo5+eeew7ZK9Sn8jRu3Picc85ByTDwsNYIoMx3330XVlZ9i6UqQLeWLVvCCcZ8cvBm2rZtC/2ResABB6TiO3fujDB8HZW0PaAeygQIYIyp3oDVv+OOOzBT7TjvLoHT9pprrkEAk8bTTz+NGhH+6quvMJ+kvsdfFZKH888Dw8z/k9cnGBP+uyZujC0YNOeX7JHTyIyJdPx4Y9LEYye5+THXjbqu58irAMiCrePKL05I78CP8wva/O+WfzI2tacqTor6qR7zYh4rcPnva+NPjik+/8Oio55bWqv31b3um3zoUNbhVdbptXjXlxOHDHF6Douc9HrBfSOKv1/irIlxx5NvhMSxeJWOtYNiuOsw9Zi5/B8OgfT9oKQjpJrJylMgAnWj7f6HM6KQdqLTirxjXynpOjT2+C8bo1yUcnH+m/mHDUl8OCvy9JiCnkPKjn+5cEU5i7req9PLOg9NnP/GmhXyQoa8pOOKKr16s2eRR9JfB//973/HyMZYx8jG7sMPP9y/f3/lrSdFqwbKQbGYTXBmApxIqPe7775r2rTpypUrEVbXVHA+vPjii6j39NNPh5lEAElqMZEsqCKQCplBgwahBH9OqBRPPvkkCgcVFo7pG70BsezsbBgzlA+SadtB3YFCTx5xxBEwADDAmJdRfjK5JlE9UFZWdvHFF+PAYR7H9AQdbrnlFkxb1aiDqgi0adNGTl6EwNAm03YFlDN58mT/OEjnDFN2hUchHbQCQwhzMbKgaS+99BJ277//fgwhuIw7zV4VULgcKJ6HsQozhmHWvHlz+Mpw41q1anX99der6woQqKIayI5y1qxZA4uI7sUIHDVqVDJtSyCG5uPg7tLHrOAqVeapAqUGtnBNWrdufeedd6pHXLH6h7eBpKTcboFeGj9+vHryt0ePHoWFhVi1YwzAdUOLkkK7C3RDn2ABAxKbgUODQYIx89lnn6lUbKPRqArvuDdQIIDOKOfaa6/t3bv3BRdcgKIwBi655BL4x0m5KoMqNmzYANcH3YLexvwD3bp27YrqEIAOSbndpUrXM5JlVAq/N9O7lBsCa3/mEI/FRpSt+N+6YDRXfi2DkMSBhe2faB+vZ5sEf4jAhAK/DlvTFIa8H2XAJUZZ8mIDjoVf2paF+/tIS0CCqUT86yIGQynBxNwi+vbUsju+Kr3z2/gHs+3FhYHioijsTG7degYjNqcZTNTh0f6NnXv6iefOrn1r/5zjWpqNgsyi8vfnTensY57D/xaUUq+SwAOEZvKVEiiKPyF/WDWpTgpEyCdL5LMlKMamhJqB5mGjRW3LFM7cdSTqivmbeH7EqmUnDmoaPLRpOIMlCll4YYEoE3TqMvkl2o4NrfoZ8vUcvyewTZa9zwH90RygfhcX5/n//d//YfbECXneeefhdNrFMbYjUNQxxxyj1mdYx+AURV0DBgzASQUFzjrrLFh0JKFqzOaTJk2CF48FhHI+KqMG5guIVR4lj4xquxX169dXj31BDPORn6NiHdBpmAUwDT311FOYKeBbPPbYY7Vr10b8E088gXUkZpCtpgkVg5KxVSYqfbJT8UCJKZJpW6KSVF5s0Z+//vorFHjwwQebNGmCmFdffVU9N5AumQLlIwaggZFIRMkAFZkU2gb0AxqLvCoMO6fidwlkhIbqZhl2K2Ng0Apsr7zyStgkKDl48GA4dh9++CGGEHo7XWHoBqAkIiG57TQtG+nHYKtkdtBeACUxDgHW4sojX7t2LRys33//ff369SeffLK6mgIB1RwFdPDrYehbjPbULthxdRBAh6CozMxMjMNk7JagOlWjkqwMkIcOUBXhZCnbAeqB0tLSm2++uVevXnfddRf8OcwGRUVFN910E9xZpG7VCr9ZEoTV48/pqVsB84mjhgCs6W+//YaDiCN40kknQUMloFCFQGdscTR3XKYCJahLegCutiLD/6IJsqM/VSq2GHsqvOPeQCoUwPbzzz/H1PTMM8/gpFbvnrz//vtffPGF0mor3bAL1MyQOrmwq5qTFNoSFIhOUB80wqB677334I3BAzvzzDO3nYf94pP1YnhjdGEaATvonz+uzOwGVclLGMECgQfiYok14byJGVPhZMBqemW1yzo8177eGQ1EyDAxFUuXglH1w6XpbK5ZSCug/mWExAUxGQnBKZFmnstXZaVply6KEeVibYxOWJEYs8hZtJHHmO1SSJoGh09QunrKzwGbNj24X0Ygq3ke69dG9GsZbpBhWPAh5EkiTPkzrahHeWb+DZ/qAH3IKX1prPPBXC+bJx46LWvK6uh7k6yujb3/npJbEuc3fVC82ql9QefIEe0D938WKWLhu47hA1pmyJMCTg0KgNtTXdr86cgx6M+JPXv2nD17Ns492Mv//e9/o0ePVo/vJeWqDGrBFg4NzCFO8mHDht1+++1ff/21+ro8ThjM18OHD8ekcO65586fPx8GUtmhnaJKXrJkybJly1TMTkGWFi1apF6a2LaZmCuPPvpouDvQ4dtvvz388MN9Q6PG3hbg3MYWM851112HGRPdiHnhySefvOeee1ALMn7zzTfoVVSRyq6mA8zjsFWYreANpH+YCEkgpRLC2FZYdbrklClTMCUNGjQIhhCRH3zwwb/+9S9MbZgTv//+e/W8PcTSW4rUgoKCO+64A92O8g888MD//ve/ffv2VWWm9NkKKJ+fn9+lSxesRGvVqjVz5kx4DMm0SoMq0MNYzuJAY66HsenevXuFbUyBegEUQ7swhBBAS7/66iv5WT//ftxW3YsRheU4VskdO3bEijw9VXUCAoiEpIrfce0p4ARjUQuHsm3btjAMMNs//vhjhQMV8/68efPQpXBHYO3Q2IceeggLViSh0u11L5LmzJmDUQTL2qxZM/ipqCWZlgaUxxbCyo9UkTsGwjjNjzvuOFS9067G2Hj44Ydxev7www/16tVDW9DbGEjIC+cDjixKQL2pqlE4cq1YseL1119fuHDhO++8o3yvlEAK6IC8v/zyy+mnnw4bfOyxxyJXv379MCFAGOWkdEMPIIxh9vLLL48aNQrnEfp52wJ3yrpd/+54OlB40aJFp5122jXXXPOf//wHzUSf41xDnzRq1AjeALYQS+9S1Rs4TbDkwBCFZ4bjDncNUwpGQoWHXvkN6mFn9QWwVq1aoe3qCVC/I5MNh5gKQCscCCiDvDgF7r333m7duqluVwJbAJ1qHJx38qaF/BAXk9cV/D/mOl7cLUhM+8fkX4PjJxiTJ5OJo0Nj5t4xzytnnhsRceZ6whHwwuLJclLgTGaOkPcueEIIJLtwrbjjyXsWDP4nFmNcfhDLcbxEuceWFbNv5kXv+rbw+JeLeg6JdBsa7THEO2Qo6zyMHTQsdugrZT2HlPR9fsPF7y4ZOrFwen6iHAtEFo0Kt0y+S+q//yqPg/qUWLI5VUd2iH900RdjVsZ6DS3pPbT8xWmRSz5e13NI9M1pcbjQhR6/9btN3V90L/5g45BZZb1fipz+yvrlUcE8R/75bwdDrepQZw+AtqtOwLTy9NNPqwGKee3666/H2aVSk6JVQxWFimAV1GmWl5eHUxfGBhUpBWBx1bkKq4x1AyKTmXdGqgTl0VcGZFEgI3aTBaWB0q644grogz558803IQbhZFoaqijMC7A6mFOwtsAuIuFDwGCr7I8//jgUS28OZKZPn445ok6dOnXr1oWtTU9FduwiCyYdzOzfffedeswimZyG0h+qYgEE44d5EBO3ikRAfbkLXHnllWpNr3RLZvbf4MC0279/fyxS//a3v2EGxOr5119/VQqkS6ZQhcB8qos9qLSkRP4gxa6CctBArN5wuDEePv744+3VmAICaClqhweMmRe5oAAmZUziSAJJOb9bsMIbOnSo8g7hSKmBoVJVOQBmDJM4FqlYNSKLSt0pkFRfc0fJYMiQIYipUHOYT7gLsOs4m2BH4U7BtZ0xYwbkt1I4her5kSNHomT0DE4Q7FZYeKoQVXtlUFmwBclStgM6B34z3EfYMDV0sYXThhiYMTiXI0aMgAzKTGbw9YG/hVUERhFGBcZbep+nA0kkQeCoo45SfYgsGHWITC8TAezCRbvhhhvC4XDz5s0xpCsscKfsxnfHU6BGjPATTjhh4MCBkUgEKkF/jByMAWiOQXjeeeelJrFkHv+z5dAWh69Xr17XXnstAlk+8Dm21y2IV0cHPrEaXdg+8sgjquT0LCoGni5mGDguWN7gFIYHhpN33Lhx6WqkUyU/A9VXCugJK41mcNfhCZc7jHsuL3fKy1c8tGpiaPxoY/Jv9m/jzd/HHz/Jy2eYq1wGXyMR850F+czD1iCmjPFil8cc37GQjyr4j0m4zOFejLmxKPOWJfgXy5x7R5Sd+XrBkYPLe7zoHTLEg2/R+WXvsCHR7kMSXYd4h71Uft47Jc/8VjhhXbQw4ZW7HhwLWY58pUW+45FgcF+ghK+D3xAVrBbUIcT/BeX8tP8r6zbUOeXd6FGvlRz+StmU9S534/B4Xpta2vf5+FEvR878sLDHEOf+7zeWorHSF0K/yBdcfB9rXyXZA4wtXboUIxWDG+cD5l+cUUDJVBeoBR66WtLBmcAUnzqLEEASvHgk5ebmwnKr+EqSaoXarSQ7kEeS8nvQIXfffXeFvQEZqA2XAiuzAQMGFBUVQQaRKh4rNsyPyI7l4JgxYxCTygUwDWGqxSSFVKz/EKNSAcIQhvlEsTBO4Kyzztq0aVMqVQWA0go2FcusLl26YN2m8gLEo1i4jFAAVvnzzz9X8SCZmfNPP/30gQceUJf0oc8999yD9p566qmYRtPF0kGxSPryyy9hG1Aylrbbk9wxyIWi1O+kAPX+SDJtO6BpyIIAFIZXpGb55557DvEKJQYQhiS6F3YFw+y2225LT0VF2MX23XffVS8+tGzZEqY9mbwzkBEWVy2s4SauWbNme5q/8MILgwcPVsYJYxveHuqC1cGuVLeisad0e/XVV1E4hOFtq5hkchp+AZLkfuVQqlaYS5UGEIbjC7/5vvvuw0hADHIBqA2zBz8D3Q7/CcZblaaAGMwtVvAdOnSAn6GsZjJtS1SB2H7yySdoI+jevTsqQvnpBQLs4iyAmUeBOEbKV06m7Qo4RvAzUBG89mTUzlCaKFVx+rf2fzZIaahYtmwZXB+UiZMLHq3SLQUEMKJwdqg3dNAzGKgYM3A7UIiqYitSowIlYwbGAMDsod6FQaSSUaBAdNc777wzaNAgjC7sopfUR9XOP//8pNA27Oj6VXXiXx2EVyNfjhCUMsOMBstHO7NemB+IB/ybIka0XezgRzuJeiLIDUOEEoZDSNRCRmTaGsrlj4RkGsQyCTMEJB1ieEQ4HiEFLDA2n770W/nNbxc8813i68XBJU5eKQ150kkXFmEBgT+vWUb56e29R061nx6YfU2fWj3qBLJt+TyIReV3L0xmWJ5hM2HLexP+HRMFQhVdFto9cDjlFrYtJDrWo9BvVSQQccItarvNa8v+sgnr2CRUy4yWsYxlRWFDuD1bh0OEE8NgyCcvurLkhdd9k2QPUIpTUd0ixWmv7mUAJVNdoED4EGeffTbOz86dO5944okIqFqwzc7OhgKIOfLII9XLmZUnVYjarSQ7kIcTj+VC06ZNEcB6F9t0YeXi4/zHGQ5T/e2338LyZWZmQnkkASR169bt4IMPRhhzzdVXXw3PCTMC4hGDEuA9wFTDD0A5qswUEEBdmBy//vprzD4Ai8s333wT8VsJY8bHjPPSSy/BYB9zzDF5/gdJIYPs0KRZs2ZYRkMMkyAWhVOmTJHTzearu5CEwb7mmmugCeShDEwguh3TnFJAiW2Fyr5o0SI0BO2FUwJhlbSroIojjjhCPZ0ze/bsZOz2gRjaiwBmdvhVULtu3boYS4hXKDGAMCQhoC66JGM3gyagHwDMAw4fYtDk9I+qKbHtgQI7deoEFxDhM844A/rvIMtFF10E64IaMbbh7kBzWA7VigpB4ShtxowZCCj/EgGQTE5DxYPkfuVQh297udAnqB0LZSzWCwoKcHqm7n0AJOFwo0sRmDp1KnoMvikGJ0Y1MiI7XDqYRrQR3agKrBAUBTWwxYjFCYKY1GPLSj0FaoEMCkQHYquq2D2QVzZg+7cCtwVZAE6u//u//4OLgKkASyCloVISs+Wxxx6LMnFyqVtj6Ao1fgAEsGzDEVcjEE24+OKL27Vrh0iIKYGtQOG+jhSn7QUXXIAYTCnwrhBApC+SBLVAByxvMKugZxDGmXjjjTfi9Ef5UDsptyV/9OxuoDTbNTDYuEkd6i3m02+ZlVUadAiDnxGrW37g/Z0CB5tcvsiKhkGxYABGH16G/AL4FmAfPUOkS2ASAX/EKjHsFdyest58a3T5nR8X3f115KMF5iqRGTdZJnfhVaBYYrjZvLSdsfb0ZiV3HBd45h+1bz08eFRDq758ZZZ4pmFyZglXUOFS6vmays+Fo5myPv9P/s/8P7lTXcgKDNG9CQ+SqP/WrjikMQnLiuHkJFrUMhtlJQShnkFrBco7Ng2alAv4Gcijcv8lwEDHtIiBe8kllyjbgxGMU0KlVhcoE+cPTokrrrhCrYkBjr6qCDMOzhko4I/uPdmxqL1hw4aY6BGYNWsWzupkgg+0xfm8cuVKTB+vv/46Og0rV6xfMTGpaQheBSZibGFgMMvAkqGo+++/H+s8ZN9p01A+5mv1uRHVSz/7H8PYKuOmTZtgEh5//HEcu88+++yhhx6C4VRHDcvN22+/ffLkyehPgPX0eeedB5VgP1ReFIXexoIeYchjF63AccHMiLBCSaaDSDR8/PjxyAJH6tBDD00m7BZwa+BZosA5c+bAzCdjdwZ0gCvcqFGjU045JfUFsMqD7OgioJ7OQUNgTWH+582bh8gKW52Oyo6qcVxwvqhIFKIC6eDQwPRCHg3ELnxK9C1QvQ2UWDpKeMKECZBBz6CNSjiZXPOMGDHitNNOUz9pBDP5/fffp7TFCEQMBqEaUUhCJ7z99tsqNZl/V8CpcfLJJ2OQn3nmmcmoGgC6oQmY0HAawkInYysBvCicL2gysqNbrrvuOpxcKikSidx6660jR45EmegKTA7/+te/EAP/G6mqxzCJqQdrEIMtTi44EFlZWemuW4VAGB0LhVFCMioNpCqP5PLLL8cUkYrEiYyTV10IqZjUgaxBUIO8lMHxR7lJ4FmU8nH3TDXe4BnMNEg4HijMvimnwx0H05D8JVOcNPK8waKdC2ImhDT2wZSWFEXh2MmnPrGlDjeWFPJxyyKTVrhLC+1yHvDMgCO/TiEvDxiCGcSwaSLLjB7cmPRtFTqkcahhhmnDRhP55ogsw3ckpI5waeSvp8QFtwgNYFAIymwUJJ/9hLODyqW597WAayT3qwVU7RCxplRc/d7aIrOu7biPnx06rC5abVEjUi7Cg0eUfbUk4BmJw5u7dx9ft7ZwGDFdw5Tf7RCuR+F5+Ed7XwYTHBa4jzzyCM6uevXqqclRkZSoJmCMYRrhjKe+NApQHbY4Y59//nlM0OqCP84olfrno6akuXPnYimDuRUTTa9evZQdAtAWpy1MIxYQ6vxFDGacFi1aqHmkpKREvV6PeBSlGoIt1jSpiQYxmLw++OCDcePGqZ9BkUVvLhz88MMPY8eO7dKly1tvvYWpTT1viPiUGpgK1aUFhBGJ8lEObBtksMxS955QnV+YzIWSoQCmJ8gjRrURAV8dqXPv3r3hjtx9992qQETKatJACevXr+/Xrx9a9+qrr55//vmQQRXJ5EqDclQANvW4445DCaNGjVKfT1DxOwAKjxkzBmYYrlX//v13kAWFqxvYjz32WEpMda9qCHxEdCk65M477/zuu+/U5Z9U91aIyn7NNddgTT98+HAcdNVR26qR3r3YwvPD8v3pp5+Gb4rdCjsN8ejYTp06IfDll19C+e0VXu2gRpCfn49uQXVoJqquW7eu+gI6BNasWVNUVKTEUvrAsKlLj2qEYwI56qij0LTRo0cjZsdqY/QOGDCgVatWw4YNUw5ZOqhFBeDxnHjiiXAEFyxYoHxuFV9JUE7qPG3atCmMcTJhh+DYAdSILXZVh+C4wPxjFzMYPGM0U/WSn0NmgScBP1JpiG0qgC0ae8QRR/To0QOrEWRJ5doWlHnvvfd+/vnnOCMwQ6rsKdAKNQL94uWyTUXi0GB03XDDDTfffHPFhUNot0GVlYJx7sr/YrzM9eJ8E1/y+PIfskaPMSZPJxPH2WPHHfO7l+8l/A9TcNeFuHz0MplT/h6p/CoFIj3Xk/ee5Nc3yxPuilL33WlFV31a2n9YWc/B0R5DYj2HOt2HeZ2HsC4v865DWfch0SNfjlz18fp3ZhbPK2Gl8sMV/nc7/2CLO0+bd1Az/uTeFslJEFdRdNVAA0sc/smUVa+NW/XWhLX5Ce648vmLBNrP+ex1Je+OWfbWmKXj1pT53x1M1636ldHsceBbYKBjCyuFGROLFUwuKhJgTknK7RZyDvOBucJ6bqvnM1IgEjWCgQMHwvyre94VSlYdNAq2AW6KUkY9rbYtUObrr7/GbJt6ABMZk2m7BQq87LLLMDM++OCDqijEpJfpt1gClVQSAldccYX6heGk0HYYP348fAgsSZE9GbUliEeBcBfq16+PhiO8lSRqlHX7YBdbyGzYsAFH7cknn1QyOwW5UA6cRThSBQUFqhZsk8l+58sK/DYqOwRfRD03g8ik0N6N0h/rhG7duvXs2VON1WRa2kEEaJTq1cmTJ2OV/5X/A85JuTSUMMB5d+yxx8Kb2e3nM/YgaBr0h9rwSrEImTlzJtqeTNuMOspqbAOcVo0bN77rrrt22lgIQF5l/Pbbb7GEgI+YTNuGHfnOO0U5NTsFgsIQXP7yR8hkRsmEsmWDFgcdbsmLEjZv5XR6uKuoJS8uuDYRpmkI1+ZRi8fgXTkknCABQbjBXUPQuDAXRcRnixK3DY9f/H5kyDg+PZ9HmdQE/ieD6yNEgCZqi5KetTdd0zU69MzgQ2fUO61TbpssmimYCSHqP22hNJP3Rf5g8w4WfPiTe1skJ0FcRdFVAepwN0T4gK5NzujZ9PQejWpZ8gOoJkV/cHjabevnnNa7xWmHtezYMMuSqqfrVt3KaPYCMNcDnGJY6cKkffDBB+vWrUMMnPukRNWo/MmLlcrEiRP/8Y9/qPVTdSmQDorFnPW///0Pi6GWLVuiCrU8TSanAbFXXnkFq9hBgwbBhCNjMmF3QUUPPPBAmzZt3nzzTSym5YS4eQUP5Bznk9z3LwysWrUKvs7ll18e8t8mTSbsFqpwzNFHH300zADC6d0LZVQ/oNXYYlcJvP/++1h/V/5qP3KVlJS8/vrr6sMqiEEhqWYiFVtVODwn9AOW3fCNUIWSUQL7NGgCGqj6FluAXXRIhw4d4LAqmW35azQcwBUYMmSI+nUVdaBTqdhV80yKzz77DB4Vzne1qyQrRAkAuKSDBw++//77t/f1WFAlP6OyGEzInwfDRGW5K8isO2dllObmMCtIaVluWZvbO5hdgzHLFf6dEjgBnFrcCAojAN8hwFmQC+QvcM2flyWeHrHp7s+Knv6Fj18VKCNZpbR2mZUTsbIiRiAh7yOUH5hTekFn54lTQo+dVueCnrXb5dLazMny4iZ3MXkLauIPXZFUbC9B3rchtinCguUIVz1QQvyPnlLimYTZgoU8L5uKTPhb8jaL5q+PmgKw0lW/7Pqc/3FPddJiokwK1TCYcZ544olevXphSYdKU4an2vn444/r1av3z3/+c8fl//bbb7/88gsWW6knM6rYFcjesGHDZ599NhKJvPjii6q0dB2UfYKlR9vVHK1+KO64445TSb7UboJiR48ePXLkyLvvvltN9+lVI6yqUFUr4DG8+uqrxxxzDLyBpNzOQK63334bWY466ijsqgJVElB9qLZffvnlkiVLnnzySaxNVQxIyu3LoC0AAdUchOfNm/fpp5+ec8458Fn/Gm3cHnBVMcASiYT6aH36oQdoOwYhtugTjJOysrJhw4b17du3bdu2SYkdorJ/9NFHnTp12vFD2VXyM1BuZYAcYcTAJFko5v53sTfXCHs0yLLLrUjDa/Nq/72ubZMM+U1NZuJMlqoKeBiYZuPU3MiM39eRJ0eXX/VBwYMj2ZfLclbFa+FMCQpucKzl5eezskRZu0DBwAOdx07OfPKs2pf3yjm0YbC27fedacCvECbkLIjLdQEyC9SyV0GEYRO0hjCDeAaVF6yEGYCnJeRnRXFmyM+g+qnq4RTNXxwc89SMcOSRRz799NMwFVj4IgljAwNbiVUFlJzabgWqAFi4Y11bUFCAxUrAfzJXkRSqMqoWMHXq1J9++gmr7a3ufyflfCC2du3aW2+9Fcsy9fEASKptUmK3kO2hFF7Uo48++tprr6lf0kLV2Kr5F2vBRx555Pjjj7/55punTZv2xhtvwAxfcMEFrVq1gsxu1468WAXCu3r44YefeeaZjh07oqitmqM6YenSpQMHDjzttNNgAGbPno3mr1ix4qabboJwUm47pJqARq1bt+66665TBxEZEY+SlZgCwjNmzPjvf/977733nnLKKRBLsZXk3onfVdvVE0m///77SSed9Pe//x3uxaRJkzCEMjMz1YsVaGNSLg2/PFlg+nZfQR16gMDy5cs/+eSTxx9/PP0R4JRYfn7++eeff/LJJ2MQLliw4MYbb8TJiLPe8h+r2nGrkYqxNHPmTIycO+64IxwOIwZlpkjK+VSp+5Ll7RSPsyjjhXzJg0uHh38bZ0yeQiZNplPGHT/B2+hEeRlzGI8xlnA9x/Vcx/HcTQk+dk38qTGl571X1HtIaZdhbpeX+cHDeBf8DfYOHep2G+L2GlJ+5psFD/6Q/9Pi2IY4j3vqQxvyaxf+NzfwJ5/FcP2fSI37f/LeFBLlr6vuRXfaoMrm51GgrwziH2irftkV8djdi9TV/Il4PvF4HIuGzp07//jjj9t7dmGXQJnY/uc//6lbt+5Wz2cgCbuLFy++8MILscCN+Z/eSqZVKzDhACvLyy+/fM2aNepGMmIQ2KqN2F25cuVZZ5115513lpaWKg2TaVUGpaF88Oabb3bt2nXUqFHobaiBKgCMdDAYxLQLa4RZFavD3v7nOJVAsojtUOHzGUp5VHfLLbc8+OCDq1ev3kEPI/6yyy5DpcoFQQCu2BP+72duL0sK1Z8TJkzAQra4uFj2dRpJIR8UhTHQp08fGJtoNIrUnTZt7wTLcfV8Btqe3j/o7cMPPxy9lyIrKwsOB1rqd2TFPYlOAMh7zDHH7FvPZ0BndRItW7bs0ksvXbRoEcJoJpqAQFLIB4676hA1tjHOMVxVn4Ck0JagE5CELXpv1qxZl1xyycaNGxGDXVSKKhBGLVv11Z/iprmCRUTx10Uj6/06nk6cQMb9bo/96cAfI5PL3TiciggUZwlR7oqNnpi9Lvr6hMLLPy44cljJoUMi3Ye5PYa4PYc4hw2O9x5c3ntwWe8hxUe+Wnbjd0UfzY/OL/EiLhrsMTQTvgoXCfnlUUd4ZYKVYc8R8mOijhDoM/WtLd/Zky1P6rYXAFU82UlyC/2Y/8ss2IXa+HPlrsf8n0vz1U/m0uwPqJkO5zC2v/766+mnn/7BBx8gnEzeLVAmQOCaa66Bn6HeGVFJQE0imC+w4PYF5YSCbTK5msAJiFbAvbjyyivnz5+PMEDVJSUlX3zxBSas1BmKqrEYPeecc9566y1lBZWGKrWKyMlg80ORUOCXX37Bah4uHcKq1bNnz0YXKScDwMlI/awd2PE0ovyM22+/PV0MYbQOhRf531UDqB0kk9NAJKqAL6LMAGwAluBYOMKaIheSknLbATJYaMKVRCerKtCoVatWjRw5UpnMlNh33303cODAb775RplnsNPC906Un3HYYYehpelNQBvPPfdc1Y04lPXr1x82bJiSQVKFBxGRqh8gpp4DRefsE92iNAf5+fnXXXcdHE00AaCl6J/PP/88vRUvvviiP66lkxEKha699loMS5UdVNgzyI4kFLhgwQK4sHBl1LDBFjViIaR6dau+kpeM5JWU3WL7eeW1R3lTBukUtpO4S50J50yxZ5JsL5QgXnG90oOf6lrr7NwAs1lIxKhXUG5MWFo6eok7tzgjxiyHBj1hmILIXxWR1ycNbLNMp10d96h2Zo8W4QaZNEgJ+oEY8r1X+dapfAHVI5RyajH/216m3HrJe0NIktiEyl9NQeof12f3CvwrmWgMGiB3uWyP1Bxb1Zl+gzT7GTjFcLarMKbISCQyfPjw/v37V/IFuQpRpy22V1999SeffDJ69OgDDzwQhatUNbmoMGYfzBdIUqjI6gJL+fPPP79Zs2bq1URUiqlq7NixMMwnn3wyqlZiiB88ePBJJ510wAEHYEJUMdiqcBVBUQBNwxYNR5lY+n/77bdnnHFGOBxGJGS+/vprKAB94IL8/e9/V49SIt7vkh31CfwMHCnM3U888UQyyifVw6gO3YuSt1cOxKDP448/jm5p3br1ZZddduihh6buYe24B+bMmQMfrm3btk2bNkWNkIeZgUrPPfdc9+7d4bUoMZQPx+6EE06A9VU9gEgIq9R9CzipRx99NJQfN24cGpjqH3TjypUrH3744Xnz5sER+de//tWxY0ekyk70n0uosCfVMYJBxdhb6KOe/N37waAqLCzE2Q3z36lTJ7QOMQhPnjwZ/tbll1+elPPfdx00aNBPP/2EQYJuUd+nR8N3MAxQFFKXL19+xRVXNGjQAOcvBjD6EOXDTX/kkUeOOeYYVUJ69mr2M7DPBLFo1INRZEEzHhUZpiiwZ9ww0/xYfnQrahbjn7wbGjZ9oG00QDZG+Mx1zoQlsWkreIRmJeQToIZ8FBRtFMQWzCJuruE1z2G9Wxl9W2a1zKOW/IzX9s4DpY9890T+IzepYFqcRrMfg5li/fr1o0aNgsnZsGEDbOepp57at2/fZPKfAmzepZdeCrOHMCYpFYmJCdbu559/Vt/vUpH7HJgVMefCM/joo4/U+35XXXXVgAEDMvzvGiWFagx0JhTYuHEjfDisOBGDXaBSDzroIGiVlZW1lRnY10Gr0Vh4hK+99hrCcFWPOOII9HwyebfAohzeyYgRI55//vny8vIbbrgBTkyXLl2SyXsx0Pbmm2/+/vvvcaanDj0Od05OznfffafebNptUCb8OZy88FrSCwctW7aEz1rhEqhKfsa2cCK4cE1Oo4Z8PdVixcLJWzm4YPUDy+yECIp6MSNOB0QOfKH3BDc+fWnZ7HVsdSwcMzLhOgQEtwl3ufBME0plkljLXO+QZoFD2gSa1wtkGSyLs4B8+9XiWOjLz4FrNJpdBjNFNBotKirCjKwWLlj8NW7cWKX+OWASX7duHQKYnlSMAmujhg0bIqBmLhW5b4FeBfn5+ehn1QR0MlZ+6OSabhEmc1SNLVbhcDUQs1WNWK2qz7winLpi9BcAXV1cXIxRjbYDdHh2dnZubu5udzgKQU/CoMJmqwJRBfwz9F5SYm8F2uLoFxQUQGF1ditUK+DHp1/p2VVQCIrFyYvysZvqXvQVwsFgEOVX2OfV7GcI4hEe5yzsWobNyhnxor+KRWfNFqVmeVZZvllnXav6/F85E7JK86O5Lgswg3rUQgdwxi34ENwNmaSuEenbKnB4m4z2dc0cE2ep/H132TNQ1TDhYEiN4ZjoixMaza4jzaA/L6hdBLCbupD+54BpJ0XK4EENFQCYCiucsPZ+0Ao0SoXRBBVGAC1Khf3E6ie9amVL1IFWNSIpPbXm1PjzgfFD01KNUs30272bbfS7KnnLEgGUowrf7QL/NJTaSmeFisQW8VVsBcpRqN3UGFMlq5gKC6/26xnyMUbLtVAV2usscUdcNIYtrruyWfaKdsHyVnmFtQMrarGYIcLMYiaJcyIMeYslh8QaBhOtG9q9O2R3bWzUN0hAyN9T8AzbYPKRCogJKrcGFVQ4hJqE/qkzo0bz1wCTcmquwQShwkCl/jmoaQeVpgKpSGzVnLWPAv1T065qS/outjXX1SgfbFU+YtJrT/EnH/EaZaumqV00sCptVKcGAqnSUtu9nJTmIBVIdVEVm5DeLenjKrVbIdXsZzB4BoIEGHEpM4roqAfn/LAqvLFVo/wGwXiQcoM6pigPUe6xTPkMKqfczQu6nRuwI9pmdmoUaJBJbZ7AiUn8308TxPaoYQqPEo8TSxDLhF8pCPLLXy8x9HORGo1Go9Hs1VT3fRP48oLALyDELfphzQeDFv94Wv/SEIsGouFEMGGESgPE4KyO42SK+AENSf+2gcOaZzTKpEH1NXBkpZQRC34EXAqLeESUC8rhZBBiyt8Vk38mqkEF8rtWGo1Go9Fo9mKq2c8gnLnMc41AmHmCWRNHlj202Cw0jYRtOoaV68RrG+W16tLD21j9W+U0zDFs+bKmMOXbp4TJb18aBiO+/8CpQTiRP59qyMc+oah0Q4Dwf6AEwe1eo9FoNBqNRrN3UCU/o4K8iBDCv+XLqeCl3HpuZMnwxdQyedNc1quZ3aNlsE3dYKbJ5Tvg+kFOjUaj0Wj+0lT39Qz1YybypVNu8JjHrenrxbgFJYe2y21RP1jLEgHBTOpR+SCHQYwMUrVnUjQajUaj0ezNVPf1DIn8Kie2Fi+Xj7YbmR61bCEMyol8H8UT1ODyU5/Ukj+dqtFoNBqN5i9Ltd83gTPBOLWZfPCCGUL+Kz+nnXq7BslCPmBhyG+A6xsnGo1Go9H8lan2+ybwMzxBTEb9l1N9L0O9bQuvxP/NDhnGjvQ05Bsj2tPQaDQajeYvSzX7GZtLgyMBV4LC1TDkxQv5zqof+8f//udCTP18hkaj0Wg0f2Gq+b4J9rl/DcMPqqsVzL/IYcGl2EpauxgajUaj0fy1qe7rGf72D4dCuhLy2obyObSfodFoNBrNfkV1P5+h0Wg0Go1Gsxn9UU2NRqPRaDQ1RTU/n6HRaDQajUaTQt830Wg0Go1GU1Po+yYajUaj0WhqCn3fRKPRaDQaTU2hr2doNBqNRqOpKfTzGRqNRqPRaGoKfd9Eo9FoNBpNTaHvm2g0Go1Go6kp9H0TjUaj0Wg0NYW+nqHRaDQajaam0M9naDQajUajqSn0fRONRqPRaDQ1hb5votFoNBqNpqbQ9000Go1Go9HUFPp6hkaj0Wg0mppCP5+h0Wg0Go2mptD3TTQajUaj0dQU+r6JRqPRaDSamkLfN9FoNBqNRlNT6OsZGo1Go9Foagr9fIZGo9FoNJqaQt830Wg0Go1GU1Po+yYajUaj0WhqCn09Yy9GyP8o/sX/xP/XD8mgRqPR1DxyEvKh8t/kRk9Dml1CP5+xF8OJQ+MBL8htRonFCTOZKa9A6RNco9H8KXCSYMTCrGMKj7hBxyIG9Sxq6mlIU3n09Yy9FUE8IRIkGk5k8ICH05ybccuzKcEZrtFoNH8GQnAiDGJwKhzhhT0Ly53SoJGlJyJN5dHXM/ZeGHO8Iic2g3lBZnoBYhCTEWYJddVSo9FoahZBbEY4IdzkxHI5NxDKahMMNgwSqqchTWXRfsbeCro2QZb9sHLpe6tqd80JemGTChM9Lkyqr1hqNJo/AUw3whNUMIMx06GeWTS57MB/ts87LY9YSRGNZqfo+yZ7KzgsDln+/Qq2xGt9RWsSwEKCcSNuCpvqt4Q0Gk3Ng0mIERP/CEq4oAGPzH1xafYh4QOObkj1JKSpNNrP2EsRhHPhrfhmDVvOWl3VVBiWxU1iuERgHaGvZ2g0mhoHtkFQTqgLK8Fo0Hb42kGrA92D9QY0IIaehTSVRd832Yvx+Oov17mr3SbX1hfUDPIgMbCooEL7GRrN7qNmrYpPIhW7eV7DvzIi+c82bClcMZtlKi5jyxK2V8/OqYwmu4eBQgVjlLvUDjBn/aMFVi+j3rGNdldTzf6Ivp6xt4LD4pGVXy93Vnkt/9NSvm6C/+RLrTi/9Smu0ewG/klFGELy5qOwfPPPKNbs1HApteDOI51TYsnblIx4EIKsRyxLGCYWVpT4b1/IkrDO96HCMOQZiRjMpUJwQ35fgqJYFCQzUOTgpivknQbDf6qSWMI0mEFMl/hPVgoTuaAVqkNBgUqd4P60LTfQXXhUlmPJfKZUWKWqfzn+8z97kY4ptUBzBLSE4lSYpi/tVyxb4n+0J6WGbLj//LmzYlB+VteM+sfV06+baCqP9jP2Yjyy4tsl7kre+qq2wmRYWHDDwAyWOvs1Gk0lgRl3pfEURvKFLeUtyPW6NPGUOCZxiPwuRIAI+UiC+oOXoB6Hwr8wvrDmOAORxc/sUfgisgxKqSG4iQhZj1+ANO3CIwaMPGx6EJXKolQJUADyUhuPcL9Uc7N7AC9FnuOyxh3iCYa8AWJCFJ6EK90aYcly/Puq0uWRyFLl6yL+ThrQNdULUlPpiqBV0klypc4yny0LlFoKf8pBOZw6S5/Mzz04XO/4utrP0FQe/xTSaDSavzSwpqagFjcsbprcVGbeoaTcIMySK/qA4JlEBH1fhFN5kYO6FmUm/APfMeD+NQlPmnjqMep5BrwK+CWmTQxbIADzHGA0KOuCRYbFZpbJ/YsYylb7fwanJlwdee2CyScfpPuBPYHMKM/gpqyrUhiGfEQCmsJLsCix4aPIR8cBXAaKJnBUgOI5arO2+JPXcfxqpI8lL7dQ6f1YTEiHi8O3QHMCAnEoh1VWHY1m+2g/Y9+Ac7nu8eHqn8ogc20ZQFEqAFTJ1UiyXL9k1KgqVUmamgDdmzqyIBm7d6D0wVZp6A+HbZbVfyKwqbZHTI9I0wkrTDwqoiESyxYJk8SYEWc0wUQc/oR88jF54QJWV3gWc8wYoii3qWebrm16toEA/riBAuVDU9JGGwaXPgssuXQ/pLsC74HZzAmwGKeOYzmu5cHqo3h4FegbnCU2M0KeZRHXM0u5GYWHo7TdKRajlktc4sVMB7kCMSPkZJgs7DtUhBHOBJOloe/lnaEt/uQFFeoYJEql3oIZBDsxajjCRHaTuxaLm9wxMNXIO7UaTVXRw2gfANM0pZge8O8fxsTf2QkQw+S+cuXKX375ZdSoUbFYTEXWEKrSlEXBtkar0yQSidGjR48cOXLRokXJqL0GHHrGmBoAKqDCewrYfXljw+au7cVJgnFmOLYRM6ljCjgDLCi8gOlZhutJv0Eu5uFssAR14zTOuct5DF6I/EivvEwgvRGEXcFdk8Sp45oJZiYodU15KUR6EJ7J4xaPm8KlcCqChKFUN0bLXepCGWnoBUx9woMxh8PDDSa4X07l+0gqkSDcIa5LIsR2UQNDqTJBAn8GIYOZBpMXUfBHkczk1RT4QxD1BEkIJyES8LxsTkLMtOAXwSGS13EM14TbIT0l2RiNpmro5zP2YjY/n9Hq322FKae/guLi0tJS/2GxnQPXpFGjRpZlDR069Prrr8/Ly4NNat26tbyXbEj/Uvou1UrKsVizZg1MSygUqlevnmnKG7nVXpcGrF69unfv3mvXrr377rvvu+8+HFbVz3tDb/veJl+1ahW2wWCwcePGKl6NvT8fWF9YePmMgSA2bGmBKBizqXjORhZPGJlZOQ1qGU1Ebodsq1nAtXlImDDFHL4CNSxm2FHKC1jJsjJnZaJkVZEbSwQyAjl5OXa9EGlEc1rlijxuheA7wGWAdUdu4lB4DiaNkuiSSCBhGyEabG0nMuI4+UIsA/ow0/OIgG8ShEuyJCGKOctiwTaZli2P3s6PIONesRdf6cqHw/O8wAGGZ1mGCFjwPuSjFnCGqOEa1KWlK0sRkFnkA59qSwNZQau2xTOEayUs2zCYQZnJDJcwy8l33TIGHyO7ZZgHXcMwqXxqBeWi7/TzGZrdoUp+hvZRapYtnwOljD325FPPP/ucmoN22vlwLH766acGDRooP6NWrVrwM1q1aoWJXlHt1ggWBVsssk888cQFCxacdNJJQ4YMURVVe10aAA+jV69e8OruueeeO+64Ay4dqIkjuxtgfMbj8VNPPXXOnDnHHHPMm2++iUjldO4ZcLrIBTqhLikeUzTpsYlkgpmZCCOW0hARrDxzkzicHzP4hERjEUQnEiz4uevZgSJj9fcrF7yynC80ScSDWyAfr0A2QwQJ9UJuWbtEx9sPOmBAUy9TPhjhfycTJwKn8UDhbxt+u2lUnZJ6RY1ip75/ImvqkoAweQArBQEXIGbFMxKmZ8/49/TED7Gy5uXHfnqsUR+uyLbnizrZ/RNfztpycpj/+uI1j67NckPsCNZnaC8vG36N/9aI9DPkfRyakA9sfHzmJ40WNfJbr152IfAcvDpuVs+sDhd3CnfO8LI8eVdIvoMSD0bCcx6bvfr9dfHs8tO/OjPRNCJdJP/bn9rP0Ow2+r7JPgCmbJhwTD+cM8d1YciBuzmwcePGDRs2FBcXO1vCGMMWYsirnBLsQt7zvFTMjp0VVW8qrAI7BmJg06ZNUKm0tBQ6oLrU9fNtUfIgub8zakJS4WuxC1mUPEjuV45dlU9HVQeS+2kxajCog47dZHL1oQYMaoHrgFoQTg2M7QEZGEuMgfXr15eUlCjdtlVvt7VFxvS8Oy2HU4LVOkm4ZWPKp1wxJ/xbpmGb4aNy6lxWL/f0MGvpBIwMJ59zhwvukgT3mO0S187nC29bsOyq+RmTMjIjRmYru/55depcWafOP+qa3YKxuoy7gs7zxGrmMVJMXAcOgOCw5y734Aqs+nhl44X1MtfWypqTs/q3dQYMuucK5j+uCcseIDaxLW4ENtnB/GDG2jA6etvzBSFkEygOCgk3KrjAESgT697Jz1td2yigkR9jsflR4VEP2kg3I5mRBKVjEc7PCK/JDGwMxUnC5EY4YRnrYzlzbPGGMWHg9KLRpdQVUlnBLJFJLENESK2VudY6W/avZ6MxKDNZoEazW1TJz5Bet+ZPQa6vCLn55luWLVu2aNGiuXPnzpo1a+bMmd99911ubi5Sjz322EmTJk2bNm36Zn788cfatWtjfsfMhbyYNJSfoeyEikHJ2zuOqXgEVDgVUyGpVNu2b7755kGDBg0cOBCWCWDeVGZJCQBZok9yvxKFK1RYRVaIElMNBMnY7aBkFKkYFagMSnjHWWTR26B6XpGM8klGVUR6aurwqQBAD+OwAuVEIiYpuuuoAhXJKL/G8vLyc889d8CAAW+//TZ2EZkuUCEYltdee+2TTz6JjFBMjYSUzioAMbWrUBl3SkpS5QLpkRUiKHcNh3Bz5cuLgiszWQ7v/FiHVsMOqH9v3Sb/a9Lzkx6HvHZIo783dGHDE54Dk+wSI5E15+GZKz7eYHu5he3XN3qhYc+Puzd5uHHdu+o0eqRB17cPOuyTvq1eOjjzsvpuQ87caGY8lhnltmcRbhkkg61ghT+XGjxMiRlOZC77YrGzSSRclqCeBSeDGsSEwvJrG4bA2U0pNzzXq+Ag4l/55CgjkOMWlW+YkOLxRXSBPK+pMLPjWcvfW+GWOJ7jMTgksmNlNnkJAs5GXF4+CfSzj/zuiG4/9ejwSbcG/2tf1sMh1DE32HNemk020YBn+K/G+C/VyplGaiZc4eGIeThQAiUoXTSa3aBKfoYcgJqaR3Y1EYwzy7JCoSBAAAcPFh0BtdYECKjpCfM7kgKBAOYsTO7JEnw/A2CuT01kfr4KQKoKKBlsFSqyQpCqhpRpmueff/6VV14J1we1q7pSBSpUaVvVsgOUPALIsmPhVCo02bEkSJWpwgo/pVJURliVCRBOKQ/d/ESJitkpfhlJSRVWqF0cU3iQKUNeLajCFVAYx/HLL78cM2ZMQUFByndUXVchyILB+Y9//AMj4cQTT0R2ZElXT5WvSpAt8fFTds5WwmoXJPcrAkv5oBfwIqxswUZbeHW617ZOyNiUtTzC1xYbGzbUWV/araj5eU2djBi8cflGqPDKfivb+E5xtlOL1OXHPHxUw6PqlISLikVhuSiN0KLSUMGmZuv4cbzFTe2y+mbHRBF1XA5d5EujhuA0/8f14VV5ZQE3ZkUC3OBT4pGZsXKPRUhMuGi2IQW31NmJy+tS6N70jvW7yWCECcMz5aOejse9/M/Wh4tDpcESOEQBxyr6PkbWeIk4HI2tBwBcB06ZY8TLrZL8zFWFrdeHTyGdH+jgNPWyuG3MIoklER53TMfEwZAZiHzPRL6HEvfU5c/0o6bR7AbmAw88kDxHNXsbnJQsKuIlIq9HHcwUlmFg8pavtfrTAZwJTEaRSOTNN9/EdNCmTZvjjjsO8crSIxVbhOFwTJs2beTIkeFweODAgbVq1Vq/fv2HH3749ddfjx49uk6dOvXq1YMYasMW2RFQhUDsE59Ro0atXbu2cePGoVBIJfnKVYDKjllpzpw5q1evhm45OTlKE4CYpUuXbsCUXqcOClm+fDk0hxoTJ06sX78+IpE3XQeonZ+fj6ZB53g8Pm7cuDfeeOOHH35AxrZt28KLUvLYKnmEIYxcr7/++o8//jh27Njs7Oy6detCgXSdi4qK5s6dixZlZWVBq1mzZr388ss//fTTQQcdhC5Kl0wnGo1CErlgOyGGzsGyHnb3119/hVVo0KDBVvqUlpbOnj173bp1mZmZOAQIv/LKK6ilXbt26BMlAwoLC3/55Zf/+7//GzFixO+//w5JNBZFoRzVDyAVhubDhw//4IMPfv7552XLluGIwAagsejnXr169ejRA81BXmi4cOHCVatWIV71KlAlQCscGvRqRkYGvFXVaWqLo6aUeffdd6En2rVo0SJEomnIiAJx7NSVjObNmyMSXYFOQCBVgl/PHyAS4xMNx3EvKytDu1JjEqBk9CeUxwHFAINWSIK2UD6Zf/sgL7pi6tSp77//Pg40xva8efOaNGmCA4rUbTVJIogRN1g5X/LBwlC+bbcMZgyow1i5RzNNI08YIZ4hPCth4dwxaZAGLB6cd+8cY1atqFXc9poW7CRabEZi3KNGwMCfaRGb2tQVpusEOTc5Tk9ih5yMELGE5VGvnC15eI6xLJP18pr1alg2t9iO2TybW71zaMDJEEGCDPJZCvm10Q2fbPAWel4Or3tOnpvhoCvQD3+MW/llDPgK8muh8psccDRWeCseWmGWkNpn1jFqGe6quIjZZhvPaA+9gEkM/8MYcGPixsJXVwaKa7OWJLd/jmM7grgmtM22yyYyczF2RPaxQa+xEQgEmSkfA8v/Od+ZxGNZsZbnNItnx9EdKNOELiiOsqKxkVADO7N1WN9y1+wCmA40eymuWP7F4kXPL+SuYFx+ANCTz2c4sVgMBgNT7caNG2fMmAG7hfnob3/7GybuBQsWrFixAuYNSTAbJT5PPfWUMmCwH3fffXdubi4mMhx6TGSwxI8//jgKTF3ngNXE7uDBg2FCUpMdtrAumNCRuoP1DewKksrLyzt37oxcZ5xxxqZNm6AAjDQy3nLLLSinYcOGS5Ysue2222AVsAtNsIUxHjZsGGRS6yfoAMuBQi655BK08YgjjlBzrlIG5cO0KEm18kNe+B+nnHJKypJBGPr/85//RFcgVUli+9VXXyEJhhaBa665Bj2DAgFqQSekylTFKhAzf/585TOhM+ExwH1BFaoubA877DBkV52jMsJ8Ih6Ff/bZZzfccANcNNWZ8H78i0ryIL700kvoDaUAysEWWdq3bw/F4FelekPJjxkzBt4VZJQYttDhwQcfRAkIX3fddehVWPTi4mK4F9AHBZ544okoR7UIoMDPP/8casAXgXlW2qIKyOAYoahUo/xKDEjCF5k5cya8BPQ/4lUS4pW2BxxwAMYYSlDdi1an9xsKx0jo2bMnhE844QRIArQa8nBQzj33XDQcSSgN4Ijk5eXB/1NFpZeTDsqEAJZGEEZeAJVUOXCX4YGho7ZXAkZ3wo0lNiV+7zl8MpkxKvf3OY8sikwsKFmypmRVcUl+tGxjeWlhcUlxSUmk1ClNlM+O/tzy5/F09si2P64bv2HxiplLl81dtXzxhtVrivI3lG7cUFa0PrqpsHhT8frC4o2bSso3lSVK4+WuV8bjPMI3jtgwIueHkaFx8x5bVPD2+p/qfTfRmDa+49i1EzcVblrvlDlxl3Gc1fi/nM84afpEOnFU09/XT8pfuWIlRixOGTQ22QQmPCYSIs64yx0hInzlSyt+t34bkTscDsqSR5f/kPP9WDpuwt/GFczZULCxKFoejTNHvorrObyAf9fm2zHG2NHHjVszZc3CxUsXrV6+auPawkUlE46eNt6YPLLJyBXDl69cs3ZTcUk0FmMRd9Yds3+1x/3YZkT5tNLVq1YVbiqKJxLy+GIGYolFT6zY8F0BwhpN5dFO6T4GlheYl2EAYCmxsIYBwySLA6nmXCRh2kWkSoJFwS7iMWFhpoAtf/LJJ2FXYDZghJCK6ezhhx+GLVFTs5rasMq86aab4MdADA7Hf//7Xyyd16xZ869//Qt+DCRTM2BSJ5/0XaRCH5gibAFi5FDzQV74Ok8//TTijz766KOOOgoaQiUYD9hyREISIIC2IADv4R//+AfMc8uWLeFGwAwjEqtkaAWjpdRGDByv888/H9Yd7hSUR0vPOuss1IvVORyplCFEsQiojnrttdeGDh2KjjrmmGMOP/xw1XxVIDIqtRWIQRZkB6jixhtvhGHu3bt3//794QxBYMqUKegc9Fh6CWgsDhNqgT+BwJFHHon+RL1KGfQAyikoKEC7oPDzzz9/5ZVXYkG/aNGiiy666Pvvv1eFKJ3RXhjmxYsX165dGxU9++yzF198MfoNnYYSICO19FHyKgbKoC51sLCLeICqoRgCiESSiocnhP6EUwj1Bg4cOGjQoFtvvRXOQevWrSEPMTguxx13HISRCx4nDhw6rW/fvqpwgBpV1QBiqlgljwBqRBh9iC3EUP4nn3yC3dNPPx1tue++++CSqiObcjRlEdug6kJvQ+b444+/6667Xnjhhb///e/wU6H8PffcAw8GeVFOMkMaUM4xmcjgTf7WsjAUy4xkLX9mwcI319FNuTxkuTkJI0wzraxgIAPnELFE8YJCt1AwEq/VLitS1yMJm/KwZedYGf5fKNcO1ibhXCsjKysYygyGLN+VzOD+r4wkSP6HqzIjdWL1Ig2Prp3o5BhdLM6NxHLhTtloxogriPyO1pb4tyvQk1ufWfLCBOWWsKj8hSPGY2LNO2vCLGweFDA6BoK9wm5TLyBIZIrLF0lXDOd6egGmSAR4NOyxLNfKdAM58LGJTZfQxIJyj8ZF8zivY3gYCywmP+JBeCgzQ1D5WKjfk+rI4qglS9Nodgd/GO0m6rTX1BQOX/b5ooXPLcDixGMu91ysfpIrNd9vgBXBKjY3NxfHUb1KCqu8YcMGLECxsIMAJh0EYMMgALOUnZ2NNfGXX34JMfgNMHXS8hsGzBXWweqqA4wZ1qmIPO+88/Lz8zF9r1u37ueff8YKEpGY2WFi1Wob5SdV2YyqEevpgw46CNZFXc+A9UUM7Mftt9+OEmBxYWB69er122+/oXyogTIhjDkaVh+Fy8W746AKqKHikeXaa6+FAYYwlrzqBkGLFi3QdpQMYdSL5TgKhxkePnw41s0oee3atVdddRVKyMnJWbp0KdbWkITaWNNDUpqEjAy4FxMmTEADV65cqVSFGFTdqmloFGpX9zuQEa37+uuvV69ejVrgoqEt0Aegn9HzqnPgKKAWRKKWPn36jBkzBvpAf7gFUG/cuHFQFbrB88BRQzkwkFDjl19+UZ2Pw7Rq1Sq1qIULddppp0EYXgjqxWIXhUB42rRpOOioAlpdd9110BBZUDh0QBehEHgGkER2OGRQCUV98cUXKAcqffPNN5BE+UiCyUcPI75evXqffvqp0gSgTxYuXIgADgrAmEFdEINTAr9nzpw56H8MNnSaKn+rTsMuyocmUG/AgAGq4ehejDS4FBiNcKcQierUwcIxQljJqGOaLCgN1ILu/fjjj+fNm4fjha5AFvTqzTffrLoa7VKHANm3KgF7nuO5Tkl8aWLywJm/hn6ZZEwaH5gyquPola+tjK0rg7o8yh3Pi3vRhBdbNWT17/avv1ljl16zcOWcdUsWrFi9cnVB4abiWFmZhwqkPXc4c+V/MVdmduOyEjhKOIsS4+v9PInMmX3BvPWLVs5ZNG/+4/N+tyf/Zk8ef+Y3RQvXF26KlsUTTJ7VjEXY9BOnTSATRjb6de2ENcuXLUOvohNSXYqKIIYpACHHixWNKhkN5c0pKx5ZvXxx/tJZayZcMnE2nTLGmjL7ljlrV+SXFhdhEKNw5iX4BvZj818mkLm/Hzdl+azlS1esKF9QXjqi9PfeY363f/8t/OOy/81fNnfpwhXLNm1ckyiPuOXO0keWjbLGfd92eNnUklWrlm/auCkai2P0yFdovPjCx5ev/3aDvFii0VSaKl3PwKSj2VMoM4Ytwsnj4a+hEQlbqEyCEoBHiCSYGdgwLPQ7duyodmFlMTVjECg/AJM7ImF1MOmHQqHbbrsNJWDKRmS7du3+9re/IRdW86l5XNZX0RhQ8agUMigW06UaaigHkdg95JBDXn75Zazj1UwKNbAeRapydKCGkpeerK85rMgtt9wCVRGGJNbWyAVJaKLUhsl5/fXXkXrKKad07tw5Zfb+8Y9/hMNhiI0YMQIxqALFKt0QRj+8+OKLDRs2RBgdBQGkIpeqFCTb43e1r4uMR+3vvPMOaoEwMtaqVevhhx9GdyEVRlo1QSWpLAceeCBqadasWSoevP3227DcaAs6Ge4LmqA6qlWrVjfddBNqgdGFP4QY6IPw77//jsiBAwf27NkTna9qqVu37n//+1/1BAZKRl1KW+wC7CISJatCVCq2ShiRQPUe3Dts0QP33nsvfCYViVTIo/ewhcLIhU5I1aLKAZBE4Soeu0hKBzFI8gWlJMpBsZBX+gPl1aka4eugEFWO2oJkQWlAz1NPPRX9qfRBmRDu1q0b4lEI3BSUJrvY85IZNmMQeRyZFeAN2UH/bZd5YVZh7ULKvNy5mQv+s3Tm+ZPLJiRInAiPB4hrEpMX2wECn9jxGliu4ZgGNW0zaNkZZiBs2DYKo8Si3CImNQPUYCFBgq7/SybE2fDZOrMwtzRnY+3Ta5eGygPEyDuqTqRJYZC4JaMoWWNz15O/JbL5kP1BRe91yO9hcHm1I2Yy6gU2vL4q5NrFLSKho3IYLRcZkYZn1inILMzxsjd+uyGnMCQSceaQGBSR2Ugso1iY0dwZwdj/itfct37hlQunnTLZnkjitdzQbbUDpwhhe2gCMQL+Z8cNL8PLYGYoHEzYQqDZyYuSGs3uo++b/PXBYcYWNuOOO+6AWVVWAcA65uXlIQlTP2ZnNfX/+uuvEIANCwQCWC/ChGMBClMNc4h4tfKGPOZxVY5fQ8UgVZWp5n0ln5WVBXsGo4J4xEAMK3u1noaMsnBKEsAw9OvX79xzz0U8QL2QVzpDRkVCbO7cuVgCIhL+EJbXADrDkCO7utiDxbcyPxBWhaB1//73v5UaKU0grzwzhLfHcccdB48HWVAOckHt5s2bw2VByQsWLEClUAmpShiloRb4E3AOlDDEIDB58mTswtPCyl41QZ6K/sWe3r17o/OROmbMGGWAp06dimLhO/bv3x+7aAUkAQpP1xaFqEAKFOI3LulgQUCRikcTcIiXLFkCYfigRx11FCQRDxlVPir1q/rj/leKlAC226ZuRapGlIw24jAhFxwpuFkLFy5EpGqIIplnO0AYdaktdlWWVO2pIQQBFfMHlAj5DqnpBoTbkHe+oWvnBztHDisuC23MdjL472LsP6cu/WZdggvmwh+hhLOo/MwVFEKuhGnKt72gp3xMFKeULA+oaqghAojkJmFUmDF72TfzS60ycggNtw+G4zRT1DbrBJsffkCQhbNKa6/9fl0wLmK0UH7uQr6gIk9Rud0OsP4WDp98t9X01rDVo9eW2fH6fXNCdaJZrpXNs2u1qW0emlFilsRXO5umlXgOZ0bCkDc95PuzthOwhc02lpe+s8l6x4r9FstMZFoNQl0f6dzq3FaWCNskEIITRYWFBpgks26YmS7JYizkWo6NSpN6aDS7S5X8DJzMmr2BHR8LpGIihmVt0qQJTIuamrELe2nbNlJhBhCPORrmcMWKFciyZs2aI488EobnaB+EsQRHIRBQxg/C6aZrW5RKyhinJFECnBvYUZWESRtzNxwgbBEDMVgIJZzaQhjaIoACIQYrBXlVOHQAyLJo0aKY/9MtgwYNOvzww4844gj1MMTpp58O/wOVYgENSaitMgKUiaKwqzoEvaEeZ4FKkAcpyXQQD01UoxCGMLLAc8rOzkYqeiZ1HyelP4rFLsJKf/Q5+hDuGrKjaYhR/aDahd0GDRqgQMjjEKiiZs2aha1KQoyqF/IoGVulqtoi11ag1UqTVCoCINUK5TgiplmzZlAATUBvoC4UDlWB6hbEoFKVFwKq4WCrTlNVbAWyYJuqEWIPPvgg2gjdPv7447PPPvvqq6+ePn06qsBoBCgNVahc24IkFII+/Oqrrx566KErr7zyrLPOeuCBBzAMUDKqQBdBICmdDpb28moDFuimm8XKGpebJxidnunc+LaW8YbE5tm1V+Uuu2tF5KtCE9afEytTeDC8wnY3lgkTRjh5gWRLxaRjIZvIbUYMV/5GmlH2e8yYYcL3aNO7FY3y3JJQ2M1A9iYnNYoFY0hY//N6uoF5jv90kXoawy9re0BrZnFhOMEEKfpso70uA8o17dOClptGoh7h9Uwzp+VxrQ2jpE555urv1pJ4hssjFs4k/5NdFjE9TkqbcXJJ7Yy/5zkNAw7hMZFwcxKlwWLPIKYdyDQDVsCi8hdthWmjtfLVFs90bcOSl0U0mqpRJT9Ds8dRE1+FpJIQwJFGAPMvrAJm6pSVUgIAAilLgF1M+rCdMDzYwiSA+vXrY+EO04gpXlkvJbkV6fqoMpM7m9VQKEsJZWDJsFUCqkzooGRUUdhCGPpATKmNMCKBEoMMbIxfAEEqdIa2OTk5ubm5CMC1gtqwXsr8pCRRiyoBSdBBKaMqUkauQlSNSgHIoy6A7Or1TpQPGVSRqgWoSJSpbLbqcwhji75FTyIJOiBeCaBAqIFU+EboaqhdXFwMGUSiRmQEUADyQMVAGKhcKRCvtAUqBgE/t5SHVio+tUXtKoxyoAZIKSyNv/+lFqQqAYSBigSI2UGnKVQ/qNq7du36+eef9+jRA4WXlpb+8MMP8Ajvu+8+NBZFAaUn8LNuAQ7cp59+euihh1500UWvvPLKwoUL0YeqadiiFgAZgIBqkcL3MxjakEHssBkycBzqi3hbxi/J7Dy4vdsVhjxed3Vw2UuLvPVWifDMOtwwY9muFVtQbDhhHCelj6+XRIZRIpqGECPwGBi8jri59rNVNGZneOG1766ffNXMyVcunHzR1BkD5895YEE4lh1m2WSBWTQjQmIBpaNUbsegcI/GhMcTYs1Xa+x4Zq1o5oJnFk67fNrkS3+efdHIuQPHrX59tclDlGdFx0ScZQaPxanD5IuwnCQshn8ymtN6V5t5tzvNL69XllXONnnznp2XtTRUnhERIWoHMjwzwC1fGY9aAsM0QzDqmgnfV9FoqkSV/IzkCafZE6T3//aOhYqXh8q3JcoewDjBQqjtVjIAZgO77du3/+STTzChf/HFF99+++33Pgh8+OGHsNxqQleTOIS3RRWV3NmMikQuZZlQu28rpRlTSUosvVgVSFlZCGOLsJJRqbAoiPfF6Z133gkDBp2/+eYbKDx8+HBsEaMeCEUugECqWFQNoI/qDQQQCWRZaaTqSgWUsJ9bsmnTJpSsum4rJwylKUnVZOxCW1UITGxKAEBAJSEvUhFGmQj7WsuAumaDJJh/lQUFYhdZEFZlAiWD3VQMUOGUVgC7kFSFYLesrAw9CWWwCxAPfWQdm90IkCpH7SoBlYrIbfHrkfKpAGpUge7du3/00UdvvPHGySefDIcGisFpeOyxx3wPIekCqlwAuwqEf/rpJxzNVatWHXnkkRiiI0eOxCFGRmiismCrAsk8f4BuYf49CCPAzGyamUkyQ7YRqJvwDi5q+2DLaMNyh4rCpWWl61icOTnNs4xcNyis6PxIaFWGyTJQxBYtRQ3yngJnKFmqKb+l5S11EiNLbS/DMwRbz+zZITon05xt2dMDkVXRjVkboiSaWZ61esT6jJJcIS+cwKuWHlCyzIpAIv4MGiqbVlI4v5iaXqG9KboyYs7Ksecb9iwnc5bFV8ZiFncoySrMLPplkx0NYhiiH/yrERYRLppvsVCpVRw60c7un20LK3dSTsFbGzOcHAxMagvTDjILXWRYufbG7I28HrHkT7wxYcrRWMFjIxpNpamSn6FOac0eZ6fHQglgC6sAUrYEI0AeyM3ZkdSqVStEYh4vLy9Xy1m1tFUL95ycHJhkf+TIjCpXhVSY6lclZyxUndIBlar4pFAaKhKTO8SUJEhVrQKgbdu20A2BuXPnQj21EFfKQ211YQap0jZufipWZccuCkTJCGBXFYik7YFU4BeTVAnAsYCfgfhmzZplZmaqQlIWHbtKONVY9GH9+vWRtGbNGnWfApFqiyoKCwsjkQgCBxxwAGJgdPPy8rDruq66B4RykBdFIYvyQrCb2qaTHoMwilq+fDkCqXiU1rBhw3r16qGoZcuWoWqlodoiEgKKVBaVHajIFCq1MkAYJQMclH79+j3zzDMvvfSSujsG73DdunVoqWqXEk4vHPEvv/wyXCKM0rfffrtnz57I5Xsm8pkbhJNyFSE/iSVsjCZpe01iWEbQDmXZmXVETmYogx3Msw+tZcEWx7xYPGGWx61mBmsbjBGTrzPLv9+UnchE7SnFfIiQP40qGP5D2YRZLi0YWVC2Nl6SVdD6oaat327a6o2GnV5v0ubjZgd/3PbgDw9sN6xFaYsim9DouDKxlDH59U6GVqnitovJ4SqEElbhO2vrbApEahd1H9Kj07DOTd9s1fydgxt/ckj9jzt0+KRdm0eblOetIY4RGb6JbbJj3DMdlxMWcINobjxgRc06rplXWJ+1uLJlvEm5xa11X2wUPwcynCA1YwZnBjENTsIHZbR/pHWT8xvAibIITnYMA3k1SKPZbfR9k78IKXuwA5QMpmMVSMZuthkYDTAwJ510EnaLi4s//PBDRKZMIJJgtmHFAUy4MkXphaSj4itMRaQCxQLEoN5UUmqbQkmiLgXCKYFU4MADD2zTpg0CX3/9NQwV4oHKpdQGcD6geSoLUGJA6rFNn2wPZdKUsGLhwoWzZ89GCVhhYxeBVIsUfvEStQtNYF8hs3r16jlz5qhIZEQMzNiIESPWr18PZwJGVJXTuXNn5IU3M3z4cGwRqeKhybvvvltQUICwilFAGAcIBeIgOv4jKQACv/766yOPPKJ2FYiE09O3b18E8vPzP/vsM5SJTksmVwSKhZJK22TUlrVvD2RJhnwNcVygJFwNeGbHHHPMOeecg6YV+Z+eg86gwjKh3pIlS5C9a9euKAEeCWKQEVvIQzHIqIqwVfj5fOALcPzJ2xDSbOJoWCJEM00rzOoGcnie2OiYRqxuVjgoCI3HyjJL2/y9Q2nQCbu1Vn6wtHRSqRUPENfwmCs/Aa5KIdwgwkwQL+oSrNkifPUXqy03O7NVVujE0Loe6wu7FSU6xd2OovTQ8vKuMdLNbtS9qS1IcC1ZM2EZc1GYoEw9QorxYXDTgMbpf77eboA5Yi1bN7zEIMGsExpG+kVKu6+Ndck32yXogU6sR7Tw0ML/b+9MAHSq2gf+zryzzzAoa5QlIUKhJCmUKCKt6rNESiv1FRUKSZuSFpUv+hNSKCr7VoSUUMhOZMk2ZoYx2zvv+//d+9y5XbOZ7WWanl91O/ec5zznOeee85znvNsEtooMrh3pcwV7NyXFbTqZ4EtN8yZhaCDHB1dQZKov1B1XKjC1ZEqgq57nkr61j0Qfjkoos2Xk3sAdLm9aMkGr23i5x5V2gafifSVLXB/iDfGGB0S5A/4+jShK/ihQnCErWTnn8CzEL1v3p2M+K6sIHw2SFpylKLnpppvq1q1LYsKECTNnziTBniexBVcO7ohJwJFZlY3oBKeA5EhCcpw4M20xOyG9E2wBkHxse+yxx0hz4h8yZAihhhgMbGPsQ2zG7O7SEUyyq9vm2Tk5g9iUKVMWL15sb2xxcXGDBw8mwZbZuXNnkeHqbMXGbq5Lly7R0dHJyclvvfUWNlMdbfDbb7+98cYbCFx44YXXX3+9aGjcuHHlypWpO2PGDJqmO2Tu3r376aefJm4QnUhSnQQQKFSqVIn8PXv2SBxDQ9OmTbv//vtPnjwpYQTCEi4g1rNnz1KlSpH50UcfIYYwveOW67p16xg6WkSe0WMMqbJs2TI2eBKiRIxH3mg7G6RUxOLj4wmP5LUTmUho+/3337GkTJkyNEGQQYtiHvJSV0ADD5Tr8uXLCTioSJo4b/z48cgjKW9gSS3Bqkldly/N403en/zdiOWxC+MDDga4kwPJCna5g05G/DXzQMCvaV63L6VuuK+kOyEgJT7UU+6mUqHNPcluT9Sx8zYP/j1+UkzIvqCARLc3zZOWnOY66XId8B2fm7Dm4R/3zdmb7HUnbk1M/imRUTmv00Xx4fElPcGhARHx0e5I3GxQcqgv2BsaEtX5/L9K/BXlKXl4dporIcCX4gtMDvSEEA4GhPuiQk5Elo4vVSIuKjQuJDA+MCDeFRDn8p0I8qWEHpy/z3UoMSkkscJNFU+5E42fPw8O8YUFGh/bDEgLcUWllUk9r23ZgEBvUGpIwjfHA0+lHQtMcmOt25sQ4PES3KRFJIf7IsKCAsPDIzpHlbuprNeVVmJX4I43tnoPR5xK8/A40wICEn5NODwr9ui8mPCA8IAgn/G35nN8uIpyRgoUZyhFhwzu1YkUceV5S04GnPlsOa+++ioePyEhgW2Mzel9E3bTtm3b3nnnnUlJSfJiBhuDVScr0JmlSTnYmSVnlDdmsfmbYN26dSPx448/cjjG2jFjxrz55pu9evVq0aIFGxvbJDY7N0Wpnifo8v79+/v169e9e3dChOHDh7dq1Wr+/Plsyb179yY4QMY0x1KewXhuKcKGSy+9lCiB9M8//8yQogdTUduuXbt9+/YxvM899xwPQsa5QoUKhAIInzp16qmnnkIGmjdvTgevueaa0qVLoxkD7EYBq7ieOHHioYce6tOnzw033IB53Pbo0UMEsESgVoMGDZ588kkyiUJeeOGFO+64g1jtiSeeuPLKK4l1duzYIVWICRhJEtj8wAMPfPjhh59++qlEJM6mswQB2mL0SDCvnn322aZNmxJsMdPee+89EitWrECAfhFGoBPsJ+WEAbn66qtRcvTo0bvuumvQoEFYcvvtt1Nd4qdPPvlk7ty5iYmJUt2pIS2QAC3Nd8J15JP4tfdtXnbL6t/6b906Zuf2d7dvePj3XcMOBCZHHygfU6V7uaQyRyI94eHJpRNLueoOr5NwfXxScFyFPZF/Dt28/O7vtvfftXf4X3uG7lv/4G9L26zdeO9B32eeiKPB4afc26fv9HkCUi5ILN+srMedEhQQEhEaGR4ZFhoZEhlcqkRoVHB0apkmEYGXhngCAlN+T0vc4EtxJXgCCOxciaGJnqPxm/usW9Hlp8Wdls+7acnclovmXG/8+1WnOTG/xh386og7LdxbNzC6foDXfTw0wBUWan0niBg3PCw8KCSoxk0XHi3/R5o34MiK2BIHwkKSjPd1AtyJQTTh9nmCQlJCwgJCwiIDw8JLBld5skZizVifKzHmuxPxE4+FnkhL9ZwMTPH8OenAoV4ntj25t8SpEHeQhzCDbeKMj1hRcqBAcYasZOWcw7PABYs3z4ztI3D0VpYDKRIZruhh95o1a5b8kuPq1atff/31YcOGsa9s3LgRp0ams6KkMyD5uH65FewmsjPDSjkgUzYnsLJMnMKikzDijTfe6N+/P0f548ePT5w4cejQoWxjX331FRtbdHQ0R2SRBKmepRk5w+7VtWtXNuZVq1ah/JVXXtmwYYP80nnfvn3ZHemy81UTyNAKORI9PPzww1QvX7787t27R40axZb5wQcfcNavV6/ehAkTWrduLR1HGG3s+o8++ih7MEHepk2bVq5cSUzQoUMHbBD9XJ07K5EW4QutoPDbb7/dvHlzuXLlxo4dK+/XOAcBGQKIRx55hKdcpUoVlPDEseejjz6iazxuNNMveQFjxIgRl112GVXoPsZPmTIlOdMfF80SlBhjkd5i5cqVjxw58uWXXxJgvfjiiwQuUVFRdJCoCBmnNruWgA3PPPNMmzZtUHj48GHCx8WLF2M2T5l8Snft2sWD+Ouvv1CSwSrjh64CEwLdrsjIgNCTASV+i0x4/+SR54/FDj4eMC05PMmVes2pxqMahDdMPBUS6wkKikqNDA9yueolX/N209I9yp4sH+JKLRP6e/iJjw8fe+nA4TcOn5qRGro7KSDiSGi7kIj6Ub69niNz95f0BpdpXtJTOSUtKCUkNDwiNCIyPNxdIiQgMpJkifCg0NIpNdtXPRV6MDol9vA3ewJOBScZoVpIUlCsLyAleEdY9LaoUlsiS/0eUWpTZKmNxr+lN0amLUxM/Sk2OdxT9paq3kCGpWRgWCmiDCPCSP+aEpPcXTm4/M2V0oKT3McD//jhSEh8KVcSz+9kQGCCOyA1Ii0g1JvmDU7yhp0ICPJFVouo069eUliyJ9W97qONp9YmJaemeL1pHCA8bp/Rii8t0H7PRD+goRSArA+dSpHA49oze2fqXm+Nh2v63GksfC97D3ulWSgPjk30+++/51qyZMlLLrmEPQmPA7LVmY/Yx062fv16dgs2SIrkAMR2SM6yZcvYlS+44AJcv3yIgStq2c/i4uLWrl27ZcuW2NhY8tmoOK/XrVuXAzSbH9W50gR+KN0VWSax8fzwww/Hjh1DsmbNmohRHXu2b9++bds2qlx++eUYwK5JPrecPukCO2L16tWJEijCBnajpUuXcgQ/77zz6BeeFGGgLekODdWvX1/0II8e2sXsP//8k53ywIED3JYpU6ZChQq1a9euWrUqddGJMexwP/30E2FHnTp1cM2iFg0U2R3JADsWx/rGjRtjz+DBgzt37szIYAa7LAY3atQIy2Unpgm0YS2jSitk2q3Y/UIhrWOqWIsq+XgmAjVq1KAVxoqRRBu1SDPUGEBba0w4yvOsGzZsSCBIPp3FKh5NtWrVkJfRIJ/MhQsXMlCoovvXXXcdI0nddevWUUq4wANFGFNlJjCFMIYIZt++fRjD0DErCHq4IoPZQHcOHTpE4LJnzx6sZWDZ8sVIxl8wxsuEdukmM0EMrlWrFg3J4Jw6dQoz5IfGU1JSaIu+SJRDRexHOdgTTBQCdiLPhJw/fz5hEJJMsBYtWtAR5g/RBhEGT7xjx470Dg20KAMO7LCuVLZy96ndp+I2nUjYnBi3M84Tm+TzpYVUjy7T7LzSDUMTg+PTfMHJoa7gkoHlXWVCg9yp7lPJyR5fbHDsH74jvx9N+z3Wuys+yBuYFhYQWaVMyVpRgZeHhNZwU8OXEBiz4Yg7JTSiSmRC5Zik0MRSaeUjw6MCSqS6A4NdAYHBKSHegMRkV6p3X8TRVYe8SR5P2ZPhTULDg0uk/BqYeigpMNHlZdMP9jJPjQ3eWOtGaJwW6g2vGOHdcyrJ7Q2sF1UiJDkxPMkdFhoZXFJmFKPEaIMr1ZO481TM+pPG75lfmBpVKzI0LCxmTWzwsVBfSZe7jscV6SoTVC4kMsgT6knxJgbG+E58l5pyMtHjCnDXd0VdHFEiNDJhV0LCH8mpgUkVryznCQ0IdocEh/J8A2jcG5iya+Rf0ZdFlG1zvvHzYYqSOzTOKMLkGGcAzw7XjOdlq2bTwgvLpg4kxMMigwC7FDLsE7gk2bNxwZRKXa7oYbfAbQH5SJKJTq44ekCbfaUuwiC3YFhjIiZRnbpoFpNkM6OUTHn7n7psA7RFKbciTC1UISz5ogR5Mk27jG++oEQ2afYqKmKA6CFBPsIYDJhh2mUYRpdlNGgUM8hEQHpNmqEQ85AR+czQIzvOGDRoUJcuXdBGFdqlCqU0BzQkRtIQtbAfIzFJ8mXboxWKsNwONUhIdUpJow151Ip+oC1KxWa0iYxYK62jDQFyEGbHpSIa5Oc3qCIKaVTUijAJjLT7TiaaBdIiLKAWAXnWVKRTRCRcsUdaRAlXaRSoKyBM0/IQpZvISHOIiQbpF1exSvolcwNJUYiplkbHuMnAkkaAKtgmbXFFjLpYBaiiVOqm+tJSfB63KzAgNSAtyZOSmJySmuRK8oT7wk6FuBJCklMDTgWk+sI90UyT0Ki00LCIIHdYAJGJ8YtaR9NOJfsoTwnwuYJ8gUHGxyLcQRGesJNhx1Lc8VFp53lC3GmhyRFJoa4U98mgw74gV4mA8lGhYe5wjyswxEOuNyLQ500LYDG6ExJTTqQeCkkmXAkPCg13BRu1EpNPHg84EJgSTlxClwL5l74b/feFmF+RSQhNTvUFnRdX0hPhTSuRGu22AikZIgaQThI+xKacQBXxTLA71B0emhbq8qa4A095vL5jwW5WV/nAEkGhPChfkvdUwPG0BG9iQpLHWHHhIUHRQRFBAUHJLmKyNFZacpgvlEUTyiMLcPsCvQEaZyj5wT1kyBArmXeMma34D68rdltMWpyvdOMyvgCv8Um2rDZCcdD4GjwOjhXXA9z+7X3SQUD8OAI4aHIoRYwr+VIRRIxS089bW5RRP/1dD0qRAdLSig2qBNIUIYNCcffcglQnB0SD1OJKWuRB8kGU2PLckom85CPJJifKpZQrAnZ1+5aEmM0tYkAm1VFLvmgwa2QBzcXExIwdO5ZNkdNzgwYNqEsVaYWKohxLxE7yQRSSkFa4IiaZ9tVOIMZDFD2kkbe1kWlrE+ymSYCtX6qQYyo2dNoVUQLIgF1FaokYmSJmNGBia6BIqoNo40qaIq5SSppMKc2A/XClXa4ixpVAwS7lVuyRWMTWaajIBPlSUcSAhCiXuqgiLUVWFZfXHZAa6ErzBaSmeYlRUlxeH4pSAlJ8wWlujzc8NSQsyO0qkRoSHlTSVcIdQtTj45EEEFP4Aum5N4it1vihCR/RfjBPKyE58EhQQEqQK9xLYEAEH+AOYy0ShQR5Q4PpRQkJzwJcQW6vJyAgyBvoCvKmhKR53axtd3JwQERIYElfSJAvlLSHFZ4W7InyGb8CHkIdIhqXK9j4ZoxRIcgV6QryhLiSA4LCCR+CwgLCgq1nbXWQ8Qx0EUkFp7ncAS7Ci8DQoChfRCRhkSswyBccFugLCw4yZ0kS3XGnhacaPwGaEuQLd4UQJrlCA6ICQ5M9IUnJ7oCUwFNed2KQK4IRdQcHMtyEXD5X2vGVJ0PLBUfUiJAvwyhKbtDXM4ooxlPxuPbM2Zm6x349w4cfMT4A7ljiPD7ja33GN/uMEyFeFdfjNpwebscQI5Pzdlqa+RenvLI9I4P3xP8a+RyH0UBzeESjrtvwy0YtQ6vXOD/K9waN79elbzz4cuPE9XcrNlS0WjzdJK6kjSw709RAbTHP7IMXXYawuXeYhrH5puFBTduM5kjTC4/xJzuMLxZikGkzWyKaAsy/aml0x5PKXmL8prPx4rPssuwWtEjHEfMxGsbvFlBEQ4ZqDEzfujL0CNC0c+fOK664IuHUqYEDB3bv3p1aYemHeHkc2IwlIcEhMvheYwDpARh/y8p4IdwcW1s5Oo2OmOObmsJ2a7wown9G11GFTcbWZfyTro0hpVuGPIrJMOTMUABt9jiDUYFtxvwbY8gzHoZmL5qNUaB7SJvTwMgwjCW2IM+Q95oVjAdhfPPAetyGEBs4FlGVAtpPSUk2+sUWyOCxqYeynYotpw2dtGv8mWHUsf+aHzfBaoq4TzU7w5AbFvKMpCkjUDAnhtEJ41mgxB40UUjdFLMuA2I+MZ/RjQAmmCHPiJs9cweHEHAYHbCqY68xZbj4UtI8SZ6k1LTUVB6OzxWSlsyy8gSEBQQRY/hCjT4ZsQKLjvidIDwwhSggLTkt1ZtCdMKTSEMJzyAoIDDYG5Ec7EoLT4hMcwe7ohJDAlAU5EoKYKK5w4Pd5k9SeN3G8Aa5UwK9ob7kII871ROU5DN+3SLcw3ikpYYzm9nWQ4KS0k66k40+GXZZ0DNil4iUaFdQossdl+oOi/RF0gbP23h2iBrr2pBMM75X4nMlG3+nzfg+DCYERAYFkk5L8Yb4fMmB7lA3QSwBD2GtKyjFlRSYEhh4KsgTkOBxe4JcJVyhSQyYOzmSIk/giVBPKXeIKyAE9YGB5iuqu9/cW6JBRNk25fX1DCX3aJxRROGp+LxGnOHZ7a3Rp6YvyGe+6OpzGx/dzeTNjU3T2DVxqaZjNTfLdCkEUCUCYMpwodh4+Ibj/ruukQ/GxpNey9gqEbLr8i9eHMdu3p5uiwWOjxqCOHoEaUtyEDD0iA3pDRkXU8LcsQyMHFOJyBsNm/0yd2SryMg2txLDDCPD1GMELobNUjG9Oaw2VRiCpqxDs6EhWwyFW7dtvfrqZvHxJwYPHtS9R3e2w4iwcEINtIpOwwojKjA24/RWTrPf7q+FZYMREDDE0ncpMQbXlD6tCk8KhRK70LX0vvOfWWyAuFnRaM5UaLzRAEYzhiIxwwj4SJhp+b9RgEK0mk2Yj9tE1HEx/0/K+JNilLHHW8MrAthr63GSPs72OIgmo8QIOtFhlpiqjALjGUnAkkmVjWWnGbmYtckzTEsfBzvHUGU8HIee9DHnP4k10zX4jFGmihlzSmf+7o5ZBTlknYPD/0wRzKZ184IN5jtZlgIz32zYaNp+9GZlc2V5jVdGRNLIkzZM+52Y9vJwZeh4lIaBRmOEMpmwlYNphIGRZ3bbqGXYajRqCZNtegAx0JBGu8TBPvNVUuMXQgxh4ieXx73znS2RDUtWuLFSwb5CoPy7KFCcUZC6yplJM1/P+MNb45GarOoAj+EJfEHmocwBN+I/SRtugosUOHDKGE7Q9Hm2FookYVb8u7pdSyoK6Y7IkLIlMyN1SUhb/CcqHG2dZqfRkJlw5mdQYiYM7Hyw5a17A9Orno5zZKTM1mw3lx2pXuON/yNHjyYlGb/8zeE+JDTUiDPCwtiX8NtkGsPC0J7ufWlCLDFLs25HZGSEDRvt6+mGpfeH/5ufCLHeQ0P2NLWZ+yiawZKXm9ONsW8QNbYk42rUMrQbDVjlDjHL5nQBA1vMidn23w8DCRGTHMM8q0GjLa7GHuhoKAMIG4JWRcNIIF+sQLMpYJBujCVvYxUbWF0AuRcdYoBgq0jn7ydl3Jxe0cDqmhF/SG1n006kpm2tja0zI5ZGy11LW7lUnnlY7IpmvqFHBETI7oXcuT3G2z1pgcavl3s9gTtGb4lqEH1B64p/a1GUM6FxRlGFoU117Z27M2VfWtU+NVxuH8vdONVYxZlIdxJWApyiGR5UBi123cxkqChka0QhQaP5bsLZfafxojBfmjl5UjE11SMfVmWb59QbGR5BtMHp0AgtpKEsPb/TnuzIYHOWkrYMGGJWk+Y1HbnL0GLOCjMUWXXN/2XoTga1kFlz5hwbuzogk0HS6E76fXYaMmApzKqisy3IWaEhbCo5s1gG7CyHAaamXJGl5N+tONszbcvUWk5kZ4adL9qcaXBWsTOZ+26X1+0xPlOSGrJ71OYSl5U6/6aK+nqGknsKFGcofoTHkuLaNXu792BAlT4Xmes8IM0VHOJzE24oZxPGm+NdqvF+fmpKSrLX6yPOCA8NCw4O5vxLobVHGUmzgqIUHzzGxz4C0kK8nHwi9ozaEl63RLmbK+vnM5Tco3FGEYXH4k3z7f92b+pOT/WHqvuMT627vOYL885Th3IWkNeQveAx31Q3P4NpfoTSeJ1dZNLRh6MUM7zyVTe3JyDAE7Djgx0RDaIq3ljR8f6SopwBfd+kiMLIpnq9R2bt2/f5/rDmkW5voNsb4OEMYfx5ax32s0r6e9UEEYQRRiTBf8SBknk6GmcoxQu8vPEaqtv4Y2uhrsTlsRd3v7h8m/I605Xco69nFFGMF+p9aa4/0hLWJKRGeDhJGL+gEegKyHJ3U/yKuFTzSphn3nDAs8OPdHQlKcUQpnWA8bHjAF9qsDck2V2yccmgKsH6+Qwl92icUUQxn4rxcqUeGxRFKXKoY1JyjcYZiqIoiqL4C33xS1EURVEUf6FxhqIoiqIo/kLjDEVRFEVR/IV+r1VRFEVRFH+hnwNVFEVRFMVf6PsmiqIoiqL4C33fRFEURVEUf6GvZyiKoiiK4i/08xmKoiiKovgLfT1DURRFURR/oZ/PUBRFURTFX+j7JoqiKIqi+At930RRFEVRFH+h75soiqIoiuIv9H0TRVEURVH8hb5voiiKoiiKv9D3TRRFURRF8Rf6eoaiKIqiKP5CP5+hKIqiKIq/0NczFEVRFEXxF/r5DEVRFEVR/IW+b6IoiqIoir/Q900URVEURfEX+r6JoiiKoij+Qt83URRFURTFX+j7JoqiKIqi+At930RRFEVRFH+hr2coiqIoiuIv9PMZiqIoiqL4C33fRFEURVEUf6HvmyiKoiiK4i/0fRNFURRFUfyFvp6hKIqiKIq/0DhDURRFURR/oXGGoiiKoij+QuMMRVEURVH8hcYZiqIoiqL4C/39DEVRFEVR/IV+r7V4kt1jDQgIsFL/ApyDIB0np7BGIOeFU4zHOYepJUX+63uWD1Ru4SyMec4PHQpuQ5Y9OsvdzI5zPv7KPxR936TY4vV68QJpaWkejyc1NXXdunWHDx+2yv4d/PDDD4mJiSQYCsHpFguIKGR45UpDq1evTkhIoAlyCrGhIgX9EphLmzZtYmqB9JdB4GrJ+YeYmJi33nprz549dkMkvvjii6+//prWQTL9AQ3Jgz5y5MioUaMOHDhAmhxgcU2ZMmXu3LmmCQWygVYWLFjw8ccfp6SkWFku19GjR0ePHr1t2zba8vcI54CMwE8//fT222/z0K1clys2NnbMmDHr16+37hUlEwWKM5h5StHEfkBcjx8/PmTIkLFjxwYFBUnpv4SlS5c+9thju3btYhDMHcErY1IoGONrnuFI00S/fv3mzZsnIyyZxRK6LMMYGBjIfjN8+PBTp05xS5H0WtJ+gj1+wIABxBnOVpjYM2bMIIFJkuMPpEW3202E0b9//3379tGcjAOxBZHB559/zq2I5Ruqz58/n4EldrFV0esXX3xx+/btjPA5nFpm51wrVqwYNmwY5lm5Ph9xxogRI4gzrHtFyYS+nlFsYT/gAXMMuvvuu0uVKvXmm29ytcr+HTzzzDNXXXVV586dFy9ezK1sDIUFHp8rg7xs2bKuXbu2aNHi+eefDw4OJp9MKS2W0DV21tKlS7MdsgUyvDt27JD9T66WnH9AP607h5c0A04E4O+mJY6RhiTYkqZJyLVQkK5JW5IjCa7OXp8TMECw7k0YEK6Fu7iUYkaBJoc16ZSiBP5IIL1ly5ZOnTq1atWqb9++YWFh/7ZHxq5///33P/zwwz179vzmm28YE4kACgUGE20LFizo0aNHv379unTpgsMVnytXS67YwY4SFBREHyMiIkaOHFm1alX6vnnzZross85/iH7n8EqjXO0cPyFNgMwiBoFMZ38psvdaKyvvUFe6w1WaEMjh1tncOcHoW6anLGuqEFeWUvzQILRYIU5KEgcOHGB/rVGjBrsgOy6Zth/8l2DM78DA7t27t23b9vHHH1++fHkhjgCO9bfffuvTp89tt93WsWNHK9ccZFla1n2xQ3oHpENDQ4cOHUrY8cADDxw8eJAxERk/IY1mxrbnLOBsyPmsndd8Q3XRmeVELaDygoMBgnWfbjCJLA1WFKFAk4PNTClq8FzS0tKSk5MHDhz466+/vvjii+Hh4eIdLIl/DTJF2QWff/55uk+8FRMTI0UFJy4u7sknn/R4PAMGDAgJCWGLtQr+TTDCFSpUINQg5Bo8eLB8CtIqK76o61OUPFGgOEN2r7MJjWK0tJ5LrJp5QWpJdZvs8iGzSYaW9Peq5So5/oa2wO12T58+fdq0aS1btmzSpAlHDTItidNf7BVMkw3sNDKSAJEpRCy9uUaMsSrnBWoxFHT/wgsv7NOnz4YNG4YNG2Z3zRLKI1IXRo4cuXLlyvvuu69cuXLk5+Y8JxrAujeRW/taWJi6z0yWkqLhjNiSrVu3bt68+dSpU2fOnOnXmZMPbEuc9kimE8mU6xlnSA5F5wSxVrCyHJnONGS4zRJTQSFjqU7HOU/AKSDyTqyCTFjF6QLOhKTtHCdSCtb96WSXL0hFJa/8k17sYmpyYBLs4yMJO0eQtPmFOw9pZKz6ucZWa+qzsNWSEBmnpDQnt0DCKc/VVOx3eKJcT548+fLLL6ekpPznP/+RHHZBSWCS2IOFgmmvlUlCrgKl5Bt6CxUZGdEv1ywTiNlXMq3KeUGCDLl26NChZMmSn3322a+//ioGWEJ5wbTLYPv27WPHjg0LC7vjjjvQb4Pm1NRU5/Da0KLdqH2LvECaTCktIKIcnfIo7Ryu0hDYT1lMRYYiEtyaOnIF/ZXhDQoK6tKlS3Jy8htvvBEfH48ewZI7p4glPBQ6K2mQvoOkZaxIyyAAt5JpaSnaiLW28YLYL72THOmv3HLFP3CVTEAYpIhMS3XhgWZpiytIK4I0CpLgKlVIWxJZYYsBaTB1WFVISI5I2lerQvqEN0QdVtmQI5lcRV4GR9JKXvnnvW9itW3CPJCEfZqU3RQx2VmlSj4wJ5uBqCWHeUlCdEompTJZSeNqaZEcbpFhUnIFciRB/llAGmI3ZSMsXbp0s2bNsM1u3bTaIMOtGEkCs+mF9BQB8kGECxGa4Gq3K5mGHWajcgUZTwS4SpWCcMkll9SsWfPYsWPvvfeerdYqyzXYJgZ/8MEHsbGxF198MWptg40OmDPBvqUJIOGcPECCW5ERRExKC4hoE4UgTUuCYSQhrXOLjAwySN38jTMKW7ZsGRkZSQz39ddfow3ldmfPOfYI2Iht1k36LWbTfdIgmdmNBqVWqmiAqVzNrvyN5EvCvkqCnlJEgCg5QA5XMmU3tedDoSPNkSDyi4uLO3z4cGJiokSBtC6PQATATvMEET506NCpU6dk3ppqLDGwb7nShRMnThw9ehTN5CC8Z88euwqYNSzl8sTRifyaNWvWrl27b98+hEVeBpC0v4el2JPFBC2yMBsmTpz4ww8/kBCzWSrnn39++/btr7nmGnK4PXLkyJgxY5grROt4/EaNGj344IPyKcjcc/Lkyffff//AgQMysVDLhLvtttuuvfbadevWTZs2TX4zgFKmKa1ccMEFffr0CQ8PR3jBggWLFy9m5SBA6b333tugQQMRM3X7F0zCNpw+a6ZFixZz5szBKvJlUWEeQ3fw4MG3336bbRJhyWR8LrzwwscffzwiImL8+PG//PIL9tNlzuv33HPPddddZ6guPI4fP/7yyy+LASD+pUqVKn379i1TpsykSZNWrFjBwiYTA3i4bdu2FSOt+nkHVXT8qaee4rESfqGfEIExkWHJPZgBe/fuZYRxXr169froo4/IEZdNwuiP18v8GTx4MN0kk96Rzx580UUXMUnOO+88MjHm999/p6f79+/nlup0s1OnTgXpow3NzZo1a/bs2QkJCbQusEyuv/76m2++mS6LkZMnT169ejWmUgWr/vvf/1auXFn6gryoyiXUoq1WrVox65o0abJs2TJaQU9eh/eMbNmypV69eqwv5rYYyUi2adOmUqVKeAZuM1suvoI1u2jRInpt5bpcTANGAz3GJAgMZMNj5u/cuVPkaYXlwOOQJwtkynXTpk3169f/8ccfr7zySinC1dx0001Vq1ZFA7fGcOdxAJ08/fTT8+bN+/nnn2XlwubNm5s2bTplyhQMzlIzNrNNvv766yxbsROo3q1bt9q1ay9cuJARkx/+ohQNISEhxMc9evQQyenTp/PgWHGk6XL//v2ZLXbHnVB99OjRw4YNY97a5rEcOM+wort37y45WYKRNM2VQSbWx08SOmAwOawm1j5KJEa3ZyBFKCemR5gHJAHE5ZdfTkN33XVXiRIlbKdKLR4uAjNnznzrrbfYBXAddLlOnTp169b97rvvGNKoqCgeNJpFuQwUVXDpDB3Tg3YrVqxIJs6B5cDzRT82DBw4kMdN/v3339+zZ08SSp5huP8pMJOYdh9++CHLQOYKc4gjVHx8PLOBUq7MrW3btrHAmDSDBg3atWsXmXZ1W4yrJICpJogYoITA1n7TgdnGlhATE0MV2vrpp5/YEtDPrGVZfvbZZ7t376a6qEXsm2++KVmyJL4b18DCMJvKIhAmkyqy0sDKPRPUEmuzrEI+mygrCrN79+4tyjO0TiCycePGF154AeOlgzVq1GDbS05Opjojxp5HPst+/vz5x44ds6oVHgwvzvq1115j95WVT3iBSUlJSRhAGNS1a1eGF2/O4Rj/Yg5PbscnS0TD//73P9oCppCMjFXsQCSx0Bji9JejnQNIRbY0MRuPJjJWmfl0AJnvv/+eqBQxhpcrvpuwlXxbjI0Z5169enV6io/GcaNKSgsIyv/66y9cOZOQpoEAgr0fty594Upb7EzDhw+nVBwxT19KwVKUa6jCiPHU6CzbD3FqhmGxQVJaz30rSIo2YMdluLDWrk4R8Q1LlRw704k0x/COGzeOUI+nRpdJrFq1igERS5Dh0fzxxx/9+vVDPxHGjh07pFFLi2mGSP722290k41HcoB5SyzO5kcpkGPVcYA2hkgESIvyLCXZ4XgiLFLr3udjbfIov/322yzlgXzkecSEO+KX2IPZm3FWNMSDZjYSqWA2sJXioBgQagmsuLFjxzIy7K+sOLqDbZbq00F41KhRhGhO8wi4OWj93//9n3WfDdJx4gACshtuuIEjEE8TX8rQ4WDFWzIJZWGK8JIlS4iT8ANTp07dunUrI//iiy/SOo6rXbt27AUiJvLACRNL3nnnnT///JMlgA8nHGE0aJGhkM6CCFORniJctmxZFOIt5ReTceZMlUsvvZQBR4aQhcfB4BCcycePlHxQoDjDem5nER48y/WVV15h9vDsma/kWGUmlDJTn3zyyQYNGrAYnKX29OLKFCcKYT6JvBRZcmYrZDLD2IBlhjHLrTJTD3MRf0oRGyQbhngQq9j08iwGOV5YWdkg3TEb/7t6zoj9crWyHJDJpoVhOJSRI0fmIJmYmMj5XrwS+5DsgsCCxHHXqlWLcI1bqlsVCg908mhY5CNGjMBUwC0ypNgJGHbHHXdUqVKFyMO23E7kD6rTKPsT/aU5Aik8mlV2OjzNCRMmMDfsH9V2Ni057GoML84RT8QQZWkbYhs2bChXrpxMVI5T5FhlJjx3YjjOza1bt7ZfOi4sMInhfe6553CgGHDFFVdk8LNmPzwETMztr776KoNt+QANRC3mRuZ69dVXs1NI0/SUvue+RSQFKrLj0p2lS5faY04mp+H77ruPHDszA1JEo+xDcgJu3rx55jFHZsiQIZdddtmhQ4dI0yJXqyxdCZkcbNBAmCI5gKoWLVpwtjHNzHq5Md94BDgcDseE8jmMAKuSTY55aN17vUxFFgjBQZaagXxpd+fOnUQkmFehQoXY2FjTHCOfK2EHz5qpyPmB1WfVNKGU0DAsLOyLL76QIIMcq+x0KCK2LlWqlNM84rNKlSp98skn1n020GXCHcy76qqr8DbcysGG8IhZCoQIEt7xXGiIqOLCCy8kKsI2hGXaMNScFjifyCo+fvw4+UAtdBIx9OzZky5wizBK8OGE+4SVEmTbYA8yzH8GFlV33303B0JphXw0EFIQ3+CUGHw5tuHSt2zZIn1R8kohv7Dpb5gibBU9evSQF7jwO0SgUgT0h1Im38KFC+VlT6vAhPmEAFcC/DZt2rAfP/jgg2+++absBJaQCbdkEkM8+uij3DLzFi1aRFxll7ILsheSQ1usBxqVIoHlxFJk7mZQmxkRoFNczcjtzNAFSZgKMoIqPCCl2M8qtXLTkYpAmsEZPHjw1VdfjUKWKKGbmDF+/HhG9eOPP+aojRKzXuGDu8cAXGrbtm25xW2xKeJ3sG3GjBkEBBzHL7nkEnsAzziSZwQN9EimxJo1a3D6Mg5gDomR5sr55oEHHmDLxIux32RolwHBfXNOIh0REcEI5zBEHMWef/55dKJk+fLlVm46VMRtsTH06tWLDYAcsaFQkGVCtM1RDLU8X/pilTkGk+gHATZduS0I6LSf1w8//MCSya47ImNP4zOCPFfkpXo+oDpXpty9994rj4wQkDERzSID7C4csgki5eVSwSorMLRO+MWh+bXXXuvYsSOzqBCVy+DQF3rXqVMnOsjj5nROJq3IUBMJ8azJ2b59u7xIadRMH5z58+cTYLEYWSDIF6JtNqw43CkTg0VBQECOuM2qVasSFbHBczxjYdI69hMJ4RD2799PBEksTo7xMMwjHw+RtUnO3Llz33//fekdRStXroyJicF+1IoS8qOjownsCKFMEwxEngXC6nv66acJL4hahg4dKkEYUIoGhqJRo0YETyx2vBMV8R7VqlUTJUpeKdBGIg/mrEGLchzBEbDNk8NyIsw3bbGMYZbIn7PimCg5ziK55UTCnCZBDtE0V3uyClKFHMIR2iJBeMtKIN8GA2TTIiiWHdpm+vTp1113nURCojBLKEUzEQx+h5WTSwjnzUYMLEUOTp48ydZFAtuwPEsZG2J5vB5xOt3nEL9ixQqCdwKOAQMGcOagLublrCHfiBfAyDfeeAMvQyuMMF7+zz//xC/gjlnnIiPyBTSD6vQxPDxc5g+xKbGmrVwQGeYDT5Om95iQQz6IDFVwlxhJDsbLQUeKsuT222+vXLkySpgknKhkPLlKYsqUKQSsrVq1Ik2LOavKE6jCV5533nldu3bllv5igN26dPDgwYPMvVtvvZUJQKZUzB9Gf3w++/1vToGcO6UoM0iyPGnXmtBnokOHDhLuUzEfQ0QtGVsS7CjdunUjwe7CIpVSiqSUjXnv3r233HKL3FJKwtSRkezyc4AxX7p0KQk22l27drFIjVYL6YnzrCVBT9mYSaCZDjLxpCNcmfldunQhn5n/1VdfiYxAgDVt2rS77rqL0BlJlIChrsBYDZhMnjyZrR2vKO8nSpCBDDEHj5i1yUq5+OKLycEGjmqLFy9GpmXLlnbvpEpoaCgnAcICeoczxJOTTxUSXAmYCKREEgFo3LgxIYJhgflYaZor+aNHj2YoUIuvpl3GhzQyYhurm4hz9uzZhC9SVz4+Yhqi5Blj6PONPMhzAo9f3n5mluD6WcZkGq+gpabec889TMSU9LfYRR6QoZTrkSNHCKJlLRHJkoOYaMgAKxC/gySLYePGjVQnE2ESbIrMRTQQ5MoLraKHc1L58uUXLFjgbDo7qMIGgH6ZyrnhiSeeQDNQ19LigC1Q4htiiJ9//llMssoyQREdHDJkiKzk66+/nqMtQY89nmeB5OTk9957T0YSj8C+wsGL9S9PyhIqMKgCOmu/xsNhyCo7HQIvOdngm+w3gK0yc8TsVyaYEpy3ELDKTod8usAkfOyxx8RzvfPOO1Snv+QDZy/iAE5UVgX/wFSX926YFcxSDJAXh+Gtt95iBf3++++knX3MN8T38oGbSpUqsY9auacjwyKvJcgw5oYGDRrIg8jf+yY29H3z5s2lSpWidXTar6WjhKKHHnqIwwMRkiV9OiKJDfl734TMESNGyEK79NJLGQEkrbLTycf7Jk6YV5dddhmt4OJI0zWgItdRo0ZhOUWcIrjFAK5UYVPnTLJjxw7RkAPoydP7JsgLDJG8ctmpUydurWJzQXGLNkIQVpM4bax68MEHsZPptG3bNgScY8VtbGzsJZdcwkyAzz//XJR88cUXDC/Ur1+f1c1CQxv5XHlY8h6NQCZBCTbLaLDkLdWnw5mNScLCQYaGZs2aZRUoeadAr2ecQxo2bNikSRM6sHbtWnmFkLiH6759+5YsWcKaZ2YwPyRT4FYSrBMC1WeeeWb8+PHPPfccSuyiDLDlyNsf7Hxff/01OkUhOR988AEenDTLg6jCjrqmTp1KnHHNNdeQdraeGamCrydUat269Q25g7CG1sHScjqnTp3Ce5Ig9GaVksjZBjb4J5988uqrr0YhOyhhCrs+FbPTX+jgFzhz33zzzdjJo8QjcEbhAclQ52x8XmFMUEuC3uGCJTMDffv2HTNmzIABA4gj8UQZbKC6nIFIc3yX98KzQ8SIC4lIcG3ykgYaxAb04/LuvPNOU9ZflC5dunfv3iQIMmbMmEECq5h1hFz4TWad+OuCP246GBYWJsPFpiJvgVllp0NbSBLUWhP6TLRq1YrFfsbVlBvoKctHXkAiZFmxYgUJUXv48GF8Qo8ePVgRBW8oM9j/6KOP4jSILL/99lt548AfEJHceuut9BRPyHYrfaF1ds1x48aVKVOGnA0bNsgXKBBjEk6fPr1p06bEJaaCwsSeV8ePHyeS4FZ2dyfYhputUaMGblOWBqYuW7aMBM9CXuzMMD9xUDh/OkJdHiKlJJhR4pAJyzp06NC5c2ecCfOcnEaNGolmJAWe/rFjx8hBSe3atUlkJjQ09Nprr2XhIMNiJ00rVpmSVxjEfyI88k8//ZRdiknTp08f/BqejkD4lVdeYdc8ceIEtywhxECqkCOZcpXY2b7lKmIZwAFVqVKFaXrllVeyT8g5YM2aNSyACRMmsCMyhu3atWODp4jAnOk+ePBgZCSatrRkhdMMsTM3iAGQpXLWjxzH2WBIIwPUsoodSJE0vXLlSs7WDCbBO7VoIssqhQ6tYAAjsHXrVvEyeBD5ih22FaIZdmdxZzLtJ0+enPOw2DaQsCVJ4JSpzgjjl9lQkclOj2hgZjI/CXFwmoQaKCQT98cez4Ynp2erjh9A+fbt28VZ16lThyAJq7CBIyxDzRFQOghWhXxBj4Ddi02OhlgU+HpyrGIHZNKigG25gSpyNqUKk5NZmqfvm9hQijDaODCwhfAECfLsYy4RQK1atWQVc2vVcSD6qZ6/75uQL71AufSFHKvsdPLxfRMnqF29ejWPgA7ed999NCo94lkTY73//vvy2uFTTz1l2JqWxj5KrMnpCOXcWlqyAZk8fd9EdAKTMCIiggV+//33O1shLTPBeAYm5BAhRUdHYyS9lo+SZKiCGKcjBHgQ9957r1REz+LFi+UYxiShLVrklMhoyAhQ0QZr7chj3rx5lmoHYtKkSZNki7npppt4alS0ipU8Yoz1P5RbbrlFomMcBw6UzrCoPv/8c5yOfBnEnkmmuBG8S6Zc5XQrtzKZRCwDLCo5AOFfNm7cSCukOXNfc801d9xxxxVXXEFFnI78aezNmzfjFPBfxsiarecMVVgDR/ICfRRTszSYRumLGAl2IjNShDyJ6tWrs6RZWn/99dfzzz9PE2TKggSRd0KpvW6trAJAR/BTcsLDf+EBCRPRn90AYifQtCSEHHoqIENDOFm5zXL0gHwQP2UnrDKzVKYKzTmvVrEDaokGphm+lRM8PRozZow4aOYS4VTXrl0JCq0Kp4MMUAWzrax8QfULL7xQ/swbU3TWrFmSKa+63XjjjdJBMMUNaBcBG7EErOKssIeCzlpZ2SB6kGT/sCb0mWBpx8fHi5FUFD35g+r0qHnz5oRc3BKv7N69mwSBAtswj4P4g1vnaBQW6GTuMT4MuFwL2JcsMR+Ur379+vXq1UM/0eSBAwdoGifz8ccfd+vWjVO+fIfum2++YWypIh9IatmyJVd/mCQwkyWeIIbgVuwEWpQ5I02Tgwxzg02dW0zi0ZND2gZJkAMDCTRzlblBtEc0ds8998gtj3XGjBlMfvldE5nMpo6/ISeHx00QJq136NAhBzHljBRo7IyZcu7g8CQ/NcPuiN9kYi1fvpzFc/vtt4t5giVtYmVlQopEJgOole8xsvvijJhtu3btmjt3LtshmwRzmqK4uDiOuQjjxxs2bFizZk1DqbkMREmWUMokRudFF11UOdfYf60KLEUOcJRE8RSxqk+ePCnrLUtJOx/J/v37o1nePZkzZ44cTUgLpvhpUJF8jM+yNPegR2wYNGgQXrhNmzbcrlix4t1330U5SGkGyOcqRUb9dMzCrMFOcROJiYkijKPnahU7kFI7YSOlNC3HMnIY3uPHj5OT3SBIRUoJRm+44QYSxBZyFv/qq6+ioqLwgNn1ESiShqz7fEF1CXQ44OKR2WwwGz/ONsOu43zfx6pgtoupzhzIcJsZqqBftgfMzs5yxIDQgTlPcJlLbr75ZrRhmKUlX4g9tE6vCSmYAPgKDqzkE/Zt2bJF1rKI5UxuZDJjK5cESH4hIs+OpUQH0S9v6ZL44YcfNmzY0LNnz3LlyskHXXfu3EkRz2vKlClt27aVdyoL3SQUCvglzn4kGGd5iULyQdKYTSLB/GZHxYoV5T04zNuzZ48pldEw+ciqCJOQUweJWrVqTZgwYfHixcQcPGLm5OHDh//73//Onj0bYVoRVaw+0nLrfG3GhnwU/vTTT6SJyGUGgpQqecWYW/lGHtU5wTA9MPDuu+9mwjEhPv/8c+bo5MmTOaI5v5ZWKDRr1kxCB/ZgNipWJmfExo0bM487dep0nvkLjwTOTOgvv/wSbyXvXIAs++wwh9AYQwn2c4nE71TMUnlkZGTp0qUpZYnKomWgpJUMSBHa5Pc3P/jggzfffBN3QxPDhw/HK6FfZKwKDshHLAfluYS6tEJwRpT2/vvvv/baa4Q75IwaNWr16tWUZqnc3MWM+Akb6CbyZzRDSnFGdohWpUoVKcoTtFutWjU5gdEuByZySFvFWSGlTzzxBLOF1gmhYmJiiDNuvfVWNjwJd0TSCZIy/lmW5pUGDRpIoLN27dolS5bMnz//xIkT7dq1k9LMyHgytowwacEqywoZBBHmln7J6/aZEVWAMHMvl8hko3pBRoO6IDPntttuk1fmWbY8DmL9a6+9likh+uWaA2cUOFfYs5GHyzGMBCsLl4VjZAJUqFCBkbd9JpPw4MGDP/74I6MhU9of/aJF1DIl5AOVhw4dIuiRfDBFDEgTtY8cORLDCDLknT5Wwe+//57BKsNK82wpiSuvvJJMOk7UiAZyuOXIhFd57733CGfRTGz9+uuvUyQtkmBFYJLcMgG4ZgAZ1gjrhTRDR7DrtFbJKwWKM84hmM6Dr1GjBoEFk+zXX3+dNm3a8uXL7R/TLRRoAgjGWZzc7tixg3VLTPPII49gADkEGZwPkGE9sE8TGrds2ZJ8KZVlnwMIXH755WPGjPko18gXeqkr1wwQZ8gSxTuz8Fio2CZFYPbGuqU6bNq06aWXXho8eDDnACKnvn37UoQvePbZZ9Fgb+ECt5KzdOnSLl26yFtFVpmJqd5AJIG0VebAluH6xx9/DBw4kHabNGkiPziBztjY2Oeee05eexAxrlIXuI2Li6MWewNepk+fPps3byZTOguWnAPJj4+Plw/J8kDz96k3zBCPSZrND0/kNCxL6A4yOD4OWKRxskOHDj1w4AAnzgyjhxhg57Zt2/r16/fpp59y65SRXoAtKVdJWEKnQ3VKOeBypKPX2MwU+vDDD7GnXr16di3RAKYyH+PJ2Zcn0qJFC8aZcZN8Wz4DFHFl2hCakOD8xyiZJRlhwgOlxLUyn8/I2LFjMZ5aGYYrH6BEEmw/7du3J7F9+3ZOul9//TXzWd5To5WCN3SusI2vVKnS9ddfT2LVqlVMuQULFsiHhMipX79+nTp1EFu2bBnBPZJXXXWV1Cp0mBimRcZX6PEw3DIDeaAcgYz5ZGJPPIKh8PBwIm/CIBYLD4v8RYsWsa7tpU2OaN67dy9qmWnyiXsggJZPWshTRhVz+LPPPitn/jllnJW8T4QkMsSUl156Kbek8YGimavkCLj0/fv3U5eYRt5KNtuxPsWCpJIHzCHNJzLu5wQeNhshV8ILcRCEnM2aNcP1k2kJFQay3a5bt44jmrRCLCzv24kNnA7FAMLwbt26ccY1650bMOmZZ55hbWDP6NGjuWVhZzkgZLLw2Kp79eqFzSJGaM+6pTprddSoUWSCVcGswpocNmxY6dKl6SxjYhWkgwALktPDxo0b169fT0JGLwNkopZG5VTduXNn+SYt+Tgg+SIxDBo0CBkygVKrsvnT71QhKOnUqRPPgsGvVq0axiCDWoQtOQeifOfOnfJ5iIYNG0qIkA/o4M3mXwmhXfn2stO2LBEZJio+FHD3rVq1YnDEKkvINPLYsWM8NTwg+jmBSaaUgvSOvZxA5Oeff0ZYqiPjFMsS6rZu3ZqmxfIpU6Y4LRcNyABbb6NGjQg6b731VmY7BnOeY6qIPEiVzEyYMEEm3sMPP+xUXojg+rE/399rtUHy+++/l2XLRkusT2hrlWWD6Kdf+ftea+4p4PdabeRHMnh8PESCKo5AjBWTBwsJ8ugCI8mxRF5CyCUYkO/fAx03bhzTD1iDb7zxBoMmL5hhD1eCoYsvvpie0gQ5nBgJRjEeV/PTTz+RI0qkFLeAm8L+hx56iPVIDnz88ce4BekmYoB+SpFB8vzzzz948KDkiwD2oJ8hwhuIMaJHRokrFSll6DiTfGF+YhqIzNBDQuxRcsk/9fUMINDhStguX0zijIi7IRY2CwsThql69eo4I9LEthx95L0SYCLikeVjZcxmDGAhSdE5AXtweeJA2TAkHJQiwc5hpb300ktsty+++CI2U5H86OhoTttRUVGk2ed++eUX8qUikInP6t+/v/1upRNKGSgWKoEOJ6TmzZs/8cQT7KZWsQPRyXARymzduvXll1/m8CFWceaWOIb0O++8g7tBJ0VSKpBJYMF1+vTpbDkPPvjgvn375Evw6HQanAEGhF6TYIjk3eK8ghn4Jjkp0lP715ZygCpiErU4SkrOXXfdRaBmlv8N+cze7t27P/fcc3Z/M3SHfFweTpaQGg8oj5j8HHotYDYDJenKlSu3adOGilIXqE5a3C76CRQY2C+//JLNmLa+++67qVOnimR2UJ3oh4Q9PkUZukksJR/iZkWzPzHnrbLiAkcIdm4mPDHivffei0/gGbFAKGLm4MG4ZeETTYq8v6EhXCWLlE36BZMNGzYcPnx4y5YtxDp33HFH27Zta9asKXOySZMm2EwaB/Xee+/JywnA8wJCPepWrFjx0UcfZb6RTxX6QtxPvEIfJQdJMnFZNFqvXj1x2hSJwD333MNzR4ZD0bfffkuREybz3LlzqUVgwbIl+CNumz17NhESUQ6NWnJKLmHE/4kwdcAMQD1Dhgxh6jDt5K+mCZZcwaAJVJmBbCrTnVbw0ewu0i5FXAl+2aoZybp168al/2Umq/5Zh6YJt+XD2Lh7jh0Y6bRHbrGZ8J9dbdCgQdILrkA3qYKHYvnR2aZNm/75559WTfM1T4n0iSTYp9evX28VpEMRj4AdlHUIHFzGjBljlTmgOSQ/++wzTmkcGkiTY0MTt912G60DBzsWPALYZlX2+dDJ3oCkGMxx6qKLLqpVq5aIOSVtJJ+jGP0ilFm8eLHkWMW5RmqtWbNGXtzq168fjeb8uCkVU7lyjqR1ptDevXuNrprY1UnIUMybN4/Rw6NRS4qAUrwtApdddhl7Bh1B5s4772S4KAJLLhtQe/z4cdw3Zvft25dblDtrGR0zX47C6XO8MywzH9OiRYtw1kQe0goyVoXTQVK8NkdbNg/Rb5UVHpvz/nfUsgTbqCjzgXMzc4zbnKuLfirm73utuaeA32u1YWIwPzGVCRMbG8sDAroJPGUiXfouX9e0KuQCDMjT91qdMDIrV64sU6aMLG3mMIvowgsvPP/887klbt6/f79YiCRWcUtcQhFuhMMPNtM6Rax3zjDEhTNmzEBM5LmOHz8eYY588lKT5NNxvBnO6uuvvybHhiK0YTlHDmrhPXis5KMQG44dO0bQg208ZTLHjh2LDQK7jHw53OqVkjsK9HqGpeNcIAYwRaBr167Eqh07dmS/YfEIllzBoAlUGXum+Ze75YPHzD9pVxoC9kVWCyG5fMCKWlb9sw5Nly1bFktIsIrk1WCrzIRbNuknn3ySWIHV8vnnn7/99tssLfKBHeLll1/eunUrXgDWrVvXoUMH3DFLi5UpvQa6jKosu0kteTkEqDJz5kz7LGJDKPPss88+8cQTFHFoePPNN+VlWBTGxMS89tprP//8sxjAeZ1jEDsuhlmVfb4HHnhAfuMSea7R0dE4PkYeq8gBS84ByjFGPlt6+eWXy3YrvcgTUoVtQEIxIq3MvcuAaZHxkhjyt9xyC9Z26tSpQoUK5kAaUGpLciu9sCtKEZAWYTy1aCOT05szEMwB6hLVEZfgcxlAK9cBCoEx57kQDBmWmcZUq1aNoSNHHhBYFRxQxBPEU5NmGcrLUVlKFh3o2u23347HYEepXr06t7k3uIh3TaBHPAvWBaG8fL1CninwlPGZhI+9evXi1qrgf1h3X375pXznFp8THx+/b98+Zg5rfPLkyWzhuFnsoRTbuMV7UETO8OHDieQIdnv37s3xiTh42rRpFCEm8jwR6nLCITrBZTGHp06dSuhzzz337N69m5NJu3btkEFYoAr2EB5Nnz6daOzQoUM33njjgw8++O677w4cOJDIFc2zZ8++4oorqMWCxXK8JfMcMXnpS3qk5JICxRlFhKpVq95///3dunWTCecnmJT33Xff448/7myFEeR68cUXE2rI7s4kNkvODfJQWY3sZ+xAO3bsINNpMGn2DGImIoxvvvmGddWgQQPypSIBO2eFiRMnssJh1qxZI0aM4DzEqrO3mRyg7+ygxCVdunQZNmwYHoEztDP2FzG0sZI//fRTmvjoo4/EAOrSBB6QJc0BgiKOIEAYxGp3jirVJYFCqgCnEBTSNWdPneCDCLnWrFlDXeZJ/t40sWGU8N2o2rRpE37N7leWUGobtnPnTuKSu+++G3ukNDuMwUrfzyQHxD+++uqrHN/xpPhHfDTO2irOEfTwIAgcmzZtKr+gIFjFDuzhFTCYCUNcJQ/Iyj0d9GzZsoWDJi6Yuee0uciCkaxoQi4x+B9hc57goTRq1Ojee+/N8D1/oVmzZvhMNu+z1nEawiTanT9//gcffEAEwJ5N6PCZCYc3S84BzoT4g1Cgf//+HJ+YYJGRkZxDFixYQF3n7GVm4jdwaAsXLsRlMdUnTZpEoIBbWLp0KetF1o4l7YCY+JVXXvnll19QSxMsZw6T77//Po3Wrl2bWphdqlSpCRMmvPTSS4Q79o/HW/WVXGKssH8yHLbk1TOuzDYr1w/QEK0QhpOwstLfAhADkpKSJOFXM3KGpsUkNiEWCbuR0x7SGE8pSHfkFrMlQQ6Y5UY6cw5iaBswYABb9a+//ipqBVHIILAzSRUcGT5ObsE2A0laFIXOpo020lshIQptpK4TFCK2evVqTqUbN24UMWpZxQ7IxLuxX1599dUxMTHSum1P7qGKtIIGzsGMMHFSznrsKrTIrnb55ZdzGsvSSEAYMBXNb7zxhtzaRbYerhjAka5KlSp4Rm5tsexAgCinTJkyxHD28GZnhkAVBEaPHs3eIPEit1k2RKb8lRyicPs9lzOalA8K8X0ToK7MBK6Qc3XRT61/yvsm0i/6KCZlMExKuYKVlQvQUJD3TTBG2pWE8xYsOQd2FRGmdUlIJqWWnNkdyScTRNLOtLsvV6kiaRkEuyLOys7hSo6pwMLQaEK+rUfJDf/41zMILSXq5Gpl+Q3aws05G5IcyeQgCOc81MUS7HnmmWcuu+wygvrDhw+zJGRVUGobTEJMJS1HWDtTNNhXG5HPDkpZfqgSyRMnTqxZs6Z9+/bB5ocJBFsSzaLc2bTkiJjkCEbbjjG3uwMs/g8//PCRRx6Rw4fIW3LpMTRW4Ts++eQTcjiRREdHZ1CYe9AmrWDeyy+/zAGIkxk7q5gElly6kUAVuWWbJ4Do06dPSDa/AXpGpGsoRBtmLF68uHXr1pz5nF0GEXC6QhLkc2pkh7j55pvFfnIyVBREXq6EF1OmTOHAJz/GYEmkIzI0FBcXx4GvevXq8u1ZirLUXHTAPIyXcQBj3Z7Tj2/7A+kdV5DOWgUmzB9KjWXgf7dpgw00x1BztVuXW7CETod84wmZHZFbwc4R5FZy5EoOYqS52pmGRHotEgwLrYuk5Ni3QAIBSQOl5Ai2EiWXFGiS8RjOOZjBJODZc/WrScwtmV7WvYnkyFUMyCxzNqFpoWLFiq+//jrD8tprr7ETUCSbjROjP+kLRhKSA5JpY2dKK1lCKc2RQJIWOehzyrnV/DS7VBcxkByugqTtq2CWWLeStiqb1bnSHVi2bBlHq4fMr66RL8IiBpghfSfiYY/v37+//a3dDJK5RGpxhfr16xNq7Nq1a+LEiYQ7GGMJmdhGSoIr2zynJQ7fpCUzT0gVaYXW6TjId5ghg0K5pe/Ic8U8wk1OnLfccov8rSnIXMuJ1OUhduzYsVmzZjJimauI2Pjx42NjY99++235qat8D+9ZwzZS7LSxis9E7iXPIc4O2t20ytK9mWBl+RmrsdMH3HlryTlwlko6Q3VLznwiFAlSBBluBatCehXJBEmbNQwkkzGUBNiZNpYiJRcUKM6QR6IUQXg6LVu2JMiYPHnyXPNvoBsPO931+AP0y6yIj49/8803V65c+e6770abv7pYuNAEV5rbvXs3x2jCqUjzL9NKqRPTHMMeztm33XYbuzLnFauswDCY9913X9++fTnub9iwwcpNh5Digw8+6NWrF4Owd+/eGTNmjB49ukePHlWrVpWBsuRyjfRFErNnz37ppZcY5EsuuURKM0AT27dvf+qpp/r167dkyZI9e/aQiIuLe+CBB4gFRU92mO0YPpRpc/z48UcffVS8rVXsgEmFtrVr17733ntDhw694YYbyMlSspjxb+ijohQiBYozlKIM20PXrl05cz///PM/mn8GulBgB8qA5DOZSB89erRPnz4lSpQgApCv/0hpwZGGzAYNDh48+NZbbw0ZMuSCCy6QViTflLVgBAgynnzyybp1677xxhvO76QUHI7yISEhgwYNInp4/PHHd+7cKcGc2LBixYpnn32WII/N/vLLL+/evXvNmjWfeOIJO9QTJRmQ6iBpyXRCdSKbH374Yfz48TfeeKOVmwmPxzNs2LAxJrfeeusVV1xBoPPcc8/JD71kibQL2MZ13bp1CxYsGDhwIDEctxlkBIzZsWMH0dtjjz3Ws2fPHPpViNCufbUxzcliuAodu6HMV2ci36AhS212vjPzLJOlAYZZ59owpeijcUaxhW2A4zsbAMfNESNGzJs3L8X8Weh8Iz6F/VXSbCrsrHKEtXNKly790UcfPfLIIySCg4OxQUoLCMqlLVqHmJiYUaNG0bXq1atTSrsJCQnff/+9lEoV2LhxowQZ7777bpkyZew31woF6R2hxuDBgzn09+vXj5M9FhpjZP5WPYMv6ZMnT1566aX/+9//KlWqRBXIcj/GcoTRQFpkRJuUCtwOHTr01VdfrVq1Kt2hCavAATIUYRhKuOWhk8MTefDBB9GZQaFApgydMcRe7+bNmz/55JPhw4cTL0rp3r175ZO2YhLX5OTkxYsXP/300wMGDCB+IhxhQGg3y64VFmJ8hi7QImC8X5sG0U/f6SY22MaQw1Dbt4ZoAbD1OFWZXTTmg3V/LsAeMQBLbNtMMwun40oxpkBuVyaZUgSxnw4+sUWLFhMmTNizZ8/BgwclM3+gU7YufDq37DqSI6V4H9Jgb06FCzqF+Ph49nU2b87cH3/88bhx4wikevTocejQIUrFCQpz5szh9E8EEGb++Uc/wQjfeeedb7/99urVq0+dOiUGNGrU6MUXX6xTp07jxo0HDhw4a9Ys0iKfHdI7EG9ubyoZSgng7DHPDmx44YUXbrnllho1arRv337SpEmvvPKKfJuXIq6WXDqSDzS6YcMGhvf888//4osvxo8fz/COHDmyd+/e8nlewwKz+oEDB3777bf333//pptuOqM9hQU2VKtWTX5KVXJIlCtXrpT5Z9uYkJLpP2iF1i+88ELiSMZKcpgAZcuWlZ+bZChs2/IBdYnYKph/8MzWQ68vuugiecuvIMoLCE1jQFRUlHz02M6kyxdccAFGSo6iZCbr841SDLCfLP5X/ALXLE/AuQTHyh7Pbjps2LBff/2VLbxz584NGzbEz1KKfuIPnA5i5IhPLCzQCSROnDjx+OOPs23L+VUygd13/fr1ZcqUIVPsAc7cYgnIBuAPpOM0wZXhxSTaIk0CAxBgg6eUBPk5mIEe5NeuXTvRpG3btuz31157rXw/hVKQ/p5RD61zTTW/nodJ8toGaRkZ0mKPja159+7dd9xxx44dO8ihIlUkv3Xr1tOnTycHDShHA1dgdzEVGB8BloRfoUfEl+z0bPMyAtgpv23PIJMjmf5Ahohx40qL8nqV3RzxJaNhfysn32bQCjOcK9GG2RtDD09Bes1ok5Ph2Z01ZASYoikpKSVLlhTbyKHjdJ8JID9gpSiZ0TijmMPzBXyTXAvopPB3HGRx97gVXAzeVn7jHNBvXwveUGZojis+bs+ePaTthriSZiu1LbGbtk0Sn+gn7Fa4mv02tiKxwc7kijtm0CSdJQgjs2vXLjsuJLNWrVr2Fm6rzUEJ2PaIEvtKJhqy26hkeOPi4uRvBEomSHXCOPuQLdVJo0okyZFMf0OjYoP9QOXWxn9mSNOi305IWhL22NpF+cBs5O9BdqpCv6k7/8oLiBjGVWyQqzNTchQlM5Y7yx8FqasoiqIoSrHHj+c8RVEURVH+5RTo9QxFURRFUZQc0PdNFEVRFEXxF/q+iaIoiqIo/kLfN1EURVEUxV/o6xmKoiiKovgL/XyGoiiKoij+Qt83URRFURTFX+j7JoqiKIqi+At930RRFEVRFH+hr2coiqIoiuIv9PMZiqIoiqL4C33fRFEURVEUf6HvmyiKoiiK4i/0fRNFURRFUfyFvp6hKIqiKIq/0M9nKIqiKIriL/R9E0VRFEVR/IW+b6IoiqIoir/Q900URVEURfEX+nqGoiiKoij+Qj+foSiKoiiKv9D3TRRFURRF8Rf6vomiKIqiKP5C3zdRFEVRFMVf6OsZiqIoiqL4C/18hqIoiqIo/kLfN1EURVEUxV/o+yaKoiiKovgLjTMURVEURfEXGmcoiqIoiuIvNM5QFEVRFMVfaJyhKIqiKIq/0O+1KoqiKIriL/T1DEVRFEVR/IX+foaiKIqiKP6iWL1vgj1iEteTJ08GBQWFh4cHBv79mo0IQEBAgMfjSUpKIh0REYGk1+sVSXKcVYoHdEoSdFwSipI/jPVjriB7LrF2uHIr+ZLDIrIFziGYJFfskTSIqUXHSEUp3hTDOCMtLe1///vfyJEja9as+cknn1SqVElKcSuUciUEmTBhwldffbV79+7Q0NCGDRs++OCDLVu2FE9kX6VWMYDu2I7V7XZbuYqSL5hFLDHYtWvXb7/9xmoqWbJkkyZNqlSpItu2zDcki8IWjrVWyuWKj4//448/yMHU6Oho1oLGGYpyFihQnFHUwIPA8ePHmzdvvm3btuDg4Pfee693797iSvCMlK5du5aoolGjRtddd92OHTveffddHCUCAwcOHDRokAQZxWwzpuOSoPtBQUGSVveq5AOmUEJCwsyZM8eNG1eiRIny5cvv37+fNRUbG8uiYxFxZRHJ/l0U5hgGezweIozRo0dPnTo1OTmZBZ6amnr77bcPGDCgcuXKmGqJKoriJ1h1xQk8SFxc3A033ID7wA/OmTMHR0M+V9i+fXudOnVatWpFbMHui/CECRMIRxAuVarUli1byJFwRLQVD/CznDt79uz51ltvpaSkSAeLWR8VvyITBpg/w4cPv/jii7///nv2bOYSOStWrKhZsyaLqHTp0qw4cphyCFuVzzpiKmAelmzduhWHEB4ezkECD/DHH398/PHHZcqUadiw4fr16+3lAFZ9RVEKFfeQIUOsiCPvUN9KFQ3wFDg7t9vNoapq1aoPPPBAmzZt7BcnsPbZZ59dtGjR/fff37p1aw5bCNeqVevQoUMcyHCRDz/8cHR0NGJF4RxWuBBnPP3005xE//Of/9BrK1dRcofhKszF/uOPPz7++ON33313r169goKCZBFVrlyZOGPq1KmJiYkspa5du4aGhp7bRSTWcj1x4kT37t2XLFnSo0ePESNGlC1btmTJkvXr1+d0MXnyZLrTsWPHiIgIcR1SV1GUQsZ0IMUEzi7ygoScqOTWPKgYxMTElCpVii5PmDCBW6mCwMmTJ7/88st169ZREaglRcUGOrtgwQLc6LXXXpuUlEQfzfHQ05uSW2TOsKx69+5N4E5EzjKxpxClRBitWrUitqB03LhxLCuQ0rOPObuNt0uSk5Ofe+45TKpYseK2bdtsh0Di8OHDNWrUYFHQI3lhhnyrvqIohUpxC+HxdEav0j9bbp9RuF2zZk18fDw5JUqUkEzAB3H26tChQ926da0sRVFOh2XFNkxiz549XOfMmXPgwAGzxIJ1VLt2bRKI/fTTT/I6hxSdE1jv2LBz585PPvkEk5o2bVq1alXxDIQUCHDkaN++PTlTpkzBM4i7UBTFHxTIF7A4ixR4FrCMM83jipcBEhxoSIgkSJp8XCTIi8AgMcrZARvsVxdISCYJOSxKpkBaSs8+GJPZAMkE0pIw7TUQARJSKidIkQFboEghtlk36Yi1GfLJkUxBcqTojNh1pYrcMpL2CIPodGI0kw4CMp4gpZKwb/0Kq+P888+nrf379y9atEiWmOQD8To5JLCQhF169rGbnjZt2l9//cXttddey7pmmWMeV3K4veaaa7hNSEj49NNP6dQ5NFhRijfF8C1JfAdeA/DL9JBbcdCnTp2SMxaOhlsScj1X0DoeWazllgTWkjaeSmAgCZBSckReEmcfMcy6McEwruSLSWK8WXKas2YHpS+Mv90RyKDqnCNWkcBOyQHJ4SqzSDJB5hIJ+yqJ3CCDBkZ76TplTnKLHsnMYIbolxFmPJ1DDRlu/YQ0gam9e/euUKFCZGSk/XVxATPi4+NFrE6dOtyCFJ19xAzGatKkSXLboEEDs+RvMK927drh4eEkFixYcOzYsbMwjIry7+RsOKmzhnhkehQTE/P9999//fXXAwcODAoK+vPPP/EmX3zxxZgxY/CVL7zwwvXXXy/y+JqKFSuatc82tL5x48bDhw8T92Cz7Mek8eA1atTAYASSk5N/+eUX6RdQevXVV9vfTc0lKOf02bZtWw5wCxcuDA4Olj1ArjlDXYiLi9u0aZNthmRecsklbDk7d+48cOAAqsiRUtJhYWENGzakIaps3br1yJEjlEqP6tWrd95558nmWkTAKjH+jz/+YKpIJjnyRKKjo+mLHILJXL16NQGryJBJUYkSJXLZHWkI9u7dKw3JmHBlrBo1akTOunXrkpKSyAGzklGratWqVapUoSKB6dq1a+3Qh22SWtK6Le8PsIErTdD05s2bmQ+NGze2P+yJMYmJiTfeeOOqVauioqKWL19et25dBsevJuUA9mDnjh07CC9SUlIwgwFnmctDBOnOoUOHmjVrtmfPHgafdXHttdeeK4MVpZjDmsw3LNeixvbt20eMGIGbw3dw6vrxxx9HjhzJtsctXgaPjCshERISgpck/5tvvrFqngs+//zzli1bYgzmYQ8++qmnnvruu+84ilGKr8Shv/vuu3hAZAg+hgwZgkOXurkHPfPnz6fjzZs3p7q8Sg9W8ZlAnkjikUceYcfFTqIcBrZnz57sNxQRzz388MMlS5akCNj57rjjjnHjxhEhUQpz5szp0qWLjHbHjh0JO8i0VBcNZDRg2bJld999d0REBKMtdO/effr06fZbFTB+/Pj77ruP7hAt/fe//yVMpKJVZqrKLiFwS/d5xOzKVhshIYQRb7755okTJ06ePPnBBx+0adNGxhmI5J555pmVK1diAyAwevRopjeDiQFMdadtWUKLGWwoINIFeYhyJYfoh3HDn9x5551s7VglRXkiH3ZmV4XWP/30UwYQk5iTx48fd9ojXWBx1a9fX2ILRtUqUxSlsClWcQY737333iseHN+B4/vpp59iY2M5p3KgGTZsGJmEGuyC3HKO2b17Nz7Rqnwu4OTK4fj+++/H2QHHL26tMtMbAi6bcKRUqVJ2/JFXUFLAOAMwY+LEifI68wUXXBATEyP5QNF7771HPmNbs2bNDF1AgNiibNmydJP+igFWcdGD+fDOO++wzdMXZtGxY8cyWMvtmjVrSpcu/bYDbkUAABGMSURBVPHHH0vvnCPJLaMBBw8eZNY5hyIDRBW33XabBL4El1QRVSQouv766ymCgQMH2ts2pUACCwn1COAI5ix12WBXsatnh1PGqnwmRF4qEncyaNWqVdu5c6coIdOSOxMIc0VeErnBbMFoIrsq5D/33HOMLUu+SpUq8fHxTkmjcloao9e4cWMEgJDRKlMUpbAp0MvXLOMixcUXX/zJJ59MnjyZo79tXokSJSpVqlSxYkWOgNySzybBTlm5cmUcECceETv7iIVut/vpp58uU6YMcRsbDDuTlIJsQogtWbKEEKFZs2bInOVht5vDmE6dOjVt2pQcTtUEanYRibvuuqt69erMJ7qQkJDgNBKbCem49uvXj86CVVBU6datGxOJvmDqX3/95RxzEiBxBl1GhlsE7FJJzJ49+6qrruKs/PLLLxMiSKYTqjAbn332WeI20r/99pup2IBS4uO+ffvKLYGOJMjnirDIX3fddS1atGBrN/VlC8JS/Yw45SWdG6Tuli1bZs6cWbJkScJN1pRkyuzNJbSLPLWM7uUCaYKKcs2AZB46dEgkiXHlhQ0pFbgFyQdOI1ytMkVRCpUi9DZ5wcFV4Tg46rENWFkm4vLE6dhXIFO2inOF7Lvs0O3atcOY/fv3L1261CpLNxV3uWDBgvvvv5/0ObFWxgrY/+644w7sJM6YO3euXcqVPaZjx44UxcTELF68WIpsZs2a1aRJk0suuYQHAed2zHOGLjB/unbtSjolJeWLL76QySOlwCH4888/79KlC6NBPt13lnJL71555ZUDBw4QcrHv2h/4cCIV69Spc+WVV5L+5Zdftm/fTr6t7dprr61Zsya3jDN67CZIcLtw4cK77747JCQkNyO5du1ansgZWbRokSQOHz5s1TwT0ou0tLTRo0djFWFT69at5fmSbwnlAoSBmcNUzz1xcXE5t8IslQTBHKtM0gIVZUgpkhzIWZuiKPnHWOL5BYdSpDBewDX/Cmvnzp3pmrxvQj6ukOvrr78uXZ4xY4YpXoRgO5GXWzp06IB/lNfJhbFjx3K8lvcppCN5hVoFed/Ehopbt24tV64cqpo1a4ajd77aP3nyZPHmBBzSClCLEzmBFKX5aPHsg5GYTTfLly/Pllm3bl22NKvMZNWqVZyP169fb92fjvS6RYsWDAUhb5kyZYgzrDIHxuibDbFDi+Rrr73GrVVsvqHWvn17DKB0woQJplbrHRlCH+bDvn370GBJZw8yzCj0oydn5BMhxC7yU/25wexE2u+//85YPfbYY0Rg3FpleUF6V61aNZZAaO7AzltuuYXRoKKl5XSw5LbbbpP1zlzNbBu3xJGsCJHp2bNn/oxXFOWMFKv3TQTc5fnnn2+b57TTtLoomt20adNLL70U25YvX75jxw45QwMb9pQpU/7zn/+ULFmSW2K7c2g/TRMxyFd12Gi3bduGnaSBmfTdd9+xXSG2cuVKuiCmcl2wYAH58mORRR+xuWrVqhhMgo58//33ZNqlkyZNIoyQ36SSTBupC6+88krDhg1RMnz48IoVK1rFpyPV27ZtSyzCZjl79myetRTB0aNHf/vtN1Sx+c2cOZMdUeS5nThxYrt27djauUXAFM8WBFgO7N+RkZGE3dnBsZ7NmyvQhFX5TPDQEZ41axY9HTx4sD0ZrOK8QC26T3RFQJAbJOSSiqLBiWmFKyoqShJgFThgZMi3BzZLGUVRCoVi9b6J1ad0R/MPIjo6+t577yURHx//2Wef4QRx4txu2rRp+/btHEnpFDnntmu0zl4i7ymwK0yfPp2E+Os9e/awU3bp0gWZY8eOffXVV2I/pZy/b7jhhrJly3L7j4AusDd3796drZct7dNPP+UqRbGxsV9//fWdd97J6V9yMkBduty4ceN58+YRMnJKlvHJgDxHijjEE7UwqmvWrJG3TgQarVChwhVXXEHRsmXLdu7cSSa1/vzzz59//rlTp07SkOjJmbfffnu1yU85glquq1atuuaaa6yaZ4LW2ewXL148dOhQoiVysuxsbmC2EKm89tprb+QaHpAMgqUiE4TmViqTYXJLkESEJ7elSpXKzWAqipIfWGP5Bu9QpMBxCL1796ZrHNRwnZJDaVF+3wQLOcLirNlXLrjgAvYz46Vkj+epp57q3Lkz+5zISEfyCrUK5X0T4PyXkJBw0UUXoa169eonT54UO5955plbbrmF7VDeValXr558wn/37t0c6L///ntkLBVFHhkcBurKK6+UWbR+/XrsJ3PixIl16tSRp2NJn46IAQnBKsiEiMGcOXN46DBgwADkyYmLi6tdu/b48eP/97//kc94spGzo1P6zjvv1K9fn/kgdcFSlw1UoSLyYFiTIzxcsGrmAjTv37//tttus6cBOVZZXqAiHZGrlXUmzNaM5rhaWQ5E25gxYyR0uPTSS7HQqVwEWHG1atVCgEEmGrPKFEUpbHI6EPzjsPvy0EMP4aMjIyO/++47+fkjPM7IkSPZDkl/+eWX9nu3RQSeBNfHHnvsww8/pBfjxo3r1q0bOzqbCmYTarhP/yBbnkBhvn+ny4kxXcwq/fr1e/fddzFp7ty5rVq1wl83a9aMI2bHjh27dOkybdo0itg+W7dujRj7JdGe/c0IUVX0YdA++uijRx55hPSjjz7KPsTm1K5duxYtWrzwwgsiUyjdIWq5/PLL//jjD/Y8BopJSxz84osv/vDDD8Q6FMXExBDcrFixIigoqE2bNgwy0ziXTfPICE2Ik9hKz7jSEUDsiSeeaNiwoZWVIwzIypUrZ8+e/fLLLxdkfqKHpuWPjOR+SMPDwwkgaBebrax00APyIyWMAENKPOT8OTUyESCTFcGVyTlv3rzrr7/+HzQ/FeWfhKzJ4gHuQ5DXM/AvP//8s+RQykYoXSbOEPmiA66Wwxnn/tDQULxhy5Ytk5KSpkyZUq9ePU634ojzDd0vlL/XShXqwqpVq0JCQnDKxHOcCwkm6tatKx8LtU/nDzzwQHx8PC2+/vrryEijlqJ/Alh78ODB8847j25WqVKFzZ4goEKFCgQEdLMQu4MqtnZaYdCWLl3KyZtBGzJkiAzavffeSxERxpIlSwg1Lrjgggx/pidn0CBvusnMzxkMoCHiBqvymcBCltLYsWNpxcrKFzKedI3QwfiQZy4ICwu76aabks0PalhaTochYvrJ4yMW2bVrl3PQZGx//fXXCPNLQ1WrVt23b19hPVBFUTJQoM9nWDqKJFlaKJlg3RcZcHZwxRVXNGvWDH9HeLR69epx48Z17tw5KirKEiowBew41TGSK9FPkyZNSLAnbd269cMPP+zbty92slFdffXVlCLGDkTMsWPHjk6dOiEpWIr+IbBL3X333ZjNkXfmzJlTp05t2rQpMQe9o7SwuoOe++67jyM16UmTJi1cuJBB69Wrl+jv2bMno8qmOGHChC+++KJRo0YXXXRRnpquWbMmZufMVVddJVdgF7dqngnMYLPv0qWLdZ9f6CAbP+2SyD1EuvIiiqXldMgnhsA80oweq0nyBUqBcSZSIdG8efNy5cqRsIoVRSlUCvS+SUHq+gk2abYBztls0jgaDojyvgmMHDlywIABlM6YMcO5+RUFGEksx3v+3//9X58+ffCMHGrZwpcvX16tWjVsBks0K6griSwlUb5o0aJ27drhT+fPn4+DlnyRlGOcpDHALDkz77777pNPPok8rnzz5s0//vgju7LMhxEjRgwdOhSFlSpVql+//rRp0+RTk7Zy6SwJuslWIU07zXamzyEYuWbNmlatWiUlJdWtW5frm2++2b59e6u4kKCVU6dO3XzzzT/88EP58uUZsVq1ao0ePVoG4cSJEy1btly3bh2TuXTp0q+++uo999yT+8fkV3iOxAfA8+U50hEMy8ezk/nwyy+/UDeXXUOeAalTpw51JdrIAPmwZMmSDh06pKSkPPzww2+//XZQ+q9ySekLL7zAXMX4b7755oYbbigio6ooxRBZcsUDXB4OCLfSrVs3uoYnks+BkkmR/O44vky+0FGkwMhUk4MHD1avXl0cbufOnekLllNqyWUFvaMiYnJFOANkypdLr7vuOrY0bp1idpoEqiylOYLYhg0b5AiInf3793casHbt2lKlStEcRVOnTiVT9NvK5Zb8VatWdezYERfPSZ2KAkUidm6ROZOQkEAgxeOAK6644ujRo1Zx4UF/ecoEwQwXrURHRzMszrEaMmSIbIE1atQ4cOAAVuXyMfkbjGH/7tKlC9FY5qece6hlPvk8fIwUYQYNebC0OEBAdBKlEV4zerVr146NjZUiqX7y5MkmTZow4DfeeGNiYmL+LFcUJTcUqxAeh0KXOHfu2rULD4LjiIuLI59Mrhy7ySS9ZcsWySlSyDZTtmxZTmByy8lVis5oLRW50sH/ywb5c3GHDh2aNGmS5HzyySeSANIzZ87EZYu23IDjbtasmRgsv1VKmtgCUylq3LgxzV188cXyi1UZ7OcW2AP69Onz7bfffvfddz169MAAqlgSRQAspEehoaHso5yDeRzt27fP8DuzhQUN3XzzzfLVULa9yy+/XPKBIuaD/HoKW+b555+PYVbZueaFF1746KOPiBF79+6dnJyMhfmzjYpcefqSyA1Iynqx7jMhloSHh3O6qFChwu7duxctWkQwYZfOmzfv119/Zfa+/vrr8nqboij+giWXb3ANRQp2yj///PPLL7/kPE3XcENdu3b9/fff8TJspZIJ1atXX7hw4b59+zL8sc1zi20JBkdERDRs2FB+A9RZlB3muS513Lhx0sEsycEpU3TJJZccOXLEUncmsIfT5PTp0/H1vXr1Yo8hx2kksQu++7HHHkNMcpylpOnX3r17K1WqZG8Y8tFX8p2S5xajS2lpBGcETGzwmzZtksdRuKBTjubEE5GRkXPnzrUHTZB3VcLCwpYsWYI9yPvDjHyAVTw4qFixYnx8PFbRC6ssj9AvK5VrzjgI5tMzXjabOnVqiRIlLr300jVr1jBXmWaEthdddBFmF8GvuCtK8aNYfa+VI/KoUaPYwE6af9qAroWEhJx33nmcSgkp8C90mOM13ge/g5cpV64cRzE8uFQvCmAzDlT+YtnAgQPZg62C7KEKPSKxdOnSjz/+ODdVnFCdMWHLHzRokJybrYLsYRipxWg3btx47Nix8qKFjK0IsDdfeeWVRBtSJJk21EWY3fS///3vxIkTaZqghN1048aNHEARyGsX/AFG2lcexM6dOydNmiQvbJjlhQZDwZWG2PPefvttztlEGxJ7kc+TpejTTz995513Vq1axUBJflEYojlz5jzwwAOstRdeeKFfv35ikhF35GIKnTWIM7Dn66+/HjJkyPHjx6+77rqEhISVK1fWr19/2LBhV111VVEYSUUp3hSrOAOnjE8Rxy3Ojt6RkD7a7k9yEBNvXqQcDVZhW2xsLFtaLnd95AlN6AXdZ1PPa3eoDpKgei5blKE+duxYdHQ0pkotu2nsOXLkCBEeCkEybeydlcMlYnRz9OjRkydPXr9+PXGGmCGS5xDMAPrFla2UqIie0sFCtw39DAgNMRpsgbRCE9wCpRQx1GyW2CA/40amXXpu4SkfPXqUK/E6NmOq0/IiAuYxwoxbXFzc6tWr165dGxUV1aRJE+IM+YZLUZhsilK8KVZxBn3BrYjL4xYnIh7cTtge0Nlr8d1FBzGeK4bl0jbZ9SVtJ3IJQwE0xNUcoTNXlyqAMFfJRINdF+NFwOxBxi6Qz2MiwZMSsT59+rDFTpw4kZCF28xVzglY4ryC7KNmYaGBWnniaJYE3SctDdkjCfaOaJeeW+SlAhIYjHmkuYrxIlAUkCHFMLkFMY985zgriuI/ilWcofwjwMUTGMnmxHXz5s233nrr5MmT5Tc58PtkiqSiKIryT0cdunIOkEMkAceePXuefvrpPn36NG7cWCIMDXwVRVGKE/p6hnK2kXd5EhMTP/vss2+++aZbt27t27eXXw+zX80WSUVRFOWfjcv1/xuOSplQRcj7AAAAAElFTkSuQmCC)", "_____no_output_____" ], [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAq8AAAGNCAIAAACwjw9qAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAP+lSURBVHhe7J0FgBxF1sd7fDUbJZAECwlugQQJHC4XJITghDsO5w53DueQ4whwuMthh0Nw9wQJ7hCDGHFb39HvV/Vmi05PbacXwn2R/rNU6v3rvVfS3fVezczORgqFghMiRIgQIUKEaBvEShCJRIqyZkpFQ3pEgY9OcFI50rCKVpNFklGgfIQIESJEiBAhlmOo7KBYDREiRIgQIUK0AXOYFlhFQ3pEgY9OcFI50rCKVpMgpJKVmxAhQoQIESJEGyBWStQsyq1BtCgsHGhLRYGPTnBSOdKwilaTRZLhOwUhQoQIESJECCfMBkKECBEiRIjlHWE2ECJEiBAhQizvUO8cFKshQoQIESJEiDZg3mgXWEVDekSBj05wUjnSsIpWkyBk+NpAiBAhQoQIsbxjqckGxr14DMkL2Oq6cUUqRIgQIUKECLE40J5swARkN7ba6pjrXvxfxOcxXxcrSwXGvXjdMVttVVwjIMsU5jEhQoQIsVSisEyDCbYnGxg74vZizYX33rv95N36Ro75n2QEvwPGXVeM2YtvBspl391Ovv2994oEkGU6NHxdI0SIECGWRkigWFbBBH/VOwUDrx0r6cTYF44eKNTtuy3tke7rMcXKbwOpQN+TJQ0YePQLxXVSK3Vt61KFCBEiRIilDcW9fBkFE/xtnxvoM+i2e68tBrn3Hn0+PPiOu+7QYirgHP3CqNsG9ZG6WqmTbhtVGHWSIUKECBEiRIglBr/5U4R91ly/WGtF8dMFW103btyLx8jL8Ftd92KxEdjfUi82GijbViUULImG6zX+IqPQxocN3e50j6pV6bYe5J33Tu4rjS7DYEP9BeOef9TkArcNkpoNbS9RsEkZrYWxkE2IECFChAgRFL85G2gbjx7ad7fbJTi+963+BxD87G+pLxS91cvt2LYqodDXRO32Q8VOtzvdY99Fhs5gQ3Xjl2Rg4Lp9peIP6xKFCBEiRIgQ/2P85mzgRfPRwvXXXOhl8PfcYbSIF48pBj/zlvrYF4rvNLx3svngwS8vtxc/oPDLxxN+BdzeTKfib9Bt1Fvf6Wj9METx1fxgQ10YY79tnbJnLdqAbYkCoc9Jo/SQFF44ukgOvPYMn5cjQoQIESLEr0fxJdhlFEzwt2UD41TMlKo1FrVG2LH6ZfNx110qygOvvbf1LfU+g05q/eTBeycPlxfLW0/YqElk7jPotlEm6LUTC3kznd42quD3Sn7QoS4GLLxE7YZ7oOGHEkKECBHid4LeqJdZMMFflQ20vsWuXn8XxhqLjn6h9VNzUpqj88In518+efD1GHXktqv1Xbd4hm8n2uh0EQg41N8M7xK1E7+87nH0eWEuECJEiBAhfi1+6zsFAwcerV5/D/Jh+XGt3x/kfVN94UDfptqvwq/zFnCoXvzSupiSBX+4cgHfzyyGCBEiRIjfCDkCL6tggr8qGzDfN1AojBp1W+vr74uCOVi/9+1YqRSx8EG8TbXfhnZ5CzhUL3556WDxjt2OF4eHuUCIECFC/G9QjHnLKJjgb/4UYTtgjs4Ln5zbPIi71X75uH4p3Hq/fKhR8Ctf22/vUIsYNKT10w23X9rWLx4EhN+kFMwnNsIPD4YIESJEiN+K/2U20Gf3/SXGqg/lF39tf9yLv3zkf//d9XnbhFTz2f1xLx5j+QXDXw7wJw/X3tTv7rd+kMGg1Jvu9Jitir9h6H4ZQDpTRcChlmDQba2fdnzv5L50UewQY/2FB23+bqJBsEm5c4Hww4MhQoQIEeI3o/gyQRC4fpfNvFNggVE7+oUi8wvM7/N5MXAhZdsvEAwcKKauzm3eLGr23080HXo6Mx/yDzbUUrRp+Ivv9i6Rd1K29QGuOYcIESJEiMWIfD6fy+UoDayiIT2iwEcnOGlgFa0miySZYLteG/D9CF0JBq5brLjQ56RRY9Vv+7scyecQRy301veg2xb6Xn/9K/+jzmt9zd9AefOojbWoDbpt1MJ96i7N6+t05nbS+omAgEMthfoyAD18l2XR1HuOb2uJAkwqRIgQIUL871D8uN0yCjVBMgKZaogQIUKECBHCCv0CQUECp8AqGtIjCnx0gpPKkYZVtJoskowC5SNEiMUE7qpiTcMqGtIt+jQZWMUgJqb0iJQGHtJHx8DHxJQekdLAKhrSx8SqY0qPSGlgFQ3pY2LVCa5pENzElB6R0sAqGtLHxKoTnDSwiv6kWwyuaRDcxJQekdLAKhrSx8SqE5w0sIpWE39SYBXbpeMWTWkqBM5lGMwxdtFFF/FPiBCLBfLkGFhFQ7pFnyYDqxjEpL2apaIguIlPk4FVDGJi1fFpMrCKQUysOsE1DYKb+DQZWMUgJlad4KSBVfQn3WJwTYPgJj5NBlYxiIlVJzhpYBWtJv6kwCq2S8ct+jQZWMUgJlad4KSBVbSaLJIkIQhfGwgRIkSIECGWd4TZQIgQIUKECLG8I8wGQoQIESJEiOUdYTYQIkSIECFCLO8Is4EQIUKECBFieYf6jcNiNUSI3wz5bGpRaEM0pFv0adKmClYxiIkpYdxiW5qloiC4iSnhPaQ2VbCKhvQxseqYEt5DalMFq2hIHxOrTnBNUQDBTUwJ7yG1qYJVNKSPiVUnOKkcaVhFf9ItBtcUBRDcxJTwHlKbKlhFQ/qYWHWCk8qRhlW0mviTAqvYLh23aEp4D6lNFayiIX1MrDrBSeVIwypaTYKQS9NrA4zYlKWiwE1SekRKAw/po2PgY+LTZGAVg5iY0iNSGnhIHx0DHxNTekRKAw/paQ0RIkSIEEsRlo7vG3AHHk/48QShX6dZKgqCm/g0GVjFICbt1SwVBcFNfJoMgusY+Jj4NBlYxSAm7dUsFQXBTXyaDKxiEBOrjk+TgVUMYmLVCa5pENzEp8nAKgYxseoEJw2soj/pFoNrGgQ38WkysIpBTKw6wUkDq2g18ScFVrFdOm7Rp8nAKgYxseoEJw2sotVkkWREXisQecmHjFjKUlHgJilh3GJbmqWiILiJT5M2VbCKQUxMCeMW29IsFQXBTUwJ7yG1qYKH9NEpCsF6LG3SpgpWMYiJKWHcYluapaIguIkp4T2kNlWwiob0MbHqmBLeQ2pTBatoSB8Tq05wTVEAwU1MCe8htamCVTSkj4lVJzipHGlYRX/SLQbXFAUQ3MSU8B5SmypYRUP6mFh1gpPKkYZVtJr4kwKr2C4dt2hKeA+pTRWsoiF9TKw6wUnlSMMqWk2CkOFrA4vWMfAx8WkysIpBTNqrWSoKgpv4NBkE1zHwMfFpMrCKQUzaq1kqCoKb+DQZWMUgJlYdnyYDqxjExKoTXNMguIlPk4FVDGJi1QlOGlhFf9ItBtc0CG7i02RgFYOYWHWCkwZW0WriTwqsYrt03KJPk4FVDGJi1QlOGlhFq8kiSbKB8HcKQoQIESJEiOUdYTYQIkSIECFCLO8Is4EQIUKECBFieYf6HEGxusRD3tuQslQUuElKGLfYlmapKAhu4tOkTRWsYhATU8K4xbY0S0VBcBNTwntIbargIX10ikKwHkubtKmCVQxiYkoYt9iWZqkoCG5iSngPqU0VrKIhfUysOqaE95DaVMEqGtLHxKoTXFMUQHATU8J7SG2qYBUN6WNi1QlOKkcaVtGfdIvBNUUBBDcxJbyH1KYKVtGQPiZWneCkcqRhFa0m/qTAKrZLxy2aEt5DalMFq2hIHxOrTnBSOdKwilaTIGT4KcJF6xj4mPg0GVjFICbt1SwVBcFNfJoMgusY+Jj4NBlYxSAm7dUsFQXBTXyaDKxiEBOrjk+TgVUMYmLVCa5pENzEp8nAKgYxseoEJw2soj/pFoNrGgQ38WkysIpBTKw6wUkDq2g18ScFVrFdOm7Rp8nAKgYxseoEJw2sotVkkSTZgMoORF7yISOWslQUuElKGLfYlmapKAhu4tOkTRWsYhATU8K4xbY0S0VBcBNTwntIbargIX10ikKwHkubtKmCVQxiYkoYt9iWZqkoCG5iSngPqU0VrKIhfUysOqaE95DaVMEqGtLHxKoTXFMUQHATU8J7SG2qYBUN6WNi1QlOKkcaVtGfdIvBNUUBBDcxJbyH1KYKVtGQPiZWneCkcqRhFa0m/qTAKrZLxy2aEt5DalMFq2hIHxOrTnBSOdKwilaTIGT4uYFlBFxOD/L5PGVdXV1RI0SIECFChGgD4TsFi9Yx8DHxaTKwikFMAmp68PPPPw8fPvynn37q16+fSQBFWSCiId2iT5NBcB0DHxOfJgOrGMSkvZqloiC4iU+TgVUMYmLV8WkysIpBTKw6wTUNgpv4NBlYxSAmVp3gpIFV9CfdYnBNg+AmPk0GVjGIiVUnOGlgFa0m/qTAKrZLxy36NBlYxSAmVp3gpIFVtJoskgxfG1h2IPFeysmTJ59zzjkkAVdccUWPHj2i0fAqhwgRIkQIP4RxYtlBPp+fMGHC2Wefvdlmm1111VVz584l49too41MAhgiRIgQIUJYEWYDSzcymQxJAPjxxx9PO+20gQMHXn311bNnzyYDAN015AWDECFChAgRoi2E2cDSDSL9mDFjjj/++E033fTGG2+cM2dOsUFjgw02iMfjRSFEiBAhQoRoA+p3DIrVJR4MleAnZakocJOUMG6xLc1SURDcxKdJmypYxSAmpoRxi5TDhw+/8sorSQIQo9EoJRBbcPLJJ1911VWoyUcHaKIuTUBEQ7pFU8J7SG2q4CF9dIpCsB5Lm7SpglUMYmJKGLfYlmapKAhuYkp4D6lNFayiIX1MrDqmhPeQ2lTBKhrSx8SqE1xTFEBwE1PCe0htqmAVDeljYtUJTipHGlbRn3SLwTVFAQQ3MSW8h9SmClbRkD4mVp3gpHKkYRWtJv6kwCq2S8ctmhLeQ2pTBatoSB8Tq05wUjnSsIpWkyBk+NrA0o2//OUvgwcPjsVi1PP6VwqFF2y44YbFWogQIUKECNE2wt8wXLSOgY+JT5OBVQxi4qNZVlY2aNCgXr16vf7665lMRrcXkUgkzjjjjJ49e5rfKTAeBD5ufZoMgusY+Jj4NBlYxSAm7dUsFQXBTXyaDKxiEBOrjk+TgVUMYmLVCa5pENzEp8nAKgYxseoEJw2soj/pFoNrGgQ38WkysIpBTKw6wUkDq2g18ScFVrFdOm7Rp8nAKgYxseoEJw2sotVkkeRS89oAAw1hRT6fZ33Ky8tzuZyslUGnTp1WW201efugqB0iRIgQIUKUgJARvjawaB0DHxOfJgOrGMSkVNPUwdixYw8++ODa2loCP5CPDXJ1+/Tpc8opp1BHWS622wqIaEi36NNkEFzHwMfEp8nAKgYxaa9mqSgIbuLTZGAVg5hYdXyaDKxiEBOrTnBNg+AmPk0GVjGIiVUnOGlgFf1Jtxhc0yC4iU+TgVUMYmLVCU4aWEWriT8psIrt0nGLPk0GVjGIiVUnOGlgFa0miySJEeHnBpZKmOheV1d39NFHz5o1Cyafz/fu3fvaa6/t1q0bTRtssIEokyJIJUSIECFChLAijBNLJSSnI/yfdtpp77//voidOnX6z3/+c8wxx7zzzjvbbLPNRhttJElDiBAhQoQI4Y8wG1gqQR4A7r///gceeAAxpnHJJZcMGDAAsXfv3iNGjNhrr72ohwlBiBAhQoRYJNRvHBarSzwYKrFNylJR4CYpYdxiW5qloiC4iU+TNlWwikFMTAkjlVwu9/nnn++6667z5s0T8sADD7zvvvuMjjLW7xGYOhA/RcF3AKaE95DaVMFD+ugUhWA9ljZpUwWrGMTElDBusS3NUlEQ3MSU8B5SmypYRUP6mFh1TAnvIbWpglU0pI+JVSe4piiA4CamhPeQ2lTBKhrSx8SqE5xUjjSsoj/pFoNrigIIbmJKeA+pTRWsoiF9TKw6wUnlSMMqWk38SYFVbJeOWzQlvIfUpgpW0ZA+Jlad4KRypGEVrSZByPBThIvWMfAx8WkysIpBTEo1Z8+eve+++06aNEmYzTbb7N577y0rK3OHf1Mx8HHuEX2aDILrGPiY+DQZWMUgJu3VLBUFwU18mgysYhATq45Pk4FVDGJi1QmuaRDcxKfJwCoGMbHqBCcNrKI/6RaDaxoEN/FpMrCKQUysOsFJA6toNfEnBVaxXTpu0afJwCoGMbHqBCcNrKLVZJEk8WJpfacgp/5rdnIFJ5/LO7k0k6KuoX7lDhSlZQpcs3w+n81m//73v3/11VcS7zt27HjzzTd37do1Ho+rK6pBRVpDhAgRIkSIRUK9VlCsLvGQ/EUPOJInDShk5ueinSKFWN5piccSkXwsEs2jpZScCP+hyT8lr4oIPKSPTlHwNfFp0qYKVjGIiSlJBShvuummU089lQoKyWTylltuOfTQQ6mTBIimx1A50rCKVhNTwntIbargIX10ikKwHkubtKmCVQxiYkoYt9iWZqkoCG5iSngPqU0VrKIhfUysOqaE95DaVMEqGtLHxKoTXFMUQHATU8J7SG2qYBUN6WNi1QlISp0SwEirwPDGSkDdMFJxl6IDPKSPTlHwNTElvIfUpgpW0ZA+Jlad4KRypGEVrSb+pMAqtkvHLZoS3kNqUwWraEgfE6tOcFI50rCKVpMg5FL5ToEef74pEr3vrSmdOlV3KlOvcUQjOS4azTQqRT1H/lf/uMwNPKSPjoGPiU+TgVUMYuIuR44cefzxxzc2Nsq1PO6448gMqAOjYypGFFhFq4lPk0FwHQMfE58mA6sYxKS9mqWiILiJT5OBVQxiYtXxaTKwikFMrDrBNQ2Cm/g0GVjFICZWnYCkPHfk5STf0mRgWqmjoB/KoiilqXhIgY+mQXATnyYDqxjExKoTnDSwilYTf1JgFdul4xZ9mgysYhATq05w0sAqWk0WSXKvLqXvFDD0dDbrfDuz4qLn5rw+K9+cL0SymXyB+E+RKxRyeaeQ15M0ExYIA4ryUoJcjkkVZs6c+de//nXevHmyE2255ZYXX3xxIpGQv1MQ4vcGl4BdviiECAz9wClks9lMJvP2228/88wzdXV1MEUNfYfT2tDQ0NTUJK+BuVt/D0gvLS0tX3755UcffSSiNFEh4X722We//vpr6u6Ljjh+/PgzzjiDx1BM3F8DKspjxozhwWQi8tiaJgHTRId+6RRX1IHoCPD84YcfFgUNXD3//POMs+hiadu+QiwVWEqzgYiTT6Vzztx84fuWbsNfzL/5U7olnsqqdwpUXCzkc+Tk6v0CnZsv7Q+PTIEd4YQTThg7dix15rXiiivecccdNTU11It6IX5/sPhs8bJ968sS7suLhiwUmD179iGHHPLHP/5x3333vfXWW4UUHW5jIujqq6++3377ESmF/P3AFbz77rsPPfTQNddcc+DAgUcccURzc3OxrbWVQe6+++7ffPMNjBknofpPf/rTTTfddOyxx5K7cDPIJiOtiN999x1TuPzyyy+88EKagDQB3DK1W265hUXo27fvDjvscPXVVxtbATo333wzfFHWzEMPPTRkyBAGM2XKFI9+iBCLC0vrpwjzkfyCTK42k0hEnPqWwrVvZ576LtNScHLq4wLqQ3RNDU2TJ03+8ccfiwatmDNnzk8//TRt2rSivPSADejpp5+WvSAej19//fVrrbWWNIX430A2d0637Ons0UU2hC+4YwHrtmDBAo7URMFoNErUlLM1vKjBvPLKK3Pnzt1www2TyeTvvbz0u/32259zzjkDBgzganKaHzduXLFNt5aVlTEkNgrz7V6Cjh07HnnkkVR4GE8++eR0Oi286Pz8889EehKIXr16kUy4DQFuQf/+/Q877LCVVlqJ/P7jjz/2/LGxSZMmXXvttfI3youU46yyyiqsycyZMydMmIAHj9sQIRYLltp3CvLpBTmnsakiWkjnE4XZ2erb32165NPGxhyPpZOLJl559fW1+/Tut/HGno2b8wf7Ec8zT9RS8VAxSJ7/kSNHctRge0KMxWInnXTS4MGDw33h/wUzZszo3bv3GmussdNOO7GtX3TRRZwj33jjja+++opgIEdMrkuYLgi4SymJmhyXjznmmMMPP1ySgPXWW08+90or5ffff09IprLLLrvAc5OLoQeysFkNHgd/yGvymICifStwvvrqq6+55pqnnnoqgZ/hEfXdPR566KF///vfGUnXrl3hTRMMTddddx3h+b777uNJlLc2aJo4ceL+++//5ZdfkgqMGDFis802c7+Fhwdssdpyyy25c84//3xax44dW1tba5wzDHhSInIC92sVJC633347tp07d5bVKzaECLEYIY/KUgF5sFXJT6bp9WnpgTc2bXpr08a3t/S7Jb/5Lc3b3TLvX+/WTW/O5TINTz32cCRaXlFRwYmEhIAEnN0B27/+9a/Metddd+UZNvxCzl2wioZ0iz5NBlaxLRMqQHY0DgR9+vSRLYANZbvttmNSns1OKm2JAqtoNTGlR6Q08JA+OgY+Jj5NBlYxiIkpPSKlgYe06qgQlM2yZZOKcSG4HLLFS1leXt69e3cCzFZbbUWW8O2332Lu9iOlgVU0pI+JVceUHpHSwCoa0sfEqhNQE/CUEds4B/PEPfHEE6xbhw4dvv76a+5hg6uvvpo17NGjx6xZszzmRkQNVw899NDaGmuttZZUBCIakso666wzatQouWqYGz/GG+AUTlrAFTziiCNMq1zlN998c+WVV546dapoSpO0MhGSm4TGCSecUF9fP23aNK47AX7FFVd87bXXRF9MTCkVAQ81i8BScJ9Id0ztlVdeYeSQrAM+RV+NPpt99913+/fvL5mHj/PS0iNSGlhFQ/qYWHWCkwZW0WriTwqsYrt03KIpPSKlgVU0pI+JVSc4aWAVrSZByKX1nYJCNDavluGr9wXYk/kvE4nOjlU/8W385tfmz2mOZ2mLqgMB25BEfZm2mDN5yQZ4roRZAsFo2aQY5N/+9jf2DhjEVVdd9bbbbqusrJRW0QzxvwHhijuHZT/ooIPkXqI0zxV31IwZMzjgfvjhh+QEa6yxhliFYN3i8fhGG21EsHzppZdgVltttd69e8saAm7yRx55hDXcYostampqhCyF3PAcplnk77///ocffqA0ENGQVFDDs1wy8WBA13LVOnbsyMWCGTduHKK0og8IwEOHDiW6CymQMdN66qmnXnjhhdRvvPHG0047bdiwYR988EGnTp3uv/9+0gKmrNXtwAnxnmwAPz/99JOQDPW66647/fTTWSXqZPzC4wq15557jpMMy2idTogQvx1LRzagn1wXnAIxfEFDNsdDEYmSEPCA5pw4SUG+kHjlp7J/vN48JVeWyrXQzh5NQkByzVNUtG79OI+8MGD49gJDq62Q7iatqFCUA4MRXnnllZwYqDPsVCrFfkGYkQ1Cr03J4oT4HWCuHcvOKXDu3LnyXU9yCdyAP/fcc0855RT2dDFZzsGasFACHrr33nuPyo477sipmooccz/++ONPPvmEu3r77bf3XzdMttxyy3vvvZege99991EaiGhIKvfccw85Bw6xKtq3QkYlMXudddahPmnSpIaGBtOKSOJCmBdNSmmiDvDJ+M844wxyAuq33377m2++SWJx55137rDDDhKzRd8KWtHp2rUrdflgIHjggQc23XRTVoa+0um0fHRAwGC++uorUhMGLGMuNoQIsZjATbU03VXygOky35LPXD1q/oBbmze5OdPvttzGt2U3vyUz4KYFA26u3+ym2r88MuvUfz+QjEY5Q3M+mDp16vz583nAyAmOPfZYZs2mQ0o+ffr0uro6Ii4+Xc5/gVU0JN6wlTxDSOoCyTOoUAoMCcRcTKS0ipg89dRTFRUV8vyz6VxwwQXiyq0pKDV3i4LgJqb0iJQGHtJHx8DHxKfJwCoGMTGlR6Q08JBUZKkBFw5wJH300UcPOOAAeSNZLoobXCDIM888kwRUX2eVaIpDd2lgFQ3pY2LVMaVHpDSwiob0MbHqBNcUUP/ss8+qqqpYJU66sj4sLzF4v/32YwHLy8u/+OILtyGlR5QrIssLdHsRIhpSt6t3FgAVzN1+3OJVV13FBV1hhRUIzKLMFTzssMM49LdlKECZPYRkghkR3TERc+Dpwi2KAthzzz2Z9fDhw6nTNXkA3mbPns0jT6rxzjvvyGTZuI4++uiXX36ZumcApc5LS49IaWAVDeljYtUJThpYRauJPymwiu3ScYum9IiUBlbRkD4mVp3gpIFVtJoEIZfO1wYYvePMr1fvEhRUI2UuX4gkotGt+hT+uW/FVUNrBqzSIRutoI0njamaaQNxKLwwvw6cBkaMGEHOXpT1uWHWrFlPPPEEUZwHG1G6psdvv/0W8tNPP6XforYvMPn++++PP/54NiZEtpu9997bfAWhVgnx+0Iu3IIFC1599VW2Y06Zw4YNIyGYO3euXAJKLjGlgPrpp58u3wCBKE5CGLAmH374YX19PekUh2AhWbRnnnlm3rx5VFjhtddeW/i2gJO33377KI0jFwWuGg8Rz07RuA2svPLKlDxoXGuGQf3+++9nnHThfx3JY04++WTOG3LFuTduuOGGRT7gcs9Q6datGxUOKtTJCQ488MDOnTsnk0l4nJjBkAeQH3CAETFEiN8D3IpLRzZQioiTn9+SzOSjVU4+GWlmIvlIoclJrF7jDOwa6ehky/LNTl791rJ5hGRzp+J5qLS/X4OHH354//33v/zyy4uOtCvyg4MOOogHW04/kNLdFVdcwbHyyy+/REc0/cFmdMIJJ/z888/U8dCrV69///vfbArUgeh4gFtODz/99NMnn3zyww8/tLS0wDBrzivNzc2cMIL0uxzCLItUpGT13n33Xfb6AQMGcIb7z3/+w6GN5TUKxBhigLFFPPHEE//xj39AcoHg27pMyyFklXgc3nrrLSr9+/fv1KmTrNLUqVNJtsrKyuA5HMcW9T1amBB979a45557pCIQ0ZBUgHwm0f9arLTSSihwceVj/OPGjbv33nt5rhmMjNyKxsZGbo/HHnuMK37NNdccccQR5BPnnXfeQw89ZLYa0SyFNBH46XfmzJk8rWPHjt13330R6VTeQWBl8EPqeeedd9KR5DRiHiLE74GlNBuIOIVYekHtujUNf+7Xctygsgr1KkchG0mMHltXn4tko6msE0046pODSrsViHV1dTBURNTe2g0MwaBBg6gTM3ieibgSdzno0ET9pZdeYnOhgjhnzhxJ8HfddVc2HQAprjwQW/ywrXAGErXKykq2p+7du7NTMGzR9ICt9uuvvx48ePAmm2zCOeOf//wnMezjjz+mr+OOO2699da75JJL8AxQbqv35RBmQWRxyJk++OCDs846q1+/fjvvvPONN944YcIEWS62Yyk5Sv7pT3968skn//rXv3I5YLgup512GissHyYwUB2E0GANOUnzsFCXDwew2oTPiy+++NBDD/3mm29Yxl122YVWeG1hv0tZ1Z49e+600047aJBAGIhoSKnU1NTgx+rKgLM4CjxB8+bN41HlUpIKrL766jRZLyKatbW1p5xyyv33309Ev/baa48++uirrrqKYwD3D0n8I488gkMmgmbRxgWacMt8JQeaP3/+ueeeSypZVVUFzy1UXV2NzowZM3h4L7jggqOOOorDAJrhHRXi9wW33ZIPtU+73hfkn4Zs/rPJDdOaci3ppjmZlmOfrNv0lnT/23Lb3F770viGbLrl6WeejkVVok1EnDx5Mik2uTxpuxyvt9tuO04AU6ZMWbBgAQ+w8tnqXHdVhFVUqhrTp0/nAeZk8NRTT/FI45/Iwc5CF+wRq666KiL+yQnYHRjJbrvtxnaDJtuivL9ofJoKzz9NDz74IE7kAmHIRsMgaXKbeMovvviiR48e7H2vv/46W1V9ff0bb7yxxRZbkCLIlxQxdzIhBunpWkqPaEqPSGngIX10DHxMfJoMrGIQE1OWiqxGS0sLK/PVV1+RgW200UbmBRgJ/0DiPdnY/vvv/+ijj5L8cTnAqFGjUOYGYCvnKotD4NMjpYFVNKSPiVXHlB6R0sAqGtLHxKoTUFPA3fv555+zkuXl5e+//74s+xVXXMGpmhQhlUrx4Mjn+Dgli74YArc3eMDi40FKAw9JBYgfNSANtzdTYSuQYzrPHUd8UkDMpS+PppTcLVxxwjZXn2eTnAaevmbPnr3ffvsxR7YF7hMZgNvQVCjB3//+d+4uJk4+hKYMm2Uhp4c//vjjb775ZhIFGQww5gYe0i2a0iNSGlhFQ/qYWHWCkwZW0WriTwqsYrt03KIpPSKlgVU0pI+JVSc4aWAVrSaLJImzS0c2IGDQUuYLedKAXLY+n87mmjPZlvR/v12w5W2zNrs5veFt2b+/XNvc0vzhRx8mUoloNMKJbeLEiT///PPVV1/dqVMntnUe12233ZZNZ+rUqQROHjbls9W57qoIq6hUNdi5OD6yiXD4JkKwRzzwwAOIf/nLX1ZZZRU2C7IE4Q888EDEG264Ydq0aWwZdMozb66E+EQUfPvtt+YLT8Dee+/NdmM2JvcwKMWEkXASYl5sUhxuBKQdzP3ggw9m/2XiH330EaTp2uPHI5rSI1IaeEgfHQMfE58mA6u4SBMmSykVd10qY8aM4cbYbLPNCEiy4GzEUgESvfbYY497772X3FE8CLgiXNlNN92UMyI5lpDiWfqV0lRMaWAVDeljYtUxpUekNLCKhvQxseoE15TFYQ1ZW+5tcmJuwnvuuYcnhbSVnICl3nrrrVlPztlPPPEEawvE3OMHnjoVUxp4SCpuYG78SGkqbA4rrLACY+CQQPSV6E5HlB5NSkaODjcGT/Sll15KFqgHq5J4Ijqnjl133RVXPHFPP/00jDHUblQF4Bn9O+64A00WhFTe9EjJHQW/5pprkhbgEBKIIRA/AhEN6RZN6REpDayiIX1MrDrBSQOraDXxJwVWsV06btGUHpHSwCoa0sfEqhOcNLCKVpMg5FL5KUKn4CQjCSeSz8SdxkS0EI1vtmpVz1Q24qhfxf14Uubnltjaa6+rvsdDf/k5u3m/fv3OPPPMPn36DB06FIc8XUVfvwoyJA6FO+20E/sChxs2NR5vnn8eY7qgO55qeEgyACrsDgMHDkSExxw1NTHX1KhzScgeDjvsMPmcGjrrr7/+rbfeysGFXtzKBsqF43z66acjR45Ef9CgQbI9iTInleeee47drXfv3p07dzYpiNguJ2Ap3LOmgvjjjz/ecsstJIUbbbQRNwapEhu3XBRKVrtDhw477rjj9ddfjybxadiwYSuttJK0ihogUeD0RkjgAgkjkI5CGMjiT5gwgYXl/rz77rv//Oc/c3oePnw4SRixkEVjMZ988klSrt133x01Flls3UBNHgRzCfwhauqC2bwZ8CCXlZVxTujbty+RHtHHivvkyy+/JBU4++yzTz/9dCooMyrMQVVV1f33309qToL+/fff46H0cUOfEhNuGyrHHHOM/IqjAY8qJQt1zTXXyDcUIVKKYYgQvwe4u5aObMACNfgYeUHcyfLIrlQe2WjlinykkMpla3MVb4xrSSSTN998a79+m/CIjh07lmMHoZHTCZk4G1BFRQVPqXj6LSDA80jz2HO84MTzzjvvcMjYYIMN4HmA33//fQLz6NGjp06duvHGG3fr1o0tAKgnvmQ7k13jH//4x8cff4yIQk1NzU033cSARaEtYDVu3DjyDEZSWVlJj7iSvaZnz57kAbjaaqut9BVXKJotN2D6suYsC2nWfffdt9dee22++eYnnHDCqFGj2NxJDkSNxSESDBgw4KqrrqLphRdeOPbYY7t06cItRKt4E4gy5SabbCKpwHK4sMEhq8dhlwq5KXGUNb/rrrvks4TcurTy7LzxxhvnnHOOxNf/Jbg96HT//fe/8soruZoy2rbA1sHIb7jhhrPOOkvyBjdwRSznHrvxxhuPP/54ubVKQRfcMCivvvrq3IfuHrlLefA7dux42223rbrqqkU2RIjfH+qmLFaXeMj+a8qC0xxxUpz7nHw8F2v6cHbq7BGNeSfWGElu1Knput1T+Xy2fsH8cWPHNjU3rdyrV8eaGh4/gqUAJyTy1dXVHErY7vFvnEt3wCqakue2traWQPvdd99ddtllq622Gieegw8++JJLLpk0aRInS2Lzq6++esUVV3DoIc3fd999GYDkImw61MUt3iQVIFk54ogj4KkzJHaco48+mjqM9CjKVEyJIeCwxQmD6Tz//PPsIBx06JqO2GeJbSQrzz33HOceSEAre5/0Yvzg2S2asrRHNWIND+mjUxR8TXyatKmCVTSlm5Q6FRaHyqxZswg2Dz/88CuvvFJfXy9zFx2psK1vuOGGQ4YM2Xvvvdddd12PT3EopKmbVnZ8SLlGQNRMiYKHFCtgFQ3pY2LVMSW8h9SmClbRkD4mVp2AmlIHPHTc4WPGjJHPAErUh//iiy8eeOCBLbfccs899+S5gOHmpzTmxq2UMAKraEgfE48O5/g77rjj0EMPJQwjcs+4B+DWpKSOgpAGXH1KWuVmAKIDpNXtRxloJ0x8/Pjx3HIoGB1W6ZNPPuH8wN0oypC0ipXxIxDRkG7RlPAeUpsqWEVD+phYdYKTypGGVbSa+JMCq9guHbdoSngPqU0VrKIhfUysOsFJ5UjDKlpNgpBKVm6WbMggrRMwYmO2cPKTdZ/PrWxJRLrk08N3ia7eI1JoaFKfm1O/XJePRpwEUV8/qDy0bEbkAURHE5jd3gRW0ZBUCLek9nfeeefOO+/co0eP+++/n8C87bbbtrS07Lffft9+++2//vUvDhwkDZw15ZeJSQXo130AwhtD+uijj3j+Z8yYAcNmdNhhh5ENEKWkL+nXVNwlIMjJHzEaMWIEwYy0gHlhO3HixA022GDttdd+/PHH6ZH8QLIB5uvxg2e3aMrSHmEEHtJHpyj4mvg0aVMFq0hpRObFFYGUvZVlf++995j7yy+/PHv2bDmouU24HCRPe+211wEHHLDWWmtxXSDlZjDOS3ssCiU6btGU8B5SmypYRUP6mFh1TAnvIbWpglU0pI+JVSe4piiA4CamhPeQ2lTBKhrSx8SqE5xUjjSsoj/pFoNrigIIbmJKeA+pTRWsoiF9TKw6wUnlSMMqWk38SYFVbJeOWzQlvIfUpgpW0ZA+Jlad4KRypGEVrSaLJNn3iinnMoDyeGSbNVNOJB8p5Bqy0VfHNsWjkbJEtCIVL0/FyxLxVDIZT1Kk2PSJl0RloiNxV5bjV4AVpBw0aBDLSLAnEnfv3n399deHJOJy1iH8XH311cShzTffvLq6Gp6QTI+ly75gwYJjjz125syZDAZstNFGl1xySenrkFaQSXCuXWWVVejuq6++wrnMiOG9+OKLRMfp06c36T928t133y1LV9xA5kvJHFnhtP57MyeeeOKAAQOI9KRoLKwc2lh80VxttdVI41gfzmGXXnrpxhtvzM1gli5EiBAhljcsO7EhUshvvUaqJpmNER9jibcmls1tjqZSyeqKsprK8g5VKgOorKqmFJATcDSUbODXxQDCLeb9+/fv2rUr4ZaIvummm3bo0AGebGCHHXagddq0aTjfeeediTQCwpWnx+bmZkKX+WKiTp063X333d26dXPr+IBeGMBpp52G54ceeqhe/+kzQBZyxx13HHTQQVtssQWR78EHH7z++usJijQVLZctMPHRo0effvrp5FK77bbbrbfeOmHCBPIDmS8li9+jR48jjzySJOCLL7645pprttlmG3mdhqVmGVEWVyFChAixvGFZOikWelYW+vdsSRTSuYIzOxP/eGJTPpqIJ1Op8jJ5fZ6qenWgFRy+iRABg24psKXs0qXLJptsQrDBz8CBA4ko4rNfv34EaeqVlZWbb7658IQcSiAesALE7EcffRQTwKiIUvICgzBAlK2QVtwOGzbsH//4x8SJE0866aS33377jTfeOOSQQ9ZZZx0Ovvvss88777xz/vnn9+nTh+5ICEyMXBohI6dkFiCbzX799dcXXnjhtttuu91221177bU//vgjc0THLA7JH1nRww8/TLpw00037bTTTjC0yrVQS6w1zXUJESJEiOUN6p2DYnWJB0Nl15ayVMwVcpFcfuTU7NkvN6ULnWK5fL+V5l26Z+fqaCRBk5MvRGL5SCzqFJR2qzkVqzeBVTQkFZDJZB577LFbbrkF5qqrriI5INLIe/ME4E8++YQYfMYZZ8jbE9LEYVTMCWYjR47cY489Ghoa8An+9re/DR8+HFvJG9CBNCVWbpHS+GEYLS0tP/300wsvvFBXV1ddXb3FFluQphAX4SFJFP785z/LZxdSqZSMwePH45zS2qPAQ/roFAVfE58mbaogos4B1CH+22+/ffnllx955JHPPvsMRvRFU6w6derE6X/o0KGsMAsigV80zdqKPvD06BEFwU1MCe8htamCVTSkj4lVx5TwHlKbKlhFQ/qYWHWCa4oCCG5iSngPqU0VrKIhfUysOsFJ5UjDKvqTbjG4piiA4CamhPeQ2lTBKhrSx8SqE5xUjjSsotXEnxRYxXbpuEVTwntIbapgFQ3pY2LVCU4qRxpW0WoShFSycrM0wDoBI7YU8olcZkEmevRzLRNmV1Tk0omKpst3TW3UozxRKMS0Iqqc/oy5cQI83gRW0ZASkyibm5sJwER0Qi9RlpBfWVmJTjqdbmxshEeHbACSJnlBQrxNmzZtxx13HDt2LHWYAQMGvP7660RrUgHMpRfRlDoVCWPCC2jCP9mABH4YjssMQzQp4WHECs+kAktLNiCRW5sqQMJMnTp1xIgRjz/++Oeff15fX28UaKVEJPD3799/2LBhO+ywQ69evWS1RYFWQEUWh7o0AREN6REFwU1MCe8htamCVTSkj4lVx5TwHlKbKlhFQ/qYWHWCa4oCCG5iSngPqU0VrKIhfUysOsFJ5UjDKvqTbjG4piiA4CamhPeQ2lTBKhrSx8SqE5xUjjSsotXEnxRYxXbpuEVTwntIbapgFQ3pY2LVCU4qRxpW0WoShFx2XhqNOtFcJF4Tz+20WrQQLTixQn26+tVxTiKX5XpmIzH1O4VOJuL88t4whKAotxNFY/3V4sRX4j3nfilhiPoSd+VNCkjCPEEIENKI0E1NTSeffPIPP/zAlcBbjx497rrrLjSxxScMPJrSSikQRipAjUO/xE13+MecigyAwE9dxsYYIKmUphpLJpgaS0SFUmZKEnDPPffsueee66+/Pus2atQo0izmwlKgxnSY7MCBA6+55prRo0e/+uqrhx122GqrrYYCi0OrXvjimwJAdxIiRIgQIYpQ2UGxugRDBknJPi5lqUiQzDmReD7zfX38xMfra3OV2UKkZ3zWHYessEIyp94jUO2oLnRSbMubwCqWmhCxOJqn9beQEnIIukQmmgyPjuQHBGN41DAfPnz4+eefT51WQvV99903dOhQ6nigFP+c6UkaOPE367+uhhrxntCOEwN4UaYusVPMpUmrqEEKT90ESBGlNB48pXhwkzACD+mjUxR8TTxNskRMZ86cOW+++eZjjz1GWVdXJ7zoUDIRlpT8gCxh//33ly+EpgkwTUqPcxi3SKl1FTykj05R8DUxJbyH1KYKVtGQPiZWHVPCe0htqmAVDeljYtUJrikKILiJKeE9pDZVsIqG9DGx6gQnlSMNq+hPusXgmqIAgpuYEt5DalMFq2hIHxOrTnBSOdKwilYTf1JgFdul4xZNCe8htamCVTSkj4lVJzipHGlYRavJIkkVFPhHuVmyIYMsnYBHzDhOopBvcSIXvFT3+uTKfDRWXmg8a/v4Hr1jkVg8UmjOOQlqSntR3gRWsdSEujsMs6wSk2AE8BKDgRi+8cYbe++9t3xcAObUU0/95z//SYXwP3ny5C+//PL777+Xv0oM2bVrV3mpn3xi9uzZON96660POOCAjh07ikNAqwxGACOe3aU0UTciFSk9oinF1k3CCDykj05RCNajNDU2Nr7++uskASzUtGnTYOBZSWlFjYn37duXNQTrrbceeRJNgAshOmapEU0pftwkjMBD+ugUBV8TU8J7SG2qYBUN6WNi1TElvIfUpgpW0ZA+Jlad4JqiAIKbmBLeQ2pTBatoSB8Tq05wUjnSsIr+pFsMrikKILiJKeE9pDZVsIqG9DGx6gQnlSMNq2g18ScFVrFdOm7RlPAeUpsqWEVD+phYdYKTypGGVbSaLJJkw1SycrNkQwZZOgG3CDg2Rgr5iJN75cfcJa9lGyKVtA5csfay3WpqiM75dD6SiEaior1Ib8AqlpqUNgEjigKgIpg4ceIuu+xCSZ1rsOaaa5511lljx46dMmVKfX19jx49CG+AaEe85+yLmvFDZe7cuS+88MLTTz998MEHy5cOAWkSTenLLQqsotXElPAeUpsqeEgfnaJQoiPR3ZAwCxYs+PTTTx944IGXX355xowZRtNUSKrWWGON3XbbbZ999tlss81IjDw6RhS4SUoYt9iWZqkoCG5iSngPqU0VrKIhfUysOqaE95DaVMEqGtLHxKoTXFMUQHATU8J7SG2qYBUN6WNi1QlOKkcaVtGfdIvBNUUBBDcxJbyH1KYKVtGQPiZWneCkcqRhFa0m/qTAKrZLxy2aEt5DalMFq2hIHxOrTnBSOdKwilaTRZLLVDagU4EMZ8KCU5jZEj3tyXljGjrlnEinSO0VQ6r6d0EHPfWfwN+bwCpaTXyatKmCfplA/Ubc0KFDX3vtNU6xXICampohQ4b0799/00037dWrV+fOnYlwWHnMRZQ6FTBnzpy///3vf/rTn7bZZhtREB1K0XGLAqtoNTElvIfUpgoe0kenKJR0AYRvamr65ptvHn74YbIc+f1A0yrmJAFdu3bdY489DjjgAPlSB2kSP6IjFSMK3CQljFtsS7NUFAQ3MSW8h9SmClbRkD4mVh1TwntIbapgFQ3pY2LVCa4pCiC4iSnhPaQ2VbCKhvQxseoEJ5UjDavoT7rF4JqiAIKbmBLeQ2pTBatoSB8Tq05wUjnSsIpWE39SYBXbpeMWTQnvIbWpglU0pI+JVSc4qRxpWEWrySLJZSobyBIz8o35SDwbSUTy+Vs/Tj/wadSJJiJO5sD10ydtUZGLRGPq+4lJCLyvIZeKAqtoNfFp0qYKkg1cdNFFl19+ufDl5eWPP/64/FlkY8JVEWWpCKSpKGj/KEybNu2MM8647777ShMI401EgVW0mpgS3kNqUwUP6aNTFEp0MpnMF1988cwzzzzxxBPyaUpA4CcboJXpM8du3boNGjRo77333m677SQJcK+M25tUjChwk5QwbrEtzVJRENzElPAeUpsqWEVD+phYdUwJ7yG1qYJVNKSPiVUnuKYogOAmpoT3kNpUwSoa0sfEqhOcVI40rKI/6RaDa4oCCG5iSngPqU0VrKIhfUysOsFJ5UjDKlpN/EmBVWyXjls0JbyH1KYKVtGQPiZWneCkcqRhFa0miyTZVH+JN0s74gWuWSzq5FM5h6i/ZZ9Uh0hzNJ+N5J1XxyXnpp20ev0gw8SLBv8feO65566//nqWnWtACD/nnHNIBeAR1cXQoA4IipTSpE2LEBE1FFZeeeUuXbqMHz9empZMcJ8JiOJSkgR88803l1122eabb77NNttceuml33//vbQyOyrySsDgwYMfeuihb7/99p577tlrr71qamrwZlYGiP8QIUKECPHbobKDYnWJh0QLKUtFQn0hks05+Vg+WXAys6PxS0fUvj+zLFogvmYv3iG3Q59UNJeLxRJOxPtpc4s3DatoNWmrSdspkXLSpEkDBw6cPn26tHLkfeSRR8r1X8pBFBgPAuNHShi3SHnjjTeuttpqu+++u7vJY6gcaVhFq4kp4T2kNlXwkFYdkwGQvlAy9yeeeGLEiBEff/xxY2MjCrRS0iqGZEjbbrvtsGHDyBJWXXVVyQyMQ49zEQ3pEQVukhLGLbalWSoKgpuYEt5DalMFq2hIHxOrjinhPaQ2VbCKhvQxseoE1xQFENzElPAeUpsqWEVD+phYdYKTypGGVfQn3WJwTVEAwU1MCe8htamCVTSkj4lVJzipHGlYRauJPymwiu3ScYumhPeQ2lTBKhrSx8SqE5xUjjSsotUkCKlk5WbJhgzSOgEjaqV8xomqF83zmRYn9tq4zKVv53L58nwku0PPpkv+mEo6kWhM/TV6UffzpmEVrSY+TRIL58+fz2H3/fffl48L9O7d++233+7evbt2/Es4NB4EbpISxi1SEllnzpx51FFHuZs8hsqRhlW0mpgS3kNqUwUP2ZZONpudMmXKK6+88uSTT7755pusBqBJpixqlZWVAwYMGDJkyH777detWzea5FcDqHi8lYqG9IgCN0kJ4xbb0iwVBcFNTAnvIbWpglU0pI+JVceU8B5SmypYRUP6mFh1gmuKAghuYkp4D6lNFayiIX1MrDrBSeVIwyr6k24xuKYogOAmpoT3kNpUwSoa0sfEqhOcVI40rKLVxJ8UWMV26bhFU8J7SG2qYBUN6WNi1QlOKkcaVtFqskiSzXbZeafAiajPETKfvFPg9F8WiW6ycqJnVTbi5CNO/LMZ2Z/rooVIVF3b/y1YaCLipZde+t577xEIOexWV1ffdNNNpAJcCfAbrwIe6KIo/H9AAjxjcINwPmvWrIceemj//ffffPPN//a3v7366qvy8UCZNYapVGqTTTb55z//+eGHH7744ovHHXccayIvBlAC8R8iRIgQIX5v/D8HknZBAomUpaLjZMlvkPLR5kghGS3E0pHsDaManvg2mc2lCtHmozbJH7ppZSqiwmcAbwpW0WrSVhN1QuCjjz562GGHyXcGEPv/9a9/nXzyyTRJHuAxEVHgJilh3CLlgw8+SNQk4lI3pMdQe1KwilYTU8J7SG2qYEipyGme+oIFC0aOHEke8Prrr8+dO1dIMSFpQGTWG2644eDBg/faa6/1119fGHEicDsXUWAVDeljYkoYt9iWZqkoCG5iSngPqU0VrKIhfUysOqaE95DaVMEqGtLHxKoTXFMUQHATU8J7SG2qYBUN6WNi1QlOKkcaVtGfdIvBNUUBBDcxJbyH1KYKVtGQPiZWneCkcqRhFa0m/qTAKrZLxy2aEt5DalMFq2hIHxOrTnBSOdKwilaTIKSSlZslGzJI6wR+EZ1cRGUDnP/rcoUqx4lFnfRnc2OnPtHSnE3kEtE1OtTdvHdN54S6tov0JrCKVhOfps8++2zQoEEclOERDzzwwDvuuEO+lNBEwVLnAjdJCeMWKYcPH87xeocddnA3eQyVIw2raDUxJbyH1KYKbh3K+fPnf/zxx+Q9nPJnz54tLxgwQbIEFNBMJBKrrrrqHnvscdBBB6277rqpVEqSIWxlHZRTrSlkWz0WhRIdHxNTwrjFtjRLRUFwE1PCe0htqmAVDeljYtUxJbyH1KYKVtGQPiZWneCaogCCm5gS3kNqUwWraEgfE6tOcFI50rCK/qRbDK4pCiC4iSnhPaQ2VbCKhvQxseoEJ5UjDatoNfEnBVaxXTpu0ZTwHlKbKlhFQ/qYWHWCk8qRhlW0miySZBNedt4pUF85zMTU3CqinLrVHGN9O0bW6ZItxDi6OpPqUp/NyOS5uIWWAsdU9Z5CRhW/D1hfQIw8/vjjJRUA66233hVXXFFWVsaBXi7DrwOBlhL/U6dO7d69OxXhfz/o2RRB75RCUh85cuRpp5225ZZb7rbbbnffffeMGTPkTyihIONkhMccc8xzzz33ySefXHXVVZtuumlFRYW8EcAimHX4LQsSIkSIECF+C5adbEDlA0Vw/BchmnIiO6yXjDvZfCSazcff+b6xmRhGapDPoVCQ3zb8fSIpgZCgePrpp3/44YfCVFdX33XXXSuttNJiDHuE3hVXXPF/E0elF3m/o6mp6eOPPz777LPXXnvt7bff/vrrrx83bhxTllM+ZTwe79mz57Bhw0gCaLrhhht23HHHyspKcRUiRIgQIZYoLEvZgAWxSH7TlVNdy1vyESdSiH06OTKt0ckWIpFCNqKSAI6n0d8jGyAiEjsffPDBBx54QE7SRMcrrrhik002WYyRu66ujiN4p06d/jfZALMgvxk/fvw///nPbbbZZrvttrv66qsnTJhAEwOQJAB07tx58ODBDz/88OjRo+++++5dd91V/swSCkBchQgRIkSIJQpqjy5Wl3gwVAkqEvw8osBNEoQ5n2cjsavfWfDkDx1ihYhTyB23RW7oxvGqXMaJlOei+Vghqr6smJSgbW9FwXcApgSEf+ocnffYY4+5c+eK5uGHH37TTTclEup1C0KmvzdEgZukhJGKgHA7YsSIyy+/HB6fbnOpGFFgFUtNPBXAjH744YcXXniB7ug0m82KPqVRqK6uJj/Ya6+9hgwZIgkKECeiRgkQpcnUS0tPk7ZTsIpBTEwJ4xbb0iwVBcFNTAnvIbWpglU0pI+JVceU8B5SmypYRUP6mFh1gmuKAghuYkp4D6lNFayiIX1MrDrBSeVIwyr6k24xuKYogOAmpoT3kNpUwSoa0sfEqhOcVI40rKLVxJ8UWMV26bhFU8J7SG2qYBUN6WNi1QlOKkcaVtFqEoRcxl8bcCKxpFPYea3qinxtjjlHE++OaZ6fizlOXL9PwEKoi724wJpKZfbs2UcddRSpgKzyxhtvfOWVVyaTSX14XgxrTvSlfO+997bYYgsqckUXI2QimUxm0qRJt9xyy0477URHZ5111qhRoyBlUoKqqqptttnm+uuv/+qrrx5//PG//OUvNTU10iSuZMqGWSzTDxEiRIgQixfL8tYccQoc0nMFZ52u0dW7Rgv85+QnzE+Omab+mKFqdvJORCUJRYPFAeJ0Op0mcH799dfCdOnS5c4775Qv1l1cILLS0WeffbbZZpshmixksQBvM2fOfOCBB/bbb79NN930hBNOeOedd+rr600v9B6PxzfffPPhw4e///77r7322nHHHderVy9IHfQXc2oSIkSIECF+b6jXCorVJRgySEoijZSlosBNUnKCJgOgfODLphs+TCQKkXwhtsfazadvU1lWyLXEnEQhFyskF9c7BURoyhtuuOH000/P6e8cTCQSt95667Bhw+AF4gr4eBMF4CYpYUxl9uzZ9EKeIe8+uDVLRYFVpDTirFmzOP0/8sgjr7zySl1dnTQZNUr6WnfddQcPHrz//vuvueaahH/hPWran4JVtJr4NGlTBasYxMSUMG6xLc1SURDcxJTwHlKbKlhFQ/qYWHVMCe8htamCVTSkj4lVJ7imKIDgJqaE95DaVMEqGtLHxKoTnFSONKyiP+kWg2uKAghuYkp4D6lNFayiIX1MrDrBSeVIwypaTfxJgVVsl45bNCW8h9SmClbRkD4mVp3gpHKkYRWtJosk1Uu4yseyC1IB/amA6FarV3ROZnLqNw0joydkZqXVK+2EMvVBwl8W8zeBBaX84IMPzjvvPFIBEY844ogDDjhAVnzx4tNPPyUwSyrwK0DiAhgnZWNj4/PPP3/00UcPGDCAMP/444/X1tbCq/tDv8hP2bt37zPOOOOtt9569913L7jggrXXXjv8rsAQIUKEWGawbGcDxOOC+ouGjtOrqtC/ZyEbicdj+dnNFe/+mM1GovFCmlShsJgiNVFzypQpRx11VENDAyIRdODAgZdeemkymTQp2OICDl999dVddtmlKLcfjKe5ufmdd9458cQT11lnnSFDhtxzzz1Tp07Fs4DxU6622mrHHXfcG2+88d1331122WWbbbaZ+TNL4idEiBAhQiwDWB6yAZAvy6d36ptIOk2FfDYTS77xQ8t8GjMxVLTCbwJRk5LgeuqppxI1hezevftNN91UXV1NfbEfo5uamsaPH08UL8oBIIOkJFkZPXr03//+90022WTQoEG33norSQy8AB1G27Nnz0MOOeT5559H89prr91qq63i8bh6ocD1ecAQIUKECLHMYNnOBiIFjuj8o6rRDVZM9u6QKTixvOOMm5X8ejYH9kRBfzj/tyOfz19zzTVPP/008RIxmUySCqy77romdi7eCPrpp5/269cvyNsEuVxOvi+IypdffnnhhRdus802RPerrrpq7Nix6XQaHRkbI+/UqdOBBx748MMPf/755/fcc88uu+zSsWNHTwYgdSBiiBAhQoRYBrDsZwMxdfgvZGLJzonCNr0T0UI0Ga3P5OLv/dCUjkeiTuS3H9sJjS+++OIVV1whHxfgbH366afvvvvu+rCtTtuLHc899xzHev+XHNTnAjTGjRs3fPjwrbfemiTg8ssv/+KLLxgVvAR10KVLl8GDB991113kB/fee698YUDRS4gQIUKEWA6gPlVYrC7xYKiELilLRYGbpCQRiOq3AjJOPJnPfT0/ctKIdEOu2clVd4/X3vLnmu4xJ6b+0HHMx1tRKOlRAirilClTtt1220mTJknT9ttv/+STT8oX8AF5tcCYSx14vHlEgZuklEpDQ8Phhx9O8K6srHS3om9ExjZ16tTnn3+eg/5nn31WV1dnWhkPJUilUltuueXBBx+80047rbLKKihIk+pYv6hgsg0MxZa6NlVOpMnUqYgOpTQBq2hIt+jTpE0VrGIQE1PCuEWrplOIqO+uLjhcOUWqz5Wob6hSOrS5PnQqoiHdoinhPaQ2VbCKhvQxseqYEt5DalMFq2hIHxOrTnBNUQDBTUwJ7yG1qYJVNKSPiVUnOKkcaVhFf9ItBtcUBRDcxJTwHlKbKlhFQ/qYWHWCk8qRhlW0mviTAqvYLh23aEp4D6lNBWwVkUKUEKO2jhwbiBONO2yJ2pvvPiOkWwxOKkcaVtFqEoRUsnKzNMA6ASMK3CQljBEJkJlI5O8vNrw5JZXI55NO5qTto3v1TSW5gpHib8pJKa6AVTQk8ZKyvr5+3333ff311+GJi927d//kk09WWGEFQmm7vHlEgZukFLz77rtvvfXWBRdcQB0dYy4iScCbb7756KOPotPU1ATjjvGAHKJfv35DhgzZe++9SQJoNR6kFxYKEVvSDpjOnTujY5wwa8jZs2cjdurUKR6PS9JgzMWbwCoa0i36NGlTBasYxMSUMG7RqlnIo0PJc81C5KJOXH/QVHex8F7g2SDcoinhPaQ2VbCKhvQxseqYEt5DalMFq2hIHxOrTnBNUQDBTUwJ7yG1qYJVNKSPiVUnOKkcaVhFf9ItBtcUBRDcxJTwHlKbKlhFQ/qYWHWCk8qRhlW0mviTAqvYLh23aEp4D6lNFVRrwWmJtUSdZCJdyMbyTjQRY9MgJUDZd58R0i0GJ5UjDatoNQlCKlm5WbIhg7ROwIgCN0kJYyqq1XFeHp85/+1YVP+tgv49mq/6Y0V57FdmA0RNcMkll1x22WWQoKam5v77799tt91olfAZ3JtHFLhJSmHOPPPMAw44oH///sLIMGpra1977bURI0a88cYb8+bNI2YTp+WdCxlJWVnZ2muvTQaw11579e3b13zmgFbPAPCG+NFHH+2///6Ib7/99qqrrooaJAq0fv3117vvvnsqlXr//fclIXCbizeBVTSkW/Rp0qYKVjGIiSlh3KJVk3ynfnyjU09TxMmT5KvvrSqwD6BTshe4Nwi3aEp4D6lNFayiIX1MrDqmhPeQ2lTBKhrSx8SqE1xTFEBwE1PCe0htqmAVDeljYtUJTipHGlbRn3SLwTVFAQQ3MSW8h9SmClbRkD4mVp3gpHKkYRWtJv6kwCq2S8ctmhLeQ2pTAXuDk4vnI3kn0VyIrZhKrBxzYulIpBwj/31GSLcYnFSONKyi1WSRpAoE/KPcLA0onYBbFLhJShipCEP7zMbCcU83jK+viBQi1fGm6/dMbtQl6sTakQ1IsBRlou8hhxzS3NwsapzXzzvvPFFbXK8NSAVSXopYsGDBX//61/vuuw//XL/6+vp33nnn4Ycffv3116dPn26UMaSVoVKSBAwePHjIkCH9+vUjckurqBl9SjMArCgzmQz648ePZ0ZnnXWWvAYgrWeccca111677777Pvjgg5jAG1fAXRefRcF3yj5N2lTBKgYxMSWMW7Ro5gtObeTxE59Ze+01o4lCPJeMkCzlI2oTaM8ATAnvIbWpglU0pI+JVceU8B5SmypYRUP6mFh1gmuKAghuYkp4D6lNFayiIX1MrDrBSeVIwyr6k24xuKYogOAmpoT3kNpUwSoa0sfEqhOcVI40rKLVxJ8UWMV26bhFU8J7SG2qUSioTEB9rU22ZXpjbpXY+iesW0hkY9G4Sh1sJh7SLQYnlSMNq2g1CUIqWblZsiGDtE7AiAI3SQnjFilz+fxtoxvu+rIspl4Izhy8aezk/gkOgG4dcQWsomQDlGPHjt1xxx1nzpwpTRy7iY6cmI1mEG+G9DGRHkV8/vnnf/jhh2OOOebjjz9+9NFHEWfNmpVOk40qK+OBCN27d++dd9552LBh6623XmVlpfBAXAFhpHSLpnLZZZddeOGF66+//ksvvSSvAYDZs2dvvvnmM2bMeOihh7bbbjshASZiRSn+gVU0pFv0adKmClYxiIkpYdyiXbMx8vTpzw6+cI9IF0VlY+mYEyFhNDqiDzx+3KIp4T2kNlWwiob0MbHqmBLeQ2pTBatoSB8Tq05wTVEAwU1MCe8htamCVTSkj4lVJzipHGlYRX/SLQbXFAUQ3MSU8B5SmypYRUP6mFh1gpPKkYZVtJr4kwKr2C4dt2hKeA+pTRUQ8+r1gngsF2l4v3HMJxM2Pn49hwRBfza9LRM36RaDk8qRhlW0miyS5Ny4bP9OgQXMfau+lR2dungh60QT7/zQNL/Y0g6wdg0NDSeeeKKkAixonz59rr766l/9zYD+wD9Xik6z2ezjjz8+atSoLbfcctCgQXfccce0adM4xMsVBeh07tz58MMPf/bZZ99//31O8P3796+oqBAnohMce+65Z01NzXfffTd69GgSjpzG66+/PnHixDXWWGPjjTeGBPJaQtFmqQfZfjofzTnlhXSiwYk4sUIi4iScQiz8CX/Cn/BnoR/1VbbxXDSSS6R1EpB1YlmOisW9ZGnDcpcNkMyt3Dm6QfdYJJ+JOpE5TfEPfsrIm0PqhXjaVbkIEFnPPfdc4iIVAiHh9q677pKP5S924J+I++mnn55//vkE4P/+979E+u+//54kQFoBle7duw8bNuyJJ56YPHnybbfdtvPOO3fs2FHSCEqB9rdoSI6I27XWWmvAgAFkAA8++GBLSws9Ur/11ltROOCAA2KxGAMTk2UIkUg6WdZc7ahPCug/cumoNwrUi4HhT/izHP0U8tFsLprPFevuJvWjeacQdbKcU6hEqDiZaEH/UFEiP+ovxS1sJYYeUvTdTcaDz4/SXPjZxAMj4Ufq7ib3D02YG4WFB8lk5WchD2JiRPmho1g+ksyxWeTJAaL5mJPn5KD+QO7SiOUvG4hEawr5bdYuj0cdToAtudRb4zJNRLlCLquuYbatvE6CrlQee+wxwj/xlWNxPB7/xz/+wWEdPnjE9UCcU+Iwm80yHCoE4DFjxshXBWy77bZXXHHFDz/8YIYh79Zzdh88ePADDzzw8ccf/+c//9lrr71SqZR7GNTdYrtAF/vttx/mb7311owZMxjV559/Pnr06PLy8t13350RimeSA0kglgkUCuqDwTmejKRTwdOuv5FC/eXv8Cf8WU5+ImoTzEWIgDwN6qGQmLuwDj85pZl2iITqd3JhYuj/8sNpGQV2LDZVginBUr/f5uQL2qH2qRKJWCEfZyMtqFhajLOFXEwHX+VEuS16ji/8E2UvLOQKTla5KqhRsYFnC3TpIGKlY7pqJUNQfRVUp/RORXWn0xjlhLFpE3Eio6J3tQKqUpwyUH60GutTYIlkM45lnQhJAFK6uDsvnZA5LunQEW0xAEdE7HihpV+vRJfyaDbBDRz9clLT5Hpu01ySGyWS4HKS6Kl73QYG880335x66qkcixG50/fee+9jjz1Wmn7LULHFG2GV+o8//njDDTfsscceG2+88TnnnPPpp582NTVJ0AUoVFZWcvrnmP7FF188+uijnNRXWGEFcaJ8/Ta4ney5556dOnWaN2/eiy++mMlkSDvIUQYNGtSlSxdGS7pAMiRDWmag9zm9AothLUOEWOrgZGP1aWJ3oSySjUdykawTz0XU2cMN9fl6wns0ko9wPI6SMkcjBOyo+iGB5tSlAmO+EEs7kTQhGX31gVxlSY6dLeBV/ZBEcLLO5aKZbCSXjUQykWg6EmtRPgo05aNZvSFHovlolL1b/dCPVPCVdWItZADaj/ovWojTTDafUz8FHKqRkwA4uXQk1xzN5iMkBxkGoNzn0WWY6ETSRH6975PAyMSYFD24twKGn1PnAjUv3KbVewQEEP0KomoHS/G+wayWjmyAwLO4wI1RyGdXLCts2TeWz2e4MedkK94a36x+yTyfUbdeq5YH8u74ggULDjvssOnTpwuzwQYbXHfdde6PC2jdXwNO3lOmTLn77rsJwJtssslpp532+uuvE3ppItzSF6iurt5iiy3okSTg6aefPuKII3r06CHxmMAsvf+WMQjEg9wf9DhkyBDEZ555ZvLkyZR0N3ToUFolFZCXKLRdiBAhlgE4ibxTlnFiOY7PmXysJZnLxdWvKi8EfRTmv3RVrjmRywhDDM6TRUTylNloLhPNRvKpWLoimk4RbHOFdKGQjmaj0UySn0g+qV9Ujzv5RCybimcT8WwkkXMShUwCtVwiUkjkC/Gso96AyMUKWaK7cvvLT4HYn66IpclasFMvUbBlcVSPck7RSUqckJ5Xwd4pxJK5eCqbjGWTkWyqJZJoikayMbY5IkAuXsgkneaYQ9bCwPNkBXSoXkigWf2oubEnRukvD9tciDQlsoXyNNlP7jdvt0sKWLjl67UBQOxyYomySGbbvvGqfEshQk5a/s44Z35OfV8EKJ5zF+5ThsHh+Oyzz/78889hiII1NTV33nlnt27dECUea91FAD+y9ER3eUegtrb2kUce2X///QcMGHD00Ue/8sorDQ0N6BQNNEg7/vnPf7799tukCH/961979+5NCkKnJh4DRCD13w5c4Zle9t57b8rPPvuMTIV8Zc011+zfvz8jFwVB0WbZgr7m6s0g/bAoyLVrC9LKyowbN+6HH37gsho/UgmxZIJLPGfOnLFjx06YMIG6uuz/f5eMW4gxAH1PFW+qYtv/Ck6ukkCYj6bro811kbT+vr2SMRTUAb8lkk8X0plINsdtnnPIHxg4oZe8IKpPzXXRFmK7fgkhk3PShXyOoJqN5ZpiLS2RFjIETDKFdDrelI+rrzXjkB9lKya6F3JcmAQ/KhNoIQazEq0vDKgzPT/N0VxTIpdPOCQKLU5TgYO9GkNLJt+cUW9PcC3V4mXJFIja6qX9HE6b6Ei9CEzso49slhwlR5AnDYjJ5wXIZdKcFNWvDSqX6iqoeEBBwpFtikSaIxz/mLB6aSLTGjGWdqh7Xm64JRzup0LKUlHgJilh3KIq9Y0Rz2cbIvHTnpz7ybxqJxeLx7L/2CW6U6+YuubqyqtLbbzpqpPNZu+6664TTjiBOwoxlUr9+9//Puqoo7j9CcnuLmj1iAIj4opQOm/evHfffffJJ58cMWJEY2MjfsRElKnjdt11191jjz3222+/9dZbD8YoYG51bkiPKAhuYkqGWldXt8suu3z11Vd0mk6nzzvvvL/85S9kAOXl5RUVFawDdXHo9iawioZ0iz5N2lTBKgYxMSWMW7Rp5gtNhZdOem3Q1bs4VWwo8br5C+bOn8vcjY7oA48fcqYVV1xx7ty5ZHVTp059+OGHhwwZgqHRsZoXhRJvPiZWHVPCe0htqmAVDeljYtUJrikKILiJKeE9pDZVsIqG9DEp1bnmmmvOPPNMLtxbb72VTCZpgpcmylJDgVX0J91iW5qA545bCJLDBk+Z3H5GR+pAREO6RVPCe0htqmAVKbMcBoiQc3KZaeq4UrZqKlYd4yDl1iG6ZigaY+lxjdlUvmrVao7kjVMb8pyc1evpUfVhAnWgz8bTCWJqbIVocqVoxOH8n8tGMolcMtYST09N131fnxmXzs9uzEedsu4VTo9YdOVozaqdcl2c/LwMCvFsgoCbTxTfpDcDoMxFCd1Rdfbv6CRWjcfYF9VHw5tzuVTznFzD97XN39bnZtZzaCrr07F846ry3pWFsnw8mo5mU/qNjax6LWC+0/hTM2ZOSk2LvT8TyVV0qYjVRGJKx8mrN0OiZALopGdkW2ZkorFIZe+KlvJ0zkkk1W8Tpplf87vN4z76boMT+zM6Bowj91DV0DU8pFsMTipHGlbRarJIUu1U/KPcLNmQQZZOwC0K3CQljFtUZT7XHI0lCyS10f9+n75uZDTuqLep9uzddM72lUn1+Zkod5f6dr1Wb7qqvptv1113JTTyeCAefvjhN9xwAwFbtnsY0wV1d4/KWENEAv97771HqHjttdfkTwkXm7UCJUGlZ8+eQ4cO3WeffTbYYIPKykrTijmQipTSBDykj05R8DUxJWhpaRk+fPg//vEPdBjMs88+y/DKysrYpBAZbam58q5hFQ3pFn2atKmCVQxiYkoYt2jT9GYD9959z5lnn1lU0jCGxo/wO++88/33379gwYL+/ftPnjyZS7znnnvql06Kr51YeywKCw+1VBT46JgS3kNqUwWraEgfE6tOcE1RAMFNTAnvIbWpglU0pI+JR4f8/tprr5Vs4NVXXyUb4LmG59H2aBpRYBX9SbfYliaYPn365ptvTvL92GOPbb311rRKQiA6Wl1BREO6RVPCe0htqmAVKQuFDNFh7mtzvjr2Mycb2+TmTTvs2okjlI4XygKvBaeZUF//TNMXJ75Xv3r+j//d3SnPvn/Ix+lvW2IZQmSErZSzfEJ9ljC+INXY6fDqfmf1y5VhX4ilY9mJ2S9u/2b+C/PjE8uJ6OloSzwXi6lEIueUO/FVoxvfs+HEt34ad9P4bnUrxAvRdCytBugQlVVuQEoRLcQThVgiH2+KtyQGJQZc1z9fqV55iNU5U+756ec7Jzk/xtO58pa4U5mJZmONmS61nbbpuN7JG8fWT+XKidbZOAfBdGrWqzM/OPHDjg2dGWqsQNLhMIF4da7zwG6rnNSrcq3qfDIXUV9PlyXKT35w8lcXfVG1QsV2j+zcvHJ9MlfGMAqJZSQbKKacywt4NAqcZPOFiErpBq6W7FnWkC/E4/nsB1Pjk+oLTj5LwhtvfbUQEPvZLKZNm3bMMcfMnz9fUoFNNtnkqquu4lgsu4YsqAeiSQnwQBIwcuTIk08+eaONNiKruOeeewgYYmjMe/XqRS/PPPPMd999d+WVV2655ZaEW6Nj1P7HIJiRlxD+GcAOO+zQtWtXZgQpmVBRaZkEj0jB6da1a79+/bhqZGbrr78+ZVVV1bx58+rr60UUbLjhhly+TCbDDq7sdBYFEFku5ahtiL5AROGD41eYLJlwT8R/Uj6tAVdD1Ci5jeUacfm4ZE1NTRzNEYFo/kbgB/8NDQ0zZsyYO3cu9WKDDdIpOwZpJbcZ9w+jkuFJk+iYensR0DbnxDtv0qW8U7L65+5THvq50BTJ5ZtyhRb9PnlWH5QT0Xzs5ydnVU3v3H3jVfOdMpyzs01Ot6kdO8Qr4hvGKzYor167U9k6Ndl+TvVanao7VzVGmx312YHG+u/qP9j3o7qbG8smpjqsV9Xzb936XrH6Ghev0u3wjpVbJfNlmfoZtZl0S6ojrTWpDcrL16+oWq+8y9oVTjIRmVOWaohV9klVr9OhZp3KyIZOp76VqR6VzU6mhaP+vPz3Z38z4+/TshMi0T5OzxO7r/2vFVe7cKXKXSuTDU7miejn+37e8N68HNfXaY7nnCwJRSTeaUqHwrxkfI3y8rXLK/t0aIgWqiY01z2Y+WL3L5u/b8o46aj6ey9kA/lIc7TjlBXis8qz+Ww0XaFCfowVkTVb6qGyg2J1CYYMkpKAJGWpKHCTlDBukYSNs182lo840VguXxeNXv3GgpfGVpPLRZ2G4zaPD9sg5UQLUTJQ7RITnkO2hqOOOurBBx8UPyuuuOLTTz9NQkC96HbhUgwFzc3NhPYRI0aQ4xP+5TuM4SmBRNPu3bsTZffff3/Cf01NjbTSRCl1U3qn08bES0VBcBNTwrMCbJEMvra2llHJSwKUgHxIxlnqTWAVDekWfZq0qYJVDGJiShi3aNP85bWBQkU6H0k65HL5HJsyYI9mNS655JLbbrutW7dub7/9NkkSKyDACQtSV1fHYW7KlCl33XXXoEGD5EUUyRpRKO2xKLR/Oh6RUnSAmxRNIApGBB6dUhPgo1OqWay1wqpTKhrSeDCkMG5RNAWm1S0KfEzcTUXWcUjuzzrrLHI+cvFkMskdLrk+TWLu460ouHRElCbuGSrsIcOGDXvjjTfYSS6//HIYYDWXXIGN4ocffqDes2fP6upqku+E/pwQTWJi+nL7KSUpxcRNwgisojbhTJQkbo+/ZGzd5Q3zVpm3xZNbOGu1JMqSiXxFNKY+J1hwCpnxzkc7fuLMa173zg0TO6USjYkPD/ws9UGhfL+qvhetmU7UqpfdC8mWstrqpq5OMje/a2N1viq5IPnJfl/EP46nk7NWOn61Lod3XdC5MUnO7ETj+UR5U4UzIzdz3LSaLTpUd+gUycUKkUJztKkp1pBqrJhwyXjn3rSzSqLPA+s3rTTLSTXnCx3Kc+X5ZGOkmh66zLxowoKbOKm3JI5MrnXCGvU1zdloXdaJVdRW5d5v/uKy72vGd06vMr//EwML62VSTiQdT9a9OP+7vb9Jl8W3eGDj9Hr1zbFofGpk5l3j5z1UHsksqDq5Zp0L1lR/iqgsF8vFJ94zZfLx05xVcgOe3zSzYiRZ5iTiCSeSzUfC1waWRqjfINe/Y5ovJAu5HdauTOWbSO6yTvKdCS0LaFdvfBV1Bbfccsujjz5qFo6NY+ONN0aUVito5WH+17/+td122/3hD3/g+R83bhwxVTwIOGLuvvvu999//+jRo++55x4iR+fOnVFQV2VJui4Mhs2IHKhHjx4dO3YsLy9nY5JXU+GLSssiVIqov29ArjUlu7PcBlQoJUVgo2fHN5s+FTGnLiLXvbGx0fBWoMmRFG9FuZ3AkPjh34UZcFF2gdMnvfubA3T8e6EJHbwV5faDdaALiYj+EM2i0Aq5BPAyBv+HlKXwPJLoQ8pVA/7mbYGuZQAyC/HPk0IyPX/+fH0Fil1odS/QBzxi66yzzhprrEFSgrLcdaJAnWHTBbwwwcGQMGQAvlOjiXCfyyTzK+69cjbVmJiZmvryz/l6pyWd0b9IoP6CL+Xc12ojM+PRNfIV66aU00w2lc7EoulsPD3XmTvfmdUQnZuONNYnaudVTc9EG8ubC2UZZ9ojM9KfpZud5u7HdO94bM2C8qbafKYhX8ip3+TKtJQ1NK6cq9hhxaZO8Tmp+nlVdQvKFmQiGXKHDMl2S3mHfFkmmp0Rb54Xbcllc7GWZG2iJV9oSTR2SI/PTPnP5HS0PrtbdrWTV5vadXqtU+c0RZ18c22H2tjW0dXP7jO7eo4ztXzs3eMiCxI59Wjn1OcEWZloZEGkvjYya268bsHqdV1OW63Qs7bcSdWPbs7NyzU6jerbENQvFKpPGZKgtDS3ZPTxgOtRXLalH8tdNsAjRdKmatFYwsmtv2K8byeSg5a8E/9hXvyz6bm8+i0StRHwsPHwjBw58sILL+TZliB98sknH3DAASYQogbksafEhCSAdIEMoF+/fueff/6nn37KY6K60+CR7tSp02677XbnnXeOGTPmySef3H///VdaaSU8y9NOKXURlxCwN7ErkQdwTOHYREWygWLzsgquBSl+tPi7lJIDUecC0chGIHcIV0papZTLhxo3xtSpUzlx9unTp1u3bhzyLrvssoaGBkwEKOCEG+bII48k2eLGIB0cOnToV199RWtb0UjuNynxzz02YMCAmpoaErUOHTqQp950000ytjlz5hx88MFknB999NGrr75KYorCzjvvTDxAAcyePZs8dZNNNoHHvFevXtyNo0aN4o6ltdif4xB77rvvvs022wwdBkm51VZbvf/+++jIOCmnT59+7rnnrrrqqkyBm4Rb+sADDyT+qW5sYY/xgxkzZqDG4/DJJ5+88847e+21F87x0Ldv30suuYTh4dlg+PDhzIUBM+vjjjtuhRVW6NKly7fffitdMObnnntu8ODBspIsyPbbb3/rrbfKgqMgF4tOeZbvvvvuLbbYglmjxvR5GOUB58KJDsBw33333XPPPVkQzM2weaJJ3GmSXzMG8Cgw2auvvnqbbbZh+jKANddc85xzzqFfPOyxxx5cBZQfe+wxfCJefPHFNIm5B4yBleFaHHTQQd99950ZGPpcx1133ZVVAhwn2GQeeOABNf82cgvBjz/+eNFFF+24445y+QBX86mnnpKUpWQM7DwqDybux9csK9uqLJFJTH92enR2Ip/JF7L6A/icEZpi05+YSl7Q9Y8rNnaozaQzbILZeLw5Em12ovX5SMapiEY7ReOdEonuuYqKlopUJFWezyem/Hdckni+aqbmgF6z4/OcxmSPBR06O8lEKpGpiTbV5PLVzfGyhupoOhVtjiSbCuWFfHk8n6ooi0eay1rmxOc3pWY6ufnxrHqhIlMVL1Rkk2U5tuSfHx/boa6qrnzBmoes3RTPtrREecKyZYVUrHsi2bWhS65q25oVNq2szuZmPzslO6nQ2JLhKWIyTjSScJobnfl1kUJNS80K9dXV8bKqztW05TK55qZmsi8WnxVRX14QyUYTkUw+k+HylS7e0oylIxvgYViskESb//PV0cK2a0YTTjoaizTm4iPHNzdHE1G9WdEvDzz7Dgc76jyTm2+++XnnnScPp/Kidahwo8ycOfOuu+5iM2KXOfvssz/44AN51I1OWVnZlltued1119H09NNPH3roofKriUsLCHUVre8RMBfCXrFhGUbrjUdol3xIXvanbu5JmgxPkkSdjIE7hOs+ZcqUQw455D//+Q97Libz5s0jGyBgyBaMB3S++OILYuH999/PNk2F8PbMM88QFL/++msUzBYvfQkQxf+XX37J/v6vf/0LJ927dycOkXCw78MTv1EjQL799tsvv/zy888/T2h57733GB4ecMsAiKncrgQJ0pH11luPXAHyiSeekFRVdHTnzjXXXEO+Qi/rrrsuIbZr167ffPMN5kaHjoiOjIQnhUSBx4Qm7nOGYWYqg3cDHjWSgNdff/2FF15gMC+99JIkWxMnTiQboFM8YIsT9MmeiYVkIcTI22+/nU5R5ikD+GFtSSxeeeUVbtGddtpptdVWQ/OEE0444ogj5K16WTQ0L7jggr/97W8EdXS23XZbJnLiiSdef/31aqqtaYrgzTffpEcCM+b0IsPgOkK+9dZbjE1WAE2ckKn8/e9/J/BzKVmE3r17z5o1a/To0Vxu0jJWg/SCMUBSZ2zff/+9jBwP0rVeleKLt/DvvvsuA5g/f77w2L744oski2+88QZZF3NcffXVJ0yYMH78eJT1eC2LLHjkkUdYn48//njttdfmCnI3fvbZZ2xB3B6MQXoHRW0F0uBY0okkUrlO+3RrSbVEviq0fNWcbU7n8835aCGaj2a+z84fPadQ2Vy944pNkXTMYReNxNOxqlyiqqXQIedUOZGKpFNek+4Wreoaq6guT1bg8Od45tuWVC7VfbMuufJYoqmipWZBviadqOjQKVGzQrRTt2jH6mR1qrJDZXylDtGVOjpdOkarapLJmopCeUWzk3Bi+cqqpg4dW1LJeCRZFu1Qlu4aL69MdkrlEwvero/lauK9k7G1o+kcris7FKrKyqqqyqNdE2UVHGG6Oitus3JLNJmYVjZv7IKmdIZsQX3lEPtbLl7V3Km8UBaprK3tmss15BdMnpmOpqPrOg3RxlhLPJdP55xcVU1VIUJGlHVi+lZRb5ioO7m4bEszmMVy99qAeoWL60gCHmFXVS8PDOxTVkXuqb6QOzp6fHau+uyqurw8qsceeyyHDzktELzvvfdeUv6iH71xsFM8+OCDBxxwwBprrPHXv/6VnZcjgtLWYImJDeQHV1xxxeeff86zR27BM8yDLau/FEEiImEPEFSWxin8Sqgn5ZeoT0ldaCokSZRcZckDWCLTykVn6+fEzO5PVOA0xi3x0EMPsb8TIdjBm5qa/vKXv0yaNInwxtZ/zz33EGZIGSdPnkxk5QQvdxHQ41gItA4bNoxIQKfnn38+Xfz3v/8dOXIkHR122GE4Z5eXjZ4rRTjnrnv44YdJMiSO0jWRkvDQo0ePZ599ln4Z2IcffogtnklnOcjKIOvq6vCMH4Iocei+++6jLxKLjTfeWBQYHn6wJR8inJNPPPnkk5z1H330URaBvkys0gMvQoKoAbGfkzTHXEbIUgwZMgQdkhhcMQv8iAngEeNRYoR0QURfccUVicqoXXnllfR10kknyVd086zdcsstPK00XXrppYwBW/ww02uvvZZrBMksZNZnnXUWTlgoRqJHpyB1DJkmrQxDPFChCWXWUBICcOqpp+INt+R/LB1LTbpA/sSlQYcATOwfOHAgDtkryBjQIS8Ut2r+JZcY/3SHZ5qkBNwhXJ399tuPFIo6C8Xl3meffWQY6BeNS9CnT59bb731u+++Y/osMukX+QSuuLIsGiixVe+PxfL5eC7bdecV4ytGqxurfnxjYj5D8GzO57JO1pn4zOSahrKKtctjvZM5h8dAfag4m9ITqSvkx2WdscnM2GT9uJaW7/KxmWXReKxQlq+b3FieXqG+rDa6QSGTmubEs11zXQod4rGqRLQiFSXEq4esvCyZytdkClX5aCqRSKbiPHTlqXhZZSyfiBRSzXGnsSLnpJxCWVUyhXJNoaJDpimXn+PknXT3VXrOL69tTtbFk4WqVLKirEOiPJ4oi3WIVcQqIxXrlNdW1pbnU1N/mNGSznJp1VcVO0421pyNRWJOpw6Z6g6za8YO/6FsbqfmykyP3bo3JOrj6YR6MzmSjcSiBZIHSvX+Av+xbt5rt/Ri6cgG5GFYDMCTygWSTDuqfn2VQ15sterCOj0kskVnZCo+HNeQUV+Jmfv3dbe99PKLqUg0EUmVV5SzibCl4oMnZ+bMmU899dTBBx+80UYbkWJznuNUJJsFXiiJDeuvv/55553H3sQeesYZZ/Tt25cnnFa2DBREE8i4lmTIOBkw42fwZvzF5mUYEhIKkg8UIdOXCycVWRMDucq0VlVVceIk3nMzUOd0zrbLOXXBggWy/7I1E/w6dOjAfYIhtxBqJ598Mh4IhCQN7NeEIvHmwXPPPSevIZODHn300QQDQgJuq6urOfJSIcxgKEEC5o477uDITu5Cd5AEKs6d+CGsbrjhhnREbCOV4XTLTc4p9rbbbpNYxagYMK6Iu+jIkDp37ix+8ID/adOmUbKLw2OLCSuz8sorSytQI14YKDB4KjK7HXbYgfDGkZoxrLDCChdffHGvXr0wJFrTKT0aJ1iRKjFl1gpgjsLNN9/MlLfZZhtO+WgySMRdd931yCOPxIR8fcqUKcyFJuZFEx39+c9/xq3M+qijjkJTpiOlAOcAfdQopQmGElesD8AJ52zSIGZEjxzBGSEDoInVXmuttbAC9IIhVpQoAPQRZVSqm9arLDeVdEGFftEBdMTZA4YLQR3QBVehe/fu+EdZzK0guyJNqaysxAqHK620EkcUfJKtMjX84KGoKsg4zZFMjpHEI7GV8p12qGlMNGZea87NLmvK5uLprDOvMO3V6bWJdM1e3bKxllg+ni3n4B5NR7KJfCLzdmbMUT9MOnDit0PGfLvz+G/++NWcl2cxsWiuunbW/Fg+Vog0Jzunck6kKl/upJKViUQqHo3H1Qd1YvQYjSbj8VQkmYioXBu7GBtPJMZOrD4F7mTi6rcM48l8NBmJxaOJWNTJRdKFebno/DjH91infEuhNpYrw0mugiyFZzaVjhP1yyuzZfEe0UQ8keVStOSUp2b1hxS4EaOZ1Pzn5jRcN2vi+TPHDPoq/3jZrB4zu5++QnS9skShsqFMvdhGvMglYslsNF6Zi0XLI4VoPpbORdVlKq7b0gxmsXRkAzwhixXFl/p1Rf3suFaizGlOkyE4hXd+qGvIR95+/Y1/XnqZet+IPcuJHH/cMUOH7s1jww7OTsQ5b+jQoRyDeJxYx6Iv/YSz85555pmokf5feOGF6667LrukKIRYZuBzW8pzxe5MnOY+Yc+FQSTIEfJpYjvmRoInHiNySqOVjZ5EgQyA4yw3EifyH3/8USKQbPRF7xpYcfREjeP4QQcdhDc0UYPRCckv+YqUjAS3sunDoEaSSszu2rXr1ltvbXIODMvLy3fbbTfqH3zwwZw5c+DpomfPnphwgL7//vsZGK2AUAdJRxLzsGX8w4YNGzlyJFPTo1CJkR6vHeJHKsQqHhO6YxYwBDlCOyMfM2bMvHnzhJfpbLnlljvuuCNzkemgTP7BdGiVv6QFj5r0vtdee1HHAwd3loi0hvM0zM4774ymLJook1iIf4EanwZ1icSydNJEiTIkoJUps5gwZBWI5lrI60YCsRLQJK2m1F0tBNSkgiu6lpmyzvAkdldeeeX06dOlF3jx7w+ZHSV5CSKGiLhluRgzFZpEE9C3/lg8ATjmJCMrD+6VTqRz0zPOKw3ZXKYpmq/9ZH7kSyff3em0Vcd8NhOLRuKxRNSJEnkzmHdLdO7XuVP/8i5bpGr+UFa2fTTWHa00j0FMvoqwEI/ko9l4tFDmxJLRpJPQn88p/mi4qq1gVOrTDOqPjKtvNyp+wJehRtUfD8g5+aT8CSX1vUdRYn48Ufw1aLWW2LGWiUKkLE7uwgBiuVghk80XMnlH/cWjyuaqpvvn1N88fd5j0yI/pTp0q9z8yo277dOzvODky1oiyXws0iESLaS6RuOFaCIZSyeb6TleSKmRLBNQdyb/L/ngTv39kI1Et+gRW7WqWf2mQS73w5zy1z6devThh7c0zivkI1knvtUfttp+u+1PO+20NdZYY/DgwXfffffEiRPNM8zwqPfu3fvYY49lf//mm28uueQSOYeJQrGbEMsQfO5JueiAICoimzWhjrBqtmAJJN9++y3MV1991a9fvwEam266KQGMcIKavITATs2mL57dGDduHCVncYAOflKpFIc/QEQHyaT6Bhi5OQl16CByT6IG5LfXsCU2UBFePvpAyMEzeUltbS2hAnNO6tXV1TDnnnsuh/hrrrmGwG/eFkF5nXXWOeOMMxA5Jf/pT38iUX7qqaeYptGRkahxl8A0MQzq6LNWzILVgydTqa+vZx0kXMEwF1kQxox/hk1GjgLj7NKlC9OkO0hZhB49euAK/WnTpqFDgoVDbOFZf7yZWeOtrRFijlupCCOaDAme8qeffmLk9EtiByNu5YoDKojMixHSRCnDlpJ+hRfPpZC7RRRY/zXXXJMp3HTTTVyIs88+m67xA+gdP2JSCswnTZp0yy23HHPMMWR72267LRcIEjBgYKZWRMyJErAJhIVILh6p7Ffdbc1KztYzXpobrY9FGp0Zj06pbOxcM6BD0yr1pKAc2lORFClGKtfcnJibH5jvcvkKievKym4p73ldr1WvWyW1ZWW2OdKcb0l1qCCE4zwz28lHok1lmXjEiedUNlDsul0gwDsOq5PIpRIdueiZXCzWuCCbzFSoP1CovmtefSQSHUI4y5OJFBobM9Gsel04EYk5uUxWfedxnr4z8eZOB3esPKObs0llbbJhWsuMhrgzr6wun0zHKwrlsXjKiTnxrJPIlhXKnWwsl0xHI/mY+vvFy8gOz82wdGQDvzdqoult1ihLFvKFGNe/4p93vj1x2lz1p63U+wqRL7/9YuiQfW+44Ub5PBH6soUBNqDDDjtsxIgRnKVuuOEGTlo8/DyZ5iE3miGWH7CDS4V7gDuBYCMhAYanjluCUsIbWGmllXbS2GWXXf6oIZ85J364o6AHQuJWwo+ERumIUiKE3IF4oEcUgIlMcltSoeSWlggKUMCQVnX01i+D07rFFls8/vjjjAr9mTNnXnfddQzvk08+QQ3QC+WZZ5558803kxYgyh/5POKII+rq6kSBXjyAlwr+gajRNWOQ8EzUh2SaTJApmHWggiYDZjAyHUSaJBxiYuaCE1IBREgiKIvJCV76ookKTfSIE8Di4Fw8owDMslMRUMdKSGBIhkSJB6yoyCzkWgDqDBXgnFZKgIgapbHyQDxLBcj0yZC4EIceemjHjh2ZkXzZ5SOPPCKjLfXDaOEp77zzzm222eb888///PPPuUZDhgxZeeWVRQfPoiaigGN0TBP64sXyXfIr7LlS2olN+3KOMzYRn5Wc+/rcpnjLirt0ryuvVzox7nO1btlIWTZfmYukWiKxfC6aaEy2OE0Lyudn8tl4M1G3sWLVxIKyOZFcsmVsc0VTOXlBjNN8wbICQaDextPBPpKPxjrE0l0bspFcw8Sm6voa4r1aDw542nlU/4GaeCGeHZ92MpHGRHOHntWRaC5fyOlfMHSyZc3RwXHnzy29T1m9sOK88rrIlEsmdJhaWV+dq4hUVjrVrEnOiUUzyXQk75Qlogw9Shqe+7VjXxIRZgNOgq0kEt+6b0XXeJ3621VOpLLPlk7lyjn1p7VJj9Nz58xkX5SHjSeHygorrLDvvvvyNI4dO/b222/fddddO3XqVPo0hlg+wd4K2CTZ63UgUL+PQGn2XAkqXbt2peRA+a9//evqq6/+97//TaC98cYbr7/++muvvbZPnz5EJkxE2Q3uNGwpZ8+eTVSgIh1RAqzUHu4CChL59FhUWCKLhSQ6yukfklbAOBcsWECTxFG6BijILy6+9NJLxx57LE0cNI8//vj5rV/NiT5hb+jQoc8///zdd9+91VZbMezXX3/9iiuuIJaTUpiJewAvTTJCmQWgXltbC09mwJAkYmkLpcl4UBBNUFNTA4lOQ0MDCu4muq6v5/AaJTZLX2gC5khdepQVA6iJfwEKRqRuBiC8VGiiIteRvubNmwdjZgHErUAbFR0WqYUvkxumFwEmlOiTEFx22WWvvfba6aefTr9z5sw5+eSTf/zxRxTMCA3Qh//iiy9Q5mJdeOGFHFrkO5jJ8FCgFwMxEUT0N+ypP+obySY5ESULFUM7O12busxPzXlhztQX5+Rmp7Krzu64RcdINlHg3J2M5pNZFW7z+fJ8JEl2kWrIVjUVqvPRCo7VFU5Z0inneUgle1VkVsmiWDdqXoepFRXphLpyap9daACBUYjnObPrEVc4iYEd1Ce+xmSd8U4ql5IvEuCHRgI5CVuiPjbrlWmRbHx+t7rydcsz0ZY4c3NiHPtaorFIrmtZpnLeenUr/bVvQypa+VVi1q0/d2zqWhHtoD5uHsNf1OkQWbDBvOwGuXgkxXxzcRLBZWfbD7MBtQRc5pU7RtbpHlcfEY1EY1Xdu663dSESi6vfPFDPmPojXJEIKfmgQYPuvffeTz/99IEHHth77707dOggzzal54kKsZyDW0LCjMQbREGx2XEGDBjAPcNxTT7jpvdkFXXQ4Y6S4I2hJ2aIiC1qbPHs7zC6H7UrWoGmjnrFRAHIH8GaPHnyzz//DGlGxQA+/PBDKr169ZLX2Kmjjy0BtW/fvgQSjpiIP/zwAzkBIZCRiw5hm+C97bbb3nrrrdtvvz0+33//fYKxTIpSddAGUNbjKgKRRwx+jTXWIM+g4jZnwKyMmQ6p+YorrojCV199JX7QkZJAyPBQww9Bp3v37gwSTXjpRcDwZs2aJZMVBh1RoCIfipQ6873vvvvUIDTkYvXr14/B4P+5554TK0pQVNKANBVpMkxAoM9EGD83xkorrXT00UezztTT6bR8KoIBFFVbwYABKVpTU9NGG21EJocT0kfWxK0saqAoA/IcJWX4Ue+V5mOJ1RNdt0yW56pmvzy5+ZH5kUKyy07VDRUNVemaaDKSjCfyxMWo05zgAK3+iHAqkiyPVXRIdO4e7dwzv2JNsiJWkU7GCk6Hwuq7rpYvRBITI1NfGJ+qVx8MTOs9dtHQL/urQv7TP5G8+psyefUHi3Ir77V2NJ6vrqsY9/LYeJMTVV8QoP4MkeOkmS0ZQf1382tf+5m7o8uW3XIrZXPxbDSfiDjqu2kjTra8kM6UFxJl0W6DOtTsEm+IOPNG1Oaeq8/mnQb1lwii5APx1WPb3LfV2n/ryyOnPu/oxJisjG4ZwEK37BILbuLfD3iPRpxyx9lurbLOLfPL0w3piZ/GW2YUYqkWblXSz1hi++134MT2ySef8MAfcsghPI08/zzVRRf6wXaLIZZtyOVmAy3KJTA3rdwV6iZuZaQO9t13306dOrFTDx8+nDDDHaWO5/oFf8IqwZgKYc9zX0nQ2m+//aqrq9nTb7jhBiK67OYCCb1F7daRSCmg9Q9/+AOnTKLC/fffTykkhl9++eWLL75I/Y9//COBB56BcUxnGAyMITEwIisiOkSgFv1Rvrlz51ISq4jcKHBYX2211XCLDgqEH1qpu8dgILMjnqFAHTCMjz/+eNSoUfB77bWXkNQxFw/iypD0SF4O+fTTT8+YMUNa6ZEozuKgQxKz/vrrU2G1qdA6YsQI+YNhkMwCtZtuuonx02TAHMn1qYwdOxY1wJU655xzHnvsMUhEKRntVlttJb8xRKLAAkoTzoEsggASHdIOYy6k1KXigfAClGfPns0guSvkWvTp04eKrK3EeONQILeKJGR1dXVcR9RgFixYMGHCBBQwUa5bx6CNBOqDeZEIoToZieuLVOassG/Pxli0YmyX5o9nOalsx61Xa441F6LZpEpZE6lChfrYnwrTZVQ6Zjp0Tncjm1V0MkbGEInG1afuooWVDlytftW5VbnorP/UTvlvY35Beaq54OQijJ7sQ/05uZ8L4x77se7nev2HlLlITIoCz+qTXTEnksVrhn+TkUJKvYDhqFhPWlCzccfUYEJ7ftazM2ufmp9qjEcI4pmCk03GsoX09y2fn/J55bzOLZ3m9TpwxdryuopCMhWNJpws/SSzVblELpZqTiRj8RUKvU/t27xKU2Vz+dc3fBv9PJJocZoj6SiDjDapr1MqSxZS6USkIu4kCrE2N4GlC9wAsYsuukjfD0s05C53l6Wi4Fdo5tS9r752skN1tCJVu/26sUf/eczU7z4pS8b6b9bvhOP/dsvNt/31b3/bbED/jh07ipWsHXWpSCmMu2JEgVUMYtJezVJRENzEp8kguI6Bj4lPk4FVDGLSLs1CpjDuxQl9dlnDSebVm4/QxcurW/Vr4MQqgvGf/vQnOasROUwsIQjddttt7Lx77LEH+7XEUVonTpz4wAMPEC///Oc/s6ET+/HQuXPnN954g6357bffRsR82rRpL7/88r///e/+/ftz6hXP7MbinN4pud8IgTjBavr06cRvGGLVRx99dN111z3zzDN0jQkBgJEwnqFDhxKeJdvAG8qScLzyyivffPMNR3zC5Lx58zhEcnycP38+50h5AQB9otd22203btw4IgeuiHbnnnsu5CabbDJs2DDR+c9//nPccccxEpxj/uSTT8q3GpA3DxgwABI1mYWMX0CdsHTXXXfhlmEwl6qqKpIPlvf444+naYsttjj11FNlAQHT5Ey/5pprDho0CIfwgLkQ8wjGDJ6siOvCQ0oK8v3332PL2jLTiy++mAwGExafTAU/OH/ttddIYshCLrzwwieeeIIrRSbRvXt3UjSZFOW77747fvx4xsao6Jo14TBA7CeTQOGAAw4gXeAGoItVV12VJub+7LPPosxaffXVVyzCjTfeyOLjivl+/fXXH3744Y8//sgwANeL6ytbh2dZAK5uueUW/Oyzzz4rr7wyM6Vphx12YJWYHTz3zAUXXMDwmNFZZ51FcoAOiyxOAE5kcVgWxjZnzhyuHUPl7mIi+mtU8qwA0+HqczMwSLnNMOQO0xUccJxXx+tYLhPvGpv42vTktJR6QjbOdBjWIVueiSdjZamyZIKUIOk0Oz//d0pqcmVzS8O8abPq3q6fM3LW9Nd/nvzm3DnPz5n29uSeG65KqpDsUKjqVfXzu1M6zus+/70Fc977KdeYi9Y7uZ/SDW/Nn3j3xDHnjpn+7IyV9luxrHMZoZ0kQCUZIJ+f/dxM57P8nO6ZjkO65rvUVibi0YT69KKeqtLqum7Xyd9OiU1M1Y5qmTu2LhEtyzZn0j82T3580pdnfxf/rmZ+5+Y1zl69fMtkOh5NJSoiFeXpqc0zRkxJ5VOdh6yY65mLVVRVx51c12jH8poZo2bkZ6dm/ji95y4rOVXN8Wxy5gNzv9znu0lvTuq9Z+9odVk0kXdi+pfWyVEmZudOnbXC5j1YORmvWsnW0sBDusXgpIFVtJosklS3Iv8s+eDGBWTBpiwVBW6S0iNSGhiyOZ9P57O5XEMmn6nL5JrSTf8efsXFl/xz9McfNTXVZTIk3Tze6nPFAH0xdHvwlKWiwCoGMTGlR6Q08JA+OgY+Jqb0iJQGHtJHx8DHxKfJwCoGMTGlR6Q0aCWz2frMc0e8mJsPkeZ659UHrouA4h4480z1B45XXHFFIsRPP/3EiY14Zu6NmTNnyke0br31VrZ+YqR8Kp7ww35NAMCKzAArMga242uvvZYgZPZxnkn2ZTQJ7RjK4bvYvR4AoC94Qtpll11GDoE+VmzlgDrxrKGhgVMgvXTr1g2fDz74IHVCuBkn46Hr4cOHY07XYo4m5ttuu+0HH3xAsGHwBEhiIdFCdESN+gYbbPDCCy+MGTNm6tSpzILchQHr/ovvWeBnv/32+/zzzzlYk9+QlxAjZfwyEUCdLlgQrEgsiKxuD5tvvjmxk2hHskIkY0aHHnooI9x9990hicekLxIXKfH/1ltvMSozDMAYWFiugsyFoEiIZdmJhaIm60ZJKkNOQKcbbrghYZJrSo/4fOedd7jK8ChTEsKPOuqoRx55hGGQcxCY8Sxj41rcd999rDZqZqFAv379aGK1uRwsRe/evcUVCscccwwJgbltBLIskMyRAI/mww8/LJOlo3XWWUdsTS+s3h133IHy5MmT6UjuFuMH4IoLveeeezJNsWLuZHL//e9/uayI5AE333wz/uUlHDFUl6e1or3k8vX5lgVN35z+/ejIqFHRkZ+c/vXX33xBv3J7c3GV1uzsO1u8/Wnky5Hxd1+vevGTyPufRd7/ODbqzcSX70e/eK7ziJYx6brMgvrmqQtm1/308MyXNnnzg+Q7o6Pvv5X68N3ER+9GP3g3MfKN1Jsvd3xj1FYf1X5fjzYzyuTSOUaRzaWb818c8dlnkQ/eXGvUjJH1P06fvKB2XprtOccGns/wpKZzzfOb5n02d/T+o9+oeHtkZOQ7qTffSb3+btkbH8RHvhcf9fbab026Z+Kkn8ZPGPfxjxO/nT3j56Z5zdOf/PmNimff6fDalCdm/PjTlKkz57YsmDOneVr9+KaPDvro7cSHr5V/8N1Z39VPq29ubJl8y5SPYh+O7Pt604+NC+oy2aaWbI41b1ax4+26z64anU2rLzPO6yU0C6iEVnhItxicNLCKVpNFksTZ4stuSwXUcPXrhJSlosBNUsK4Raumem2Ael793imHwVgBnUKBpM9Raanj0PjLC5ViJaLbrbv0NGk7BasYxMSUMG6xLc1SURDcxJTwHlKbKnhIH52iEKzH0iZtqmAVg5iYEsYt2jR/+RuGTnWO20F991jr+2no8OR88sknn332GXurfMkrJ1q2VA6L4o0tntM5WzCRoEePHjSxrdNKZOVUigLhVo7mRBccspOypZIrEC2oE2Z69eq15ZZbcprHOWAfJ4RgKGOmwhioE2PYKwlIuCUqo0M04pQ8cOBAnGDF/s6JmWiNN6KadIpD8UNf7ONEkVdeeQVzGMyJE5tuuqlEbjRFn1DBIZsYSTTFdt1112UKEtWYArOjL86dpC9EboZEXN96660ZCUMFrA86aJpZCE+P6BP1WZm7776b0/mbb75JnXES17fbbjtR4/DNMCjJUb777jviH4NkPfGJJl3LXJgsM3399dfRIemhuzXXXJMLhCE7HbNAGeAQ5dGjR6NJZtC5c2c6wiGZDZeV3OgPf/gDalTk6jDCe++9lwyvS5cuzHqjjTYiqWKczB1D1pnJosmkuO4M/qWXXho3bhy90ETs5+Qt+QRgGPghupOZ0QXmDA9bc32BXBpAaH/qqaeYiPvaMUFSEIbKyLlAq6+++s4778wEsWI1ZJHpCFH84JCSfvE2YsQILhDKZDy77LILnbIIpHS4JVfgXsWWVvc1wrboh7NuprAg3tQwPt342szmJie2aSq/4ryq8s7m5scw25Kb+NbkwphoSz7TXGiuSlcmyHHLmgqReCQXaalsXGn3XrmOuYpCMpN3ZhZaOkyL5d6pm/tFZv7YBZl8NpLNVXcvq16ruuMWHcvWqiiUZfWnD5LxZDQW0Z/nzmdmjpzT8HVDU1WmZouqaKdoTXlNeVlFNB5VvyOoRqyicFMzR/hk45d10z+cWv/V3PiCaKYi1rV3t+pNyyu2qqgrr89nkzGcVUc7pcpiqUTzjMysV6flMpHKzSqTXTKpqi7JylimPFu5IJmbXPjxzQnxdKGpe8vK268a7ZpqnNpS/96saLLQdbte2epYRyceTcSdeG4Z+BuGVJSs3CzZkEFaJ2BEgZukhHGLdk39sg63vJZyToQ8IBopxPTllCeDGhW/Afg0qc40rGIQE1PCuMW2NEtFQXATU8J7SG2q4CF9dIpCsB5Lm7SpglUMYmJKGLdo01xENkDJeYutH1BhE5RdmHgjOmzTxGlaiU84pImtnFZsYeCJRmzZkLL5SiQDeEMfBlBBhy0bHVGToZqRU2eXF1t80qOxBZio+KN/KYAmQJ0eTezBA90RuQW4whzI+KmgzxYv8xIn9IUaFVqpoIkrfALpiJGgg1ucowCDjgnDki3J+OEBdckGiJH33HPPFltsIfNFAciQMIcxXTBTRksX8JIiYIIfmuAZgIwTQ/wzTXhaZUFwQomIjiwamqihjHNKlAEV1HDO3FHGLZpEZeqiJpoyPHS4+ujTBaNCUxbBKDM8Kswd0JcMElCHkSFREWWspKQVb6jRLx3hhI5k8DiX64UCalQgWQ0UZJFxZfxIyWhxJS8XYYWCdIetTERGIuvJRGRNKN1+uJYthUy+JZNuqKttam5pdphPWZX6igsMxSrPPZVpys4v1DbXLcg3qL90EE2mYrl4IUF4zCQKiWhFtdOhLJnPJ1ty6Ww9N2Bzc6Il3+TkU4VktJBIsmyRXEOysaU8n8pXMx51ByYT+uuFmG26MZeta2nONtQms/lkqjPdp8r0Vxjqe4r7Lh/JZjisN6m/iphuLmRyTl5905CTYHDRTH2ikSvcoblDtLwsVxPryrjjDjt+pj47N90SzTRUF3Kxik75qqpULJpsztYVGuqbFkQanWS2KlkWz9dEKzKVzfEWpzkXT6cyVc3kgYlopRPLhH/ReFlBRP2FSh5BlQPrD6SQCnM9c+oPd6q3rRQdIoQGD4/aQ3TcYqtSu5V+2bnYrMFuKzusacUKiJWAOrxoIrKroi/KPJk4kY4oxWcpsEUfW3oxtujDy7NNF5Din5I6TeLQPRgTVqVHMcEhgMcEUgxxK8MWWxToWhh8igejjxNajXMUZF4eCIkJEA9SN1Nzj0HGBkkpyuKEVhmzmMhMxcTwkG5GcjiciLK4ZbQyKSEBFWwhUUCkTgU1zsQA3q1pPOAWhu4Ym7RK1zI8SSDQZ+5MwczCDUijT48o48ENmT4lCgBN6vBFexdEGQVzLfBsDCFxDoraNnCF8g5n9UQqUh4t61BRUdOpsqK8ujh9M/5oJJ7Mp2KpQkUy2TlRXpWIVNJ1oaI5URF1yspzkXSqKZuqc+KFvDqUV9ckaqorOhRqUlH1/cHxSLzQmMrNK3fS8WQqVxZNqG8kBnJaA/koMbqiKlZRHe+UTHUtkFYks2zSbNARtWWjR3exskI0XhaNVscTVVyRQnm8JZnL5GLpgpPv3FLVJV8R6dxcXdnSMZ5LxAvRSDJRSJbFEhWpeHmqPF9Z4ZQ5KXK1QiRT5sRSiepk13hll0J1tKWikCrEIslmcpvm8kKmoj4VyRUiqRwDXFagnvBidQmGDFIeHilLRYGbpIRxi21okvhSQYqoL8JARycA0qVixaCkR7fo06RNFaxiEBNTwrjFtjRLRUFwE1PCe0htquAhfXSKQrAeS5u0qYJVDGJiShi3aNV0mpwXTnlxt+GDnGo2Qc4STky9QqB12Hf0zZHL8p86dmESIxywZ0WLrXgo5FFQpy509darzi78p0h1/lSfd4onYlEVYZW3fE6d+dRnFHSzGoaKIjp0EQZixaORGUBxGIh5hlD86lxV578Cu39Eba7EBvVZBHU0FJ/aIa4WmrLqOp1lNnrYWLMISk17wEB/+SvTyak/4MqJVKnphaItrr/ggDwBVxzClRf1Jrh88T5BnR89DrUEMt7WBdRdcyUmT5qyxRZbzJw16/777ttyyy1TyRRhLRaPsQ7YUInHifoqImCIKd4ZCbZQatFJfpgOi0Wh5yLLnsmqQz88rWgwTFxxFdRiM3jOguplDnW8VgvuqDcC6U2ZqLtDLZSau1ooeRkgn05n+Ke1X62pJqGslKL2THco41mbcHvkGLSKZ5GoXig1XDyo1dbLxNjU/FQyQYueguviFqcj9wMdaXNGD6HuPfVeuTqnqCYV2tUPt5MwHj8MVF9Buk5jhS0muGOuOFStrAB3GnckCYZeauVG87/44b8cG2UhnS84aR6L5iwZGEY4istX/Ki+1DlbvY2vFoCRRgtRJ5ppicXKMrGEk2+oKBCmU9FyliVL6M4Vos2FZvU3A7ItkaasvikI5wmGwziS5JckGzwkrc8sXeQcLp0aQySfiOXUpKNJWTszYB7aXNRRT1OGOctNS3fqZbxYLkkRTTmpSMxJRNW3EWojnkzcRjPRXIJ8N6L+czDIq99XSDuZbKEh0UyKoP7CYTRbyCUz3BeR5lQhXogn1NcaRZlENP1uZsLoMeud2C+ivjm5eJMXR67ujyI8pFsMTipHGlbRarJIkvVRsnKzZEMGWToBtyhwk5QwbrEtzVJRENzEp0mbKljFICamhHGLbWmWioLgJqaE95DaVMFD+ugUhWA9ljZpUwWrGMTElDBu0a7Z5Dx36rO7D9/TqdA7ltpg0fvlDqSiorB6FVp/DYX6H3ahXlTIKqgYDGhXnA5FAhTUs6f8qt0WAmXlUHdBq1LQpuIbCEkrdSlFVG7zhZzKBFSfSkn1qMel44f2r/QVdI9KR0M1UOqZyBAglVlrvziXfRYFnWqIhQI6BALZ9QC8ahZP0oXqFF/KnXYknIxcD9gpzJg547zzzquvqz/qqKPWWXvt4rk8HtdLLsOQYFB8vYRyUdNRQ+A/EUVHhb7WfEqTcnV+0cQFQULalUudRVCnqjwyWqWqNNVQ9NpqVWKPjLMoAv4prlXrYqKs/tcothLn0Sy6Uiupmopj++XiKicaylJDm+slxlygK3qRSfSKtp5SqbXeJECJrbeftDIiNMWDMdT8L360kQIiNmoUau7FwRtNFIrT1xCTornWUf3ItCG0N7kKrdPSfWk9ddG0MnqIugl9bSJ/hLgVyrlG0VxXtKLKz1BWZgK95sqp1qNQSsqtcqounr4TTBOlGplu0qQeuNJTmtoJtybBP5OLOC3vpieMHrfeSRupbKB1GMaP6l3DQ7rF4KRypGEVrSaLJNWy8I9yszSgdAJuUeAmKWHcYluapaIguIlPkzZVsIpBTEwJ4xbb0iwVBcFNTAnvIbWpgof00SkKwXosbdKmClYxiIkpYdyiXbPBefb0p3e/aq9IOfr5CCcc/YXnrTr8y/9UFaFN8KlIadWiqisDDTcpdTepVbQvKdQ/QiodVah/iqS7Cy2pijYxhmqnUq3aTisUS1rFXKPoTQSlIZaaV6ZFPaVCvVWTf4smSkGpFzU1L6shW6XypkXjTbEi8q8OS5zcMpyV6+rrMulMMpWsaH3rQebB5mvGg1vVoZ6IlNBCaqh/zNiUhhaVoTKnWf0nLeJNBFVB1qslnLS29vJL3U0CjFu9KSszANVQbFH/FHtudW5Q1NTdGQ/uLrSKci6lUYDX3tyLrBIdrbOQB1MXtIqqT0TxqTVVBWgTJYo5jNub6sozRzWvYqtHU7WqUvdtvOkuqOhC/asUlKG6RrpdLMWbCtq6UnQr9dZWL6kcaGvpQnpTsrgsQpH8g478o6hfvBVV6dlFihNpNSWGrU75V2UD6iXl5pENYz/6foOTNpU0n660HxmkjErBQ7rF4KRypGEVrSZByNaXIEOECKHBc8EPyX4uls1GmzOJbMGJOnn9U6CM6XpMfc5U1YuienVSWlWp6lpB110kdfiirZCioMVIQf0Nl6j+iTioxVt1Wl1pD7+QrR6UYUGsij9FTa3WOlQz/lZvylDVjVVrv0ahtaKn0zq8hFJWY/tFU1WkF8hWb1rf6Ehr0W0kgpNEXL1jXF6WqKosr0lGK/iJO2XwUSdJqbrQXbeWYt72dGTNKc0A1I8adnFsRQ/qB01RYMqtIy+uhv5p1WztUfzYpqMrYoVIKwdmvUqyUEZTOaFVKvgpjk3MtQfxJv0WKwt1oc1VXY+k9Uf7FIWFPJi66bF4+fRVpq5+NKl/Wg1bzUvHI6Ti9Rxl5NLq0VQ9Sl/6VmkdpzbRCtqbEvXPL8NoHZ66KB63UlfmiyKlrlu1z1irT66IEov9FjVblfWqGsNfvFH+Irr60j+cqDPRXDaaVq8/RAvZaE598Ex9xHapRJgNhAixEHiwY4VErMWJNSei2SQPv36p0fZD+q9KqVNqsVi2KohO8cddL/2xtgYgi8NoJWU8olOsmx+PoZSGXLhVfhYaPz8u0etcfny9tZJ5R390V/21mUiyLMlPqjwZZbtm16VJ/Wj/v3Th523hn1bSO3L50aRxa9VZiDT1tr15fxYmi315NEW0kp4fK2l+FumWetseflle+bFqLkzapyM/HlKLvyy1lFZv5mdhk6JDj8nCot+PVdPjTcpWcqHxeMy16O1dfRdiPB2Pt0TVX1Pmv7zaQZZSMHtmtKRDBml9ccOIAjdJCeMW29IsFQXBTXyatKmCVQxiYkoYt9iWZqkoCG5iSngPqU0VPKSPTlEI1mNpkzZVsIpBTEwJ4xZLNdVbmI251456re+g9dXHxaJZfQBA+xcdra7g8eMWTQnvIbWpglU0pI+JVceU8B5SmypYRUP6mFh1gmuKAvhF1AuKqD5gyRarwVaqTPhXDNXupHjjp01vGh4dHxOrTnBSOdKwiv6kWwyuKQoguIkp4T2kNlWwiob0MbHqBCeVIw2raDXxJwVWsV06btGU8B5SmyrQFilEYuozipHchJaG8vnrn9A/lnf0xxD9BmBItxicVI40rKLVZJFk+LmBX0gfnaLga+LTpE0VrGIQE1PCuMW2NEtFQXATU8J7SG2q4CF9dIpCsB5Lm7SpglUMYmJKGLdYqqk+a5XO1L7ZUN/QEsvRqv5umZPjOdE6KnC5TLRoSLdoSngPqU0VrKIhfUysOqaE95DaVMEqGtLHxKoTXFMUwC+iLowYiVKqD3IpE1OaS9bqp01vGh4dHxOrTnBSOdKwiv6kWwyuKQoguIkp4T2kNlWwiob0MbHqBCeVIw2raDXxJwVWsV06btGU8B5SmyqoVv3JwlxM/SXE6rUqk+uX618zUFrmHi7dZwzpFoOTypGGVbSaBCGVrNws2ZBBWidgRIGbpIRxi21ploqC4CY+TdpUwSoGMTEljFtsS7NUFAQ3MSW8h9SmCh7SR6coBOuxtEmbKljFICamhHGLbWmWioLgJqaE95DaVMEqGtLHxKpjSngPqU0VrKIhfUysOsE1RQEENzElvIfUpgpW0ZA+Jlad4KRypGEV/Um3GFxTFEBwE1PCe0htqmAVDeljYtUJTipHGlbRauJPCqxiu3TcoinhPaQ2VbCKhvQxseoEJ5UjDatoNVkkqX7XQvkIESJEiBAhQizHCLOBECFChAgRYnlHmA2ECBEiRIgQyzvUOwfF6hKP0rc63KLATVLCuMW2NEtFQXATnyZtqmAVg5iYEsYttqVZKgqCm5gS3kNqUwUP6aNTFIL1WNqkTRWsYhATU8K4xbY0S0VBcBNTwntIbapgFQ3pY2LVMSW8h9SmClbRkD4mVp3gmqIAgpuYEt5DalMFq2hIHxOrTnBSOdKwiv6kWwyuKQoguIkp4T2kNlWwiob0MbHqBCeVIw2raDXxJwVWsV06btGU8B5SmypYRUP6mFh1gpPKkYZVtJoEIWMXXXSRcrNkg7F6ylJR8Os0S0VBcBOfJgOrGMSkvZqloiC4iU+TQXAdAx8TnyYDqxjEpL2apaIguIlPk4FVDGJi1fFpMrCKQUysOsE1DYKb+DQZWMUgJlad4KSBVfQn3WJwTYPgJj5NBlYxiIlVJzhpYBWtJv6kwCq2S8ct+jQZWMUgJlad4KSBVbSaLJIkGwjfKQgRIkSIECGWd6jXCorVJR6Sv0hZKgrcJCWMW2xLs1QUBDfxadKmClYxiIkpYdxiW5qloiC4iSnhPaQ2VfCQPjpFIViPpU3aVMEqBjExJYxbbEuzVBQENzElvIfUpgpW0ZA+JlYdU8J7SG2qYBUN6WNi1QmuKQoguIkp4T2kNlWwiob0MbHqBCeVIw2r6E+6xeCaogCCm5gS3kNqUwWraEgfE6tOcFI50rCKVhN/UmAV26XjFk0J7yG1qYJVNKSPiVUnOKkcaVhFq0kQMnynYNE6Bj4mPk0GVjGISXs1S0VBcBOfJoPgOgY+Jj5NBlYxiEl7NUtFQXATnyYDqxjExKrj02RgFYOYWHWCaxoEN/FpMrCKQUysOsFJA6voT7rF4JoGwU18mgysYhATq05w0sAqWk38SYFVbJeOW/RpMrCKQUysOsFJA6toNVkkSTYQvlMQIkSIECFCLO8Is4EQIUKECBFieUeYDYQIESJEiBDLO9TnCIrVJR7y3oaUpaLATVLCuMW2NEtFQXATnyZtqmAVg5iYEsYttqVZKgqCm5gS3kNqUwUP6aNTFIL1WNqkTRWsYhATU8K4xbY0S0VBcBNTwntIbapgFQ3pY2LVMSW8h9SmClbRkD4mVp3gmqIAgpuYEt5DalMFq2hIHxOrTnBSOdKwiv6kWwyuKQoguIkp4T2kNlWwiob0MbHqBCeVIw2raDXxJwVWsV06btGU8B5SmypYRUP6mFh1gpPKkYZVtJoEIcPXBkKECBEiRIjlHWE2ECJEiBAhQizvCH/DcNE6Bj4mPk0GVjGISXs1S0VBcBOfJoPgOgY+Jj5NBlYxiEl7NUtFQXATnyYDqxjExKrj02RgFYOYWHWCaxoEN/FpMrCKQUysOsFJA6voT7rF4JoGwU18mgysYhATq05w0sAqWk38SYFVbJeOW/RpMrCKQUysOsFJA6toNVkkGb5TECJEiBAhQoQI3ykIESJEiBAhlnuE2UCIECFChAixvCPMBkKECBEiRIjlHWE2ECJEiBAhQizvUN8/UKwu8ZDPPUpZKgrcJCWMW2xLs1QUBDfxadKmClYxiIkpYdxiW5qloiC4iSnhPaQ2VfCQPjpFIViPpU3aVMEqBjExJYxbbEuzVBQENzElvIfUpgpW0ZA+JlYdU8J7SG2qYBUN6WNi1QmuKQoguIkp4T2kNlWwiob0MbHqBCeVIw2r6E+6xeCaogCCm5gS3kNqUwWraEgfE6tOcFI50rCKVhN/UmAV26XjFk0J7yG1qYJVNKSPiVUnOKkcaVhFq0kQMnxtIESIECFChFjeEX7fwKJ1DHxMfJoMrGIQk/ZqloqC4CY+TQbBdQx8THyaDKxiEJP2apaKguAmPk0GVjGIiVXHp8nAKgYxseoE1zQIbuLTZGAVg5hYdYKTBlbRn3SLwTUNgpv4NBlYxSAmVp3gpIFVtJr4kwKr2C4dt+jTZGAVg5hYdYKTBlbRarJIMiKvFYi85ENGLGWpKHCTlDBusS3NUlEQ3MSnSZsqWMUgJqaEcYttaZaKguAmpoT3kNpUwUP66BSFYD2WNmlTBasYxMSUMG6xLc1SURDcxJTwHlKbKlhFQ/qYWHVMCe8htamCVTSkj4lVJ7imKIDgJqaE95DaVMEqGtLHxKoTnFSONKyiP+kWg2uKAghuYkp4D6lNFayiIX1MrDrBSeVIwypaTfxJgVVsl45bNCW8h9SmClbRkD4mVp3gpHKkYRWtJkHI8LWBResY+Jj4NBlYxSAm7dUsFQXBTXyaDILrGPiY+DQZWMUgJu3VLBUFwU18mgysYhATq45Pk4FVDGJi1QmuaRDcxKfJwCoGMbHqBCcNrKI/6RaDaxoEN/FpMrCKQUysOsFJA6toNfEnBVaxXTpu0afJwCoGMbHqBCcNrKLVZJEk2UD4uYEQIUKECBFieUeYDYQIESJEiBDLO8JsIESIECFChFjeoT5HUKwu8ZD3NqQsFQVukhLGLbalWSoKgpv4NGlTBasYxMSUMG6xLc1SURDcxJTwHlKbKnhIH52iEKzH0iZtqmAVg5iYEsYttqVZKgqCm5gS3kNqUwWraEgfE6uOKeE9pDZVsIqG9DGx6gTXFAUQ3MSU8B5SmypYRUP6mFh1gpPKkYZV9CfdYnBNUQDBTUwJ7yG1qYJVNKSPiVUnOKkcaVhFq4k/KbCK7dJxi6aE95DaVMEqGtLHxKoTnFSONKyi1SQIGb42ECJEiBAhQizvCLOBECFChAgRYnlH+BuGi9Yx8DHxaTKwikFM2qtZKgqCm/g0GQTXMfAx8WkysIpBTNqrWSoKgpv4NBlYxSAmVh2fJgOrGMTEqhNc0yC4iU+TgVUMYmLVCU4aWEV/0i0G1zQIbuLTZGAVg5hYdYKTBlbRauJPCqxiu3Tcok+TgVUMYmLVCU4aWEWrySLJ8J2CECFChAgRIkT4TkGIECFChAix3CPMBkKECBEiRIjlHWE2ECLE/xvM+3a/EaV+FpfnxQ4ZmLtcPuGeO/UlcCk8Q1qMI9TTVSjKvs5FE0jdlIsduhOv59+pryUTYTYQIsTvjpxGJpORzYUyr5HNZhfLdiMO0+k0pdQp6Q4UNf5fIXNkBWS+VGSQkUhEGOqiuZxAlkJmLYvDlUJcQtZB3az6GsnYAANjwIvldsIVzqVCSR3PukPVo1ZRUGuRz9OjNHGriL7cMKKzeIFbOpJeBDIwN7NsY+nIBrgVQoRYqlFXV7fvvvt+9913cj+z9Xz//ff77LNPfX29KPxq4JAN65tvvhk6dCi7pzCU//3vfy+44AKt8v8J9QDrrZYV+OMf/zh16lRGK/zIkSMPPPBARNnfRX+ZB/M98cQTn3vuOZk1ZXNz86677jphwoQlZB0Yw7333nveeee5B3PaaaeNGDHitw/vvffe46KbGzUajd58883/+te/pFUAL7jnnnsuvfRSGAnJlKeccooMg7US5cUCvBH4jzvuuOeff14Yumhqatpll10mTpwozLIN5hu+NrCkw2wZPAncryZZBmYbbQvGSoCtZLv+Vv97MB7AwKQuT75AmgRGlKbFCHyyMoSradOmEaTZl4sNiw/s+KNHj/7xxx9FZBOcN2/et99+29LSIsyvBk8y3mbNmvXmm2+yfzEXebynTJnyww8/FJX+X6EvmsoG3nrrrenTp8fjceHnz5//0UcfMXgRlx988MEHY8eOjcViLAtiOp3+8MMPWR+5cKLz/wgGNn78eMK2eRIZFYksT4eIvwU///wztwEVfMr0f9IwokDq3MCslXtNPv/8c24hWgVFdnGAXj799FPJ10VkR2IRFixYIMyvgIxQhipwk0pjScLS8RzqZVxOwfR5JolV22677YYbbriGRu/evTfYYANiic/iYAUIORiuvvrqq622Wt++fdddd91NN93U/TgtCWCcPHI33XQTR4Ha2lomC2OaeCbVTFpfA/+d7ofDDjuMxenXr99mm23273//u8guJkjAo2IGTwUwHdkEfyNk5xJvAtPRkgAZjwxJKjCsiVxNQF00lxMwfTN3gaTC8EWN/28wGEZYFDREhBfxt0CcyHyNQxFNnVsCwLAylFKXJrnPqQDR/+3AlendQHopCr8BDJis95JLLiHbQFyMw16MYFThOwVLOuQ6Ubntttsee+yxiy++uKGhgWPfmDFjHn74YZ/Fke3mnXfeQZOknrPj8ccf/4hGhw4dikpLABgkQ7388stPOeWUI4888qGHHio2aLALoCD3K4DxmfKvBj5Z2HvvvZcci+WVpfs9OlrOES5piOUKbFnc8zNnztxzzz0vvPDCwYMH19fXyz62pIFxhq8NLOkwl2odjYMOOogj7OGHHw5DciD3lhUSRx9//PGhQ4eSXycSCY6/G264Yd++fVOpVFFpCYBc3wULFohIMKY0F53AzERefPFF0gVIwy9esMKrrbbatttuu88++9AjzO/X1/KM/5cllQsaIsT/HrIJt7S01NbWxmKxxsZGOWnQtKRtL4xn6cgGQnAnSSUejxPL999/f8qxY8e++eab3F5yh3lAhPvpp5+o9O7dm4udzWapywtuxtsSAoZ00kknHXLIIUcddRRTK7IaNDH4SZMmNTU1US+yvwNwDuTpDbFUQ29uCvJoSKXYZsMiFUKE+C3o1avXtddeO3jw4H//+9/V1dW/6z72WxDufUslNt5440022YS76vbbb9f7XvEs6wbkfffdN2zYsCX25jNghGuuuebdd9994403rrTSSp4ZUf/www+LQogQiwK3k4Gkv6DYZoPnfgsRYvGC09eRRx75+OOP/+lPf/K/Ff9/EWYDSyXi8fgRRxzBFjZq1KgffvjBeqJdsGDB559/vs022yzJ95+B3rHVixamLDY4ztSpU1999dWi8DvD3W+IpRf6RYE89/+cOXOKVBvgIfrmm29mzZqlU4IwJwjxe0FtcEv29rJ0ZAOyjiEM2Lb++Mc/9ujRY/78+SNGjJBdzDQJSBQ23XTT8vLy0gWEkTcOUKPM5XLNzc24AvCIgCb2U6Mv2ys8ZUZ/Uwp1qYDa2tovv/ySsI2VkGoE2oP8ggAQn8p161eviHNAnS6AWMGIJjwDO+ecc2bOnEnGI60ATXECEMW56UIgjBtYSXe0IkppDKXUo1BqpgKkIxRAU1MTAaa+vh5lYJxIXSqiTEXMA4LuMHGPyqyklDIe6gIzvKUIQcbMTFlAmTWQCoYCEVGgXjQogWiic9xxxw0dOnTGjBlYyZJSMTqIkP/973/JmF988UVWG1JaS4EhCuKHCs4BdYAfgYgyQtFHhwqlMEVfywRYK5kspZ69mqDMVNZWLZAWKdEpmi1W0KP0S0kXdCR1Kix+USkSSafT8NLE2KRivqqrqNRO4AdQwYOsQGNjI/snO6G4pS4VGRjKnu4QxZDSOBFlKkBIMaRSNLNB9MUPFSnFj6dHQVFeGHS0dGQDTCmEG1zRFVZYYc8992Rx7rvvPvmddWmSCuXjjz++zz77yGWWJgMYzt84of71118ff/zx22+//ZFHHjlo0KABAwbccccdOOTeMgEYffEze/bsf/7zn7u14uKLLx4zZsw//vGPPfbY4/zzz99kk004jX3//fd/+9vfDj744F133ZUR1tXVTZw48eyzz95vv/0wIYm57rrrIM2oeG6vueaaI444go176623fuqpp2hibOg8/PDD6DMRmFdeeeW000475ZRTTj311BtvvJHhMYXp06ez4x966KF77733Zptt9sYbb8hQGf+555572GGHsQL4vPrqq3FoejRzueGGG/baa69ddtnlgAMOOPHEE1kKeW4NRF9eq2BeRx111MYbb/yXv/yFqW211Vb33HMPyYFoooAmWwBz+de//sXgxTY4xAnjnDJlCjNlSZnUe++9J0+19ILaTz/9dMYZZ3CcFaulCzIFf6AzadIk7gcu/ZAhQ8wvZQFuFVLDjTba6N1330UUfStoTSQSd999Nxkq99W0adPkAhkrKqzqI488wr3K1Qfoi44VNMnj8OCDDx544IE77bTT5ZdfLiEHV1Jy4XgKeARQQwQ33XTTOuus89hjj1HXbpY1cHMyO1aPDeSCCy6Qr7uQyY4bN2677bZjrUjote7ih15j1RfDYE/Yf//9t912W/n1YPfhYfz48X//+9/Hjh1LHX3ygPPOO4+TEluKzxVfJKRr7oGXXnqJfeaggw5ibzzhhBPY6C699FI2Q0k4vvjiCzZY2Q/ZatiaxJwklb3rz3/+M1sQC8WNOmvWLLaOYcOGDR48eIcddrjoootg8ICyezpWsAIfffTRMcccgze2Dh4c1oHUpNisIQOWshTw4TsFSyVk9+EOSyaTPHUjR44UXsCl/fHHH7lN11xzTbmZig2tgIHHybfffst9w9mIB4Onmtuae/Gkk04ivs6dO7eo3Xr3EPh33nnn22677cwzzyRCDx8+nMhNrOXg/txzz+GHWEi/ZWVlK6+8Mvvvm2++yXP43XffcceTZNx1113spDyEZ511Fs8MDsU5w+jZsyd78VtvvcUNLXcwPI8Np3DcbrHFFjwM4nYVjR49esg32DD9VVddlZM6Zzt2YdmdMadcY4018Pnaa6998MEHxGYiuumRCuGE5+2WW27hgXziiScYGx0R7EeMGIGmqLnx7LPP8iQzqaeffvqZZ55huZjUySefzERYSdHB7c0338y+c+GFF+LZdNcuNDQ0kJb179+fMysbxF//+tcFCxYwfWnFJ/sF6dQ777wj98CyB26nww8/nMvx6KOPkp+xGmYlSf7Y9MmEuF4+y0sTYH369OnDLcqWuu+++0pC4F60hx56iKtPuGLf5O4qsr4gvcAhCR8PCDs+2ap7GIztyiuv5Fakzl3BhSMN5fG8/fbb5WwnassSuEayFZC9UfKAM01ZE27g0aNHszN88skn5gb+PcBS0zW9sP6cZ8gXeTzd1+WSSy7hQvDIMzaUucGuv/56tj62Mrdau2CuJvcAGybpO/sn2QB3CPeA3HXogKqqql69ek2ePJn9kM3QbBfsXWxlbE3suvA///wz+UTnzp3JIElSuf/xc8ghhzQ2Ni5ykDhho2BzXm211diCuChs0WxB3Nvz5s0rKjkOdyYJihm5BfS05IMVBDxRpiwVBW6S0iNSGnhIHx0DHxOfJgOrGMTElICAx7mEcy114uXmm2/ORSQp5jKLAkki4LxOkKOCIXcVOux3kmkK8AP+9Kc/8aCSUIs5+vgkznHHcMoRhyhTEqX+8Ic/oEy8RA0Gc5467ukNNtiA1AGFL7/8ksOBtPKwdezYkTBP0P3qq6+MH27cLl26EHF/+OEHQwK8kYgwzjvvvFMY00RmDU8OISKgSVqlpPe11loLnRdeeMG0AuqEAXgiNHVALwzv5Zdf5hHl0MYjajQBx+7evXujz6MlDMqYkE+kUilyEfYRo08ThwAWhKAiaoBB6h0gcu6554oHIPo87d26dSOrEBIQ0Xl6zUWBwQ8errjiCursAhyCWV5WUpooebbXXnttMiEyJ0QghoA9hUGyFIbkHuCQoX0XIU0GRhPcf//9rBXgJM1Bx0BEQ7pFKQ877DCmJsMzPk1FudZAYeLEiazthx9+iCg6bJqsqtiakst9ww03sJicKUn1SOlYdjHh7iIDZs0ZJ4yQBiIaUirc2Cwg13rjjTeeMGGCXCbIe++9t6Kigr7waTUsJfHTr18/YjyDxJxhHHvssdQBCvhcb731IOUmBDxKRCYu1mabbcZcRA2eAeDn8ssvFzV4LivT/Pjjj4XRHXoHAKijTM599NFHq4sR4OpQCqizaCwsfhhAqXNgSMAjs9VWW8m2ICSnf9ImI1KS6DA1bmBGRQDjtudmoE4rJdPhhmf6PG5CiiHPCzFPll1ANCX/k7pxLuKpp57KBiJ1wHgGDhxILKcuPpkLjydD5XmhTlrAMC677DLqKAD4FVdckQ2HMzoiPFsQuQtXavfdd4cxfdHEdSGiUxfSXBf3qCgF6LPpceJiragD3aHCqFGjmCNdoyYmxHv2HI4xHF2EBGhy4OERYKtkVyddEB5gy2CYC881akIabwIhJ02axHkJ5zzCaApPhSyNwTNTGRu3DWrM3eiYUirE2fC1gaUS3CWUXGySR27rN954g61W0iaauEc//fRTnl7qwoiVG0Jyu1MhrWaPQ8QVT+8ee+wByamaW0TrqpSRuxmfKHC4pxUGfv311+/atSu7JE8aoWjdddcleolnKjyBHMh4eglg2o1CZWUljwQViQ0CvGHF81OU2wnGTF+MTUYlkLrnoI8Oz8MZZ5zBw3beeeettNJKwsuYGRibpjACnPDwcNbnceI4SB5jeBZWvvWBcyo6eIA58cQT2b9OPvlkDp1m9YKDreHVV1/FLf45Fo8dO5Y16dChA6Ks+Y8//khIY9jsQTLmxQI800t3DbZOqQhENKRbpBSwyL99MDJBAj/pAockRG7LqVOnrrDCCtwzosNNdd11162++uqEGWEWCfywVpzbiM0ExSlTptALsfyEE0445phjSLx4iIqqi8KTTz7J7cFguNDPPPMMfvr06QNPFzDc6mPGjCHnZs+FpJWbjaSQLIQpUP/tSwRwix+mr6/GIq6OlALV3L07V1luy98+GEbCguy2227cokz/6aefhiS7xTNN1DfccEPyURaE3rXF4od09OCDDx555JFsAkyNswoD4CxuFpzniHyFSyC7EGR5eTmpGEOFVF5+FZgyXZDT8DyOHz+e/UF4mT6HE8bg3nxYeUYoQ3JD9q7a2lo2H3IaGNFhO+U+p0Kwhyk1FLDV/+Uvf+GROeWUUzgTGjUqnPHYnDkT8kwh8jSRdpABt+UKhNnA0gruOcDhj0tOFOGYxWUG3Kbvvfceh6GamhrUfK49TWTlgwYNuuCCCzp16gQjPnlO2CLlI4FGk82O/ZRnm9tXGIAady09sslCirm2KH6xKDc6+4U0CQ/kISHNFxGgiTf8FOV2QvotNZdBFoVWPP/882Q2LA5PS5FqBcqe2MDAUJbXGHfcccciqwHTt29f/HBCknc3UGabZqMhxlAp7XqRIJxwIOCgQJ0Lyl6z88474wrnMDjksWfdNthgAy66tlhs4Jx07bXXcuri+HijCyIa0i1Son/JJZf86jTODebIBD/66CPSR7kb5azDHud+GZ8F32ijjcg7gyyvui002J05KrGee+21l7y5S25HhbuX1qL2ovDZZ58NGTKEyvTp01988UU2Vu5tGQbl22+/zWjJlSV7g+R+wP8f/vAH7hO5w5WX3wA9lQL9XnTRRfpqLOLqSGkqXCwOD/j57SMBjIRDOQkWlZ9//vmVV17heLrnnnvyGIp/przZZpuRQBOEFkuPpcAtvX/zzTccrKnzqDIkbp5ddtkFXkYi38jCVZBgDA+4Ywm966233q8eGIa45c7s2bMnj+rBBx/88ssvy5mKJu4B+WNLogxkd6JrEQVyV7AfwrMPi2h0pE6eQUXMPYC888472Z0YA9mAuzvqjIGMhK2J9SFPuu+++0iyPQPwIMwGllbItec+kHD7n//8h81It6g38/bVr5ALrHeA3F5ERBJ8zrKkz++//z4Hr9NO+7/2zgNMiyJr2/t/BiSKAcyAqJgDKqKAAkYMmEFUMOcVM2IWs5gD5oiIihEFFcwKKgqCiKioZFEM5GTa3f/ufl6ORXdNUYNpkL6va8/Weeqc6urqfiu8M4xn2veBbiIvFq/+jz/+yKkal1pgZeJl5W2mFpECvVLHeLOTtHRTLF2uoH2uWHLK6GFZEKyru1mZ9gEl3yyKfnu8bt26rCvWSZF0fcF2iB8xYgS3TCSb6z59+vRNYUtBO/p1DT5vQAAw4wBbCsra9JSLxo0bM2XTAj3kOEsLHTp0wKU1dUy/JsnUtgiNB9CNc5WSXx6UCyV/UVELrVq1YpFm2Hk9+vXrxyvXpk0b9znSSV4e9kwlPwhtMlBAFhsCDvRsatm+MHfbtwLxPb///vs5X9IUHxk2FjwF2kzuPH3T2A1gW7RoIQWUNXv2bBZFc38PXFr3smitKUstSPk90Ej37t3ZllHgSf3www/cpn5WQq2uxYy0ySab2Pboz4BPR+/evZkGueIjjzyCwr5WXx2h8KrwXCjzXKTQPY0Ap+rNNtsMcZGhkcqVK3fq1IkGebXYKdavX3+LLbbgiMWmljcZvRQ6f0BkBWWGxSzTUSZGg8bJXgESXaZNm8Z0wSCzJ2OQNfKCFEaGjRHz5Lhx47744ouxY8eyZ1KVYvL8Aa9Fwd+F3hL2+7x2HCvfeecd3AkTJrB68SG0GBUyEMmLQoEN9XHHHbftttueffbZLEJ8lrRLBfcV5GO/3Xbb8ebx6SIMhdqPPvqIWZvzEJ+r/IVQvGKmsAhoK+C2UFZrGR13VvpfiqtatSqflrKyDAKYzRkKrRyUSQcK06dPpwWOXHfddZdmn98PDeocPHTo0NGjRzPHNWvWDFdPir0XOzY637JlyyT6T8B94jFoAMubFYBxXnbZZWmW+XT8+PEst7xabvtsvL799ttNN9205EfDwsCUTTojTON8TLhKuXquvyLHy8/yQ677RzNZWt5//33eE7YIelhADEycOHHrrbfGNf33QIOlUgWADxEDQpcYEO6dNSnz1dqHH37IFoGAP/ANyVOtWjUsB5UXXniBax100EG4XBGmTJnCbr5SpUqNGjVyx5+PEm9R5J4yzLHHHssBnReVuZG36+OPP77zzjtZdzlTuVekY6LkO5Sli0AV5xPeLiYN/eOysiIZhEsuueSUU07RjwkCDS4euwE92oI8DM4OO+zAUs1kx0vJG/nkk0+yS9W2tBSUgyo+vbysDz30EPsADrvXXnvtG2+8ceaZZ+60004bbrgh6TTotoDCK0Xhggsu4FPHjrV///7HHHMMH4P77rtP80L8kyorMkanzNaHxTgTvNBcFfQR5SNhSgYTibF9AAU2Se3atTs4hRmHMpbyfvvtx+2rwT8ELvdM+s8s9957byYy9Yf29UdyVlllFU5gugvF/35oih3k5BRmyXh4DTgUan4ptRUkvs+PP/44zTLNsTlws956663NN99cC0BJioCmrr/++nPPPff8889nviaXp8mBiUJ8O4ocM2YMm+AVV1xxxx13lA46fq266qrswnlVFEnhyy+/xNatWzf+KgH0jmE5FJYeQHng4fKU1dQfCJu2QYMGcUrea6+9uE3uF5FOMoEwUXC6sAH58+Byw4YNY7Tr1KnD3gsX+Ixw1Jk0aVK99D/fSph1Y8iQIeuss07NmjXlLhq6ChsL5gFeCbbvzIRMoUyJP/30Ey/b66+/XgpdVMLjxvGPQWZC0N+lzcDt61k8+OCDhPEgaI0OM5MrIEPy7PhfwWKBXj4KPGZQGcvTPeywwyi8+uqrY8eO5RVhnkozQhDPtHjGGWewgWAbywZCrw5VWL1JxPAa8RnT5Z566qlu3bpdfPHFbDjOPvvsBx54oHPnzlzUfnlQ6YsMlyiVfFArKHMc/+yzzzTLAAXWbHVbCqjPihcq81nFTp06lU9sKidQxZ0yFCU/VWgQy6zB7puTB8ueXZEq7be4CodFBdMCoMiCgssLDTK9UtBvKtCyngVPlqomTZpoi9CnTx/9fvIiX8jlhhtuYF/ISY5TFPOpIddE18U2bty4efPmekNKDf0RcLNM7tz4LrvsgkWhfUaVp8AugX0YZUWGoR0g67rrrrvyyisvuuiiTp06rbHGGk888cRKK63EZovVQmF6ZMryoqaIIYXFeNNNN2UWLtWl/xl+rsI22r6jJhKef/55boHHh6Ib+T2oD1ydo2f6NBbydGTdwnnnncedAu2UGl1UaEH3+Mknn8yaNYsBcX8pj9oJEybwfjImf8i9h6F99soszLyQPFkGXGM+fPhwesJbre2jOsyH6Nlnn2U7SACuWigvNEv6jBkz2Arw6JkN2HMcfvjh11xzzdtvv33qqafSgV69epWi/xy4Ea6ywgor6FsryqWKtHvYlVdeGZ1y165dKagqcMvFbmAxQ49ZBdDDZl7jsML2v0uXLqxe+iWsAGQBc+LMmTMbNGiwzz772JtEa0w3zBcqs2Sy2+UFojx06NB11133yCOPfOihh9hvPvbYYyeeeKJ+3CXUwp8BjVsPKat7BnsRjib00P08EEa3ORkkPUv7hiWGm11uueUmTpzIcCkS0KklnYIpWMSGDRuygaApPuQKswCmAOw555zDEVkiMEFMnz4d3d1blAvtPJi/uC6XEzT47rvvYlu0aEE/eS6scJrWqS1lLio0wkTG9o7H2r17d56vIddE18USf8cdd3D2pYVSW78bboe7+/rrrxle+zm02mfh+f7775nx3QcdhhWCjQ5b2PPPP5/jGq8Kj5LPyNNPP80I8zKwuqNoDCkoywt9ADpGeYMNNtCXFkJ/zIBtMX2moNbmzJnz3HPP7bfffnJ/P7prpn52w+nTWMjTkVVB5ZNOOknd03j+TmiEpviIMW7sBnQkKNX961+33HLLUUcdxQczveAfMwJlQU/GjRvHVfS9pgYKVw+LWSvtQtIHIjkJ8FnWD9HjX6QMao0bf/PNN7m0hkLUqFGDl61+/fq8AAoG9FLpj4OxxTLzu2+dCzr2iCOOWHvttblTpgt71b0s4lj8xTDWBRoHttu84qwWeqjSedi77rorz7tv3776B9nJqAXHjVpWfRphcrGTvXSO+4oBJlOdoakl7O677+bqc+fOnTdvHosWtbxepdDyUFbfvLp+C4llAEsAV2TyZRdiwRS4C6yO6QKX4wKHAzeMkeGExOaJ/vfo0cOquDvK3Ndbb70l0UaDljlT8rkifsqUKRIF6z3ndS5BZ/Qxe+mll7bYYgtWhf79+yftLhK0TFNLp5Sk//1v/Pjx9I1usHsj4L333uOh68QDpaBFhWHhxLz99tuz1WhZHpo3b07WMsssQ8dKbQWJ7CqtValShWYZAVIYWyyP7NJLL+XUValSJT2vUnTZ8D5ceeWVV1xxBYn6pWua4mYprL766s888wxH+X333Vc/OyAeXYl5CKAW9OJlXntaoKpq+i8hpXCVp556apNNNtEvuEn8/dAUY8JZv/QAouHJAi8PjcQMXQy0A1qT9PsupYr0z6KzdeODVvL/ZBiWlVZaSe+JFAosfvSBAh9hKYTB/fff36pVK54jz0jBvwcmDfsjVFKA0WC25NGX/EXFbTOP5nxWBJ0KXBiKl19++cUXX6TM50Wvqyb8tN4D11o8dgN6ikssPCoWHiznzmuuuWbs2LGXXHKJ/sCAAngnOLLzyDnIshpJpBaFtXzUqFFSPvroIxSa4u0BZgdmFpZM/XkN4oEDKB+hevXq8eqw6//iiy/4LKk1PkK9e/du0qRJsxQKTZs2ZT3g0py0mHnViF5NjstsF0hk5UbnIXJdCjTL3hyX1R0XkqumX0KMHj2aeK5ICwSoCrvTTjvxQg8YMIAzNy69YjO05pprUk7uKp0f9U+MdCNAOmPFKHG4pPzZZ5+xT9eFWGM4VXOGuPnmm1955RVdQnTp0oXbZzBHjBhBJ1GoJZ0F49///jcdO/HEE2fO/zvk6KwBrDFcResTIruxr776itw33niDdtLelRsWlc0224wPuU5dwL106tSJ8WSNXHXVVRlJToft2rXjEot8lQyMIdAa9xWPdYCy2gkTE8b9shtgk8ErMWjQIFyyuOWrrrqqTp06u+22m8ICTfEgeIVIvPHGG6+99toLL7zwjDPOYHZWb2lclleIeZxVhOM7j1LPtNREGZDISsxp7P333+dNIIWr8M6wUWNnRoEtCyLwCrF1Puuss7gQzS605Ri4Ou0k91D+E63SM/Z3om4wINw7H0/eWG6ckWdq6ty5M58LFsVS6J8M191222354AwcOJCPiZ5+v379Pv/8c547Ii4QxjYanQ+ynkspv/zQVDquCXwY33zzTdqnLP2FF16YMGHCoYceqovy9jJ78D4zCzEx4iJqnuSAMSv9vWZOWQRLpMBd6MtLxYOu68JUv88++zC5Mfi6ZUTSaYp56ZZbbjnhhBPUApYYPggEUE6zs6CHth4VB+4TdMOyeVe4IjbjYo2MGIgxAimBKsPrxqTwGsF6663HzLXyyiuvttpqtefDmkoM8PbwcugXRsQNN9zA5oD9KccgoLDWWmvRCAskKxaLNzPX9ddfj8gJ6bLLLuMoc9JJJ7Hf5MWlHT5aBG+11VbMyLyLtM8rTmTdunU5+3LooZ11112XvQKfK14mzh+8djTLnpQYIunnKqusQic5xTLtTpw4kTUYBeiPCswa7A/4JBPGOqdbo5aZlKYEl+ZsV6NGDTYfN910E8F0FZ17tCHiE9WmTRv6zMfgySefvOeee9jrYE8++WT6Rg+ZFNq2bUtTJP788898YNjEcK2jjz76iSeeuPXWW9nWnJNCMNMcS2OjRo3oG+1z+4xV9+7dN9hgA26ZD9Wdd955+OGH05P+/fvTGgFpT5PfNtdfeunTpw9i8hjmd5IrMibhv0Volg0fI7/++us/9NBD3bp122677R5++GG2GrwAfMJZ2/bff38uZynAHMemhI2IifF/i9DrikBMZKRgNFgkmHCYjnEVw+mc188GCpSiYB437wOP5pFHHmFDxqvCOUyJQvGGpQMtEHzggQey5+Ozw1hlYrDoehPYDTz66KP6ostqXZtxecF4RXfccUc6xu6c5ZBtKI9phRVWYHZ+9tlnL774YnYz+iUvN1EFOsMHcxH+FiF4XW+K2YyLNbwuFmL+FqEK8MADD/Da77777nwwL7/88m222eb111/X58KNlP1j/xZh0nQK7h133MH02Lp16169ep1//vmcWDgI6Y+pHHHEEbxsF1xwAfObfjYk1ELSdPn/FiFXnDx5MhML/We24cDAo+fdYGJp0KABfaC3vGPMG0yAmtYYJQrMnOycWMXZ3Wo+17yHy4TMR3iHHXZAZxpRPLXt27fPd4AOM3scdNBBhPE6Pffcc4wMWxA607NnT+Y6thq8h0wyt912G5MVuyL6DORmWqPAOrvAHqfCok4m3U33PtrIZFzhilgU1y0rMu+K+JRAVZqa4HVjUnhUlDk68+axVgEKD5W3gSfN+0otT5S9JO+9foKOwovCVpRa1jZA1LMni9eOZnUhzu58ZtjJUsvyw+eBHT1lRD4M66yzDq8aufqP8fDR4m3jhEo7tMnrzgvHBvzqq6/mYMRJiLmAWZiesJpqWeVywJtNx1jk6A9QRf/5qCDyuqNzazqbEkwVmwzuhTANAvDhGT58OJ8iff9BFf3XfVFI3vFffyWGj/r06dPJZUpiU8Ke/YcffuCjxUYBsX79+gSTojEcN24cbTJQVLH68qH98MMPv/zyS2YrNh+cePhU0yvrA2d0znwEoBBMN/QlJN1QT4BND43zyUfRCOuKjMnmm29+//336+9DwNtvv33YYYcNHjyYNR6XGEVSpm9ci2mIkWGImJe5HFXc4AcffMD477zzzvTZTWETxlKkn6FIvOKKK2iBDUpysRTFlxznihIzrgjEREZa7aRJk3gi7AZYPhXD7HnKKadwj9aOUniUlHl7eTr6FUVSNt10Uwq8J4oEu5ZwryjLu4TCE8S1d0kxWAZZZS7Hw6KWNzMTI5tPZA/x0UcfYXlP2KvxuKkaM2YMrx+3ydvOLMzmjw7rRXXbQeTlZPPK/kYKe1k+BXyCeLdR8lfEFV7XRNc1i54R09QEr6sUFk52qyzq1n9eOTaXPC+3HRUYECYQPjusjtw7ryufOKo05m4k9vHHH2fhZJTsUbLkMwLs3Slb41jcTp060az9B825EEtdu3btaEFhoCpGFYVN5MiRI2mczxRrqj5ZvEI0MnPmTJZV+qafESjRvZaeC2eGs88+WyIpei5bbrmlRSqRYCx7GvrGNoipjFMT5ygmTz6qHJY0AsQwHTGRaupG4WXjtWQiwjL/EM8mnvuiQap4nWrWrMlgchXieXURgYL+iJN1APQCc2neOmAMGXxaZhrnTSaS2+Tqr732GpZx49OnG8eqHWtNwcn/VXwYKdCQyeZd4YrYjIs1MmIgxgikBKoMrxuTYtYtQBIxH7kmpvUJcgVvlVtQAGWFyVqVyknEf/7DfM1qyv6dN1hV0lVAZMXlFeTcrMO3ValgZcOtdV3IVwl1xu2tFdL6BXpuuDFY13Xj05rfCsJqhYmGFMi4whQVyvXdQFqfdUVZtf+k7wbMeqsMr2siBSFXyDXRXJERzWZcr6iuytWrJdwYFfiA/JO+G3CtlzTwt5g/6bsB8LpY0BNROa0sIVe6nkvkdwMZV7giNuNiw65ZFXSDKks3KyhnBoGycMsu0jNWBdbZcv/8qaAiYNvDAPkYU5JtYAqKkAv5MpZPBZ9eTs9J6HwxDUngNeKIzFZA34yh5GOStAVFKxiBKlCftbelEEiPxG3HzaXsukaMKNcbWS7yzYqS/0dcYrFAt7nINxufWK5LZILl2mcK3PISBUOh0XDt3w7d0BP5A/uz0KYyAfGXtkgK4SwFQMlPMVeFvA2weLy13EbB3whH2DXWWKN79+76zan8Qxk2bNjbb7/Ntp3jKVXElCoKChwyr01BQUEFgc/m4rEbYHUp+BtZfvnlH3nkkenTp++6664vvPDCrFmzEP+bfpf17bff3nPPPfvtt1/Dhg1vuummZdJfU+DdUmKBYKywmTcZUR/Ckv+7oankW7/0WnJVqAioM7L0UF9Ruj2sUL39U7E71QjI1UfGqv5ekndowTc27eYf1jeaUvuyXtIueGr/2J64lHW5UumfDne6hH6jVVBeGjdu/Prrr7dp0+aGG25o3rx5u3btjjvuuD322GOXXXZ59dVXb7311meeeaZ2+sfItMIVuDAmzDU24wsNFLrc34M1rq2Y2mSvpkv8vdAHoG9YugfSkxlIv7s031XhHw93utT8v5AjbIj+kJfh90NP6I9+681VVP6d0A63ye2rcaw1LkVIyYfZuP1R/TE0+Jlm//CrVGSK3UBBLCz2Z511Vv/+/QcOHHjzzTdffPHFPXr0GDx48KOPPrrPPvvYX8Ip8MIsxvRqP0nBcj7G/iHTjRpZbv5fgMHFZpacvxFtVigsnf4HHinYXVOgVn2W8o9HT2fZ9E9+aQXS+1Ch4OXR93wlf/5pvuT8Drhrvagi8Oh1RSYWjRIQ+Uv670Tk/oHo1uwTKviEVpBP0F/DUl26dCkVFwf0HtjbkHFFICY+0ohPCVQZXjcmxRsTH2nEp2REPicUBNMEH9EaNWpUrVrV1cHiDa9rousGqgyvG5PijYmPNOJTXJG5pkqVKuuuu26zZs2YClW14oorrrPOOptvvnnmuOMmuq4oK4bWGjZsaH/KF1ZZZRUar1OnjlwR2ZpcEYiJjFShcuXKdK9x48ZaCFFWWGEFetigQQOLyduMa6LwuiYGUrwx8aLhdcMiD4jnrn/qJpEB2XDDDRs1aqSlyCIzicLrelPCovC69HDNNdfcaqut6tWrJxHWWmstnh2PTDGyVogXOVTwouq/DY3LR2P11VffYostsAQoRgWgG1yUS2u1Rll77bW32WYb/StBa9PwujExtEaBxjPPZaONNuJy7nPxphuZmECKNyZeNLyuN2WhYjJPyanIqJNYdVg3kHGFK2JRXLesyLwr4lMCVWlqgteNSTGL4rplReZdEZ9iFj0jpqkJGTEQU3LirpivSlMTvG5MilkU1y0rMu+K+BSz6BkxTU3wuiYGUrwxZtEzYpqa4HVNDKR4Y+IjFQDxKWbRM2KamuB1TQykeGPixaShFK8bFl03PlIBEJ9iFj0jpqkJXtfEQIo3Jl5MGkrxut6UsCi8brliXNcsekZMUxO8romBFG9MvJg0lOJ1vSkLFdkPLUFfgxQUFBQUFBR4KXYDBQUFBQUFSzrJdwWlYoUn/+WG6wpXxKK4blmReVfEpwSq0tQErxuTYhbFdcuKzLsiPsUsekZMUxMyYiCm5MRdMV+VpiZ43ZgUsyiuW1Zk3hXxKWbRM2KamuB1TQykeGPMomfENDXB65oYSPHGxEcqAOJTzKJnxDQ1weuaGEjxxsSLSUMpXjcsum58pAIgPsUsekZMUxO8romBFG9MvJg0lOJ1vSlhUXjdcsW4rln0jJimJnhdEwMp3ph4MWkoxet6U2LE4ruBgoKCgoKCJZ1iN1BQUFBQULCks3j8C8P/pV/XuDbvikWLzLsiPiVQZXjdmJTyRuZdEZ8SqDLiY4xASqDK8LoxKeWNzLsiPiVQZXjdmBRvTKDK8LoxKd6Y+EgjPiVQZXjdmBRvTLxoeN2w6LrxkUZ8SqDK8LoxKd6YeNHwut6UsCi8brliXDdQZXjdmBRvTLxoeF1vykLF4icFBQUFBQUFBcVPCgoKCgoKCpZ4it1AQUFBQUHBkk6xGygoKCgoKFjSKXYDBQUFBQUFSzrJ3x8oFSs8+r1H2bwrXBGL4rplReZdEZ8SqEpTE7xuTIpZFNctKzLvivgUs+gZMU1NyIiBmJITd8V8VZqa4HVjUsyiuG5ZkXlXxKeYRc+IaWqC1zUxkOKNMYueEdPUBK9rYiDFGxMfqQCITzGLnhHT1ASva2IgxRsTLyYNpXjdsOi68ZEKgPgUs+gZMU1N8LomBlK8MfFi0lCK1/WmhEXhdcsV47pm0TNimprgdU0MpHhj4sWkoRSv602JEYvvBgoKCgoKCpZ0ir83sPAYI5ASqDK8bkxKeSPzrohPCVQZ8TFGICVQZXjdmJTyRuZdEZ8SqDK8bkyKNyZQZXjdmBRvTHykEZ8SqDK8bkyKNyZeNLxuWHTd+EgjPiVQZXjdmBRvTLxoeF1vSlgUXrdcMa4bqDK8bkyKNyZeNLyuN2Wh4v/TdwXyKz7qsWzeFa6IRXHdsiLzrohPCVSlqQleNybFLIrrlhWZd0V8iln0jJimJmTEQEzJibtivipNTfC6MSlmUVy3rMi8K+JTzKJnxDQ1weuaGEjxxphFz4hpaoLXNTGQ4o2Jj1QAxKeYRc+IaWqC1zUxkOKNiReThlK8blh03fhIBUB8iln0jJimJnhdEwMp3ph4MWkoxet6U8Ki8LrlinFds+gZMU1N8LomBlK8MfFi0lCK1/WmxIjFdwMLjzECKYEqw+vGpJQ3Mu+K+JRAlREfYwRSAlWG141JKW9k3hXxKYEqw+vGpHhjAlWG141J8cbERxrxKYEqw+vGpHhj4kXD64ZF142PNOJTAlWG141J8cbEi4bX9aaEReF1yxXjuoEqw+vGpHhj4kXD63pTFiqyGyh+b6CgoKCgoGBJp9gNFBQUFBQULOkUu4GCgoKCgoIlneT3CErFCo9+tiGbd4UrYlFct6zIvCviUwJVaWqC141JMYviumVF5l0Rn2IWPSOmqQkZMRBTcuKumK9KUxO8bkyKWRTXLSsy74r4FLPoGTFNTfC6JgZSvDFm0TNimprgdU0MpHhj4iMVAPEpZtEzYpqa4HVNDKR4Y+LFpKEUrxsWXTc+UgEQn2IWPSOmqQle18RAijcmXkwaSvG63pSwKLxuuWJc1yx6RkxTE7yuiYEUb0y8mDSU4nW9KTFi8d1AQUFBQUHBkk6xGygoKCgoKFjSKf6F4cJjjEBKoMrwujEp5Y3MuyI+JVBlxMcYgZRAleF1Y1LKG5l3RXxKoMrwujEp3phAleF1Y1K8MfGRRnxKoMrwujEp3ph40fC6YdF14yON+JRAleF1Y1K8MfGi4XW9KWFReN1yxbhuoMrwujEp3ph40fC63pSFisVPCgoKCgoKCgqKnxQUFBQUFBQs8RS7gYKCgoKCgiWdYjdQUFBQUFCwpFPsBgoKCgoKCpZ0kr8/UCpWePR7j7J5V7giFsV1y4rMuyI+JVCVpiZ43ZgUsyiuW1Zk3hXxKWbRM2KampARAzElJ+6K+ao0NcHrxqSYRXHdsiLzrohPMYueEdPUBK9rYiDFG2MWPSOmqQle18RAijcmPlIBEJ9iFj0jpqkJXtfEQIo3Jl5MGkrxumHRdeMjFQDxKWbRM2KamuB1TQykeGPixaShFK/rTQmLwuuWK8Z1zaJnxDQ1weuaGEjxxsSLSUMpXtebEiMW3w0UFBQU/ANhirfCf//73//85z9yFy/UebDb+WfzN95m8fcGFh5jBFICVYbXjUkpb2TeFfEpgSojPsYIpASqDK8bk1LeyLwr4lMCVYbXjUnxxgSqDK8bk+KNiY804lMCVYbXjUnxxsSLhtcNi64bH2nEp2B1+KPMUuoeASkI1RqW6HVFICZeNLxuJoVNTL7PmUjhdcsV47qBKsPrxqR4YwJVeVd4XW/KQkUGufS6LBaox7J5V7giFsV1y4rMuyI+JVCVpiZ43ZgUsyiuW1Zk3hXxKWbRM2KampARAzElJ+6K+ao0NcHrxqSYRXHdsiLzrohPMYueEdPUBK9rYiDFG2MWPSOmqQle18RAijcmPlIBEJ9iFj0jpqkJXtfEQIo3Jl5MGkrxumHRdeMjFQDxKViL+eWXX37++ecqVaqg/9///fZ9sCUKNz3vikBMvJg0lOJ1MynTp0+vVq0aG5qll16azrsxFim8brliXNcsekZMUxO8romBFDfGdKxFWowb6brC63pTYsTiu4GFxxiBlECV4XVjUsobmXdFfEqgyoiPMQIpgSrD68aklDcy74r4lECV4XVjUrwxgSoj77qKyq4CGdF14yON+JRAleF1Y1K8MfGi4XXDouvGRxrxKbIcrEeMGHHllVeussoqq6++ursVAEsUbmLeFYGYeNHwupmUa6655qOPPtp0002XWWYZiVrGLMDwuuWKcd1AleF1Y1KwoB9/YIcPH96zZ8/evXt/8803tWvXZt9mkYZcr2hkYlx3oSKjWvzeQEHBkgtHxmnTpv30008lv+AfgS0z3bt379y588knn7zNNtsstdRSperFijPPPHPGjBkHH3zwuHHjuCMoVSzmcCM8ox9//PGcc87Zbbfdhg0bVrNmzXvvvbdFixbPPvss2zgohf5l0KHFBQ2fbN4VrojNuFgjIwZijEBKoMrwujEpZjMu1siIgRgjkGI242KNjBiIMQIpgSrD68akmM24WCMjBmKMQIrZjIs1vK6JgRRvjNmMizXy7pNPPlm5cuVrr72WqQdXlKpT5JrouvGRRnyK2YyLNbyuiYEUb0y8aHjdsOi68ZFGTIoWEpYZHusWW2wxevToX3755ddff5WuSBHTWnxMvGh43UwKnedeLrroog022ODTTz/VvWQihdctV4zrms24WMPrmhhIwfIsZs6ceeihhy6zzDI33XST7mvu3LmtW7euXr06GwIUiGnNyMS4boy4eHw3oB4XFBT8sfz8889MtcxB+pQVn7V/BkzuPXr0uOKKK7p161a3bt3/+7//+38pi+Pzpdusl506ddp44407dOgwYcIEFFvGFmsefPDBXr16NW/e/JhjjtHTWXbZZc877zyqOnbsyDZOvyfx18DVF4/dQPIiFxQU/NHw4WIiKDkF/wh4pp988knnzp332GOPbbfdtqQutrAiQtWqVS+77LJPP/30nHPO4QyNzoZAAYspkydPZrvGXbRr165y5cr6GPLsGqZMnDjxlltu+SvXPq5V/N5AQUFBwT+HefPmnXnmmT/++OOJJ54oRSe/xZ0GDRq0bdu2d+/eTzzxBK7OzapaHOnVq9d333233HLLNWrUCJf1WBuCpZdeumXLlig9e/b86quv0ti/iOInBQUFiwGcIaDkzMcUb20ki/b50uW8180o+YBFQI1wYPr555//kAZd3AYpl7d9SzGrQh4LkPvHYs2++uqrb7755rrrrrvlllvq4WqlcUm6mFLygygMm/m1Ay/5NpPLzG9BiuEqmdokZ0GFu9hvv/0o3HzzzbNnz6b29ywNmcYN6WXV/lH8+uuvzzzzDIUaNWqss8463Ah3BypsscUWVHGPL7/88p/dE4NLF98NFBRUXPTLX9jJkyd/+OGHTBClivkHPqqYLzRlEKmqPxtdC7ju1KlTR48erc6oii798ssvo0aNmjBhAq5V/R5o5Pvvv999991btWr17rvvsifg6oh0oBSxMBT/zTffMIwzZ87ELVWk868NoBr09pkqhZX8FMWr6qeffho7duyMGTMkGtOmTeOiHNbL1eFFgPYZ/FtuuYXxb9q0aeXKldMlpnR3WG5QMA5AitufefPmDRs2jN4SbKBjP/rooyeffJICuQo2aIHLvffee++8806akR0fFF16zpw5Y8aMoYdWZX0bN24cV3dz6d6IESN40CU/ZfPNN19xxRWHDh3ar18/EkvqguhamW4IqoAqejJ+/Hg+TRYmnYuOHDmS28nf5h8I12Jf++mnn1JeYYUVeEwUGApZqFevnsLefvttCiqr8KdS7AYKCiouSy21FHPTgw8+2KxZs+233/7888/PTHO9e/du1KgRa+TcuXM1ofwF0AeWh+uuu65NmzZcvXnz5uwJrOr//u//WBjocIsWLYYMGZLp8CJAC5qsWeEGDx685557nn766ZMmTSrX/bKu9OjRQ8PYqVOnkppCy3369Nlqq6123nln/WNLb58RubWSk4JCH6ZPn/7II4907Nhxhx122G677ViMEa0FrnvssccyRGeeeSZupoU/EF1x+PDhgwYNotC4ceNUTlB/WOFYRM8777wdd9yRp3byySejuGN4ww03cAsHHHAAuyUpZDHyLNVt27Y97rjjGMB8/2mB/RkB7dq10+OGUl0KG1leYAaBwdl2221pTQH6qh/0q/WffPKJ4oGdwf77708/e/bsWZJSateuvdZaa9Hthx9+2DuStEZ/gEJJcvj666/vuuuuI444gp4wPpy8iSzV/etfbDf32msvXo/u3burb6WKPxouym6AXSNlbkeiS7Vq1VT47LPPtN13+/nnUewGCgoqLmwFbrrpJpZA5kcOl1988YU7LzBNsIyxACy//PLLLrvsXzNlABeqXr068+Yaa6zBvMaJimWG2ZP+aI7m1DVr1qwJEyYw4XpnVQJYA4A5caEQxmFumWWWufLKKx999NGNNtrozjvvZKtx2223MT60xiV0lcy1zKVjBA8YMIC5lZF0v8wAyvSfk2ilSpUYRpT8SjNlypRu3bqxCNnRVpC79NJLN2jQYPfdd2ccfvjhh759++rfaAiWLm6Bi7LG6KLupf9A9PRfeeUVXZ1jdCqXoBZWX311+skG5dtvvx04cCDnY0amFPGvf/HI6CejxGsmRVkrrbQSWyUeGRuIJ598UlWWyPm1ffv2vAb169dfZZVV8kPH4+NAX7duXZqdNm0a7bvjgGVghw0bxihxrTSjdC9ss/r37y9FMNQbbrghBXY8bDIkZvj888/Zp7700kv5ceYVatiwYZMmTcaOHcsDZSddqkivyLuEyJv20EMPldRFZaGPmC2Rvn6oVauWFBfeQ/pDIzyRv/RvgXDJig9vHjB8ZvOucEVsxsUaGTEQYwRSAlWG141JMZtxsUZGDMQYgRSzGRdrZMRAjBFICVQZXjcmxWzGxRoZMRBjBFLMZlys4XVNVIGJiRMb8zuTIBPEaaedZlVYTuRrrrkm+u23364zBKLVmjXyrs5el1xyCWXVyhoZ0XWxzJ6rrrrqUkstdfTRR+MK5i+m1GuuuUb/kDrNK0GtCq1bt2aNYSqElVde2S1kbMbFsrSw5HDXrA2cuVn/2Cdx+8CEjtVV0r78BwWefvrprl27stSxGJB4/PHHWwyWu1hvvfXQGeo06bcqubS/zz77cFGuyEZEVRYDuuj555/PYLJPGjVqlHSgilWQtXm11VZz+yYrKDNiwIM25OZFOqMUtWCWxlnmGVj6ULlyZf2tnqT1BWMovPjii9wIT42tlXTB67Tvvvvy1DiSohOMqACWao7UZNWoUePee+9lSLkWVQw+Owya4kFMmjQpbWaBXskC8a1ataJvJ510EqKgHfayvAkM/n333YdiKexU2PYpWKI4++yzCaaT7Lpwk6bng8v5niyeFMfrV199NX34yT/ZFwRwU3qa9GTttdfmlZAOvLdvvPEGLxgbJvdJWQFLazwCNkZlPR1ZFbiQJRppe/+59dZb6QCccMIJEs0Cz061bKHYt6UZCW6M8Lomuu5CRdbZ4ruBgoKKyyabbHLUUUcxM3799de4zLnSxTvvvMMUzMS3ww47MEXyeS5V/FXUrFmzUaNGzCaffPIJE6tE5mJO2G3atGFipWPUSndhdeReqC0XLVq0UBaL1n777cesPXToUMpskjQ+4I4DBVwKderUYQfw3XffsTAjbr/99tIFjXBYrFq1Ki3jEgCqEhzRPv74Y0TuZfDgwQqAUnUKVbvuuivrJcHul95cqF69ek2bNt1pp52kuInUArktW7Zs1qwZx1YguCyIueuuuzKXFrTD8vPll19SqFKlygorrJAPowq7xRZbsIfjoqz60gX7mHbt2jVo0IBFCFfBgnfsxhtvZMxZ50499dRevXpR+9577/FyfvPNN4znww8/XLt27VK0D3YMO+64I4UxY8aoY1gW3WuvvZaNAmX2ZGlgiTXWWIN9xm677Vby58OjxJI4YsQIKS68BnxY6Btvo56CexeAyzNitHlLCeaVKFWkVVtvvfX666+/5557lqQcvCd6A3kQpUeSgyoxYMCAUtqCcLN24mdYVKgQ0LOKDy8u5LczritcEZtxsUZGDMQYgZRAleF1Y1LMZlyskREDMUYgxWzGxRoZMRBjBFICVYbXjUkxm3GxRkYMxBiBFLMZF2t4XRNV4CzCxHf33XfzUWV+Z/5C1MGFqgMPPFBTGJMLorKU6Foj7/7O7wbgiiuuoAXmbvomJTmO/fLLQw891LZtWzqGkqYmWFknS8KShDIal824lgKUWaT333//SpUqsfO47rrrWAMUYCkMi0ZGXWK4WCknTpxoMXDYYYehb7bZZix1UoQ1Qm+vv/56LsG6PmjQIBShWuAStM/OjAWMpljhpANVHCW32morEtNWf2tWFshdbrnlSFwoLB4XXHCBJbrtcCHOkbwkPI769etzL6pyY7CEMUosabS211574aoKuE2G4vbbb7dIFdLKpJMzZ85s37496yj7hosvvpgtBY2w7OnRKx6U4lqgtnfv3sRvvPHG06dPR4ennnrq9NNP5z2hz+eee65EIJ7FvnXr1gydXOnQo0ePZCD+3/9jU4Kbtl0Cl/0QGxQGoXHjxqzc7t1Bkp/y7rvvasBfe+016UTyrrJZ5DXgOUqxFBWww4cPt/Vb3SgLRunpp5+2RCNt7z/6RBN2yimnSFGVYui5LrHRRhsxVrqLTIzwuia67kJF1tnF47sBjW9BwRIFbz5zCp/VN998k3LDhg1XWmklbY6xI0eOfPXVV9E5dEbOUF4WLUuQy8KDnTFjxrRp05IepHDkuv/++88//3x1rBTtwPlMehpeDmhQLSSnhP/9j5PcPffc06ZNG67epUuX77//XmGKAQbQUjSMm2666SqrrEIuZebZUaNGvfDCC5RZICtXrky8spQilllmmY4dOw4ePJhVhL1XSc2x/PLL6wHp5+4UeFIUHnjgARYnNgSZZg1G47nnnqMb/fr1e3E+uGaNvn37HnroobRZypyPFOZ0NgGUdcv5MKAPyy677Nprr0339Hf9KGhV4O4YwIMPPphEwjK9pZNsAm699daDDjqIq1x66aWTJ09mK/DII4+wLcjH56lWrRqWLYW+n+ed4dmddNJJPA700aNHIwKdYV9y1113/fvf/85skugYfdCzY+tTUh24tW7dug0dOvT555+vW7euPXqhHmLZtPFMaefzzz/nihooLNu4E088kf5IycN2kEfAg9CTyjwd16UD3leFloGhoG90gKEoVcyHWvY0FsYd6X5L1X8ayfW40uICj81s3hWuiM24WCMjBmKMQEqgyvC6MSlmMy7WyIiBGCOQYjbjYo2MGIgxAimBKsPrxqSYzbhYIyMGYoxAitmMizW8rokUgOWKdU7f3Hbt2lUHXyzT5fHHH88EzZzy+uuvI4JOM2DtyBp5l6mclpnZKatW1siIrosFjrxMJXTjpZdeUjc4Y5122mlXXXWV+omSpiYoEegqEIBCQNrSAucVsxkXK9Q4i+i2227L/M5ay0GTuVWDYNcyyzCydaCr+iJEV6erZ599NgsDOvO7NxELaX9LRzQpYLWIlGlwjz32oKldd91VwdghQ4Zsv/32Oj1bSqZxoAxJ6/PvUQVzgXLai9INKssstVxF26z11ltv7ty5SU6KGyMuv/xy+rnyyivbiLG47rzzzgMHDmRU3RRZg+ABAwboP7JHCyy9eohUWaQ3ncJnn31Wo0aN2rVrT5w4kZHnrdPvgZ533nk0tcsuu9A4VwfepZNPPpm+uY0DPeeJc2k48MADVWVYTDJGC74GhlxaZkNAI+ecc046HgmPP/74/vvvr+9UAEXxYIlqmV6pfWLS+hJyk7ZSiLFEAxd92LBhbHTowN577y1RlizsBx98kN7iv+gPoyERLNLwuia6boy4eHw3oB4XFCxR6OUfOXLkV199xSmhRYsW+iwwdb7yyisc4yistdZaHHYVjJvmlQ+1uchwGmaKZzbhQCOFVZlDJ2c++gPe9o866ii6vcUWW2wcxyabbJKxpEO7du3GjBlzxRVXsJa3bt1aB7489IHz+tixYytVqrTTTjuh0DH2EG+//bb+GjzHwYYNGwaGQvFlBVAryymZwtSpUykzJhx/WW86d+6s42Ya64cbOeCAA9q2bdumbPSv+Hr27FnKyUEPtU5rHcpfEUWoP3SPSAosOZdddtmOO+7YuHFjeo5SSsjx7rvvHnrooaymK664IpHcHcuzLSphKleuzPizlM6aNYtDOfukQw45hGutueaa2JkzZxJDmz/88MO9995Ly+xscJVrcC2CgbeuJC0IVWTlE104cOvHHIyAWuP1uOmmm6688koWacUgquDCPkZP4aCDDio9lRwEUItlo1xKWxAeE3sRPQI+2u7o6aLs6ijQva233lpfb6j2T4Urhl7QioZGzcYu4wpXxGZcrJERAzFGICVQZXjdmBSzGRdrZMRAjBFIMZtxsUZGDMQYgZRAleF1Y1LMZlyskREDMUYgxWzGxRpe10QKgmma6WODDTaYnf79NaZvpqTDDz/8hBNOQGc+ZXrl9CCrFGtH1si7v/+7gW+++WaNNdagJw888AB9Y5Zv3rz5qFGj6I9FGuZy8tthhx04NEOz9G8AWCFjVSAYWrZsSblJkybM5kyRTJRMvh9//DHX5d7BvYQKslRdc8013CmHZhYb4oE594gjjujYsSPt7LPPPqxwagTIctOtYK7IuxdddBFXYb/C2RfOPPNMFhjWWvVQMa5VgbFiH5NMxwuDcWZvQWtkqQVZoP3p06frBzcs1Syu0sGNTNL+97/+/fvTFDfOu0RrPXr0YI1nkaYnenDgpiTj9euvQ4cOXXvttUls2rQpB30WRVrgWvoP7hGQ5i2QaFeke7wqLPzc6RtvvEGu/tAe9OnTh3bq1q3LuZxGTjvtNP2dPuWaTR6M83sDF154IW7S9HwUabjphlwS99prLwaKu+YxMVZsxbgoOgGKMeu6w4cP19XTp1EmGlu2xZZoWGt8cmmHTcmUKVMYveTe5qN3lZ0Tu1VLt0S5wut6U2LEZNOhG6jIqJNJd+fvwfOucEUsiuuWFZl3RXxKoCpNTfC6MSlmUVy3rMi8K+JTzKJnxDQ1ISMGYkpO3BXzVWlqgteNSTGL4rplReZdEZ9iFj0jpqkJXtdEFZgmdtlll4EDBx5zzDG33347Okfw008//cQTT2Q2Yem977772rdvrxTmILIoWDvWmsi7jz32GBPiJZdccsEFF6CUlWKi62JRWIG22WYbTtjXXXcdR6Ljjz/+vPPO0xFT/VGiMJcpT64bo0LGolPQbMWCzZTN+srlGjVqRLc5zlILCgMl5ltr1aoVy0+HDh3uv/9+WqMDZ5999sEHH8xgckjt1q3bcccdZ/FWkM20hivy7g033MBqvdJKK3355ZePP/74+++/f8stt+hHv0SC246lU6BXagRMdGOkAFsNBpxxy8RgeVtat279yiuvsJCwWtepUweRMDdGrXHEZ9PGOLDAM6psBx988EG6nVwmfZGwbgqWky7nXbZQDRs27Nev3worrDB37lx2dY8++miVKlXuueceFlRvoq5IYdq0afR8zJgxDBGHcs7i+hkNrzfPkeWTjdoTTzzB0LGpskEzm/TsX/+64447uChK9+7deXXVuLBrCTe9JDkxfHx4QFz3ueeeu/jii2vXrs2bgK5aS8S67tSpU59++mkpJuIKuYwq44Blz7Thhht6Y7BsyNiRcJs8r2233ZYqhVHFW8oHkzeckWFYTFdivrWSk4tx3YWKybPj/yo+jCzwATabd4UrYjMu1siIgRgjkBKoMrxuTIrZjIs1MmIgxgikmM24WCMjBmKMQEqgyvC6MSlmMy7WyIiBGCOQYjbjYg2va6IKkyZN0leavXr1Yq5nK8BEyczLXM9MyizMKZwZinmNWmUp0bVG3tV3A4v8bwo4yTGtb7rppvSQIx0zLEc3RDqDtUjDbcfFxLw1xo4du+uuuzJ7rr766ldffTUrh87cBjGWaAVZhkjDyKKrYbzssstuvfXWL774onLlytWrV2dDwLZGg0xTmfS8K/Lu3XffzVVq1KjBwLI6fv/9927fFJOxgrLCTFTBXLCmTFRBVrX6mwfM7CNGjFCVGyMLvD8sM4Sxm2S7OXLkSI7IalwvErgp77zzTr169Uhh3WI51ygBCzxbwKWXXrpmzZq8hGneAomyQDAjrL8dtMYaa7CRpUq1n3766XLLLUdnnn322QMPPJA3Sn1QrVlxxhln0AJPbdiwYbhUGV63LJHNH+1sueWWXbt25dVlS8RF0emnm4h1XWoV5oqGiTY+JhoWwxXZNHPXfKKtTZg8efLaa6/Np5tXiBZQMolyhdf1pixUZJ1dPH5voKBgyYSZl8mRBYa5ePbs2eeee+7MmTMPP/xwDp18gNdcc00m1uuvv57lhxg+1aU0Bz7kRLJ20hQLqgsKOom0KVc6UOb8zV7E26YL87L+VRtnRA5D7dq1Y4KzA01ZUKuAcJgghrtYfvnlWf45zb/xxhtnnnnmiiuuqPXMKEX7mDBhwqxZs4hZZ511mIW7dOny1VdfHX/88frPB6y88soMrx1VdfwoZZYT/W4abTIat9xyC2skDYb7JggTJd8H7ZSCcmH0WSIbJp4IBRb4tMaPRo8UtkSsRg0aNJALbm+TheK//+VNaNu27cSJE7faaitOxowVMYIX795772VDwCvEeL755ptKKeUvCB3jCE4Wq69+oiFWW201/ejnqquuuvzyy6tWrUpMqW5BuE39e431119/gw02wJVeXkjks8NV2BXBFVdcUalSJUaAqrIuLeikDVRJ8qE3MxzDFuq6667jDenZs+eUKVPoEp9TeOqpp3g5GdK999473MIfD52o+OgNY6TM5l3hitiMizUyYiDGCKQEqgyvG5NiNuNijYwYiDECKWYzLtbIiIEYI5ASqDK8bkyK2YyLNTJiIMYIpJjNuFjD65qowosvvsg8xeeUuQ9OOeUUDlhUderUiZmCOX3//ffv1q0bBztEw9KxOqOcddZZTNwZWF+rVKlCO5zMStJ8qGKe4tTIwmatWSHT+E477UQ7TG0EeyMNr+tNMauCDlsctbmELmoxhlwTzYVBgwbRQybojTbaiIWEcyGHWto855xzELnZNm3a3HzzzfYDfmGNWMFckXdfeuklGmTVHDdunPqZSXRdE0Ug0gikYIFVedNNN+WFOfvss03MRAJbTB76qquu+u6772psdTx1YygAVWykOnbsuN1227FHVK2sxXz33XcHHHAAq9e3336buWVZQGd4W7RosfXWW7MH1UVVyzOtVasW79vLL78sUXrGAt3m8THCl112mRSqDK/rFblK165d2fyxebUf21ttxmZcrOF1TQykYOkDH9vu3btXr179sMMO44VhGNkZsDfdY4899K0SYYp3E+UKr+tNWajIOptshNNdwWJA0t0yfv6hAHBFLIrrlhWZd0V8SqAqTU3wujEpZlFct6zIvCviU8yiZ8Q0NSEjBmJKTtwV81VpaoLXjUkxi+K6ZUXmXRGfYhY9I6apCV7XRApYpqr27du/+uqrnN5Y0Zm8mMI4eTz88MMcxZjNL774YkT9uzISlYu11vioY+++++6+ffuqCptHlxOWvvHGG3NWs5/gUqWCWRTaP/DAAxs1anTGGWfobM1knYlMGk3xuia6rll0CsxWKnA5tU/ZrHATXZcy69mhhx7ar1+/dddd9/TTT+/QoQN7LJripHv44YdzoD/33HMPOeQQROVyibJakyvyLosrW40nn3ySCymepqw1xVg7JoqMGIgpOQvG2BDdfvvtp5566g477MDiqvOuxVg6A8LD5a3Yeeed6Z6aouDGUBCU582bN2fOnJXSPyFMm4huDOsWARRY2BCpsluWRdR7yMaxc+fOu+22m54jVcAuoVmzZv/+9795RpaYaQELI0eObNKkCfuY999/v27duklybjRKzoKDU5Lmi1ydPTTvA6sve19cRD5EbopZsjKimgKva2IgBQtcF15//fVrrrmG3QCvH7d2zDHH8JGvWrWqeqWvItzEfGslJ9iBGDHxk2YWB7w3YK5wRSyK65YVmXdFfEqgKk1N8LoxKWZRXLesyLwr4lPMomfENDUhIwZiSk7cFfNVaWqC141JMYviumVF5l0Rn2IWPSOmqQle10QKshwgmIirVavGZGG1TP2cbitXrsyRF9dNdF01IihLLPm+K7quoOyKbgxWCksL3bBpKx+JK7yuia5rFp2CZsY0aQFc0U10XdWy5HCyZBgZNIthGKdPn87AoksErpVfzDKuyLtchZMuC0xJSkVrTa61Y6LIiIGYkpNrTYWpU6dyBP/6668HDhzISVqiUJlEbpynxuJN39SUcGMoSMyQiaGgp+OK+QEkBjt37lwWPO1RhGJmz56tr6kUbLpr4dprrz3vvPPOPvvsSy+9FJGruP1XZMlZcHBKktNhti+4vAwWY4WMJSwjpi0leF0TAynYkp8qv/zyi/7qIm8Om29EN1HWEuUKr2ui68aIS3Xp0iVppmJDXzM274pFi8y7Ij4lUGV43ZiU8kbmXRGfEqgy4mOMQEqgyvC6MSnljcy7Ij4lUGV4XTdFH1SOLMxZOv2DPsNMgsyeOotLtKo06rfWVAUWY2REb4y1g7WCa4nR1xUqeyMNr+tNyVSBGseW/AVrIZNirlI0YgyjXLP5YRRltSZX5F3at9lcovC2k083m3dFOAW4Lpub1VZb7bHHHlt++eW33357bhwxE4miH9WrLDETQ0GJFgD5GBUyTUEmUjA+6o9cq9WggXtFtwXsr7/+euqpp3JTd955J7s3dx8grDXhphtyaZ+Hrs+UruW9otm8K7xuTArWrkiBe9HO3r53US3W0i1RrvC63pSFilwrO6AFBQWLBfZJ/hvRnJWflysagbH6k4bx7306rVu3Pu6442677bbx48eX1ZO/5aklq1z6zmSgk17dpWfPnl999dUtt9xSu3bthQaH+W/6Y4uSUzHQHWH/xk9TsRuocPCmui8rBSE3HmWptZK0+KCeY0v+EgwThGaKDF6xwIvGEEq+g1f8nZR1rb8GLs0Rs0uXLnvuuecpp5wyc+bM/8z/ZbFShMNf2c/AtQJVdPvX9D8ncdVVV914440tWrRQ8O/pOSsu6b+nhX8kxW6gwmFLoApYQ3o8Sll8X/pFu+uCgiUZPjJ85GvUqNGtW7e6det26tRp7ty56N7dQMWHbk+cOLFjx46dO3dul/71w2JO+JNYPH5vwNDCZstbxhWBmPhIIz4lUGV4XTcl2cCnfxvk3XffffXVVwcPHjxp0iTe/uWXX97dz7rWMFGflmkpP/3003LpX/ZIQ8pMybthUQQijfgU19Ltb775Zs6cOZl/fGyJwk3JuyIQEx9pxKeEReF1TQykeGPCovC6JgZSvDHxkUZ8SlgUXtfEQIo3Jl40vG5YdN34SGOhKXz8gU8NC+euu+46b968O+64o2HDhvrjBxYpvK6JGVcEYuJFw+u6KS+//DLbmvPPP3+XXXbRT/o1IWQihdctV4zrhkXhdU0MpHhj4kXD63pTFioW3w1UFNgE/Jr+F8/23HPPJk2aHHDAAaeccsqpp57atm3bpk2btm7dmir9C1Qt9mGIufrqqxs1aqQ/t0nj0v9Y0mknIbJXeZRO91Qoqelf3dlmm22aN2/OsUDfc5YqCgoKgiTTerpeshtYdtllDz744Isuuujnn392P1+LEdWqVWM3wG5Gv+wJpYqCP5rffmWx4kNXeRVk865wRSyK65YVmXdFfEqgKk1N8Lqy2gr07NnzxBNP5HNbu3bt3XbbbbPNNkMcPnz4a6+99sMPP1SuXJlPhf7Wm1A7ajPTGoWTTz757rvvpp0nn3ySeUG/TW1Z4KZkXLPoGTFNTdBVfvzxx7lz51aqVMl+792NyaS47WBp4ZdffpkzZw7lFVdckQB0oDBixIjNN9+8atWqgwYNWmeddeg/WK1wW8u7whWxKK5bVmTeFfEpZtEzYpqa4HVNDKR4Y8yiZ8Q0NcHrmhhI8cbERyoA4lPMomfENDXB65oYSPHGxItJQyleNyy6bnykAiA+xax0Pmu4fPwVKRRTcoKtKQACMfFi0lCK182kcBiQyAyQibFI4XXLFeO6ZtEzYpqa4HVNDKR4Y+LFpKEUr+tNiRGL7wYqBDyMt95667TTTmMrwIG+X79+11xzzSGHHNKhQ4euXbv26tWLpZFVs2PHjq+//rr+cpaen56lCx97LPrpp59O8KWXXsqCTYr0PwquDj/99NPee+9N3x577DGuqF5BKSgIYcSz+yH96quvVm6p7l//qlevHp3v27fvyiuvbPdbqisoKCg/+bliscC6/cfOYAV5it1AhYDV7oILLpg1a1b9+vXvuuuuWrVqsdBqFWQ7z+H4jjvuQCTgyiuvnJH+V8mp4uMBFHBVYOHH0iBunTp1ttxyS/098DQwOcpjCQBSONOrjKgW0r6UDv1uPLW0jAXFUwV8UCdNmvTtt9+qtwpWgMpY5bLLMV3oWt99990333xDT0jX/SqsUqVK22yzzRZbbMGBAEWXU1Ua8h+uqC5JBAJoQf0Hyti0y8mtEQwUFE+wOpA2loCCLrGg4J8En9PMtwKLEXRbuN9rFvwZLB7jmywF/2gGDRr0wQcfcKdHHHGETsOILIQsisulcFY+9thj+UgQOXjwYBY2rWEnnHDCQQcd1L9//08//fTAAw8krFmzZtOnT2dhu++++zp06HDDDTfog6TpgJQRI0bQFHuFNdZYg/jOnTuzonNpatWZL7/88pBDDtF/gW3UqFFHHnnkWmuttfrqqzdq1Kh37970Tetrnz59Dj744K+//pqUu+++m260bduWi+JyFdq88cYbaaRu3bqrrbbaqquu2rx58+eff550XWjo0KHt2rXjrin37duXpnDtP7Q/fvx4+nDMMcfQB+Kte9zaLbfcQlP0B9Zff31GgKZI4aLEUCD4scceI10/Ln322Wd32WUXxe+7776ff/457eilInjatGlchc5r/BELCv5h8GKLkr+4sVh3fnFhsRniZH2Yf6qTzbvCFbEZF2tkxECMEUgJVBleV/biiy9mJVtmmWXee++9MSmspj/88MOsWbPmzZuHnTFjxltvvVWtWjXCTjvtNNZgFKrWWWcdlNNPP71+/fqs92wgNthgAw7rc+bMOf7446lq1aoVK+js2bNZF1nF2TestNJKSy+9NIv0jjvuqAZZ5qdMmaIA+sO6yC6E1h555BFWUNrUH5sjsmrVqi+++CJ7EYKvvfbadI+RbDKopU3sKaecQm8JYBfC24W43nrrNW3aVH9mC9uvXz+6zZ6AHQC1aoGCyrvuuitdJWDYsGE0W716dQq6U1K++eYb9gFE0hT7gO23354GyVpxxRW7d+/OLRND/+GSSy4hhuALL7yQvZT1kOCNN9544sSJ9FDBjDObFQKee+65n376iftKH065n2DGNZtxsYbXNTGQ4o0xm3Gxhtc1MZDijYmPNOJTzGZcrOF1TQykeGPiRcPrhkXXjY804lPMZlys4XVNDKR4Y+JFw+t6U8Ki8LrlinFdsxkXa3hdEwMp3ph40fC63pSFiqyzxXcvfz88ho8//piFqlatWizVPBvKyy67LMsYqzJrGIXKlSuzrNauXZv4Tz/9lEULWM8IZp27+eabWUS7dev22WefPfHEE6xt1LK0p1up5C96pk88+Y/lH3fccVOnTj3xxBMHDRr06KOPsjmoU6cOy7/+A27EEI+lBQodOnTgLdF/Bv6ll15i1WTFvfPOO1lKaZNGvvrqq7XXXpvIM8888/3332flpqBLr7rqql27dmVbg/7MM8+88cYb7D9IvO+++/RjhZYtW7K6t2jRgsvts88+hH300Ud33XWXGreecL96UxHZ9LAlWmGFFTjuDxgwgDsdOnQox322O9wXw2J3Koi56qqrtt56a3YwdJUtAoP5ySefvPzyy/RBzTK2Rx99NJsY7o5Lo0D6WAoKCgqWIIrdwN8JKxZrD0vj5MmTWZx0SEVh0dKf0WalBzYHWM7BbA6oZbklGFj8cCmwjWCV3WOPPVjbOCjTrNZFXYL2cQlj+eRYzKn6jDPO0HXZCrCOUmZbMHPmTBTCcIHcNddc88EHH9xrr73oSf369XfZZRe6py8eiKRsf12cjtFh/eVz3RQL/Mknn1yjRg1dnc0Byz+RU6ZMIZ3FmLLilc4ldL+6ujoPWrZpgb0Cx3fizz///K222oqVm3FgTK644oq11lqL8u23306wxgTIJZg+s8PYaKON6MYBBxyw+uqrU8WuaN68eRrGatWq0eCll17KDVKlcVN6QUFBwZJDsRv4O9FayNrDsiSXAmu/tgKQRiVQhYuusq3xaqF9+/brrrsuLukotACqQmHZI55ajshci6Vx9uzZU6dOnTFjBgXWaYL1X9e2EzOQe/nll9erV09bBNZpItFpFsUiCRPo6jOR6gNWBfWcguLJZS3Xek8AFl1hSqectJjqdF7956DPEs6+p3nz5rSAQpvAhmP33XcneODAgdyUuyGoW7fuZZddRpvqKnuOFVZYgSviEkazFNQB9RMrt6CgoGBJo9gN/J1o+WH907+259zMooXCyuSuiLKswSyHqsUVWs844BLDAsnKx4G7cuXKrHxqQWFavzkTE9y3b99mzZo1adJku+2223bbbU855RQCWJ45sms/QTuEIdapU4cWKNND2lx++eVxCSCMGAKoQgECuK6hRf3777/v0aPHGWeccdBBB7Vq1eqpp54ihUSl0056l0knsazrJNJtcDcE6g98+umnpNesWZMACsQAvYL111+fyBL/lacAAFggSURBVFmzZk2bNk23QMdoYeWVV6ZNgoECkcstt5w1azdLZLINSdGl7b4KCgoKlhAWj90As/M/ElYpLMvP2muvTeHrr7/msM79ai1MQ0rg/vDDD5x9Kay55ppYUBXrGZYULagseFqPERWmq2DZE+CuscYaW2211dZbb73NNts0bty4adOmO+20E9sCWtDqqGAiUVgdaZY2gTIitUCYAihLtGCtqb179+YQf/LJJz///POsyuw8ateurXhyQb0CuxCo54Bi16IALN4Kk86FtBUQiNrxYG0DwSWUwgaC/utXDhEFAWQpQEOHpYyi6xYUFBQsISSTIf+r+JTm738cujWeBAsnlsXszTffRGQ9U4DQwjZgwADOvqxVrKx6eErHIgLrmbukqSrJT0HkYE2B5f/WW2+97bbb7rrrrnvvvffuu+++4447unXrtuqqq6pZxVPAJYs2WUp1ZEe0AHpFgEVyRV0XO3r06MMOO+yrr746/fTTR4wYcfvtt1922WX6vQHdi0vaZGlVNivRaiksv/zyWP0ggABuE1SYMWMGVfSQsrYCSlT/2VtwC6rVDcoSozBdlDCs7qigoKBgiSKZCflfwd+FLU4777yzfoXw0UcfnTdvHkuaAgTL288//8ziTblOnTqc7IlkDQO1AHIBRaJVCZa6jTfeGHHgwIGzZs3S4wdSWCbTjUTpG/60gVILuEKKsNw0MNFRrAx9+/Zl2a5fv37nzp1pn/O6fkivRMKskIYv0I4Uw5TNN98cq79WpGBZBmfIkCGUa9WqVb16deLdRrh68lXD/H/KKNFNxxKvAc/kFhQUFCw5FLuBCkGVKlXOPPNMVmUWtq5duya/OJf+A0KqWJ9YWS+44IL333+f9eyYY47hlMwallmhBQEmugub1rwOHTpwofHjxz/44IMskDoxo1SuXLlq1aoU9AWA2yxl1zUQaZBFVFecNm0aqzIuZQqzZ8+mgEtBvzDIFofrEszJnkRqKROgy02ZMoWbVS61onSl+TRr1myNNdaYO3fuI488YsG09tFHH7388svE77333twUOpRynAU+Y1UgkoKGWp1HLCgoKFgCKXYDFQIWxcMOO+yggw5icWK1O/LII/v16zdq1KhPPvnkmWeead269c0330zMIYcccuCBB7KOsm8AlFJ+BGRtvfXW7du3J+u+++47+eST2XmMGzduxIgRPXr0wJ0xYwZtElZKCMJKTORyyy234YYb0uBDDz304osvfvnllxzcuYUGDRpgx44dS7NvvfUW7R9wwAFvvPEGWSNHjvzss89Y1CmTuNlmm9Eay3nPnj0///zzCRMmsCpTpau4rLrqqqeccgopjz322OWXXz5s2DAGh6y2bdvOmjVr00033X///VnOCaBjkes61/ruu+8aN25cr149+omrXUJBQUHBkkaxG6gQsHpVq1bthhtuOO2001iSBwwYwMK/xRZbsMi1adOGhYrj+3nnnXfRRRdRS3CyF0h/X8/StQSCFMOWRiwpF198cceOHSm/9NJLLJ/bbbfdjjvueNZZZ+H+9NNPthZaiqxQrcpJUNo4uSuuuOK0adNY+HfYYYenn36ac/aee+7ZqlUrAvr373/44Yefe+65M2fO7Nat2worrDB79uxDDz20b9++WvWpXXfddTmdX3rppc2bN2cEOO6rcReuyDJ/zDHHnHnmmVy0e/fuO++8M5ub4447jg3NVlttdccdd1SvXt1+IqAepj1NUDltKSkQYGXslClT2BOwQeHSdKnYEBQUFCyBlKb4Co46iWX6ls27whWxKK5bVmTeFfEpgao0NcHrmmV1ZB1lTWJ5Y7EcOnTo119/TVX9+vW33HLL3XbbrVatWvqGnEW9SpUqnMspczhmJWbfsN566+kL/2XTP+lDg5zFOfeTtf322ye/dp/+F4fRWY9Hjx7NOj1x4kQWv5VWWqlOnTos5JyPtcP44YcfnnnmmXnz5u21114ssWxT9A8Khg8fPnDgwJo1a7Zs2ZJI6XSbM/3DDz/Mgkqv9tlnH47aLLcs8E888cSgQYOI3GyzzWiKDuByazTVunVrNiKVKlWiqyQ+9NBDY8aMIYCtCXeKPnXq1F69elHLrmL55ZfnvhDpPL2i8zTy1VdfMYZsL5o1a6b/uBEu904jRH7wwQfvv/8+2xR2GIj6aQi9eu655xjezTffnC0ICjr952ZnzJixyy671K1bl3bosLYLGkZs8rRSvK6JrmsWPSOmqQle18RAijfGLHpGTFMTvK6JgRRvTHykAiA+xSx6RkxTE7yuiYEUb0y8mDSU4nXDouvGRyoA4lPMomfENDXB65oYSPHGxItJQyle15sSFoXXLVeM65pFz4hpaoLXNTGQ4o2JF5OGUryuN2WhYnKI4v+SZio26mT+BlxXuCIWxXXLisy7Ij4lUJWmJnhdN4W1mQM6Cx5LqY7O6CxOlKlVJC6rHSuZfkZOJBsILM8SneVZuwFSgKoff/yRRtCBXBqxS1CmimDQKgikI7LtIJFIFJZMRAJokER0Crbu0kkUQZl4XUiN6EL0Tad2riiXApsJ0mmHGC6ETrNKRySGkzoi4NIHqhBx1Sa16jnoRriERkA7A/pDJH2QCKSrS0CAboEsrotOgwSoSxQIUMtYysLrmui6ZtEzYpqa4HVNDKR4Y8yiZ8Q0NcHrmhhI8cbERyoA4lPMomfENDXB65oYSPHGxItJQyleNyy6bnykAiA+xSx6RkxTE7yuiYEUb0y8mDSU4nW9KWFReN1yxbiuWfSMmKYmeF0TAynemHgxaSjF63pTFioy6SV+0sziQP4GXFe4IhbFdcuKzLsiPiVQlaYmeF1ZLfxAgZWJ1ZHFCVioqFUYixywVrEugnSCiSGeAKpYzLQWAulaNakinlqqKGvxUxUQSTtUEcPSiOW10BqvvrGOaueBS6JWfVpTMDpNmY6rPmiVZd3FSkfEcjlcLgFEEkaz5NIIurqhrpJI97DE6HLkEkwkzaozCsPSmhJBXSWMYSFFF1JXaVA3TgANoiNyXZpSrwhDp6BhIUx9Fl7XRNc1i54R09QEr2tiIMUbYxY9I6apCXkXKLgi5FNkM66JIhCpAIhPMYueEdPUBK9rYiDFGxMvJg2leN2w6LrxkQqA+BSz6BkxTU1QrTfSCrJpeEJGdN14MWkoxet6U8Ii8PkFPrxYEzMx4HVNdF2z6BkxTU3wuiYGUrwx8WLSUKpws0xWbhUWEF3FbKadjLhUly5d0pYXD9R12bwrAjHxkUZ8SqDK8LpYFfQIgYLWRV5uoKx1TquaFDedeESqtC6CWhNWpUhrP2k6rVLjFqZIFRSTNlP6YbxEdQPRui1d7QAuugoEg66iskEAYXYhkKh0rKpU4CqATox0SJtJvi9R48qyYbEwyiZKJ15t6hYoC4Ilutbwuia6blgUXtfEQIo3JiwK12UqKZUW1L0pJrpufKQRnxIWhdc1MZDijYkXDa8bFl03PtKITwmLgmWArTAFXobPP//8tttu22ijjaql/2lTRG9KRvRGLlQ0vK43JSByJPjqq69uvvnm+vXrV61aFYX74kOdSRRe10TXDYvC65oYSPHGxIvAPfLUONuMHz9+0KBBEyZMqJKiLUK+BUssSyx2A/5IIz4lUGV43UxKZmXSagda7RDdJy0UiS35Tm3SSoqyhF3CbV8tKMztgynWNyIpiLS9UjyNmBWmCxJ1FSuApaMohnLSdNq4uel1SuC68Wnbv+1jQLcvhYC0sdJuAIWyexUwXWUiFWzW8LrelLAovK6JgRRvTFgUeRfsiOBaIyMuWqQRnxIWhdc1MZDijYkXDa8bFl03PtKITwmLBgrrR69evbp163bCCSewoKIozLVGRvRGLlQ0vK43JSyyg2GG7Nix44orrrj++uujCzdGeF0TXTcsCq9rYiDFGxMvAk9typQpZ5555jnnnDN69OiXX375xhtvZHOgPywLFhxoxxUXj90As1XG5l2xaJF5V8SnBKoMr5tP4amAHqehJU1VmUQFq2BW7VCwLIlYkAKZFRGrWteqoFy5Ald6yU/TsbQpV6RtJ4u3amXlqkGsVYHSVSVRhUwHLNisavP3qCqwFrAWn1QsKAoFW2vC65rouoEqw+vGpHhjAlVGxoVXXnmlQ4cOTKP6p57gTTHRdeMjjfiUQJXhdWNSvDHxouF1w6Lrxkca8SmBKkPuzz//fOWVV/bt2/fBBx+sV68enwJ9HAIpJrpuvGh4XW9KWFSH11xzza233vr000+fNWtWo0aNmBYQLUZ4XRNdN1BleN2YFG9MvCi+/vrrvfbaa8iQIY888sh55513+OGHcyi68MILZ8yY0bJlS25/oe24ImP42/RXUNHg8SRPKIVCSf0T0IVKThnEdMDbjlf0Eg6LbKSgXPzwww+DBg0aP358yS9YYvhPCufL22+/vXv37jfffPPKK6+sFXRxRPMkm9qbbrrp2muv7dGjh37rqFT9D4KVG+bNm9exY8cPP/zw7LPP1j/yYk9/3HHHUb7jjjvY2PFkCSvlxFHsBgoKCgqWLFgn2F6zfL744otdunQ5+eSTN9xww1Ld4gz31bRp08MOO+yss85im4vLoliq+2fxzDPP9O3bd6WVVjr44INLUvrvpI499thffvnl0ksvnTRpEkq5NgTFbqCgoKBgyULftM2YMaNTp07VqlU76qij/hmrprY4J510EuviqaeeOmfOnH/kd4o///zzLbfc8p///Kdx48ZsCOweKTRp0qRy5crffvttz549tedTVQzFbqCgoELDNA3p17rZrz35tAuq/vrZXJfWddM+Zr+Z/DX9245ut9VPJUJJXSTSCyYw46tAg7Qf36xSlAsl1SHpYoq3NowSVVa6uS6I6cV/GzpugXHjeOeN/6PQde+8887PP/981113rVmzJotoqW5+LVYFKFWkqG/SRali/hOnlnKa/dstULb3ASxMVZkyKAykQ3KlFGpNUbAL61/9+vV32GGH4cOHsyIqctGgNVm7kHRIO7KAiKufTUgx/Q+Hlj/99NMPPviA8hZbbLFM+sddDB4lt09nnn76aTYNJTWOYjdQUFBx4VPN7AbMMpzkmG5slqEgnTJTOeU/bwLKo6tjgU4y7+CW6hzmzp1bKs2HxRuRrJK/qOhy33333d57733zzTdPnz6dNiXSnzRkIRAMZLG8aWxLFfOncizoEUgvF9YI9qeffvL2atasWaVSCsF0RifaRbtoJDTO0N1zzz0U9thjj+Q+nSdiLmNCf3i4VkuBl43bYcRUli5obdSoUVOmTNGgWRbIxX722Wf6eyGlCudBlPz54yC9JKVhWF4eEynQQ4ZLf08MSOTS2FatWlHbrVu3adOmqapc6GNFC+rVj+nfcHNRFY/P7cxSSy1FZxgc2xP8Sbz++uu6zTp16lgHxNJLL92gQQMKI0aMmDhxYrm6sXjsBrjhgoIlDb35fOw5Bxx44IHbbrvtvffeqyqglo/6sGHDOnXqdP7552tCLNWVh0XI0qWZZwcNGnTDDTe0adPm5JNPRtQaQEGMHj0anZnRsjiJ7rjjjm3btmXB0DyFtVqzMShxQkrnzp3ZE7z22mvaLaHTjkhjPaiWvr355putW7du2rTps88+a50Byq+++ipjyw3qjFWqcLqaRwFAmfVs5MiRDz/88AknnMDixNSsGINl46STTtLf2BasOscdd9z222//zjvvlBrKUQotm1Lc/H66BYO7e+mll+hSlSpVOF9qUS/VpUssT+qhhx467bTTeF59+/Z1m+IpMyy8jU8++aQ9XPHxxx/vvvvuRxxxxMyZM5WCdeG+dtlll7PPPlt/GUxZFPTmsMN47733rr/+er1RepppXhJJzJgxY4455hh7TOgoO++8M09w6tSpitTOWH+t/MsvvxwwYIB0YYkiUwZdkUZ4Rd99993rrrvugAMOoMOZBZ4yD7djx46ULRGlZcuWhxxyCLevYFWlGSUybhil5xk+fDjtc4PLL788w+JGUkbE8tLyOAKNZCAyuavFBW7bbN4VrojNuFgjIwZijEBKoMrwujEpZjMu1siIgRgjkGI242KNjBiIMQIpgSrD68akmM24WCMjBmKMQIrZjIs1vK6JFIBJ84knnthuu+1q1KjBJ5aJTzoBTE+sN6eccgo6UzYLCYrVutbIu4888giTwKWXXkpZtbJGRlRBR8aLL774xBNP3GCDDejAGmusof/sBX1QLT1v167dOuusQ8cs98UXX1w2/dPXTz31FGGACNQqUbegYNmMizXkTp48Wf/prGWWWebQQw/lbMrV1bi1Zlg7QMduv/32bbbZZrnllqNLRx55pNWSSCMsA+i0ycSKYokGR8O3336bTQ/XIl6XAyIps70444wzdt11V/1xi169eiEK9bB79+5UseSgqLc//PDDuuuuS/Bhhx2Gax0WlHUhbHqdpJ9m3QK4ZaEW1BS1rKA8OC7HRa3zimGDdcstt7AVYK9AzH777ecmMhorrLAC6+Vmm23GCFClWmzv3r2pYtBI4bkQSVfN8vRXXnllGtxtt92mT59uV6Sg+2Lp/fe//73++uvTwkorrfTNN98QoFpZFlreNJqyRHZseqMeffRRhSW9/N//vv3221q1anGtDh06oBNvUMu1eD/Hjx//1ltvsanV7aMrgDLKBRdcwOu93nrr0chaa61Fg+iCBknfZ599ePkppxdMPhQ8cTrD43vuued019YmZaBZuTTiFlwLCjZX7auAnTt3Llsxbpn3ls0TkW4MnHPOOdTS7bvuukuNqMpiMtYKi8dugL5CZsgyrnBFbMbFGhkxEGMEUgJVhteNSTGbcbFGRgzEGIEUsxkXa2TEQIwRSAlUGV43JsVsxsUaGTEQYwRSzGZcrOF1TVSBw/fxxx/PfN2oUSM+4SwSVgXMxRtvvDEf+6uuukoTDVWqda2Rd/WD1UsuuYSyamWNjKgCExCXk6WH1atXr1y5sv6T0BLh6aefZrZibfjuu++UC/PmzbvyyitZSJguCSYMUW2yNnNQZgE+MEUFrBXMGubuv//+m2++uQ6Fq622WteuXRkxzadq3JAry+J0/vnns+TUq1ePQWD2t1py6Xbt2rVpk8M9/XQTscC9tG3bltl/9dVXHzJkiC5nMTYOHDG33nprOta5c+c0L2kcvvrqq/r169M+uz3FA/Ec2VddddVjjz1W7cgaBBBmYibGXCDMym4VFubMmcP+ibvefvvtuZE0KkExLHWkYw8++GD2K9wgp3YlqpZNJOnsBqS7idwOSz73tccee7CC0g7rIt3u378/CzwrZatWrRhzdFAiyCWSFljhON0yYgMHDiQAVEsLvGY1a9Zk6JSIyGtz+eWXsxFkf6lgNTh79my6RyN169bVDRKMrgDsRx99RBWPj+0ajdBD6QIXSHnttdeqVq1K+++//36pLq3lNhEZBLY1yfXSXEb1wgsv5N5ffvnlpMfOFSlzdyBX1gp5kXisMFGWW2MXwrNjNIYPH56G/BYDvGnUwrXXXqt2VGUxGasC62zxewMFBRUXJr6LLrqI0wAHUKa2Fi1alCpSPk2pVKkSOnMQn+pSxZ8MPcFqp85JjvMT8+m4ceNwqYKZM2deccUViMyVzO/KAhYDjncs2A0bNsRVOwZT2wcffDA4hfVV1gpmDXOHDh3KslSnTh3WdS5x7rnntmzZsl+/fsxxpaZz0FV2DJ06deII+/3339OTHXbYwfpDYcCAAegsS9tttx1jK92FWs6CzLbsG/r06YOSjsdvXybTCNDCjjvuiDtq1ChcnhGWsFtvvZUjOAWWRmwam/zgmW5ssskmTZo0USMu5H7yySennHLKqaeeirVCWe7nn39eyszxxRdf6Mt8DtCsiEm/nZ5zv7jYvfbaixvk9Pzll1+qCp1OsgNjsd9iiy1YkKQLqvbee+8HHniAFZTxP/rooxlk7uuNN9449NBDp06dylBwiGe74F4OdPtAC6zi7JMQR44ciVUkfejSpQvLMC+V+8sWxNMybxSPCZcWpPOhYFNFLoM8adKkzOXg+eefnzhxInf3+uuvjx071hKBcSZeCi8qmwZeJGKsiptiC0IuY0h/0qSkn4xk+/btuS63YC0IXMbz7rvvDjwyV6R9/ewvD01xaQq0z44kcyEX/SPDckBbFR+eAeS3M64rXBGbcbFGRgzEGIGUQJXhdWNSzGZcrJERAzFGIMVsxsUaGTEQYwRSAlWG141JMZtxsUZGDMQYgRSzGRdreF0TVWD6o/Dkk09yRGPmZX7HZS4Q+rFlgwYNOC6gG5Yua+TdRftuwCzQw/32249GNFmrw1dfffXZZ5+tE96IESPS1AT6/NhjjzHZUcAl0ixwtALuxbUZF2tkqtgQMDtzHmUjssYaayy33HLt2rX78MMP1SVdwr1iOoS/3nTTTXSydu3arMrogpQDDjiA6Zujs9LBEgUXZc/Bc6lRo4adBS3SwH344YcZH9YVmlIMB83999+fdI2bJQLd2HnnnadMmUJZ6WoHKGvbsVC0TrChSVp02pGFvn37aotzzDHHSIFMDAU2DSy3RNof88FyF4zztttu++qrr7q3LKtCr169VlppJRJ33333xx9/nAVSZX3frjBwy4Ar2DLSt3POOUeXw15//fUcu6tXr86KO2jQIGKkc9rmmH7cccdRwIVSW//9LyOcDsa/2IvgkmKW4IEDB9JDHh97Gh6lW2sWaJBuM55XXXUVZS7K7pbPC52pUqUK6XwkCVY8AQ8++CBvPgWlS8eqtwcddJC6tFDYEbrfu1gBS2/1wxS2XBwGaFYBQC0ue1xq6fPtt98u0WzGdUXW2eK7gYKCigsfaT6rHF/4uDJBrLnmmihAFYfa3r17U2DF0hFN+l8MszxzPYVx48apb8xQb7/9NlM50yU6y1samMBxh/3HCSecoKUoA+s3N1JeuIoskyNw7uzcufNbb7219dZbsyZ16NCBpYuZrnQNB7rKqL7yyivUbrPNNvrmPDl8/O9/n3322WuvvUZh1113pas8AsrKMrgoOxsuwdNp3rw5im5ftS6rrLIKa6p+wZsATpPXXXcda4ZOwBzgLIuAe+655+CDD2aHISXDOuusc2kKZ0e47LLLzLouASxXvC3KcjtPGViVsVyXuyhV+ODJrrDCCty+PVxEBuS5556rV68eLx6NKDLDvvvue++991arVq1///6HHXbYd99916JFi+7du2uQF4rCONbrihTYb3FoVmc0kvYK8RTYFqt7bn/0y3Sgm1VZsIo3atSIR/zII4+wMeLFo9lSnYPa1HcMGgEuyrsxbNiw0047TVn6akfts8l46qmn2JrkX29yCW7durX7pKzgWnHiiSfSSTWbgab0yaJB1n4Kbhi1+tYHkZ2TxFiS+1hM4ObN5l3hitiMizUyYiDGCKQEqgyvG5NiNuNijYwYiDECKWYzLtbIiIEYI5ASqDK8bkyK2YyLNTJiIMYIpJjNuFjD65pIAThS6JcD+Kiee+65fPhRWMOYd84//3w++ejPPPMMCigFrB1ZI+8u2m8RmtV1r7zySho58MADKdO9I444giWWMksXPeS0lKYm8RzEb731VjqfaSetX8A1m3GxRsZN+5KMzMiRIzmEMSOvvvrqd9xxx9y5c9EVY+0AIqsUSzWd7NatGy7DqxFmAeOOmEz102JES0ybSaBMJFBLjMpJu7lOsj1iZuewzj6AYBZF9isUdIbbc889SVcLH374Icfi2ek3PWrHbQ2RLLALKUzWdYVEgxaw0lmquTQ3zjqqxsFizOUcrF9Y0e8x6Oqszdtuuy1nYrqRXidpMJNIGLm8GNw4rLbaasQj8nSIVxhYisBVgzfeeCMX3XvvvSnz+I488kh2A1xOvyWjh0UwCs/36quvptm0a6XHpKaOPvpofUAYcOJVqwAS6R4WkVwVlOVaQGdfRSPt27fHZTvLu6FfkdHWnN0tutrv2rXrnXfeSRnS7AXa4SpWZQGuayLBlAUujagdWdrRdydsR9h2uzGUyVUtfXvxxRetNt+OWSssHt8NqMcFBUsUvPl8pMeOHTtq1KhKlSrttNNOuOh8wr/44gtWDsRatWoxX2vWW7RPyqJlCfWHwxMF/Xr5Cy+8wCTVtGlTmtXZdOrUqZqSuIvBgwcfeuihOjmpBYPZ+YILLuBcdfzxx2PLC8sVicAawDG0d+/eLCFvvPEGVfRH/XRRB4YPH86GgJMWHcZV2HvvvYfIGrbuuus2aNAAkQ4nOQtCC+j6Ip0YFfL3BfrtOWI4tH3//fePPvrov//9b3R9p2InOfYKN9xww+mnn67zepq6AIgci1nI4Z4gBNx9991ff/21JaoAKrM1UYHFI5X9cEd0np5zAtaCMWfOHHrIHoKtHjdFIxo0wYMGCgSzULHzUwB3fd5557Guuz3xQgDUrl2bLO2K9G/rmzVrhsKughh92UMVnwKuoq+agABrX+1QQHS//1AtPWcE9FB40EpXlQvXJYA9JVavd58+fWrUqLHNNtugqDPTp09nDKliD/rJJ5/weivLrgW6NCKfjtITCsLj4yXRbySUmnCgHf1HGvnIjB8/HsXCVJgyZQq11apV0y8bqmqhEBkbWhFglM3mXeGK2IyLNTJiIMYIpASqDK8bk2I242KNjBiIMQIpZjMu1siIgRgjkBKoMrxuTIrZjIs1MmIgxgikmM24WMPrmkgBmGiYUvlsr7XWWvpZMkcZ1tcTTzyRFYUpbPfdd0exRGW5LtbIu7/zuwHx1FNP0cOtt96aSZ/+jBkzhn7qBIPO4kGZybRdu3YfffQRVZo9M61x6tpkk02qVq3K3F0u9JMCzvGMBvM7sLRzKuIqXEuNG3ZFIIC+0cPNN99c3x/QT1bQY445Rl8tsLEgJpMoV3jdvDhjxgx6SIMff/xx586dme65EC0/8cQTDD7nXQJ4iByIOfWq29ZztzXKzz33nNawZO4uGwL0ewOk0JRyZQVLLO0QyUOxy4FFqoDONoswdqI8ILrNQfzUU09lHaLKgs0i0n/o378/KzqXIJG1bcUVV+T2eR+mTZtGbdJ6il1LqAXgNrnohhtuyO5njz32YNzU7FFHHYXOXoTh4o3imM7WjTIpSlc7QFkPkXgdoElHdGNAromui037kvzaB+3wUk2ePJnXe9y4cTTFUOjXZdjlsGyzJ2jbtu2IESPQIW3st3Yg7X45fm+At8L9R5hua7SvP0dB2DXXXIPrxtAZfSe3ww478MisBTcmY62weHw3UFCwZMKn9KWXXuKD2qRJE/s56G233bbzzjsPGzaMz3yrVq00rf9d0EOWcArsUdi47LrrrnXq1KHDKPqn58yhxFx33XX0eaONNqKKbkOS7MDq9eabb3LWmTBhwldffcUyIGsFs4bcL7/8koMUOwnaXGONNbgQJ7BddtmFNvNXcWEVYZ2gQJ8rVapEx5g6u3btyqI1aNAgFgDWIUX+HnS/av/VV1+lt+3bt1fHND5M+hy46QlbJTYilpIkLwg659F99tmHdWj/sqGWmL333rtmzZqklJIdENlcalfBOZu79oYJOkkYO1GGq1+/fkOGDCnrP4JPI8Czfuuttw4//HBSWrZs2b17d+73wQcfZEPAY2I5Z4kqJZQBjfBGcVGG5aGHHmrWrJl+aQ6FO8KyY+MqN9xwA2f0hg0bouAq10CZPXs2lufIoJXURaJGjRp0iX3MTTfdtOeee66++uq4XHSllVaiVr8Ww6q811576cgOaZ4H9p2lhxTkgAMOaNGixdJLL11KWxDap53N0v8E+fvvv68xVxXlH374gS4RwwvAZyrQGQ/kLy5wz2bzrnBFbMbFGhkxEGMEUgJVhteNSTGbcbFGRgzEGIEUsxkXa2TEQIwRSAlUGV43JsVsxsUaGTEQYwRSzGZcrOF1TaQATKlrr702H+m7776bQwDHC6bUs846iw88x83llltu8ODBHNQ+/PBDqnQOAGsHiyjSmgWuiGjfDVgAUIVNkxLkphm/NSsLXJeVjDmX0zlTGB1GVCLLBj1nrX3mmWdOPPFEnb+VlTbmac1QZNpMqWBWpDX/Yfdw/PHHM19z9eOOO2706NE6KrmklyohVzoLs37vvW/fvix1DCPbLM6+Q4cOpdssPOxL6DMHPmq5TUs3vG5GpJMc/Vl9aRP7xhtvqOf0k6mcbnOGZgBZS7744otM50mXFZRJTGtKogp5N0O+igvpvdpyyy3tN+rBIlXgrhkQwjimv/766zvuuCMDnnY/ibdgWUSaZfPKeslWo3nz5pMmTVIk9rHHHmONR+d8zBaEltGVaEgh+N133yWSLRSbYJZhicCiS2d4zZ566qljjz2Wp6MUcNOBm9pqq60I5jb15YFqQdcSck10XSzQTzapvCR81tjOur/nf84559A+D65Xr16nn346uxxEEqlN2vK1Jky0QkZU+2onH4lOr+666y62C2uuueZ3332HIhj/xx9/nKFjW8yTIgzREtNmfmvHrBWK3xsoKKig8OaPHTv222+/ZTLi7Msnlin1ySefvOiii955550ff/yxVq1aG2ywQY8ePZg97YMdgJg8zGi0n6l141UuCzqpExvlc889V19g4GKZiGl5+PDhd95552WXXeaedZTrItHsQkmTkh+R9u7dmyWNYzdreb169XQvCrCwPFSxG5g6dWq1atU23nhjplE6+cEHH3Ts2JFlj/6vs846q6yyyo033jhmzBjdTimznJDIKsiST5lTu/6KHNBPzqzVq1dnaTz55JM7d+6sP4KkFBUySBclqQwU4A1Ls5N/pq9/3zh58mRWOD2+POi8Y4R9//33l19++R133MEyo9GAUtB8UAYMGNCuXTva3GWXXdi26hcjgJQ2bdqgcL8sV0cccYS+xC5lLgjx+pOF0KlTJ/efV7CdImDkyJH33nsvW1i2C6UKpzNy2Q3QDQrsYLhZmpJeCoqGRF5pCjwvXm/eFhpBRKlfvz52yJAhfADPP/98NnZJQnDYoeQHKYWWEcxHkirGebvttuNU8MILLygSy2vcvXv3ZZZZ5rzzzuNJJU1E3zKRxU8KCgoqLnzadbK5/vrrDz300Pvuu++GG26oUqUKS2wyWf6//8cE/cknn3To0EFuKW1B+KhfeeWVm2+++RYL0rBhQ/3DhG7dusmVLnCPPPJIrl5qxYeuqHmQpY7joNsHfRPOoeqqq65iGWA+lf4Hsummm7I9euaZZ7TKptPab5NpGP1RGjZV7K72339/DuiMEjPpxx9/rHauuOIKplf9c/NSTvmhHdK19rO2sTKVKtK/kKPfcOSYS/9/z1XKC7064IAD2J9x8uYdUydLdQvCy0YtGxpGY9111y2pZcCbwAA2btyY1ZoVvaSm0Mgee+xx//338yZQRnGHwtDuk2GhtlVKqSJFP7bgEpdcckmmfUP38tVXX/3www/0Z++99/6dA8vlsAceeOD2228vBWiTTTAFusprU9a/CP0z4NLAvuTWW29lye/atSsbVsaNDdZNN93EXvaYY47hk6tIpURS7AYKCiouHFs5D1F4+eWXmXEee+wx/YYz0xwiJ+Px48ezmDFls954P/yIVHFU+i7lWwdc1kL9YThVpSGlAkcrlgrvlG1oWmdRoZEuXbpkgpn3mbPuuusu9hZMqeWdm8LQGnDFpk2bcnrjHoX0UlAQtjuc/n/99dc+ffqwVaKf3AUt6IfWo0aNovGzzz6bu1OzpbRyQqKWNx4Tc7c1heUhsqKceuqpRx99dLl6/vvhQk2aNOHt4gX44IMPcPUo81DFPoYXT79ID6UKH+xpXnrppaefflo/LHCDlbvPPvsQ8MADD1SuXNl7RYWRyzPNf5/Ee86IsSfecsstMy+bIJcwGDZsGBvZzTbbrMWCf76zXNAOqyx94D254IILeEalivS/d0Vn2J1wFt9kk00yN/unYq8Q133hhRfq1Kmz0047HXHEEdgHH3zwuuuuu/baa9loqrfl6lWZL0GFQp3E6mHrDjOucEUsiuuWFZl3RXxKoCpNTfC6MSlmUVy3rMi8K+JTzKJnxDQ1ISMGYkpO3BXzVWlqgteNSTGL4rplReZdEZ9iFj0jpqkJXtdEFZh0xo0bN3ToUCZuTmZMTNJnzZr1xhtv1K1bd6ONNtJyZenkWjtY0LmBZU9ierUEuSZmXC5Ny6zozHTeGCwKjRP5zjvvsCprgtY0RC2bic8//7xRo0Y2cVs7wm0t45pFz4hpaoLXNTGQIsu5/8svv/zkk09YWuzPOsHUqVMHDBjQIIWe2+1Q5W2t5JQdA4wPGw4diy2GceNCnKTZKtmTlVWi6wqv600xi54R09Tkv1bMGs8q0rp1a/2rEDeSMgWYMGECA6Jf1pOYiTEXiyJcMROpAHBrhcWwkA8cOLB58+Za+STS4dmzZ3/66aeMmPSy0nknDz74YDYlrI4U7PWDslIkui5WCu/Ju+++26xZM+sMOs+O15v3h85IUTBl2bwrAjHxYtLQ/O9R6B5jMnr06Fq1arH7sd81prduiiWWJSZ3x/+luRUadTJ/A64rXBGL4rplReZdEZ8SqEpTE7xuTIpZFNctKzLvivgUs+gZMU1NyIiBmJITd8V8VZqa4HVjUsyiuG5ZkXlXxKeYRc+IaWqC1zXRXCvw4afAh5ZpkQJzHIo+8+4nn1wVZE2xNikIuSa6LlYxlE3MxwCiOgYWr0iRzDLzL4prZZBrouuaRc+IaWqC1zUxkCLLhC5RCtBVi1EVuGKmyuvmRVlEqzXL0FFL2X2CWEsxV3hdb4pZ9IyYpiZPjUW3VatWI0aMGDJkiH5rwY2hIOR6x8F1zbrtyOYTpWDNBYuhrKejl0eixgp48y0SKxHMHTNmzLbbbrvpppv26dNnueWWU8+FN8VE1zWLrjecAlYF1dJJ9RDFRNm8KwIx8WLSUNoryHQAhQJDZMGZxLJE2vltmAoKCioa+pRSwOo7AMoUgI8xLlaf57IgRlmKjCSmZVDjdoCWNXAX2sLfiLqteRMr10CULUX/brxN2UVL/l+FloFKlSrdcccd1apVu/766zliluocNAiiJC2M+BELRFLFG87guDHpIyq9aXm4IyxrISv0XXfdteyyy950003urxn+HtST/KVR4u/3D4era0Dog5Eft3iS3UGpWOHRGyybd4UrYlFct6zIvCviUwJVaWqC141JMYviumVF5l0Rn2IWPSOmqQkZMRBTcuKumK9KUxO8bkyKWRTXLSsy74r4FLPoGTFNTfC6JgZSvDFm0TNimprgdU0MpHhj4iMVAPEpZtEzYpqa4HVNDKR4Y+LFpKEUrxsWXTc+UgEQn2IWPSOmqQm4LJ/9+/c/4ogj7rvvPv2NZFYXLTBlpbii68aLSUMpXtebslARVBgwYMBhhx12ww037LPPPtwIom6HggKsDHJNdF2z6BkxTU3wuiYGUrwx8WLSUIrX9abEiH/1nrSgoKCgoILAGrDzzjvfc889F1988ZAhQ3D1VfPihRY2GDVqFDfStWtXbQXQoRRUsDCWKusPS1UoeMwZm3fFokXmXRGfEqgyvG5MSnkj866ITwlUGfExRiAlUGV43ZiU8kbmXRGfEqgyvG5MijcmUGV43ZgUb0x8pBGfEqgyvG5MijcmXjS8blh03fhIIz4lUGXgar1s0KBB48aNr7nmmqpVq9p/gKCsFLMZN140vK43JSzCr7/+OmDAgBtuuOGcc87Zbbfd7PsAYlQQbgpk2nHdQJXhdWNSvDHxouF1vSkLFZP3wPyKj3osm3eFK2JRXLesyLwr4lMCVWlqgteNSTGL4rplReZdEZ9iFj0jpqkJGTEQU3LirpivSlMTvG5MilkU1y0rMu+K+BSz6BkxTU3wuiYGUrwxZtEzYpqa4HVNDKR4Y+IjFQDxKWbRM2KamuB1TQykeGPixaShFK8bFl03PlIBEJ9iFj0jpqkJrkt56tSp/fv333nnnWvVqhVOMdF148WkoRSv600Ji/9N/8vCr7322p577rn88ssjSsdajPC6JrquWfSMmKYmeF0TAynemHgxaSjF63pTYsTET5qp2KiT3hswV7giFsV1y4rMuyI+JVCVpiZ43ZgUsyiuW1Zk3hXxKWbRM2KampARAzElJ+6K+ao0NcHrxqSYRXHdsiLzrohPMYueEdPUBK9rYiDFG2MWPSOmqQle18RAijcmPlIBEJ9iFj0jpqkJXtfEQIo3Jl5MGkrxumHRdeMjFQDxKWbRM2KampBx//Of/8jVj9sDKSa6bryYNJTidb0pCxXt1/ulJ805Ny4XvK6JrmsWPSOmqQle18RAijcmXkwaSvG63pSFioxe8XsDBQUFBQUV/d+ABKDb+kV69d8KBeUi2R2UihWe/HbGdYUrYlFct6zIvCviUwJVaWqC141JMYviumVF5l0Rn2IWPSOmqQkZMRBTcuKumK9KUxO8bkyKWRTXLSsy74r4FLPoGTFNTfC6JgZSvDFm0TNimprgdU0MpHhj4iMVAPEpZtEzYpqa4HVNDKR4Y+LFpKEUrxsWXTc+UgEQn2IWPSOmqQle18RAijcmXkwaSvG63pSwKLxuuWJc1yx6RkxTE7yuiYEUb0y8mDSU4nW9KTFi8VuEC48xAimBKsPrxqSUNzLviviUQJURH2MEUgJVhteNSSlvZN4V8SmBKsPrxqR4YwJVhteNSfHGxEca8SmBKsPrxqR4Y+JFw+uGRdeNjzTiUwJVhteNSfHGxIuG1/WmhEXhdcsV47qBKsPrxqR4Y+JFw+t6UxYqshtIdgfyKz7qsWzeFa6IRXHdsiLzrohPCVSlqQleNybFLIrrlhWZd0V8iln0jJimJmTEQEzJibtivipNTfC6MSlmUVy3rMi8K+JTzKJnxDQ1weuaGEjxxphFz4hpaoLXNTGQ4o2Jj1QAxKeYRc+IaWqC1zUxkOKNiReThlK8blh03fhIBUB8iln0jJimJnhdEwMp3ph4MWkoxet6U8Ki8LrlinFds+gZMU1N8LomBlK8MfFi0lCK1/WmxIjF7w0UFBQUFBQs6RQ/KVh4jBFICVQZXjcmpbyReVfEpwSqjPgYI5ASqDK8bkxKeSPzrohPCVQZXjcmxRsTqDK8bkyKNyY+0ohPCVQZXjcmxRsTLxpeNyy6bnykEZ8SqDK8bkyKNyZeNLyuNyUsCq9brhjXDVQZXjcmxRsTLxpe15uyULH4bqCgoKCgoKDgX8VuoKCgoKCgYEmn2A0UFBQUFBQs6RS7gYKCgoKCgiWdYjdQUFBQUFCwpJP8i8NSscKj33uUzbvCFbEorltWZN4V8SmBqjQ1wevGpJhFcd2yIvOuiE8xi54R09SEjBiIKTlxV8xXpakJXjcmxSyK65YVmXdFfIpZ9IyYpiZ4XRMDKd4Ys+gZMU1N8LomBlK8MfGRCoD4FLPoGTFNTfC6JgZSvDHxYtJQitcNi64bH6kAiE8xi54R09QEr2tiIMUbEy8mDaV4XW9KWBRet1wxrmsWPSOmqQle18RAijcmXkwaSvG63pQYsfhuoKCgoKCgYEmn2A0UFBQUFBQs6RR/fWjhMUYgJVBleN2YlPJG5l0RnxKoMuJjjEBKoMrwujEp5Y3MuyI+JVBleN2YFG9MoMrwujEp3pj4SCM+JVBleN2YFG9MvGh43bDouvGRRnxKoMrwujEp3ph40fC63pSwKLxuuWJcN1BleN2YFG9MvGh4XW/KQsX/p58cyK/4qMeyeVe4IhbFdcuKzLsiPiVQlaYmeN2YFLMorltWZN4V8Slm0TNimpqQEQMxJSfuivmqNDXB68akmEVx3bIi866ITzGLnhHT1ASva2IgxRtjFj0jpqkJXtfEQIo3Jj5SARCfYhY9I6apCV7XxECKNyZeTBpK8bph0XXjIxUA8Slm0TNimprgdU0MpHhj4sWkoRSv600Ji8LrlivGdc2iZ8Q0NcHrmhhI8cbEi0lDKV7XmxIjFt8NLDzGCKQEqgyvG5NS3si8K+JTAlVGfIwRSAlUGV43JqW8kXlXxKcEqgyvG5PijQlUGV43JsUbEx9pxKcEqgyvG5PijYkXDa8bFl03PtKITwlUGV43JsUbEy8aXtebEhaF1y1XjOsGqgyvG5PijYkXDa/rTVmoyG6g+L2BgoKCgoKCJZ3ku4JSscKj/Yts3hWuiEVx3bIi866ITwlUpakJXjcmxSyK65YVmXdFfIpZ9IyYpiZkxEBMyYm7Yr4qTU3wujEpZlFct6zIvCviU8yiZ8Q0NcHrmhhI8caYRc+IaWqC1zUxkOKNiY9UAMSnmEXPiGlqgtc1MZDijYkXk4ZSvG5YdN34SAVAfIpZ9IyYpiZ4XRMDKd6YeDFpKMXrelPCovC65YpxXbPoGTFNTfC6JgZSvDHxYtJQitf1psSIxU8KFh5jBFICVYbXjUkpb2TeFfEpgSojPsYIpASqDK8bk1LeyLwr4lMCVYbXjUnxxgSqDK8bk+KNiY804lMCVYbXjUnxxsSLhtcNi64bH2nEpwSqDK8bk+KNiRcNr+tNCYvC65YrxnUDVYbXjUnxxsSLhtf1pixUZDeQ7A7kV3zUY9m8K1wRi+K6ZUXmXRGfEqhKUxO8bkyKWRTXLSsy74r4FLPoGTFNTciIgZiSE3fFfFWamuB1Y1LMorhuWZF5V8SnmEXPiGlqgtc1MZDijTGLnhHT1ASva2IgxRsTH6kAiE8xi54R09QEr2tiIMUbEy8mDaV43bDouvGRCoD4FLPoGTFNTfC6JgZSvDHxYtJQitf1poRF4XXLFeO6ZtEzYpqa4HVNDKR4Y+LFpKEUr+tNiRGL3xsoKCgoKChY0il+UrDwGCOQEqgyvG5MSnkj866ITwlUGfExRiAlUGV43ZiU8kbmXRGfEqgyvG5MijcmUGV43ZgUb0x8pBGfEqgyvG5MijcmXjS8blh03fhIIz4lUGV43ZgUb0y8aHhdb0pYFF63XDGuG6gyvG5MijcmXjS8rjdloWLx3UBBQUFBQUFB8ZeJCwoKCgoKlniK3UBBQUFBQcGSTrEbKCgoKCgoWNIpdgMFBQUFBQVLOsm/OCwVKzz6vUfZvCtcEYviumVF5l0RnxKoSlMTvG5MilkU1y0rMu+K+BSz6BkxTU3IiIGYkhN3xXxVmprgdWNSzKK4blmReVfEp5hFz4hpaoLXNTGQ4o0xi54R09QEr2tiIMUbEx+pAIhPMYueEdPUBK9rYiDFGxMvJg2leN2w6LrxkQqA+BSz6BkxTU3wuiYGUrwx8WLSUIrX9aaEReF1yxXjumbRM2KamuB1TQykeGPixaShFK/rTYkRi+8GCgoKCgoKlnSK3UBBQUFBQcGSTvHXhxYeYwRSAlWG141JKW9k3hXxKYEqIz7GCKQEqgyvG5NS3si8K+JTAlWG141J8cYEqgyvG5PijYmPNOJTAlWG141J8cbEi4bXDYuuGx9pxKcEqgyvG5PijYkXDa/rTQmLwuuWK8Z1A1WG141J8cbEi4bX9aYsVCx+UlBQUFBQUFBQ/KSgoKCgoKBgiafYDRQUFBQUFCzpFLuBgt+wHyOBW15CWAJvuaCgoEAUu4ElhZilzv7tqZX/eQTG4Z96ywUFBQULpdgN/PNh/fv111+fffbZXr16/fjjj97lkICJEycScNddd73yyiuzZ8+WHrOHWCzgRubOnXvfffd9+eWXJWk+VI0ePfruu+/+5Zdf/vb7tQ5QgP/+979SVAbVFix22HM0C3qmGbGg4G8h+WtEpWKFh65yepPNu8IVsSiuW1Zk3hXxKYGqNDXB68akmEVx3bIiMy72pptu6tSpE/ppp53WtWtXCksttZRq2Qd89tlnXbp0GTlyJOVJkyaxY1hrrbUuuuiiww47zML+7//+T4XkYinWuMSMK+JTAlVpaoLXXWiKLOLrr7++//77v/fee+utt55uhyqtr2eddRa1H3zwAWFUWUq+tbxrouuaRc+IaWpCxlVPsN99912fPn1effXVsWPH/vTTTyuttNK222671157YanliZClRLdZs+gZMWk9xeuaGEjxxsRHKgDiU8yiZ8Q0NcHrmhhI8cbEi0lDKV63LNHA5YM2ePDgxx9/fPjw4VWqVGnatGmHDh3WXHNN4nn9lK5I2bSlBK9rouuaRc+IaWqC1zUxkOKNiReThlK8rjclLAqvW64Y1zWLnhHT1ASva2IgxRsTLyYNpXhdb0qMWPy9gYXHGIGUQJXhdWNSyhuZcbE9e/YcMmQIz3uZZZZh3qEAqmVWOuCAA1q0aHHnnXcec8wx9erVe+mll6ZNm4Zdd911N9lkE8IIttYMa9zriviUQJXhdReaYsp1113Hanr66adn1vuff/751FNPZViaNWumO1WtWcPrmui6gSoj48J//vMfHlP79u3pxvHHH9+2bduNN9544MCBTzzxBCvHjBkzdthhh6WXXroUvWCzmUtkXOF1Y1K8MfGRRnxKoMrwujEp3ph40fC6ZYmC8i+//HL55ZezL8c96KCDVl999QcffLB79+4NGjRYe+219XIqxazhdU103UCV4XVjUrwx8aLhdb0pYVF43XLFuG6gyvC6MSnemHjR8LrelIWKzDnFTwqWCNq1a8fSXqtWrRNOOIGnzqKo92D27NlHHXXUzJkzmZtWXnnlFVdc8YgjjujYsSMHUOYsjqd6S9TIYoqm13nz5j333HP77ruvezsaBPZDU6dO5fBNWQf0v4sxY8bwpFjyb7/99u22246tAB3u06fPLrvswpO6/vrrb7zxxsX9cSyx8OCADd8ZZ5xxzTXX7Lbbbs8//zx7vk6dOvXr148t6aGHHsrOT8E2TRcU/GUUu4ElApYWDp2ffPLJ/vvvj6tvm5lxOJGMGjWqbt267ANQWDg5ep5//vkcT1mKDj/8cILR0zYWV1jguYW333571qxZu+++u529QLfGcsvNslsy5e/ipptumjhx4s4770wntR7wpFZYYYUrr7yyZs2aKN26dRs9erSCCxZHevbsee+99y6//PKXX3551apV9Ulce+21zzzzTN7Pk046iReAHQORxYag4C+m2A3882G6AaaeGjVqMPvYLPPTTz9xXMatVq2au0ZWr179zjvvfPfdd5s2bUri33tc/v1wa9zCs88+27BhQy35NgIUZs6c+dJLL7Vq1apSpUrcrPS/hV9//ZUzIr199dVXM+vBpptu2qBBAwrffvutjo+L+0NZMpk8efIVV1zBw91ll13q169fUlPYpteqVevzzz+/7bbbcHn0f+/bWLAEUuwGlgg0s2hpZ0NAmQILIbMP887SSy/tTj2UUVgdKVfwWcnWyzDTp09niWXC1Q/d3TsaMWLEF198QdXffps8kXnz5nFHjz322Ndffy0RF+hb7dq1pYwePVqK3ILFiF69eo0fP54Cu08eokSx6qqrbrLJJhTuvffeiRMnSiwo+CspdgP/fFhm0jUl+SdqrCJY6T/99NOcOXP0rYBEWWIA3QpJ9F+C+qmTsbDOgxSVXR1UVRYffPABs/Cee+6pmyqp6Z326dNnww03tB8TuLWlphdsvCSlncSW1AVRQMlxbkqFkpqDnco222xDAHuXMWPGlNQ0nV6lqUnuyiuvLFe1/xi4o8CQBsi8CYAi0cbcYohXAZT+l8EVn3jiCXrCPnuzzTZz3zRgj964cWMKPP3nn38+U1tQ8BeweOwG+GwULDIMoKY/TpzffvutDalNkWAiFjfN+3ugV6XSfKxvBjEo2qbghjtM7XPPPdewYcN69eqVpPn88ssvffv23X///fXj2wxqXNdy0XDp6u7GBVSlW9DwysVab8uCPlx11VWcGtu1a7fFFltYMDpXmTx5MuVll13WrfonMWPGjEmTJmnQSlIcjAYpGnMhBetu/twqlf9ieIJDhgzh6lWrVq1duzaK2xM6pi0p9O/fnydeqigo+EvgxVucDhn68MjmXeGKWBTXLSsy74r4lEBVmprgdWNSzKK4blmRck2cO3cu09B999334osvnnXWWUccccS5556LOGvWrJdffpl5h7mpRYsWmk+33nrrM888k1y1ZgWzKCIjBmJKTjBF9scff7zhhhs++eQTXAUwoa+yyir77bdfkyZNCIApU6bceOONEydO/PnnnzlPr7322txO5cqV3XaUK/vTTz+xgh533HGnnXaalgfdKVVvv/323nvv/dJLL2255ZaqctO10rNK0f5XX31FWTrXOuGEE3baaaehQ4fSEw5zVsXKvcYaa1x66aW1atVCYdgJoAW1yTJ/6KGHalsAdi2BS8d+/fVXytqdSKcwatSo7bbbjna22WabV155heXErVU7ZvOiIsHrmhhI8cbERyoAvK4ex5VXXnnzzTfvu+++xx9//CabbLLMMstkIhk3Im30wNLffffdhx56iBeD0UPkAV1zzTVrrbUWZZ4Oeywe37x585TCa9C8eXMKDLJacK01GxCTa6d43bwo++STTx500EGU11tvvUGDBi2//PKE2e1wF++8884OO+xAJAHvvfceAdTmWwtc0XXNomfENDXB65oYSPHGxItJQyle15sSFoXXLVeM65pFz4hpaoLXNTGQ4o2JF5OGUryuNyVGTP5vcYEPjNm8K1wRm3GxRkYMxBiBlECV4XVjUsxmXKyRESmwmHH2veeee5o1a1apUiVmFp43C9V3333HhHvIIYcw83LWRGcBa9++PQsV3HLLLeSqHbdg1siIgRgjkCJLh8eNG8f0Xa1aNXoLrH+fffbZnDlzCFAMM/6ECRNOPvlket6xY8exY8eSlWkHuH0isW+99dZKK6300UcfUVatVZ1xxhkNGzZkV4SbSaegMPYc5LId4XKsH4zkww8/PHPmTKroFRsXRpJ+UkvVTTfdNGbMGPpDLrDlGj58+Prrr8/advHFF3/77bfWPrhlyHcA1A5PjUvQSO/evaVbTN5mXKzhdU0MpHhj4iONvMvdMVws1f/+97+rVKnCbbKW77rrrn369OG5UMs4C5VLmSlqDXH27NmTJk26++67yeVB1KhRg3eGWmAvyKH8jTfeaNCgAVVMfDw+VVkLrs24XtHwuhlR3eYe7S+7NG3alBcD0Y2kPHLkSG6fgJo1a7L/87bmdU10XbMZF2t4XRMDKd6YeNHwut6UsCi8brliXNdsxsUaXtfEQIo3Jl40vK43JUZcPHYD9BX4XJnNu8IVsRkXa2TEQIwRSAlUGV43JsVsxsUaGZECsJJx8OXMwZrKVAhMSUxMwGTEOqp/WNiyZUuJgK5ca0QFs0ZGDMQYgRRZ5m5NnRzHNTO2adMGkVp1LElLwwjgBDl16lSlILrtgESaYpnhyMX8S5XFoE+bNm2jjTa67LLLMrecJKcFg+BvvvmGExu7gerVq48YMQJFMaqqX78+vaXqyy+/ZMxVhaVl1rmtttqKHQO6qgzFGHJNtBbYvdWtW5dLn3XWWW7jVsjYjIs1vK6JgRRvTHykkXeBm+I2f/zxR5bAyy+/fIMNNuBmYd11173xxhvZHTLI7jMy5FIleBO23HJLHgS7gU8//TRT26lTJ+0GevTokbaU1OZtxvWKhtfNi+LII49M9gL/+tdee+1VkhaM5PYVwC0MHjxYtZkYr2ui65rNuFjD65oYSPHGxIuG1/WmhEXhdcsV47pmMy7W8LomBlK8MfGi4XW9KQsVWWd/+9qt4J8H0x8rfaNGjVq3bq1NFRMNIlZofgQmX1wsZQoS/3rUATj66KOrVatGYejQoZwR1XNVcSPM+y+88EKHDh30FYL0DOhkzZgx4+WXX27VqhUH61LFfDiKcY7n0E8kbZZUB7fllVde+dBDDyVszpw5b731FgplIIaqPffcc+mll6ZXQ4YMsSEV48eP//zzz0844YSMHs8DDzwwYcKE3Xff/cILLyzrZhdfNIZYBnCdddZhk/fmm2/27NmTDRzH/TPOOKNJkyann346qzsxpZwFIR0YXl6eSpUquWFqXPCYUCioKgamyHJRSvNBT6ZMmaJyWX1YdtllVSCYd0nlgoK/DP88WNFQJ/XZls27whWxKK5bVmTeFfEpgao0NcHrxqSYRXHdsiIzLvb+++8/9thj0bt06cKiooCvvvqqYcOGHJGbN2/+2muvIbopsnkRRWTEQEzJCaZkqljIDzvssEceeYQya0ObNm1sp0LV888/z+289957HJq9reEyO1Ng5d5jjz2GDRumf68PiqERFp6BAwe++uqryy23nFuldLc1glFYkNhXcYRt0aLFiy++qB9sqylOcjvttBMz+L777vvYY49RZYnXXntt7969GV77zYb0Ugle10QVJk+evN1229WsWbNPnz6rrroqCuOQiclY9IyYtJ7idU0MpHhj3EjKWDZe9FPlfAx4XROtwNI+b948NlJ6LYFt3yeffLLaaquFW2PrwFtRvXr1999/nyfuNn799dfrP9XRo0ePQw45hILVulbtYL/44ovjjjtOZYlujDAXu/zyy/fq1Yt3KR8jhf3cSy+9hHLggQcSieJGUh43bhz7Iem8t/prH24MeF0TXdcsekZMUxO8romBFG9MvJg0lOJ1vSlhUXjdcsW4rln0jJimJnhdEwMp3ph4MWkoxet6UxYqJscV/q/iw6wKzO9m865wRWzGxRoZMRBjBFICVYbXjUkxm3GxRkbMuHfeeWfysqS7AUT49ddfmX1WWGEFRHYDEi3FbMbFGhkxEGMEUvJV/fv356hH91q2bMnaYLX0/OCDD2avQMFEK5hL7S+//HLKKaewmlq6xcyaNWujjTa69NJLrRGrwlpBlhigEZZ8+sO6zsqEkuSkI8ncrc0B6xCbBkucOXPmlltued1118mVNbwu1sC9+eaba9euzQrHvZgoa4WMzbhYw+uaGEjxxriRlH/++eeuXbtqcvn90A7TE1YwsGxe3StC3tU/0tNTUK3FsC1TyzE/KYChQ4fq0soKQ1erVq06e/Zs5aodIRerPwNKg+3atTMxDSlhPymAd9991xvjdU10XbMZF2t4XRMDKd6YeNHwut6UsCi8brliXNdsxsUaXtfEQIo3Jl40vK43ZaEi62yyOyi9gBUYdTLpbhl7nDQqwRWxKK5bVmTeFfEpgao0NcHrxqSYRXHdsiLz7pNPPskcxFNfXL4boDB37lw6xqTMPEv3ttpqK6rgm2++2XTTTR9//HHO6MzCbgoFc7lZ1m8W4yOOOKJz587oyeZ3/u28/fbb++yzzwsvvMBxP1OVbw0LiN27dz/mmGNQLr744nPPPVdZVB1//PGcib///ntcRviiiy5CJOyNN95gDRg2bFidOnWsNWKE1zWRAjsA9h/t27fXIZXL8aG1rmZSzKrWFVGE1zUxkOKNcSMpw+DBgxlYUzIx4HVNVAG4TfYWPPq+fftqwlpllVU++OADrG5f5Fv7o74bgPHjx5933nlysWFIZI94++23e78bUOHwww9/+OGHqWrduvUzzzzjXhEof/bZZxtvvDFl7pF75yXPxIDXNdF1zaJnxDQ1weuaGEjxxsSLSUMpXtebEhaF1y1XjOuaRc+IaWqC1zUxkOKNiReThlK8rjdloWLyyeL/Kj7J7sW3nXFd4YrYjIs1MmIgxgikBKoMrxuTYjbjYo2MmHE5Z+ub9sXouwFgetXszxE/OZ6nXHXVVdtuuy3HbgXLWsFc6Nev34orrmj/mgCrAo2wlm+zzTZsONLABIvxuuLrr7/Wur7ZZpvpnxUQ8PHHH6+++urst/RN/gYbbDB9+nT94ttRRx21995761hPeqY1r2sWeEA8mhkzZkh0a/Ou2YyLNbyuiYEUb0x8pJF3gfERjBiWu77xxhtZC5dO2XDDDa+44gp2rlTl00ulFNw/8LsBoD9gZbOGW6vfhXTbEarFXnbZZerAHnvskeSklILSd9L+k0WrrbbapEmTVOvGgNc10XXNZlys4XVNDKR4Y+JFw+t6U8Ki8LrlinFdsxkXa3hdEwMp3ph40fC63pSFiqyzv220KzJMsgW/k7JGsmIOr3rL8Z0llg3B448/Pm3aNN7XWbNmPfbYYwcffHCVKlXCPef9fuWVVzggsjyXpPnolxCZlDnJlaQ4ateu3bJlS677xRdfcHqjQJdYXTbZZBNWfaqIGTNmzDvvvEMVq/jzzz9/xBFHhPuZh3ig5c8//3zXXXetWrWqlFL1PwvuSxs+VtPBgwd37NiRrd4ZZ5zBvW+11VYPPfTQW2+9dfbZZ+s1iBmEQAxVMS0Ag09/xo4dOyGOiRMnslMkK98+CjrouwrcKVOm/Pjjj5QR05AE7o4dQNK/dDdQo0aNUkVBwV8CL97isRsoWAJhx8rqu99++1H+9ttvn3jiCQrvvvvuN998w26AmTSNKpN58+ax5B9wwAGcL0vSfDjNf/bZZ/pPGJcLPjOHHHLIUkstxWz+zDPP0IepU6fef//9J510ErP5kUceieUI++ijj7IX6devX82aNXfYYQd9K1MutE6wGrHP0Af1nwp3x4PmZHzyySdvv/32d999NwPYoUMHNgEDBgxo166d/hEsAwKlnD8frvjpp59uvPHG66233rpxsHdhA1HKXxDeChps2rSp3gRW/dmzZ3PXmSfL40YBmtLvnJYqCgr+EhaP3YDmgoJFg3lHBe9IVszhpVdMnUyjhx12GDMjLsdE1mCWXvYHK6ywgmbSUrSDvvXCsuSPHz++VatWmTCqnnzyyQ033JCzWkmKhqYaN27MCkGB3cCcOXMefPDBOnXqNG/eHIVDLc1SePHFF9my9OrVa4899qCrNv6R0AKQVa9ePXYDuiOUUvU/C+6Lp1ypUqUtttiibt26Xbp0GTx4ME956623ZhtnAWKhg2Abr8yY4yJqrS1JQRSmFymSn376CctV1IKh62I58Tdq1IjyDz/8wOuR78m4ceOw3PWuu+6qFOkFBX8BvG/FdwNLBJpJF0c4J+mHwcOHD+/bt+8bb7xxaPrHfTVdetHLzZK/2Wabrb322iV1PnPnzn3hhRdat25t/7CwXFSvXv2AAw6g/YkTJz7++OM9evQ48cQTEekPuxb99YIpU6bcfPPNAwcObN++PSmBrobZcccdWSApLHILixFs+wYNGtS5c2d2V4v8uurnR3oBVDCrAjCYWrYDEMZWjIf7SBwPP/zwAw88YH8woCz07ZF+QRLXfay//vrrkCFDKKy66qotW7akA9ILCv460g9IRSfdfHt+8cF1hStiMy7WyIiBGCOQEqgyvG5MitmMizUyYibyiSee0Ay7GP0WoSw89thjHJiYOldZZRW6ql/9029mWaQVsFTNnDlz3XXXveqqq9wYVQ0YMKBmzZoffPCBtSDcyLwr9JtuH3/8sf5LAfSHpct+nRDYsugPIrHVaNKkyY8//igdSJc1vK4sDZL74Ycfjho1isWD66YhC8RkXLMZF2t4XRMDKd6Y+EgjkMItg/2qoNmMizXyrv5KNG9Lv379LAWmTp2677778qpTe9ddd0lUbca6Lv3Ji3KFuep8IAYL33333Sbpf7aYzuDaY4VPPvmE15LuXX/99WoK0GUNr2ui65rNuFjD65oYSPHGxIuG1/WmhEXhdcsV47pmMy7W8LomBlK8MfGi4XW9KQsVWWeL7wb+sfB0ZeH777+Xa99nMu8wN7HSsEuYM2cOrl6LJLOCwRGZgxoFOtyhQwf9EYKyjo+6C45Z06dP32233XTXQuW+ffuuvvrqm266KS53rapIuChZDRo0aNy4sQaQ079+kCE22mijbbfdlhgGtk2bNvoDiOW9CtDCrbfe2qJFi+23354Tqnr+z4axZaAW4XcsXPSv8ngBnnnmGdZUxg0mTZp0wgknTJ48WZcYOXKklttSTtmU9Y7lodmYYDbfZ511FpHvvvvuhAkTyLJJuWfPnrNmzdpmm230/QFVpZyCgr8MfWAWC/jYmM27whWxGRdrZMRAjBFICVQZXjcmxWzGxRoZkQJTjM6ywGyo+WWvvfZiQ4AOV199NSLzL4cSjiYWTK7acQtmjYwYiDECKWVVcRd06cILL2SKrF279pQpU/Ixrks8K/Gpp57asGFD7tS9Hapmz5694YYbXnTRRehqOU1NUIxs3hWkAAP14IMPMnSrrrrqF198QQyoNaoef/xxqujquHHjVEWiaw2viwWaWmONNbTAcCM0brVmM67ZjIs1vK6JgRRvTHykEZ9iNuNijYzLoA0fPpyXmXED9oK84ccffzwjed1111FG5NHUqFHjkEMOOfvss0nxNr5Q0fC6ZYnA6/Hjjz+efvrpfOjatWs3bdo0XlHo06dP9erV11tvPT6GincT5Qqv600xm3Gxhtc1MZDijYkXDa/rTQmLwuuWK8Z1zWZcrOF1TQykeGPiRcPrelNixKW6zP8va1Vk6GjG5l2xaJF5V8SnBKoMrxuTUt5I133xxRfvu+++hx56iBWFOWj8+PGjU5544ok777yTl4D5kWn0rbfemjlz5pdffrnKKqswXZbVmlwRH2MEUsqqAnrIhN69e/eDDz54n332wc3EuC7MmzfvzDPPbNu2rf41oMuIESNuvvnmq666avXVV8e1psDbmrkuiLVq1erVq9fuu+9+2GGHodAOoAP7gKeffnqHHXbo0KGDtY9u1vC6WLJ4Ln379uX4iNK8efP9999fTWXacd1AleF1Y1K8MfGRRnxKoMrIuyuuuGKlSpUGDBjAKz127NhXXnmFF/6SSy457rjjhgwZglulShV2V+3bt2dDQNlt1lpbqGh43YCIZUfStGnTZZddtkePHi+//PLkyZPvvffe66+/nqd8//33169fn4A0L9SakYlx3UCV4XVjUrwx8aLhdb0pYVF43XLFuG6gyvC6MSnemHjR8LrelIWKpSlMfsVHPZbNu8IVsSiuW1Zk3hXxKYGqNDXB68akmEVx3bIi5Ur/+OOPp0+fripgrsEqRlsBhVHApbDJJpswpSoGqypzLR4yYiCm5ARTyqpSr7AXX3wxi+Lmm2/OnkYzZj4Fy6aHRfTcc8+99NJLOWwhEm+R/fr1Ywq+/PLL9SuEiMoFi5GScQWuClyFsyZbk4033pgARSqYrt50002NGjVq0qQJlyZYtWbVAnhdWdr/4osvunXrVrly5VNOOYXNEKLVWqLrmkXPiGnzCV7XxECKNyY+UgEQn2IWPSOmqQl5Fxh/HvFzzz33ww8/8DK3adNm/fXXp/aFF17o37//QQcdxHMhS6gF16qdsJhcLMXrliW6Ls935MiRbPh4ymuttRbbVraPequpBYv0tlZycjGuaxY9I6apCV7XxECKNyZeTBpK8brelLAovG65YlzXLHpGTFMTvK6JgRRvTLyYNJTidb0pMWLiJ80sDnhvwFzhilgU1y0rMu+K+JRAVZqa4HVjUsyiuG5ZkXlXxKeYRc+IaWpCRgzElJy4K2aqBAqzvB2btMqiZ1Kw2j2gmGgbIKqYhSmgmEiMkGtixhXm0hSnTxrBlbVaqoAC7qLtBijTggoCXVexGCW6rlnFu2LaRoLXNTGQ4o2Jj1QAxKeYRc+IaWqC18WKkprC6P3yyy88DgLcR6+Ca2PEpMUUrxsWKciqoLK9QplI1xVe15tiFj0jpqkJXtfEQIo3Jl5MGkrxut6UsCi8brliXNcsekZMUxO8romBFG9MvJg0lOJ1vSkxYvGTgoXHGIGUQJXhdWNSyhuZd0V8SqDKiI8xAimBKl5TIYWpM4nwpRAjK8UKFskqazOvJQqL8brCddWIXQKsNtABucLruqKtW1jddSbGdQNVhteNSfHGxEca8SmBKsPrYrUb0+DL6rnLWqQVXJtxvaLhdcMiBbpklg2K+5Qzka4rvK43JVBleN2YFG9MvGh4XW9KWBRet1wxrhuoMrxuTIo3Jl40vK43ZaEib2BpYi0oqJhoNk/e1HQV1/FOVV4sPh+GsvTSS2s9+P3QGk15W0svvui/Hu+2rMIiN7VkoiWWgqyeBZann9b//STvR/qU5VqhoOBvpHgLCyo6zJulUhzljV8E0sn8r7hKqVTwO9AwFoNZUBCm2A0UFBQUFBQs6RS7gYKCgoKCgiWdYjdQUFBQUFCwpFP6RdaCgoKCgoKCJZaK8N3AnPHv9O754EOP9RtREkr8OPXrid9M+6nkLXbMnTJp4tdTfyx5vw9nKH6a9k2ZzSaXnDzj15JXUFBQUFAQw7/+9f8BxSDs+LcpikQAAAAASUVORK5CYII=)", "_____no_output_____" ], [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAukAAAFDCAIAAAAIyQdJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAPAWSURBVHhe7J0JoI1V18cbzfM8kykikaIoTYY0SCoNVBJlSEoZS/UqMs+zqJCkVIhkSBkyJBKVRMaQeR6i+n7n+W/Pd+5wzj3XeA/r9/U97372Xnvttae11z73XPfi//777yLDMAzDMIwo4RL3v4ZhGIZhGNGAxS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0YTFLoZhGIZhRBP2NwEMw4h6/v3334svvpgEDo2E79YuueRcXs9klZ6gTD9x1mA0NCA0jTEaE3LO7eAYxqlga9cwjKjn+PHj//zzz7Fjx0iT4FXpcwhRAmAJaUxSWpkSOGsQpihgomniFZ4KZfQ0jGjEPncxDCPq4TzeunVrz549d+7cmTt37iZNmmTPnp38Sy+9VAJnH+IV2Lt3b/fu3bdt25YlS5ZnnnmmYMGCRA9n+aOXjRs3DhgwgGCO4Onvv/9OnTr1Cy+8kCtXLsw4y5YYxunCYhfDMKIeYpd77rnn66+/1sctdevWHTZs2GWXXXYOfyyCa8WYZs2avfvuuwQNRAnVqlWbMGEC4dRZtmrp0qU0feTIkQMHDmBGtmzZvv32W6IoLLHYxYhSoulnRvgCXWXwUzz1etZiL6/lALQOal0JpZ1cWCQsDT7kuOII8Kur9WB78I8QLACJUn4hoPHRyAQPjp/P088noRxlqlawQCxUJBk/7aMcJ3oK+Npsfn3+/vvv1atXHz16lDRj8vvvv5NzzgeHyIC4gWnSK1YRzSQYLgRm17M8Evt94TCUKlVqw4YNQ4YMkbDsscDFiGqiLHbRE6/NrgscDp77VumZhnZpNLD1T5ihnS8beCXhCYbDqx2ojjAJKiqh0ggJrggkQPk8KSKC8YfF3FNcGBx/7pQDZGo8lVCmP+PAqCJPKag0Loh5s+HmVGnlK0fPUwSd2MDTV2tcfvnlxYoV4zxmQC655BLSyZMnd2XnCE0NcYNMwraiRYtGbpWqh59fSsMLCFpPlixZoUKFSEve3IIR7UTZd3Xx1/q4NXAmnEifHdjtOr22bNly8OBBGQDkyxFE4kSQcdW8g0f2hzkL44WKtKi6apQ0Cr1T9Z9Nmzbx9EslYATD6DGDCxYs0DVdMG7Lli3TcGlCQa88A3Mc5O5DjSr569at+/XXX5kCzTK3/0WLFv3111/khKqVWHRCi2CrLmQY6u7du9esWfOaa6554IEHXn/9dTIZH5WeQ1577bW6deuWLl26WrVqWKhZc2Uh8Oc0wclFgJ0eyRpAJhIxw4gWzt7Zf1rQzl+yZEnt2rXvvvvu8ePHJ+gIThc0ROzy7rvvNmvWbMeOHXIEOI59+/Z9/PHHbdq0CT4IQyFriX4GDRp077333nXXXX369OFgU2kkqPs0V6VKlUcffXTixIlHjhzRMcnVE6tGjRr1/PPPr169WpmumhEEY7ht2zYG/88//3RZF100duzYVq1aEWEwhsGLShPN1BPrPPTQQ/fff//UqVPDDOzIkSM5OBGgIkHG4cOHGzRoMHv2bOk5LTOCecwvh+LkyZODTb2QYRyuuOKKMWPGzJgx4/3338+TJw97E1zxuUAG5M6de9iwYbNmzfr000+LFi1KvlZCGOiLPjoNP7mUIoNkhGsgwXYNI4qIpu/qskvZfjzLly+/YsUKLM+SJQuu6uqrr3YSQXBIcOXlzOCUinfTquMU4V+IA1Cl/GCQQQ9PxPbs2dOyZUu0DRw4MFeuXORzLE2YMIFX7uvXX3/9l19+mTp1alczCP+4Qgm+5vvvv2/atCnRT/PmzVOlStW3b9906dK98847xYsXlzGhrAWq9+/f/+233y5VqtQzzzwzf/58IhVumZ07d06fPr3sPHToENe7SZMm8bzppptQSHXyhbRdyDBKPNetW1emTJnFixcXLlxY+V27diUoAUJAXjVW3qgH2LVr15133knUSH7WrFmZxHz58nn1YoDkG2+8sXz5ciIh6dm7dy+z0L59+wcffJDp+/bbb1lCEg7DrbfeqjVG9KMcrUNYtWoVq+XDDz9ET79+/Ro3bmzTCoytEgyU1jxjxcj4AxgMRS7lEfkAUhH9vrwaCgPOiie1LrvsMlWUVeErIskq4k7y6quv+i5FVSjSK0rwbx06dMALsdPj7aaP+stqL1euHAkW8Lx581j50uZ3R+b5aRmcYB8N49zAYo0Wjh07xnbduXNnmjRp9BsEHA8c0q44JngNzhv2aphdrZ2ZIkUKjiJXLSa61vAkcOHsKVu27ObNm7Hh6NGjmzZtqlevHhdxIg+UVKhQ4eDBg65aTKSBJ76A6ty9iDM4I8mB7777Llu2bIRf69evRzM5rlpMaJGi0aNHE+7ccsstW7Zs4ZXMtm3bJk+evEWLFhIAWsGdtWrVqkCBAtOnT0cnrTN0PJ0u47///vjjD2bh999/d+///delS5fbb7+d4XLvHgymhu63337LlCmTVh3PBQsWOIk4vPbaazVr1mQ6qAu7d+8uUaLERx99hB5Oo8qVK7vFFxqa8CfOKfUsIWfEiBGPP/44i42lCwMGDCDfSVzYTJ48uU+fPpzib775ZuvWrcePH89eCLXmyQeGTs9EoV0GpJ26ECDAPBJfYlWnTp0IRHgyiQlWxKqRI0dmzpy5WbNmBw4coC1WjhoFlhZrifyXX345ZcqU9JpMVzMEAbv//XfRokVaYMQuRMDkoIq28H5cwIYOHTrMgwQ3Q7WFjFNhGEmMaIpd2Gbaw1xkOT/w3RzPHEKuOCbsun379nHfxdfXjY86Ho8++ighSPAZFgwtAqEAbiJt2rRscm1pwJLt27dT1Lx5cyzBpFCxC5agRLVeeuklTiZaxIXhgHSeYQYBVqNGjXTguWoxQZJQLG/evNzDPvvsMx1sKFy7dm3OnDkJv2bPni23COTv2LGDuztXK5yUhMHpMhITu2iciZgJGRUHFypUiNjRScQhbuxSsmTJDz/8ED1kduzY0a2/0LAeli5dSqNMulPqLUWqswYOHTpE3JwxY0aMIXaxaRV33303A8LmAuaIjcbohRocTQ0jjADoNRIQVi2evDp1IUCM2eR+hbPCJMzDX7HfyXcSIUAzxr/77ruEy02aNMGxMPXMu5YES3T//v0vvPACgcvbb7+NC0rQkoDp8cUuqEIzK1+D5sXDgc9g+vbtS1vCqTCMJEY0xS46g9m9W7duJSjhdvXDDz/EOmx85F8oVbjDa1woYuuG8Sbk09zUqVM55zhRkJQNQn6kW7du7PyKFSuG+dwFN4EwcUaqVKmSJUvGSUYm2sjEgE8++YTgg6Bk48aNqHXVYoK8GiJ8IS6hXSqSCQ8++CAep1atWqRlEk/SkyZNojniJHm9UJovTBIVu2hU16xZ88Ybb7Rs2ZJbaZjBDPW5i9YhqqgbHsSoDqSd0qAoSqX58+dnMfTv3598J3Fhs3nz5u+++444Xqfv888/z0Ax2q44Jt7MBMYTT8JKWBcxbGHkWQnbtm0Lnp14oQmsYrU89dRT/l2LzRhJRSyHYcOG4RYaNmx44MABaikTDURmuBFWrBYV8q5mCAK9jS92ARb2n3/+WaRIESy87rrrcBp0c+/evf5icyoMI4kRTd93Cd7zOG58AQluM/JWsWBbssm55rIVJekKglBFNu3rr7+uXyCMBS2ipHr16rjFyZMnV6tWDbW6Rfl07dqVKIrYZfr06dyEXG4Qnr0B99GjR4+2bdumS5du8eLF+uc1JYAfueGGG/bs2cN1p2nTpvF2B59Vrly5ZcuW3XPPPRMnTlQmarGwe/fu7dq1I/qhp1zUGBDyKeVCxmH8448/zpw5E/2opUgVDQ6huN93+eqrr/zvuwhNHAnGWUvOT4QaTNbSTz/9RLAS/H0XAhqCS1Rx3vzyyy+SDEOLFi3Kli1L6/G2wonCct20aVO/fv24l7vcCxutebbh/PnzSbz44otsNxLx7ibN6c6dO+++++7ly5crMxKkjedVV101Y8YMwl/lhwKrYPTo0YQvNHrFFVf8+uuvhB3xWhUMU4wMVUaOHEkcxv1kyJAheB7CVtbSwIEDCaPpo5ainq5mfGhw/O+75MiRA59w5ZVXkv/zzz/XrVsXq2rXrk0onCZNGrTFcnGGkRTxNld0wLExa9as5557rkGDBvXr1+c6gosP9ek9255o4MYbbwxzYLPh2aXJkycP9X0XPAgXEQTy5s2rr5gQQ2CGK/bgNEIVTpMox2XFBEuoSC3CDiRz587NLYpMV/zff7xyIaPooYceQtLlxmT9+vUKjNq0aeOyTny2hIXyhl988QVqQfnQsmVLqY11iTci/9yFHKJSQgSW3NNPP/3MM8+88MILu3fvdhJxiPdnRmPHjmX8uSXTBDMSHg4Pfd+F6k5pTFiE+fLlQ2zAgAEu64JHo81+1xhyrivHFceEraGNzDIglOTkjhCEV6xYQYLYN5TyYJChlffff5/JwipiF9ZAJBWpBRgJ7733HiEF0c/27dvxeKlSpercubM+tQXEElSIAAR/7rJy5Uo0jBo1ittOhgwZBg8ezIoF6XTVDCMJE02fu7D9iA+47jz88MN4dk7r7Nmzz5s3r2DBgk4iCPp1+PDhr7/+mttV+D7iVqpXr85+du9B4OOIkHA9CIwfP54QgUy0BcdDCX7ugtlUwcUUL14cX8lFn6sel3JfCX2pUKHCkiVLihYtinOUm4vFtGnTuCOiivtW+/btXa73ecCCBQuqVKlCZzt27EhkQyYjoy5/+OGH9erVoy3U6rTzKhmJ+9yF4JLZb968OcuPTKJPgl39uZy4xP3c5eabb/Y/d2FBEgRLMgxMaLZs2UgEG+PDsrTPXWKhBR/h5y4IM4YUAZsiXpl4YbshrLZUV/mhQBIS+7mLaiEv/QQTLKemTZtSHQfCWiJ6xnsg4D8TVMhTn7ugk6X7uUfPnj1xPpjH9QYZlKCN61ywfzOMJIq3TaID9rDuIqVKldJezZUr15o1a1xxTHQdwUMp4XKDIBNVCJBAxuXGZOvWrfrV5eeee06qgIQr9kjwcxfVIoTSJ8ycZGrXFXtutHTp0hRlzpw51Jdmhg8fjhn4neCrtiyfO3euYiYOV7WljvNcuHAhlyoq4p5imX2BE/nnLownQ8eVV6sO8uTJ89dffzmJOMT93KVEiRLjxo1DCaqcUFgQ0woBlxUTSu1zl1hotCP83EV7RDMbSiZeEGbwSURYUa0k9nMXtSLzgDW5f//+2267DQ1ly5bdsWOH8rVOIBKF4H/ukiVLlmuuuUYBCobVqFGDVUorKBSummEkYaLpIq7rBc8UKVJgujLJUSIW7EkJK+FygyBTNwwS8ixx2bBhw8aNG2mrSJEiyAjt+chR6zzxMiRkebBJfl/wL+hXOi5xi1Diw6v0oERFvHJ3Z6x4/eabb/xWjMhhGDXjLJXLL7+cMRSuODI0O4GlE3pyg0FMKzOxK82IHLaJm8tEziazw+muOT2jKIYgcfjw4bZt286fP7927dpr165t1arVvn37JINAYi3B/l27dhG7lylTRp/qTZ48+a233lI8lNjRMIxzRTTFLmef9evXHzhwgN2eKVMml3VSyCOgB0ejG7nyBZkIcFCRH8oTyV0ioAAoDMHeJ126dPo5159//mleyTCC+fvvvzm233vvvXcjZsSIETxHjx79+eefJ7gTTxq2qnwFiUOHDrVr12748OFdu3bF1O7du9P0yy+/TPii4IanqxYZqM2QIUO/fv1mzJhRp04dOZyBAweOHTtWsbI5CiMqsNglHDgInAibmdglVFQRCcF1CSa4VbuXIGglTZo07iUO3JBwUjiXMH/LTT85knK1SC35o61bt9IREoZh6Hg+cuTImDFjOMUjhzOeZ9++fceNG3fmYhfglsKGJXBp1aqVQpZGjRqxtR999FHSEyZMIHzZv38/HTmJfZ0jR46qVaumTp26c+fOZcqUoS3CuFdfffW7775DoQbHMJI4dp4lADuZOICI4aS3tOcN/kubNm22bNmkRDm4DD1xHDgpErlz58Y9qVR1Qa9XXHGFohBdtnzI0W8tkZB+H3L0Q3ESe/fuVY5xPqF5v2Dx17kSjAZb1T/Llal0LBCDDBkyDB8+/JtvvpkVMV9//TXyM2bMGDp0aKpUqZy60416QWjVtm3bkSNHduzYsWHDhmx/8dhjj3Xt2vXzzz9/6aWXcB0Rxi4aHNCaYWSoiMcYNWpUrly58BKbN29+5pln1q9fTz6liOmp6oaR1LDYJRz+Ng7z5xIplRi4rBM/TVdCOchcc801JDZs2HD48GGcCGmV7t69m9hCAjzJoa6vQRQtWhS3RebGjRt5DTR2gjVr1hw9evTyyy8vX7681JLpVXI/peIVP6si4zyAZaBptTn1YUyEFr+fiBf2hbZGmjRp0icSIp6MGTOmS5fujA4+gUubNm3ee++9zp07N2rUiL1Pd0CW16lTR5++NGvW7ODBg65OaAhNdIcBXw9PRqlIkSIEcNzNaOL3339/4YUX9uzZo+//gqoYRhLEYpdw6GsuOKlQnw/7XoCd77I85NfwFyQkw2u1atV47tu3jysOmSCxLVu2KJq59dZbyfSrC6WzZ89+/fXXk165cqXvUxA+fvy4vk1MdFKqVCkVqUqwqrx58wbrNKIaplKzqbWnzAsZBuGNN9546qmnfvjhB9IMy8yZMxs0aDBnzhwnEVXQhbfeemvEiBE9e/Z89tlnuZawzYMn/bLLLqtbty7hy2efffbKK68kGGT88ccfDRs2ZIikBIfTvHnz5557Tt/5/fHHH4ldUELRlClTcFNPP/20f0fyFBhGksNil3DkypWLOxZbOsHvuh49ejSWAK/yKfgaErzWrFmTe9uhQ4fwsOiUDM8FCxaQSXRy77338qpaFBH6+GmeDz/8MM8lS5boH0ZTEcyfPx9tTzzxRMqUKaVW0C53MqIuMuP9V4ONKIUJ1TWaVaecCxPGAUiwHTiAV6xYUb58+dtvv/2mm25KmzYt28r/fZzogk5dffXV7777LuEXaXpHsEL4QhFp9rXEnnzyyZEjR5YtW1aDEAbWydy5c3ft2lWhQoXbbrutRIkSa9asWbp06d9//02p/ibAjTfeePPNN1OaIkUKBnP//v0UyfMYRhIkmmIXHdg8/c8/Qa8q8o/z08UVV1yRL18+lPv/jKbiCTXnxxb4DvyCPmilCHbs2MF9iLvO1KlTZS0yWbJkady4MT5owoQJ+kE1kiQmT56MQOvWrfWtGiTxHY0aNWrRosXKlSv93j322GO4mO3bt8+aNYscfdLzs0fmzJmff/55KqKTJyBAxb/++ouoCLFy5cqRQ8JILEyrpsC9e/DKxGmQz87A0gqwDLTGmGjYs2ePcgT5TvoCgwU/ceJEDuPZs2fPmDHjm2++mTNnDvvi7rvvdhJnEZbEgQMH9CeBvElzk8J88YxkjujOI4888uCDD1KFbe5t6Iv1+8z64RH5RDM8ue08/vjj0hyGkiVL4kkYE8Zn+vTpJL7//nuimWzZsqG5T58+vJLJ0OGvvv3222XLllGFolgfJxtG0iHKPnfBHXAYcyR72/nigwcPbtiwAa8tN0GOkztNpE+f/pZbbsE1/P7774e9v9eqfJ0TYsuWLexwTjj94JkiopPBgwd369Zt9OjReJYlS5YgRhHmvfTSS9dff/2XX36JB8Fman344Yd4Da47TzzxBP4IsSNHjrRr1+6dd97p169f3bp1CVaojg0Yg05cGE8apfrOnTu7du3KgHTo0KFgwYK+mwuY6DX3xx9/4EbTpElTuXJl2WAkCgaNcQaOIlAmEQOxKaNNvmSUf0aRJTxZYBiwb98+DCBiVpFkLljYHaDFLxgcZTqJswVz8d133xFSVK9enS3MBieHHY1JJPR0omHxOhG4igj/1X8KrzDhPiKpXzlUFa9q4F+3Cn71NLmQyId8p8IwkhjRFLvgtVetWlWvXj19X4R9hQdv27YtcUCEHiGxsNvvv/9+ggbucBwYtMJ+VluEMmPHjn366adHjBihL51wVerSpQtRBYZxKcdaEtSaNm2aagGqRo0add111z3zzDPEMfXr18f+mjVroiR16tTSjDbqqnX98RT0UEQmDrF///6Ebg888MBrr7123333LVy4sHPnzk899VQsL8MrzJ8/n1qIZc6cOZaAESF48+XLlzdq1IhAkFeGkWDx+eefnzVrFq/My9kZWFqhLWLoHj161KpVi5CUc/H9999/9tln9fu6LButH+Mcwsbv1auXPtjo2bPntm3bmDi8h7zBVVddpY9PDMM4RSK9ByQF8M4c25zlHCc63TGesCBXrlyFChXSEcJ578meHmgFZ/Tkk09+8sknH3/8MUGAWqFdTgsCC4IncuSYgPwyZcqkS5eOCIZzJUOGDEQqVatWJc5Q+IIMt+f9+/dzOfvtt99SpEhRqlSpsmXL6rojASAi+fLLL3Pnzt2xY8f33nuvUqVKskcC69evp/rWrVvp+A033KC/40h1kJgGh1Zuv/12TrtJkyZVrFiRiqd3cKKayP+eEfNFrLx69WoNPk9gJPPkyXPFFVeQw7DzdBVC/z2jBx98MFgssTCntLtjxw7WPwmsUtMUpUmTRv9KBy2eShPGqYO74F4xZcoU0kWKFOFahaNgG27YsCFlypSffvpplSpVmCObJsM4VTxXHB3gnRPEiZ4mOCGAQCFz5sz33nsvl11e9bmID2Iu5f20SK8k8Fm7du0itiD40LVYMrFAoUt5IIn7ExyBuL9ly5ZRN5aYCG7aF5CF6JkxYwanWuPGjQ8fPkwmBLpkeET494xAoxoMI+lSHk7uBHH/nlHJkiXHjRsXVzKxSKFPvDnGuYVZmDBhAu6C20jRokV79OjBzQQ3myNHjp49e7IT2Zg2U4Zx6kTTz4x0XwmPEz2t4H1efPHFr7/+eubMmfgdclxjHv4rCf+TD1454UaNGkUYUblyZfIZa+pKMhjd2n1Ul2s9h1///v0rVqxYsGBB6qo0FhL28SY0IEk+XnLw4MF4z3bt2tl1/FTwhjYGcafs7ODaO0G8OcY5p3r16lwbWrRoUbx48WnTphUoUIBwFu/RpEkTAhp2qJMzDOMUiKbY5ezDeYCvIZJo1qxZvXr1iGDWrl3ryuKg88OLH/7bu3dvt27dxowZM3DgQK5cKpVYeDgXCXHWrFmDp0NJp06diDz8kCgMCowUWhE26XcHBg0alCtXLnIwSUWGYZxRcBclSpTo2LHjhx9++Nlnn33wwQfELoUKFSJwUanEDMM4FSx2CQdHvhLJkyfv0aNH/fr1a9WqRWChzDDgs1KlSjVx4sQyZcr4UYuvLTxHjx4dO3bsfffdN2LEiCxZskTo7NQKUQ7VCVwwYPz48fq8GohmlDAM44zif7yaLFkyLh7avwpcyAzcbOyjF8M4ZSx2CQehAK4H9G9mt27dWv/SJeELHoqAIG5MgHuiFlFOixYtsmXLpi/hSo/8V3iQTJs2bfv27R944AGiH3yfNKDWSYTm2LFjBC69evXaunXrjBkzSpcuTSYVaVp6JHaBw8nB3PFMkSJFrFNE/76OSl1WIqEiA85Q65TSItEBxqu+t+REjfMRzT4w4+Cn5QEEaSdtGMbJEvCwLmmEhXMI8E2ECAwaBxKZeCJyJHBuwTYMw1EePnw4ZcqUWIiLJDPpWJhEIHpgQI4cOUIAWqxYMc0jEPCRmSdPHkrh5A4YBnzLli3EKLlz55YGJmXt2rXZs2cnJFWOHV2GYRiniMUukcKZx1gB0QBHlDKTTmSgCz3mKV4hwRmpZxKxMIkQmMKgNe9HEowbaKxOetDQ4FIeaghVzI4/KTYdhmEYp4hdARMBp45/8HAI6TTS6zkHw3Q0Kn2Z/UZDCBgWbxpjBxDKJxG36CSI1YpiXP/VMAzDOBXscxfDMAzDMKIJ+9zFMAzDMIxowmIXwzAMwzCiCYtdDMMwDMOIJix2MQzDMAwjmrDYxTAMwzCMaMJiF8MwDMMwogmLXQzDMAzDiCYsdjEMwzAMI5qw2MUwDMMwjGjCYhfDMAzDMKKJaP2bAPrLefrzPeqC/l4MeOWnH7Wov2FEIrgtcrCEJwQbwCvPUH9+T/LuxdPvqUzgLxhTRZIkkNRfRSY/uBX9XUbEyAzON4Bx06AxMhpD5TNcKgX9+SHl+6hICWohTwJJcnxhKeFVc6R8xEioLvgtnjQoV0O0roak/EKea615WLdu3ebNm6+44oqcOXPyqgk6Jxw/fnzfvn1///33kSNHlChcuLD+nHgSmSatIp5YuGzZstSpUxcvXvyyyy47h4NmGBESlZ+74KkFu04JZWorniHQr0NCCXwlCRlAqV79hJ8fyklR6gtQ5dixY6QjtN83I1b3VQoIUISL9EsNHwaE8dEYuiwPDWaCU4AAdZkvnrz6R6aPr0QCwm8rVqMnjez3cbkXMJq+CRMmVKpU6a677qpateq3334bPAVnnx07dtxzzz3XX399xYoVsadatWoLFy48tybFguiKBUxcdd999919992VK1du1arV0aNHXbFhJGGiMnYB/JSuLyR09pM4o34B5b7+TZs2/frrryQUH/iJYBnl6xkGBLjlqC9KuILQ0F+/IT8RzPr167FQ9kSi8IKCMcFl7969e9iwYXhtl3vRRd9///348eMZtDDhiyIVTZmf9kr+H6oz+EOGDNEZgABMmzZtxowZVAmjPHLoAjqxAW0gSwLr4Eyu/yQOo8G0vvHGG9u2bTt06NCaNWu6devmys4RyZIlu+2226688kpMIo5hsflLQgLnHJwD48aynzNnzoEDB7Dwvffe++mnn1yxYSRhovVgw0+x2erXr58xY8bbb7991apVruCMIY/DUTF8+PCHHnpo165d5PinxeHDh999991rr702c+bMZcuWxQX8/fffkTgpZH7//ffnPDiEIqmCx1m2bNnTTz+dO3fuXLlyNWzYcO3atcEV9+7d+8gjj3A2M0ouywjisssuY4hat269fft2l3XRRbNnzx48eDDDiDcPNQuM/MGDB59//vns2bNfd911K1as8EMHH5bEhg0bOESR9HM+/fTTsWPH6qhQ5qmAeWqFNdOgQQNlYkkki+d8RWOyZ88eEsDKJ1A4t+s/ffr0//vf/z744AMSzA45evpO45yDJdz69u/fzxoGhgv8dWsYSZlojV0uv/xyrgujR49mpy1atAgfwd4Ldgo6VIBTinsPp1QYEEAMYbye/IuQBj8Tb/jyyy8To9B0xYoVOYrIp1E01KhRo2nTpo8++uiPP/748MMPN2/e/JlnniGgoW4oV4VmrjiEIKVKlRo6dCjCHGyhhNGDlyEeotbIkSNvuummX375hRNx3LhxK1eu5HXhwoVyPXD11VePGjWqb9++nLK7d+/2OhHA6bqwYYT9cQ4ebdKg8IKny40JszB58uR33nln586dy5cvb9++PeswWFjVQQuDHD3J99QHPi85cuTIX3/9tXXrVq29eEEAgqcMhaBlQLD77LPPFixYkGWDJbKBRtXWBQuDULt2bcaB2DRFihQ1a9Yk4crOBd5CuBRLSDNxgD1aBhJILKwHdreeaHO5QZDpOYDjLBXEWCo85TScREzIx8Jq1aplypQpWbJkLKSrrroKd+SKDSMJE62xC7t0165dbFEcATuQ84AtyqZ1xUHHUqdOnbglZwtLjhw5+vTpoyrBngVvGHA5nptA+UsvvaSAKVeuXOx5/yxp1arV119/3bZt2xYtWuTNm5dn48aNiSo427BN1eNCyDVgwICcOXOmTJnSZYVGSvB9P/zww4svvkiPxowZU65cuRtvvJEDDNsefPDBjRs3ymC6UKhQIS76U6dORfjQoUOhbDASy759+xhMZp9BJt4NXnKRQK3vv/++cOHCuXPndosvDkwuq6JKlSosMFftxLKkOQKX7t27FyhQIE2aNCoygElh8Xfo0IGrRevWrd97772GDRueZ8ueNUAfeWoxhIJeB7sdBUxKxwJtPK+88sqvvvoKZ/XWW2/h34hjVGoYSZlojV3gjjvuuOKKK0ikT5/+gQceSJ48ubZiLNKlS4ejT5AMGTJIPtjfkWbb8+QUwRuOGDGiR48e+fLl87yH45tvvvnss884b/CVGBC4al166ZNPPpk2bdr3339/6dKlTldM0Fm+fPnBgwe/+eablSpVcrmhQR61RCF45wMHDjzxxBPYTD6OiatSjRo1uKa//vrriGGDrlklSpTAH3300Ue0EnwKGqfCzTffXKRIEcYzderUjz32GAuAMXdlkcHtlgA3f/78WnhxYVVTSjwdS7Mml5Nm4MCB7dq1q1mzJq3Hu+YvQBgHxgcnQBDPsr/vvvvOy5E5fPjw6tWruae59zhozXChWrVqFTc6XsPsfa1eKFWqFAEfly5WnSszjCSO1m50wW5kWx49epRtPGbMmO+++06fi+q2IZDhkgqc99yPuS6HBzGEYymRBhzB8uXLuShzbBA38AoSxgyOELwkvlIaaJfS/fv3V65cmWiDgEaZwun18JU8+uijTAThTnDTsUCS/s6cOZNghWhs1qxZnhXHyKTv3JbIJ4ZjQJSPPNDx66+/novUihUreHW6jP/+++OPPxiu33//3b3/91+XLl1uv/12BtO9x4Hp04CvWbNm7NixX3/9NWmINa28zps3L3PmzDt37gxMuffp/bPPPlu/fn0SzAJNsN727NmjhRcvTBwErwfpoTpPwJIGDRpw9tx1111qBZzoBQkjw2ZkD7L1GNtt27bt3r2bUXLFMXHjdZpGLIweijCGxcAGxxtMmjTJazYAk8hKwEJNKPaDKwuhEIHvv/8+W7Zsffr0obNONAjpZBB69+5No4sWLZJa8p2KmNA6EA/h1lhvDBog74oNIwkTrVcTTGevckuoUaPGtddeq63ryjwQ0K3ip59+4mj/JCwff/zxypUrkQdX34OgBLVk9u3bd/PmzYQpKVKkwAeRQxHPDRs24E1opXjx4smSJVMRYUTKlCkLFy6My5gzZ8727dvlO5zSEyCGEulxWWFBcsKECagidilUqJDq0iLPq6++mts8Xnvq1KnKFEjWqlWL/H79+sU1wEgsOgZy5crFqrvxxhsZUibRlUUGc816+PTTT1mT48aNc+svPohTg5e0lhxzGpj1E99usTn1YShat25NJHfnnXdWrVqV5wsvvMBkueKYIOw5jMBhT1q4soihrpQktjrC1F2yZMm9995brVo1TK1evTqWd+vWzdfpRGNCxauuuoog+I033hg8eDBhCpJecBKAtDQPGTKkQ4cO9erV467FIgmlDSglvrnnnnuwQWbUqVNn69atrtgwkjKBbRdtsGnbtWt30003lfO4/vrre/ToEesGTJptzK2iZcuWrqtheeuttxCW73AqPCU4hdWrV+fOnRuZb775BgFyJIY8J5BOkTFjxpDpqnkepH///uQT6yxcuJBX4YpPoLYee+wxlCf4uQu9prNIlixZkquSFAquTVmzZuVIQwmSro7H3Llz06ZNmzp1av0ukiFO4nMXFtjbb799880333DDDVp4b775phaDk/BgTsN87gKzZ89OlSpVmKCHZcNUlipVKowxqG3YsCHC9rmLYGCbNWtGQMlq19gyU/qyfFwYK80FaOiEK44MZofZ1wIIVZf8uJ+7UIW1tHTpUi48TDS3jpw5c95xxx2dOnUKfAzifWjq6seEhig9ePBgmzZtMmXK1L17dzooA1Tx0KFDPXv2ZGHj9PZ5v5KtUnAqYkIXWKuVKlXKkiULqw6uuOKKdevWuWLDSMJE5dUNm/EIxCvsduW0aNGCs0dOXzle7wJwhZ0wYYIyQ0HFhx56iGuQqvOqfO354cOHN23alON/1apVhAjkBHa5d+vt3bv3yy+/jOTEiRPvvvtuvyJFH3zwweOPP04aA2rVqqV8X0BIFXcdQh/CjhEjRvj2xwKXh2MqUKDArl27ODuJonQFlwbcVrFixXA611xzDRcpfLff0LZt2zgFueszPjLVACK5MmXKLF68uHDhwsrp2rXrV199NXXqVM4S5cSCKSBGJCR9/fXXtTAaNWo0YMAAhtofbSB//vz5NWrUYLVkzJhROU2aNGGOhg0bRnrNmjWdO3fWdxGCK8aCU+S1115Lnjy5e48JdYmHUEjs8sUXXygzjLbzHm0Qnu3btx84cCCJihUrzpgxI1myZE4iCEZPT0730aNH6987ICdRA5g3b977778f/YHp93AFQaCWGKJQoUIEsmzYzz//HC9BPMHOXb58Of7hl19+4erCRHM7YuHpIoTxJJyKIMgn2kCMxcNda8iQIa+88srzzz8vP4BaOt6hQwcWxquvvsqtSVaxVvURr9MSBAopRScLCReKElbdt99+S9echGEkWVivUYe2nFw2m5Nnq1atyCHfSXgyElPC5YZAMj4u13MHHDkPPPAAO/+6667DDaFQOvVs3bq1HATXl+CKlI4cOVK2DR06lCJUgSs+gapE8rkLRZs3b5ZHI8ZCFTk++DKcDs3x5FpGjqvm/Uj72muvpdadd97pm612L2RO7vsujN6XX37pxxMcEvHOafjPXZQpXJ2YSAbNoQSAIvvcJRi6z7gxaBzq2nc333wzoYkrjomEYffu3U888cStJ7gtIZycR7Nmzfbv38/+CjNT5Mf93IUq3Gfy58+fNm1abl8oQQPIJAijTWI8Dx8+jPNJkyZNt27deKWnPXv2RGHLli2JsPWZkLQBaaciJn4pFy0NGlZt2LDBFRtGEiZav+/CTtNmow966tVHAsQcSrjcEEjGx+V6+TiClStX4jWyZ8/u37HUHJn4o8Aoem7F1fFQu+STlmSgWkJmhIG66MTRkJY2Px8wg0ya06uKBEXcorh4LVq0iNiLHM/e6Puw7ZyjsQ2+v8Z6jQQp8XG5MdHiYeJCCRjx4o1oYMT85R1qAMlnkCFDhgzDhg2bPHnylClTiEpJhEdiomvXrilTptRkOb2hUXNECYcOHSK6qlu3LrU++ugjQo3UqVNTxHRLBsKYLTGe7OjXX3+9UaNG7du3x5KBAweSIJx97bXX/E9cpA2QdypiQpHEeJpPMKKLaI1dzho7duzYu3cvCXkEf4eT4BVnpNdYsUsskPSfp4I0cKkK1RzBDZ7IvZwgZ86c2KnfwlDOqVtiGOcHBAFsbeBmokR4kp/g8ssvj3wfyV3s2rXr8ccf79SpE1u4Tp06VatW9SPU8A4kLtTCho4dOzZr1ox4pVWrVk2bNn377bex0EkYxnmNxS4JcPTo0WPeH0rMnDmzHA1uSJ6ITP+flQt1s4H06dMj715OAf/+hHvyXV4w5GTKlEm3KJflOU2EySR8IQ6T5Yl1lIZxHsOOOIkdyp6KvBaSxEYNGjSYOHEi/oTXAQMGfPnll6448dcJ7WKeuBcsIZBKly6ddjelJ9Edw4guLHZJAN9DkeCpU1/O7p9//smXLx/5QIgTkD4BYtu3b0eMW1327NmVSRUf5QRDFQgukqSgCN+UN29edB45coSmyXFy3r9YtX//fszInTt3rCiKushLmPBFliOpUuM8gAnVUwkjQRgo7SAlCCYi5O8gtJvAKU0INimuoG3btldffTW19u7d26RJk59++gkDKA3ezqFA0ptkZzz29OzZs3Pnzi+++GKzZs1I8Cpt6EdMtQzjvMTOsHDgAggFiD9I6w8DBXzVCSgqXLhwqlSpKF2zZk2ws8CzrF69GplcuXIp4ADlSyyWZ1FpcAKQ8cWINjCjTJkyJNatW+f/9Eds3br14MGD1C1btqzLOgGZyU98t1SmQqzWjeiF9eCvk+DFY4SB4dJY7dmzhwCiZs2atWrVuj8CEPNp06YN4Qg7GqQ2PClSpBg6dOgbb7wxaNCgrFmzYsCmTZtonc1LFKIZTBBFLSQInnr37t2xY8eXXnrpzTfffP31159//vlOnTp16dIFqxCwxWCc31jskgD6x1FIyCMIuQ+8A1cowhdef/zxx2AXhjMimiGBQM6cOUmo1P/AI9izkJYz1TnkcoMgU1SrVo0nDnfz5s0kVIpm2sKpEUvdcccdfr7wGyL0yZIli9LGeUbwcjISRMPFlmFTlChR4loP4n4lwlA6iKuuukp6/E0dBiTZp1xj2J7XX389QYZuFAsWLHjllVeOe78HJMnwIAZHjhwhcOnWrduLL77Ytm1b8pMlS4ae5557jsy+ffsS2diSMM5ztBmiC7wATJ482fXhootatmypTCdxmsC7HThw4KabbsIRVKhQ4dChQ4EPjk/8M96A0+HGQyl+bf/+/a7af/9t3LjxiiuuwD2NHTsWGQlT8eDBg7xiJ6+IyWD9jjQ3ORV5CgIcPXpUv/BMJnUp3bBhA5HQ5Zdfrn8Kjxzlt2vXDg1Y6P84yanwfs27Ro0alBYrVkyW+61fyJzE70gDQ/fVV1/5n2M1atQoeKgFOWF+R5pXJ3dqoEe/I12pUiV/Wk+X8mhE3Wcc3nrrLc3OzTffHGo2kWRfIMxTqHqicDU9eHWqY0K+/zvSKVOmXLp0qTas/n05Ag6CJ7Zz586dyXd1QoPBiOGF3n777QwZMuB50EMfBWr37dtHBENRx44duWu5amHxuvIv/kSDZr8jbUQL9rlLOAhK8DgcDyTY0nIHXLN49eFMKlKkyM8//7xkyRK8gBzirFmzkL/11lv1SQmqcGH16tWrWLHi888/r19cQlLC1EInoQ8J8skhf9myZXfffTdNv/POOzgsfSSTPXt2rlaUjh8/njAFYTKJriZMmIAHbN++va6AalGgjRMaO1F16Ylv+OppRBfeenELRksCWEvkMOPCiV4YMBoMhXBZEcPQsQs0hkA6sagie0rVnd4giCf0F4L0igzh7I4dO8gn/G3SpEnJkiW12Qk1Pvjgg02bNrGXw3QHDcx73759u3Xr9sILL7Rq1Qo9soEYiETq1Knbtm2L5u7du/fq1QsXwZLQKDkVMSGfFv/666/Nmzf7ORipP2NC+kJbUUYUYbFLOLzj4L877rgjTZo0bG/cEG6C/IDr8iBNPMG1Cd/Rrl07vAB7nrCjU6dOOXLk6NChAxWRwQt8+eWXH3/88fLly4lFIKDd04Pv0EUH/f6XamHw4MEzZ84kHmrZsuUPP/yAgJwUd/0777xz6tSpXJXIpAohy8qVK3FYhEooxAAkpQSwGU+E5AMPPOCZ7HDFRpTAwtDaYHJJKCQlYGX9BJ+OSlw4MBSsbfcSMdoCwbiCxOBXDFV99+7dNWvWrFy5MvElMtx8Hn/8ca4Q+tvykyZNYttqqx48eLBx48ZcbD777LNQ2oDOrlixomvXrgQuL7/8MrNPTsCIIFKkSIEjQhtieBtq4RBUPS6sookTJ+qvW2AJTmzr1q333HPPs88+i7VUPImxNYyzg8Uu4WDrQvny5UuXLk1i3rx5eAcdIcGw20eMGEHgUrZs2Vq1alWrVq1o0aJTpkxRLZwCXiBjxozURZib0+zZs1GC/PPPP3/jjTcuWrQIGRxZqVKl8Bo4MkqRJ5MqHE4IqFG0ZciQgdCHVnBeVatWxTacIHcsrm5+XBWw6QTYjOssXLgwntFlGdEJ64EFwPJo1qzZ9ddf/80333D2EPhyHDZo0EARsBO9YGC1x1rwSYdUqVIRuzBZPXv25CrSr18/9mydOnX0d0Wuueaa1q1b9+nTZ9CgQZT26NHjmWee0R9wjethBD0tXrz4t99+26pVK/07Lmz5WMJoJqZ59dVXWR4I80pEEkZhvnz5aJdb1sCBA7EEU4l77rrrLpR4Q5tEx9Yw3PEcXbAV4Sx834U4g+OB56effoqzuP/++0mDK/Y+j0WAdv/++29ChPnz53/99derVq3S905UqufRo0dnzpw5Y8aMl156qXr16tJDlT179uzatYvnzp07ldC/KEOCoGTJkiU33HBDt27dyEEPTxpFG/rXrFmDtgULFuzYsSOgywNL9JR50KRJE848HBO1XJYRhd93oSIzCFpp3Om1WngqEWveLwQYTzbCjz/+uHTpUoZFQx3J913OGt6mdF+PwxilNY9sc+WT9idXOaHmUcJCktIfLO9n0hxNIBlLIBiEfavA1+w/Q1U0jHNONH3u4u2vf9iQ+/fvZ8ulS5eOTG4enM3sMZDYaUTKed59990EHF988cVvv/2mtrCEJyNIKbcTrinYQ5xx2223FSlSRD+HVl1JkrjpppsqVqzIqVm2bFmKuA9RhRM0Y8aMPDNlykQibdq0UpgmTRpazJ49O6EJl2wyVUVWob9gwYJ33HFH+fLlOSbJF1TkiYw3VP9wF//8888rVKjw5JNPeh0yEgfzyzDixFl1RAysutSpUzP4TIHGWWtAwmcUZpZGQSstQ4YMWjY8ldDsO+kLAwa/bdu2+v0grgRME3NByKh9x4Ag4ETPEdgAAUfg/fNxSmsekyVLpnzS/uQqJ9Q8SlhIUvqD5f1MmqMJJGMJBIOwbxX4mv3nhbaijCgimmIXfBO0aNEiZ86cn3zyif4OiIo449lmp91V+ccSO7xXr1758uVr3bo1NpBDc5T6BoQBGUAY88aOHbty5Upu4eEPPJXu27fvtddeK1OmDN45UU5ENy0YMGAADqh///64p/AtGqFg5AlfmIhChQq99957BJfELgwmmUQMCDDOkjTOMmyQb775hgSLfPr06YcOHTpy5MiSJUtUWq1aNbat0oZhnGdEU+yCJ9q/f/+oUaNwUj/99BMnB3BycL3g4oXAab8lBIcmefLkGT16NJFH9+7duYXTboTNKWrZvn3722+//e67737wwQc5cuQIFUmgEyhdtmxZs2bNiEL69eunw9JJRIDMxloivIEDB5YsWVL5xknAyO/du3fixIl79uxZvnx5hgwZ9FUDVuMNN9zAZAUvEuNswkRkzZqV8Wd/ca8gmmTBz507l0kh4rfPGg3jPCaa3O7x48dTpUpVoEABfNO6desWLVp0+PBhrlz33ntvuXLlEnW6Rwg6A6GEB6/XXXfd559/jnPs2LEj8ROlkTSKb8XIDRs2FCtWbPLkyUQSvsJ4UaPIP/3000OHDs2WLVskrQTz999/v+8xfPjw6tWr49OxAVzxhY03aQHcexCuIE4RZySrjmFcs2YN1/qDBw+SWbly5apVq5IIllfa0xFAmaB0rEzjFGFeOnXqxK4kuF+xYkXNmjUbNWqUOXPmJ554YuzYsfpKrBM1DOP8Ipr+7AWxCxesH3/88e233yZ2wXKc1x133PHcc89lz56dojN6QqNfiV27dr3zzjvFixe/5557eCUuUX7SYfbs2TNnzmzQoEHOnDkxL0ycdAGiedy0aVOtWrW4phOUKH/QoEFEwwSLGjENGmvsyJEjyZIlW7lyZfv27deuXas1RuDywgsvZMmSBbHgJYdyqnz//fcvvvjil19+6f9Q6bXXXjt27BghL8JJcMFEKQwsHDhwYOnSpZs3byamTJcuXdGiRbkeXH755fJswbNjGMZ5QzTFLlx8cVW4/qPePzjLYZDKwz9p4MwdDAwUrdOQRozWdchd5n1nM0nxt/frFTIM3+0PjgGaRIaIGJSruT99+/fvZ045/CgFnXkMI6uOV56kteq45adJk0YyxNNoIC0lKOepL/bqxxnK37NnD08pt9P0dKGp1JgDE6Sx1XT4r4ZhnH9EU+wiJ8VTPwRRJpCmF0Awcea8ld86rdCWjjRykmDsgnnYKbeuAMsVGN7ggHsJupczoTwZMXLAP//8r2aTqYoaWF41yCoiH7QqfDEgEWjPQzm+sHGKMKQacCBNDgkV+Zl+jmEY5xPRF7tALJekg0Su6swdDCj32/UtgTP3Sc9Jg53+E4NlsyEUcJDQbPqDoznlNTifNCg/IOQhAT+fhL/qfOUQq6L/6ucYp4iG1Md/1QjzJMdG2zDOS5wLNgzDMAzDiArs42vDMAzDMKIJi10MwzAMw4gmLHYxDMMwDCOasNjFMAzDMIxowmIXwzAMwzCiCYtdDMMwDMOIJix2MQzDMAwjmrDYxTAMwzCMaMJiF8MwDMMwogmLXQzDMAzDiCai6W8C6I/FBBt8iffn8c7hnyyh9eC/g+P/VWHS2CaDydTfPCKdqD+3JM3BHSQnkr+tiJj+Rp2a8y30/3Ad+Ymy5HyC8QSlNRpKM0qgVwZHCQmTDq4SK5Eg1PX/RiNPIYU8Qc2B5E8aTTRPdF5++eXKPI/R6LmXoBlhBBhSFSXBda4JIoHBmjL9edezbyrLknYxI3isAgvxlJeiYZxpoukA044K3lracucW2SN/JHtIKyf4TOJ5EtZKlRI85eO8kohAWBV54qfkmwJWJoFxO4foT0NrZILRsMTNB28aAxNxcqNHXdQGK+GVJ6/+pHiCp4Qs53n2T8FzggbNH1KhceYJvHqCSQ5s01MGnys7aVc+ijRPhTLnyhjDSBRR6eO0232fpcxzhdyQdv7u3buVyavOD0p973ASplI34Es8eJVzUVF4givSOkEPth05ckSvsu2ChdE4evTob7/99vfff7usiy7atm3bhg0b/AjPx5878F8TC3UPHDiwevVqhU2AnvXr12/evFlzenJqY4ESqdKavBAIzIp3/DOVhw8f3rt3765du/766y9eGQdwckkJrMUw5p0lBywJ0udkyuQlSGDDn3/+uXPnThLKMYwkzsl8GHCuwNR169ZxwCjNM126dNdccw1HkVceEafSX7nCWBpwOrB27dpevXrlz5+/ZcuWZCKDX8AlLVu27LPPPlu5cmWePHmefPLJ0qVLo4TSgFsN4Vh9/fjiuXPnTpw4cevWreXLl3/ggQfy5cunzvpKJCnk/sjEAVHlo48+KlmyZJUqVcin1pgxY6ZOnYp5ZCIWfEKHsuS8hHED5uv6669ftGhR4cKFld+lS5fp06dPmTKFsWJw/DFhEok5GE9qKSdTpkwlSpSIfNWp4oQJE1599dVZs2ZlzpxZOY888kiRIkXeeustTZAnGym+eb5VgJ59+/Z99913S5cupS2Xe/7C1PBkBJid5557bseOHWwZohZyhg0bVq5cOUYpViSaFMBsbhF9+/Z95513MLVGjRqsgZQpUyZ2DZw6GkBGrFWrVripVKlSNWzYkJHEGAkYRpIlmmKXY8eO/fTTT2ythQsXyuzrrrtu3rx5yZIlk0AksF2pu2XLluXLlyfo17JmzUpspDSeRQcG1QkOdLxxWvA6e/bsV1555emnn65bt+5ll12mItxov379evToUbZs2fvvv5+4gUCkTZs2zZs3p5RaSEpzMFLOc/v27a1bt/7yyy9r1qyJhsGDBx86dIhnpUqV/IroAaWxhK7xpGuEKe+9996aNWsGDhyIM8IeFDJ6I0aMIOf111/HXfoV/X5dUBC7lClTZvHixX7s0rVr16+++oppCv6mCOOGZ2d+O3TowEpTTvXq1XH0yZMnl0yCUIUnsQuL5JtvvsmSJYvyH3roIVp/++23SXP//uGHH7z5jH8upIS4h8WgBaZ8rRbmneMQ+wcNGoSdt95667Rp0yRwHqOO82T0WOd//PHH77//Tj5TM2nSpNtvv11iSQ1s/vbbb9mDBw8e5JXZZOLY12f/K0pyhpMnT+ZepAXGbZAtwE1JAoaRdGHJRgucIpzfCxYsYLfLeGKXo0ePuuLIYLvi7ocPHx7qkAjmiSeeQF5V5CWBBK+EAjyB8CJfvnwDBgzg8OBVwpjav39/7jF4KK6D5OCniGDwqkQPvFJd2mJBEXq4PT/++OPEKBxsVETbunXrChYsmCtXLgIgtSJcNc8qdI4fP/6xxx4rXbo0EQmjxMUOGVlOAj1DhgzJkSMHT8ZNZgQruXDgnEufPj1HnXv/778uXbpw2jFE7t1Dc80QTZkyxYsrAmuG2EU/fYsQlADhzlVXXcUpq1fgwCA8VXrixIloDrMm1TpNExMj71SfWM+rV69u0KBB1apV06RJw7yTcMXnNf5GYBmznlesWKFvyrPvZs6c6YSSHpj9+eefM036oAWDR48enVg/dlrAEhZ87969McZbXxfjoGbMmOGKDSMJk+Q+UA0DG4yrSd68ed37SaFu586d+6677ronIUqUKKFzwlWOCfm//vpr8+bNOfOefPJJ/KaEea5Zs6Zz5844gjZt2mTIkIEcSl999VWcRceOHTkycRNOS0zkRLj6fPzxxyVLllQEQ36ePHmaNGmydetW7u6cXqFMuummm4iNPvroo/z58/OKT1c+8p5ruhiFd955J5bMnz9fSsiUjBEKZi179uxn9IN09LPetCbvvvtub/XFgCLyr7/+eoRjzT6ri+odOnQgPGJyyblA5pTNwtNb1xezTXLmzKn8JA5mcw9hyjCb8CV16tT+h7tnGRYSBtx44436EBF7GMMCBQqo1DCSNCzfaEHXrJ07d8pngT53IRP3DRIQvLpqMaEIMQ51rs7cOcLjfzjB01eohsjct2/f/fffT2jy/fffI0yOikgT0OAUiCT27t1Lpkwiv1SpUjhZ4g9epS0WVN+9e3fFihXxIy+++CKvSOq5fv16XEyKFCkmTJggbcFKaAUDJLxr165rr72WURo8eDD5PpQi8/PPP+fIkYMmduzYoSpOxYVE5J+7aKgZtHTp0jEprDp97kK+SnlqggJDHIRTEdnnLkzEoUOHUEtgyqqjxViQSanfrlN9wkKQDS1atMDIC+RzF2/k3BQAnkE/PibKnDlzJjnIeGPjBOLW4qnMswkzxYTOmDGDi8Qjjzzy1VdfkQOuODQYzDNWd4JRPkjG5YYGM1h4LC1cyqOPPsoFbM6cOWS6YsNIwkTT5y76iDUW2qjAJsRr4wLIUUBDKT3UHpYw6HXevHl4eYKD8IwePRo9OrGCIQcl06dP/+KLL8qWLXvllVcSkWAeyikippkyZQrpkiVLpkqVKjDKHpdffjkhBfmffvopMk5XTBD77bffCIbQoy/2kqlntmzZ8ufPL0eDEiS9Gg5ksIFMzCCtjvNKWkiMnEKFCnE7X7hw4eeff65MIxQaOoY01mgztsDa0FMJViBPSjX4PCUcCb/88kvLli1feOEFnvGuTDJfeuml4cOH00qwZszDNpCRvCaq3ajGm5xA95WAuH1nIjQXgpkik4SEgVJJnjXU4q233vrOO++8//77d9xxB8aoKDwYT2BBdYUXLjcmWoqKcV1WaOQreN51111YMmzYsBtvvJFXV2wYSRlt4GiBDRn3cxddYnD6L7/8MhdZnnh5ohNtY+Hqe66K+IYzIJItysUIYfQoJArWsH//fiIALOG+LhmJ8Zw/f37Aj1588cCBA3kNbr1Pnz6ozZgxI9GJy4oJ8t27d6cut/wFCxb4OpW4//77KSpatCihj/JdtZjs3r27TJkyNDR06FCX5ZktJTBq1CgsJ5Datm1bKCXnNxF+7gIat19//TX4cxf9Psv48eNZb6CAo1WrVitWrNAIUyV4wUD4z10mTZqEZukPBaVqOsyUsfKRvEA+d4nFrl279HVX/3MXINRjuHTk+5AD/ky5+mcLTMIetU4aD8aTtCsODbeaHDlyfPzxx8jTC5cbBJkUTZs2LW/evHPnznW5odE4EOhgD2bIl0ZiiWGcc6Lpc5d4YbtyDJctWzZ//vzECt26dePI5ypTsmRJ/yQIPhK0XW+44QbOKqKE8NSpU0c+wlU+AQpXrVr17bff4iuJnzAgWObnn39WTurUqWPVzZMnD0/injVr1ignFjiOxYsXk0CzvmAh4/XMlCkTCvfu3bt582Y/M3KoC9RicDiJly5dqraMk4CRvOWWW7Jmzdq1a9eePXsuX768SpUqzG9iJ0VceeWVrDdWrxZeXCiioaeeeoqldXJNXJgwVj/88MOLL77YrFmz5s2bE2Ly/Pzzz5k+lZ79wdyyZQsx6zPPPFO/fv1HHnmkVq1aGzdudGVhIXC56667GjVqRPgi+2NBJoHLE088cc0113DDcbmhQX7lypXEu88991zDhg25qj344IM//vijKzaMpAzLN4og7Aj+3IWQRZ+OcuTPmTOH87h27do7duxQDvJEHgo+YhEqPy6IBYKdmNdoXjlLMCB37tyrV6+WADcenrT75ptvUkTw8cUXX5AT3BA3dc/wizjtXFZM6I4+ziEUW79+PXV9UPX0009Tl25yqQp194JQn7sILCT0KVasGK3g0DVQFxon97mLN3UXcX4cPnyYcWMKpk+ffumllz777LP6bTJNt4+vAcJ/7uJXjLVggvGLQgmAfe5C91OkSDFz5kwNF5MyefLkjBkzkk+YkjZtWvbsK6+8wtz58+Xqn/gcAuJOZeRQ1386vTH55ZdfChcunCZNGj9sYnXRnCsODTbv2bOHiIfufPDBB3I4IGt5nTp1arZs2WrUqLF169ZQrQdDLRYwriZz5syyhAFEiSs2jCRMtH7uom3Pkz6wdTmK8NocBpzWnElk+gJKxCJUflwQ44yHYHlaXLRoETnseZpTkX54DEePHuXVG1732z1epRjEmwmqhd9Riz4U6Ql4HGQQ4KmcSJAewE6OYaCVpUuXHjp0yEkYCcGY82QMGXnWAEdO48aNubP26NGDZcB4aoR9VCsSXIUT395wuTHxi0IJGOIy7zf+BDNVtGjRK6+8knxi06+++mrFihXELrwGhjvmSPKq3UckiuSUKVOIexLLxo0b2aGxNAdTpEiRhQsXsmawUzmeIQnPKQuAiKd79+41a9Z88cUXP/zwQ+IVDFbgMm3atAYNGlSsWHHw4MFZsmQh01ULDT298cYblyxZMm7cuFSpUpGTKJdiGOeQKI5dBJtt5cqVderUufrqq3v16qWfs8hnSfJMcOTIkQ0bNtAEN7zkyZOrLb9FrNKrzPDzQWkJhAGBv71fAXDvMVHUAgnqCUWyZMmyZs1KdfysxS6Rw4QyaJpEbs/333//nXfe2bFjR9YAk2JfckwiKHTQTM2dO/ehhx5atmxZo0aNPvroo+uuu46onfXPfOkO4OqcQBWRf8FD35IOfFk6MhAmhlCUyWpxSmNCUdq0aYlg/NglchS+4Ojuuecemhs7diy9YOHNnDmzfv369G7QoEFsbZpG0tUJDZbgwTJkyIDzxHPyqkyVGkZSJlpjF/8U4e7LbYOrA/HE5Zdfzo6VPzqjO5CoYvfu3TTBhqfRWB6QfMASfBMJl+vhv+IylIgFAtLGcagPwOOCWjmaUM4xQbBNGvbt23f48GGXaySEZofR42x74IEH1q1bd9z7y+FkwklPh3F6YZo4zon+hw4dSuCyefPmvn379uzZk0OauSMfGc0jT1UR3jQGqFSp0rfffjtnzpzZs2eT4BkhCHOPQkOY6IFSLAxVGgYq4nmomCpVqq5du959990vvfQS4ctXX3311FNP3XDDDUOGDMmcObNWaYSr0Rez1WtEF1H8uQubjcMDT0HgwuXjYw8Vsb1PwjVEDq5BCSIAGpIxvPIkrS/kyn0o35c/cOAATxyr/w/Dx4IieR8q+hqAIvoot0vcoy/tIuZVOhkIWah+1PsdB5dlRMbSpUs5EfV3tUaOHKl/cldFRlKATbR37942bdq0bNly586dLVq0ePLJJ5MlS6Zp4kpAQi4CgudOewoQzu6RM2fOHImBKmoIVWxYpzcmNIqFvk+IHCrKeMLldOnS9e7d+957723WrBk+8MYbbxw+fDjBGZolFsmHOrLEvZzovnsxjKRNtMYu2vn79+/nIGeXsgM5g1999VVdhf3z/gzBDmfbk8BFcvwHuyHSJUqUIIHn2rp1q3L85y+//ELd9OnTh/otANSWKlUK4d27d2/bto2c4Or0jifBDV5S/pHXkwMPyxChIdh5GeFhrOCvv/5SfMkUHzly5JVXXvnzzz81mE7OOKcwEY0bNx44cKA+pRg2bNiqVauU75/NJIReBcLk6Kmck5hT6gY3FBe5jpMAneAbmSZNmjvuuIP7DG6QhL545+PqhMWJesJ6Gka0EMWfuxCyFC9efMKECbfddptyOEK4hZA+ae8QIVxrOL1I6GpFc36LmEFcUqBAAXKIVJTpo2/JZM2atVChQi4rDrfffjv6Dxw4sHHjRk5E+RRqEZOtX7+edKVKlZInT06+3+hJwBksS/CALstICE3HTTfd9MUXX5QsWVLj/9tvvxE0c0xSdBJHnXHaOXjw4L59+2rWrJnO+9WwtWvXPvvss7t27dLs8NSeihdK/WsPaVA6QtAs5f7OPXNMmjTppZdeqlq16sMPP0wAPXr06FNxCIYRXUTrWscv4B04wrNly/bWW28RSfCKo/n666+HDx+eWI+TWJJ533UlQYSBo/Sbk7fCY955553Ys3jxYv1oBpAh0Pnhhx/wL/gafas/Xq688soyZcpQfeHChQhTUX6QyAyoWKNGDWkj39VJJNTFlaMzT548ob55Y8RFw546deqMGTMOGTJEs8Mwjh8//tNPP0WAtCSNcwgbcMSIEWPGjOnatat+yDJ//nyOeX0iG/6A14T++uuvL7/8couTYvbs2ehByRmNJKZMmdKwYUPC6JEjR77zzjv33HPP888/P2rUKFdsGOc70RS74Hp4EivozCCtjz2uueaa119/nXiCNNffdu3arVy5kqIj3r8XSULCpxEc4s0334x72rlz5549e8hReIFhPLGkadOmBDfLly/HCcpjYgOxyO+//54/f35ugeTow95+/fo1a9Zs2rRpvMpaohOqX3bZZV999RX6/QDlww8/5Hm7hxpavXp127Zt6e+mTZsQoyG6T0JdRuCPP/4gDWQKpQm5duzYgZISJUqEiaIMhlHTx6ojwSvwyvPaa68laL7U+0MQrLQ2bdow1+RrhD3BMxtAC9rCMPDmOfArJ2SyHkizGHhSJMnzD0aY7tNHJZTDsmcQ8Ak869at26hRI03E6NGjBwwYIBkIDNaJmQro8uCV6jz37du3Zs0atk+EICxI7927FyUQrDkYtSsZQY5nVAAE9IyLJpppJVB+8skn8QPc09KkSYOvGDZsWK1atfAkxDGadw2IYZy3aMNEBTrdFy9eLAfNs3jx4rt27WI/EwTcddddeAEd6uXLl9+2bRt7GHmerv5pAqdAiz/88EPatGmxgQhDniLYASHQpUuXlClT1qlTB1+GGcQKd955J4GC/lUGOHr0KLdDDEZJihQpZs2ahRhKKKLKww8/nDx58r59+yoCW7VqVd68eXPkyKFgCDHy9a/Y0d/SpUvrrwQgyXPjxo3FihUjnwNVOvWdXCnHVGI7VOHyJk2aFGz2hQNnTPrI/hZjYKq8fzxDQw2VKlVivZFJQMPFV5kMZvXq1bXqmAKgrq8Ewv/bdCDhxKI5xWymmOfTTz+NMQULFjx06JCKzuP5ZZDpnaCnbDF9ypIpU6aff/6ZTATYSvfee6+/y6ZMmaI94hM88houRpIn+XomCjXKE9Dm9MZEMux3vIG3di7Wn5KgSEokFheKsO2TTz7JkiXLY489JtenhoALCbPPqh46dChiYfQE4zX4L8uSQcOpMoA4NFdmGEmYaPrcBQe0bNmypk2bKo3169evb9u27datW9nG7EC8AEXkE9/Url37xx9/REySnoLTBg0VKVLkhhtuoNGlS5dKv1oXpJ977rkWLVpMnTqVG1KfPn3uuecePFS/fv0wTJIYhoshjRJcz9ixY8kMOJJ//02dOnXPnj0JTbjZo6Rjx47VqlXLkCEDMkWLFqU5VdEND6dMT/Xdmi1btgwZMuSZZ5757bffyCc2evnll7EBeSSpqKYR3rlz55VXXnndddcpxwgFQzd37tz27dszelpOy5cvJ+bYs2ePPrtiAIEjZPr06U899RSzTK3TvuTCQOsspMmTJ7NUPv74Y0xat25d3bp1R44cySxjmJM779Dgk2COCOX181mNPAEKkSX5adKk6dGjR4ECBRgHMvEebFh9ioaYqseCMFQJv5REeCQG/itmsAGVGQuswgBARmCbYk1KlSPJWCDw5Zdf0oWqVaviSbg7KZMnjRKZ0dMHH3wQl8jU4x+8SoZxfhJN3y7EVA7dX3/9Va9sV3J4cvdlAy9cuJCnciRw9dVXFy5cmFehzFOHJvA+PL/44otHH3305ptv/vzzz/Wvk/mtSAC+//77b7/9dt++ffny5SP+4EkpYgiQIOoaMGAAvhKxHDlyvPPOOxTRC24/COBqcVU//fQTfpaQ5b777uOiRiteC4EmvvnmG5rOnz8/Ic6ECROwhBiO5gK99ZAZ3NIoUkVZ1axZs2HDhhESvfTSS4iFcrLnMWvXri1TpgwxLitEOV27duXGSZzH4CsHGCvGkFW3cuVKfwDJIV2lSpX9+/cvWLBAksBI8ixbtqz+qpG/HqjCkwl65ZVXmDL/1+MfeughWn/77bf1KuHEoilmqcyZM4dlRqOsH1QB6+q2225jzfiH8XmG+g6bN29u1KjR7t27Fy1aRCYzeM0112TMmLF3794E6EOHDuU4Z3CowlLPlClToUKF2DKVKlViuEDaAFVUJ8HoKV/6g2XiBRmXOlFFUwAuNwiCp+bNm2/btm3VqlVIorx06dLZs2f/9NNPiT8wIFTFP/74A19366239u/fP53372Izs8QogT5ccokqsiZbtmz52WefTZkypXz58q5mELToUh56JcYtVqwY3gOFBMHERio1jCRLNMUuSQRGDDfBk3tevXr1Zs6cyZnHiSUP4oQSguqKtKhC4sknnyQ64XKfoBK/dXk36q5fv75ixYrz5s0rUqSIZOLFr/jXX3/dcccdePbx48fjMWkuXkd5fhNh7HJaYMx5nqHY5QKHsJ5x49AlHE+dOnUyDyI5fQxDdF6gQIEPP/xw4sSJmTNn5mBGmNK9e/c2bNiQ2EVR+1keeQKsLl26YA+mYtLRo0cPHTqEJYMHDyZHIUi8Ju3Zs4f1SWBB4BKvo6AuDuHAgQPTp0/nupIzZ05XEAQy/pOVKSU7duwoUaIEw8jiZ6y4ZQVEDSMpw/I1EgseE3CCP/zwQ8GCBWvXro2/UCwSIfgOLkxo4Ik/IuzgEhbJT6nlnmQA0G7jxo3r1KmDs3YSoVGL3bp14/icNm0ar2hIlNnnDRF+3+W0wJTBGfq+y4UMS5c1rO3Ak+2jBJmgzcWTfJCYBHiSr9ezP/J+65iHMbKHtJYBOLk4SDjQtxB/h5VMlYbpF0X4CsKU3377jWCINMb4/2QRsQuewYkaRhIm0s8JjFgwdjxLlizZu3fvefPmjRgxAmehosjBa3ApbN++fffu3YmBXG5CaOZo7s8//2zTps369es7d+6c4KcFMnj27Nl9+vTp0KHDLbfcousdSMAwog59bMAa5ki+1PsnZbXOAx9KeB8o6kkR+0WvlCLjf+ICJM4mfotKBD/Dg4ycTBhhigLewcNlxYT8TZs23XvvvbfddluNGjW2bNlCJuEO4Z0ETsKPGcbZx2KXRCOnIEfJs1q1aqNGjRo2bNjw4cO5FVEa4ean+tdff/3rr7+OHj36nnvuwekQf4TyOHEh7qHFsmXLfvDBBzlz5gzlznRXk1ULFix44YUXOnbs+NRTT9E6OKELCYZCyMu7XA9yGKVYmacF6dQc+U3oVbOjHOMkYBjBD1z8ha38YAH/1RfgefZH3jdGBignOB3GJCz3OxIvFGko3HscEJg1a9b8+fP/+uuvuXPncu9CeNWqVfrgNnfu3P5PUQ0jKWOxS6KRd/B9BImbbrpp0qRJOIL+/fsTUihccNKhwVNUrVr1f//7X5EiRdAjVTxdcWgknCJFijZt2jzxxBP+vwXuiuMDez755JNu3bqNHDny0UcfJUjSz/7RE77i+QfDruCS6VOOINMbRXdtdbmnAylUo1oYelX6Qhv/04g2gmZNW1Kvyhfks9RJeFIOFQlenbqzRVwD/ARIQJKxIN+Xj1dGRT4uNybIpE6dmqeWZbZs2Y4dO9alS5fjx4/jFh5//PH8+fM7UcNIwoSL8Y0E0f73I5WDBw/KL8iTKjNeqKXDDBdDddyr8hNELYJe1UrwMxYoJ5+29u/fr3/+nzTNxTq5Lxw0ekePHl26dGmZMmVSpkyp/A0bNhw4cKBo0aIamfDTlyhojlnYs2fPn3/+WaxYMU4IpoB5//XXX1OlSsVRgQCvp7FFwwgFSxFX0LBhwwkTJhC1PP3002yE5cuXsxRfeOGFli1bcim6YJ2DEUVY7HJKMHrcV3gSDfDkTFKgwOYPv/9jCUd+bqkiCVXRk0wSSscC83hSRR+0kFD1BL8fc77CWAEJRoOEfz1lWHj1M+MdzJMDhRwY6KSJ4OZI+5YIFRnGmYOlqKU+e/bsOXPmEK+zDnPkyPHoo49mz579AncORhRhscupomOJYeTsCR5M/5SKFyRVRenwwrFQXf/kU4539sVz+MkZAaUSU11lXoAED1qscdBYaRjjHcyThraEWiRhE2GcE3yHIHhlEWq1syCVacvSSPpY7GIYhmEYRjRh8bVhGIZhGNGExS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0YTFLoZhGIZhRBMWuxiGYRiGEU1Y7GIYhmEYRjRhsYthGIZhGNGExS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0YTFLoZhGIZhRBMWuxiGYRiGEU1Y7GIYhmEYRjRhsYthGIZhGNGExS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0YTFLoZhGIZhRBMWuxiGYRiGEU1Y7GIYhmEYRjRhsYthGIZhGNGExS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0YTFLoZhGIZhRBMWuxiGYRiGEU1Y7GIYhmEYRjRhsYthGIZhGNGExS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0YTFLoZhGIZhRBMWuxiGYRiGEU1Y7GIYhmEYRjRhsYthGIZhGNGExS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0YTFLoZhGIZhRBMWuxiGYRiGEU1Y7GIYhmEYRjRhsYthGIZhGNGExS6GYRiGYUQTFrsYhmEYhhFNWOxiGIZhGEY0YbGLYRiGYRjRhMUuhmEYhmFEExa7GIZhGIYRTVjsYhiGYRhGNGGxi2EYhmEY0cTF//33n0smkpOuaBiGYRiGETkXX3yxS3mcfOxiGIZhGIZx9rGfGRmGYRiGEU1Y7GIYhmEYRjRhPzMyjJPk33//5XnxxReTOH78+OWXX0461g9ljTOHfBeDT+KSSy7RRGgK/ARI+OyAJX7TLuuii44dO6a14d6NJAAb9tJLL2WySGvxaDnZNEUL9rmLYSQa3BwHkhL79+8fMWLEyy+/LN9nnDUUtXDY/PHHH0899dTixYt5BY4lSknoZDrLyAbx999/z5o1q1atWgcPHnTFRtLg7bffHjx48KFDh5imf/75h5nSlLliI8lz6RtvvOGShmFEhtwcrFq16plnnsH3tWvXLmXKlJyjdm87myg6yZQpU5o0aZo2bXr06NEyZcr4H3Kck+mgRX30smfPntdff/2zzz7r2LFjrly5uOU7CSMJcNVVV7333nujR4++9tpr06VLx5phO3vrxfZvdGCfuxjGycD5NHfu3Pvvv//mm2/u3Llz5syZ8X2uzDhb+AHBrbfeOnLkSG7SzZs337t3L9HkuTqEjh8/TtObNm166qmn1qxZwwHJMXnJJeZpkxbEu3369ClQoMCDDz44f/58zZpt4SjCZsswIkW3fLYMELg8/vjjDzzwQLdu3ThB5fjsiDqbMOA8GXnmBUgsWLDg3nvvrV69OsdShgwZmI6zMyO07htD2LR9+/aGDRvu2rXrww8/zJMnj9bGZZddJmEjKcA0MSkHDx585plnFi9ePGjQoNtuuy14wdh2TuLYz4wMIyLwZYKDavXq1fXq1UubNu0777yTJk0afBznEzhR46zgjzlPxS65c+dmOnr37n306NFbbrmFcEECZxpWBU+ZwXHYpk2br7/+GjOuvfZa4lpKFd1K2EgKMB3MS7JkycqVKzdu3LgpU6bceuut2bJlU74EQMJGEsTiSsNIBLizw4cPv/zyy7///vtrr72WKVMmV2CcUwgfdeTUqVOnfPnyxJSffvqpcs4CaognscuIESNGjx59zz33cI9XJmvmrFliRIiCXcibNy/bed26da+++uq+ffv0eYwTMpIwFrsYRkT4hxAn04wZM4oXL16lShVzc0kEjiKFL+nSpWvRosWRI0c6duxIfOmKzzA6BbFh5cqVPXv2TJky5VNPPaXPWvwz0okaSQbNC2vmvvvuu/baa6dNmzZ06FDmS0W2tZM4FrvEhrj7+PHjLFw9fVjTsVCEroTSTsW5QJb4z2CwTUb+/fffdIqEnkBFiqQB/L6QUBpJEihxEkFIs1RJHo4dOxYsTE5AnScQMOUEypFy4Yv5IKaE8vX064Jr4+yyYcOGXr16MZK1a9fmiArzE3FZK1NlNv3VcCmfhJCAq3aa8Jr9f2hFo+2KTywYoSLJnHZLfNCshtQWsFr0CrwioHwgx1WLDAUKl112GYkKFSoULVp09erVQ4YM0ZpXE040aHDUkBLgimPiD1T4wZESAqY///yTuLZMmTIyiaeTOAGS0uk/ZSEJQAlpHz9HT6fitKJ2fbxmA2AV+PaQAOUg5iqfSWSPULtxm5ZherqsyPB/zssEEe8++OCDaBgwYMD69etpAoVOzkiqnPz3XWKtofMJbRUdSyxuZbKaSVOkjuuSpzT4YucKbTYZGWyMjAQy5UyD89XN4MxAZe+yyKsScZ2vD/I8kdFY8aqEj9yKr1aZquW/+kU840rqVVX8nODEWYPmGK7+/ftPmjSJdPfu3bNnz+7K4kNjKBgHLRiNNkV+LyRw2pFmWonVhNoF8kGGyVTZ5gucdrQStEJI8NSYqEU1HbDJk/HzEwsVkydPvnLlykWLFhFo3nPPPZkyZZJyJ+GBMfQdYVlFKZMbaqlLWOPjsmKCBvjpp5/atm2L5FNPPVW1alXy45UnUy0qjTxP/9VP0KjGgQSvGrczh9cDt1RojtFQo8pXwi8Fpc8otMjgkKA5jVLcQWBw9Dzp8VFfWDPjxo3bvn172rRpb7nlFnLO9IAbiSXWqovt2gw2yejRo8ePH086eLDuuOOOJk2asEl69uw5f/58cvz9zLNhw4Z33XXXOVzuGzdubNWqlS6y2ONbTjplypRDhw69/PLLcc0ff/yxugZYi3CnTp24pCoHYSCT6nj/bdu23XTTTbwmS5Ys1roBJBmrHTt2zJo1i1sLOegnE7W+8Ny5c3v37o2Yp/j/fR8yVLn77rtbt269ZcsWMlWRJ1dnWixQoMCrr76aIUMGqjPa9ItSaNSo0e23344YSsIEVWcCGt26dSvLgJEpUqQIRyNuTmbHHRyQ29Vg8lSm+s5Tvvjw4cO7du3KkSOHPjCQzGlBI+aPKgYATfi/7aLDiSIMwJhs2bIpP1R3Th3ZoEbh77//1roKbk72kJ86dWpWrMtNDFI+duzYunXr8tqhQwfiCTVB1zwRZwmJI0eOfP755zVq1KA5RixFihQSCAZtFH322WccaYySryQYKWRxjhgxglMQ4SpVqoQaSVlIEU+XddFFrIRUqVIp32V5ahmlo0ePYhijcSamxreBtlDOk84y/uTolUZ5IsaCwRgGivTZ+RI07eqJDcwUjWr10rRaxxKW7sGDBxlz+bdAtUSCEprYu3fvrbfe+vPPP19xxRV4LW2HeOfaSCowc0YwrGOOqOHDh6dPn16bRDzxxBPsH3bv2rVr27dvzy6SKwGGcciQIf4JHQaU+7is0Dg5D5cVGjwO/jpz5syyx6dChQq//PKLLMcJskWnTJlSsGBB9jnPiRMn6l+WFLgneoGqxYsXFypUiBH49NNPqUVmsBlKk/nXX3/ddtttOFb9GAWo64vB7t27KUJAAyW4B3NgcESh+bvvviM8kqkU8WRUCQQZZLTRBHPRsWNHrCVQIGrkYkQTFAW3EoxnmsNlhcbJebisOLhiD0ZY816rVi0NKSNGvhONyfTp07t16/bkk0+WKVMmS5Ysb731FkMt44Hq/fr1I2pJkyZN48aNeXXVThMYxuDrH0arXbv2lVdemS9fvq+//lrW8sQSpq9p06YEiJg3cOBAbNBES8Nph1CYq227du3uv/9+TggWGMuMRrXq4M8//6xXrx725MqV65NPPnHVEgn2oxDN+qcCS5cuzZqJ1S9ekdm3b98zzzzD0qJR0prKuDBQDA6nI0EzqpyKmKBwz549DCMtYvyGDRtibYRg1DqLn6h9woQJXDluvPHGihUrHjhwgCKQDHAhueGGGzJmzNijRw/WTCiFJ43Xv0B4xFL58ssvmRrO7+uuu+6PP/6QAfQCU2Hnzp133nkne/Dll1/GztNuSTBSTusEJdwQcCAsGBbwpEmTMBVjPKsDkCY2ZQcxNZs2bVL1xCI9rMOaNWvKpX/44Yd0HAOchJEksdglNixlVi37s1q1ajpTBbGL/BFwoBYoUMAVeBC7kO9UhCbgBo4fl0eQqjBIMuBCIjhRPMXHCbli3RXefPNNtqVaRA9PnOZjjz1GPDFmzBjyXX0PBMiBPn36cAtkG+M3v/nmG2rJawClPJHk5MOd0Rzen1smaiUQbKokn3vuOWeNB879t99+UxHCP/30k4IbV3zRRbROkUCMOCZ79uzNmjXDBqc3NKqiXpAOjGNovBacPAmnIiaSATpIIIJ5dJlRpYo08HSiMenSpUubNm04DHSL5Sa3ZMkStcXqevXVV+UoIXfu3JwWrtppglYIBTp16tS8eXPOb7V1/fXXcwhhMCNJqaZPYKd/fjsVp5sVK1Z07dq1SZMmBHO0yEX5rrvu2r9/P2NIu2vWrOGc1oCwooi3XLVEoknZvHlzkSJFGHYimG+//TbW/KpFFuFVV12FDC1iFZNy+PBhrXMEBK/sKZRgLTE3N3KnIiY0SpSvkbz22muJY1TdFccEYS2AsmXLZs2aVdEwi4ToQeMPsoFYOXXq1IwG21AfjzkVpwkUMjJE2OXKldOHfwwFxhCrUYQNsoc0dwxtUiJLljGZTsUZQDvrq6++Yk3iZhl8RoCmH374YbkymcSTkKt48eKUwqBBg1z9xKPRfuONN2iF9VCnTh3m/Yz20Th1LHaJjXYFBxXek3XsExy7sGfwUK7AI/LYRU5BrYRHknK7vDoVIUAGcHBXX321s8njwQcf5PoibWLv3r1cU7jqcYzJNzkVnhI2LX3nRPnf//6Ht8KRcehOnz7dt4RSEgQu9957r9zuLbfcwlGBgAhWqByCD25szqCLLqLWqFGjUKJSEoSJ8hp63n333YcOHaItCeAr06dPP2/ePHKc3tAg41ckgTEJgiQ2kHAqYkIpRajatm1biRIlZCH2e1UDOLk4SC3HYfv27eV8n3/+eXIYQC6LxYoVe+WVVx555BEulC+++CKHmat2mlBDPLGc6ztRAgYw8uPHj8cwDtf77rvv5ptv5gRl8EuWLDl48GCd3GF6dOqgH5MIm8qXL8/K4WDWgb1lyxbCX2IpxqRChQoENyf9uQvQBIucFa4V9dZbb9EE4+CKT6xzxBYvXkyIgyWMTNOmTfUrsggLBvC9995j6SJA9PDxxx+HGhyEGzVqRFvA9R3N5AS3GAz5wO5jRf3+++8cyein4gsvvECLFNEK1WVezpw5WTzJkycn0jrtUxPo5PHjjBWWED1jgHb0Qw89pA/hKNUTOwsVKiRLZs+efeYWCZppkdlhR7AquCWOHDmSdcJUslPkZxAQGImpDB2GscucikSi0UbbiBEjUEX38+TJw0o4c300TgsWu8RGSxknwq5gw/gExy64fq6wLHRXlpjYBU+B2wLijAghVEpQubYfz379+gUbRuSxfv16b6cHQIDrb/bs2blN6jVYM694DZ1h9PGll17SZQsHqn82W3D21K1bFy+Gp2McNmzYQKbvr4MVKhMXU69ePWeQBxEVrSCJPKXEUvIaFJHAvF9++QUbAhb/80+nTp2uu+46ohlynN7QIE/gFfnwMrYSxk6nIiYolJ0cJBxgsl+BFPnCicYEAbpP7xifggUL0i/CStzxxIkTS5cuvXz5cgToEZGl1pWrdprQyAvS06ZNS5cuHSPMxLG2GVICRAJQigAbJMbztFvio7bUEHEAi4cxad26NTPLqckKYckxIIyYjnBXLfHQBHr0HSygpxyEwf3SyNMKzJo164orrmBkgAXPUFCEAGZ88MEH+iFshgwZCGJkv1MRE1osV64cGhBu0aIFYuSE6gL5lAokv/vuO33GWb16deX4RViifzON+Ilj+1TGJF5oQs1pQFiT+kIPm5odQSb4pXPmzLn//vsxFYPP3CLxWxSYx9wRZLNasmTJsmzZMnJoXaUk1q1bx4BjVY8ePZyKRIIStThz5kzd1oD7EjlOwkiSWOwSG29TBPZqmNiF+IPt7Qo8IoxdcIhNmjS59tpry5YtyzM8kuHY1mcnTkUIaF2Wr1mzBl/szPLo06ePLJdD7NWrV9GiRblm+VWcihN9R0zQKEeLwpcCBQroh0ecvo888kiyZMm461SqVGnVqlUS9isGj4NyaBQXjEdwBl10UdasWYlOyKfW2rVrM2XKhPdxZd4PZfr376+6wAjo6oy80xsaxDp37uyPXniQQTm3/MqVK+t3I2PhDxFqx48fr+OWjhPHUOTjpGMSMN2DKO3xxx+nX1wfP/vsM9r64osv1HcncQYiBnSyhqWcBBdr/VWdwoULf/vtt/SdBeCXKoFJYbpz6qCZVtQEJ3GOHDkYSY78jz76iGObHAxAQMPC01VLPNLQoEEDLSdCxh07dpDjij1LaIuOIwksbLYMM3v55Ze/8sorxL4UjR07lsOSEWNxDhs2jJwwERXhl74eR3Pt2rWT/aGEaZ0iXwbbiImpy6QohyKV8sTOESNGVKhQgdap6FScJvxW9Ny+fTtRNZYoyA6Y4sUH2CC+/vrrUqVKcXU57Zb4+I0qgVWM/DPPPMPAclmaO3cuZkhAkGblYBUxqFORSKQEFi5cSHxP92HMmDHkOAkjSWKxS2y0JdghZyh2qVKliraHqxkaX4admeDPFGSYtjq3WL8Jntdcc40+uaEUy7nGvfjii4ipCjgVMdF+Rv65557T7zgULFgQ59WwYUNeUcupv3TpUqojFkoJqHTnzp3FihVTdzzTLtaXWhhSwpSSJUvG+mDmzjvvpMsIfP/999x6f/zxR9LglIaGtpo2bYoGmpCqMEiGJ9fN33//3akIAm3y7IDBqoU9v/76q5OIAJQMHDiQVhhD5oLBJ5pBYZhBO43QCjDODz/8sOwnViNcwAAncS4gLGaKNZiMydSpU/0zyUmcAoHZ+ucfohDNL0FSmOMWSZrmzo0YC5ugnIoff/xx7ty5qU5E0rt3b7atv/ddtZisXr2aurRFlZ49eyKG2lDCwSBD3ENYSUU2CMMSaOPENYMnTdesWbNv376kXZ0zAC1iMDuOIJJe5MuXT1/XlTEkgODp+eefb9OmDa6DV1fzTEIrtM44dOnShfHh5jBhwoTgcaCUV+4D+LQ9e/a43ETid5DbFH6A7sPbb79NvpMwkiT2O2BnFXYg5zS3qJsioOIJqEJFpyI0TCdiHJBEXfhcLrXKXLFixYIFC9iKvP7888+//fYbxxilkveqhiRlypSdOnV6+umnUbtu3To0v//++/gLzr+hQ4dyPwssIkLg0NAEAthz7733+q8kcEP4StRyjhIXPvbYYxwbvj3z58/XJ0M4puLFi1955ZV+xfBQpVChQm7gKlZ0QxkCCfAsX768zp64+CZt27ZNCezkxFU6Qojz0nq/UE0s2LhxY7UVSXdOCzTEdOuDd16zZs3KEeX365zAGOqHLBw5ZcuWZRZIu7JTg84K7tA86SZjTgTgiuOADE8GZ9SoUVQh3atXLwJ0bvNp0qQhjiHN6Gn5STguhOYu5Q2v9lokoDBFihS5cuUisWPHDtaY10jAbCkhqGLfsUHIVJUzBC1iCREbaa46WBKry9wf5s6dW6dOndM1UwkiA2juiiuuIIHbWb9+PZkUeXYFDGP99OvXjz3FZHmVTgbpxNexLEnQ4l9//eWVGEkYLQLDB5cBBPtn4nMXtt9+j3379nFyhwcZCZNOUDkCuj0AVfQrJNqTcPfdd3Nt4sLUtGlTfRM20I2wt0Nd+xCgFv19/PHHOfmkk2Bi3rx5+gid6oiFUhJowwNJHJ++cwccBpzlS5cuXb58eebMmQmntm/fXqpUKZUKrj50vHTp0t26dUNDmFaCoSF6xwhEMsISE1R0KoKQ5UDrzZs3l2FczrZu3eokIgNXSO8YuhIlSpBGW4TdOXVoRb0gHCSCZAY5fmidHCdxLsCkr776imWAPS1atNCAYBL5TuJkUWehR48eTBZjTqS4atUqcpxETFjeWuqs50mTJmXy/kAVtVKlStWhQwdWvrQhwzOUed9++63iQnr0wQcfRN4XZBB+4IEHqMuO4JoRGIgTH/Xt2rWrUqVKBFXavOS4aqcbf9CYCyxhUvTr9DIP2Cm1a9f2f3RLvqt5JvENYKlgFbz66qsYqXxgTJhlHBoRDPmuWiKRKqpv3Lgxf/789J2GnnvuOdp1EkaSxD53iQfGxX+eXvCJXG5wi8BBniAS5unqhwZrUS6oUrduXYUaKp0+fTq3Ny52EydO1LdV1DuEJRAXfxBQgnzGjBnlNchMnjx5lixZ0E91csIoAX8Yr7rqKv3YCHlUHTx48JNPPpk2bVrRokW5V3Hlvffee4NVTZ48meN2zZo1eHZfSYKgAfO4QjF6CQ5yYBo8SIfpBa1TyrGkV9KR2IOMYIg4lgoVKkQms0CUFleDJOV/kVdao33qSA9+mdiFBKOK8lD99Vt37x7kuNSpoQ4CrTAg+oie+PXIkSPhO6sqQMJlhUBjy9PvoJ+j11hoj6CZ9cw0McvEOhLOkCEDW08t+lspXnRfB1WkimpFAlVYfiSwgbBbOXq+++67OXLkqFmzJibJAGnWOOgpKDp10KMPFNFMgCLNpJm1qVOnEnM3atSIHEmqCPRKgmdAy+lDChkHhldjQgjl59Pihg0bRo8e/frrr+tDF/LBX2NAWhaGR8IkaEUJnmrRSLJY7BIPWr5nAvbSm2++WatWrQcffPC+hMBncWwj3KRJkzAfegtcG5tN8HrHHXcQKPj7ljvEsGHDZsyYwaGuf7cmlnxcEKAWpTzxDv379yeNEyH/p59+euqpp7imBHa857ZcnThQxW+Ig0H/ULqgIhfKwYMHMxSUcnKQwAchL4Fly5Zx8b3mmmvy5MkjJcpPkKFDhzJo4MYxNJLhWa9evT///NPVj4Na18EPDMjhw4eVDoM/ODy3bt3KoKGHiG3RokXkOKETeLKBMFGDyTqhxUjcbiRI1ffff0/YRGLLli04fVcWB8xARlVI01nds13xKaDl5HU00H1CKIJpcn799VfFcxITyMgAnoyGUI6TCAECakU2kyAc8WOLUDDyS5cuJazXEPHKFL/22mtjx46VARIL1bo+dAEE9uzZQ3WllRkGybDCsZlWdu3apbo0umTJkjFjxrzyyivE1uQodEZe9qh3ylHi1MEGQiVNBOPAU8q3bdvWuXNnLGELqFRm8JQlGh/ml+dpxG8rc+bM2n27d++mLXL+9n7lsE2bNnjI0qVLe+KBQSOTWrKEJ6+g0jBQhWEPFsZPKmEkXTRhho+2InvD/x1LcVp+ZnT06NHKlStrq7iaYcFhIXn11Vdz4XAqIgD7Qf/UEkhV9uzZK1as+Oqrr7KlIzGVziKJwQRbXPuwJHfu3FOmTOGkJ43a2267bc2aNYipOVctBLSINsIR/aauTEIJwcqmTZvUFo775ptvVr4EoFevXoER93C6woKeZ599NtDtICVhwBgkGZxQ39UFdXD48OF+lcWLFzuJ0GCJFgxrSQErhxB1GzRowKhSSpETPbHqhEZDCVd8CkgzByrrmdhRwzJu3DiacBIxkdl6YrksAVd8CsgSFALHYfXq1fW9dQKLL774Qvn+mJCQvIwR2JOgJaqFMMet5ovzePPmzeQ7iZhI+Oeff86XLx8LmyikZcuWzHXGjBmZrAwZMnCzp11/KFy1mBAO6sNRuqMthmQo4WAkRnNUpHXcCHUx9cCBA8zXoEGDtAbUKaBU6JV80lJ1ikjn5MmTtUIYPTVBDNeoUaO2bdtq8P2mleCJhZojmXoaoXfqONekvHnzYhU+B9/L9iGTNcw1TL/LTdOS9M0jR0/ynboQqCL88ccf/jWJ7idY0Ti3RHSCXlAwKKxd8O9SpxH2A3eITJky4RmzJES2bNkQA/3mnlMRMQ899FDOnDlVkSd3XC64jz76KGnMkEwYGAd2fs+ePd966y0S2PzBBx/ceuut/fr1u/POO/Gz3377LcewfpfS1QkLCgsWLFi+fHkSvMqwO+64I2vWrF75RXRT/4SapoAmiGz0DV9QvtJhoGtp06ZlkBk9N45hoV8IM8iyJy4yhqd+D0WZuEglwqBaDA4X+q+++qp58+Y0RM5PP/3EeRCrOfLFunXrVqxYEckERQ5qP/744+TJkzdu3FgfvzN3+GhXHAcCnVmzZk2aNIkTnXPiNBqDKozhOWzYsEKFCumLn6yulStXYlXwmJCWMGP1/fffY8yiRYs4PslxEgmxdetWKWQXhPmpKwrpZs2aNYmhka9fv/7//ve/xx9//O2332b5EUM0a9bsk08+0XAFWxgMqwiUxkiJRW4qDSHMUuEk5pXEiBEjyMQMdoFk0IkMMFzLly/3P/XEMJ6SOWnQIP3Eaurpn3/+KbVTp05dtWoV8Rz5GKbmQH3kTjVv3rwJEyZwLUnws+GTg4bYd6xe0vr6EWPCTYMJ6tq1K05DdrJaZBKJDRs2cMvCcn0vLaAlAvbt26ff80JPkSJFIq9onBsCy9AIgr0BOIjgf8me1fzkk0/612VcTJkyZVyZx+DBg+Vqw4PMoUOHvO+JJgJ9tdapiACEAadfu3Zt3+ey4YkD6BpdACcaE2qplO7jibp3766fxKdPn3769OnkUISG9evXE74Q26EWnXgKb8wcTldM0IxPR23v3r2xhIqAbdw4KXI1T/yaoqJGxIhsdHaClEhbGFBCQ/q6rhu+hMBh8YxXuddyANKcr/4n6p999hmZdIfmJCm8Trjbnm6HxAG1atUaM2YMTehTJQaTAEUjyZCylpDnlbOTUzN//vzPP/+8SsHpPaGZfJByEroKOwnPWl55UuSJO7CcCysxk/66AicBEaTMQ5jzgLFSX2DixInXXHMNU0NPOfIJdqlFvmvjxJj4Teg1VtqJnjApYHQQixcvZihQS1CiH0PUqFGDKcMGIMhmWGQPV+H777+f80mf/BHacqaiIdCY15xrJggyEaC6ficc5Q+e+IcQncSJzxi81gJxgP5RE7rcsGFDpkljy5O1KvOIcRkWhKnlVMSEIv/noU888QTV1YQrDguGTZs2Tfv0jTfeQBXRwO23385eQAmloM6yUBmxevXqYdX48ePJFBT5qtSuMoOfEggFAkDdJUuWsOUxhlZYXYQIOLoffvgBq2SMtOn1m2++KVeunIaOCbrvvvv8v/XB06kOWgN+Aki44tCoLZ47d+7EDFopVarUX3/9RZDxyCOPDBw4MNgqwcrp379/gQIFWLrI582bV3+cSDY7vTGhCCWU0h3Voi9c88h3EkaSJOTd64KFQeHJhtSvCypNJr5VRSz0GTNmsEu1aXkKlYYHMX2NNFFQhYpORWRgDDuQeCv4B7dPP/00WzSMKmoBmxYXMHToUDwpvc6cOfO7775bqVIlv2KePHn69evHcUj3J0+e3KhRI7yJPIg0SCwWnD08K1eujEISiHGOEgOhFj1AJk7nxhtvRA+ZCHBu+YEOpXqGBz10nF5zbXXDlxBhvqvrtRyANNf3woULyx6CD8xTfnB/SWsQGA0sYalwZU+WLBlnM08iBqpwAnHRJ4HYuHHjuLOSwFd+/PHHXBY3b95MkdCYCF7RyXHC2iPEmT17NnNERfKdhHfjBAyQs5YxNNenT5+77767WLFidIHYiFKiGYIqzGN+iVAJa6SBG3abNm1KlCjBAcBVmwnCQmIphCUgNEEkOOk//fTTTp06zZ07l8NeHadUYoIcnhQpTYudO3euX78+q4ijJV++fKiSPQhg26uvvkqAQt+xHNtKliz50UcfDRo0qHjx4owPbQV3ORS04v9O+w033MDgywCBGUAOu/ipp56i+0xrnTp1aE6nFzKsIkKZ9u3bs5CILzH466+/VvW4UOWWW27hiW3EW2orEjuRBGIj0tjAIc0CYHyaN2+uv8fkq2JkmHRGgFuEfoJMpprQ04eOs2F79eql31pizbiC0KABhdhPZ4kUyWFHMyNt27blCufHsqAEAoTaLVq0wEkOGDCANcO6whU0btyYLkiVpzgAr8BynTRpEtM3c+ZM0tomTiI0yNAcI5MpUybSummwOLFTf1xM+1FWMadTp05lKXbt2pVlycwS6LCMV69eTXPIhGpR1bds2cIgI8OyzJUrlyszkixMlREMq5w9wJ7nLGFBgwYK14lnZ0t/9dVXuF32zGXeL/KQoJTtpN2YFJD9+Cy8j/6+nezXJ67sz1Cmkk9FGDt2rD495kkaVdQClUoDHqFChQoMAjz++OPIkCmcuiAwSRpwPffccw+aMYnTQgeekNm4XRRiM96KoyVebecELMEPajA5WjQI2BxsIYOwdu1a4hVOX7o2Z84cQjEiFY3byJEj1bWXXnqJ0oULFz700EOcNGigItB3jkyUS7NTegKqvP322/rSDL779ddfp0pw6zQhVYsWLfrss8+Ybg4JjhZCwB07dmAwOgk0tWgRQCEm0anDhw/LQs4qbCCfNE9Oa/3A5YsvvnBtnIhOgFqtWrUiLKBT6dOn57BURYqcqDdonDfz5s2bNm3a7t27EejYseOjjz7KMkAS8x544AHsSZ48+YIFCziSUUJkhhjWfvfdd6w9ZEgTixC0EWIS6wQMPfGBhGsmCDIp3bp1K7EaQ00V9KjvTsKzCjh9ichpHfsxyf8AzIeRxE7GHCXIcDwT5TgVMUEY+5FhbRDg0lNy4jUvFshgCaemfpZ31113ESsQvdFrNGAPpeoRCexhZPSDpPHjx5Ov6k6Xp42JfvDBB7GEfnEGE09IiZMIgfTwJHK68sorqUuc/cILLxCx+WsjWAk6u3Tp0rdvX6aMIl4XL17MxYN2R40aRQ6qnKgnzPQRtbB0NSMs3VifhMULAmqXjtetW5exzZo1K/HlHXfcsXHjxoBNQZ/uIIYxhOlMvQaK12rVqrGh9NELwvG2SCZQnYmmCaDXyMcrbCQdLHaJDYuYVc7aZXvo3+RgJ7OgeV511VW33nprxowZucVyN1U+IKPP+Z2Kcw27DvsBk7iwsnuxEF+jHJ7BbigYKiLAtueGp1+pff/993V4kC9/6kMmsQVnMw5Lf1dZmfHued8keO+99zAJuBvx6iROOBFCIs4nRhU/zskRytSzD5ZwluuXPm6//XbcMdaSGWwhBmM2A1K6dOmBAweyfvCbjCcjg9jKlSu5zzEXHCpMB8cDwY03loFx40k8wbEUKnYhbuYGrAVJE2nTpuWKGdy6lHDvL1iwIAFHrVq1cMe08sMPP2ADDh1h/WUDWmElE7UQR27w/hwVzSHQu3dvukAiYJM349ykaZHQ3LVxYipRxdmvX5DRAuMKTgdjWU56yZIlFKVMmZIjoV27dgTThHfYQxFNcNhQF+69996mTZvWqFGDbqKEIiJvmU1zPBnwQoUKlStXTt2kOrhmgiCT6j/++CPjg1rC6507d8ayShoIVurVq4fxDBQXdJoAJCkFEoJ2GUaiK0YSy52KmCBGiFayZEnWRpYsWQhxaI4mXHFYkGScda4TkhLOymAtGyxRj/Qkp0GDBiwA4mNylOkUeaqYa0VayKCQBaPg2EmEQKqAMb/22mupiBIWuUJw3wwn7Q1gv379CNEYMUr1ZHKp+Morr9BccIsIE1Hp75gCaw8XSngdrDBePIvcYiPmVl1C0m+++YYW5ZecqCeMqQydjJFVPXv2ZGCJm9FADjJOOggyKUXbI488wvQxdBMnTgx0IKFBM84tFrvEJrBdPFi7XDdxl2wYHIHgSHjmmWdWrVp1vfe3GIHlTj5nEpsWD0hFNonTdY6Q/YKLFCcHnohQzGV5ONGYuDKv79zYxowZw5YGXv18JYA0/PLLL127dtVR5Jc6dTHxBfQXl7jeMVzkuOITHxugijsWQ8p5Kbflis81WI5nv+mmm5h0pltHQiyHyFWvTJkyuD9WBWPOwczo0QtAkrBAnzmhgUn58ssv1UE0AIkwsQs5nI6c3AhoyaGkdevW5DsJbwCBGdH3RhEgtpgxY4b0C9YDg091IIz4+eefg23Yu3evEmjjiTaiGfQMGzZMTQCZqsL06R+zF7Q4dOhQ8il1op4S4mCOZEIELOd0X7BggapztPBctmyZvkWEwG233abv1gg1pCdwLBHUDh482BXHGSIhs8eOHYtOWuT08pU4CW8wEQNGo2PHjpzQklEm+AJAmonjuh/mOxDUReytt95iEJIlS0bAyiu44tD4vcvm/RFEDnjiP2lT64CY/yS/YcOGSMb7uQsQNjH79B0ZjGEECHBR6IpDgB5kgDVGiEZ1wmtVjGWJIE3gwlOlPJHknkOLeAPSwcK8MnfXXXedrNKTsQqWiRcEQDbgXanFntKvYpEjnGhQFxhM8jWq//vf/ypWrEjkpHwnGhNVREZfqWGJbt++HQ3kOwkjSXLpG2+8wYIzfNhaPgQu7GSt46xZs3IR6dSpE76DewO3qypVqnAUwd13300RDrpIkSK6Pwmn8azjmvfgvstFlkCB+5wch3CiMXFlHpxwnEz4PtLBFX2QZwHhKBki/54npC0Wrsy70q1fvx43Xb16dSqS4yS8LyKgk+HVFy9I8IpmV3yu0dcmvvrqK854jMe/yza/C7wSLuCpGbo333yTmxxFGkOgs6VKlSLoYToGDBiAV1WR6pKg1+PGjSNAqVatmlfj/0eGNINDi8SgTA2LcN68eUWLFmXt+eODbYhlyJCBST98+HDlypU5uZkd3wBkCL4LFy6MhTVq1OjSpYv+wDVo+pInT65XKQTuuEuXLn355ZcJtlyWZwxtoYoTl4OfTt1///1EIVdffbX/tQ9JIsYKYdw4HhgxWuRswB6K9KSUThEU1qxZs0ePHgSFVKeWVzswnjzVHIEdLbZq1Yoohy0ZsNJDksHQFsHW999/z0BxRmoVBQ81kCYzbdq0FSpUYEF6muJf54iRz10FU2Mp8ZHBDBHTR/xKghlURQmEgjnlic4PPviAKeP41xeH/SkDCXjigQSXiiVLltSuXbt48eJ+phJo0/eaWZ+VKlUiNl2+fDkrRH9SQzLxQilgMEPHhW3r1q2sT3xaqH1NmrlQJo0qMX/+fFYLSyV//vySkTAC6GFfsHRZe1iOVYShGObLhAEZNBBkT506tWnTpkT2LCffKifkoVnwM4muBg0ahMdmGWicVUulPuRQ8bfffuvXrx9T0LZtW9aw8uMKG0kIps0IBTsZL8mC3uVx9OhRrtFkEsKTr1KePuTzVKzjVJxrsIRbI/4Uq1xWZKg76mC83SFTneUCndj+Ek5xw5N+l3Xip3U89dkvpbqau+JzDX0EDJNrw4fGnWtsxnhG+9ChQ353fDEgQceD+65MlX7++eccWqE+d0EPA0JdjR5HKQe5lAh/9BAjGmDdKu2KT0wZTWOhPhCSzlhtCTJZORwwRGCocrknDAZVpyFOyl9++YXgngOP/GBtatEfE/VaMkCCHKA7mIGk8gWvgAx133nnHcKsF198kbbI8SVdM0GQuWXLFs51DqqOHTtiuZTwdBKejCCfdnmSDhYIhiIslBi43JhQFwHgos+BpwiViq44NGimLh0sW7bskCFDZC244pgELP7332eeeYauhfq+CwqZWY5tRnvixIlEsbNmzSLfSYRAekBBZIcOHaiOGWhzEqGhlhqtW7cuq4XFScXgFhGglDVAET0lgiFwSdTnLuhnKu+66y592BlvRTIDHfCMYSKIdYjOr732WsIpNJBDPqVOOiYIoJ+YjwvV5s2bpSTeVoykQ1K50SZNcEOMEZcGbjNc0ThX8Brkk9DwgWJ5EjxJk9Cil4ZzDpZwTeGS5N4jRl1Qv0CZwVAkNBouNwIQxh5d8V3WCTx9gSFlwCnVaCcdmFkMe+WVV1gMn3zyCScE1royDwxmtXBg6J+j0HpQN6mrhUEpM0Imr16l/78phgEZqaJuqlSpOAAILLgZB9eVDAYwI6m933T1V2wwUoIYadx0sIZYEJH8/vvvxFK4dZcVhCqm8X6fS/9Q780334wB4JX/P7TFgOi6rF4jo9FACWh9xqqIAEX0lHOOo4Um+vbtSyDFQUgp503chsTChQv1Taz69etjOWIQaxz8ugwRT8ZBr3FBEjNkedzBFOQDYk2bNr366qtpnYghjE4fWlfEU7Vq1SeffFLGoEelJwF1UcL6ZLSZu2zZspUuXdrvbCiohQxDOmLEiFy5chEjagFHYgkysH79eoad7uuDZ1d2AhYAmax8SgkOCNEqVKjgysKi1fLdd9/NmTOnf//+6dOnDzWq2O/D3qQL06dP1wdUM2bMYHbId6IxIZ+NTJyHka+++qp+5wtrQ8kbSYXAVBshYOcEEyYn3qKkgLPGw2VFhuTDVFSRj8uNAFfhBC43Tr5wZUkAjMFv4t+5QeIZOSE4cnSZI5OE7mpxUd14kWbBa5jPXXhFP02rIS7flStX5tgLFiPtE/yqUh+XG4Qr8KAvglsygUKPHj1ohVdXHCSAkdhDJMFl+v77769Xr54+T6LIiZ7ANRMZcaugEGO6deuWMWNGztQxY8ZoEDBAwqBZ4FDEmNtuu41DburUqcg4FR5O9AQu10Ovyo+LZMC9x4eT+PffmTNnYic26He7IDBYnm08/Vd9EsAAEpA9/vjjlLr6Hk5pTFSU4OcuamXLli1ELb179yYda6D8ijKDp9bVuHHj7rvvPn1pz8dVi4mUUJEEHDp0SL9P7lfRU5DWR27AamnQoEGNGjWYpmAZiqRKoBnIxLzly5dXqlSJKERFgQbCjg8gRt1FixYRv3LGUZ2VqepO1PutPennOXbsWEK9J554AjudCg8naiRJLHYxjIjAl8ml4u/27NnDaZ0pU6Yff/xRmbhFPZ104kF/JD8zwr0OGzasSJEi/r9d5iROE9KJZ6eVJk2a+D/NccWenbh7BOT3Dx482LFjRw6J1atXS/JUBiFeaBHNqCVY5Nhu3LixhiK47+RgDLzzzjtc7rt3765YwRWfLTR0RAxp0qTp1auXbBDYj81AJjL6McqIESOqVau2devW4L6EAhkIH7uoie3bt9evX79mzZrETwjEklGOJGUG5n3xxRe33HLLqlWrSDvR0HgtOxjzTz755KmnniIcccUxoRVkaJG2BgwYcP311y9btoxX6kqAhF550ro3jccYJV7XrFlz++23T5o0SQKgKuEJDLf3W3K///579uzZc+XKRQINrtgDq9TW2rVry5QpwwLWvxjpio0kT9L6TN4wkjL6MJzwIl26dIMHDy5ZsmSzZs3+9P71dKBUAmcImsAFc7v98ssvOboKFy5MJt5cpacRvPz8+fO5tr755pvJkyenv8H9wgzOzssuu4ymkWnZsuWKFStGjhxZoEABJHW6ONHThFpH88MPP0zTRG9+po8a/emnn7CZ4I+oSz+nUOnZhEYJL7ChR48eX3/9tQ5dMrGQxJQpU954440tW7aQ/uCDD0aNGtW3b9/MmTPz6uqfGpy+33zzDcsybdq0/fr1Y6HSrgZH6BWrEGvTpg3BAa9Y1blz5y5dujCJEQ4atbCZJ2NObPHWW28RMrqymDBlPBF79dVXp0+fTrhWokQJDFCpj6z69ddfiVCnTZvGWiKQIiR67LHHCO8koKcSYaAL2Hb55Zfnzp27SpUqBChqLriiusla6tq1K20R8ubNmzfCvhtJAi0FwzASBA8o8Iw4RK7LtWrVql69Ole3wEXvFD5yQCF19TvS+reCyAFX7Alwr3355Zfffffdffv2eXfmcF8/TBRqS/3iuWTJkrp16270/vkvmvAbkhiop5s3byZEGDt27MGDByUgGZ5O76mhtgCF6AcsSZEiRadOnUj4QySQ+eGHH2644QYOYA6k0zg4iQJLZCeT1a1bNy70nNaHDx9WJmkOeKa4cePGxHzc9ZcuXUpHNJ5ORWjUTf9zF9KqSL4vQHSL8hkzZhwJ+uV8XwC81o4vXLhQ/zZP7dq1O3bseM0118ycORNVwomGBoVS+8svvzz00EM0yisVyeGpIifq/Txo165dBHNELXv37pVksIxqIbZt27abbrqJoPOqq64aPnw4VukfPGTofOWgWrGg1AcZr+uBBdCqVasrr7ySYDGWVbzu3LmzXbt2lStX1r9LhDyZrthI8ljsYhiJBico57h7924urA8++OAi7y8FBjvHRIHThE8//ZTLIl4eVYB+V+xBDn484JtPtpVQ0LR+cECLK1as4KbL9ZfmaEhFHLr6B+YBLy8bvAFI+PdQTg41DTThH0WclJkzZ8YScmSJighWRo8efccddxD8cWY7FecaBnDy5MlVq1b1/xL122+/nSxZMm6MhC9Yu2zZMrrppCOAntJfxS6ffPKJdDIOwUqUGZieEJo1aCNHjtRXp6FcuXJz5851xZGBJbSyevXqOnXqLF68mFeaQy1Rmj6xI+1ET0wlRZ5R8VglAVi+fLn+sR/GJ2fOnIMGDVIERlvBCuNFShCTKqrwpDqh1UsvvUQ+BLe+atWqBg0aBH9uakQX9m1qw0g0OEE9cbJ4ySVLlnDpf/bZZ0/65xSowrdy9BI3cG/u0aMHmsn0f8HH36eBTeuh19MCDQGaf/755/r165cvX75AgQLkqHT9+vVcrN9//32OE9pFjANP9pCQzGmH1hcuXDhv3rzq1avrnyc5cODA888/nz9//tdff12D4A/Fpk2bGLr77rsvX758ZDJ0Tss5BUtg8+bNxC5Ynjx5coaxZcuW+/btI9itXbt2lixZGEDsdxUSgjFhkbA8RowYQZh77733ao78VUdznmC4RaKJZsTatWu3YcOG22+/nWAoW7Zs8f4qWSgIC9atW1evXr0rrriiRIkS5NAWtm3ZsuXHH38cPny4/klPCdMipTJJksr3oQgQI77p2rXrtGnTypQpwzrkSSnykOBAoWHXrl0ffvhhrly5iBf1s85JkyZ1796dpct6RiBYCWPIaqlQoYIklWlEERa7GEaiwU0roe2DQyRHR4jvHBPFnj17OKqHDRs2ceLEq6666q233uLALly4sK9NDfnNnVwrocB4To61a9fWrVuXIAwXT47iErXYqFGj3r17k1Cmb8DpNSMY7Gnfvn23bt24iFepUiVr1qzcj0uWLNm0adMU3p9LxDBaRwxh0hjMCcQUkJNEjiIsAQ0gJilxyPuTrilTptRI8ox8DNesWUOU3KFDh19++eWJJ5549NFHb7jhhjRp0gQmw1OiJnjqVc9YYBJPZPRlWCzBNjITFbsQuDz55JP6tMZlnYAlNHToUCYClJOgVd44BeIb0kRFDBFWqTq1MM/fXJ54/CC5dOnSGjVq7N69m+AbiBHhpZdeuvrqq9GPHjT4SrhyaJ1Q0TfViCIsdjGMRMOu8b0tCf9E95+JBX+9atWqI0eOSBsHSZ48ebgN+9poEfzXk2slFGjmybX1999/D27IT+Q98Zd1yfHjBl5PrxnB0MSOHTumTp3Kgc0rUUu5cuWIYLgl6xySDE9sUMJPazqSAlokvnmgtMYQSEQ+hlu3btWXkKjICCRLlqx48eI80SAl5EsSpNy9BOHLxBLWkEYI8QFLheOfU199BC1dYk39e9MyCYIbAj/fBwHqklART3L0BAxTabzd8UHy2LFjXAC+++67vXv35s+fv2zZskWLFk2dOjWl1EVJcB+RJyRS2mKXaCTGvjIMwzAMw0jiJJULimEYhmEYRiSc/Ocu9oGNYRiGYRhngVg/bbSfGRmGYRiGEU3Yz4wMwzAMw4gmLHYxDMMwDCOasNjFMAzDMIxowmIXwzAMwzCiCYtdDMMwDMOIJix2MQzDMAwjmrDYxTAMwzCMaMJiF8MwDMMwogmLXQzDMAzDiCYsdjEMwzAMI5qw2MUwDMMwjGjC/hajYRiGYRhJGvtbjIZhGIZhRDH2MyPDMAzDMKIJi10MwzAMw4gmLHYxDMMwDCOasNjFMAzDMIxowmIXwzAMwzCiCfs9I8OICO2U48eP87z44osvueQSnrF+be+kQTmq1MTp0mkkHc7t/NKuDPj3339lg1YvCVtsRqLQ+kkKyyYqP3fxdmJgBP2EcWGiBQDu/Qzzzz//zJo1q7bHkiVLdBLwdMWnibPWHeOsga/3p/WczC+r9MCBA+3atXvwwQd5Hj58mMXsyk4NdYfnOemXcU5ICnMd3T8zskvDBQ4LQLj3Mwy31U2bNn3++ecTJkzYtWuXYpfT0jpKzPWfx5zbydUqPXbs2Jw5cyZOnDh79mzlgJM4ZU6jKiPJ4i8bcFnnjqiMXTRwx48fP11XByPqIG5g9hU9uKwzjBriSQTjJ1iKp8UAOsJ6Pmt9Mc4+mlwtWuWcTViol112mZarFjCcrvVmrvjCgTUjZ+Xezx2nIXbRKaI9CfQtPBKDxC536qoVnt99991rr7129OhRMl2xcSGBFx4yZMinn37KGvBXBc9414NW3cGDB5cvX75kyZJdu3ax9yBRi0cRM09UAQmdAcqPEFr0wWDZfPjw4alTp/bs2ZNX8p2ocX6hSWftdejQYdmyZSw/LSQySTP1Ti4mlEoMEPvjjz9WrFixbds2vfKMZM3EWqjUokqi1m0s1K7YvHlz165dFyxYkKAZRrSjlXPkyJEuXbrMmTPn2LFjLAAyBUXBa4C0fCwyLutEJsKq6HJPitMQu8ibkwi2OzzqZ2I3D7Vg7969LVq0aNKkSbFixS699FJyXLFxIcH6KVWqVLt27Z5++unVq1fzCiwGnk4iJrh7JG+99dbbb7+9evXquFoceijheGG5JnbFxiVYg2fyf5s2bWrUqNH//ve/kiVLksnGVqlx/sF0p06dGq/10EMP9erVi2AaD84B4IpDoHWCcJs2bW655RatYaJ2rfYIHWCspeut5QDuPZH4TU+ZMqVq1aqELwUKFIjQEiN6YcEw75dddlnu3Lkff/zx1157bceOHf68B5ZpkEf9+++/Bw8ezKJ95ZVXWp+gbdu2CxcudBKnxmn4xBvT2UirVq0ijbYE94PW/eWXX/7kk0/mzJnT5UYAm5xT6vnnn9+4ceOHH3549dVXo4oTiKF0EsYFg1bRypUrH3744aNHj7733ns33HADmRwM/kfiPpwQ7du379Gjh84JlijCU6dO5SCJKxwGWnz33XeJgUh/9dVXVapUUX6izgDtOKpg1ffff9+0adP06dMPHTo0X7580qPP9j1Z4zyBlcmTqQfC07Fjx7Zs2ZIQhAgmS5YsTDfrkKUr4WCoqKWCj+W04DxQfqFChWbOnMkRoroJLhjaPXDgwJ133vndd9+VL19+xowZLP6TXmZYxabr0KHDO++8w/2hcePGGI82c8XnN6xDnlrG06dPr1+/frFixYYMGVKwYEEtAEr9RcU6nzt3Lotkzpw5/pWM0kGDBjVo0IAESuJd85EiO04F+lO3bl1MEU5vaCSTMmXKH3/80amIjBUrVhCv5M2b95dffmEsgJ3M0xUbFxJ4T62B33///YorrsCJcwUkNCHfSQRx6NAh4gyFNSw/EmnTpt2wYUO8wmFAfvjw4VrGxC68ClccGewXqmA5W5rFfP31169du5ZXjAcSiVVoJH2YdH9+4ciRIwTBLMJbb71106ZNynSiMWExUJdAgRhXxwOwjIkSiEL060KRLBhk9u3bV6FCBZYuscv+/ftPZZkRBjVp0oT7Z7du3bAN4/V0xcZ5CouNWWYZa01++eWXRN5lypT56aefAis4pu/S6cwpnyNHDvlMMXjwYPK1F5zoSXEafmaUWGiV7edeEgJhhgm2bNlClLdq1Soife4c7F5tYJ5O9LxDbgs3h6dIFDgpavkryak7v5AHBxbDuHHjiE4aNmy4ePFiuqwtEdx3nH6KFCmUQ0WeyZIlI0GOBGIhSUby4MGDGlLA3fPKwFIR9BosAJghA9iZTldMUAtM68aNGx955BG0DRw4UJ+4YCTQI9JOOvphNOgso8RwsSzdMEWGXKSWsVMXtcSaXxK1a9du3rz5vHnzuIMyPuRrbYCrcwJyqJIxY0ZGg1ckGRA0JE+enOiBHH/BIKkRI6YJHm3SjD8JaUCANGvVK3RgA6eRVr60xcJXzpnUqVOnoUOHcmV97rnnsAGT9HSi5wV0VquXdejGKDL27t2rwWesnK7zBVYas6wPhqFy5crdu3fnUH7iiSc4oMkJXjwsUXK4nrFQXZanQWgvuNyTg8ZOEWZ33bp1Sz2WLVtGCBYexHgSjjHBTkVoWD2sAFz8448/Tm/Z52xLFpMrPq/BR8Abb7yRLl26NIkBNzdlyhTGTdvPqTt/YZRef/119kPZsmXZQorog/tO+pNPPmHLKdhlITVu3FjORQKxIH/Hjh158uThZuzGNE2a1KlT80yZMqX2JK+UCglAlSpVWNVUx4Z4R17rGZmbbroJPR06dEAS+8/XaaKzbN777ruPUdIARkiBAgXwiQzL+beG6ZGW6LZt21ixOOF27dppwfg40ROfuyD8yy+/ZMuWTU6fJ6EP4Yj0+PIkeGU59evXL3hZpk+fnic5Wv+cJeQo0wenMWbMGNoCaYuFzKCJzz77DI+UK1cu3D45rvi8QyPPE1dQsWJFVm/kC5jxKVy4MOEgE+TUnXcQ6TI+RMA1a9ZkTeom5i9FIA1EclwvFWwAS3fIkCHBYifNaYhdsEP+lyd4BodDe0A4FaFBHrERI0Zwb+bY+PnnnyOseB6ggWrfvr2b9ojhDjRp0iTf1zh15y/0dPPmzVdddRVbqGnTpv6H2H7fSbCvpk2bxgarVavWwIED2XLaexKIBdV37tyJr2enuTH14BXXD0qQo6dPpUqViEvCjDyZ7BRCFk6R7Nmz0wrC7BrkncT5hfpbrVo1N0ARkzVrVmIXhgVCTVOUQo8YExYk8z5u3Dg8G4th5syZ5DNcwokGBTo8V6xY0aRJk4cffrhLly7+78oh4MurLpm9evVy4+ih5cqTDaKcWOsWsOH999/HMKpLWyzQjA0bNmwoWrQo1f/3v/+xiUIJnwcwsBp8hlohZoRobHPkyEHsggan7vxCy4zVQgeXLFlC4EuXCUqC1wMylJ652OX0fFf3o48+Wrx4MWbxGndXxAJ5JV588cXcuXMrHS8ykej+zjvvXLNmzQMPPMDuSpYsGZnsNF9GiTBgWCwx5cjgCImkoVAkqiEfTfCoUaO46CQ4qsEg3Lp16+uuu440TcfbeqjuIKwiPxGLeLUFE2agQukUCWoOBQMFPXr0aNu2LZeezz//nBgCbcGDxi7y9SPMU3483oFFGL/csGFDNmewwWj4448/li9fjobSpUsXLFhQpb5MkSJFXn/9dU4j0mhWi7G6vHTp0ttuu+3AgQMvv/xyp06dyEFAl2kJxJInP1ZOGHwlYZA2X20s/aHMCCaSVgQDxWC++eabnLuR9wK4ub711lv6LioV452meO2PkHi7EK+eCPVHPiZaitJJBHPTTTexosqUKTN79uzkyZNLj6+NAUQSVIVXhkI5rBme5AevNARIswVGjhzpK5Ewi3nOnDnE1owtjdIWwhIABBo3bnzLLbdQS1sjFpjNydSyZcvBgwenSpXqu+++I4gh33fFQENKBDetRCiQVC/cewikR8LKiZwElccLDamt/fv3v/baa+vXr6enDIJKwyAjufkMGzZMP5t2BUFIczCRdy3y7oRqJVhDqEbDt0ItLUUGhHTdunU//vhj7hu//PJLpkyZ/L7wxNGxtnGbqkgR6wfXKv2hWg8mlCUnsxSEX5E+1K9fnyM2QlV0mCqpU6desGBBiRIlXG4IGJrOnTuzdEgTuDz22GMal2BfhjZygsdCMkJbXZkSUA6vwUoSRKtWdWlR2oB9rleevCJGWptfrQhPNqkQqwtK+9Yq4T/psrpGOthPhQIx5IGKaCCHMdFQMyy4P42higBJtUKOxE4CNCxbtgyPfPjw4Vq1ar333nsKIEAzggBpNa189cU3IxKo++677zZo0ID01KlTI/w9I7XIU2Y8/fTTnCtYwkFyww03SCYYZCTMyPBUQiPjjyQCnmwgQQ4CpBlehJUfBtVVE8oBafZa+/9MZDQ7aNYwkikxCSQFMIynZ3jg93E0Gjz9sRISgFgdD8bvo8RISBsJ6QyuLgE9td9PAqq/+uqruDiUjx49+sEHHyQn1jz6ATQyoExARmLBwvGi6gcPHvR/z2j69Ol4YDITrOuDkp9//rly5crbtm3jSXjEFvNM+H8NmgshU9W0xlCvfhdi9cWlQkBFTQEJTZOq+Er8aUJAW5uExMATSUKoOzKMNE9etcv8taQcEhLgCZF44GBUC2hLLTJiwQMih6y2jh07phZjySQIRzPhCKo6dOjQqlUrjKQ6oJZVxzVv7dq1TtT7rq4fu/if0/DKU/MLaOBJpj+/oCo+/1+QWFDkw2vkvcUmnsgzUqoeF0mS2LFjx0cffUSa+3TZsmX9UiVA2kj424bxIlIm0Nu5cyczIQFB0aZNm3788Uf9407SEDnYTCsoBBwKqmhCeniSiQAtMh8rV65cvXo1CRpV3aSGP1+HDh3avn07Bssj0BGK1EeWHXH0r7/+qjWtfK92OJBB0t9j1N27d++6des8lQG/TKbmhRY3bNjAdDBQVJRJJ4Eaypcvny6C+GWcrN9B4pjatWs/7BH4Q0S1az/yyCNEw0Q52CANicJrLXHTijxt8VyzZs2UKVPIyZgxY6lSpVQaCwlrpTEdDA53ZZYTi40i8rWugNI///xz6dKlFEUyeqolS/wET+5GTBADgkJyJKzWmbvly5ezQrwagW1LvgTOOVgiY44cObJly5ZVq1ZhXqxlTNd++umn33//nXWoLqhKvFCq6p7iwBQwsPp9NAkoHzG0MfJoZvuQT44EEgsVb7vtNjYLbfXv35+ZjTXCpIlxWbpcbYlsiMtF8+bN9+zZQy3Z46Qj4KRNpSFc8a5du9DAJcEPXHyk2X8CVRifjRs3rl+/XvnqnUpR9cMPP7Aj/OENA7VUF3il4xwNTA2vqPI1yL1wUq5YseLoiX+5VEVJCnk8bPM69B8bnCFiDcsxkg9Kc1SxzDhrJA+egojwdAegFi3u3r2bY1ErXCifHJrGjbCPyAwWiAT033jjjenSpUPbiBEjmHFymJFgU0kL9+7BJmrfvv1DDz302GOPscIfffRR0izy+vXrcx7RfWSCq0jD/+N17ZTASpbgRI9JETBhwgQkJ0+ezN5zKuKATmAQkbz88sux+/rrr9ePeMkEJ3fCuTMKeN5PP/2US8xdd92FcI4cOe655x5miyIJsD6431xxxRUMCgLsGaciMmiUzcDc04W33nrr1ltvLVCgQL169fRza0AAY5AhriTSypIly6hRowhxyHQqkgyYhDfHV+IQb7nlliJFirDy6Be9wGCGiydzyi0tZcqUadOm7d69O5kQPPKhUHU2Az53wIABNWvW5JAuVqwYx0Bg5ryjV88lS5YULFiQ+W3WrBnTR6ZTkUioSI+YCFa/lniLFi2wAVDLqd+rVy9iBW/9B8AJVqhQgVMtsVOD/Mn9jrTf5b59+7L8MICxDdVfxIDujBs3rk2bNkgStbOcHnjgAbaMFhuwnjt16sR6ZgA5S1iZCVqigUI5CeJ7okaaaNy48Q033MB+Yfug1ldCguaI85IlS4Zv0rdPpEEC5xy8LeOJhXS/cOHCWbNmZckx6RoffOg333xTuXLlVKlSMfssRYwHeuHqx0Fr5q+//po9e/bQoUNxo9dccw3bnJODfFVHMxqI6UuWLMn5/eSTTxLenfSYoJCdkj59elZU6tSp586di/JgbTRHkPThhx8qLvehdU41hMGJhgaFwIxXrFiRukw3aWU6iQiguauvvlo7aNq0aViuMXHFJ1yxN4R/f/vtt926dWPvs9G4VLD9dQWSAE7ys88+wwmzF1h4eB6nIjQa+c2bN8+cORPHhZNnBEqUKMEso5AiBCTDpOfOnZupYe9QRKZTkZTAKgbh66+/5iipUaMGM5IzZ85y5coR5/l94erIvYsxx13gor///nvdLpyKCGC0mbUFCxYMGTIE71G6dOm8efOy3lyxZwYyv/32G+ucABr/iVfBAHASEYBJmHrllVcymygZOXIkOoF89KMQJ69FCywe//sutLJw4cLXX3+dlU9+8uTJM2XKhOtmiv3DIkx/T0/somZAFicIkno6FTGRDDC7Tz31lPpMRMZCRD8dDq6ImDfUxxi+Hj164M2JKhhBxhGfi8NSFfzLM888Qz5jhDYGi+3nVEQGrc+bN69atWqsAKrrBGJbcjSq43pi83333UcRrbAiT8WvnTkw6bvvvmOsnn766fz58zMmGIwv8EM9DmYOReVzNN57771kQqgpCwYZrun4dA68bNmyMVCaCyI5DRGtKzFw4ECKaIXxZP+Q41QkEq0Bqnfp0gVtwFZUpEsRTyaldu3azIhK4XTFLnQkQSUIAGawRLlky4zWrVuH6q/kccpdu3Z97bXXqIIjZiJ4cq1RT1lX+s07dYfh5bSgllMRAlnLk8XMqV+pUiV8ItVRzpMgINg5kiC+1++2AHE/TVMXJHDOmT59+ttvv80tDY+sLpQvX56Yj/GhIx9//DHnooynlAXJMqAIQg0UXSNcQyF6qEstqdWwq+9UZ/QIJrx1HVjYRPkJjnwo0IZVnE+sCtpiusmhFVd84nSBQYMGeevOQRV2mWdRpCvw5GIXZNTErFmz6CyjkT17dn1M6ymIoUHGMz4fffQRU8N9Wt9+oGusLo0/z379+hGuafXypNeufmgYf6JzlFx33XXE8d7MBFwT5xwtql0933zzTYroJvcuwlBynIqkBFYxDqwrRgnvyv1Qo4FboKcMIMuYQ507sPLpETEBRbEGPDx4CW4m+OFcuXIxIOgBBl8TJxi0sWPHcjIiwJP7JIYlatAQBhys7CTi5xW1KCcRN3bhbk++SukRXpEYlHx2MTtamb4AT9dMHE5D7HLa0YBi99atW/0vxLRr144cJxECBJh1tmXHjh01jhxRzB+OrGXLlsxfq1at6tSpQ3hLEEqE66pFBgPKaUfgj8vgooA29DPZU6dO9QcaECOU1nfuSpUqhXCY0T9XyFoGGWu5QWptYTPungFk+XKkcWFlI91+++14OoIMJCni6VSEBs2sfoZ3x44dK1asYAoYKHbgiy++6I2QaxdmzJiRLl06BgovQ1xIvlORSPzB/+STT9BGX1D4008/+SNPUfPmzb115KBTZzl2QZKh1j8kjZHoiaQiY85p+sYbb8iFEZeznhleAnGiClY1EQy+CX+xYcMGmnA1Q+AZEmiUoWZl4tmZJnw9ZwBWsdeClysJZlDfegYGkCmjiQRbOWvIGEZj8eLFefLkYZnRERYVgzZnzhyG+s4773z55ZcJ/hiiDz74QF6FfvF0KmJCPnW5mDIOTBaXe3TS9wYNGtCKKkoJd9nMmTNTxLxwe/YHLbFIIe7IW1OB31OjoVgbAeVAtCQZwU1DkyWcaAgkc3Kxi/oLbdu2VdMKm1xxCNQvFgynUapUqRgoAo5ly5YxvHSEoatbt+7zzz+PPbfccgtRkasWGrThxpka1iTBYtmyZZkawNn6FooJEyZkyJCBIiItXSydiiQJU0C/3nnnHX32QMTM2HJvUXjHNmftsXrvuOMOHBqjF8mU+TD+HFiMGLHmPffco+nDY3gzH4DBoXXGUyEm4NbIjLUCwyM9/vLg0quPh5UfN3YhCicfaIUeffHFF5wCV111FcuDWuQ4vQmRRGMXecnly5dzCNFb+vzee++R7yRCoBEBnHLx4sWpmDFjRrzM559/XqBAgcmTJzMuRHnMpT5pdNUig4EGGYZ/YUnpOGHZYZhQ69z4e/bsiTNV5ESOU5FkwFT6gmFaK/r8A5544omdO3cStXAWsoWQwcfRneCfizkVofHG6f/RL0mhnOsF1cnhqQTbhkm56aab8G4cFVjlVCQSjTzPuXPnEoGxWmju3Xff9RVSeoZiF+GKQ+P1+B9CXvypqs+fPz/Biggw5gwUEQbrmWFkoH7++Wcutfi4adOmUcr0MWWsahI04WqGxrM3AMJoZiX/+eefV155JSZx+cPXOzlPEp14vSZNmnD/I8SRMeQ7iXONugAkCO8CR9kllzDRLF0WVf369Vm62Iz3xJkipgUMobogVYiJKVOmaDlVqVJFH+doTCjiyfizU7i9nMrnLlSkxeeeew7LaYhpZa5jaeMVzlXsouFiqxLJ4e6AWJD15opD4I824/bkk0+yH+lg3759f/3112LFirVv3x4PjABTo89HXbXQIOPr5InLIk7FGPywOgKSwdSxY8eWKVOGs0M/VnMqkiSYTY/wRVWrVtXwfvnll5zihHr9+/dn9OgOq5eBoiOsOuRdzQhAM6gJlqsOU2Zf+YBOPAAQPrLImaOT+NxF+rGW6ugnCMO5MUfKjxu7DBkyhNLAVB0/jiSlLGYiM+VAhK2f/Hd1zyjayWvXrmXmsJI0obpXEg4GS0NGyEI4Tw6zztnJVmnTpg1zgypWPFE5HkdNRA6DHhivE793cM011/BK5qZNm2Sh8oEosmHDhhw2xC4cNspMUshydQeqV6+Oc8H+2bNn9+nTh9XWu3dvfUGE5e7/MydIqqfhQYynJFmFBNSMNq/r169nXXoigbEC1OL98YlFihTRb8OqNLGoLZ7MrKxFOXfxSKw9RWgiwVZ8gTVr1rA+lWZ4lQgPdekLR9rNN99MAo//ySefsJ65bnIVo5TOst44YjXskYM2Zod553qXN29e9HAgMfXB3SGzaNGiLGaaYDFryni64iQA9vDEZiJjuWZiys6dOxMjdu3aFbMpJfBSgs4iENzBWEgbT6Cb3CDldgjvWLpkUldK4PbbbyfEJ+zLlSsXr56Ck4G6OXPm1KhyhgX/RsY5RzNOgkPU/74tu8wPwUOh1cigsfcfeOAB9DB0+qZg+fLlW7ZsiR9GJkWKFGgLMyM+tKvB5wlXX301acD9aujIRAawjRbZHdddd52cmDQkWdiG7F+cMGl6MW7cuBdeeOGpp55i3+mcYgEzUFqBqhIh3lAFYIg4jziMSBMc+14IhYwhT9zLQw89xH2bNa8cCUSChHHgmnSObCJU0rTllccGeVX5/vvv69Spw96ky0S0Cln8JZcgifN3Z43AeP/3Hy5Dr3Qmwd0CDDpbIrCiL71UYSyJXr16lShRgjHilQElJyB5yaWXXHzJRax5/z/GOfAfqX8Cz/iGXXVRAsyxBnrz5s0kyEc/o+8JXkT0yv34/vvvJ/Piiy4+odz/38B/5xCM8m3mybLTz+ZY1gMGDMD1c1iqiCe9vowrzomj0bM88Hmd/1+s7pChWjzZe/hlNh4DxblIRK92tUCZLA7joUOHPvzww+xPMj1NAaX+tCgrQaSW04u2lLN3715lnl5KlSpFHNy2bVsWgMuKGP2yFR0HraUE0YpFHudCmgTen7i5bt26fikEVqRXqlphCLTtiVGL2SGB3yxQoAAJbNvp/SqTkBhu7oMPPihcuPC1114rebaRVx6Yqcgm50whC9V3fB9XVZbQb7/99t577xG46Auw7HT6BYH97vUR4cB7jA54K+6/fzR+Usj4oJBlSc6OHTs4vNVcoK43iaznYe8Mr/XAA5kyZw6czL42PxXrv/igHjZnzZqVNDrZILt37/aUnX7YbiwbVu8TTzxB2uWGJdBVj/3798vRAXaCkwgNY0grSLJcGUYSX3/9tf/dTE/PfxzNhNCXXXbp/3uTOHvfG4vAgxmhFmpJ58+fn9XIca5vngZEkD/xmQTL+MMPP/S/KxlDHQTkcdT4eVfx3EK/eJYrV457F4mRI0fiFbmfKHDhydBpt6rvDmwP9d8JkNdMkSBEUARPfKzFrHGTDAHHsGHDateuLRu8I0u62BeBgfNnJ1AaE/SgQT+n0+uePXtIqOl4QeaHH37A7XPEd+jQgbsrHaQ6T/AOgoQJGoskg8YC/vb+aCoJAhd5ovAgqZngWaZMGXYIwQQVX3zxRV1MVaphIuGqxcCbLU1bzGlSFf/JhUyLST/6lQyaebLP27Vrd99995UtWzbONEi//jtnaBBAfQGORl4ZLq4s3LAloyISSgf+XyeuNzzuP70G4VX6/7Fi5OWa2R7c7MlhuBg62oLevXvj4J577jkGKlAS2B7/D+lIhgmdStAFtKEf9LG28k8XNMRl7u233+7UqZP/LVdwxWHBGC0GElyAuBEmaJ70k+BJ6JAtWzbSBGeEL+wIlfL0oQIa9d+JwYt/pUme6owYc6Gv8gGHtBKAecgsXrx49OjRXJRxfLx6NZ2Aa+fcoS4ELLr4YoZF34OG+vXrly5dOlDK/3lPJbA+kKBe7ClzYxYQPqGQBA5E7p4djS/SfFFEglOTwGX/vv316tUnL6BDJHJIpI2GSJNgR2iPqPR0EeiPtxMbNWrE6m3cuDFpZTqJEDAIWAXBkvpqkXsJAfIaSZ5s//Lly6OE/FatWuXLl4+0NKAVJxQ8arHGz3/VtEgnKxavzoyT1o+cJBOQ8D4wIzzKmzdv4Ov5NBqskbR71b7wSoMFzjoaFsxW8E2CNcBxTrTnD6Bgq/IkRxUjIbg6w5U9e3YyCVwIXyRA6wwXz3feeYeD7IUXXlDOv//967li+WOegbGK/0LvQS3ujUoAsRfpUKaiftGiRbVq1SJw4Wr02muvMYnUooMUIYC1kgxPREJnGfWZzmiISXjZiYN1oH9YliOT1cC4BAbVV0VcGfO/E5l4/0t4YoSXFRL9eIKx5p4U/GPI48ePjxo1itsAu1Sr7f+bQKtr+V+eTlESACMrV66M8Yw8t0mXG0ScsfD64f8Xp9gHnTgafUGVZbp3715lMlycvitWrOjTp0/Pnj31YQnrlsGXTqm+5KJ/Lw2MVQLIchLEBDoGkhSYByRYz7KTBck6CbW3fVQLMRJ4fP3Y2P9ptIoCch6885/nj0+ATMDhBFxQeDhdAib+9x93a5flwZlNnFS9evUqVaqoLe8EUaM8E7D/bILxxNzsOMxjiOQEPRfsRiYwBjLZbcP//8/b8pecWHsB0IYG4mDf3euLayqF1atXDxgwoHWb1nnz5jkxIOL/WztB3Jz/h4bYffplLpeV9MBCRgB0qGAqZqsoEoiTrr/+ehLU4lQL6ina3H/+nAT+n+184j9CJ+T+I+/E1AB6dCOSQn13hzQwR3Pnzh0/fnznzp2RCdy1Trhf/ReoHnC+3nQngTHXaPDkQLnhhhvoAq+Jj1+1wPz/4sH3w4zVX3/9ddz7l4TUyh9//IETfuONN3Q7Cmyi/9eBgNsXYQxCT6ZMmajo3kNDB2fNmkVYyRGJJdjw888/E0+ThkT1OrAWkyD0gV7pk0agw4y1K4sM/I5uisQuy5cvR4M/VeBNDZuD0Qr8xy5RJjvk38CyPjF3/z+FscE2fbymj+CknNFftWoV24ZYMrf35w7I8cQDxFQWWvVZh8HJkSOHTF26dCkjpi0UBAGE+887H2OBcLju4GUYHGaQ+JoEkMm4Ed7VqFGDsInRg0DIGINA1BLJWtbgx7E5qaD+Yh4L0jcysdYS6hGOkyAE5NYincGrKzQJN6QLH4mtW7f6hqF8zJgxHNL6hzLJ94o4JGg0kIp1opxbWAP58+cngZGzZ8927oIt6a3agJksEIz3FrCukyd+OhFI/svGD9xbHIwG4Is15rDT+9f/NAjsdxz99dddV7v2Q4Fh+y/QgiMwJNLJEzROAfXea/wER0VJCo0DXeZ2QSig9cZVjUwJRIj/A1amRp9gOQ3EEcyNBsn/LzBj/+g/TZA/Lz64d32aizbCSt8DcJNs3br1E088Ua5cOUqPE+UHpsf5eW861NQl8vMBXUkABpY1oCODjsybN4++qMgDy+P9DxAjoYFjmf0TaqWhViNGW9u3b9dwAd7+f//7H8P1oPdvOjOSEvYmJpA88V+4waKiS0XAZ599JgO8RgKLf+DAgTNnziSNbWRGqC2Jxi5YT5f0HRfSWqAqCgM991m2bNk333yjUZgzZw45pPWKZu9zscDzov94Hv+PdMCfBa5p/PePd1lVDU9xDDTi7GTFLsTITL/04zHbtGlzyy23PPzww/rQhScNUXr82DHu2oEm/r2I//0ncZHYaSbQee/75xDo5H//9evXD/dEgpCc40oLCBBj7wcG5p/j9CHwpCLjRUYg979/eD/OGMa/2qREX3hkEvWzfA3g0KFD//zzzzfffDNwPfI2TOD/vaZQ6U2A13JgxJyRJBCJi0pJMBFMhzKTFHRZfVSaw4+h0Gt4vCkKTNb8+fO//fZb6gIHQGANe12mlARPb/n++x9zceIrb4TkZP/DQRDG63hwZ+IwIIFPYQ2rxV9++aVLly6vvPJK4cKFKUInWv85fuyiQGHglwGOeSvbU3BuwCBMwVpsPnr0KHdHBocdt3Hjxk2bNgUkOAY9sJzFFFipAasDHSQvsI5VFFht3rEWhNffQExPmrnTT9PQT5WPP/4Y99Khw/9SpUzJmUmLAUvY1WgLGHOM/zmO9oBWf/BjqY/Btm3blKAhr1ZSITAEJ+yhm8rRrV2ZoUBMU8OTpT5o0CD9MPfXX3/V98EDI/+v52y92fAWVMDDUIF1y5umzGnT/8TEDyv1Ewcg3adPH54vvfQS3ozBvOzSSwJKAv+xLwI2I4QkEQ3/BWJYykL4rrODLGddrV+//osvvuDIw2wOLMZNBEbHQ5KBhJf2/idgu/dfwl2gov/7LlrMKEfP2LFjFy9e/Prrr+OHuaJ4TXltBebu32OBiUC5QplApBmXgLAXNVLRZXl4FeOhWLFiffv2TZ8+vWvI+13uF198UbuATEZDkuFJorELMIX6CEskuFtAA8GUcKfv1KlThQoVGCByfv75Z31zk0FhPAOT4/l6FjRb56J/jgVWNnPFwDGjCODyAuMebkEw0/oKDjvH/6CCLbpq1Sqa1hL0uPgS2vz3mPeJBcslsNoCH4X+X3vnAR9Fuf39bamEktCrIL38UVBUUFAUvRbwXpWmooIXFRQRryggIk0FQRSkWoGrgHSkKYiCgCAgUpUSeicQQsrWmdl9v8+cyRJiQoLvJcbPhx/LZHbmmfOc57TnnGn7l0peQrDwDJgLyfRvMX9bJ9183apqgHub5aCKMbBtkHgRXgJwr2Y3W0gzmARMGirEXEpWJUqUYAk1rNNUQWjjxo3jx48ncZFnNAByUhpQHqOZK/RjlmGmv3CIUllewAeAECxU5+HhhCGUL19eVtgiy0tDDcO8/o2BjRgx4q677iL6iPRwhzAFVkyLZiog50M3upklU3QSBpzK3qRd7sCSsWf6wphFzkww8nYfsnBhQ/Vi6GaNrOYcVBPAI637dv8yCGNg4cKFRP+mTZuyDvPkFibTypOFRcTIFyf/9QDJRQCxSEA2Tc1scgEQ4WhWKFXxYb6G71QlpcPB+/XrV7dubdJR+SATkRCagJjYrcpeFHWcPQ8VyD3s9EjckKtUhQQYgxKj+bwhCa5spAiRlUtAycJMXFiZPHkyMpSXVpBWHjhwwAyMDsyIhkRIWiNse0gpiDLIh/c7XMwAuVVEALLi4Pj7qVOnhM8VK1bMmDHj7bfflkcNzE6Ufu0QUjm3SrvRjznZmx8zcTFN+q8EY8Gdmc6rV69OqGQsTFjJyckMgb3KH017Mr07yIoWCKg8TBkXwSGoafihXCuwPkI2G+hF4jAEyT75CkglR40a9dprr9WsWVOaidxQHaIhLTc7Nz8qDstqzpACnhUOl7QS+uaeiwD9Z599tkuXLiQrcjZXety9e/eAAQOsRvnDnw895tivFKCPICpWrMjwRIWIO89OEQFLmhHI9uzZM3z48GvNX/pNTEwk9AgdFc8d5mU5jJeaNBg8lJg47v0PevXq9VrvPkuXLvP6/Cq/yJS8ovsHsJ0oI7dWnD9/nlQJbtetW8cEM27cuEqVKmU9UGVIwcCa1asGvvlmjx49R4z4YN+ho6RH1u6/CCIr2Ib///znP88991ynTp0QEVvCrx5R7q18XYUY+CUJ25+49/33R/V6+eV+rw9Yt34D8yTyhNQlBAXQI+v4Jw7JCqp89dVXH374YXljhNlQiQlB2YOaz5Ox7Julffv27dHzpVHvf7hn3wH2wQZ0LjT+A9gLGwQyOSPN1wYNGsgYw1CsZMLaVCCgO/iHeeQgxSsIh49LQxpTGx0/fnzQoEHyCilsm3gtu2iTKRyCluFJT5k9Y/qrvV/p1v2FWXMXpHq8uop7eYCghjGzQuaKmphyiKR0QTUm52NUIyRsRsWM9PRp076cPXsu+aaykkKAkydPIhyMSh40xQwoJRXb5tSouDZNGX83tMCeXbve/2DssaRzhgr0amxMCU5zVgibBZvF2JCMbJEnhCmB6KVVq1YdO3Y0D7WAxMBvO3eMfO+9Xi//Z8AbAzdt+tWcuAXY4UWmGIZizfzRKFYAWpCQlSOkjcDadIURjqiktnK3BGKhwhYvuwRoRmMOp5YbM2bMW2+9JRdx3G43aaXVCLmrskdd6QgGAxR4u3ftGDXqg1PJyTq+zJ7MZ8SyAeJAVIO5ytMSpERvvPEG82KLFi3MViaU3skijYDHs3LZ8jf69evZs+eo9z7YtTcR0808NZa3g1w5yFhWrlz5zTffDB48uHHjxmzEgDdv3ix7VegIJw1m+XjuXPKkiRNf7vWfl1/uM3fuwgy313RDdjIQ9JXzcCAl71OFjEgMXZC13HzzzY899hjzrOySHonrx44fHz36w117ElX0v6S5cQgHEjoUh+bMWKdOHTYCq8XFpksDdnXr1k3eY8JeNUa7/YsvvliyZIm0zxFy+AVYmwsTJBAAQpI8zQGf/fv3Z4RWi1wgYZcoc80113z55Zc+n09CDFqZNm0au9h94vTpESPf8/t8ms8b8KQvnjurTqXyMTabUyV/zsiYuLaPdUlKzQioqzzq+ohFOgtM1aoXyaByIf7DDz8cO3asfv36Q4YMkenzAqtM7oG0dwa+nlCsSLWq1zgdTrs9qnb9Jpu2785pMGzLY4z/E8AesgIUfJjvrbfeSgZDTJFM8brrruMr6XxAU/cd79i+VQ94dH/6quVLqlYsV6F82djYIjZHVFyxMp9Mnubxmyd71UW3HGCq0Zg5cyZkAX1h4sSX1q1by5ln9lpNYcnvOX/m+JMd2sdGxziUOpg1I8tVqLZw0RLaadZb0XKQD9ulIxQhcy0pQtZ35LPrf/Juuj8HescqwM6dO8OnEvP5Xl2OpdCvV6/e9OnTsed27dohRkQzffoMSi72Yu2TJk3yej3Y3emj++67s3m5kvEl40tQM7kii7R/omuy26c6yllyFkiMqPngiqCD6kn9yb/lBVMANQUCyqrPnTvz+ccTbr7+/6IjXK/2G5CuB/0FIb9cIYJFLMTBNm3apKWlrVmzBtUjombNmslNzXD++aefH0hMNPzeLb/8/NKzT1cuXbJU2Sq/7D3kUZfXZNbkY174zIRIHsyfP59pG8l07dqVjkhcsByyOuXjmK4ye/UJBHzLl31TtmyZSpUqkn84HRGlSlaaMnWmuoJnXV3NWVJ0hHhfeOEF0yjUK2sxS/q1dpugDfj/eTfdnwaUYUZFCl2nwsHwkK10bbXIBRyFapgdqU+6d++OuKZMmaJCgN1OjcRXGvgDvhHvvXvu3Fld927ZvL7HM10qlymVULLMxsQDaXrQRwtDRy/woE75WoQVzBEHSYmghiioUTHaDh064B2sQFyZrNksRMgP+FKTk57r1LFkdGSkzU4EZlGmQtVpcxcy5wc05UU0tUgXOBgkdTVGRXqHxZJ+iQH369cPAbIXUX88cfy5s6eNgJcQuWPr5jq1qkeoiQRtxEZHFr37rgcOHDxmGaKqNC3K2QCp5cuXS94gP8P3+uuvN2/e/MiRI9IRSyTG8tChgwMH9K/ObOWM/nrRcs20YRXhg9rFerAgR33wwQdimSVLlmQ2RANshyxeme3ddBMmTEBHYOPGjWXLlmUYgF0siUIHDx7kKIv0JfHnz7sUAIoXL848KgOj1hThwjRjkwaAr0jBdC51UUkC2W233da2bVuMgMxODiexlTZ9X+93LuWs0xGkojh7Nrn/m0PqNWk+/9uV69avfeqxuyL8vgUzlr7/waQAybqDSs16h0820Ck6CF+VOHv2bO/evWvVqsUEyRYxPmkJflq58usFK6bP/3bVmpXfL/myTqUye3Ylvj1ynK6uuKuCUDJqihB1hkYVIlcEIjcRApCN33///axZs0aNGlW0aNHKlSvXqFEDcTGTYUBws3HThvdHjSxaJAY3OZ/m7fXqgD5vDFm3buOGtSv+de+tHnfSW8OHJXsy/EEkdWG8WZFNUFRsJJGrVq16++23ExISZK+0BM6gbcYX/12+ZseA4ePW/7Jmykcf1K9e8cyJ408/89Lhk2d1u4Nc0iYvZrgY0GGJbYjPMAQZC1v4yqhB1o5YR0fsBbKXZkLkSkC6A6TUMCYbqUez9si68APgB1OHJQybVA+jomZ96KGHGJe8FxW3XffTBrJir9dPrXby5HGni40aSXmFajWWr/1p4y8bh735WpTmmTdz4dJv16jSVm57pMOsn0zExsbKNaPU1FQYYyp98803mzRpQi4L2/RLTut02rZu21mkeIlqZUu6NF1TV/lszqxUCgTIB8kIRFZkxgRlQj9JA3O/3IWGJRxVP/oRXLNm1Ueff+6IdFHcf79m/W23Ngl4Mgynep4z0qa8wLC5zDEQm7MPBqVgpYR7EmJMd968eXPmzCHyEp2RCTkkulXHBG3paem9evUeNOSd1et+XrNy+YN33Zx+/uSoCROOp6Zp8Aujl5QT7ib22bRpU3nWNAx4AAyTHq1NBQi4AnQNwj/FlZSUJNcWBdISwCdfMV24la9jx45NTEwcZL7ymEiOgjh89+7dWDUtv/pq5sIFi2JiinjcgV82/tqw0fVEb90WbdicUY5ghLpA4gjZgnZ1FZ9eLuoISFSBMqpBKfv27XvnnXdU4kj4JbirhipWUPzPnTNv4Xer+wwdsWbDr1Mnj2lUp/z5E2e7/fvlw8dPana7bvPJCbgrDcQCxG5ZMgSWDHnYsGEYLRUdY2HmEo9bt24dh9CAymTDhp+jo6Nor+nG8y/2Kl2+yqwFi37auLHXS11LOf1rV/4wcNiYdGoXG1N+rjeFI6tSpUrJZRpySuoTsuGRI0dSpSiJmTGKXfBGAdCgbo3ihuYKRWn2SKUDZG4PkF3kcseLIn7gwAFZxwfJSIQUS8BwZJdABggaNWokzwGwka8QoRIjaSPuiayEghyVA4R6oYIwLZnKJ598wqjgE6+Wu4FQJ7CamokLYYsp4cUXX6QeojyidqSmhwjNtmzZIk8bVatWjfqV0uH6G284n37e0L3BgHvBnJldnno6Jc3t00JawONLT+z59KMuW1yN+i1SAiGv4TdUppkDhDj5vkicOYnKGLnLlBMWukLQmDruw53bdlBJeP3eoPf0olmTbc5iDW5+0Osn51ftZKFqOPXJPPB/DXqBMWamm2++uXHjxt999x02itxIXJAhnIOnnnqKMA0GDhq0avXqGrVqTJn8qeZ3B7zu1T+unj59lk8z1DW1QOrBPZuvqVTBEZuw/eBxjwpWObMt+kLyaBBZlShRomrVqosWLQqnmyDc1JNy4tEHH1rx/c+pWtCna7ovZc+2NWXiitlcpSZMmekxglg0/m5+LoKIHfTt2xd10NGTTz5JUAhvp6DBMEx7V25Gg4YNG1KisV3aMABgkbsCEDnQRZ8+fYQBchG2WLsztQPYOGDAAAJKr169sOcuXbpQkzETs4s2VCrmG0eclSpW3b/vUM+eL915551nzpwKBNxnz558791hySmpbj3o13ya+9TT7R60O+PeHf+loe5GN29INw3twicTxDL8S2IKRdLzzz/PFoRj7Vbs6VoAzfsCujZz4tvFbbYe/QYx/xCGrRYFBeSwffv266+/njSO+L5kyRJk9eWXXyrzNSEXIrHh0aNHL1/2bb16dWcv+JoSX/OnuQOaL/1Eq5saxFeo98vew0EjoC4iqdEFQkF/1teth9WxdetWifilS5dmJdyRaNMUozp7892yb+fNW4B38DEC7r1bf6xavmxC5dpb9h/xmSfdchMURLxeb5UqVeCZ9HHBggUW5UzAiTBD0i8GLCiY8y5ZQXwjXGAhSGPFihXCZzZWly1bVrdu3X/84x+4/JQpU5AYQQZnx5aI3jfeeCPHRkdHEwGmT59e5Zprfv11i7Jr3dD9vtRzp5rf1KhY6Wrr9x3VQ4EQclMjC4Q0db4cKVvdZIbfH374AWYAMyWMUZ2KXhSYPuAnpIWCPk/auY7t2n7z7Xduv+EnkfQn7du5qnLJUnZb/HuTZmQoI/BC0iJ9JWFKS3GOKjHadu3aHTp0aOjQoWifeUqCMGUwmTGeSDazfv16gjOK3vv7Tj3g0XzujRt/vvfe+/Bxr6a7vX6/O2XcW33iHK7KtW7Zefy0P+TTDezP6i4bECXdyXveIU4clh98tnabDWDAVJYv6Evp2eHhKFfC7KVrDD2kqXOHblpkDRphcCB07r77biijDvIhpQLzBxyghtkwWNNmLQwcOFBMggZpaWnywxoSFRk42ee4cePkWJYQt7r5Awpj7gK7omOWVCRkBgw4Pj6eFF4kwnaraSiEskn0GDzDLl68ePny5Qlt0oYl85OUj4CCBlI7du3y+DyG7gtq3oXz5uzZvZsUwqeHfB634Us/svcXUtDS5RonuclcchacsAcmT56MxOGNCmDDhg3hTqVBuLVOzPenp+s+D9340/fsXG13FuncY5i6ldKEaiUf+XJlAG2kN378eJEVRhwXF9ezZ09Cp4ia5axZs9jFiAijUdFRr/V5LSMjLeD36hr27PNjS0bQr2uGlpGecqpe3Yb1G7c4kZpBwDC0nK9/Iw3A1ItdMp2wpDbCakVQ0q/VNBQ8k3Rw6ieT6cqv6V49lOFLCWopL3fuYLMnDB3xkZdEUmVItFd/sgoKIoCBMG/BPANcunSpmIo4wOrVq1E9vYuH0EDmNg6RBhx+5WQPZejDD/j555+xUtgg4NKv1cIcgjQjxOPqwiohpmLFips3b4ZJaUBKUb9+/YiIaIcjMq5I8Wuvrb5z507zR2czfH63z+MnsfSqZzWwtLNvvNYtukjcvOWrDJXtSWZsCi78yQT0mW9QPZ3ecMMNlMXwlpU9ZTvkQyQwmv+HeZ+q3KXvwHRTGQUMZEhuFxkZCbcYKZHu9ddfRyxwi4gAFs4oxN8x8iFDBnkQEMz70jB0f8bJR+5pHl++bj5zF8I9KhB1DBo0SKwFHmSvKUZVdQRI6/APPehjdtS8547vueH6/2vc/L6j6W4SR9KZrMLMCojs2bNHZhSyRuKVDMTabQJm6HH+/Pm0CaPgcxf88tlnn5XemZ9gkn5hzNptqobiB0GhGoSPjj777DMJL7TERuWqU1g1U/47FXEq2yY7Cfi8GedaNru5WKmq+cldwG+//UZfABv49NNPxUdEenhAZu7iT0lO+vSjSez06UGlIPf5oD/9lR5dbPaovoM+yWAEStoFIUCLN01r2bIlcgCEAuIAmRwbAYJiFG3btkXCjAsRkcd88803esCv+T0E4e+/X7Fx0yZNN7yq9EOmgWOHfql/bdm4krXX7T2iBTN0IyO3y7h0jcEQ8yGOgiiNxJit3aaZST6Bq4cCqa93eTzSUWL2ktX5yV1SU1Pl/ZbQlx+FhT9IZWRk9O/fX17fFUajRo127dpFA5HG4MGDicaMVyYmGiQkJMglcvZC3OrmD/gLzkPmB4wE1bJCHH/kkUdYJ6aTlMgus4kFFIwb04CRI76FCxfWrFmTAUszREnlikRoUK9evW+//bZ2rZo2df8WyrDd37p19eo1HDa70x6MiIiyB2Nii0bZI2xlK5aKibKHgrZLPF2K7DA7VpA72TH6oEc2sgWVCPOCkD1oc6qb6Z0u+jQ2b95RrmLlbs93UrRVc/NSEfTM85u5nPD7n6Fhw4a4uqw/99xz4Uei4Bz+77jjDhJBRkRweaP/G4MGDoqKinE4XSGbg9meQWCodrvBWI4fOuFOcfd+tW+x2NhQMGB3ZL+OIxBbZA7GHBFL586dX375ZdkoIhKJqRWUlVDx0S7tnTYtQj3WG3JGFg0aEQlFo5wOZ5UqlZRoaGQ1v0hOcI7Gk5KSxEKuu+46uTFQgJeS02A/DBx7wGAA7jRkyBDGq3zA4YA3q/WVARwKsJMW5i8zk3PL6SgYECHIEt7QkbgxtdHMmTOvv/56OGSA7GW2fuGFF5AcFl2/Qd3ly5fVqVMH78Z6OcQVQfYRUo9naOSZod+277u/dZs7bm/OEFGv+QyCJT6QtWyBPsUfXTB9zpkzB+2z0Won4GDLvO0EMDbwNcvlgoID/TZu3JglYMQk36QyiIWvwvO//vUveagbdY8cOaJPn9fsDvXqVmeESzdCyp+zDe2SKFGiBMRZadOmTd++femCjmSXWlGr/FfPM0VEuPALl3rgw3XsWFLymTPdX+xRvEgM6ZEtFBBrF0BEIF9XrlzJnAG1J598EvuU7dIgDA4P3ymVFeyy1q48kDb+i1RhVc6mhMOsACaJHvJwMlFl7NixjEicnZZElXbt2skTQETOKVOmPNrxUfh3Oc2nMcxxmM9kqpU8wYFCCnTt2lVyJuFHiQvlqFYsHEWLFuv89NN8V+mnPRjpLGY3nHElom0OW5VqZdS1JRVYLoziygFRiL6ITqwjJZz9448/vvfee9mouDPx/PPP44DsZTpbsGBBq1at7LR1qgsrhI7GjRopUnb10Alsx8TYo4vFVShfsTQVkXo+Vg2GBgK6CwPKxYoVEyu677773n777Wzqo40oS1pDSIkPjcjuP4BWsgS//PILARaeH3jgAaZsKLOOVeM1H374IZaD2YSxb9++W2+9lZyMZlQa5MHYDJbPqFnCJClL9+7de/fuLV3kCun7T4COCwBMKtu2bcNMkQVZiCSGbLR2mw0oExcvXows5DJqGGR2LDlk1apVy5YtozhTuZ7sU8/HyEdBM6ibPEG3vmH1AkeEc/B7EzKMIMlu5v7sgCb9yi9zEjrJEBXlLFxlgW7o6eSgXnJc/dzv21bWvvbasRO/TDOCXl09PU+GStEBJ/DKH8XxFQOdwTn1yowZM8h8s0kS8PXo0aOzZs1KTEz8w3D4ikA1r5Z+8tj+R9u0ebR9l5R0X7p6bY07GFQPyP0RdAHIyqtVq0Zlf858gfcfKCuwyUf9GkwPaj4E4Q4E3HAXSHnsvualy9ROOudBG0jQlJBSYlYSEKQXOWlEiKQIoxdg7S5kwGxwYOxZfplZmLf2mcCMScGxZ7mbOAxmCwaFsVGoffvtMo/5S7zWPtGO4TMMr6F5/BnJH30wsnxCqf0Hj6trbSiCDweHJW/e58vhgBX2vPTSS/Hx8cyjVoPs0FEOlqoZ2oq5Hxe32V56fTDOllULBQNYxYZJUr/66qu9e/eaY8rOxf79+6dNm3ZY/fanGrr6j2UZHgzM7zn90F3NSpSrs2nPIUP3W75n+INEgCyEwsJJS0urW7fujTfeSGZMX9buP4D2IOAnDmhHjhx+6B8tu3d7JsmrJ+Mw6ryCJ5ukIAWgz1LuZGI6x/Xk5EF4F5CvYO3atTSzonbmeRe2WxSvPOAEg5S7xZlWDxw4IExau02gmvXr18+ePVvemvpH/Prrr/LQnBqp2I/80TWfO6XlrbfEJVyzLvGIFvQHdVMWqCagm91c6Egcgeyfqa5169Zyd3MuorA6kTVvIF336EFv+qPt7y0SX+poitunLOAi7ysYfPfdd/Pnz2cI1ncTarjmQND1vHnz5L5j2RYexQXgwl7f71sXJ5SOe7LH625MWLmnn81WA9PM0JFJwYAUoYPk8oYbbpCbzRXd3ISmpfd7+vEIe4lZS9boWhBV6cEMOsjKhZAVkNZjnGT58pvqVwhW5pGJP5+7FAwQPVIePHgwqZk4DOIG1m7zVBhfxYtYWltNsEvGbB6hQIM/ioA0TDO8KJbg1q3zYzVrNzx6PtWthwxN/beaZIEQJJY9/PDDpPxMzGIHOVEGhhZw+zza3t2JfV/tVi6+SJQ9oljxyl/MWZSmGR7z+iTeo1ggwqqzQVcKsCdSEjkAER1Lq0Xm0ATm+XC2KJYwcOYuZq/k5NMffjiqcd1axZ3OWFfcK/2Gn/Oqi8ghw2eRuBhIBgwbNowJQH5Fma85CoqutFBACxGzcAsfjhbwG/v2/pQQGzdq9Kd+PYSX4ZtYBBxlO8nLEEhMH3nkEaIqJQUVAPxnHVfhgWjh8ccfJ82qU6cOYVdYzSoT1kU12TYqrZiNoSArAqtByAgYbk1P/W7J7A6t/xFjt8dExDa79d59h477jKC6dUt9cBxoGuqxGnNGAFBjviG5pBTOTTvmIRBQrw/7ft4nJWy2Xv2HZJi0ChgyZBm+LGFYYLXIFKC5i/EqoyGFCAW9/lAw4E16uNWt+blmhFiYiakLCfck+myhu6y9wIu6XKREoy5r0PxM0okRw4ZWv6ZKEZstvmix/wx9P8nr91GgaH6OtA4yhyCcg927dyeYL1MfN24cGwVsl96BbOHrjz/+aKUtJshdJJ2i8cVcXSkIJ1TY8poi6mmshd6t3VnGxTI3lsR0WdLM4pv/fEgkPefvvK1p0ZL5umZE4t6vX79bbrnl9OnTUBOC1u5coBxEz9C9+uHdO8vEJwwe/kFaMIRiQprf5KBAAcNKmqa4rE2ZAgQiWAEtcxSmyjACGR8MerVsleprd+3Hn5V7c0CW2wGFmikeFXgnTpxYsWJFqVrZLvStphchGNIzXv93p0hH/Oyla3O8ZgRLHMsSOsnJyYR3bLhXr165ELwiuJDIF07gJKB79+5NmzYlW5w8ebJIx9ptNpClrIShxnbxVQkahAsXNslH/jodESG7f9u2n+bO+eGdd0YWKRJnd4Ts6lJOzmDK6dOnD/WxPN0AWeVlOZ5DDznstiiX0xYTaWtxW8vuzz1TqmQxnzu5d6/Xftt70LyRXthGE4Yjy+m+/zlEREAEIitZKznAFtkebqY2yh8TES5nvTp1n366c51a1YI27f2R786Z+40t6AiGcv5ZWujPnDlzwoQJzIvXXHMNZNmCBq3dWYAUXCHDEXJqTrthd7nseEzSW0PG1GzYrMvTnczHr8w3Xmeyk01S27Zto5Qhqg4aNKio+TtT8B8eQuEBLMGbvHkoMTFx+vTpslH2CsSQkFU2i2KLrCDDrArKBEYYaQtFlowv3b5Dh9YPPIBJrV+/8qWXXvRrATohG0X65jEsLih6z549BJ1u3bp17Ngxmz0UQogQwsyzLro2dyqwDtiYLVBcFjiQw2fPnv3++++PGTOmRo0afJXuskCJkf8oifyEHnH1m25q0uP57vVqX+vOyBg7/N0l36xGoEFbDt5BF8T9zz77jOKnVatWnTt3DtOXEblcLibmDRs2HDp0iN5jzXtiwpDT+wUJ2IOx6667rnfv3lSS5C5MWll9Wfi/tMxpQwNZWpsuH3RKTvnVV19NnTqVzA9q2QkqzVz4m7lqc9gj9dDZ90aOurZao67PdsVMzItVf3jA7MpDeAaI1NoEHyZYYTtjlK+4pMliVh7lm//A/s2fTFnS66X+dWtdY9h0dRE3ZNezvLMNOkIQaosWLXr33XeZQGvVqoXxhHf9aQgFMGfOHOpSUnx5EMHaXQCQ7gstJAbh5Fu2bCGClChRYtOmTXxlu4C9YfDVOsw8UJYgvFctzSdi+M9+c6d6BIPc9MSJvc2a1Hv1hTepPM/7fV7DRyXAHjkwK0j5X3zxxUaNGslJIJO2yo5Zmj1fDDJWLaQH/CHdHQr4yJQ3/LSkSoUyDkfR0Z/NTDdI/KGgzruoutYs464Q4BxuwwyHYe3OhLQBmQKSB3tYQQ5+yiBdC4QCaacO7Lz3zuZ2Z8xDHZ5O82nq/O/FgDLLWbNmUdAvW7YMgtKdrFidZYHZmZIhfdAkpJ2dPGlo1cp1Nu3Ym+EP+NRtZT7D8AVDATnvwsfsR4FU8v777ye4E9GkpJBe2GVRLzSAKwogmPzhhx/i4+OrV68ud6CHuZUVlgI5SiBbaKxOf1BDyql0OUz9CfqRHyWS3x/yZvgzUkYOH0rSXL5ypa2JB91B9RACrU1FqcMNXdV2+/bta9my5SuvvIJVs4Mt0lc2KMtU95xq5nmXj+Jttp79h5oXtHJuf+UAh2oAJpQ4MsFGq0WmqSNk/mK8uBZshxAP5utN/mer2+LL19y095ApAdUoYPhRi5S3LADHfv3111WrVmWC9Gdep2bFbGIBO1SmaIoUweqU70YgqPuNgP/Inl/vaX6LzRX7cKfu6X4lNY4XygBSQpC8pGzZslWqVEEL2Upt8Pvvv8uvYVPUYuHYSdbM8rbbbktLS5P20LR4upKQjgDMPPXUU6RWJDHIJLwdZFWKddjFMPVmQR2g/F4JnSjp86S0VOddqqzbd0QLETR9aC2gI1IkRm5onX7gwI8++qhChQo//fQT69IX4mJdulBQNDP/KpgqwnW0wH+njqpV+drN637LCGjpmt+ve1C2YqBgoXgSX87SNevhEQFZlzYAgfIxk2QtZHjdqSc6dXig7UPPnTvnzTDU1BI0721Vb3PKBIdLtFm8eDGzJ9FYzEx2ZRfaBRD0U/v9u1OkPX72EnnOiEnMvO55QdUWBbKW+vXrMy/Lc15st2hceRT23CUrVqxYgZ+T38lrPIDoxtp9WcBlOFgPaH4vXpOWeqZ92wef6fJsaoobEw9oGbq6bIHdqGur586dI5CJbs6fP0+F2qBBg+3bt4tZWARzBTaHHytlqwTFSDO00++90y/C7uwzeJRbTdcBJhLVSmUJ5p/CAnEWFaD5WI6jLkwQBXwhLXXt9wtLFo1tfkfLcz6/x/Qxn8/HlDx//nx5/Pjzzz8n+k+bNi13J7kAhk0C5FMNfSFf6spvFzSsW33Zdz+41ZNNbObj1whkZpjhA08if5ZvvvlmTEzMwIEDiapo1aJYWCE+j3ykamzTpo3cuCDGDPL2f5XpYjPqJxnkYxqO+pUsy9BY96dmJB9ueUvj4gnxizZsPk/uYt5dtWvXri+++G/i7p2k5syOTIEdO3ak9BfCuQEfUxfz1AN5xsr5ExNsthcGDMNVQsaF56gLIUzTRUqWUJBPwJv6wD0t4ytcsyHxICIPBbwB3ZumooC+9se1X82YkZKSgmooJa+99lrmSJmbLXIXI1Pypl+rD3OMSvQx41AgfdWiKfGxUXfd/2CyV91Vg3ugWUgBaGIAx48fb9asWfny5desWZPNQWjDlrFjx8oJNrKEo0ePnjx5Ut4rQ9HMlu7duwtvIG+D+V/j1KlTDz74YFxcHLkd44L5bEPIHxAs0mOB9QZ83nO3N28aV7rCT/sP+9mipwUNt7oI6vV9v/zbxQsX4N1k2B9++CGzAJMxw6dri1JuQOdK9OSPfj3g+fHHZf9Xr87ShcvJWjXNqxluxbgKKlbzwgjLwNQkwmTjCzCQDM2X9Hqvbrff/siR48lK7sRGrMt8lRDKQDIY1ezZs5OSkpAYKzVr1qSuIz6LwViUcwVSO/da1ydi7CXmLVpDc3XzJwFApTBCwAJxvn379sWLF58+fTqdKkYu2wb+PP5OuQvC+vHHH1HDnXfeeeTIEcQk4cDafVnA3TlSpfU+nzvt1Vd6Pf74o2eSzmkB4juGjVnrxBzySCkyIiIiypQpQ/L00EMP1alT59dff6Vr3BVYBHMFpkctbIYXutTSQ0bKulXfuOyuMR9Pp9TV1H2C4dxFGah13F8PBJv9o2ZIJKcHQro7JelItcoVH27bPtWvUZBiu5999lnRokWjoqJGjx49YMCAihUrkr6En8G2qOYCRKQCvBlitm/Z2KzJjYu/XuBVnqqeoMQ/A+xTYUYJiA8URf7jxo0j6x8xYkS6+T77go/jlwXkAGCbJSbEHFmpUqVOnTolJyezUW76znsIylrMCkikwV8V3dTtHGwleCh7C2QYvpTn/92pbIUqG83zLkbQSNyzq1at2q6IyDb3/WPOrK8o6B999FHmoTwjDlaOFkKGjzj2w/yPSqjcZXiGyl3ytP+/EkoOKn1RMlLGq/sDnpQH724ZX67qhsRDSNp8IMuXHgjMmje/dIn4CJfrnXfewZbKlSs3ZswYUQeaEmrZIMJXlC/OXfxkKoHzx/f/Vrta5Yc7/Pt8QCNSBIOKlAQNVkhcWrduTSlMrs8WyUKELEChtJk0aZLkLrGxsWfPnj19+jSTBFkLGzH4VatWSTMONA2hQEG/8PPEE0+QfC9cuFD4zzqE/IH2SI8F9kzukkLuUtTMXXxKlBREXo+mD3vvg2KxscWKxBJeevfuTVTBa+gRYQKLUm5QYV4zExfvrh1bmzW9afbsWT6v8j8Cr3pEgyLVjwCt5oURSj5ISXk2H0P3BXypk8a9d2fzWw8ePuUjY9F9uuZTr3ugtiO0GP65c+fGx8czZw0aNGjYsGEVKlQQYxaDyYe14Cypr3V9KsZefN7i1WjVzF0C6syXfuHMUFpaWs+ePSE+a9YsemcL6mBp0bjy+DvlLuLke/fuvf/++5s0abJlyxYJBNbuy4MyBaXyjLShgwZ2aP/IqdMn/QHSV3X/3tnU81Onz/CqHNY4duwY2RLxAjAx33jjjXKSH07y1zuWoi5ziAkG1VtjMuZ8NaN4sdIr1vyCNZk3UDLrKB/WzRMb1nF/PeAk+4cRqJuKdSOoB/bv+b1yhfLD333frQcpj0jwqeCJrUTbIkWKVK5cef78+WwUKeVLUIZH95/f9fu25i2aT/9qJmGezEUnwPi0NWs3bNv+u7zhK/MTJKC//PLL9evXx3mkIzyn4OP4ZUGZgAkRC1a0YcOG5s2bM5NhVzg/G/McgkhAVKKaqtDvDxkeJgB1aYSphW1MlO7zHdq2bdK0RVKGh6CC7c2dOZ3pmbo9wukoXjQO6aWkpIjchHJuUH2ZuQu6/37+J0Vtth4D3s1QRvvnvK+AIIJS8lRpFgm3N+BJfujuOxLKVduYeFhjVJqqu7Gybt26R9pVohATE0OV8umnn8o7Y1BHbsIR4qbnZs1diPCBoJ6+45eNFcuUfW/052511kU95aR8wNT41q1bW7Zs+c9//nP37t2icQkmQhbAMBv37dtXvXp1eYXB0KFDhw8f7sx8sJbsSkoCAHsFb/P0C+fMXm+99Vb58uXlPS7Mjtbuy4JYkeQutzUtWqrC+v2HfEpjqsxPd3vuf+DBaIfdZVcPvVerVm358uV0LRJjxSKSMxAL8mHa8Ozfs+uuO26fOnWyylXJZPSgJxBYu2HDxi3b/WZsLrSAN+XmSMnAx91BLW3GF581u63p3gMHVGWnTna7PW7vrFkLUtIyPHqG3/B0796dxAVrQWKlS5f+4osvaISdiLUAi3SuoK+03s90jrbHzVv8o5m76D4qcBRiXrKCzoEDByjjW7RosXbtWlSPLtguXVg0rjz+/K15FoECBH6LPvDn2bNnd+3adeDAgQiLjdbuy4TNFgwEfGNGj544cVLdOg2+mjFrwsQJYPTosQ//q93evfttdvV0OyUOsyPjJZPt0aPH0qVL8R85kUvXrFjkcseK71bMmDY97Xx6SL2T3Xk6KXX4iNEPtnmo2S3X282gY7c7JYmEVqHNJXft2oWgjhw9okZht/sDxvtjxscVS+j0xONOu81p3jxIQolGoqKi7rnnngULFjzwwAPy4g2klx812YLaof17Hnvs8ajYuBOnz4wb/5FSyNhxb701stPjnanwslidktu4ceNYEsseeeQRQjwdSV9mg0IK2BMOERfcEmLIhpFV06ZNR40ahf+HG+QTKqVQsSaYkZY2dcrU1avXEJ+QDiJfvnzVylU/9e/fv1h0tMumXiNUu26dsuXK00e9Bg3GjR9PTVasWLH8mLFd/dwusUlx5vMaus2m+X3qi/ol2zxD4V8MS5xqqVhlwrMRZ4nCbMFggrif/aYbb8ChSezuuusufPypp56Sd2zk06K2bd06ftz4U6dOUYvYQ0GfP/jpZ5+XLX9tx46t8Q67jbzEkjDJ4ttvv/3kk09++eWXNWrUwAygjxlkdRDTBGxVq1Yl0FGnYdvEugEDBtDm9ttvnzFjBvWu5DTCHrCOLCjQL70zNfbt2xdxLVy4cOPGjYREa/dlAd5N9rF9FQ4J6epaKDatgkp0RGSTGxoTPqKjY9q2bUtfpH0Ijb5YZhVabsDsjxw+TFSB5ZSUVCLKxIkTCB3Dh7/3yMPtE0qVLLwxNxOIQ8ewsN6gNmfG9G7det5x592LlnwzaeKE8eMY0MdPde62aPGKiEj1ykqC8U033UQOgXDuuOOOOXPmdOjQAQPDSMRggEU3F+DRuLtGHW1TNx6q3NJ0lUxHCpEJkbMSsqhOb7nlFgllkkDnRyN/GmLqF2Bt/lsBMRGAKIxYXnboVBIGuseT2vuVl6IjXJFOh8uJanEVedmUK65Y6U1btsuLXDGCjIyMY8eOyZO3IGuFlBvgUJCWltqwYf3ICFf1arX69XnjzTcHNmjwf0882TUlJQMPxRjNlz/KWRn+aeprYQGCtT5ULs8++wypPAnEM890HzJk2D33PHBdoybbd+7yBdS5V/XqD10nAT9+/Pjp06c9HqvKRAIWsbwR3PLL2gY1q7mcdmdklM0RRUgne1SvZLLFdOzYBWKZ511YMkHr6enplF/0UvinzzzBKFJTUzEtqWCsrbnAFEKmbgz1WpGg5v3+m6+LRDOjRTe7457BQ9/p8UKPcqVKT58+y42ElMwCIYOkw3/idNKR4yfcGenmA+f5hrppwJeWlr7n953t724ZabPVbNh0zabNOAUMCC5H1wUEEZSyF2wn4D53+ujC2V+WKVbE7orr3nvQ/qPHiCF+r1/nn9d39PCBU6dOYlGMJV8+noX4C892jYpwJSSU6vb8C4MGD2jVqtVdd961e9dBwpM566AD6zQt9NPS0i7dC5KUvSxxJRzq6NGjhw4dSkpKIhYVNoOHGSYz+PxTXCFCvDpw9syJGdOmli2VQD7f+83hew4c93rVnXXqjSU+/8njxxk70V5kmLelZeqGTHXHjs0N6tWOdNijXBGUnMzgKqKonC+yzb86+oJBH5YrRxVKwJtPN+/M96dPGj28ZCxTls0eGW2LKGKOhYEwqKKLl6zRjBBWpRkacRhrIZPGvMWWWOYpNBqIeJPPJa9atbxh9WujnZEdnnj6952JHo/frSIAGlFxHkCZpVD+q6xRpWBKl38rwDMiQ3aSegNrR36ghqvKRY8n/b///eLcmfPk4+r8E9moXTNfPRpRpGjC888/R9Zp1kwWcXpkSYKZnx5N2SrgIz+tXblw/uKzyWkRUVHlypd+pH27WrXr0Y96ctour9J1qkekgV1+ItGcr/96XHh4D1ETQMeOHZeUdBr/L1W67G0tWjS/vUV0dBQZn3IgOyFMvUsRKUkpKeKSlNykkSdCW3/dtHTx0pD6oVf16LiZSCoDtbsiW7a884Yb1DMX0DNVqGSra6ovOpILVRaZvyckCrAi1pXHcBi8EgISMp+xVOZr+DwZU6Z8sW3nnnSPVjKheIPa1du1axtXIsHmUu9DdoQMJBm0ObSQouwwNOU6KmXPF0JMIrp+9GTyvOlfuEj7Q5HB6Fi74f9HqzsbN76RBvDMsrBpQWKbyZQ6lT1n9uzDh46oB4pCrqDDnpBQ5LGO7UvExeN9QfVsKcam3nwKJLCog3OHxE2kivYOHzz48SefnTh1xumMKF0qocUdt9/avEVktCvC4VRU1HkroBYioqzE/9iRWIK0ZJ0G4S0AU2HJRpaq9V8NeBP2YAnIxnxBSRBHZ/LT586Zc2DfARWXoeSMjI0t2vXfnWMjI50u4qN6/6v5o7VKevkau+hGvVVB37Fj29cLllAQQVedPXQY5qsW7A57dPMWdzS5+Qak63K4pGwthFDmZZ71tAf1qZ9/mpScbGCuGGsIt4bnIM5frFjJxx57LL5YETbphsboMGAxNkF43SKaE1AiSU9MTMyqVavWr/+JRI9uA66ICN3xxBNPFC8fHx1iwo2CH+jQOCvBy9P7/wh/19wFB5YVpCaTZX5hOQypoipw1UtVqO+VOxBh/Mp3jCiHPcIMMwHTxFUmEdY9h+TtOaYdZLZhjvXjNiGbE9L8xayMkBOPpBvlRX+H3IXhhEfEHGtzRKgLFdhwKBTpxF38TvVi7wtxX44S5CmrMOiC+p6/ZKa4DIpCWOqHGdSP8Cr5s9tlEQ9PBCqU52emKeSQ2oXhiADzEFoIKSETJlsnOQ/iQkZYpbouRIDGZtkX1JS10oDp04Fu1KsssLYAm5SFqR92sOc7d1GP/mKd9siokG7XvbojBqE7QlqEmpovqDsPtgseEtvUSzwQMEKjRHHpeijCYdOCut1lBDU90hVtN5xBJ8mdEWG6nigiT4uyaJu5C7Ybsjt0wzzKCAWdlEH4ixbFd/WDDC7slZI1rFkxWjrKMZiwnSXuIDzIV0HWxoVE2mZssGayPIV2ESwJ4snq5JSZPWDPCBFDpoBRAlJ1pS2kmT/r4MqcraQv8+BcYAmM4KHOaqvMCgWHHC5XhNpCnMdNjCj18i3+hgIRziioyjGFEBgXS03zM9VhyyEGQ4GnakacXafkRnh26jhV8+GmNt3URti0xJDUpksKjWYskT9Lh8tpN6BsD1DfaNSuoVC0PcJQ38QYUQQGHNY7UFsLFn/L3AXAdpjzy3cYlewLBVW6quCL6NmoIw+bCjTU/ShOJm8rfNALy/C6uetSsPSq2uuqC2ovhK0uGqpwxx5MjVaqR7owq2EpCKTHQgBLvEAGnrmiOGYEIjPlFORkbHVYpqzWzTlYVtRh+QOHKMkoyWd2Z52AUZmK6jArNVWGKYheLs8GCh8YgoQY+ZqH3MgblQaUcMyvpkJEK8pqzaOJyeo8C8mfchA2IEZV1LKG1szEFJWpw/MDdSgG7HCoOUYP2VysOx0kQJCz/OKydF1AEDtSuYsyFdbEjvA3JgBVuIAg/ItvqgnTPEAhz+FYtDOFT0BhtjVPD5BImjqxG8idBNMWouKXlDIH/LGjsDxlOmGFLdJM1lWjnA78SyD8CId/iiXTmPmv/nI4Ts93HIGRquEzTnJyB+LLHDjIoyOroYquioj6ivRNa1f+YWaxSilQQS/sVt2pvYUSYQkrmShRsG6QnJC7hCjClQDtREgsjfGoZqo8yUFEeWonHMAVRWxPSceOg5hiQjdKko7MOB+mljVwFSQueMJVXMVVXMVVXMVVXEXhx5/PXa4mPVdxFVdxFVdxFVdRAAif6RFcPe9yFVdxFVdxFVdxFX8f2Gz/D6T/T5giwsDyAAAAAElFTkSuQmCC)", "_____no_output_____" ], [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAnkAAAGdCAIAAADhVvL/AAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAP+lSURBVHhe7J0FgBVVG4Y36ZAOJRQEAwwsDAzEwkBRsVuxW2zsAPG3E7uwsBUUCyklREFCUro7lq27+z8z72WYnTN37mwgK8z7838+3znfd+bcM3HOuffubmphYWFKpEiRIkWKFGmzKS3+30iRIkWKFCnS5lE010aKFClSpEibV9FcGylSpEiRIm1eRXNtpEiRIkWKtHkVzbWRIkWKFCnS5lU010aKFClSpEibV9FcGylSpEiRIm1eRXNtpEiRIkWKtHlVrufa6QMvT7V18NPT40UJNP3psJEemYcocVMl1r9/RFPhh/q/pfIwtpEiRYpU0rl2+tMH6xlm6+DLB8bLPRp4uSus+I+7qRPikEzTJvUVjJg0TRBWxiFK3lRJ9e8f0Uehh/q/pXIxtpEiRdrmVdK5dtqkEXGyNKLvQ37z6PSnH+rrCiuLx50zx18+0H3AnXfrLjhot50FJVYZNhVS//4R/6tKcPYDFI1tpEiRyoNK/R7yQQcdZP1nxEffGE+/gX1usGfaeEjZasLUONhqef3LhbaGX98yXlRSlWFTIfXvH/E/r6JnP0DR2EaKFKk8qNRzbZtu3TTZ3tDH8z7ywM/tt+8OeurubrYfKVKkSJEibZMq/XejWh2vyTal7+dFJtuNU223480371zvBcZLLCX7eo5Vv7O2ytbcvrMr2O8rTjqEdQRqD974ufHBByd5/zHRt4TcjVituGqn23VOZaKDJGoh8RGfdiUopUiz3tdoO6hI5/zl7ktQeJE+63UlbjrxmMf7HXzU4GEMOPsbrxw8q4k425cWvu0W42yiIrV+r7roqSnSz0iRIkXykd5hK7YGxD8GS+k+oHDaU/H3iOGN2lh20FPTfOr9Mja1aeXYcsJU4hzTLVV5Il0l3bt3j1e55EQFJLqCNpUVkTrvX2fJ/eJCteA6Ii82QbM+3fJ9k77I0BZV4i4Xbd53wANaDu5Pd7O1pENsKX64gLO/sa7IcROPrf+hNr6q5K86OD9SpEiRTJV+X5uS0tLc2U7/5iN9VNvt+LL7mOy4l32enck+h+vbt+8InoN2sJNsvuEdpOlPXxDfTx30lBqyHsju5+1B3anY+Cx3juIajqQteDTw8s76VtlB8a4TH292xA0XeDeEI0aM2NjwpmY9bzNskrszdtv+XSGsc/xDgE1dtmv6dk70tfO47P6o4xtz7BNhFHrOQ+Awhjj71nGTKvhchHjVztcQNs7e06x8nxGMFClSJEd6XBRbziPUXs47z8D44n6jr8eRt9avxNLGNjc9xLyPVt8iS2axU1IkclOpp6cBBzVDkqi4LYSNd4oTdx4lOViCeud8Gn1wnyLfwk1yqt31wYUJOmnJJyZRmtN7V/9ta2YEH9mpDXjVmy7T+GwcKVKkSMlUFvta987W/tkfZ+VflrvakqpIJ1q2ahOnCVPDfsLm/HhTm1YJXs10fXy38fM751PFjUreQlEliA/XeScqwY9Y+Te+827x+WSjnLC+neMvCxmvLIHcP2Dj9Me30K1kwxhC3Qds3OkmGujgcxHmVR93sibbETd03tn+qDf6rDZSpEjJVDZz7abJ1vrZn/i3osrHVOuRMack1fSNv+UhwU9oTn/64J137nxD38RvYCZrwauE8cXvvKmQnXHC/i0lH8YyUfDLD/eqj3t52lMb33QeMaKvPeUm/zZapEiRtmWV0VzrnmwveKj8TrWbNi7Fl/8+ceMe3vWWovOeo1chf5lHwp1p8J4snJJte73yfau18OXj4tVlpeIMY5ko+OUnedUtr395uPUxLVNu/Krve8POST7EjhQp0rasspprXZNtfGMSaqp1vxm6cTtcxiryUN20cQk9XW16u9PvnVunwe53X39cghaDW/CRs38tGh+8Jyu23I1v/C6bo01Tss8vKSl7hRnGMlHwuSjWq27Z8jim3OHsceWHPLmRIkXaFlVmc+2myVbioZn4mbnpmXZDH/vTLusHGvX9zyTalDlpmv1oS/5869s5/iv9pg90fQW1R+id2cbP59zfALY/WbQ2Mk5/+n6uj+1cB3EU2IKPNq1biI9/GujufKneMDA7w+Cbn40WCXM+kYz/CGyZv2EaahhLdPa9Cj4XIV619UPErh+4nT5tUpj3nSNFirSNK/72WFHFYrECW4A/f3NZPL/7N6rCTnvywHhhvDie4pRf9o0VZpdNdcVu1IEHquzAJ6cpzEl0v6Xn+s6pJVWZ3y4NfAdy07dMAxJdB/X/+ZyiX0v1UcgW/I6YsPsHub4h65fo6pArsqj8urzxZ1OT9hkVOZxLwf0JLgw3jN6weJVTarzk4p7NRLUo3kKCM3Pgk1OtK94Wl26cNt4CDqAAlitAHvatcpeLkW+5ySiAHdeXBchTLhYgNyPfKkBKyla0rZDsuKVhFJLlCpCH5QpQIkbueA8jX7ZDijAKyY7rsAB5yk0WSG7XlwHJl+0QSyHZcUvDKCTLFSA3I/vRkUSl3dceuGscUIsT7naeeV2SbBtbXDds6pPdnfn2wO7fTJs2/O74rsVRq7ZxcMn9zRQegPH3gjdueEzxiHT9/KR+xNP1caOZ6NPUcS8PL/ozlAdZPwqqrTH9sT63s4sRNQM2/kimS0Et+B2x5fXDvT+2aWdMGx76o9KDdouDIW+XrSExB199dj6StBXvQ6K3LBKfhaQKN4xWmM/Z3yifl1zMs5n8Vbe8/q2i1Qce2P3Jb6YNu65F3I8UKVIkQ6nMt3F0icLU1FQBNpjlJmIsriCYsZ5ch0sg63ut9puQzLXJft9FpEilVaJLV1c4riCYsZ5cN2PNKne5WK5ZbrLcRIwNSBeIsZ4YgYeRb5VyEW4wm7nBjJVbGpYbhrG4ApOxAbkOI8dVupvlmmzGyw3DWE+6QIz1jRELJLfry8pFuCYH55qMlVsalhuGsbgCDyMnMkBl93ltpEiRIm1R8fiz3/MrAMqn1D0U73GkbUb/jblWl2nc2bjKEHgYifn/xnLbjbMlMQpgx/VlAfKUBzDyZTvEUlK2om0FsOMGM/ItNxkFsOM6LEAedlyHBchdjnzZDrHky3ZIEUYB7LilZEGkcqjyeXaiKXZbVpGNsCOuCW2KVRvMchMxFlcQzFhPrpuxZpW7XCwXnvHtFa2OfwX3sm+mvnRsCzNebiLGmikOC8TYpPFYuSYrBuEGs5lrMlZuMMsNEy83EWM9KQKTsZ54gRjrdk1WDMI12R1j5pqMNVNKwALJ7XoYiysIZqwn181Ys8pdLpZrlpssNxFjA9IFYqwnRuBh5FulXIQbzGauyYjH15IlS/Ly8ihRFYUlZrlhGIsrMFkQi8UqV67coEEDd6G7HeS4TpsOyzXZjJcbhrGedIEY6xsjFkhu15eVi3BNDs41GSu3NCw3DGNxBR5GaWnJd61FEhw5h3GOEcByEzEWVxDMWE+um7FmlbtcLNdkM15uIsaaKQ4LxNik8Vi5JisG4QazmWsyVm4wyw0TLzcRYz0pApOxnniBGOt2TVYMwjXZHWPmmow1U0rAAsntehiLKwhmrCfXzVizyl0ulmuWmyw3EWMD0gVirCdG4GHkW6VchBvMZq7JaM2aNfvtt9/06dPdjZQfHX300QMHDqRv7m47jBxXnXezXJPNeLlhGOtJF4ixvjFigeR2fVm5CNfk4FyTsXJLw3LDMBZX4GHkRAaoSIIj5zDOMQJYbiLG4gqCGevJdTPWrHKXi+WabMbLTcRYM8VhgRibNB4r12TFINxgNnNNxsoNZrlh4uUmYqwnRWAy1hMvEGPdrsmKQbgmu2PMXJOxZkoJWCC5XQ9jcQXBjPXkuhlrVrnLxXLNcpPlJmJsQLpAjPXECDyMfKuUi3CD2cw1Gbt27dp27dotWLAAm5mZqaryoA0bNowePfrII49krmUn5PsSkOPq5bhZrslmvNwwjPWkC8RY3xixQHK7vqxchGtycK7JWLmlYblhGIsr8DByIgNUJMGRcxjnGAEsNxFjcQXBjPXkuhlrVrnLxXJNNuPlJmKsmeKwQIxNGo+Va7JiEG4wm7kmY+UGs9ww8XITMdaTIjAZ64kXiLFu12TFIFyT3TFmrslYM6UELJDcroexuIJgxnpy3Yw1q9zlYrlmuclyEzE2IF0gxnpiBB5GvlXKRbjBbOaajGWu3XfffXNzc0eNGlW3bl1PenFZbhjG4gpMxk6dOnX33Xfv2LFjNNeaHJxrMlZuaVhuGMbiCjyMnMgA+b/LTCuRIkWK9B+V8y2kMA/Bf03Mr+qYntSRthrp/AYr+pmfSJEibYXSEzDkc/BfU3nrT6R/TdFcGylSpEiRIm1eRXNtpEiRIkWKtHnlP9emRooUKVKkslb0dN0qpXkzWNG+NlKkSJEiRdq88p9rrW9WRYoUKVKkMlX0dN0qpXkzWNvovtYZnZDDFCnS5lD0C3IjRdpGtO1+Xqs/8Bt/wZEi/btiltVEy0UYvyIjbQPijMcp0lYk+55OoiK//MKR7n8BNpjlJmIsriCYsZ5cN2PNKne5WK7Jnngm2p9//vmhhx5yIgHFiLGeFDc7WQExHpZrsmIQbjCbuSZj5Qaz3DDxchMx1pMiMBnriReIsW7XZMUgXJPdMWauyVgzpQQskNyuh7G4Aoexr7/+erNmzZzfHySbtB3HdeKxbpZrlpssNxFjA9IFYqwnRuBh5FulXIQbzGauydi1a9fus88+eXl5o0ePLle/N2r69Ok777zz0Ucf/e233zqFyN0OclynTYflmmzGyw3DWE+6QIz1jRELJLfry8pFuCYH55qMlVsalhuGsbgCDyMnMkBFEhw5h3GOEcByEzEWVxDMWE+um7FmlbtcLNdkTzxbio8++ujss88WW6GRIv0r0jWJHTduXJs2beKlrktUYDLWrHKXi+Wa5SbLTcTYgHSBGOuJEXgY+VYpF+EGs5lrMjaaa626os0GMNaTLhBjfWPEAsnt+rJyEa7JwbkmY+WWhuWGYSyuwMPIiQxQkQRHTD9Kdo4RwHITMRZXEMxYT66bsWaVu1ws12Qzvl+/fueee+6FF1541VVXqUrlYqyZ4rBAjE0aj5VrsmIQbjCbuSZj5Qaz3DDxchMx1pMiMBnriReIsW7XZMUgXJPdMWauyVgzpQQskNyuh7G4AqlHjx5Dhw79888/o7lWoBLcYDZzTcaW57m2VatWRx11VDTXmhycazJWbmlYbhjG4go8jEr+N/W2+rkW+OCDD5hrb7/99ocfflhVihFjPSluFoixSeOxck1WDMINZjPXZKzcYJYbJl5uIsZ6UgQmYz3xAjHW7ZqsGIRrsjvGzDUZa6aUgAWS2/UwFlcgnXzyyV999VW0r3XnItxgNnNNxkZzrVVXtNkAxnrSBWKsb4xYILldX1YuwjU5ONdkrNzSsNwwjMUVeBiFmWu30e8hI0bKPVi+Ys2BAHckrMJIkVDA9aDLxnOZedxIm1sbNmz48ssvP/744w8//PCjjz7C9u/ff8CAAe6z5pwUoByeoHLYpXIojRKWM1tWI0Y7EyZM4OLhykEAFw92xowZ8YjQKjI5O6JQU7pqg1luIsbiCoIZ68l1M9ascpeL5ZrsiQfef//9c84558477wzY13LO8vPzxU45ysjIgJ1vtTjxASzXZMUg3GA2c03Gyg1muWHi5SZirCdFYDLWEy8QY92uyYpBuCa7Y8xck7FmSgkYwHKRAO7bW1WArhDc9PR0d+0pp5zC03/8+PHRvlagEtxgNnNNxrr3tXXq1FmxYsXuu+++Zs0a52/ZEt+0adPff//dKXHmXVrQHoUYd7NhGIsrMBlb4s9rne4hdyTsxGDdLDcMYz3pAjHWN0YskNyuLysX4ZocnGsyVq7DGiVclXAeAW49d4yH5QYwbfbu3fvRRx91CinJzs7u27fvhRde6GxnnawAbbv72jBiZJ988sljjz2W20M65phjTjrppHXr1oUZ3EjbgrgSuE6mTZsWv0RcGjFihPvujbSlxDORhfWkjZo8efJ3332np7AUi8UWL178jy2W15w1FK8rB2JrTse4xpYsWeKedyN5xFlbtWrV7Nmzp06dun79+tLfejR41VVXccGgv22xqa1YsWIJWvafa+0rbetX0ldKAJf4hAkTWrZsyYK0VatWQPPmzZ21UqRtXM5lwA6JK0TiOqlXr97QoUOXL1+umdhU+OuHZysSMyU4HCmMGGdOAYNWqVKlxhvVsGHDRo0aad+jmDlz5lx99dUdO3Y89dRTmdj0JFVt2aoEzXLS33///bPPPrt9+/ZvvPHGZurYViDOGmN15ZVXdu7cuUOHDnqbN15XUnGR1KhRg6vFEbe2qty3NgdKqmhfm0SMY5MmTV588cWXXnoJ+/LLL7/wwgvct/HqSNu8uBvRjjvuyBUicYX06dOHxW88ohTSbYzlIWLd0+Hu6kjFEqPK6fvggw9atGjBTFylSpV4RfkQV9eFF154//33Z2Vl7b///loHRDLFeWQX9N577zHXspzinMYryloluw2juTZIXNZc6IwsFtZVzhpZtZEiIS4PpMtD4mqJ15VaOTk5LM+/++67Dz/88PPPP582bRol8bpIZSROGXbJkiXjxo3bZ5993O8tlwfp+TN27NgGDRrsvvvu6m0kU7oTWZWOGjWqbdu25W1H5P9QsJ8YW7/CvFJOXpw2yrkV5YqRGblFZF9vcalj8YpIG8+mTpak8lJKk2vc2SgaDxj84POSn5+fm5vLCv3oo48+8cQTH3jggeuvv/70009nW4OdOXOmXkI8OlLpxGCiyZMnr169ulOnTtZb9vYb9Ty1EVXxuLJQyVqjP0OGDNlll11q1qyp68rpWNl27z8thoI7cdasWZMmTWrXrp3zSR9jtblvFg6UVNG+tthiZLXNjfu2uBnitKXldAzQbSkbSedIj1GNkucklkw0IsV9Wx63uGLzetlll1100UXVq1dnkT548OBhw4a1bNly/fr1AwcO7Nq168KFC+OhkcpIjHPDhg3btGkzb968RYsWaTIrD+JaWrx4MXvuPffckwtg9uzZq1atolD3dSmvtK1JPJa5u//88881a9Ycdthh3COMFbeSc79vWUVzbdnIcy6t22DjBAfZjnfC21gUL/dWl1TOHajVnNyiUtGm49oxlmt1NF6MNlFR/g+LYUEaFhQvjQtXA7DlX+xnn332wQcfsDCfP39+dnY2sPPOOx944IE8TdD48eM///zzeGikshAXw48//livXr177733xRdfvPrqqx955BEKuVriEVtUU6ZMWbly5dKlS3v06PH888+feuqp7MKdKzketM2LoeDuGDt2bLVq1b766qs+ffpwNq+66qoNGzbEI7aoorm2hOK8soZiPvvrr79uuummu++++7HHHpswYQL7D4qtG4DtE0/tAjZSefkF+WyolGglEcB/7RZSCgsKC2LUlWYVrc7IZmVlvfbaazfeeGPPnncPHTpkwIABy5YtI4ZDxYjh0LG8PCsnN8U+br7VBUzMrqMF3brUUEHYRv7PimFBnKm8vDymqGuvvZaTxUz2yy+/zJo1a2OQtd8tKMzK1yjix1+7RxRpNNz/yl7ff/893aHDXF2vv/46TxAKK1euTOfy8/N5yJbgR+kjBYiNLEOdm5t7ww03PPzww7169XryySf1U7D25WPdWdYFshEQV5RjkdopWzmHGz16NHannXZiHdC7d29WXVzGLMJUKxEcT7MTnY5JlLgDtkrpBf78888VKlTo1KnTE0888dJLL40cOfKpp55SAOPAsCBuIliF/5r851qdmK1epXml5HLOmMlOOumkww47jAVUo0aNOMEsh9k+McPa01ZBYT4TWixm/84QSTeCfflb0xrMNaCqUoom165de9555w0cOPC222678cabevd+DJdtELUsgumY1beCXPvnB63j29J/6E9KvtWG7Vn7O6dXZdO9LSW9JO6ue+6556GHHrJXIT1/+OGH448//rvvvlOMLQYgJy/fWm/EC3yuDop8Skug4GvvoIMOysjIYE5lR9usWTNKiKeE16JEZgX3Bj1SacSQco+w+2GWbdGiBS4bo6pVq37zzTcwImbJkiVcLYw/w45FFGLnzp3LtWQ3E0pqLaScow8ePJj59eabb2YWoaR+/fq///77ggULVLtu3Tr6xsrMTrLE5bFw4cJPP/30lVde+fXXX3NyctThrVuMxrx58yZOnHj99dcfcsghuJUqVapbt+7w4cMZHJ4APLE5Xyyyy3w0OFZS+c+1XE/bgkr5SidNmnTRRRddc801J5xwAs/B4447jgYPOOAAe9tKy9yW/Ldg1ep1OdwFKel2knVvYGL5OYuXLs2jvrBsRptD0zKzCE+Np59+ul69enXr1uvQ4YhKlSq3bdvWmv1T2VvHmEdy8nJXrVlXmJpGEqc/ozCPyhXLVy5csGDDuqzU9FRrWk5Ns7vF5Gy3/h8Xg/Pyyy/37duXR0/z5s0rVqx44okn6mQ5Afw/Py9/9ar1jIoKrdfu+eejjcHFlHXExLrwwgv79et30003vf3226eccsqyZcs4pzxPVUsuDw7rQopUFmI8x44dW6dOncMPPxxOS0vLyspas2YNcxguU+ldd93Fyuy5557DlchiJqOcu/6dd94Jfy6UG1JqdsWKFSNGjGCuZaJVCbtwVgbMHBMmTLj//vvPPvtstrlafknjxo274447atasyVaYZXe3bt1Yhcfrtl4xOAwUw9KxY0cNFFPs4sWLOZtMrqyc7rzzTs7Xe++9xykOf8rCSFdFsKL3kEsoTlXv3r05rxdffLG+8DZ79uylS5ceeeSRescvO3vthL8mPP3U44ce2mnugkXxt2ZTUlavWjlm9Khbe9x63nkXZLOLSs2gkHMV6nQlFv3hxnvrrbe48Ro3bszpj8UK/hj7x6677l6nTm17Cc6abvaAbwae2uXUV1//oIDjWocszM9Z8+CjfR56pPcbr712TKcj7r//gazcPPutVKtRO6aUXdvy4n574oknODXO70RkW8DmYNddd2Vc0JLFS4YNGXrlZVffcduD3IN62QyBtdu3/3Gi5TEkNuizcGuENodYunXp0qVXr15HH300j3jmAHY2np/3cDMvwe7P5unN1i5OLlsfLga2s4whg/n333+zF9Q7Cvvss8+NN954zDHHsL5hzCnRyB9xxBFsNNu3b0+8+1yUoTQl/PHHH9nZ2YcddpiOTm9Hjx7dqFEjptKmTZteddVV55xzDuV67ADokUceodv0kFnnmWee+fbbb1m0qQqp8RJLjTiXnCBeZ9dS4oBYgq0EI6WsxPgwViymdadwlOnTp3PvM0rcUAcddFCPHj2wHJpaBagnjtVtLhtvtOwUzbUlFOvEr776ihtgu+2247RRMmTIEG6A1q1bc5ryC1M2ZK/5848/W7RoMn/+kvyC1MI03Y2FixYtmPb35KbbN2ArGWOHyx3CjimFzW6pRB9YaK9fv55NNncd2pCVNWr0qEMP7cDeK5ZvbbYnTpyYm5uXnpm+Liu3wDr11g77i08/njBl5gMPPdTz7jufffKx5595+p1+n9Ff62bRx5Yp8bnnv6uhQ4fOnTuXJS2rIg3OL7/8wlNSv2uNe2zWrFlLFiysmJmRtW49m/p4mvuu0xOkMKUgxiODIv7PrYlb9vekxJP9iy++YJbt2bNn586dP/jggx122EEPhXhEUVFl9TBS8cW+56+//mrbti0XhoaXCwZmOwhzg9erV489pW5z+/KxVLduXZXLtRraDOKg//zzDwfaY489dJQpU6b8+eefzBm1a9euXr16gwYN9EG+0we4atWqvCJdEk2aNGENMX/+fDjRxVNcOXMSVylAy/EK++hYlQP03zkoPSTF6YZTXlaiwUmTJjHR8npxOdywYcM2bNjAmgNX5yszM5MwjZU6M2/ePE4326Tly5ez5GK23nTXl6n8LxGOtC2oNK+Up7NzFnGxPL65XWvUqLFs2XIeytttV/Pcc8/tcNiBbJOs7x9tFMvns88+a++99mJ1ZZ3SjSrNoHN0rmn2tTwXuCdVMmXq3/PmzT7iiMPo57p1G1JS0jt1OqrLyV3q1qvPZjp+wJSUmTPm/jRk1Lqs7JT8vL333GXXVjv++usoJle70gqxI0vTuy0shoJnE5Y1vu6xFStWjB079tBDD+WJwA1GzH777nvq6afv2LR5CquM+MhYH9uyolq5YgVefl7ewvnzly21grk/uTOXLl2Wl5dvLURKJLoRJz/l5eU98MAD5513Hhusk08++f7779ezXjJznUdYpBKIq4Lh5VkMM7ZcEuwCu3btyq1Kie+AUyLF/dAqbgrxrAgrVarEg0W5gwYN4vK47rrr9Im+b4Mvvvji008/zcqSAFbYbIv33XdfXfzxiFKIRhALlBEjRixZssTTJpciK/5Ro0Yxh7mnVYRLT9iUM/F7sspEvFjOI0sQMc+9Dz/8sE2bNmeddZYGCgFOMO7q1asfe+wxpuQuXbrcd999FD7//PNPPvkkVQoLKbvtJPKfayMllU5Vy5YtNdbcn87ju89jjxXGClJSKxGVkZ5WwCYoTQPNKbH+m5qWmWJ9Jco5QzQVvxRKLPrAvbTTTjtxZ8KU/PDD9w0b1m/dutXAgQPHjBljrfNSrf/Zq1B22cynaZhLrrz8k4/fqFOzRkpG+pq16xYtXtS4YeM0a7JNZ9Nt9yvUleSWPSTWXTd+/Pg1a9aoP5KY24Dh0rc5VO5o4cKFU6dONctLI0aG5T8LEbqEy+qV9tnXMgd/9tln1qnkZaanpllvBlovWSdr5YrFN95y0xXnnvnSK6/ffe/9777z5kUXn/P+xx/1vPeJN19/+947b7uvdx/7u9pWMBvLEgxUIvXr169Pnz6MEh2+/PLLq1Spogclore69iQNFKuHc8455/vvv1dhpGKJfeEhhxzCGMLcv4888gh7wd69ezPmzrAHyH06ylycX1bPXAYs+7hxZ8yYwYXRs2fPAw44gFqze7o8uGD0K0KZS5hCzj///BNPPLFs7yku0aOOOuqWW27RPeWILn3yySesa2+44QZnrpXomN4LvOaaa5h046VlJ4613377rVy5Uu677747bdq05557jqHwPY8U/vrrr+3atatevTrrA1a3Bx98MJPue++9Zz0jy1rJr6RIvmKWZSGsXRFLvAcffHDZsmWtW7eePHlyw4YNKmVmFqZWTE2N8fhl1mWzlG5dcgX2T4hwZ2YylaUWWqPv3Kalf1QfdNBB3JD62e3ffvutb99XmjVrXrVqNVZte+21Fw+EjPSMlLSCNOsnfdLSCrHpTLfb1dqhw357Z6YV5scKHn38+SrV61zT/SJrx2113NW/4ogrlTvw22+/PfDAAy+99FLPhcsU++ijj3bo0IGlN4+2eKkt+n/88cfvv//+3CRldbkzGnSDeYvThPvPP/88/PDDTL2NGzf+8ccf99lnH+s+ZHTSCiqwLGL1bc+8hSmxTz545+KrL25ULfOZF145/7Lut956Y/sDdrn+xlsOOOyIHjfcfHbXzm9+9LV1gjeJQbOGtcTSgyk3N7dv3756F47HEz0HcNV/iT5TyBARgKV2wYIF7LXj1ZGKIwbztttuY13I0/nOO+9k18XydIcddtAnDgTovPhKVQEBpVfbtm25iW6//fYPPviA7Sz3DnOVc0QuAIEjSnR5cDfdeOONPBbYqHneFymlaJzlSNOmTdn643rabNSoUbNmzXbZZRe64aliVFu0aMFzMjMzs0x64hY3AgtT+vbUU0+xVPr0009ZSTP7OgfynCaeUTyFTj/9dLbabJP0aeDcuXN1x8WDyk7+cy2H3BZUmldaqVIlpgpO6ssvv3zTTTexSTrhhBNYteknzVPtjzg5gvWlVia0eJL1zeSNNsX6v/1PX3y15r2S9ocXwsXB6rV58+Y9evRgYc7SjKuNPeVjjz1Wv379WrVqWbef1SdrlmcRYKeRaPXCSi5Ief75F0eOHPXpJx/WbbhdWkZeYcqGwpR8u390ryTftmU+q1q1qrb+XNYqpKv0hLuRpTe7cD0UnCqeCNtvvz0dZqWJK6m2NDryyCO7devGGvzZZ59lmc8Clq3MSy+9xLy722672cewzwMTp3Xi4upw+OF7tNhx+KSZV994Y4sdm3IvT50wvVu3i444cp/UtNiff45r3bRVZlpBnjWajI++Zx52pHxfl9UTey3C5l7Dgku3OYmXXHIJVxcuhVSxCeNJunjxYjsjpWHDhj/88ANbW7uZSMUTA8jEwC2z5557Xn311cy4XH7OZVm24lhxCiH73Frxt95661133bX77ru///775557riYqjxQs0fl169aR1alTJzbBbPW++eYbCsvkRakdNrVsCnnacLh4hS0mPDa1I0aMuOeee9SZeIWdyKqXpT83oN7fLnNx4t544w36dtpppzHR2nsMS55OIkp4+PCAAn7++Wf9jBBr1pEjR/JMKG73yE2qaF9bcnXt2pUH3LHHHsuy8cwzz+QW5bJ+/PHHd9yxuf1rISxxEviXVmD/yooCFkuUxdgzaUtr+9YD3m6PlVQJN0U630yon3/+Oevfiy++mC7RPa6hiy66iPshfjVYR7LeF07duPuyjx9ji/3Ek0/9PWXqwG8HNtmhyddff1dQmJ6almlPxnaU9a8Y0np27733njdvHvcVLtduvM7uLctPqk466SRriFxXakZGBi9h4sSJTB7ONByvK6lohCmcNRCnhmXQq6++ygOIfT/PrGeeeYbHljbQBYyCtQiyM6y3vlJ32X2P2dOmzF0T63zcMRTnZa0bOWLS4UcelZGWU1Cwsv+nAzseeciKZcvWZ2fbZxTR15L3lkMKWAfwSNLLR3///fdDDz1Uu3ZtnUftZZl3eaCwllIKcj69i1Qy1axZc4899tD6z7Pr4iww7DodblHFIjLu2K5sGYpu6LnPdpANLp1UIQJ0XF0SzqEBlmJshdlEch8x7b355pts13hdqlVYaaQdP9ckd5aadaQq+qkvHsZLbdm9TmX7qPe3ra6X6XCpfRbxbdq04TxyHzkvWWKgkIIRwQRMnTp10aJF2qCPHTv2xx9/ZHWrgLJVNNeWUJwnbkiuNu4BFkdcYex0OcF16tThDDLX5uflW/OrNd3G79R8a7rFsX7I1drnsaeklspYfvyKK8Wjkg4gLmJuMETfePjWq1ePLsHx+8E6Woy5No0ndiyvIKb3TPP+97/Hp02fce111y9ZvHTAgG8nTphTWJiRUliJB7g9yRakFnMRwODQGTpAfzg6TImq6IndU2u4qEJOFUAK0g2MG+926cSxaIdm2TE3bdqUximpUaPGzjvvDOso1k+5s4u3fulIQX78qWq9h/7H7+PrNWnauEHtjNS0v/+eFEupuHe7XTJSMmdPGz9nwdpTTjrmqcefXL5sJSPLXM2xSvPkUE+wdO/hhx9mL87DgnX6Mcccwwq9d+/el112Gesnes7ThFmB1UOrVq2UuGDBgieeeIKllZqKVCxpzBHjj3SFUKja8ePH9+nT58svv5w8efL999/fv39/LhbEro41EDPZqFGjOF/fffcduyLrOiq7+YM+SFy96iGCdZ3wUGHB+vTTT7/yyitr165lhmBLRx8oZ8E9cODAXr16sZw94YQTHnzwwX322YcUJcZbL6loRD0RqD/xuo21FHIgxcQrNo4zJQiW4nVlIfVEY6UOIBhxUlheP/LII0OGDOHEcb70q9kQ5Q0aNOjbty+nmM3Siy++qB+7jzdahrIfK17RAweSMgpguQIUwHIFyMO+Ve5yMfJlO2QTY9977z1e/p133ulUCZAdFQ9jB9auXTvHdQOyol3lDsfy8tZnbXj44fuvv+KCug22P+uS65564pmlK1fk58ZmzZz88H33nXrycfUbNrnp9p6vv/paXl4e90e+lRv/tWpOO8hhu3lLAey4HuYRwE345Wdf33/f3Qfu1mav9sc++EDPkSPG5OTlvP36i1z/Kexi0yukpaYxy73Z76sNxMdi1mTMwoClgNW7eFMCZB/BezgB8rDjOixA7nLky3aIJV+2Q4owCmDHBRgWxv/X30b2fOC+Yw9qt+uuHW6/554BXw/YkJ2dk5d1yVlnXH7z7cTk5ua91fd/+x14xPpcFlGxGeN/3nP3vR7u88xjDz6akx/Ls569DJHzey4L5syZw0zJHAnr+Wsd1RYuT0CuPR7i8SJbTpewdCknJ2fJkiXLly+32o7FculBXh525cqVLMOz2Uzb3z3G4nIZ85TheWG3ZLXgQACjAHZcXxYgT7lYgNyMfKsAKSlb0bYCGK1evbply5bNmjVjAHEZQx6m11xzjWold3wwZ2VlLV26lJ0iI6+37nXNrF+/3ilftmwZlkKk86J0AVJT06ZN47wfffTR7kLkZuS4dlQRRmL6gOgbh0Z0A8vr5QqhA7jWN+WXLl28eDGWK0Rf45DUiGQ36T2cAHnKTRZIbteXAcmX7RBLIdlxwzBjxYljKHQSuSRwOVmUn3XWWddffz0DNWXKFAoZKAp1EpWLnKawo0ePZp/w2muvqVBSWLA2vfPgFsma2FUbzHITMRZXEMxYT66bsWaVu1ws12RPPKBPPu644w7WOKpSjBirsCuvvJLBHTNmjLspgRjrlDtckBsrTE+bNWtGhbScWIXqObGUtOycHZo3q5JRISdnzexZCytVTU9Nq5yVk5eZkt98px1TrI9rrXbJRZ5DuNtHuIkYK9dk7LLFy9etX1YhNXNdhaqZ2Su3q9Wweq1qa1csXrRktfWrLQqtbx6npxXWa9q8apUKmamxtMJMtoXsfcMfTmAy1hMvEGPdrsmKQbgmu2PMXJOxTgqXOrB27boFyxbVTC/YUFijIDW7Wmbl+o0asHP4Z9rsCrWqN6mzHeOzYunM5TmVW+ywfQbZ+ev+nDktJytz391ap1ZIT0u1bjZ2DSzr2Q3TLTYcrVu37tevX5cuXahiItRxddBTTjnlq6++GjdunPO7NZDTJQCL6JuW5CqnEefGRO5XNHPmzL322ostzsUXX6wqZTlt+rLcRIwNSBeIsZ4YgYeRb5VyEW4wm7kmY3liso3jScqdW7duXR6vu+222+mnn/7MM88oDCnF3WwiRgy7Rl6ihABKdHZ0cnk6A6qVYLUjxk6fPr1Vq1ZHHXWUftOyE0yYO9Fxle5mubCkckrcFxiiBNFDuU6M23UYK9dhpylPuckCye36snIRrsnBuSZj5YZknUFYgilh3u3QoUOPHj0uuOACShRAsM6sXJVgcYHff/+dlOeff/6iiy5SLXLGNkD+EfZBt35tvleaYb912rLlzk132m3HJs1aN2/WapdWVSpWTE1Pq1S1ZuvddmnarGWTpk1a77zTTi13TkvL4B7d9K7K5hFXT4NG9XdqucsOO7Vo3aRRy1a71qlbu0J6hbr1d+CJ33a3Xdruvuvuu+++y66716patUIqXapIUjx56xXDkpGRsV2t7XZt2Xr75q1b7rRDq51aNN6hsfU+VFpmy9Y7N6nfgB1/elp6/YYtWzXdIZO9P6eqQtU9d9nzgHZtMyrZ7zemZVj/rOFKCzlkwdcet646JpAo1/tjiHJJwRU3/qCXnR2pjMUIcz9r8OUC+vjDKdHJQlZCYpXVOeJAHJqDqldy7ePHwYlB6psSt1lpKJA1RvZosBq77bbbmjdvPmTIkNmzZ3NqKCeA4VJKeJGbVMln40gl0abBt7+HzP2lIvteszz93yqKy82bSxuPbh/LOfXAxoPzX7vW7uKmXm/1sl8y/+cOtF57vNC+iWzHJu5Eu9yO5rRawU7sxn+RIkX6b4g59Zlnnvniiy9eeOEFZlz7Ht+MiubaSJEiRYq0bUlvEVeuXJkZt1KlSjAlenNoM2nT++ZuUahJXrXBLDcRY3EFwYz15LoZa1a5y8VyTfbEA/369Tv33HOvvvrq2267TVWKEWNxCwoKevbsOW7cuC+//DJt4yciTptirFMewHJNVgzCDWYz12Ss3GCWGyZebiLGelIEJmM98QIx1u2arBiEa7I7xsw1GWumlIAFktuFFyxYoF/7ftRRR7nDpMsuu2zQoEHjx49P9HmtSjyM9bj6CCo/P//vv/8+4IAD7rzzzjvuuIMLlXK9S6Z4d67ZTiLGBqQLxFhPjMDDyLdKuQg3mM1ck7HF/bwWxezvr+lA6Rs/hQ1zODdj1YJKPIydPn36zjvvfPTRR4f/vFasD4M5rbh0jxKVC4gEFO+w3DCM9aQLxFjfGLFAcru+rFyEa3JwrslYuQ5zxgEGCsuIMVCArG88Vm4YxuICvp/XOpEB2vSa3XIO4xwjgOUmYiyuIJixnlw3Y80qd7lYrsmeeK7ajz766KyzzqpWrVqVKlVUpRgxFhetWbOGZ1mtWrXc5QIx1ikPYLkmKwbhBrOZazJWbjDLDRMvNxFjPSkCk7GeeIEY63ZNVgzCNdkdY+aajDVTSsACyeNygS1fvrxGjRr6+SVclStm9erVubm5Ad+NUomHsW5XJWjx4sVczFOnTuViPvXUU9u1a0eYIwVjTZabiLEB6QIx1hMj8DDyrVIuwg1mM9dkbAnm2nnz5rGqnjJlCnf6u+++y4njrOmpjQIO5zk0rtOmh7Elm2uBzz777KWXXuKFXHjhhVdccQUdUwBWAU68w3LDMNaTLhBjfWPEAsnt+rJyEa7JwbkmY+U6nJOTw8x6ww03cGdlZ2e/+uqr3F/lfa7l0aBk5xgBLDcRY3EFwYz15LoZa1a5y8VyTTbjBw4cePfdd3PV6jlIlcrFWLnchFlZWfpBRqdcIMa64xOxXJMVg3CD2cw1GSs3mOWGiZebiLGeFIHJWE+8QIx1uyYrBuGa7I4xc03GmiklYIHkdmGW1ZMmTWrevDlPbcopQaqVy4X33nvv6bqSzGY9jDWrnHKBR54YD8tNxNiAdIEY64kReBj5VikX4QazmWsytgTfQ+bRvGjRovPOO4+nweDBgzMy7L90GeJwbsbiCkzGlux7yIh1G8+riy+++K233mKHoELkHMId79tsAGM96QIx1jdGLJDcri8rF+GaHJxrMlauw3qSz50795FHHhkwYMCIESN22GEHT4yH5YZhLC5Q4u8hb3rNbjmHcY4RwHITMRZXEMxYT66bsWaVu1ws12RPPGeFpyHrINipEoixuMTcfPPNY8aMGTZsmLtcIMY65QEs12TFINxgNnNNxsoNZrlh4uUmYqwnRWAy1hMvEGPdrsmKQbgmu2PMXJOxZkoJWCC5XZgbnsf9G2+8cfzxx6vQEWEEoEqVKukWVaLZrIexZpW7XCzXLDdZbiLGBqQLxFhPjMDDyLdKuQg3mM1ck7El2NfyQFizZs2+++7bqVOnF154gVpJtQGHczMW12nTw9gS7GuxMA+i119//fbbb2f+8KzPnBism+WGYawnXSDG+saIBZLb9WXlIlyTg3NNxsp1mJMIo1NPPZXF05dffpm58df4ODEelhuGsbhAife1yWfjrVIMDeJhhyonVhX776uk27+PKV4UKVIycVFVrVqVC4xbHeYqilcUFQHxyzHSlhaPaSZCVkiHHnoo50U7pPIjuvTrr7/qF3TEiyIZ0uy4ePFipsO99tpL7x6XH/nPtXR6qxdngnk0+HmnWoIThakpQEsqWVVtKdk9svogQCrfZqURcIbCsZv1ZKl9gCsHMK8fSpxCgiNtQel0jB8/nrXRkUceqfOlywM5UCYq2elmz/3bb7+1bduWHrLHpYRelW3HtgJx1rCTJ09mum3fvj1DrSHKt39vpR2yuWTdxsnkP9daj4GtWppl3ZBIjAZDGXdcUrlOYVZW1sKFC/UXGXG5Gf7lMeRwjugA0q9t4yKLl24D5zSROEe8fA0LA7Jy5UpuRVjnLh5UpnLWcHIDlPTyi/QvSGdq8ODBu+yyy88//3z99ddfddVVo0eP5mop83u5BK1xlc6wtcMOOzz44IPXXXddr169eOZQpWtbYZE0FJy4ihUr1qhR45Zbbrnsssu++uorBnBzjxLtJ9W2u68tpRglLvTc3Nznn3/+tNNO++ijj7p37/7iiy/27t3b+ebbvyb7pMU1derUs84664EHHnjc1jXXXDNhwgTK46HbnpjPeGJyslgPMRo333zzq6++ygPrzjvv5MEaD4q0DYu7Y926dVwM8+bNA5jPunTp0rVr19mzZ1PLk1RhZaIS3Imk/PHHH1zDzCKnn376Qw89NGbMmB49emglHQ+KtPFMjRw5kpuds3nHHXcw3bJy+vLLLzf3QNF+Um2j+9qQYhB5UifafFB7//33v/TSS3379r3B1q233vrXX3+xZUqx/nBeSmphfor1V36sv/dj+Sn2OrQwllKYT3F+YWEu1TRkD7g96CUceZqmZQ40bdq0zp0777333k8//fQDDz44efKUd955b33WhkJCCmLWP7bd1p+uz8+zNnbkWZOQfjkSXaKAHuuatf5wDWl28/jWf/+b0p3AdvaMM85gbfTyyy8zyzZq1Oipp55atGix9bp5pdb54YQQnc9Gxh4n6w/3UGK9coAm9IuhKbAaLCyArAHiv96zRiyKO5HKgYJPB7WTJk1atWoVS7ELL7ywevXqbdu25bJhTUYtkJeXx7yrt0MIxjrAFLh06VLrJgqnYl0YzlGYPxo3bsyly86bHduhhx76xRdfOG/PLFiwYO7cuXQSFzk9XLJkyZw5czZs2KDuUS6p8a1PvEy2+8yyJ5xwwj333FOnTp1WrVptv/32bIRYlzAg2dnZnEeGhUEgGEthmQyINazJtI1+NyqkOBOcEp0Pjyjkcn/22Wd79erF6WS4sVWqVOnQoQO1BXnZsbz8z/p/fNNNt151xVXXXXnl+ImTs+0ftc7L3bB6xfJBA7+87qZb1+Tnx2h841+TpdX4f4sv+sNNdcUVVzRt2vTaa6/FTcvIaNZ8x9q16u7apk3M+ns9eX9PmNSz5wNX33j9FRdd+N77H2bl5xfk57E3H/TDzz1uue26a66+7NLLBv08dEO+/ZJZE1h/dddq2vr3XxYv56677uKR9Mgjj9i/7DRlpxY7pWdUOOTQwzi/sdzchfNmP9Kr103X97jywvOfee7F1WuyGAImXP5nn5sC7SCsEbH+RxnLKLFlIpVncfat6zlQbBwrVap0wQUX6GscOTk5PJfZ5lL1008/XXrppa+88sp555134oknTp482bpmYjHWtVQxN99+++1qZHOInrNA/Oabb5g/GjRooCf76tWrly1bxhacPrA+YGHNNrdjx46DBg2iY2jFihU333zzw7aYmN9++20aoTVNMFureHUsSphudR41Viyh5s+fz5gMGDDgkksu4Tx269bt9NNPZ3US5sIoQ0XvIQeJuy5gQP73v//Vq1fv6KOPVszEiRM5r/Zca30O98G7b4waN/mennc9+sgDGSmxried8M/cJbHU2B9/jH/m6effffeVYUNGWFtL63FdBuKIQ4YMGTZs2GmnncZTg4ssLz/266+/7b/fPtWrVEoryFu9YvFtd9131oXnPd7rf+ece/pd117zzKsfpKam/Tlq+Btvv3/jLTc/0vuxtq13OueUE4f9OjaWmlaYav0FdLtt1tfxv1v+XxRDMXPmTNa2PK3q1q3LQBWmpP7yy5CWLVs23aFxakF+fl7WLTfdctCRx/Xu/ci1N1/38hMP33J7z7yUdF4zZ97aXDC61i3JoFisfxlgCsXp1mzrJ45LTtyJtOUU5iyMGDGiefPmTGaajdiqsnJlB8mUxvqM9bT+ci0LNeZdZmKuolmzZvFYX79+PRtKsuINJVOxLgldQn/99RcPlv3224/5g0JKpk6dyj0O3HvvvUz2TKhvvfXWEUcc0b17d/ZtZNHhvffemwfU888/f9NNN1111VX6uIQqu+GtU5yUMWPGcI+3bdtWr5QtLMuOatWqzZgx46WXXmJRct999/Eo4Pxed911nLgSDIjOteeMcy6SKnoPOUgMRaNGjfQ9+3jRRnG2OK/HHntshQoV7DFL+eGHH3beeWfuWGvoCwu++vSjYWMmVq1adbvtajz0wF1LF879csAveQWF++6/z91397zworOs3RFn2nofsgwGnIN+/fXXdOP444/HBVasWD7291GdjuxIZ9JSC0cNHzb4l19XrV1fKSPz4I6HnXxsp1ff7m/NOt9/P2z4rxkVK1WsXKn75Rfs2rzxRx9/w6YtxkSbmm81Te+K2UF7PKxnimORu0pKVOWUS6otsWiNBw3PRE4WdyMNxvJjQ4cOP7RDB/uFFcydM+OHH3+eNXcFS+HWu7e+4tJT33rz/bU51jvtqSkMXczqgf3melpKPiNC53hopaXms1Bi36+jeJSZmckNrx/sibQFxel2X4fx0qJiWmU/xPnSppawX3/9lY3gUUcdNX36dNavrGIJ45HduXPnv//+e+HChcR06tSJe61+/fq0rHbCKFEffKU+c8SKFSvuu+++uIgnz9ChQ5lKa9Wq9e2337777rv0mUXAcccdRxUvhMRRo0a9//77mg+OPPJIan/88UctI4rVgUSiEVpWU3aTm9p0u25GchG58aIyFc3++eefu+66Kw9tvdjffvuN1VL79u2nTJnCWLFqIYzJmA3SH3/8QRUpRK5ZswZAa9euzc7OBshVm6aoxdqvw+elBch/ro0kMYjcV1oqxos2as6cOdj9998fy+hzZ3JP7rbbbqw3hw4Zkh8rPOf8i8458zTyCmIFuTk5DHWlShV0Dq3Pf629rzXLWvf2xt1zKUWXGjZsuMMOO9CfWEFs/Lg/169b2779AQsWLJg6fcae7fa9+pruTXZoxGFT8wtzc/IrVKyQkpZ6dOdjL7nswsqVrT/Kxh4vNz+/QmaG1TH7T9nEmy6mdJNjHcCqyurbxj9yjvLtr+OrClkJtpQVLy2dOHfTpk3jMbrXXnvp6JzTaVOndDi0w+o1a/4cN67xDk2uuvaKNru3sv+wUUpeQUbValXT05mSs+66/fbul1wy/q+/33rjjeuuv+nqKy4fN3Hq99/9cMutd91w5RXffPuD9a6ycaNxRI4yfvx4HuLxokhbSJwLbjFNouZdLHGyWIqxUFYA12S/fv3atGnD7MWU9t1337FKo5yLZ+XKlcy42223nRrURpMswG6pjKX+0DdmdFb8HAuX+YPt2hlnnEE3vv/+e7atFFK1evVqusQ0w8thO9unTx9cpJ03LcBWo2Uhhgixv+e5B7hvVY6OmLE4KFXqsyNcJwX21JZevNimTZsybmr8p59+qlGjxrnnnsuCY9CgQaxXiOHQbHYpZyME9+7dmz3ugw8+yIg9/vjjF110ESuVgIcP55r2pXhROPmPvjoaCSUaEBZHlDdu3BjLiZk6derEiRNZLrFE+m7QoJT0jOO7nnHpud24Hwl49rmXGzZreXznw9O5Ma2v28RSCpnb4ldb6cUhOPG1a9dmV82lYF3HBYWffPxRix13bNqs2edffbNiTVbdRs0eeqBnw5o1C1PyZk2Z8dmgwZefdzozfpv92t926w1VK2ayaRsw8PuZ8xefc9YJ6WzXCtJSCuk8zxL+laSrXNBPPPEEjwa6h+Kldm8nT57MZa1FDI+AeIVd9fnnn7/yyivcq1SpRFWlUYMGDerUqVO9enU9bj7+6MMaNaq3a7fPyNG//z39nwrVat597z1tWzbibli2aOlLb3x28cUXVq2Q+vXnnxzQvn16atrJXc/brmb1Pk/0qV+n+mmnnjfi198efOj+E446/LZb78u1HiaR/ttifbzLLrswQ3DjoP79++v3/HErUdWhQweuHC6befPmvfrqqzyLmeTimcVXsa5nBe+4445YOobL6u2xxx7r2LHjpZdeym51v/32Uy3lzK+sCQ488EAu8t13371169b0nxf1wgsv8OrOO+880q35YeM8VBrRlLYWTz75ZLxoo1S1xx57PPDAAxxLL8ER0xhL3nvvvVeup7aU4oUzGrTJq+bQ48aNe++99+68805mX2bWww47TL/6nofPBx98cM0113Baf//9d4bxiCOOeO6555iJ77777pNOOunWW28tbseITyr/udY+I5Hi8h0Qto977rnnhAkTWL5xZ95+++0s1urVqzdkyJBWrXZOTcuIWX82nCdxyvvvf/Tltz+9/d57jRvWSY//0dNYamGGNevi2P82/bdE0lk74YQTZsyYsWrVKm7Ll19+adjQoXXq1mGS/HP8hJ132b0wvUJaYX5matr8ubMvv+SK62+77fILTilISYulUs6FVfD76DH3PND7hdde36/dHumpBakF9jeSrU6JiiFucq4/brk77riDW45L3zOGffv25R545513YCJViNatW3fLLbdwG/zzzz9K8SSWQLTfqVMn1uCzZs2iJwMGDPjk009q1qzJ5nXgt4Pa7d++IDWTY1RI2ZC1bt2tN9y63yFH3dvzFnu1UXjoYQf/MXb8WWeffWKXE1i/rFy+svWuu9x++628xBUrllWvUSstI4VA+mj30/5vaU5kpC2hzMzM7t27Dx48eOHChd9++22vXr2eeeYZ1s1U6eLk5C5fvpwd5KmnnsqDWCX2Gdd597oBShrgloIPOuig7bff/sMPP6R73B08ZJjytbsigB7yCGJzBjOtVqxYUeVMPDwH3n//fW5DLHsDIiWr6VKIxtGkSZPmz5//xx9/xEtdmjJlCsvov/76Szc+ilekprIn4dZmFuRp6akqvXi9bGHpGPrzzz+vvfZaTtkVV1yhWh1ryZIlFOoL55QwSqeccgrbJB4Rhx9+uFYnjLNm60Qyu229kmQqs3cVtjUxvm+//fbIkSMffPBBLnFuzocffvibb74ZPnx4166nMnVlFMZSCjI+fveDfu/37//FgD1bN5m7dEl6Wl5uYUFOSoXMWFZKWmZ+QUpOrID1dEphQar1+W3JTwcXB3Mtl9dtt9121113Va1abdC33zZt1vTeB+494fija1auZH22GCtYOnfOZRdedtlNt99xx61zZ83hWZLKtjwle9io32++vdfb73586onHzZrKnFRYmM4Uy96WuYTdbfE6pvuZdeIZZ5xx+eWXU6JVuapQ165dTzvttOOPP94plypXrsy9QUqTJk3sazjURRwsGmnbtu2zzz7LTvq+++7jEfDVV18e1uGQXo8+2mKnnZo3aZqRmpZamLZuTdZ1112x5wEHvvXKM0sXzyvIL+xy6jnLV8yds2j9qWeflJFWWCGWM+yX4aedflLFSulVUvM/+/LLA/fePy0lti5WkGt/XTv+ze34YSP9l8QFyY3z8ssvMxN8/vnnXJw8dnkQa0rT13rZ8XCPs5bVR3pIVy8BMIArKCvRMn2oWrUqEy071zfffJM+9OvXT19I1juxLCIfffRRjssejmDnB1qopYQFxGeffcbGgP0AKWq2lOJYHOiCCy7o379/nz59aNbdMrVMeIzh008/zSLGc9Bu3bpRxRacNQGNeGpLKRrkmcM6/ssvv+SFv/jii+xT2cuyc+XxSMfmzZt3ww038JDs0aMHA8XQEc+ulw3SAQccoBi2T6xLOO/xRhOoJD2ndVMclbOFgKSMAliuAAWwXAHysG+Vu1yMfNkOKcIogOViWfDuvffejuuG3Nxc7rq1a9dyzriycdmWZWVlAQWxnFjOhvf6vnrtpVcsWbh09dr1X37+yUv9Ps7NXZ+dt2F9Xu6wgf322OfAhVk5a/Niedavp8nj7lCzyDkEch9X8mUOyvIW0Qd9QpOfl8f9uW792ty87Lzc7LzsrFl/Tzq9c+dhg4euXZe1cOmKy668Jpdj5ucMGfbjKaefNvHvGWvXrJ8xbfKdd/e2PlGJ5RTErJvZPnb8KJ5DO67DArIlqxu2VC5RqE96sFaP8/IodNqhHKnEinYB8mXAw8hdTgdok2FhcKzj5eZkZ2etWbN2Q3a21be87BVLF1x52UUf9Ht7zZqVK1asuvDi7qvXZ1P+6YcvNdxxr9U52fmxDTMm/NmsYfPRE//Kz8ufP2tyozo1hw8e9fQLr0yaNS+b6TaX4cqNWT9EZR2RdX2lSpU+/fRTHV3dkAJYrgB52LfKXS5GvuUmowB2XF8WIE+5WIDcjHyrACkpW9G2AhgxHbZo0YJn6OLFi3GXLVtWv379q6++WrWSOx7g8rCvROtyhXXFYtGCBQvOP//8L774ghl35cqVLNeWLl2qKsS68JxzzlEiUoOS0z6yj2Z90sQj96ijjnIXIjcjx7WjLKZldQ/pQIh+cj2zsGYjzszBzvuVV14ZNGgQ5dxZTDnXXXcdnafb33333VNPPeW0Jis5hW4WIE+5w3TDPWJIVVhY5QhWvJ0az9W9L0ZOuWJQSHZchwV6GgM6jyqnM6yiTj/99B9//JEB4ZJgGtbTYM6cOUzGo0ePhhctWtSmTZsnn3xSLdjNW7La3XiIUaNGsVB49dVXVSjFJ85ARe8hl1DW2NnLXh6pWFwKK9hiuZZSUPjRe+9de+3VI34dduJJnTt2POyKq67ZZefWKYWpa9es+2PMqMFDRsybO2v44CEzp89QrvXN1pJKpwxLU/SHPtjFaRkZmRUyK6WlpKanFC6ZP48N9/AxY3rc1uPITkd1OvJI6+eNUgp/GzHs1FNO/+uPPy849+yjOnU85qhj6zVuGEtNi7Gdtb4eRTt0ryTvO3FpYtWrja8xvhgENHqOqyquWlW5y0sp2uHotMkqm30zoINXqlQxg71LSmz9mhVXX9b940+/ePLp547sdEzHTkfNnTc/IzOjMDVt+OChhxxyUJWM1FhKxtixYyrVqL1Li+Y5qamLFv9Tt0Hjyhnpf/4+afuG9dOsLyajkp/BSFtQXCHav+pC1eUHYJlczzvvvG+++ebWW29t3779fvvtN3DgQLZKBDO3jRs3bubMmfPnzwcWLlyodLvJhCrZVW1ftPHds0QJ9xcT7TPPPPP6668fcsgh7MwefPDBJk2aUPvWW2+xER8wYMDhhx9+4IEHshrYcccd1RS1ghJLR2fENFBSvG5jV93g1Hoiy1yMDw8Qp1cCVXF2zjzzzF9++eWKK67gPDJWv//+O89JAoYPH85rweWE3nPPPVRdeumlygov+4BJVOT8OXI/JbGJ2ClRvEQhlnJ3gCCYsZ5cN2PNKne5WK7JZrzcRIxV7pVXXsmSZ8yYMXKdcjvQp9l4VV5u//4fz/pnlvUriYhPTcnIrHj+pd3r1ai6dOmSDz/6sP52lfLSq2XnFVauUOGsM05Pty9Ld7Mmq32Em4ixcotwQX5KLH/65EmffDWQzuizRTrW4bDDD2y/369Dfhzx6xj7Q8cC68dZUtJOOffiJk0aZKYUVGCOJj0lRmWq9U5y8sMJTMZ64gVirNs1WTEI12R3jJlrMtb2SC9IieWtXrn07TfeWpdvffLKCy1ISd1ttzYnnnQ8U/+X771Vs9XeR+zXNj8lffxvg6fPX3Na15NiaYUZOStffOW13JXp5195We16NfVsVps6xLx581q3bt2vX78uXbrgctvruFZEslEK7La3yl0ulutbriys8zxSCaBbGKsFkJnuYYEY64kReBj5VikX4QazmWsytgR/v9Zkucyjb7/9NoVIJc2aNTvjjDOAsWPH8oBWFePGTrpjx47O8hE5KWJsyf5+rZvlirOzs3lFvEzNapRUr179kksuYU357rvvau63M6xcnmO1a9d2XOdwivEczkn0lJsskNyuLysX4ZocnGsyVm4YnjRp0ueff67LW5f67rvvfuKJJxJzyy23MFYnnHDC0qVLWZEce+yxDCBhVCkXOU0BnPR/++/XSmJWE7rI2Izj+nZRJQGM9eS6GWtWucvFck024+UmYqxySzbXxvLzufxVaH0amJKSX5ianx/LZJAK8tPsX5NQUMjGkSArgzWikBRluQ/hbh/hJmKs3DgDqWwW89l4FeYzX1QsYDFq1RTwv3wuEWtrm5dSmElkGhs5MgoyclIK01Jj6SkFaSkZqSlpdNxudOuaa/mv/aspChgc63dAce1aP5FltRnjerbvycJ87ssN6RkV2OBzTlPyWTQVpmamp+WkFVCbVxBjNZKansHun+aKHLoczrWynncOkIIlVfmme1ggxnpiBB5GvlXKRbjBbOaajC2TuZYnsoASSUNHocoFzkOZWgDXiQGcMBWW+O/XOixXnM9Dxr6udMpUogA9jRWmbjuFKsd1GCvXYYEY6xsjFkhu15eVi3BNDs41GSs3DLvPnc4sIgA+9NBDuTZuuOEGzcFId4fYiVQJ4DvXOpEBSj4bB0gHeP3115mQHDcSKkhLy7NseoH1QGcCs0Ym0/rOKk/4NHZT+YVMYzzBrQqNm3Uyy1hcUmxfrcskxonOqFjIREtZamGafbUxkVhzDtMF+1f6aX+vhw5lpOTzjzlHrWyGjm1x8ZqsfwWMTVp6YXrFlLRMvNSN/9LTGCRuPEYgo0Iqz1BusvwcnqsMaWosLZZhnb6CirEMBjNmf6ntvyGeFOvWrfvkk0/e3yhWA0j8yy+/LFu2LB66zYu70vNAcxYolIs11TlhAC6Su1nFUdQH9+EoUd+cQs0r6phKtk0554hxcDPLr7lz586fP3/58uUaOuRMxmUr/7mWToQRiwW6+OCDD7JeI0tndOtTCV5XRlpapvVZINOs/e5sWkq69Ysh0jL4H//SMzIpSefUWqtgbYs2w9ixn2ZPTfvWoTiQtay1tmbsW63PWtIyrHK7S9yl1NBh9tdsagnMSE1jOxeP5nXEm9xKxFmxtvRc/dbJsc4BL5l/FGi00hmR9LQKqQxRSoY19aZlVrYKMzKs02jvENJTM60Y64eDrAYNlbfbgZNvP0nS6tevP2bMmPPPP/+zzz5r0KBBo0aNGjZsSMAdd9xx8MEHDx48mO0RjxtlbR3Sywn/ZEN6oHGmGTFYLoyVtE0EVGsPrSU43oSfgmvDi3Y4NIdzdqtIJVbnNvaKWqTyeOY2KY2GxkFWOuqoo3788Ud2qFWqVNFAIUE8M4E8N4jaD5b/XKtDJhXHu//++1kUIK0F4hVbi9wDGi8KJ87nppvSlss1KzevdDAuLIctbaS0TbfnpkquN/ufI3fVVqWiLwxP/yQtUeJ3Z4ZKLFaxqilx4uPSBaPboVzdFHSmevXqhx12WF3790Jfd911Rx555OGHH96xY8czzzzz7bffXrVqFYWrV68mslz1vDRyXkh5eEX/fh+sKWXjHR7JLebUPffcs3Xr1m3btq1WrVq8NJl8zyCFSeU/14bUH3/8wW25zz776GfOQh7yPyReUUZGBo8nPTojRQopVmk1atQoV3eE0xkuZtbytWrV4s51CgHtcZcuXbpmzRotnbcaRfdvJI+44E3F65LJWkoX/4ryn2vVViLpWwC5ubl9+vS58847uUXz8vKYblW7xaXuubekAQoOY/QJ4NHDCiheFClSMnHZsERzZqx46ZYWnUEAHdOv2q9UqRKu08NZs2bNnj2b6bZOnTrshMpPzxPJeUXB4oU4j9F40ZZTeehDpNJIFxJXVNy3pcJglXBfy3z24YcftmvXrmXLlo0bN2beXb58ebyuHCjp69ddCiRdy6id4NYiRfKIC0YXWLmSevXXX3/l5OQccMABKlHh2rVrH3jgAabYu+66q3LlytzgStlSUq/UvQARGXKcnfhIkbaIiv15LbVc3IsWLerfv/+1117Lho+5Nt/+7SrBif+aVq5cOWXKlL8DRcDSpUt1M8fTIkXa2sUdymw6YsQILvv27dvrmxZz5879/vvvu3Tpsnr16q+//vqkk05yggX/mnQzYpcsWTJv3jx1zyPKJbrNnc4trNz/isrJQzJS2YrTmlTWlR1Hl3QFA6p1s1aRd95558EHH3zCCSdQ1bdv36uuuuqNN94497zzcK0bxvpJiJQM69ue8fRC6wdIrHmdlqw/+ZkSs76ia7WKl2H94In1fdd0/m8fi6x4O0TYyEHT7O/0Emf/bKRVE++8u3uIDTfrAOcmNGPEp5xyyllnnQU4b5epXIxVbsl+vtZkuSYrBuEGs5lrMlZuMMsNEy83EWM9KQKTsZ54gRjrdk1WDMI12R1j5pqMNVNKwALJ7cJMBuXwd1lwU6Cjjjpq4sSJV199dZUqVegYJb/99tv06dPZ0TLR2r/4LL4ET3QIp30x1h1DgwCiEImtUDvMZEAluFZCYeFFF100bdo0p0RWrh0YTzn00EMfeugheut0WOVl8vO1DssNw1hcgcnYsv1dFm424+WGYawnXSDG+saIBZLb9WXlIlyTg3NNxsotDcsNw1hcoMS/y8JqxRR3iwMe5grmzuQJ8s8//7C0RByVTjz++OP2b5AszC0oiFm/S9f+HZLW79HlH7n5NsQKC/Nj1h8+zMvPy43l5eZbvwQ4zyrKz87Ps36ZLLOy9VsCrKPFCvMLrH9WYj6mMEa59RcRqSu0GrQqLHZ1D9FDZB1942+wFLgZ64Q56QJkx1ou9vLLL2/Xrp3jugFZ0a7yAEa+bIdYSspWtK0AdtxgRr7lJqMAdlyHBcjDjuuwALnLkS/bIZZ82Q4pwiiAHbeULJDcLjxnzpzKlSt/9tlnsK4xQApguQLkYd8qd7kY+ZZzj9ETbtsGDRocd9xxWVlZlEi5ublPPfVU9erVH3zwQVzfdA8LEDeR8xtucbG0QCEl9h1m/VJcyp10AXIXSg7ToH55teQwIMnlJdC+cwsjpbNBb9myZbNmzdgf4y5fvpyXfM0116hWcscHMwrJcgXIw0gLiKOPPtpdiNyMHNeOKsLIl+2QIoxCsuM6LECecpMFktv1ZUDyZTvEUkh23NIwCslysSzgKlWq9Nprr6lQUliwQszGRaXbslGjRi+88AKAhg0bxqy+YMEC1gDM++ksBwpTM6xf+Efj9pbV2t/av63I3uamFOZO+mv8hqzsWEpagb1bZa+7Pitr8t9/81qotraulBWmFqTFCtPyrL/oZv32Itq1stOsX9FAQ1bj8VWHIfrDa2MFkEiKQYqPFGmrF5c9N8XUqVNXrFix//77Z2ZmqlC1rJ4zMjLYiOuP2KgwqXSXMduxd4Tlqpx2OJDj2uFhRU9YqSSV/lZMcRuPFGmLyH+ysW6aBOrfv/9ee+31zDPPPPLII7179+7Vq5f9F9yqzp8/n1oyUwpjy5cuGfrTj9kbNuTn5f86/NchQ35av15/hcpuOSX/s/de7XH7bSxK09nCpqStX7vkssu6f/PN16lp8R9pXbhg4bChQ7Nzc7Lz84cNH/bbiN+ycnNzClPHjx//46Bv16xZE7PeqoqxzLAaLKrJkyd/aMv5tTimPvjgg7/++otgPYAiRdpG9Mcff7DR1NaKix/L/c6MpY3phg0biKEQZjuoLSO7zGXLlrGDhFXOntJqy57ksGwfzzzzzAkTJuCqQSba22+//dNPP1UAlnR9vIrLvpMtKSDXFAv36dOnzwjUzJkz9ZULHeK/ov9WbyOFFKc1qayZJo4uUagFqW4GB7hbLrvssrfeeqtGjRoKw3Kn7bvvvrvvvvvXX3+dmZ6RtW5Fj5vuqFW7yti/ZjXbsWmHA9r/Pe2PP/5c/MkXb2VkpKZbO9c1ebmFt/a4fUNu7IknHy+M5Vx88aWtdmt39533VMhILyjMWbN6bY8bb69WNW32klWVqm3XtXPH30ePnDB9/u577t2sfo203PVvvv/ZNz/9UrNSZgYb41Trdww4nUE//PDDjz/+SIfd5R5Ghx566DHHHANod0sVDIixuED0eS1uIsZ6UgQmYz3xAjHW7ZqsGIRrsjvGzDUZa6aUgAWS24XLz+e1As1JuB07dpw0aRITFftCSqhVwAsvvNCjR4/TTjvtjTfeSE9P5x5nVmbO6969O/d1gwYNBg8ezNqaRWrjxo2///57ngMnnngikXoy/Prrr6TTyB577ME8etttt/GIuP/++9l6UssM/dBDDzFhr1+/fscdd2Q0Jk6c2K1bt86dO7OFRXTAEQ1ecsklzKYk6nW5X44dEn/5Bx544MMPP4zrtKCw8vx5bRn+PmQ3m/FywzDWky4QY31jxALJ7fqychGuycG5JmPlloblhmEsLlDi34dstWJK9w/iJmEVjFjbcgVfeOGF7Bf1GYwsy+FVq1Y1a9Zs7733Xrt2TW5u3m9Dfnz5hb7PPP5ArdqNxowfn5+b99MP/Rs32nNdtvU3WgtIzF2Zk5ebtXrRzVdedOH555zW5Zi7779/fc6GnHzrIyWO8+03X733xruP3HP99o1bjf1rRn7+6m+/eTMtvUb/Ab+QN33i6GZNm06aPS83P1aQl+10FZDUMUTnnf4nYuSkC5DaEUSf1waw4zosQB52XIcFyF2OfNkOseTLdkgRRgHsuKVkgeR24fLzea1AFzzL4po1ax5++OHczrjcKYIhQ4bUr1+fxQGTAYVr1qy5+OKLmWsrVKjAWpO5kxJu8IMOOmjq1Knc8v/73/9YQzifyCLaoRFmuOHDh5Ny0003sfelZftWi/3111/33nsv21yO8vPPPxPPgpjbimZh9dAR8bTGztiR4wKS43IIZL96S2phW/u8lhGTK9nhlkKy4zosQJ5ykwWS2/VlQPJlO8RSSHbcYjHWviStEXOqBCiA5WLL+PPaeKUtXG62Xr16sQv84osvnn32WdawtE45lpUsK1ym4fnz599ww42ffP7Znnu3Pffc04cP/+2cCy7cZdfdUlLyRo8ZvdNOzTMy7WZJS6+akherWKHGPfc+8NOXX02fu+qmG29MTc3ghcRSCjMy0g86sP0pp5wwYvio8y+4YLedm6emFIwcPvjIY0445qhDClIKJ42fwGa2Yf36+bTnt1TQWltrEOvFJBYx2nAkkl4mivuRIhVfW/b60dGZdUaNGtW3b192lvXq1WNtPnLkyDFjxnAv33jjjez52rdvP2jQoKZNm3JTbNiw4YorrmBaZQ/KHMmukScAW2F2q/ozqOPHj69Vq5Z9Z1jvP+s+ogU2vjwl2Ec++OCD+iPKBGBZdpx//vljx47dd999Dz74YAppkG0u8yWsfiIxliNmhpC2s7Sv9GC5D1QC6XkqWxqVshv/ITkrgLhfnlTmZ4EGk8p/ruXyleRy25x88smvvPLKiBEjnnzySdaMTu1RRx119913Dxs27Jdffrnq6mv22/eACpUy0tNzh4z46/DDj0hLZaCzPn7/m5NPOWra9JkrV6/Gt77XlFm4fsOayy7r3v26qw4+eK+bb7o+b0MuR7V+niclvXq1ink5K3+fuOygIztlpq8tSKs6YvjEg47sWCUlO5aS980n3x1xbOd1i5dOXzg/JSVTPfGImxDL/R8gpmQFB4hB5Iph4cwrjReFE4lcZH///Tf7A9b7uuDidVtO7AMetgXEi7acGNJnnnnmuuuu44EbLyofeuutt66++mq2XMU96Y5I5LLB6sqR4nX/unQNT5w48d133/3nn38uuugipsl33nkH9+233/7ggw+aNGkycODA999/f/vtt+e+oKt169ZlFzt06NA999yzdu3aFE6aNKl69eoHHnggV/LSpUsHDx7M5kyvS+0DnMePP/64Y8eO7N4Qh1YtYou5ww47kLX//vvbSdavd2WHrR86Uj89SlTuiAAU5i5WN7BiFYYXL5kXyKKkf//+2oXjxuuKrxJ0IFg0OGPGjDfffFO/TSheWg7ENcBltmrVqnLSK12KkydP5spfzUy08SfTiiu9HKz7dclNIvXAI80NiKuKywvLUwMrQBQ67yCpHBGSy//zVv7x289Ndthp+qz5G/Lzpk8d06BO3QkTp3S/6voZc2Zn5+TE8rOXL5t38kmdH3nogdycrPz8dTfccMXFl1y8el1Wdr7180IF+et/HvDhDk12nbt0VUH+uqUrFtatXvn738YW5GevWLOkafXaPw8f2fOuh4ZPnJSfW+THCSRftkOKMApgubzSCy+8sE2bNoC7XICsaFe5w/Zo5D/22GM8DphumdtUpRjksB1uKSlb0bYC2HF9edasWexpGjZsOGfOHHd5AKMAdlyHBcjDjuswm6cGDRpwmfIcd8qRLwOSL9shRRgFsOP68l577cVZY2KQ65SbLJDcLqeeZRYTyYcffsiVYCdtEgFxMliuAHnYt8pdLkbucvvWtO5ilcB0z826kQE70BIuS43dd9/95Zdf1g3OldypUyc2o7hMOS1atFixYgUbYk4i6Ujf5Ljjjju41Jmk27Vrx6ZZzUoLFiyoVKkSe2sKFy5c2Lp168cff5xEqtQZ5PsSkPvlSAGMPO8hs4Fm9cD6SbWSOz6YeflMGCxQ2HKwKFGfVeXEICdFgDyMmH545Jb5e8hdu3blir3nnnvoqgqlkOy4DguQp9xkgeR24S5dutCrBx54wD1cki/bIZZCsuOGYesStHXSSSexPuN6hlXlxKAAlosdOXJkxYoV+/btq0JJYcFK8h4yzNNQ4BaDiOwZPf5VRsDalFq/ZyLjrwlTdm7VslGjBukpKTVq1m7arOkzTz9/yEEHNW7YiHUz0+ntN92yf/uDr7vl9sJ0653lhx/sud12VXr3ejQlxquhlbQZ/8zaY8/da29XI6UwY8b0aY2bNG/dspn13lJaxeY77fzxBx9lVqywa4uWHFX93EzidTGsmZmZvNh4UTiRiD377LNvv/12HkkMjsq3rBo3btyrV69HHnlEf0Nty4on13PPPde7d++ddtopXlQ+RJfuu+++ffbZRyexBOJ065tHXDm6D0vcVOnFoaW47/rGB4IdUUUJlofR3LlzFy9erC8P8hKYY9jp8nJwmcCYKVmLcDlxX5DCjvbWW29lNuJxTwmb1yeeeOKqq64aP3682kQTJkwgl3ZIv/feew8//HDmZuuom//W4Oh6WMX9Yop0rtVLLrmELTtrCNx4RYlUynRfMdcecMABxx133OZovMQ6+eST6RWXUHnoFWdf3TjhhBO4PjmVJe6Vfc16P3ykteSKxxaVM1EDSRmJCwoL8gryC/Lz8rOz12/Izi0oZIsai2XlZGevXbPBXt6zlGDZnLV0wexcomKF9mI7p6BgbW7u2kWLl+SRk19YEMvNy1m5fkNObgzOy4+tt5qzFtz2SnxD/qq1Wbmxgg3U2mtipw+SL9sh/t2WPCwXG303KoAd12EB8rDjOixA7nLky3aIJV+2Q4owCmDHLSULJLcLl8PfZWEyMlnLf7ah7733HttW647Lz+/Xr9/06dMBdnXLly9/8803P/nkE2rZpKLZs2c/++yzVNGC4tGIESM+/vhjudiePXsef/zxzLWvvvrqTz/9RKFzaEBKyla0rQBGvt+Nuvbaa+mGApA7PphRSJYrQB5Gm2lfK/AwCsmO67AAecpNFkhu15cByZftEEsh2XFLwygky8Vuxu9GFUtWKovIDBaCFVJTCizP+vvamVWqVLAbtfqUklahVr2GHJgtrb02SC8orJSeXqFO7Vqp+Kw+aSGzWsUKGfq1FWmFmZmVMjNSUqwdb4r1x82rV6kIZ1itW/mRIkUqK2nZXr9+/W7dunEXq/D0009n6lJttWrVzj777BNPPFHbXO7Jhg0bsiRlN0yAnh5Ydg/sbMTMx99991379u2POuqoCy64oEOHDiRStQ1K4xNpKxOnNan851rrbiuJrPd0+Q+3JFeUNTlavdBvNbZuSjsgpTA1rUC/A7kgJjeWkl4QS0+zwvIKU/IpsX5JcmpKWqq1V05NSS+0fr9yaqHVILMvzFGsY+iokSJFKivp2YH0/nC8dKModL6OZD0p7AC9Q+sEqxyJaapXr16LFy9mVzdv3jw1q9ptUNvsC9+6pUs6WP5zbUmVmpaSzn/S0lO4HTFQmn7XhH2NpVi11pRq3a3ccun2zZzCDjUlPSMtNY1Sgq3Sjd3ixsywjD31QnBqejpXaybHItNqtjxK75s571kBlMTrtpyc9zqAeNGWE93Q+ADxovIh56yVt479O+JuQ0yosJ4jksoF1JquHWXdkHZNXJQwjOxlhw8fft9999WtW5cSxdtH+/dU4rOpK0FXRfm8Yq2L1e5b3C8fUpew5WG46ANy39rYeN2/Jf8rXj0rrqytrPXLjEVqx/q3CeXGEXEf2u7GeHvy1PzptGE36GTYoU4GheVTPEqwzqllouX5oqotKI2YLjKVbEGpGxqceFH5EF0SlOcLbPOJV60XLvAoIMZd5VZGRkbTpk0bNWrUvHlzfWUMKfhfVsmOqwUEoCu2ZI04KmW6r9Tm5mi5lNJKqzzIOm322hHGlvkzUO0Hy3+upTeRSiNGHztmzJijjz76s88+0ypeVVtQ69evv/LKKy+//PKsrKx40ZYTQ3T77befdNJJa9eujReVD/Xp0+f4449fuHBh3I9UOjlX/n/0wUK3WTTPnj173Lhxzt4xXld8lSY3kZYuXTp69OjycFO7Ra9GjRpFrzbHSy6u7JNmvd3IfU2vsrOz4XhdWUjtB+vffidn2xEnYOTIkUOGDPnpp5+0qY1XbDmtXLny888/HzRo0OrVq+NFW048tt57773vvvtuyZIl8aLyof79+//www+zZs1y9jGRSq+Qz6NyKO7cDRs2sGg+/PDDuaNxy8O97NZdd9116KGHvvTSS+WqY7fcckuHDh3eeOON8jNiXIQ9evQ47LDD3n333X//gizL95AjOeJEYi+44IK+ffvqZ8xVsmXVuHHjd95557XXXisPP1+bmZn56aeffvDBBy1btowXlQ+9+eabjNJ+++1XHk7ZViDnkfIffbCw5EpLS9tjjz1atGjRqFEjrorSXBibYxDoWJMmTcrbfbTLLrs0b968WbNm5eG8qw+cuNatW++www76XaSqKhPRflJZx4ujSxTSFQE2mOUmYiyuIJixnlw3Y80qd7lYrslmvNxEjFVuyf7OD1ZygjMyMuTCCnDHIycyEZu5JmPl+rLzDhhyttoB8Vi5iRjrSRGYjPXE62NRXEdWqHEIsXIRrsnuGDPXZKyZ4rC+xQYgfe5uxogFktuFy8/f+cEmYrmJGBuQLhBjPTECDyPfKuUi3GA2c03Glu3f+eGuwV2/fn1OTk6tWrU4lcgJE4ixuAKTsZvp7/zo93bVqVOHm5oSpFqnnWDGynVYIMb6xogFktuFN2zYwKDVrFmT4VLHlIt82Z0bhrFyw7DztGGs1qxZs91229ElpDDFBDMWFyjx3/mJ9rVJxParatWqcSe0GHpHutR0sres9IyQ4kVbThoWbNwvN1KvSjlKTNi6n+N+pC2n+EOtpI81XQw1atSoV68eJxQucVOoNLmJxDq+QYMG9G1zNF5iVapUqXbt2jw/GcAt3jE6IFWsWJFFif5qRbyumCLRfDLYbSdR9HltkBhE1kH6eL+4sh7V9m3ptpFMbX0jw7qKF7V69Wr3GwmRtpS4i0t5IpxLtHxeq+pVeesb/dEsW05EfzREsiXuG3c3ijvFkf/x1K1I3KXOKYkXRYqUTFwzSKvdeFGkLSedhXJyLqJLYquR+1TCSeU/1/KYiIR4YrKEAUo2JuRu2LDBXgZt+TeQpZycnLy8vLizpUVPsrOz4065Uel7pZtIihdF2nLiLPA0jDsllW5hfXZbGm2mS4KOlZ+HjCP7yVeOeuV0pjQds29r6yS6T6UKgxXta5OrBAPC6HMuBw0atNdee7300ku6ReN1W06rVq069dRTTznllNWrV8eLtpwYkHPPPffAAw9ctmxZvKh86Kabbtpvv/1mzZpVyhvBvpOiW+k/L25k7t+VK1fOmzdPT9XS3Mub45JYt27d7Nmzc3Nzy9X1lpWV9c8//+jPicaLtpzsudWaXNeuXcutnW//6Yt4XfFljjMlSVXC96wjhdGcOXM4rzNmzIBDno/NKjbZEydO/Pvvv7kB4kVbTmwf//zzz8mTJ69fvz5eVD40fvz46dOn6y9vR4rEA5r7pWvXroceeiiXK8/oUn76W+Z69NFHDzjggA8//DDulw/de++99OrTTz+N+1tUzsc6PXv2ZH3/5Zdf/vsP5Og95M0ixpBzef7553/zzTf6o57lYVQbNWrERfbZZ5/Vr18/XrTlVKFCBQZn8ODB+gMy5UfvvPPOoEGD9t577+hGiISsbYv9TTe2j6xW4dJ8vXxzXFSsVlF2dna5umLZ16pXjFi8aMuJkZHoEudRv2MrXlcWirceqOg95M0iLaMqVarUqVOnmjVr6kcFyoPa2oo7W1otW7Zs37593Ck3atq0aYcOHcrPKYu0xcW6kOXX6NGjtQIrzRNyczxdH3rooQkTJlxwwQVxv3yod+/eEydOPPvss+P+lhYjzzP5iSeeYKzOOuus0ryHbMq+KJIoeg85UqRIkYLEM7pu3bpNmjQBQj5Y/01VrVq1efPmmZmZcb98iF41a9asYsWKzGrxonKgatWqMVbWTuhf/3mk6D3kSJEiRUooPRKd7YtKVFUClSY3kdQ3FPfLhzRosvGiLSqNPJ1h7tdEq/IyEa0lVfQecqRIkSIllPZAgnTXH+4tmTbf07U0vdoc0kChuF8OpA+G1KWy7ZjmzWBF+9pIkSJF+pcUPV23SmneDNa//Z51pEiRIkWKtK0peg85UqRIkf4lRU/XrVKaN4MVvYccKVKkSP+SoqfrVinNm8GK9rWRIkWK9C8perpuldK8Gazo89pIkSJti2I7ol+4iPLy8rDh9yibVfQh3xZdEpSHXpVPMTLOSZRwy+dwRXNtpEiRtkXxRJ43b9511113xhln3HzzzfrjTvG6LSomV+y6devuueees88++7TTTluxYoWqIpnirE2ZMuWqq67q1q3b3XffrdErh4o+r40UKdK2KB50tWvXvuiii3777bfp06dnZmaqULWbSWHaz8jISE1NrVix4rnnnpuWljZ27NgKFSrE6yIVFePJWDVu3Piyyy774YcfFi1axOj9+1MYR0yq6PPaEopRYoidtywQ6yn32xfxuLAqTE0tsG1JpM5g6Y/dF0sqpFfxID+lpBYWpMasznLkFKsdXFXE/8VV3JdTjqTR0JlyzpcD8aBI26SqVKnCY5p7ZO+99/53futCmCcDMXSmUqVKrVu3XrJkyR577AHH6yIVFWOFqtnijm7Xrh1uvO5flLoRrGhfW3IxUJzdGTNm/Pjjj6yLeXzn5ORQgoo/hkwJ+lcSOYfjrC9fvpx1+q+//qo/S6JOqtZUSmGq/c8+st0Gk64q4v/i2kT/OTmDw4D88ccfQ4cOXbVqVZ6t4p+mSFuPdGtMmDCBC+PQQw9levsXrocwhyBGYYsXLx4/fvy+++5L31QVySONFeMzZswY3AMOOAAb8MTbTFI3ghXta0uu9evX33fffddddx336v/+97+rrrrqwgsv5AmuuTYe5COq9I+p0bWvSin5sprD6ZQPGTLkuOOOGz169KRJk7p3737sscfCwQv2tJR0uxtsZ/nH3haX3hfon93P/7ZYAzE+U6ZM6dKlyxdffMFGQR/Rffzxx/GISNukdJP+/vvvVatWbd++vW6iwDu3DFSs9seNG8dy+aCDDtLdvbn79l+UxoTBYa5t2rTpHnvsQcm/P1A6aLD859pIAdJ1n5+ff+ONN3711Vf9+vXjIf7www//8MMPa9euTU9P1wcGCURqQSzGfMy/mJZf2Hhl6UQHunXrduedd15zzTUXXXTR9ttvP2rUqBo1anCIeIQpNqxWP/ILC2P2P9uJFVhku1bEf1ycjr///ptlR+fOnVkbnXbaaR07dhwwYECSkYm0DYgL4Jdfftlll12GDh3KcrlPnz4zZ87U5wvxiHCy7+L4JxTxolJIrWGZP+rUqZOZmUnHHnjggT///JPHji5a2aQiTHK6h+J1W4t4RTk5OSNGjNh1110HDhzIeUSLFi2iiheumPKg6D3kEurbb79977337rnnHh7ZrGt4oFN4yCGHOH/ZSmF+4i5ip1Uwa9bs55/rG8u3Ln2ruHSjnp2dfdNNN+27774nnHACFx9TPmrRosUOO+wQ1JmUgsKCXObXOXPm93r0sdt63PrMU48vXrYiL5aam28tAeJh/2UxGrfddhun6fLLL9cilI3+dtttt/vuuweepkhbuTj7q1evHj58+LRp05hizz777MqVK5911lnLly+PR4SWLiQuLS42lSRSmEuOGMS0+uuvv65Zs+brr79mDX3EEUccf/zxo0ePdmIESUWXNMuKVbjViHFg2OfNm8dCBHFCzzvvPAbtnHPOgf+112ufsSSK3kMuthhZboMnn3ySaezwww9nuCgZN24cp7ZDhw5JBzAlNeXPcX/cetutF1xwQd+X37D+YnE8Hir5ny/+4osvpkyZwnXGrM9cQg9ZsLdv375atWrxCD+lpMRSU3OnTZl6/XW3ntyl2x133Drrn2l77XnQpMnTU1Kt3Xn83e7/8hUxcuTIn3/++eSTT9ZQcLIYmdatWzdp0iT4TEXausXZ5+nMlujWW2+94oorGjRo0KlTp8mTJ3MrxSNCy7rn7ZkbGy9KoKQByG4sZenSpUOGDGHp/NBDDzVr1uzAAw9kKfDKK69o/gjTDlIkc8/ChQv17Y14xdYivaLffvuNYWHrzwOwfv36nTt3Hmwr+OOzMhTdSKroPeTiiWuXkeU2YGLbZ599atWqpfIxY8ZwJ7Rp04ZTbkVZm8WUvJT8wvycWF5hfmFBXqHeqI2lFKbs0Xbvhx96qMd116SlVCiwvpHBqWK3q5ZKqBEjRrBXY19LB+ghCz2W6gcfdAjNW8tZ64fis6y3hukVByrMzyuI5cQs4hr4+vMPFy1Y0Kj5jtVr1+z16AOVMwveeP3dlLT0lBSmWwKK3TVGCWk1jZXrVOlhgZzltrsKOSmSakssnqc02LFjR5jWsrKyxvw+5hBGJiO9MJbK4elCHuNjjRKjUphDUL41VHnW2+h5hbGcbLb/BTk4BVQQDDGYhFu5KTHrFRRwAq3Wrf/899923wbEuUKjRo3afvvt2dHyUEZ59m+0YLolQJciIgzrXJMqQSrBModNnz79ww8/5BFPCxTqEKURjfz1119cq1dddZXeKqN7a9eunTNnjgKQ1YmN/XF65Ugx2dnZzzzzzE033XTfffcdc8wxn3zyifNCkGK2Av3++++tWrXq0qWLpj1eNXbatGlYXiYvWeDIfUJRojGUyC0TRe8hF08anHXr1nEbOD8nwGn78ccf27VrV7Nmzffff3/DhqylixY8+mjv/h/3W7F29Yf9+vd+9MF+H72/MivL+rClMC0zvWKFCpUrFuZyYeSn0iaFnGYWodY6tGSaP38+C7pGjRrRQy4aJpgN2Tn77bf/1KlTRgwfxkz7wQfvPPHU0wvnLvpt5JhnHuvzv2cen7ZgIVNvQUHarq123m33HfPS8mMFmRUzUmtWTsndsMG++ugZE21+cd/fpgPIuZBgARIzYuy8Yc+d7wa14BSWWIwM54VbkaY4HHfgP7P/2f+A/VevXfH5Z1/kx/IGDRzUq3evsZMmTpky9dn/PflIn97jxk3Ozs1jWZS/Yd2br7307AsvL1m65Odfhvfp0wtn7pJlMyeO6/14n8ceeWTqlFk6f7wmpmjNuKXtcaR/RVxg3333XevWrfUxEJcHmz+uENasuup0BVJCLXc6gFRClW58XCbajz/+mMU3q1vClJtIwbWOaJy9WuPGjXfZZZd0+6+uLlu2jLm2atWqsGLs7li3D52RcAXOUd58801226+++upLL710yy23dO/enelWKQr4r0un46effuLxW7FiRVw0e/ZsLKdVo0GYQM8cJSINl9gDbhtGOm6woveQiycNDs9u5Gxquc3YVh5yyCG5ubnDhg3LzMh89aXXOhx+4O1XX3/aJdfWqFf3grNO+9/DD4wd81dKIZtFva3B7VRm40wfmjZtyrXF1YbLJfXuu+82bNCoVeuW3w78oUKFCuPHjZ77z+I5k/88/pRunw76/rRzz05ZtuD+u3vlEZ2WdmzXc155441aFWMZscJfhv42Z+GK4088kZnDfk/b/nFb694vhnTx8eg566yzPvjgA7nxOruWe4Oq0aNHx4tsEcPF/cgjj1x++eU8ICjxJJZMTZo0YWdQpUoVmqJB9h95ObkHHdZh1Ijf2J+uWrJ44DefVc9M79bl3P89/uIJXY7buXG1sy+5viC1MKMwd9hPv2Tlx/746dvju5w7fMzv559zzvzJ484/+8LX3//yrDPPblIj89Lu12bn5tqLAg6VxilNK7OzGmnzitlrxowZbdu21ZcZufa4eZksDzvsMK4Tax6zRRX3tbZKlHNz5eTkUA5rqb3rrrvedttt3P64NKJLIZGCax0RNnLkyN12243JlQNRwhOGg7Zv314TgPqDKHT/9BpdVd/UDo0MGjQIlwn7hBNOqFat2vfff08Lqi2xdGhEy4J4hasKxYtcileEG4Qwoin2+vPmzdtjjz2chQ7nkftd3y1ncDQ+iHOHCGCIxBTiIgGjJ1CAyuNHChRhSeU/10ZKJOu0FBbWrVv3lFNO0RePOa+PPfZYnTp1uI4/+ugjllfp6RmFqWnN69dZv6Hg7EsuO/zYw+vWrZa/Zl1eXj7njXPn7F9T9L9Si5N99tlnL1269K+//lqyZMl99923YcOGqlWqLFo8f9So3/fYo+2MmVNP6dJ1/pK527fY8YHbb2MHnJlesD4nJyMlrTA1MzctNcbNGEufM2fkNdfddWfPBzodfVg6Ndaem2dHqCvJLW5sXt2ECRM+/fTTd955R9e0W1999dXnn3/+7bff0nNrHDYqKyvrtddee+ONN9hh4OrmKaWOO+44ViGDBw9m/n7qqacmTpxYqUKlDXm5n374acdOh82eNrVrl2NWLVpQvXq9R/s81nynxttVSlmXxytnI5M3a97cY487YdnCObvsd8BNN17ToGG9iqxkClJuvKvnDts3qpAeW7eB5Yr9fqN1K1kLKeuQZXBKI212rV+/nttEz2guUe7lL7/88oADDth3330pf+ihh9gFDh06lMXZgw8+eM011/Tv359LmvKePXvec889pOvS1b4TcCZaq/XSiZYXLVrUqlUrvYHMo3/gwIENGjTgNqeKfepNN92EZQKmb2xYH3300fnz5/fp0wf36quvhglDN9xwA13VfUSHeZncC86cZB2pRNJ8L4njFRufkMgd41ulElWVRvpxeX3VEfEAZD3RqVMnRo9bnpd/6aWX/v7772+//bbO6Y8//jh27Nj777+/R48ejBvnmlNGl9gVXH/99Ywh8MADDzB0v/32m93fMugk8n+W6QCRTHFWuFK5u9h+7bfffr169eK0/e9//+vXr9+sWbM4zWzXmJtuvfO234b80qBFqy5HdOBRPOfvKStW5jbfsTkjy4VKO9YEZrXGVVs2o73PPvu88MILTPYvv/zySSedRH9OP/3U5599/tbbbsqsULFr11N2atZ4+K9/XnTZRWmZzG+5Q3/8dc89duHgMeuDWf7lz5kz/crL77j3wfuuveHK7Owsa8Jw3taOrw3CikuIgTr88MNZhSCGy3PjXXfddVz3l1xyiaec1eiLL77IS9hpp51oQYWqLbEaNWrELn/UqFGPP/54mzZt3nvvvcd6PfbcU090O/PMmttV36d9+w6Hdhg+bPh5F5yzXc3Kaem533//y77t2qSlpqempJ998bm1KlX5/a+/L77kgoopaXl5634b+2fXM06rXKVyaiznl2Ej2+3TrmJGBjMzDxDLOP8ilW9xXbE+ZtGsywz72WefMUXdeeedFSpUYBnNs5slKbcztXfccUfnzp1vvPFGpjcuXdayr7/++p9//qnrXBZxMWPt5hMqzPVMDGJTq7uDiZaHDGvTa6+9tkmTJlOnTmUbd+ihh9Ir9qz06uabb+beZ4plJqZv69atY14hl0b23HPPI444Qg1+/PHHvLTzzjtPvUU6XAlELl1iX6HvatF4vMLu7bhx43g2Pvfcc0zt7lpg0qRJBx100BNPPKGX5k4smWiBG1xfe6RNxOKeVcXdd9+dkZHBS2bS5VV369atdu3ad91114EHHnjxxRczmLfffjvrEuZUtsX0k43K6NGjTz75ZJ5LzL5cBsceeywLrPhhkoluJFc8tqg0EIKkjAJYrgAFsFwB8rBvlbtcjHzZDinCKIDlYi+//HIuKcd1gEtK706gHFtiTlt+fl5+3prLL+x26ZU98vILNuSvvf+GS44//rR/Fi3+oP/HxBTECvPzN3z30Zt77XbQmryY9aO21g+1Wq1axnUs5D6uZDLWOuqmDlgcs7qRn5Obmx/LL8jPGjP068bNd52yYHFeTt6sCcNqVq83/K8p773x5tI1a2O5G2ZMHn36GSeNGPlnbu76aTOm3nlvn6yc3FhuQWEsJ68gx/6Y2ZLn0I7rsEB9QE6XGC5VIUo0XLm5uaqlULmEUYgAs1nky4CHkcNWD+xj2T2yu5TLyGzItoZnQyw/d9GsSTs1avjDb7/n5xSsWTlp+4aNPvr0u4+/+PqfuQti+eu+/fKzlq32WrxyJSM6b+bI2tXrjv57RnZ+wYJ/xm3fePuBP43u/8H7q1as5AUxStYXrAqt08kNXLlyZR7fsPu1oACWK0Ae9q1yl4uRb7nJKIAd15cFyFMuFiA3I98qQErKVrStAEbsZlq2bNmsWTPWvrjLly9nR8hzU7WSglkxn3nmmStWrPjiiy/YBrFU5Trh4vz666/ZVh555JEsB3XxMKm0aNGCJzKX0Ny5c3lws6eEdVzEXoqZOzs7W+0jCuO0sYf6ts7RRx/tLkRuRlwwtEwfmJZmzpw5ceJEpi7mCd01Q4YM0bZs//33Z1qlb2y+OfQvv/yirrI17927N23SDiKLQjZzTN4DBgzA9RzacR0WIE+5WN178sknmd5OPPFEDkqJqrC4zz//PFXMVUxadhesWgSwWGGv0rFjR3eV3Xy8fRSS5WI5IvvRq666auXKle+//z6nnnOh89i/f//FixfzAGfuxOXsPPzww4zbmjVrCJg8eTKLe6Z/Wvjjjz9YuLBrYmmSlZVFJI3QVVjHQjocU3KlSpVee+01FUp2p5Io2tcmUcCAcD0hbXOxYoKBnPXrR/0x4cD2HVILYmkFeSOGjOrU+cQPPni/StVKqWmpuTkbli1dOnPW3HXr18+aMXv12tXWcUqxIVIn1QcxwIXBxpQeWTvogtTffhvWeMfWDettR8WoYcNa7bZ3RmpslPVrzSvOnD72jFPPmL8w68lnH7/skiu6X3LJjk0bp1ufPtKp9GLvajeODKAuqdARJZQjxo2uxkvtV8G9p1onyx1QMqk1rDUu2ohYradY135qWmFq+p9j/0qpULHN7q1T0wqmT5lWoWqtNrs0+/yTDyvVrpVakDJ4yM9t99y3VtUK+Wkp48aM265Rix22b5Cekjt/zuzs9KotWjYa8M3ASpWrpKXHf4uec9uVvueRylDm6aCETSrP/XvuuYdN6jfffMO2hkuDy/KYY45hM8Rsyi5Q18yvv/7aoUOHmjVrkjh8+PCGDRvusssuasdRmDMe8qrgcmVK7tmzJ1tANqnMAcy1dAMxlXJo+sBkpu9nMFs0b96cKYSHPnMzUwgThrMO4IhMz0899RTzEOUl+OlhjzQgF110EW0+9thjurPidXbP2V4/++yz9FkfhDu1wOmnn84WnHma4XVXlVhqgc0oSyXO4/Tp0wcPHnz44YfTQ3rSpUuXtWvXTpgw4ayzzrLGLjV11KhRDALrYBJ/+OEH9v1NmjRh3FiIkMVZ3meffUik53/99dd2223neUYlkvVKksl/rlW3ImkoNCYSw6pVDOWcEs2yAoepyclPbbf3fsd22q8gJSU9VvnCK679Z+IfVTKrHH3kcTzpFyyY88zTT/6zaNkppx/73huvvvHmWxunp5KLo2OL9CQ9Mz0jLSNNv3MxtULVRpdfcFpGakZaavrBR56+z67N+/f76Npbb6ucnv/nhH9atGm3fYPtUnOyN2Rn12/QZN92ezDXpaSTl5ZB28X/Gpd6IuDCFTvlyOph0SqUmZmp2rhfFtKB1CyymQNXrAikZnA6YxWrXty9e63KlQpS8lu03v+kzoe/+sqrt/a4o16ligVpFWvWaXj+OSfSNZ4N62LbXXJ+t9oVK7GCatJq724nHfH2ay9dd/udqRUrWCfQ+uZbYZq9YNKhI5Urcf/GaaPY1lx66aVMDDym2bZyNeqCROPGjdOEShabMJ7gzrefPvzwQ+ZddlFsbrQ5Y2IjjFp2SJrkrIeF34PUt9Aj2sFyLzDlP/PMM7169eJw7KXoG+nUstmiP0cddZSCR44cyZzB7EW32Q23bduWbjMRqifMyu+99x778jZt2syZM4epTt0rzVXKQVl2XHvttRwIRvEKu4opil0msxcddleh6tWrd+/efY899oj7ZSGOUrVqVba2jBUrksaNG1tn0T6PHJ3JlWm4adOmvGT9rviDDz6Ycs7aJ598whiyOgFoh70stfrt09Qykueccw6N6CjBovHkYsRNOZtiICmjAJYrQAEsV4A87FvlLhcjX7ZDijAKYLlcrFwZrHQASuzbKi4nxh3vcCyWk5eXm52Xl5OXH8vNy1uflZOTu956P9d6DzM3d0M+/3Jy8vO5S2P2ezBWA8pFTjvIYbt5SwHsuC6mM3n5edmxvGweA3n5BXj5Odl5ObkbsnPpXr71Pmq2ZWO5BQVUx3LpFxm51hu/agpjN5X8cALkYcd1WIDc5ciX7RBLvmyHFGEUwI7LC+R06PnIq83Lz47lc6bW5+bwP50au8p+DxyLx8jl5lqF1rnNy87Ny8nKi+XECjiHnHbrh6hj1hvgs2fP5v6P3kN2M/KtAqSkbEXbCmDk+x4yEwMnQgHIHe/LDz/8cOfOnXn+cg0w79aoUYOHMrx06VIa//jjj++///7Ro0fn5uYy882aNev5559n+mErDNuXjPXEcKRmQ76H7Lh2VBHWdfjdd9+xMli8eDEuUz79eeONNyinM2wcb7rppn79+r300kv0lkjmnp133nn33XfXHo49JVmS7yEcQJ5ykwWS2/VlQPJlO8RSSHbcAGZMOO9swXOsh23+999/zzpg2bJlvHbuUEaGre0tt9wyefJkxopa9rt9+vRZsWIF6xsWOgsWLPCMEjZ6D3lziTULi8E7bN15552sm+69917OXLw6odKsr0lYf52jICWNvSHru5SMNOuXD9vLoIyCAvaMDD6R9m+z2IxDTtOspTlcun3IwoLUglhGYUpGSkZ6Wrq1wGUnxpaMrtqdgNidkWH1ynn7d+u8JKxXaAuwOCWNl81po0TnRrWI28kOTmHIFJ7KkKZkWKtqwgtT5syeywVi/bvzbi4S7lUefITpQJH+c9p+++0vuOACtjicROzll1/OhM2VwMP6vvvuY7O4ty1qeXbrm/Ns5r766is2kTzWrUvE7+z7FhZXLOPYjjO16ynPRu24444DaPzGG29kF75q1aoLL7wQl4mfqZ2AAw44oF27dkccccSRRx6pPvBarLa2avEaWYicddZZOo8VK1ZkfKrZf32Pifaee+755ptvmFNZuFA7fvx4NtzsvJ966qkKFSqwXtEZj7cVKNKTymooji5RyDEE2GCWm4ixuIJgxnpy3Yw1q9zlYrkmm/FyEzEWl0UNM+unn36qEikzM3Po0KGcEgKU4m7WieQ/3AcUWUE4Vkd0iHiYE7/xsPzHirMo8UtATq7JWLkeVlupdIdZw/ruMVN9muW64+2forW4QBCvpUrlVlWywwlMxnriBWKs2zVZMQjXZHeMmWsyVu7GGdSuYj1hfb5tQzyE8vixJGvlRIwVpNasj31pBGfSpEndunVTqZPz9NNP69GWbv8QlAppNhFjPV11M9ascpeL5ZrlJstNxNiAdIEY64kReBj5VikX4QazmWsydu3atfvssw87FbYgtWvXZufXpk0bNnw8QxXjxOsRLNd9CFiuR9RS7j6cm7HupmjcaUTl06dPZ4vJ5Pftt986hcjdDnJcs00szeq6RVRRgsx4ubBkRxWpRSoXYB0WiLG+MWKB5HZ9WbkI1+TgXJOxcgNYriN3jCTmOY8977zzGjVqxL4W1lvHAEONnOuE+LFjxx5++OHc2hdffLFiVCUI0KbX7BaF7m4Fs9xEjMUVBDPWk+tmrFnlLhfLNdmMl5uIsbicA3YnWI015YihZ12JS6HZbJytpzDPXPtRzqzGf6iJP6blsMEVWPGI9HhqnN3ldkWCrroZK9dm63DWf9yOPZPwX3szZskVD1uB1i8/slgzivW8CHc4iwUmYz3xAjHW7ZqsGIRrsjvGzDUZ603RqPBfu5x28a1y7yE2zbV2m1ZWQUGMwWQrk7dpN0O4FZ+RkcHKjAsmmmsl3yrlItxgNnNNxnrm2hUrVrBfYe/SpUsXzX+cFzY37Pw4L8Qr3X0IsSlqPYfzHFoBKvEwtvRzrVwPIzNeLuyuQrhOPHKnOOwOxvrGiAWS2/Vl5SJck4NzTcbKDWa3fMtxecIPGjSIufOMM87o0aMHM65zYXDz/vrrryNHjnSCZ82a9fLLL7/44osXXHCBwpBaDtam1+wWB0jUdZPlJmIsriCYsZ5cN2PNKne5WK7JZrzcRIxVrsOOC7hXOu74TWxPmc5ci7Vd5xBqJ052iQ5n/Xcju8s3tY/sXH/GyrWZqk1zLRVY2rKLrLAC+4DME65cuy6aa61ypSQ8BGz/1/6ljJabkmpfEjYTsike4YZhrN2sBSZjzSp3uViuWW6y3ESMDUgXiLGeGIGHkW+VchFuMJu5JmPdc22dOnVWrVp1zDHHLF++XGFqbYcddvjpp5+S3MVFm0XKdZd7YhSgEg9jmWtbtWp11FFHle1cK8C6WW4YxnrSBWKsb4xYILldX1YuwjU5ONdkrNzSsFxNqJUrV16zZk3Lli2333571WKpeu6555566ilYiSrv1atX165ddf0gBwK0Kd8tCnUwd7cSsdxEjMUVBDPWk+tmrFnlLhfLNdmMl5uIscp12HEFWJW74zexPWUGz7VKUDzCs9kK2Mju8nj7CDcRY+XaTFV8rlW59R/irf9YLrOEwC4WW1Wuudb6r+cQbsYWTffpthjriReIsW7XZMUgXJPdMWauyVhvysa5FkdB/N+OSXgI+52JeCH/sbL1/oXYFe8RVU65h7FWazaYjDWr3OViuWa5yXITMTYgXSDGemIEHka+VcpFuMFs5pqM9cy1FDLdan0MI+LZkej3HiOlU+7LjosoUbpcD2MVoBIPYzfTvlaAdbPcMIz1pAvEWN8YsUByu76sXIRrcnCuyVi5pWHHlWCVY8XYDRs2rF+/HparsOrVqzu/fhlRFafESj4bb8vS+EqeErkJRb19yqx/Yst1ZPmuCve/MpTVmrtpTra7aON/3f/UKwIVy79tQxtfq+s1u9BfCuBasL5RZY3cxpJ4vV3ngUj/shh5ptXatWvXrVtXvyhKUKtWrWKdFIKLFR/pPySdXLfiFXZVlSpVdM04wq1UqVI8IrT851rm7UiRIkWKVLaKnq5bpTRvBst/rtXcHilSpEiRylDR03WrlObNYEX72kiRIkX6lxQ9XbdKad4MVrSvjRQpUqR/SdHTdauU5s1g+c+1kSJFihQpUqSyUvQecqRIkSL9S4qerlulNG8GK3oPOVKkSJH+JUVP161SmjeDtelnit2iUPmqDWa5iRiLKwhmrCfXzVizyl0ulmuyGS83EWPNFIcFYmzSeKxckxWDcIPZzDUZKzeY5YaJl5uIsZ4UgclYT7xAjHW7JisG4ZrsjjFzTcaaKSVggeR2PYzFFQQz1pPrZqxZ5S4XyzXLTZabiLEB6QIx1hMj8DDyrVIuwg1mM9fDBfYvCtbvstiwYcNbb7213Xbb6VdYUK6w4rLcMIzFFZiMnTNnTteuXY888shBgwY5hcjdDnJcp02H5ZpsxssNw1hPukCM9Y0RCyS368vKRbgmB+eajJVbGpYbhrG4Ag8jJzJARRIccckq2TlGAMtNxFhcQTBjPbluxppV7nKxXJPNeLmJGGumOCwQY5PGY+WarBiEG8xmrslYucEsN0y83ESM9aQITMZ64gVirNs1WTEI12R3jJlrMtZMKQELJLfrYSyuIJixnlw3Y80qd7lYrllustxEjA1IF4ixnhiBh5FvlXIRbjCbuSajdevWMdfOmDEjPT1dv1a+nIh+ok6dOg0cOFCuyt0vATkugHWzXJPNeLlhGOtJF4ixvjFigeR2fVm5CNfk4FyTsXJLw3LDMBZX4GEU/Y7GTWzGy03EWDPFYYEYmzQeK9dkxSDcYDZzTcbKDWa5YeLlJmKsJ0VgMtYTLxBj3a7JikG4JrtjzFyTsWZKCVgguV0PY3EFwYz15LoZa1a5y8VyzXKT5SZibEC6QIz1xAg8jHyrlItwg9nM9bA2CevXr7/tttsWL17stFBORPcyMjLatGlz1113sQ7wfQnIcdV/N8s12YyXG4axnnSBGOsbIxZIbteXlYtwTQ7ONRkrtzQsNwxjcQUeRk5kgIokOHIO4xwjgOUmYiyuIJixnlw3Y80qd7lYrslmvNxEjDVTHBaIsUnjsXJNVgzCDWYz12Ss3GCWGyZebiLGelIEJmM98QIx1u2arBiEa7I7xsw1GWumlIAFktv1MBZXEMxYT66bsWaVu1ws1yw3WW4ixgakC8RYT4zAw8i3SrkIN5jNXA9rrlWwylXlTi8uyw3DWFyByVjn/UKVOOxuBzmu06bDck024+WGYawnXSDG+saIBZLb9WXlIlyTg3NNxsotDcsNw1hcgYeRExmg9Pvuuy+OLjmtuJtLxChpWNIAqbhhYeITMSpxursQhclFvlVhct2MiptSGkbFSkkUgJLmSr5VYXLdjIqbUhpGScOSBkjFDQsTn4hRidPdhShMLvKtCpPrZhRQxVNPclwB2tyMkoYBUphXlygmDKMSp7sLUZhclDQsaYAUJqysGBUrJVEAJzROiVXklDuiUMmqDWa5iRiLKwhmrCfXzVizyl0ulmuyGS83EWPNFIcFYmzSeKxckxWDcIPZzDUZKzeY5YaJl5uIsZ4UgclYT7xAjHW7JisG4ZrsjjFzTcaaKSVggeR2PYzFFQQz1pPrZqxZ5S4XyzXLTZabiLEB6QIx1hMj8DDyrVIuwg1mMzeYsXJLw3LDMBZXYDI2INdh5LhKd7Nck814uWEY60kXiLG+MWKB5HZ9WbkI1+TgXJOxckvDcsMwFlfgYeREBij6XRaRIkWKFCnS5lX0HnJcxWJ3IQqTi3yrwuS6GRU3pTSMipWSKAAlzZV8q8LkuhkVN6U0jJKGJQ2QihsWJj4RoxKnuwtRmFzkWxUm182ouCmlYVSslEQBKGmu5Lhh4hMxKnG6uxCFyUVJw5IGSGHCyopRsVISBYTZ1xbZCDuiUMmqDWa5iRiLKwhmrCfXzVizyl0ulmuyGS83EWPNFIcFYmzSeKxckxWDcIPZzDUZKzeY5YaJl5uIsZ4UgclYT7xAjHW7JisG4ZrsjjFzTcaaKSVggeR2PYzFFQQz1pPrZqxZ5S4XyzXLTZabiLEB6QIx1hMj8DDyrVIuwg1mMzeYsXJLw3LDMBZXYDI2INdh5LhKd7Nck814uWEY60kXiLG+MWKB5HZ9WbkI1+TgXJOxckvDcsMwFlfgYeREBih6DzlSpEiRIkXavIreQ46rWOwuRGFykW9VmFw3o+KmlIZRsVISBaCkuZJvVZhcN6PippSGUdKwpAFSccPCxCdiVOJ0dyEKk4t8q8LkuhkVN6U0jIqVkigAJc2VHDdMfCJGJU53F6IwuShpWNIAKUxYWTEqVkqigDD72iIbYUcUKlm1wSw3EWNxBcGM9eS6GWtWucvFck024+UmYqyZ4rBAjE0aj5VrsmIQbjCbuSZj5Qaz3DDxchMx1pMiMBnriReIsW7XZMUgXJPdMWauyVgzpQQskNyuh7G4gmDGenLdjDWr3OViuWa5yXITMTYgXSDGemIEHka+VcpFuMFs5gYzVm5pWG4YxuIKTMYG5DqMHFfpbpZrshkvNwxjPekCMdY3RiyQ3K4vKxfhmhycazJWbmlYbhjG4go8jJzIAEX72riKxe5CFCYX+VaFyXUzKm5KaRgVKyVRAEqaK/lWhcl1MypuSmkYJQ1LGiAVNyxMfCJGJU53F6Iwuci3Kkyum1FxU0rDqFgpiQJQ0lzJccPEJ2JU4nR3IQqTi5KGJQ2QwoSVFaNipSQKCDPXRp/XRooUaVtXQUEBj05JJXHH9TyNFKk0KrIRdkShJmrVBrPcRIzFFQQz1pPrZqxZ5S4XyzXZjJebiLFmisMCMTZpPFauyYpBuMFs5pqMlRvMcsPEy03EWE+KwGSsJ14gxrpdkxWDcE12x5i5JmPNlBKwQHK7HsbiCoIZ68l1M9ascpeL5ZrlJstNxNiAdIEY64kReBj5VikX4QazmRvMWLnBjGKxGCUqRO4Yt0VUJWIsrtOmh7EBuQ4jx1W6m+WabMbLDcNYT7pAjPWNEQskt+vLykW4JgfnmoyVWxqWG4axuAIPIycyQEUSHDm/t9M5RgDLTcRYXEEwYz25bsaaVe5ysVyTzXi5iRhrpjgsEGOTxmPlmqwYhBvMZq7JWLnBLDdMvNxEjPWkCEzGeuIFYqzbNVkxCNdkd4yZazLWTCkBCyS362EsriCYsZ5cN2PNKne5WK5ZbrLcRIwNSBeIsZ4YgYeRb5VyEW4wm7nBjJWblFeuXDlz5ky5Kt9+++0bNmyoWsciqhIxFtdpxMPYgFyHkeMq3c1yTTbj5YZhrCddIMb6xogFktv1ZeUiXJODc03Gyi0Nyw3DWFyBh1H0d342sRkvNxFjzRSHBWJs0nisXJMVg3CD2cw1GSs3mOWGiZebiLGeFIHJWE+8QIx1uyYrBuGa7I4xc03GmiklYIHkdj2MxRUEM9aT62asWeUuF8s1y02Wm4ixAekCMdYTI/Aw8q1SLsINZjM3mLFygxktXLjwrbfe6tOnD5OuSnr27HnvvfcSwGOUYMUjShIxFtdp08PYgFyHkeMq3c1yTTbj5YZhrCddIMb6xogFktv1ZeUiXJODc03Gyi0Nyw3DWFyBh5ETGaDo89pIkSJt66pfv/4tt9xy9NFHx3378cosK8WLIkUqhaLvIcdVLHYXojC5yLcqTK6bUXFTSsOoWCmJAlDSXMm3Kkyum1FxU0rDKGlY0gCpuGFh4hMxKnG6uxCFyUW+VWFy3YyKm5KUBT/++OPYsWNV0qFDhyOOOIJyzw4mKScKQElzJccNE5+IUYnT3YUoTC5KGpY0QAoTVlaMipWSKKDk+1oyI0WKFGkbkfavQPwJaIuHKSXu56wYaydFihSXrpBgRW+PRIoUaZuWHpeeubagoEDgzK+OVB4pUrEUvYccV7HYXYjC5CLfqjC5bkbFTSkNo2KlJApASXMl36owuW5GxU0pDaOkYUkDpOKGhYlPxKjE6e5CFCYX+VaFyXUzKm5KUhZ8/fXXv//+u0oOszV79uyffvppzJgx69atq1WrVmZmpiZmxQQ0JbkZJapKFBYmPhGjEqe7C1GYXJQ0LGmAFCasrBgVKyVRgHuVlkhFPopwRKGSVRvMchMxFlcQzFhPrpuxZpW7XCzXZDNebiLGmikOC8TYpPFYuSYrBuEGs5lrMlZuMMsNEy83EWM9KQKTsZ54gRjrdk1WDMI12R1j5pqMNVNKwALJ7XoYiysIZqwn181Ys8pdLpZrlpssNxFjA9IFYqwnRuBh5FulXIQbzGZuMGPlJmXBlVde+fLLLwPo1ltvrVq16osvvrh48WLcChUq7L///k899dQee+yRnp7upAvEWKcpk7GeeF9Gjqt0N8s12YyXG4axnnSBGOsbIxZIbteXlYtwTQ7ONRkrtzQsNwxjcQUeRk5kgKJ9bVzFYnchCpOLfKvC5LoZFTelNIyKlZIoACXNlXyrwuS6GRU3pTSMkoYlDZCKGxYmPhGjEqe7C1GYXORbFSbXzai4KUlZ4N7X/v333wUFBTfffPPBBx/8559/rl69es6cOQMGDDjyyCPr16+vZ2tAU5KbUaKqRGFh4hMxKnG6uxCFyUVJw5IGSGHCyopRsVISBYSZa6PPayNFihTJK6bYjz766Mwzz7zuuuvuuecefXlq4cKF999/f35+fjwoUqTQ2sq/h+y8Fr0uViJyI0WKFMmUHhSoTZs2VapUAdjddunSBebpAf/000/Tp0+PR0eKZEvXTLD851quqq1A+iYhi1CxfuWpqiJFihTJlB6ASM9QlJmZWbt27Xr16ilg7dq1v/76qzhSJCl+0QRqa97XpqWlMb9iNenGSyNFihQpgfQARHHfXp1ja9Wq5ZRPmTLFrokUKS5dG8HaXJ/XarZHYk+hI5VvJjHFvv7665dffvkrr7zCsUKOSEip8/aLsOSURIoUaWsS9zWPjgoVKsB6hmRlZdk1luy7f9PtLxspkqnN9T1k5rm33nrr888/HzJkyODBg3/55Rfs5MmTlyxZUrdu3czMTAV7VNzDJY1/+umn6Ua1atVOPfVUNriJ2kHFYgBpu4yFnYkcFiA3I9+qRPGJGBU3pTSMipWSKAAlzZV8q8LkuhkVN6U0jJKGJQ2QihsWJj4RoxKnuwtRmFzkWxUm182ouClJWeD+HrL7dzTGYrGXX3550aJFlOMef/zxhx12GKxaOzz+uy+wTglyHwu53USMHDdMfCJGJU53F6IwuShpWNIAKUxYWTEqVkqiAPdJT6TN9R4y/fj4448feuihBx54ACtde+21J5988n777ffVV18REA/dzCqTl2OKZpm8BUh3mqoiRYr0H5VuZxT3beHm5uY65W3atFE5Ugk2JycnLy9PhZG2NdmXRhJtrveQkWbT9u3bf/jhh++++y5bzI4dO1Iya9asK6+8csGCBe51wX9O+hgYff/99y+++CLbd023kSJF2prEI4tbOzs7W1yvXr19991XVRLPMWpPOeWU5557Ll4UKZKhzfU9ZE31XKM77LBD165dTz/99Kuuuort7O233075smXLfvzxR2IUTFgsFsvPz3cD5TBLRSyupFqqsAJJVe5IhMuMiOJBG6UqxTiinAYBqhSGVOUoXmp3DGsPVUrv3r2vu+465lo4HldQoL4BWDVrpUWKFKl8Sw8uPZqsR0x+Pjf76tWrV65cyZOE8mOPPZZnmoKR/QxISU9Pr1atWoUKFaI7fduULoNgbcbvIXuaokNci+eeey4XJeUzZszAch2zwZ03b96GDRsImDt3Ljvgzz77TJe43YAVM2nSpPfee+/tt9/u37//okWLKKFBXdZqnDAs5b/88ssbb7xBMO1Tq0Z0kyBKaJmjzJkzx3lTCNHU2rVr59uCFYyIx0V//fXX+++/T8v9+vVjWl21ahWF9JmmsLTP6oFXQfry5cvJooSAhQsXEpCTk6NDxBuNFClSeRW3quw///zDjQxg//jjD255mFn27rvvVoxbmZmZH3zwAdsJ51ETaZsSl0RycSWZYoqyZpiim8VEjExmSjvmmGNonx0tJdb60NbMmTNr1qxJ+YMPPkg5U9Suu+5atWpVZseePXtSxcXarFmz7OxsdrRMh7/99tshhxxSpUoVyjMyMrBM1WeddRYTG3MYLSAdd+zYsW3bttXXBQmrWLHipZde2qVLF5acZ555JocmhuApU6bQGgdi/nYKASbyypUr77LLLtxUdqtWs/Th66+/plla0y9BRdxXJ598Mn3j7iKFYyGOS7O45513Hq2RyKTLUXhpAwcO1O5cbUoOqxwlZSvaVgA7bjAj33KTUQA7rsMC5GHHdViA3OXIl+0QS75shxRhFMCOW0oWSG7Xw3IFKIDlCpCHfavc5WLkW24yCmDH9WUB8pSLBcjNyLcKkJKyFW0rJDtuUhZ0796de7lOnTrbb7999erVf/rppxUrVkydOvWAAw7gxm/atOn333/vfqfNAUS5c5tLFMbJLz4RI3e8h5Ev2yFFGIVkx3VYgDzlJgskt+vLgOTLdoilkOy4pWEUkuUKkJtRfOIM1Gbc1yI1heVI4pEjR65bt44LunXr1hRSwgW6fv16trOPPPIIcxWzI5OrgidMmHDKKacMHz68UaNGd9111yuvvKKVI1vMbt26rVmzhjDEy2YGZVJn98lNcv/99z/zzDOHHXYY0+dXX31FrVpDgixbAE05VYgdKjOoChHd++KLL5jX6QbT/IUXXti3b98777zzhBNO4D6klmPR23r16hHcqlUrJmB6u88++1DFxEw7NMiB6ACNE6OjRIoUqbyJe1bq0KEDi+PBgwfzhDn33HMPPvhgniTscS+66CKW3UceeST3dTzHFimsrWfMmHHfffe9/vrruPGKSNuS7BkjmewLzCumBweSMjKZqf7YY4+l/TPOOIMJjMuRbejvv/++00470bPGjRsvWLCAQpaNLVu2JKxWrVoXX3wx21wSidcEfNxxxxG82267LVq0iGBE+dChQ1l4MvlpZ4wov+CCC4hkVzp37lytLrEDBgxgOqScfS0l6tu0adM4HHtT/W5xR++88w7ldI8pXG1y0BYtWhBZv3597j1K6Jhei3qIeFHcitx+9957rwqpReoAr/ePP/7gNaqEo9gjZMlh++CWkrIVbSuAHTeYkW+5ySiAHddhAfKw4zosQO5y5Mt2iCVftkOKMApgxy0lCyS362G5AhTAcgXIw75V7nIx8i03GQWw4/qyAHnKxQLkZuRbBUhJ2Yq2FZIdN5iRbnmeHiyOuZG5r7FLliwZO3YsS22eCdzOzjNEuUiJPChYhbNP6NGjh25zSQGSEy9AiRi54z2MfNkOKcIoJDuuwwLkKTdZILldXwYkX7ZDLIVkxy0No5AsV4DcjOyoJNpc342SaGrmzJnff/89M9kVV1zB7DtnzpxKlSr16tXL+Z1niLA2bdqwGa1Rowb91sqRRGY4mCuYyVWvh/IDDjiAHSQuW2E2juSuXLny888/p/byyy9v2LCh3aT1iWmnTp2YCElBKpQSvUCnHEA//vjjrFmzuANvvfVWlreU0KbzTrIjJ55agaO99tqrbdu2vCgnMlKkSOVN3KqCRo0asbbmVtUjqHbt2nvssQcreH2G5YS5xeRat27ds88+mwcaKb4xkbZ6WY/7ZPKfa62ZoSxEU6NHjz7hhBPYs7711lvLly/fc889P/roo3POOYdaZ94i7KqrrmIO5oJ2xI6Q1SX73UMPPRSXjSyRAFmnnHIKKeyMZ8+eDbD2ZBNM7fHHH49VJOV6I9eZpB1RFaeicso1dsOHDyeXmZLDeaZYmsU68YSJVY5w6QbtUOIURooUqRzKffNypzuPGuS58U3pIYOlEec5EGlbE+c9qfzn2jJUy5Ytr776aqbS3r17s09ls6h3hrlA4xEbZV/bltR7/QBuxYoVq1WrhkuArnsCtCdmx6mPbJlxWV1SVb16dStzo+xWSyJyaX/RokXYBg0a6FehqmOqlVRCjJ1UZK4FKJFViR0SKVKkcifdsLpPESWO62ZkhxeRu1aKV0SKVFT+cy3zR5mIpvbee++nnnrq6aefvvHGGw888MCqVatyOcarNyrREZ0LN+5vlH1Jb5rk8vLyYC1IzeAwJZJTDrjlrvJVcG2kSJEiRdqKZU8USeQ/12oyK708rbEGNJeByPeIjRs3pjwnJ0df5ZV4VWxhFy5cSBUzq74PXKFCBZplm5udna0YNei2EkykU+6Wfi8Mcqp23HFHeMmSJfppWhUi98hajboOhFSLdYdJCo4UKdLWqug23zalJ3yw/OfaMpTTFYGmujDaY489mE2Z50aNGoWr2YsJFR40aBDcsGHDpk2b0uBuu+2GRT/++CO1zlcBV69ePXXqVMqt5lxSN1auXEmkgufPn//kk0+6I2F9TkwjH330EYcjzO7Cpu+kESZLg1iaUqIdZYkwCgFVRYoUaSsWzwFu9rgTKVJR+c989kxRBgrZlG9Y69atjzjiCCbXRx99dPny5VzHzFjY33777f333wfOPPPMKlWqULjnnnvusMMONMJ8OW/ePIDCv//+u1u3bs5f7VCbQO3atdPT03Nzc4cNG4ZLpH42d9q0abSJFIYOP/zwvffeG+jVq9e3336rd6oJYAbVR7macVkQML9OmDCBplSodJiZfubMmWvXrlWzkSJF2lqlux7F/UjbkuLnPljx2KLSLCJIyshkJp5jjz2WOYYJD0Z2hlUlQBQ6P1/L9Mm0SondhiXmtuHDh9etW5eZjDnv6aeffvvtt3v27NmgQQOaZX5lWiUF0dT//vc/GmESbd68+TnnnNO1a9fq1atvt912++yzD+nu3xvFLLv//vsTScBll1121lln1a9fv3LlysQwa+644476vVHEY3/66Sca4XDVqlU799xzn3/++fvvv//oo48++OCDs7OzFXbHHXcQULFixSuuuOLdd98dOnSoctk365vVAwYMwLVfsc+IqRwlZSvaVgA7bjAj33KTUQA7rsMC5GHHdViA3OXIl+0QS75shxRhFMCOW0oWSG7Xw3IFKIDlCpCHfavc5WLkW24yCmDH9WUB8pSLBcjNyLcKkJKyFW0rJDtuaRgFsLRhwwaePC+99JKeM5I7wIkXoESM3PEeRr5shxRhFJId12EB8pSbLJDcri8Dki/bIZZCsuOWhlFIlitAbkZ2VBJtrrmWKe2YY45hEjrjjDOYNZGdYVUJJDasLVq0YEJirtXUZbdhiZScnBymrr322ivT/nu3hGGrVq164okn/vPPPxyCFE1jWVlZt912G9OnYrDNmjX75JNPmCBh5lGCNQXSLDtaJnj6pmAiP/744zfeeIMSOrN69Wr1jT6QxXTbrl07pmEnnmm1c+fO9E0NMuUfddRRTN7UohtuuIFDUEU7zLVkfffdd84dqJeGHFY5SspWtK0AdtxgRr7lJqMAdlyHBcjDjuuwALnLkS/bIZZ82Q4pwiiAHbeULJDcroflClAAyxUgD/tWucvFyLfcZBTAjuvLAuQpFwuQm5FvFSAlZSvaVkh23NIwSsQ8dl5//fVJkyatXbu2bdu2kydP5olhp1oiIE6uZgUoESN3vIeRL9shRRiFZMd1WIA85SYLJLfry4Dky3aIpZDsuKVhFJLlCpCbkR2VRP4fMJDMJAGoNpjlephJaOLEiezt2JjusssulCgAq1yJSeiPP/5gj7jrrrvWqVPHCUO0gMVlzThy5MiZM2cyh9WoUWOPPfZo1aoVs6/a0fyHiJ8wYcK4cePWrFnD3rdDhw5sWKdOnbpkyRJa3m233RSmcZk/f/4vv/zCdNi4ceODDjqoXr16dJVbhQ2uM7VzaPVh/fr1o0aNYnbH5eU0atSoTZs2+vkiamlt3bp1P//885w5cyjcb7/9dCx6S7epJZjNMf2EnVfnMKAS3GA2c03Gyg1muWHi5SZirCdFYDLWEy8QY92uyYpBuCa7Y8xck7FmSglYILldD2NxBcGM9eS6GWtWucvFcs1yk+UmYmxAukCM9cQIPIx8q5SLcIPZzA1mrNzSsFxfXrZsWceOHXngdO3a9eCDD+7SpQuFLLtVS5jaEWMTteNm5LhKd7Nck814uWEY60kXiLG+MWKB5HZ9WbkI1+TgXJOxckvDcsMwFlfgYeRMQwEqkuDIOYxzjACW62FnppRlhgPoEKxcSeWUqK+Ac6Vqvpd1t4/UglMi0CQqdkqsavu4Tgod4xC4zhF1CNWqBcd1WlCJGnGaQu4SRJtOa4hjKUyiRBbpQAKV4AazmWsyVm4wyw0TLzcRYz0pApOxnniBGOt2TVYMwjXZHWPmmow1U0rAAsntehiLKwhmrCfXzVizyl0ulmuWmyw3EWMD0gVirCdG4GHkW6VchBvMZm4wY+WWhuX6Mjc7O1qW6TVr1mSdrcJork3EykW4JgfnmoyVWxqWG4axuAIPIycyQMln45KJCw7RA4lJSHObW045kQIUr7NrkVPlBEsqV4zkFDrHBTIyMuRiBfoqk5ijwIoBVI5VBxBsNVRUigEU4EhtArJILbvjI0WKtNWIm5pZtnnz5rVq1dKdzi0fr4sUqajS77vvvji65MzY7qk7EaOkYUkDpOKGhYlPxKjE6e5CFCYX+VaFyXUzKm5KaRgVKyVRAEqaK/lWhcl1MypuSmkYJQ1LGiAVNyxMfCJGJU53F6Iwuci3Kkyum1FxU0rDqFgpiQJQ0lzJccPEJ2JU4nR3IQqTi5KGJQ2QwoSVFaNipSQKCLOVKrIRdkShklUbzHITMRZXEMxYT66bsWaVu1ws12QzXm4ixpopDgvE2KTxWLkmKwbhBrOZazJWbjDLDRMvNxFjPSkCk7GeeIEY63ZNVgzCNdkdY+aajDVTSsACye16GIsrCGasJ9fNWLPKXS6Wa5abLDcRYwPSBWKsJ0bgYeRbpVyEG8xmbjBj5ZaG5YZhLK7AZGxArsPIcZXuZrkmm/FywzDWky4QY31jxALJ7fqychGuycG5JmPlloblhmEsrsDDyIkM0OZ6DzlSpEiRIkWKJEXvIcdVLHYXojC5yLcqTK6bUXFTSsOoWCmJAlDSXMm3Kkyum1FxU0rDKGlY0gCpuGFh4hMxKnG6uxCFyUW+VWFy3YyKm1IaRsVKSRSAkuZKjhsmPhGjEqe7C1GYXJQ0LGmAFCasrBgVKyVRQJh9bZGNsCMKlazaYJabiLG4gmDGenLdjDWr3OViuSab8XITMdZMcVggxiaNx8o1WTEIN5jNXJOxcoNZbph4uYkY60kRmIz1xAvEWLdrsmIQrsnuGDPXZKyZUgIWSG7Xw1hcQTBjPbluxppV7nKxXLPcZLmJGBuQLhBjPTECDyPfKuUi3GA2c4MZK7c0LDcMY3EFJmMDch1Gjqt0N8s12YyXG4axnnSBGOsbIxZIbteXlYtwTQ7ONRkrtzQsNwxjcQUeRk5kgKJ9bVzFYnchCpOLfKvC5LoZFTelNIyKlZIoACXNlXyrwuS6GRU3pTSMkoYlDZCKGxYmPhGjEqe7C1GYXORbFSbXzai4KaVhVKyURAEoaa7kuGHiEzEqcbq7EIXJRUnDkgZIYcLKilGxUhIFhJlri0zOjihUsmqDWW4ixuIKghnryXUz1qxyl4vlmmzGy03EWDPFYYEYmzQeK9dkxSDcYDZzTcbKDWa5YeLlJmKsJ0VgMtYTLxBj3a7JikG4JrtjzFyTsWZKCVgguV0PY3EFwYz15LoZa1a5y8VyzXKT5SZibEC6QIz1xAg8jHyrlItwg9nMDWas3NKw3DCMxRWYjA3IdRg5rtLdLNdkM15uGMZ60gVirG+MWCC5XV9WLsI1OTjXZKzc0rDcMIzFFXgYOZEBir4bFSlSpEiRIm1eRe8hx1UsdheiMLnItypMrptRcVNKw6hYKYkCUNJcybcqTK6bUXFTSsMoaVjSAKm4YWHiEzEqcbq7EIXJRb5VYXLdjIqbUhpGxUpJFICS5kqOGyY+EaMSp7sLUZhclDQsaYAUJqysGBUrJVFAmH1tkY2wIwqVrNpglpuIsbiCYMZ6ct2MNavc5WK5JpvxchMx1kxxWCDGJo3HyjVZMQg3mM1ck7Fyg1lumHi5iRjrSRGYjPXEC8RYt2uyYhCuye4YM9dkrJlSAhZIbtfDWFxBMGM9uW7GmlXucrFcs9xkuYkYG5AuEGM9MQIPI98q5SLcYDZzgxkrtzQsNwxjcQUmYwNyHUaOq3Q3yzXZjJcbhrGedIEY6xsjFkhu15eVi3BNDs41GSu3NCw3DGNxBR5GTmSAoveQI0WKFClSpM2r6D3kuIrF7kIUJhf5VoXJdTMqbkppGBUrJVEASpor+VaFyXUzKm5KaRglDUsaIBU3LEx8IkYlTncXojC5yLcqTK6bUXFTSsOoWCmJAlDSXMlxw8QnYlTidHchCpOLkoYlDZDChJUVo2KlJAoIs68tshF2RKGSVRvMchMxFlcQzFhPrpuxZpW7XCzXZDNebiLGmikOC8TYpPFYuSYrBuEGs5lrMlZuMMsNEy83EWM9KQKTsZ54gRjrdk1WDMI12R1j5pqMNVNKwALJ7XoYiysIZqwn181Ys8pdLpZrlpssNxFjA9IFYqwnRuBh5FulXIQbzGZuMGPlloblhmEsrsBkbECuw8hxle5muSab8XLDMNaTLhBjfWPEAsnt+rJyEa7JwbkmY+WWhuWGYSyuwMPIiQxQ9B5ypEiRIkWKtHkVvYccV7HYXYjC5CLfqjC5bkbFTSkNo2KlJApASXMl36owuW5GxU0pDaOkYUkDpOKGhYlPxKjE6e5CFCYX+VaFyXUzKm5KaRgVKyVRAEqaKzlumHgPI89uTCoWuwtRmFyUNCxpgBQmrKwYFSslUUCYfW2RjbAjCpWs2mCWm4ixuIJgxnpy3Yw1q9zlYrkmm/FyEzHWTHFYIMYmjcfKNVkxCDeYzVyTsXKDWW6YeLmJGOtJEZiM9cQLxFi3a7JiEK7J7hgz12SsmVICFkhu18NYXEEwYz25bsaaVe5ysVyz3GS5iRgbkC4QYz0xAg8j3yrlItxgNnODGSu3NCw3DGNxBSZjA3IdRo6rdDc7lkJ3WJr9V7cLCgoAlbsDFJOIsXIdFoixvjFigeR2fVm5CNfk4FyTsXJLw3LDMBZX4GHkRAYo2tfGVSx2F6Iwuci3Kkyum1FxU0rDqFgpiQJQ0lzJtypMrptRcVNKwyhpWNIAqbhhYeITMSpxursQhclFvlVhct2MiptSGkbFSkkUgJLmSo6bKIY5lYe787h3AAurRC5WKha7C1GYXJQ0LGmAFCasrBgVKyVRgAY8WEUmZ0cUus9WMMtNxFhcQTBjPbluxppV7nKxXJPNeLmJGGumOCwQY5PGY+WarBiEG8xmrslYucEsN0y83ESM9aQITMZ64gVirNs1WTEI12R3jJlrMtZMKQELJLfrYSyuIJixnlw3Y80qd7lYrllustxEjA1IF4ixnhiBh5FvlXIRbjCbucGMlVsalhuGsbgCk7EBuQ4jx1W6WPMrduXKlUOHDu3cufP06dNHjBix9957t2vXLisr66efflq1atVxxx1Xu3ZtdrcEhzmc+xAOC8RY3xixQHK7vqxchGtycK7JWLmlYblhGIsr8DByIgMUfTcqUqRIkcq1YrEYs2xubu73339/6qmnXnDBBQMGDHj++ef79et39NFH9+/f/7777vv000979ux57rnn5ufnx9MilScVmZwdaQ0FOPN5AMtNxFhcQTBjPbluxppV7nKxXJPNeLmJGGumOCwQY5PGY+WarBiEG8xmrslYucEsN0y83ESM9aQITMZ64gVirNs1WTEI12R3jJlrMtZMKQELJLfrYSyuIJixnlw3Y80qd7lYrllustxEjA1IF4ixnhiBh5FvlXIRbjCbucGMlVsalhuGsbgCk7EBuQ4jx1U6DPBMtitTnn76aebUn3/+mR3tunXr2NTuueeeL774Yr169T744INLLrmE/W6TJk2ITHQID2PNw1l1RrnJAsnt+rJyEa7JwbkmY+WWhuWGYSyuwMNIn5cHyz+ChsqD4r3ZOC5ua8pK2NrlPtP/gnQs5L6qfBVPiBQp0maQc4vxTK9Zs2Z6enqbNm1wa9SoUbt27YYNG9avXx+3RYsWbGqXL18e3ZL/suynYBKV3/eQeb6zlEPOgz4Wi3EZzZw5c9q0aatWraIcUbhmzZqsrCzFbMXilaKFCxc6K9zNLWeEuYGxlHBoQCcFqQQgQG6kSJHKXEyxkvNkh5lxMzIycFWFi1SlmEjlSkU2wo54gOpsqTaY5SZiLK4gmLHuXGdGAaZPn/7VV1999tlnS5YsqV69epUqVVauXNm8efOuXbvuuOOOTz31VM+ePffbbz/F+x7Cad9huYkYa6Y4LBBjk8Zj5ZqsGISblH/77bdnn3327bff5h5TidOOGOtuNhHLTRqvU5Cbm/vxxx9PnDhxxowZTiHilmaJ3aFDh/+3dx4AUhTbGiYHSUoSERSJYgAURUUUxYggiihmzDnnnL1GVNR7xZxzzlm5ioKgCKKSRJKA5JyXXd43/Q9lbVV3T+8OXtHX/7vv+J1T51RX98xUGJZljz32aNiwIU181P3+xVjnEgIx1nZ9Vg7C9dnO8Wt9xvolpWCBZLsOY3EF8Yx1am3G+k12XCzXj/ssN4qxMeUCMdbJETiMQptUi3Dj2a+NZ6zcfFhuEsbiCnzGxtQaRsZVuc3o8ccfP//885kJK1euzGewffv2HTp0+M9//kPa0KFDd9lll+HDh7dt21YloZdwGGtfAhaIsaE5YoFku6GsWoTrc3ytz1i5+bDcJIzFFTiMeCGyFK317jvk7AjWrrUcZK+66ipm82efffbss8/++OOPBwwY8FGgHj16XHHFFfvtt9/gwYNZcbP1/zjxEASzZ88+66yzAJY0Rf5scWneT5UqVerdu/c555zD/ub1119n07PDDjv06tVrr732Wr58+cknn8yr88QTT5hxpkqVat2KD5c+XwLJjtssQIqn+h8o+8Rjtd59h8zkziqLuAEWUebxu++++7jjjvv666+PPPLIZs2abbTRRtWqVWvUqNEpp5zy4IMPsvC0atVqww03zNb/48Sj0Le411577Q8//FC/fv1sw58vni0HaLZsLLcNGjTYeOONeXVatmx53nnnHX744aeeeionbEY1ZcoUVuK33nrL3uilSpVqXYlPFmIeWLVq1dSpU5kTONdi582bt2zZMnbhxJkipk+fzqf1t99+KygooJUSbLaLVH+11rvfZYG0U+AIyxLLPH7ZZZfdcMMNTPcEs3lBJu+qJk2avPbaa507d95///2zDYFozVKuy2WphGwHUZJaFNqUs5a7ht977z3O93xyDj744F133VVNKGc5yoeRXI6wd911Fx/mI444omvXrvq6mLE1bdr0xRdfXLx48YwZM/r06aMSKbQfKYpRaFOSWptRSUvyYZQzLWeCVNK0JPlRjEpdbgdRkloU2pSk1mZU0pJ8GJWoJCoB5ayVjGvHNfuNHj165syZHTp04BNXr169gQMHNmzYcLPNNluxYgXzIa3bb789VRxFatSooUIp6nKhbAdRklqUMy1ngpQkbV0xKlFJVIK9NkWp2JfORgRVrNZ4lhvFWFxBPGNxFRk+fPiBBx7I9H300Uc/+uijauXNpD+nJMfs1w455BByDj30UOWg0EuY/g3LjWKsX2JYIMbmzMfK9Vk5CDeUudNffvmlW7duEydOxH3iiSeOPfZYHoVanT6xdrdRLDdJvlw0duzY7bbbjr0zmxteGsWxbKh33HHHUaNGceBmQ23+INn0L8bKNWz3j7Vdn5WDcH22c/xan7F+SSlYINmuw1hcQTxjnVqbsX6THRfL9eM+y41ibEy5QIx1cgQOo9Am1SLcePZr4xkrNx+Wm4SxuAKfsTG1hpFxVQ4DkmkVO2mmVTJs4oDPWJMjFoixoTligWS7oaxahOtzfK3PWLn5sNwkjMUVOIxMZozCv0Oml79QS5YsufDCC9m+sWW76aabONEygyOdpSQWG8QdtmnTpnXr1vrC5B8m/fQvi9nll1++33776aXZaKON/pIXaNCgQYykbt26+hk081abN2/ewoULeS0aNGjAa6TkVKlSlVrMZlLWt+Z3pB/X0ASIxAQNI5JNFWB3lerPkB51vNbHn4167LHHBg4cCJx77rn6S9nZNk809enTp1mzZqxJ2dA/RbprXsX+/fvDxxxzDJZPFEuaEv7H+uCDDxgAR1vOr+btxWf4q6++mjVrFtyrV69saqpUqfJTsNRmvrrTZ60UohMsEyOiK3Wb6k8Sr1ROFTsIG+mlEmDjWW4UY3EF8YzFnT179m677TZu3DgOtYMHD9500021TTM5iDSTr3ckrmkF/Euo1Wa5UYz1SwwLxNic+Vi5PisH4dqs+xo6dOipp576xhtv/PjjjyxmHBx5MptttllUn1jTLY4e3Br+D2XjGUTmcibfcJmiNWX4b8BlKV5TNG/+/Hbt2k2bNu3uu+8+55xzTOaY0WO6H9h90qRJHTt2fOP1N+rWr6d+lGAzdu0lMgPi/zMJwX+5QnC1TE4mUJKntLbPP3L8Wp+xfkkpWCDZrsNYXEE8Y51am7F+kx0Xy/XjPsuNYmxMuUCMdXIEDqPQJtUi3Hj2a+MZKzcflpuEsbgCn7ExtYbRqlWr+KSPGTPm5ptvLigoMLVqxfXZzoGZPSpWrFinTp2ddtrpwAMPrFatmg6+mRrv0linXCDGhuaIBZLthrJqEa7P8bU+Y+Xmw3KTMBZX4DAymXGiwBcvlYGcjGJYrgBFsTZf6KWXXuJtwcCOP/54fYmqOMr0GMiwiQsk27VzHEYxbNxQFgRbRnd4DptbkFUc2TmS2KQB8+bN23333Z9++mk+b/fffz/PpEqVKjNnzlSC5LBx6WVF0eo1q1atKVxVVLiioGjl6sJVK9cUrmZIwf8K16wx4DAEr1qzZlVhULBqxUfvv6tv8ocNG8YdrVixYsK4cY/d959mTVtWqlyt91FHTZo0uXBZQWFRth91ZbNxV9JrwbIiRlXE4FYRWmkNG4VykJJRKAcpxRjFsHHzZIFkuw7LFaAYlitADoc22XExCo37jGLYuKEsQE5cLEA2o9AmQMrJmexACdm4+TBKyHIFyGG5AhTFiM8+n7Lly5czB2oSx0r6ApmIWThtEaQVq1a5l19+OYs3fWZ79y5tXMMC5MR9Fki2G8qAFMpBSkYJ2bj5MErIcgXIZhRk5VCxxdmIYr2iao1nuVGMxRXkZK7bp08fllv4hRde0E88mTRAybhiOy6W67OfLzeKsX6JYQGj5fNg3NB8cnjHs0qxTOIqriY7H5l+xKyvV199NcvtAw88wAfszjvvvPLKKxs0aDB27Fh2qX4/YmyxbgvLrqqQeY0Ly5SpWli2TPlil/DzA15TUGZ5BXhNpaI1FVaz7VlTdP01V99x2x3slFu1asUdzZ8/f9GiRQVFZQ8/6uhzTj1lm62ala1Utky5ChXLVF7b/R/9i7Fyi8qVLVe4ZmW5MhVJKMMFyq4p/8cwEGk+2135bOf4tT5j/ZJSsECyXYexuIJ4xjq1NmP9JjsuluvHfZYbxdiYcoEY6+QIHEahTapFuPHs18YzVm4+LDcJY3EFPmNjag0jVRFhY73vvvv++OOPzCEEmQe6du26zz778BkkQQsqYimdPn36+PHjx40bN3nyZFyS1Qlq3rz5Dz/8ULly5ahLY+UaNrVO3GeBZLuhrFqE63N8rc9Yufmw3CSMxRU4jMxrEaNiBUbmMuYaMSw3irG4ghiWli5d2qFDhzFjxrAyjRw5skmTJtqaqdXuVmz6NyzXZz9fbhRj/RLDAFq8ePGRRx45bdo09oxR+Rr/Mcccc8kll+AqriY7H+HCimM/+OCD6667DlunTh3cCy+88L777mvbtu33339PAlKV8g1j5cJ4BZnvgMuUxytbhpWzkA+svk/OpBHNAjLMf8musLrMmrLlCspRUmbFilV77b77D999x82eddZZy5YtGzVq1Asvv/LN0KGNNtv02muuOeHwY8uWKbuywppKFdZ2avUvRnLXlC1fuKZsYZk1VVdnwmXLFa0uV1i+TOZXzWWag5H7nLmZQLg+2zl+rc9Yv6QULJBs12EsriCesU6tzVi/yY6L5fpxn+VGMTamXCDGOjkCh1Fok2oRbjz7tfGMlZsPy03CWFyBz9iYWsPIVLGmDhgwoGfPnkyMms0333zzL7/8Uj+0gYtVsoAd8M8///z4449zXOEYoCW5evXqI0aMaNq0aejlTLnNAjE2NEcskGw3lFWLcH2Or/UZKzcflpuEsbgCh5HJjFGxAiNeJBWba8Sw3CjG4gpiWGJ31rJlS94ldevWZY9Wq1YtdWX6CbL+uIQdD+3WsJ8vN4qxfolhwcKFC3fZZRe2k3CMyD///PNvu+02ACloLqeuEC6s+JQpUw455JB77rmnU6dORPiY9erV680339x1110HDhyoHFU5jLW6XbO8bLnKhWXKlllddumisuVWlFlTcU1R5SCXYOZYKcBazPUKqChbYU1RuZVlC8r9NmPhVjvsUFCw6tVXX+3evbvGuXzFiotPO/WtZ58rqlLh7jvuPOKI44oqlSmfOaKuvZ21/Yux2UusWbWmwqoy9F51ozVlKjKBrFmzOnPijrgj63Yy0tUdtnP8Wp+xfkkpWCDZrsNYXEE8Y51am7F+kx0Xy/XjPsuNYmxMuUCMdXIEDqPQJtUi3Hj2a+MZK1dsy4nj+vl2V0kYiyvwGRtTaxgZF1i9evX9999/+eWXa0OPDjrooGeffbZq1aowaeQIsLDc995777jjjluwYAHMaZgt+1577RV6OeWbcrFAjA3NEQskuabWuCbHsJ1m2E+LZ6zcfFhuEsbiChxGf6dzraomTZqk/Rcr7nfffceOzLRiiRtXbMdNV7g++/lyoxjrlxgG2I6gH3/8saCggKDiWD8fu8kmmzRu3Jig4graOQgXpjc+XaeffvrEiRM5CuuPrvmYXXXVVT/99BNL3VtvvRV04/YjxsrNcNHqleXLVlq1pszKJVOPOLfq9Anlyq2kryA3k0dS8N/M2VecCbIissQWVigsV76obJlFFWp+0267Ex++b6N6dUeOHFm/fn29q4qK1swaMGh4r0Orr5xfq3qtRo03LWRZLqqwdnnN3s4fzH+ybln2DssbNGj0dP+VdRpVKCpbvszqMuXStTakW4exfpMdF8v14z7LjWJsTLlAjHVyBA6j0CbVItx49mvjGSsXlvjM6qsmhMtnTQkVK1bEtfMNy03CWFyBz9iYWsPIuJpkVq1adfDBB3/22WcEcStVqnTrrbeed955JlOSi4WZMViPTz31VP1o1cMPP3ziiScqR2k2Y51ygRgbmiMWSHK5NBZGBHFXrFhRo0YNtSKCSgiK/mA1IWXmZKzcfFhuEsbiChxGJjNGxQqMzGXMNWJYbhRjcQUxrCrOc82aNeP91Lx582HDhplffWL6Ma7Yjpuu7G4N+/lyoxjrlxgGEG+jpUuXKqg4Vq5hxGe7QoUK1apVM3E12flI3fLhf/TRRx988MEtt9ySCOIq6NNPP128ePHZZ5997733KllVph8xVi68pnBNUfmy5YsKVxfOm7VDj8pjh63ZcIMy5aoEuaSRkwVkmP9mPitFFYvKFVWfP3dx2ar/2nijB6ZM3H+f/dksk6O1dkmZ1eW+G7aoS7eKK+bPLlexZvXadMy1rLU226cYyV1dtnD1kkXl6zTa5NtPltZpWoV1ttyKsuWqRN2RuR1FcH22c/xan7F+SSlYINmuw1hcQTxjnVqbsX6THRfL9eM+y41ibEy5QIx1cgQOo9Am1SLcePZr4xkrF0Z8uKZPn75w4ULFldC6dWt2tEiuyTcsNwljcQU+Y2NqDSPjAnz2gQkTJuy7776//fYbcyMuh5BPPvlkhx120IeRZJMvlzul8OSTT37uuecIXn/99ddee61ylGYz1ikXiLGhOWKBJFcjxM6fP/+jjz5iNuMENXTo0Dp16tCqfKx9CbtPZHcbw1i5+bDcJIzFFTiMTGaMihUY8aRUbK4Rw3KjGIsriGfsvHnz2rZty+dho402GjNmjF4e4qafTOpa18jOketzVD9RjPVLDAsWLVrUs2dP8wEIzQfQ8ccfz8FULnHTJFAEl35+/vnnPn36vPDCC61atTJd6beyTZ06lc/M1VdfTQSpyukTKxfGoa185oeAlyxq02nVb2M2euudwk23CHIzaU6+YbzMn9YWLVl+0UUrvxx6bsGKlwoL7+zb98ILLyQtm1+mcPmgb5buf/C4woU3ri571+PPtthu27LlMr/LIujhj/7FWLnl5kxf0OeoCkWVNxr0RVHDRmXXlC3iQJz+eW1Ytw5j/SY7Lpbrx32WG8XYmHKBGOvkCBxGoU2qRbjx7NfGM1YuLA0ePJhD3muvvQYTr1+//vjx41loSVPE5BuWm4SxuAKfsTG1hpFxATH21VdfPe644/RzTwy4ffv2b7/9dt26dUkjweTLBZhMRo0atcsuuyxfvvzMM89kj66FWWnKF2P98kybF/dZIMnV2sEq279/fzYEHGo322yz7777jqHa3fpsurK7jWGs3HxYbhLG4gocRubZxokCXzwvAzkZxbBcAYphVFBQ0L17d8bNm2nAgAHszlA2o3i3ZC5ZsuSnn34y8SAlo1AOUooximHjhjKWbSM7g5YtWzJUjq1RqlixIgmXXnop+aqVDKsrfXXMjrtLly5PP/00nyjiEq2TJ0/WPznwwAMPRPWDgvQ/ukWr16zO/A2eZSvnbrnlr7WqrZ6Q2RZIofkZyDJjWjm39yEzqlbtWb5cxYpVRowYYV+6qGD1gq8HTq5R+9Mq5RrWrDVn9pzC1asoDmozyuSsVbYk0OrfZ81s0XTOZputmT4z85eA1OnaHBTKQUpGoRykFGMUw8bNkwWS7TosV4BiWK4AORzaZMfFKDTuM4ph44ayADlxsQDZjEKbACknZ7IDJWTjihFvN/as+sMppsuGDRuad7US7HwxSshyBchhuQIUxcjOFzNI5gSWTOZGxMjRqaeeykrG1BGar5JDDjmEzP3333/lypXKQSYfmRKbBciJ+yyQ5DKPMSQGhk455RSu3rhx4zlz5ihBCmVASsjGzYdRQpYrQDajICuHEqzG/0OxLB155JFY3ihPPfWU7iHbZklB9qcsQor878UY+MS+/vrrQ4cO/TYQezfk8JAhQ4Bzzz2X91y20hNNfH7o8O67727SpIn9u50RPHv2bDYWwCabbJKN/m/VokVzBsbLkfXLlOGFGf31oMKCwrJlynY94IBa/9x/ZynVP0aaNzhgVapUSZG/l6655poOHTqYKfGZZ57hgB4zsTCraDplE29/eP9UMTyGxDGDq+vLuZgR/r9S+FrLA/qr1KNHj86dO/M6vfjii7yZstHiYt/EQssbaK+99sqGAmU2GMm2GAnTYsRT4k3cunXrdu3atcmlBg0aZMsixN7ijTfeeO6552688UaOws5LwAGaTSKgL9UVjFf2Bq3bVI/Ojdtucc5C8HYos8suHatVq6YNAWLr+vlnn916+x1ly5WpW6fO5ZddxlabTYFapaiekfrkP1k/kJMjmWBoa6rk+ls/wHU1eL3xzNSfjf4dxGj59LFLeOCBB2rWrMktII6ql1566ejRo7NJlnR32D322OP2228/7rjjyFeTrT/jXcE4NTymR33Frfg/+yPMbeZU+Fqrh/WXqGrVqvfddx/rEwvqeeedd88998ydO5fXiVGxIBEcM2YM77Bff/31wgsv5OWkhCbu1ryWgC0ToZxMMSKuQvWsNGOR8jUqX3pLYc17K1RKYJx+GpfQN+SAzr6s3KzKuqgGgCXhm2++AeiERRfXDFtMGqBkBGe/eC8sIok2+spchG7Xvu8lalVOvrkWHWVyM1yYKSsqYtw7dtiRxZ7PNsfrUaNG8fyPOvqo+ZnN8pqGTTZv1qoVK6cmMPWMgp709XAWZMnh7jJ27bOFyDetEsyoBDQB6hbBKkyVRDwxPVisAK0/T5IxMCoGY94AGlgw2EwcVjBbkIf0YVSHEkF61nUlmJHoLaeq9UEaOTNA27Ztr732WlhjnjZt2gUXXLBo0SKNOZsd5GvOYbVjnjzttNP0b5La4sal4L6zjx2rnuXqJcgWJFNmoNZySycAnWiEuOpWPav1HyDdV7wyTyGLlvR8Bdh4lhvFWFxBPGNxWU2BGTNmXH311S+//DIvz6abbrrTTjs1bNhw1apVLLRz5sw59dRTTzrpJJ3/KFQ/ZKoTiQgvOZYVgj6rV6+uTF7jZcuWVQh+kRMuCQigHODNgWWWJ6KvQdQbrvoUC8RYE49huTYjBjNu3LjXX3/9oYce4mNTu3btm2666bDDDttwww3JZBhffPHFiBEjbrvtNv3wZMuWLVnntt566/bt21NuumLYGjMD5oNXOVDZMuUKy64uW7RmxYx5K7t0XjBz6ubDx5TdvCFpXNeMRC6d6HHRKavfd99+++P3Q5vceee2M+ecVbBqUP1NGm6yMUdb9j1Tpkwhq0XTZqe33bbb2x9V3rZJ3U8HrqlUqcwann/mV0DRm3pGQX+ZSwBZO3ve3D06VlhRUGfId2sa1CmzpizZSxYvZsC8oBoMt6MhIUbFrWFpklXPSB0KsDbLjWKsX1IKFki26zAWVxDPWKfWZqzfZMfFck1cTxLGslcbO3Ysj7dp06b16tXjeSonU2P1Kcba3TosEGOdHIHDyG/CmhH+/vvvw4YN46DGxo5FYrfddttmm22IkxY/1CjGyrWZaWSTTTbhUcCNGjUaP348H67//ve/EydO5APYuXNnHg5NXFEXRSW6nMBnbEytYWRclduMXb58OefUN954Qx80Bsnqe8UVV/CyasA5L2FYc6b6mTVr1qRJk+D69evzWLQ2E9eUCMh1WCA5l8A+99xzxx57rH42qk6dOky8vP1oatWqFdMvPZPDmO0+kdNPFGPl5sNykzAWV+AwMpkxKlZgxBNXsblGDMuNYiyuIJ6xxmUAzLB8DN57771vv/129uzZvDbNmjXjk9ClSxeWIvtdhV2wYMEHH3zw1FNP7bfffn369GFxevPNNx999NFOnTrdf//9AwYMeOSRR/j00j/v1NNPP53P8CWXXKJOeMN9+OGH1HKk5iT9/PPPf/zxxyeccMI555wT9bfuQocdz3Id5jY5JuofptUz54q77747N8vYmBS+/PJL9gq0MkisBrz55pvXqlXr3XfffeWVVy6++GJOw+xLfvrpJ54AO5IzzzyTdeuOO+6oWLFSUdnV06dMveCYE2/5aUSlMqs3Hjzi03E/33fffXwAmMiomjx5Mp3wqK+//vpffvmFD3DtOnWY+oYMHrxkwZwmd96x4bfDp1x28Yz2O5VZk/lA8vFgxWUyqlu7dtHYMYv36V5pq81rf/pFZq0tU7hqVSFz1oMPPrjzzjsff/zxbAv4jL311lsbbbQR0wGvI892iw2qz959l3IrVtUeOLh8k01Za2fMnn3iCccffdTRhx9+OPsbrsKNs//gg8qtHXrooXQ4ZMiQ6667jl0IreYZAiV62oaxfkkpWCDZrsNYXEE8Y51am7F+kx0XyzVxAPHi3nzzzWzs+Owwq06fPr1bt2433ngjG1m/TzHW7tZhgRjr5AgcRqFNvLenTp16++238wZu165dzZo1P//8c9Zd9sdHHXUUn2VtlMkM7TaGsXJt1lo7f/58XNaVu++++1//+hfLDDMDEZ7PlVdeedZZZ5nFBpXocgKfsTG1hpFxVW6zgKMIs4TmDdwaNWrwKd57772VGdptKDPnwExBTAX63QZsO3788UeADyxzKRMOIodklThs+kS2qzQ+wszGrLV8fvn4X3PNNbwDiTdu3JjTNjsGpjunf+T0E8VYufmw3CSMxRU4jLiLLMWIAl96DQQ5GcWwXAGKYbkC5HBok4nzceW9wseDVZO15P333+fj9MQTT7CK8I7s2LEj0zQ72cWLF7OIbrXVVmyvKFE5ORx8ydl1111feukl3m133XXXBhtswBtaCci5nAA58RhGoRykZJSTM9mBGC3zxdtvv82Ky2LWr1+/F198kfV1+PDh3BRvaz4qV1111bIly6b9/ttuu+56SOe9pm+22a+1qq0aP3nMmDFsKZ599tn+/fuzhnGbAwcOBPRbl/kkFBbpsWR/Dnlm1arLP3ov+/W0NWye+JLvvp2xYYM5u3ZYs3xFpijzBX8Bk+bGG29M53379mVU9M8EyghfeOEFZrHPPvusYPrM6c2bjqpS5bZTTlu6bPGCufP27LJ3x107sp3ihaBjXYIXa8stt2Q94AXlVWPfw15Yf5pgpGSBwyiGjZsnCyTbdViuAMWwXAFyOLTJjouRYd4PiA3QFltsccYZZ7CA8RrxqHfccUdebnaurLs8dvvJSyqPYQGiNvPCF/9pWAESK42RhDbNmTOHTx+TF4scb2zSvvrqK33JgXgnBwPM/mmLFMXUMhKSWTinTZvGQcpcxaSxeeX8qtmPh9ChQ4dnnnmGKx5xxBG4TJrYhx56yHRl16IYlkuhkYLI5AgQrXSOhe04svMdVhUDY1vMh0tTPE+JOY2Jjia1qkpW8pmueNrMhM2bN2fLxRJLhCd200038QT4FE+YMIHeSJZMrc0CyXaDrCIeLMNjOuLYwxaKrTPnBDpnwLy+nAcYqklWIUrIxs2HUUKWK0A2oyArh8JX42zj30e8eCNGjGC7xEGtZ8+eLVu2ZIJu0aIF8UaNGrG08EbUUXXo0KGctHh76Tax5DD78MbaYYcdDjzwQM6UFPKB1G84U//rlfiAMWZWWQY8cuRIdo68jxm2/q3f7bffnvvlXX7FlVcdc0wfVriHn3mmctUNypYtV3ZNGc43RLg1ZhZKOAf/+uuvvXv35nPbpEmT4Ohf2lvmiFqmDGstH1c+QpxI9M/a628hc4DgM8aHTefjShUrvvveu+ecew57W57xiy+8yKiy/QSaOXPmlClT9grEOHm9lixZoh8QS5VcPGqOsCeffDKvxS233MKrwJsH7t69O61ssxA5SPmlEG9FRLdZP0yajLiKPysRfPjhhwcPHgywRWY7BbBysI9UAmcjnUHl5hQjYaE9/vjj+RTz7op6z3AVVK1atSeffJLPws4778wBl48D5Qzy5ptv5p2sDxpp2ZrEUufxY1ZO1kksldAzr+D555+PCyOWzMsuu4y1U26SnpWpuZF9LXMmN84SePTRR7MdZxM2aNCgzBDzeG8gytnhse1mn82UxYrOEYj5inWdh/zDDz9k8/7m0oOKV/haq5fhbyTG/Pnnn/P6sWDwgWFB5VybmdmD0z2bKT5RHPhef/31Rx55hBxzjwBPihMhfNppp3E6JJ8TLWuP/kqr0tYraVRffPEFR/CddtqJWYn3MScV3sG6HbbqHO4fefjhyZMmPfDAfzaqvRHvhcwboly5Tz75hBvcbbfduMFPP/20Ro0abdu2pfDrr79u37599erVSdNVSiGGxRGKnvVPyvOKbLbZZvXq1WO7w5zO3Adnxl6mzIa1a995730vPP8iH7/HHn/c/wfw6YcgH3tWaIbEJ5+zCDNjtjlVAvFmYPZ88MEHmUxZdfRNbOZtUKYMOxieJ68Ub5483+T9+vW76KKLLojVhRde2L9/fwaTrbHE1cePHy9g46jJl7eoea1ZOM0SkkSU0+Hbb7/NKvvOO+/oDwizbZa4HOKd36xZM4AcNpp8iDRIFlrevcpBKkkoRsuJLckzYUMcOrYYMRjtALBcgi2CRsiw33zzTRZOQBHlx0g52ovwtOXSLWdc3hi4kyZNoilJVzGinPcYb0LmBGZX9s2HH3747bffXqFCBTblTz31VDbvby5uM7d4mr54wQzkZBTDcgUohuUKkMOhTQDrK+IUyyHvsMMO45O5evVqZhZOe5xNYd76HIl4c3M2Yu920kkn4RKnSuUw79oePXqQj4gccsghzPLEM1cNZC4nFiAnHsMolIOUjHJyJjsQI+QwytRwww03cILk09KmTZu+ffsyeFyGzTG9W7du22+3Q4OGDe7ue2fB/MWzW7T8tVa1+SN+bteuHVsKnglnBU4PnD4p54GwseXzQHxV8AVU5sGU9DvkooKCVQW77777wQcfzEgYBq8Iizpd4hK/5pprCBZMn/F78y2mbdzgrP26brVN68023fTmf92qozCZwYUzX21xFD733HMJ4vLi7rLLLldffTVxXJQZoPdkbEYxbNw8WSDZrsNyBSiG5QqQw6FNdlyMYB41WrBgATvOSpUqDRgwQM9ND5Amti9TpkxRUCWykglGsQB16tQpyXTTpUsXLsrl1A+iVlZ7Mk5UvDn1T8Jh2SurTzZtHM2DSxUbXpaKM5fgPcOt8YZhWmftXLRoUVCakXJ4N5rvkNmXa1QIuOKKKxRHJ554IkF6s2tRDMvl7cr+NecDQXz66F+1QR9ZGTfoshgbaWxsTdiSskDqi1n2spwo9DlSiWqRz1humV0F+/LRo0fDFC5dupQegj1x5q/zchXSJFNrs0Cy3SAr+x0yL+6cOXNwM085EJ93/dQbC7DmCoLZSq+fLIX1nz+jhCxXgGxGQVYOFfsDXiOK9V5RazzLjWIsriCesU6tzVi/CRCPGjWqY8eO77//Puc8llsOdnxsevXqxU6Kd+Stt97Kq86mj+0b7+/999+fiP5AiE5YbNjms7iyCBEZN27cAQccwAmYBUyHKnLM5cQCMTY0x2G5PisH4cazXfvqq6+effbZw4YNY0pixuSu33vvva233pr5hffxySefPGHChJdefHnU2J/OOOX06y64pOcj/RfOnL7qudd3PvowNvs8K2q7du362muvsRz++OOP8AcffEAPXKlC5ucDWU1XLzj6yNXvfFDzzVcr7XuAvgDh0tlhFBUtHf79kr0PrLD1ZnU+/TLzs1FlC2fOnMN0+e9//5vNysyZM9nZPPLII2xjOX/zeXvjjTc6d+5cdvbcOXt0mj9l6kWbb3HXW6/NnzX30KOOOufsMy+95FJ1jpgf+RAyBfBCEGSdOOOMMzgZM+3iMrMEY3GfjM1yoxjrl5SCBZLtOozFFcQz1qm1Ges32XGxXH3+mUl5TYnw9HhpiJscI79PMdbu1mHTzwsvvPD777+bwijxRu3duzdp5uUzl2Oo06ZN++2335o2bVq1atUnn3ySPllFtM4x748YMcL8Xbig1B2qzZQAc+fOZf1gn0E5rRqt0lhRGAx7TRjgusQRhTfffPP1119PHO29994ffvghwIBjLmczFpf+X3zxRfa7pilU5LPBZS7SJGMnG9f0aViuWPbxxx9nKtAaxlDZXjBsfV1h+oliblm9IeYNhv3KK69Ur16d3Q+fQXa3PA3dPskqcdj0iWxXafrZqMaNG3///ff6t0HVijjz8EJzhmYi0tJu1yZhrNx8WG4SxuIKHEbmXR0nCnzxGhjIySiG5QpQDMsVIIdDmwD2RIj5nTcubxfecEOHDuXFe/nll7/44gs+rv379+cTNXjwYGXqn3L817/+BesNOmbMmJo1a/JuIJlZg/mdVRkmIbhsRuZyYgFy4jGMQjlIySgnZ7IDMWbWHtYwRsg4+bBtscUWb731FjdChB1D+/btmUFWFxSuXL3itVdfaVm73uhaG/5aq8bTV1zXokUL9iL0cNttt7GZEN977718HviIUsVMEfx4FNvmFXMPiz3Xfhucazt2KFq+nPSiwlVvvvlmo0aNmGXok2fYpEmTGTNm0CEbF5ZJdgMs6isnT53atMmUunVnfDO0oHBVwcqVb7/7HjMpH0vuhUJugT3TxhtvPGnSJJgTPGcUxqZWyTwQAw6jGDZuniyQbNdhuQIUw3IFyOHQJjsuRjBPiafHesOkwI7z66+/JmJy1KonGZRm5LBxQ1lAD7y+6kq9KShA4qAxc0WkfhBxLHF9H8Nx8MEHH2zWrBnbrE8++YQ3j2a6Ep1r9bGlT0YFI7VmitemOedakmnS8G644QbF0Z577klEhUi1KIblqisuDRip1QYSNFTVBn1kZdwgtxgjw7oKH+ETTjiBZ4VYtpn0zz//fG5faUFpRj6rBzIXLFhw//33b7XVVqyLTIOcPVgCeQI61ypTJT4LJNsNsoqda01Q0qPmxWU7iKtHbWoFKIaNmw+jhCxXgGxGQVYOJViN/w5iRj7yyCMrVarEbbPiAuxbOd3y/uMzwylQjLbddlv9O6zaiRBhy88Zl7meRev2228/5ZRTzE8kqvP1TUwWzD4HHnggN8uHgQ8bd8GA9bNgxx577FNPPdVwk4Zly5StUG7NQQf1fPi5xzaqXqVcYfmlU6Yee+zRFYgWrZ49e+YRR/SuWLF8GQ6wq1dVqlShVq0aDRrUL5f5l+VXsmMru2ZVhdVli8pUCX5NRfbfDSymwjJFZcsVralUpsxS3mvskn8dP6579270w4Tz25RJPQ7sXrdObfZ/K1cur1ypIlNQq1YtypUvV6Fc5aobVK7XtFG5MmXLVyg6YN/9nn32ud1339088K+++opXkDMx6tevH9uCvfbai9bM6xdIaanixYPiE45lIhs1ahSsNwxWCUTMSqnJTnOrWuUiuaGi88svv7xXr16HBOoZyAASY6+88kqS/c8UQcTnl73jOeecw6HqjTfe2Gmnnbi6GYktggxJi2g2ZIn+9T7RX45H2YZomatQJUAwKwSWVpMgjhqYEWvVBRdccOihh2aeyFrZT0NAwj333BPfVbwYHp96Pik33XSTfjZFY3v00UffeecdnlLoI7JFMqsgSzUr34033sjHrW3btvQZ+qLr3jNviLW7E6xRNimx9DJx/q5Rowbl6lzKZlhSq96i2dDfUdyGL/Mcgwebg1EMyxWgGJYrQA6HNmma4IPHlKFZA0bs1IyLlchXq4Qb9FrIh5z1denSpeysiZOpeHCFYpc2LEBOPIZRKAcpGeXkTHYgjZ9xyrL0Llq0KHOHgbhxkjOALVpeWFC4etnCua22+HmjKqvGjStctmz10qXYosBm/rd06apFi5cvWFC4bDn7lKLlKwqXrCpcXlC0ZO7s3gdPrLbh8vfeLipaoUtbwyhY8u03k+o0mLVzp6IFvxcuLShcUVCwaPHqJUG3nHRXrCxcSucZLlyydOmcubr06injp7bafG6TRkUzphYV8KiXBYfZ7IsIslXaeeedb775Zr0i+uMcSVfHSoaD8bhPKYaNmycLJNt1WK4AxbBcAXI4tMmOi5GYJzZjxowNg19VzeaSJ6mgBC9ZsuTuu+8W8/xZ8MaNG2e6Yg/37bff8hLAyMTFAtSpUyfNm5pPosRuiUuQr36Q4fHjx3OcpQe2vB9//DFpvJ/1Q/WoXr1606ZNCy6VyWe0I0aMYK9JmspNP8hh49rMWqifkEebbLKJRhU8ktXXXnstQd3Of/7zHyKqkiWTS+tvSQU9hV+Owbdp00b9x4hLHH300boLux9k3KDLYox8Zjz6OUf6zOwvypVj96MJMCjNSMmSYa5+zDHHkH/dddeRjGiiUN/YX3311eqBIIKZV3lL8AmlEJcnOXLkyHnz5vl3oZKnn36afngpZ8+ebYIIZl/FaLt164bLFekKO2HCBI65SpDEXGvy5MnsFzUexU1aPowSslwBshllF85YFfvS2SjTEHx41BrPcqMYiyuIZ6xTazPWb+KGYbncsN5nvB5YWrE0AbQiZVJCHFet+mCwwTzooIOUSVDlMBbXZ4EYG5rjsFyflYNw49nUcqcCNalVY0ZyseU4dJYvKFtYqdzyxQt23LHM5Anl9u22slrmH23mzE5t5ukFqTyLzH+KMjeeKS+3ZlW58jUKFhcOHrTs96V1Xnulcte9ypbL/pZmDYPnuGzY8IX7HVCjYkHB3rsVlduQM3D5wuyQMpbhkceJePXqChUr4Gd6Llum4tLF5Qd8sqpGnTrfDllTb+My5QqK1lQsKiyiV6rQpEmTeEW++OKLdu3a6YpBN9kZEJmIYQBrs9woxvolpWCBZLsOY3EF8Yx1am3G+k12XCyXFxdgEtxvv/2+/PLLypUrf/LJJ5wXObUoBzEVMlHqr47wpmLq18/fVqlSha54CQ488MAXXnjhgAMOoEP7ErAA+/XXX7Mem4+VggLDWJZ8rk7EDEBNFLLTfeKJJyhniv/mm28aNmzIbL711ltPmTKFtPr169t/XssdceLkXHj//ffrPe9fzjBWrs3M6Syx+r1RdMtVGBIJ6NRTT33yySeBOnXq8NBatWpFiaqwrLLbbLPNhRdeeOmll4ZeGovLCAcPHsw+RnFJaSZfyQyDt7qubvpBTprNch3OTPbBX6G55ZZbCG611Vavvvqqti8xLwpPfubMmS1atGCde+WVV3itlU9XPOHff//9mmuusf+8Fvv666+fdtpp+pUjPLc77rjjxRdfHDBgAK5zFyp5bu3vjXL+vJZLdOnShXfOM888w1GHOJtCXlAe/oMPPshUrMeL1CePlOvSCSX6kWl1pcvlw3KTMBZX4DAymXGiwFdmFl4LORnFsFwBimG5AuRwaBPAyxY0huw47Dhg8hHrMeIlfPnll9nksjvDDdL/kLmEzwLkxGMYhXKQklFOzmQHYvxYIroR04r7B/C/VasL1qzMbDgXL/lp67a/V6m6sGLlhRXK8b9FFcotrlg+gPKLKv7xv4WBXVCx/OzKFRdXKD+/csWfajVY/NmXXCNz4eCigtWFRQu+/eHHjRvNq1xhbmW6qrSwQoVMb0Hnwf+yvLBCpkO5XHFexcqzKm4wrvm2RbNmMNw1RSvNP6yn22Hfs+mmm/LZg01cgDQGyXAwIvcpxbBx82SBZLsOyxWgGJYrQA6HNtlxMYLNQ3vsscc4LzJ5sU4MHz6cp8piwzv/o48+YlLO/PaS4LFrhWAaxVVXrEYsgfoJUhRc4Y/LCVBwnYxgkyNA4kzeWimOxAsXLmQYTERM1mwInnrqqblz52J1SiPOuZYzDcPT+wH7zjvv/PTTT7Ddj+SwcW02f15L/4DeafTP6qgzOmJmJ6IqI9akl156idN/1KXlapDKURAFKcXyyTFNdhwZN2gvxshmia6WLl165pln8gzZmrAmcfVsm1UuQGJyJk6cqId86623cr8a+Y8//qhzv/68loev20HTpk3jpdG3aORzmKZcl3NEAtK5lrVWXwaYOO+06tWr87Tnz59v8sePH896/NZbb8k1cTGbsDfeeIOLmlswd5QPo4QsV4BsRkFWDhVbnI0o1mug1niWG8VYXEE8Y51am7F+kx0Xy/XZyQfQkCFD3nzzTXbl7AH79OmjH4ez06JYIMbmzMfK9Vk5CDee/VqfsXJhHM6alYrKrSkoWD5wYMVV88quqVxYrmqQ63Zr8ZoiXv/yayquLihbtKqgTOXK2+9YbuP6Zcpl05RfUFRYduGCgkFDK5ZZsaZsxaKylQorlClfWI4MWpHpX4yVW37NsjKrlxdWqlupy25lK1TmDbi6TJkKa/M5e7EGMKltu+22RxxxhPnbn3ZXPtv9i+VGMdYvKQULJNt1GIsriGesU2sz1m+y42K5Js4sfOihh+rvVTOHcp7YeOONx4wZo9+Zp6/yVEUJ0q/JFGseoJCIfQnTvxhr4mKBw8hv4gi7xx57sAngWgSrVq3KLLzddtv98ssvzPJEWIDPPffcunXrnnfeebiUUIiCNbE051rmazZzOtdys6+99tr+++9PE2vGnnvuqSM1ew6O104toAciqUkgxuIKfMY6+aGMjKtym+U6zFLEDpVXkw0KR8OuXbvmfDIq1AeNFZe3RL9+/Thks54NHDiQ9wY7Hh7LwQcfDHCOJx+Z26eWS/DEdthhB2zbtm2JILXqKiS/8MILJ5xwwgYbbMDwjj76aPZ8BDlMH3nkkezhOHxvueWW9EMh+dOnT6e3hx56qHv37upBXalbXRrX3JqdUGqWm4SxuAKHkUaVQxT44sYM5GQUw3IFKIblCpDDoU12XIxCOUj5g7HaMUnaQCkHmbQoFiAnHsMolIOUjHJyJjtQDBsX4Oi5MvOfgsKCgpUFq5cUrShcvaKoYJX+t2Z1gQGbi1ZzHC5Ynfk/nklBwaqVy1evKsr82z/ZbgWr1hQtL1hZtKJgCdtzDgCrMn/kSq36yXZlseUuW164fPWKVUvWrM4McfWazCCDbjODDfbLejmwkpqkUAYcRjFs3DxZINmuw3IFKIblCpDDoU12XIzsOE9yypQpzL8cepggZNnE3HjjjawrPF6OuXraHDGZN/VBoJbI6NGjmWqJqCvFxQLkxMUCZDPym7gKp2f9ng2mo5o1a7JmcAy64YYbKgX/yizLYfv27fUz9rw3GC2Tsg5bdj+Sw8a1mXMtu+rGjRtzmNtxxx232GKLu++++8EHH+QqzJjsRcyfCypftYggBzsepn26EiDlGEAOyxWgKEZ2vsPIZsSonn32WdYzHtfDDz/M89EfzCstKM3IZyw38swzz/DM9Rcg2daccsopPN5u3brxKAiyS9MpVm8SLKdeHUYp/+6776jSPon3CYvojBkzOLqQSQLicV1wwQXs8Bhbz54977vvvr59+7Zr1+6AAw74+eefSdCTpBxxXfY3LPbEeSvS1e+//86SHPSU+TGOESNGMIDgDrIl+TNKyHIFyGYUZOVQscXZiGK99dUaz3KjGIsriGesU2sz1m+y42K5Pvv5iNvEmo2ViUeVGBaIsTnzsXJ9Vg7CjWe/1mes3Gx5xslgAV65spXWlMn8XyBa/PxMsMyawjUF5cqUK1O2Ag18FAo5bXiXIztTsqbM8nJlOSmXzWTxnqmgyyLTvxibvUSZsqv4/6I1nIGrZKKZKzo5tuvHfbZzxHKjGOuXlIIFku06jMUVxDPWqbUZ6zfZcbFcExcw/7777rsc1zjPbbPNNocffnjr1q3NZpy5jKYrr7ySaXfAgAEsb8ySnEUeeeQRDsTmZ/jJdLoVY01cLHAYhTZx9aFDhw4aNKh27dqcLFkFGRizKiMZOXJko0aN9ttvP32ruXz58ldeeeXyyy+/KJDGH3U5GCvXZi7HAsBZuWnTpiwMzz//PCc5Tv8suuxIOnfuXKVKFTMzqBaRyZ6A5fnaa68944wzQi+NxRX4jHXyQxkZV+U2yzUMfPnll7169WL8PJarr76avZRy1Gq6DWUADR48mPcGhdw7YokdO3Ysa/BWW2110EEHVatWjRyS2e5w+x9//PFnn32mv7TNKstmZdiwYRyI33zzzeuuu+7kk09mHW3evDklekQ8Sd5LtHJWXrJkyeabb96jR4+dd96ZdZ1WhoHUP0s177R7772Xi/J+OOmkkzgKc7Zmbf7888/ZGrLqr7d/XqubzSEKfJmFGsjJKIblClAMyxUgh0Ob7LgYhXKQUoxRDBs3lAXIiccwCuUgJaOcnMkOFMPGjWcUGvcZxbBxDQuQw8Y1LEB2HIVykJJRKAcpxRjFsHHzZIFkuw7LFaAYlitADoc22XExioqzzHDmQFk/EAnEmay7d+/O1MkBggSWOlZZJhTWIZNmujKAnLhYgGxGfhMWaWBihEuTglgDHIM46NSoUYNJHzfoJvJyQU9Z12YK2XnQIb3pojq06SpKC9Kz+cqkZOLEiRtuuCGHM53GkMlBpkSAHJYrQFGM7HyHkWGGxCmzSZMmTPQnnHACr6DiykcJGXGPiCcAY20pAfHGeOihh3j4XFQRzrWs0PrCmf0Hqz5Phri6Vf9IPWNpReZCEgmCqVOn6lzLksy+gU2VXibERqdPnz7sEQH1qZL8GSVkuQJkMwqycijBapwqVaq/v/SBB8xu3QCTNeeMTYJfi6ggc2j9tb8PXFV/knQ5A/YIsQ4wzqpVqzIj5zMkZkZ9kY7oUG7MuURp5HC2ZqXh0jHJ/zMxDA6CxxxzzJQpUzj39+3bVyfFUovbtIH+kSIAT6lSpUrcPvduZxL//fffL7jggrPOOsv8cuZQ6THGPDoSeMjspVhou3Tpwum2QoUKBBEvep3Mv/X5d/7Ltbx7s/8tLj3oVKlS/WPE51oTpdj5mOPSpOlME6KCknL+PGUvEygbCpQNBZLLIOUi5ZRUZrrXo7A7FCvNyE6glgigpr9EvEaIEx5HyZEjR2677baPPPKI+aWMoSI/SxHSG0M9ZO4zeBT2jcPkNGrUSE9AUlP//v0HDhxYu3Zt5WfbLCluegNQts0T59oXXniBbY3yicjKjSn8a5W5sVwKX2szDyNVqlT/IGk+tcGIjzxBffDtNE0icv9yMR6UmZ7WbhpKLXWFjCuIkTn7Jkn+U1UYfJd+3nnnffjhh82bN3/uuec46DM8lM2wxIC1GCAxNtuWQPbNUqifUZeIqLczzzyzWrVqp59++qJFi/SUlG8ryUMjhz3BYYcdxvn4hBNOmDZtGv0Tx5pyXXc9VPBIcih8rU2VKtX/N2nqXM+leY315m8x2nUu7pp779u3L0tsjRo1HnzwwS233DLbFibyUXASzkgRNZVCXNpfWljpH3jgga+++uqOO+5YHfxweLahhKKQnjnR9uvXj1P7xRdfvHLlymzbP0Lha23m9UmVKtX/D/GRZxpdsGABs5uWMWk9nA0Y3pw5c2bOnPkXDuyvfSZPPfXULbfcUqVKlfvvv3+PPfbIORhe1p49e3br1m3GjBk8vXwGP3ny5BUrVrAQ0gkr99KlSwH679ix4/XXX88aqX9AN5tdEqk33n4cZ5s0acKtffDBB9ddd92yZcu4BMrmra/SIOMVvtZq85IqVar/D2KmY2qrUKFC69atX3nlleXLlxNBoV9L/iViUmJGY+Z9++23u3fvzprxww8/ZNv+ZOnSmk9NRPA/E1fnyMhdf/bZZxz4CgoKrrzyysMOO8welS2NGYvee++9Dz/8cMqUKbVq1UryXa4j9YMdN24cay0X/frrr3///ffRo0d/+eWXxx57LCfa8ePHN2rU6Kijjvruu+/eeOMNnUez9bmkdxrDe/XVV9k6/PLLL8OGDatZs+aBBx7I5R577DG2ViSQybuRYahqfVPwpHMo+ygd6ZYE2HiWG8VYXEE8Y51am7F+kx0Xy/XZz5cbxVi/xLBAjM2Zj5Xrs3IQbjz7tT5j5caz3CT5cqMY65QIfMY6+QIx1nZ9Vg7C9dnO8Wt9xvolpWCBZLsOY3EF8Yx1am3G+k12XCzXj/ssV8nM4wIsYnYbNWpUp06dBg4c2Cb4TfoElS8WiLEmLhY4jEKbVItwo1jDQyw5AsTKQWton2Ks3DxZWrJkyXbbbceZsnfv3qGXxuIKfMY6+aGMjKtymNuH2V5wQp0+ffopp5xy3333aQy0SnY+oPVp7ty5++yzz8iRI1kUn3jiCeKmCotMic0CsfohIlGOVb6abLATABM0gEW4hgWylOhOWVbtznEvueSSH3/88f3339efGSvf7rZ0LDcJY3EFDiOTGaPwcy29pEqV6v+J+MibKZIzk35JMofIVq1abbHFFsr5a6WpDemozYBN5M8Wj2LixIn69UlcvX379v/7GZKbnTp16gknnPDbb78dcsght912m54ACh2MFipeyr59+/7000/k7LbbbtjQ5HjRD1a1YhPhPaO4WVYFJp6pzyUVCjRsvRUJwvPmzWNvgbt06VJOvWpaDxXcbg798RxtETSPABvPcqMYiyuIZ6xTazPWb7LjYrk++/lyoxjrlxgWiLE587FyfVYOwo1nv9ZnrNx4lpskX24UY50Sgc9YJ18gxtquz8pBuD7bOX6tz1i/pBQskGzXYSyuIJ6xTq3NWL/Jjovl+nGf5TrM6YHjLOenxo0b77TTTvvtt5+WN6WZcoEYa3fr9ylGoU2qRbjx7NfGM1ZuPjx79mweyKabbtq1a9eKFSueccYZWC0npClHjMUV+Ix18kMZGRdg+cEuXryYw/Tnn3/eoUOHt956q27duso0aXINUIWefvrp8847b+XKlbyCQ4YM4VCuVtO/GGv3A5t+nLjPAsl2Q1m1CNdnP59dzvXXX//YY49dfvnlY8aMueGGG2rXrs29KAdrd1s6lpuEsbgCh5HJjFGxAiNeJxWba8Sw3CjG4griGevU2oz1m+y4WK7Pfr7cKMb6JYYFYmzOfKxcn5WDcOPZr/UZKzee5SbJlxvFWKdE4DPWyReIsbbrs3IQrs92jl/rM9YvKQULJNt1GIsriGesU2sz1m+y42K5ftxnuQ4zNQ8bNmzWrFlM6xtvvDERJciacoEYa3fr9ylGoU2qRbjx7NfGM1ZuPsxkOHr0aE6H2267bcuWLQkSYblVmnLEWFyBz1gnP5SRcbkQdvny5aeeeupLL73UqlWr119/vUWLFiYZ0CVUgkUrVqyYMGHCvffe+9xzz3G0JbLZZpsNHz5c/5KxSTaMlWtYIMaG5ogFku2GsmoRrs9+vn5j1BdffMESu/322+u3RZpdDtbutnQsNwljcQUOI3OUj1GxAqN0rcX6JYYFYmzOfKxcn5WDcOPZr/UZKzee5SbJlxvFWKdE4DPWyReIsbbrs3IQrs92jl/rM9YvKQULJNt1GIsriGesU2sz1m+y42K5ftxnuQ5jjfSVHUFJOVhYIMaauFjgMAptUi3CjWe/Np6xcvNkzYcmCJjTlXLEWJPmM9bJD2VkXGDVqlU333zzHXfcwSGPgx0bIC5NnFZykHJYX7EssVOmTPn+++/ZGbBnokmZPXv2fPHFF83fjiWo/sVYuYYFYmxojlgg2W4oqxbh+uzn89hNGhKvn39eW/q11lzGXCOG5UYxFlcQz1in1mas32THxXJ99vPlRjHWLzEsEGNz5mPl+qwchBvPfq3PWLnxLDdJvtwoxjolAp+xTr5AjLVdn5WDcH22c/xan7F+SSlYINmuw1hcQTxjnVqbsX6THRfL9eM+y3VYCcY1YBgLmzQnLhY4jEKbVItw49mvjWes3HwYmbMHogmWKzZxLK7AZ6yTH8qIZRWr2gceeOCyyy5jKYVjJnQly0qmQ6oeeuihE044wZQ7l8bKNWz6ceI+CyTbDWXVIlyfQ/MNm7hpwubPcpMwFlfgMDKZMcq9GqdKler/j5g1NHEIJDX9v1X2KVjKNvxp0rfH77///rXXXsuB1QSjxLxvT/1ILuvrNtts06tXLwX/Xso+6+LKtv0NVf7666/PoiXzstmvXxSjnGk5E6SSpiXJj2JU6nI7iJLUotCmJLU2o5KW5MOoRCVRCShnrRTalKTWZlTSknwY5UzLmSCVNC1JfhSjUpfbQZSkFoU2Jam1GZW0JB9GJSqJSkA5a20NGzbsqKOOmj9/PutlaEJOUbjttts++eSTjRs3tlepnMNwLpczX8qZljNBSpK2rhiVqCQqIckmoNhB2IigitUaz3KjGIsriGesU2sz1m+y42K5Pvv5cqMY65cYFoixOfOxcn1WDsKNZ7/WZ6zceJabJF9uFGOdEoHPWCdfIMbars/KQbg+2zl+rc9Yv6QULJBs12EsriCesU6tzVi/yY6L5fpxn+VGMTamXCDGOjkCh1Fok2oRbjz7tfGMlZsPy03CWFyBz9iYWsNodfBvGvbv33/UqFHTpk37/vvvu3btav601elWrHLDQLVq1XbYYYfdd9+9SpUq+ieb/nbfIcczVm4+LDcJY3EFDiOTGaNiBUbmMuYaMSw3irG4gnjGOrU2Y/0mOy6W67OfLzeKsX6JYYEYmzMfK9dn5SDcePZrfcbKjWe5SfLlRjHWKRH4jHXyBWKs7fqsHITrs53j1/qM9UtKwQLJdh3G4griGevU2oz1m+y4WK4f91luFGNjygVirJMjcBiFNqkW4cazXxvPWLn5sNwkjMUV+IyNqTWM9Oe1aPDgwWefffbSpUtHjBhRtWpVBZ1uxSo3DFD12Wef3X777bvtttutt95KMF1rfZabhLG4AoeRyYxR+h1yViViO4iS1KLQpiS1NqOSluTDqEQlUQkoZ60U2pSk1mZU0pJ8GOVMy5kglTQtSX4Uo1KX20GUpBaFNiWptRmVtCQfRiUqiUpAOWsdNWrUaPz48ePGjTvttNP0t4yQPa0bdoJFRUUrVqxYuXLlww8/vO222+611152Qs5hOOPJmS/lTMuZICVJW1eMSlQSlWA/3iiF/2wUlalSpUqV6i+Rfn8IJ9HatWszITOtm5ndnuKjRGHNmjXbtWunP6lFBAWp/gzpsccrfK3VS5sqVapUqf4SaQbnOCvGfvvtt5988smQIUMWL16Mq7k6SsESkDngmq6AVH+S9MzjlZ5rU6VKlWq9lqblV1555fnnn69Tp07lypWzDanWD+kFilf4WpsqVapUqf5amTNTYWHhyy+/vNFGG/Xv37958+aVKlUiuGLFiscee+yJJ554fK2efPJJ7Ouvv85xNugg1Xqk9GejsioR20GUpBaFNiWptRmVtCQfRiUqiUpAOWul0KYktTajkpbkwyhnWs4EqaRpSfKjGJW63A6iJLUotClJrc2opCX5MCpRSVQCylkr2e7gwYO/+eYb1tqpU6deeeWV5p89QMuWLbvssstGjhw5fPjwH374YcSIEVh4wYIF3bp147BFFUtvkyZN9t57b5VIOYcRM54oRjnTciZISdLWFaMSlUQlJDnaFvvBZSOCKlZrPMuNYiyuIJ6xTq3NWL/Jjovl+uzny41irF9iWCDG5szHyvVZOQg3nv1an7Fy41lukny5UYx1SgQ+Y518gRhruz4rB+H6bOf4tT5j/ZJSsECyXYexuIJ4xjq1NmP9JjsuluvHfZYbxdiYcoEY6+QIHEahTapFuPHs18YzVm4+LDcJY3EFPmNjag0j43I87du3b79+/Vq1ajVr1qzPPvusXr165vchr1q1atCgQSyodjlQs2bN7bffniW5oKCgU6dOe+yxx+23364m5TiXxso1LBBjQ3PEAsl2Q1m1CNfn+FqfsXLzYblJGIsrcBiZzBil3yGnSpUq1fqratWqsdyysl588cUrV65kAUZM9IsXLz7wwAM5whp1D3TBBRewAJNDrdYDcaq/VunPIadKlSrVeir9cz3NmjV74IEH3nnnHU6oOsiuXr26Vq1aM2fOnDNnzvy1mjt3Lu6HH37IoZYclljALLqp/jxp3YxXsYOwEUEditUaz3KjGIsriGesU2sz1m+y42K5Pvv5cqMY65cYFoixOfOxcn1WDsKNZ7/WZ6zceJabJF9uFGOdEoHPWCdfIMbars/KQbg+2zl+rc9Yv6QULJBs12EsriCesU6tzVi/yY6L5fpxn+VGMTamXCDGOjkCh1Fok2oRbjz7tfGMlZsPy03CWFyBz9iYWsNILmskC+ppp5329ttvjxgxokGDBueee+5TTz3FGbdPnz6VKlVi0SWZ1RSpnBJqJVxW31122aV58+avvfZa1apV9e8AIlrtS2PlGhaIsaE5YoFku6GsWoTrc3ytz1i5+bDcJIzFFTiMTGaMihUYmcuYa8Sw3CjG4griGevU2oz1m+y4WK7Pfr7cKMb6JYYFYmzOfKxcn5WDcOPZr/UZKzee5SbJlxvFWKdE4DPWyReIsbbrs3IQrs92jl/rM9YvKQULJNt1GIsriGesU2sz1m+y42K5ftxnuVGMjSkXiLFOjsBhFNqkWoQbz35tPGPl5sNykzAWV+AzNqbWMJLLwjlo0KDx48fjslJ27979gw8+WLZsGUts06ZNd9ttN3JoIh9AKsTqILtw4cJ3332XZbigoKBy5cq9evWqUqUKcWRfzvRgs0CMDc0RCyTbDWXVIlyf42t9xsrNh+UmYSyuwGFkMmNUrMCIV0vF5hoxLDeKsbiCeMY6tTZj/SY7Lpbrs58vN4qxfolhgRibMx8r12flINx49mt9xsqNZ7lJ8uVGMdYpEfiMdfIFYqzt+qwchOuznePX+oz1S0rBAsl2HcbiCuIZ69TajPWb7LhYrh/3WW4UY2PKBWKskyNwGIU2qRbhxrNfG89Yufmw3CSMxRX4jI2pNYyMq3Kb5frs58tNwlinXCDGhn6t3l0AAGWvSURBVOaIBZLthrJqEa7P8bU+Y+Xmw3KTMBZX4DBiT5OlaBUrMDKXMdeIYblRjMUVxDPWqbUZ6zfZcbFcn/18uVGM9UsMC8TYnPlYuT4rB+HGs1/rM1ZuPMtNki83irFOicBnrJMvEGNt12flIFyf7Ry/1mesX1IKFki26zAWVxDPWKfWZqzfZMfFcv24z3KjGBtTLhBjnRyBwyi0SbUIN5792njGys2H5SZhLK7AZ2xMrWFkXJXbLNdnP19uEsY65QIxNjRHLJBsN5RVi3B9jq/1GSs3H5abhLG4AoeRyYxR+vdrsyoR20GUpBaFNiWptRmVtCQfRiUqiUpAOWul0KYktTajkpbkwyhnWs4EqaRpSfKjGJW63A6iJLUotClJrc2opCX5MCpRSVQCylkrGTdJfhSjUpfbQZSkFuVMy5kgJUlbV4xKVBKVkGStLbY4GxFUsVrjWS6MlWDzLbT+3J6gMmMYq1pFHMb6TXZcLNdn5ehPMtD8+fOXLl1aqVIl/X01IqZPMRa3sLBQQSLcCJYelInMjySY/BiW67NyEG48+7U+Y+XGs9yYHMSdzpo1q6CgYMMNN6xRo4a+JCGuHDHWKRf4jHXyBWKs7fqsHITrs53j1/qM9UtKwQLJdh3G4griGevU2oz1m+y4WK4f91luFGNjygVirJMjcBiFNqkW4cazXxvPWLn5sNwkjMUV+IyNqTWMjKtym+X67OfLTcJYp1wgxobmiAWS7YayahGuz/G1PmPl5sNykzAWV+AwMpkxWvd/v5arMk0vXrx44cKFrGf2gNYf9e3bd7vttjv33HNzPqNFixaxMGO5HbRkyZLVq1ebNfufKh4Lq2zPnj15Ss8//3w2mipVqlSpSqV1+fdrTe2kSZN23nnn7bff/rTTTtPfBltPpOFhly9fPm/ePKwiMTrvvPN23HFH7kWC99xzzyuvvJJ7zGb8E6XjO3uLBQsWrFq1KudTSpUqVar/t2KGzKnwtZZjTSlEoS783HPP/fLLL1OmTPnwww/Hjh3LQbB03aoKZf0I6aIoJpm4ErL+2s6zTrTmzJkzceLE2bNnk0wP06ZNGzp0aL9+/Tp27Pjzzz8Hl8181ZBTuhzJAkXUhEwwf4V2RYRLc1RdtmzZ4sWLs9FoqQdVCcSygiCxmIKsjMQKlk7qR8qGUqVKlWq9VHaqitU6/g6ZqzKVv/baa/qzzEWLFr3++utMzaX40lUTenLF3LBZG4wtaed77LHHyECff/75kUceyd2x+t5+++2lvi97ACUdTLz8/iUid9xxx2677XbFFVfkHDZPSQ9KLEBiO2IrU1Oqx+vL9JB/V6lSpUr1l2sdf4eM/frrr3/66aeqVavWqlUL98knnywsLDQ/WZNcVHEOA3IWkkAyAkKTFSSBE7ZxQzNDRSbj546qVau24447Pv7449tssw2LyqhRo/Qb1JKITnRFMwa5Yu5U8fzFOrpq1aqsY4kBT5gw4Ycffpg5cyacjeaSBhkMNiP1LA7aXXF1nrPW8myoVGKE9JN+fZ0qVar1X8GMmEPr+Dtk7FNPPYU98MADL7vsMiKTJ0/+8ssvNfkGWSGiieGSg/3iiy9uuOGGrl277hqod+/egwYNsmdwgLPydddd9+KLL5I/bNiwk08+uVOgW2+9ddmyZebOAZLRV199deKJJ3bs2JFT3QUXXDBu3DhdVFfPKWVydSx9VqxYcYsttiDI6gsHKa5II3/q1Kk33ngjQ50+ffr8+fM5VmqcDHj8+PG6qY8++ohnxdi6dOnyyCOP2L9bnAX4rrvuonzw4MG4SJ0vWLCAIE+JZ6uIbnbKlCnXXHMNXfHcdJWBAweq8Pfff7/22mt5VmRyOr/++utx7733Xi6kHkJl/gEv0t54442DDjqIbvfee++XXnqJkdOtmuAlS5a8+uqr5513Hk+YAZAGc9e6Ojdy9913c1HeCZSoc42Z8rfeeovbefnll+lHQUbLc9tzzz3pbb/99uMdtWLFCl2RfNNDqlSpUq0PYlLKrcxc6Il5jUkNATkZGWYJ0d8P+fTTT6dNm7bBBhswjl69enFA4dymtKA0I5uZjtFrr71Wvnx5SurUqdO0aVPKK1SoQISJ3vSATjnlFC5x2GGH/fvf/65SpQqMSCOZRdpkAohZm1aa6tev37hxY4DOt9xyS+69W7dupAUD/+MWUOYagcuQ6JDMHj16wHSL5s6d26xZM/q58sordSHJrlUmqxrrceXKlbmvJk2aaBiMEzVq1Igl/6KLLpKrVY3WK664QrXB8yjgIRC/7777iJhLTJo0iUfEjetf1NJFhw4dWrduXZJ5/gxvww03pNubb76ZJ0Drd999l3lHBNKFAE7nXEJ9koMMczl2LTwlkm+//fYjjzyS3ihhnLoFFk5qzTjPPPNMgiQ3aNCAjYiYp/3rr7/SyhhOPfVUInvttRerprkc5cuXL2/ZsiU9s37jkvnNN980bNiQ5Hr16rVo0aJSpUr0dtRRR2m51XOQTD/BqIsximHj5skCyXYdlitAMSxXgBwObbLjYhQa9xnFsHFDWYCcuFiAbEahTYCUkzPZgRKycfNhlJDlCpDDcgUoipGd7zAK5SClGKOEbFzDAuTEfRZIthvKgBTKQUpGCdm4+TBKyHIFyGaUXThjtY7/vJZFhdmzdevWO+2008Ybb8yJjUlzwIABHLmYT7NJYaIVMY+z/rGyMuGykHzyySfNmzen9T//+Y/SJCXT7dVXX81q9NBDD3GAYz3gfigZMmQIrZl7K1fus88+u+WWW3DPPffcr7/+mj45R7KE/PLLL9m+koknK8s6x7qCZUU555xz1OqLq2cpqDrjjDMWL1581VVXMYa+ffuyYnHmO+644+6//36Obu+88w7j3G677XjNOB1yAqZct2D6MSBl7j9YNeXSysGR826HDh04BKNvv/32vffe42hIDt2yaPGIDjjgAFzOnV988QVn/ccff5xO1EOo6JaEe+65h4WQY/ebb775/vvvb7/99jSxy9FPcauHatWq8Vh4RXjheP4c0NkncTzlpVTOIYccwovLwCZMmEBEQfrnxidOnMhbhfMrQfYxPJYZM2awuNIVg3z00Ufp/IUXXuD4S4LzHFKlSpXq7yEmL18sDwZyMmI2x2Wu1x9k3nbbbZxmOIIw3bOucBUiypGVKMzS2jhVAIU6vmD79etHOavjkiVLlIY41xLkxNO5c+eZM2eqiiWKox5Xf/rpp5XGYrDPPvuQyRqzcuVK0znxgw8+mHj37t3pU7dgAMFySaaWTE5aHKM5lumcetJJJ02dOlVHRnJMvqmV9OfWLG+bbLIJKxDJBLn67rvvTp8M9eyzz166dKk6YS3hAFe9enXz9TID1pfVLMlBf9lLsDgR5GbpU0HOfDvssAMdkklXSH1KYuwJJ5xAYc+ePXEpVFx9Bt3/cQs0MU6da7mFiy++2IyT11THYp4AEQV5FFiEy6NGrJ3U8koRoUO2Gm3btuXq1113ndKwDFs/aHb++ecrctddd9Ez1+XVJKIgV6eQTRjd4gYjzcgftmEUw8bNkwWS7TosV4BiWK4AORzaZMfFKDTuM4ph44ayADlxsQDZjEKbACknZ7IDJWTj5sMoIcsVIIflClAUIzvfYRTKQUoxRgnZuIYFyIn7LJBsN5QBKZSDlIwSsnHzYZSQ5QqQzSjIyqHwc222sSTiepxaRo0aVbt2bWZzOmEO3XXXXbfeemvgueeeYzrOpoaJGVYlwfUzIohlohdLQUtWm2+++VNPPVWnTh2qaGKV4nhE3KSxDHOQpZVDEsuD4lyIxXKzzTaDMz0mEJmzZs369NNPOSwy3XOnrIKcveiKxQCbzQuTrnjfffexFsKILUKjRo2ATsEfMDMYmBzWYwZJ//SpQlkp6CwrP8I98hwIslZxywxPCXSLYFlGbuIAAoIOwqWcvffe+/rrr2ecRMjnsZuHqYgkF+kLZM61mfdgcEXE66h3BUdk7lGZv/3220cffcSxVd9JkKx/mYQrqlxr7S677EJw7Nix+sP4VKlSpVqvpFkuXuFrbXb6TCblMzM+8cQTzIn77rtvs2bNFKlRo0bv3r1JYKLUHy6a/CiRM2HChFdffbVv376XX375iy++GJrPhbbffnuOm7ASiOgnn4P2jH799VdmZxaJ1q1bk0OC4sqPH4YtMjml/f7775w4uUd2D88///yee+754Ycf0sRtZvMixNrTtGnTrBP0xj1imzRpooVWLxX9YLNJlpIEWfzOO+881ifOmhy+DzroIE7JnBrp2RleaG/x4gVlpRTTYc2aNQHTDxEEsJcaPnz4M888c/PNN3MSHTFihH0tmAWV++XF/eqrr9T0+uuvc35lF9KmTRtc1uAxY8bQG4Pv3r07B1kE6Fd20z9HbV5Eu9tUqVKl+svFpJRT4WtticTkiJ0yZcpnn33GKWTkyJGHrdWhhx6qv19L/Nlnn1V+qFh+WBU4LJ544okdOnQ45phjWGuZczVl+8pM8MF5Dja3quXKaOHChVjWCUQyrUighORivUQc2RkYB1xGuHTp0lNOOWXy5MkaQ5QYmxmnZFyGEbxMf6xbQfsfIhPZLIVmssQ+9thjnNd5kp988glHeVapX375hWvRms0roRghvZkrCpynp6YhQ4awHencufMZZ5zxwAMPfPzxx2xNlCCRxoF411135cytH2NGLMw0HX/88RWDH+eG2R9gWYC///77YcOGYVm/p0+fzpNnnaa1FK9dqlSpUv3lCl8nmFWTS/Pva6+9Nm/ePGb2n3/+mTXyjbVixtQ0+sEHHzD/kh/aP8ElS5awQjz33HObbLIJB0cOkaNHj77jjjtixqOekYCIrMSosJyHODCRwGKJJWKqgqzcUiZWhZyeb7nlFu505syZAwYMiO+HFUXLQ9a3ekPqUKMCgvY/FKRkpB50O4g+cakyEZi7O+SQQ7799ttHHnlkjz324K6//PLLww8/nBeFZKWhTHexA3ZkkrmE2THYcYAVvWfPnl9//XWXLl1YZdl1/fDDD/rzWjuNBfXoo4+mn/fff3/OnDkMb+zYsfXq1evRowdBErCVKlUCLr/8cvr89ddfZQUs540bN7b7TJUqVar1QUxKORW+1jLrJRf5y5Yt02+ov/jii5lAR40ahR03bhwWMfuzEsyePfu9994Lunf7J8JwmUy/+eab6tWrs9wedNBBAE26DaXZCg0iO848jstCqzUeKc4SZZarJPIzN95442rVqhFnJcuGSqicVydB9440WpYZ3QWPV0tONnVtbwTZB/Tp0+edd9659tprK1So8NNPP7H6+pnJFZVv4sBTTz3F2tm+fXveAx07duS1VitWIDHy/fffn0fHHoUlmVeZ+zriiCP0QiNOrltuuSVpjJnHW6NGDW4HWzMQTI7TZ6pUqVL95dK8FK/wtTahzLlq6NChI0eOZEI87rjjmjVr1sJS06ZNOfQQJPnpp59evfZ3IztinZg6dWpBQUHdunU5vugMSnz58uWAVkddS1JVvLj0RhttxBX1PTaXUD8s6u+++y4JybtCZFKOGM9///vfRYsWEdxwww2JKCFGoVeJCkowA2YdAtgrmKuw0OrvGpHjvMamqkqVKmeccUb9+vVxZ82aFfSX/YNbIhx5YQUzZbmUM23ChAnkNG/eXN/VK5+riLmWLkeQhXbvvfcG7rrrLjZeG2ywASfvTBeBeNHZY5HJSsw7Khv1pK5SpUqV6m+k8LU2M0cmkMnkZIPdc889W7VqpQldcYTLGsxyCw8fPvy7777z+1eEYyjJU6ZMuf/+++fPn88C89hjj91888204k6bNk1TNjmmxJcdZ2bX3yh95pln+vbt+8svv3DUvvvuu3v06DF37lwykXqLknLQkiVLxowZwzrHKnvDDTdwfKeQhXbnnXeO78EXvRnrK7jaH03adrzyyiv6Y0vOgt26dVu6dKnOjnogiOExpM8++4z7Ym+xYsWKN954A65YsaL+FhbJWB4IT4Nl7KuvvmLPwc7G2cH4UqE9JMlEABZ1xsPa+eGHHzK2iRMnXnTRRZ9//jmFY8eO5SoMSZlEOHZjR48ePXv27LZt226//fa4pqtjjz22TZs2bLCA1157jQ0NI1Sfb7/9NsAtm/xUqVKlWh/E3JVb2dziMpM4EMVMgsyhWObBBg0a0NVbb72loMnEKodVqlKlSsySZ599tpMgkcPkyzzLYsAKwSFp0003hffZZx9OuhRut912LJmkIf392t69e8PUBiPKdMW6QpzTc9Bl5tI///xzkyZNKgR/HzT4JrImfXbs2PHoo48mk3WLNFMuQEF1xqV/cqilCrGiIIAIg3zggQc4vdm3Y2ol/f1aMlkpFVHCEUccwdUZg8YvsQRWrlyZp8SGgDhLFJ2/9NJLBLlojRo1NtlkEy7NSZ0b5I44vH7zzTcU0ufChQu32morghzld9llFxYwjZOFjfWJ3iQuQT/Ea9euTdqZZ57JVYJR/3HLYpJZ8LRz6tevn900efJkborxs/tRt4MHD65WrRrdEt96663pHDj44IO5ccq7d+/+/fffk6ZuGWrr1q31DB955JGgg+yLiOVhDhkyRL9jiwReUF6sdu3acdebb775zJkzzdNGKhE4jGLYuHmyQLJdh+UKUAzLFSCHQ5vsuBiFxn1GMWzcUBYgJy4WIJtRaBMg5eRMdqCEbNx8GCVkuQLksFwBimJk5zuMQjlIKcYoIRvXsAA5cZ8Fku2GMiCFcpCSUUI2bj6MErJcAbIZBVk5lNd3yMyG2JEjRzZs2PDAAw/cY489iDBLmiYBl2nWrNkxxxzDpMnhjElcTbYYLvMppzEOnfS2cuXKzTbbjLn4nXfeufrqq5l8582bxyGJTHqjiaVXf7NI5dKWW25JnH6yfpkyrBYfffTRIYccssUWW2i5Ov300zkhMRL1kM2LFqsXyYhDGLZ9+/aHHnro9ddf/8MPP+hXRWbzwsRySBUX2mCDDbKhYPwsGwT1eyqMyNElWFx1X2TyNB566KEWLVpwhiahZ8+eAwcO3HvvvZWpb5gRF2IT07lzZ5Y3lkDWJJbbBx988OGHHybCsyWHPhn8m2++iaUrXgjWLZWHinzWb67CpicbCkSHuimWdlzSuNbrr7++4447sivixd11110//vjj559/nuW2UaNGY8aM4Vrci5IZ86mnnkq3u+22Gwl6mwYdZ1qx9MYe4vLLL2fjxXl9ypQpy5YtYwk/66yzGDYPXLeTKlWqVH8jZRbCLFpiOtPEZ6bIGJYbxVhcQTxjcTlpcfRhALVq1WIJIcK5Z37wawvr1KnDcU1pJt+uxcp1mN4WLVrE+s15ixWXIKLV9OMz1uT4LBBjc+Zj5fqsHIQbygBLDmINZhuhKr8fxINiWWK1Azhc6osE5WDFPAqOywsWLGCl5FGE5tgsN4qxuACiW14mBlmzZk1eJiKs5eyQuIQiqjKX8Psxrvrk9eKdALA888KZPZzybVatzXKjGOuXlIIFku06jMUVxDPWqbUZ6zfZcbFcP+6z3CjGxpQLxFgnR+AwCm1SLcKNZ782nrFy82G5SRiLK/AZG1NrGBlX5TbL9dnPl5uEsU65QIwNzRELJNsNZdUiXJ/ja33Gys2H5SZhLK7AYRR/6JKKFRiZy5hrxLDcKMbiCuIZi6uI3SThcj8mzQDWZrkOG1dSkN6wqkUOY+WGskCMzZmPleuzchBuKBtAJm5eWlwFFWcpVROu/biwMK1yEYsfLqDjKaAch+VGsQCZEgCZfmwQ2xFAjLVdmO2CIgwSkHCxyrdZTaH9hDLWLykFCyTbdRiLK4hnrFNrM9ZvsuNiuX7cZ7lRjI0pF4ixTo7AYRTapFqEG89+bTxj5ebDcpMwFlfgMzam1jAyrsptluuzny83CWOdcoEYG5ojFki2G8qqRbg+x9f6jJWbD8tNwlhcgcPIZMYo92r8P5ZuQKuCc6vrRHSFkjya9UdaFCVGHjP4hPdl9hlmzc5HGpgBurVfPqyUSS25nB1DqlSpUv0dlfnH0bJoycxr9gQXxShnWs4ECVdzq6ZXATIcZJWsW7EpN4CSl0uG7SBKUotCm3LWasAodNWx02QlBSU/x1ij0D6lqCZx5mJru7LBMBLH92PkdGvAqET9SH82o5xpOROkkqYlyY9iVOpyO4iS1KLQpiS1NqOSluTDqEQlUQkoZ61k3CT5UYxKXW4HUZJalDMtZ4KUJG1dMSpRSVSCM0eFqthB2IigitUaz3KjGIsriGesU2sz1m+y42K5Pvv5cqMY65cYFoixOfOxcn1WDsKNZ7/WZ6zceJabJF9uFGOdEoHPWCdfIMbars/KQbg+2zl+rc9Yv6QULJBs12EsriCesU6tzVi/yY6L5fpxn+VGMTamXCDGOjkCh1Fok2oRbjz7tfGMlZsPy03CWFyBz9iYWsPIuCq3Wa7Pfr7cJIx1ygVibGiOWCDZbiirFuH6HF/rM1ZuPiw3CWNxBQ4jkxmj9e475FSpUqVKleofpvXuO+QsJUtLkh/FqNTldhD5OfoDS0d+mrECW3bEaY1qimGNx/y5r4kLUBSjJGmGoxJQzloptClJrc2opCX5MMqZljNBKmlakvwoRqUut4MoSS0KbUpSazMqaUk+jEpUEpWActZKxk2SH8Wo1OV2ECWpRTnTciZISdLWFaMSlUQlJDnXFjsIGxFUsVrjWW4UY3EF8Yx1am3G+k12XCzXZz9fbhRj/RLDAjHWzxEg84esitgsy/pHDlZ/T0YJWCXYHNWPzVi5NpslFldxrmXYz7dZbhRjnRKBz1gnXyDG2q7PykG4Pts5fq3PWL+kFCyQbNdhLK4gnrFOrc1Yv8mOi+X6cZ/lRjE2plwgxjo5AodRaJNqEW48+7XxjJWbD8tNwlhcgc/YmFrDyLgqt1muz36+3CSMdcoFYmxojlgg2W4oqxbh+hxf6zNWbj4sNwljcQUOI5MZo/Rcm1WJ2A4iP+e7774bMWLEypUr9UuJJSdNr9a8efMGDBgwfvx4/bKkbHNYn0ZRTTGMvvnmmx9//BGoW7euiQtQFKMkaYajElDOWim0KUmtzaikJfkwypmWM0EqaVqS/ChGpS63gyhJLQptSlJrMyppST6MSlQSlYBy1krGTZIfxajU5XYQJalFOdNyJkhJ0tYVoxKVRCUkWWuLLc5GBFWs1niWG8VYXEE8Y51am7F+kx0Xy/XZz5cbxVi7xFiCJk1fzM6ZM2fSpEmVKlXaZpttKlasqMxDDjnkzTffPOOMM/7zn/+YfJXbTA+DBw/ebbfdOGhOmzZNv42SVqz6sdmpDWWsXJu5SkFBQefOnYcMGXLNNdfccMMNOukS/OGHH7h0y5Ytq1WrpsOuUys3irFOicBnrJMvEGNt12flIFzDSAmLFy8eO3YsvPXWW2+wwQamxO9HjDU5+bBAwtX7YdmyZaNHj4bbtGmjX2aJyFRtPGPtbh3G+k12XCzXj/ssN4qxMeUCMdbJETiMQptUi3Dj2a+NZ6zcfFhuEsbiCnzGxtQaRsZVuc1yffbz5SZhrFMuEGNDc8QCyXZDWbUI1+f4Wp+xcvNhuUkYiytwGJnMGKU/G5VITJ2aSW3xfDmJfvzxx7vuumufPn2YZLMNax99zhdAPRjOmV86qWckNnbmzJm77777XnvtxdqgyN9RvC7jxo1jv9K1a9cpU6Zko3+dfv3113333ZcHO3369GwoVapU/+8VvtayYqdyxOHpt99+0z8QlA0F0m8V1mKcDQVaDx+jPSQYFa79ByScwf9dxC2wSwBWrVqluyCipv+9zD6mIPinjTLPN/0opUr1/0D6sMcrfK1lvkgu8s1MDRARqyvDmgdlJVXZLFeAFGQONTkSQcfFEsnUrGW5hhGuRgtgg7FnhmpqM0mBYCMT6dSp0xZbbPHEE0+YIIIPO+ywGTNmfPHFF+Y3LUu0ytpj070AWKRMX2o1aQKzIsLZvDCRYD8BI5qcwk033ZSz1+TJk3fcccfya3/RtGppzdQHETjoINNkW5MgQGoyIMbayXKxxkU2qzUYYEZqyhRbo7JFsE2bNpzRx44d26JFC0UkcyE4yM0+Q0UkOHNVS0FiRnDUYJw0I3KwBAHnQqlSpfqnKpgJcmgdfIesKQbQQsKOfvny5QsCrVy50sxupImxnEIWLlw4f/78FStWmATlSEToh/OB8kmbN28eJQSJSLQuWbKETsw/HESQWgaDVbcwlipq6UGXw8WqxCQTXLRoEb3pKiongVZcLLJ7louqVKlSs2bNWrVqweoTGSaN5KVLl9Kz/vlV1apJOb4o10W5tTlz5vAkzZgpV06UlIBdtmwZhbqofy0uUa5cuQ0D6XtschBXsaV/AIBHx9NWjuKc8hkYz19VQZfZO8JyRT1PcsxLjDU5iPcAOeYlZqhz584lmSZck2ZqeRR6dfSmookc7gJboUIFXoKNNtrI/uEy3TXiQoyWznkgBNWkHFrlMhKEC5Ovd4IGo2CQnn2HY7lxOiRN/9JD5jJrPwUaEiX2YFKlSvX/XOHTgeaOhNIUAzAJPvLII/vss0+dOnXq1auHbdy4cc+ePTVDaerhIHXxxRdvueWWJGy88cacFA8//PChQ4dSTpq6Qv/5z3/222+/a665huns/vvvb9euXf369Rs1anTooYdOmTKFtG+++eaggw4i2LBhw1atWj322GNMkcS5Cl1xyqH8gAMO+PHHHz/66KMuXbpwuQYNGpBJn4xTaYhrff/99+ecc87OO+9MTt26dWvXrr333nt/8sknZjBvvvkmkUmTJpH/0EMPweiuu+5S64cffrjvvvuecMIJrGrqU6IJ+/vvv5955plNmzbl6oyWQ/C3335Lk25Wmb6YqadOnXr66ac3adKEG+QAyi18/vnnxGlVz1Ei57fffuOOuCi3w6vAcxgyZIiev3IEPCVGTuvo0aNxdb9vvfUWryAvCs/z6aef3mWXXeiEkethkvPrr78eccQRDIk74uXjOXDj6lB3xAA45fPqMHJE2kknncTp2X7mdMILdPDBBxP/7rvvjjzySDrcZJNNeIF46ZWjZPqcPXv2FVdcsfnmm3MvvDq8bXhTETQ548aN40a4Iq+RCTIM3jx33nknt0Aht0Bh165deTX1GJGSiTAYnjYr6KuvvtqpUyfyueuOHTv+97//Jdm8Upz+R44cyWi5LzpkMJykL7nkEvrR01MOd8ryv0HwDymqMFWqVP9g8UnPrWxucZnJCEjCrBwcOJjs9I/GcMhg7mOq4qjBAsZcTAL66aefmExJYBrafffdu3fvzmzF3MTRimmdIw5S5qWXXkp8jz32OO6444Dq1avTFcCMtv/++3/66afMhhUrViSof9O0atWqrKk6nVA+ceJEXeW6665jMJUqVeLciasejj32WHPERFyCYTOSPffck871VTB21KhRdMWQ+vfvr8sR5xRLVwz43HPP5VqUP/nkk1yL3QMnM/NkWAwYFZPyrrvuCtCb/l05emCaHjZsmO6UzEGDBhFkALNmzcLN3Hxh4fjx41u3bq3k3r17t2zZkgQ6+eCDD1gCSdCTR5TIqhbLXmT77bentlq1ap07d2bl4Map1c/EXnvttUpm8GQyNh7O4MGDqQ0e3mr94HSzZs14dDRRy6qje99mm21Ys1nCuREegv71XJoeffTRzKCDHhjeSy+9xOvCgBkGewsdN9u2bcu2gwRd+ocffiBIJ/fddx8Pk2RY/+A8T/i5554jTffF4ZK3CnEeIJsAdjksyezhzL9UT2/s1RieXjJKFOdIzQtKISNk5Lwi7AwYeeXKla+88kqNRJn//ve/NcIbbriBq5PAYACSWZ5//vlnk/zLL79waZLZQPBu59nytuGtTgKtsqz3lNMJuwGqJPMaoRiWK0AOhzbZcTEKjfuMYti4oSxATlwsQDaj0CZAysmZ7EAJ2bj5MErIcgXIYbkCFMXIzncYhXKQUoxRQjauYQFy4j4LJNsNZUAK5SAlo4Rs3HwYJWS5AmQzCrJyaB2stcwvzLCXX345MybTEIcVphvmHZaTZcuWMQ8uX76cBA4NzE3MX6wiw4cPp5Xz5YwZMziqEmQm5XSiSQ2x1jKP0yFzPedCDmp0deGFFzKlsgAwfdPJ22+/zZLJusWRiB7OOussytUDpyXKWV3ogRPhxx9/zBFn7NixvXr1IpMeXn/9dTIZOWKqfeONN9grMCREh1oXb731VlzS9GWsFj+CXBTpq1H0+OOPa62lB/NkuBDJrEZM1v369eOQOnfuXJYx1gPiTNOUa6jOWkuQph49ejAAJnGCRLj6ySefTBpHtKi1lqDu6Pjjj+dVaNSoESsoEW5hzJgxnERZcuy1lkz91C5P45tvvjFBngZBlgryuQtWL272nnvuIcIAWHe5o6eeeoohcTbdeuutSTvwwAN1ISy7HNZCgjwonhuj5eo8HGr/9a9/0b/EWstVuGuu3rx582effZanRya7CjK5cQp1X9yFlnyOv7p3ng+ZjArWmPVVAc+WA3qm9+Crb072dMUmifcJhQyPS9x4441clGfLWZbRInrgddFz4A3Duv7111/TOW8J3gYatq6CYCJt2rRh34BLn4sXL2Ywunel8VprrdULKulepBiWK0AOhzbZcTEKjfuMYti4oSxATlwsQDaj0CZAysmZ7EAJ2bj5MErIcgXIYbkCFMXIzncYhXKQUoxRQjauYQFy4j4LJNsNZUAK5SAlo4Rs3HwYJWS5AmQzCrJyKHytZcpQF5kJJhcjph5mWKb4du3asagw9dDK7KM5iMmRye7dd99lOmOmY+VTEyLO+aN+/fpMYRdffLEmU3TJJZcQIZ9ZnnIlf//99xx9iHMEYU7XBIo4+XFpFga6Vbm+SySTMyv9m5GwuuhXRnDeMsk00RWzM1ZBDk/UMgZcRBxttdVW9Hn77berykjnWs7rLD88CgU5RdED19KCR4SrcCMXXXQRV+cu2A0ok5mdcmb/mTNnKsLKwVMih22Khs3VydcugeWEiB47Il+gzOnTp7PAs9Q9+OCDuObWWGZ02L3mmmt0FVq1IyFZ51oF77//fi6EzjjjDBYSdcLryxJLOdsaFmb6JIiuuOIKMnnRSZM4DZPGtdjcKAdp/W7fvj2FusqIESOIcJuckrVA0kScHRvPhz0Zz0p399ZbbxHh7MutkZAZ5dq3FhLzoOiNG9dXEYh9QL169ShkcVUa/SO6ZSEnzomfrYA6NNuL/fffX+9eZXbs2JF7ueCCC4IuMzr33HOp5e2hP6ZFJNOJpAgHbtZazrvz5s3TLSADKIblCpDDoU12XIxC4z6jGDZuKAuQExcLkM0otAmQcnImO1BCNm4+jBKyXAFyWK4ARTGy8x1GoRykFGOUkI1rWICcuM8CyXZDGZBCOUjJKCEbNx9GCVmuANmMsgtnrML/vJYpJrnIZ7pkOqa7k046iYnG6YEplRnqv//9L/NRs2bNdtxxx8yFg19eiGUa5WBEyRdffMGqoBJEhExmfJUT4eBSpUoVau+8887GjRsTUZzDFkE6pwRlioNyVqbTTz+dQzAumeQ0bNiQYy7AXI8ljkW0qiuVc0XDSK1yBaYJOa4Rwa5du3ILcnWVAw44gLWNWZ4nFlpF2ieffMKLx2TNaFkj0ZIlS1gJWH15gX/66Se/kAidA59//jlnMp01FVScR4HoPEj/Q1FjYKHq27cvT1udcGnKAdbCHXbYQX3SxGvNUHnV1A/D059zs9ayCDFsVlzUoEEDguPHjycCSOqZvVSLFi30cIhsttlmdIKySWXLcurldihkL/Xll1+SSRoJys8mWTcCoOHDh7NqMv5DDjlEmewJAF7ZI488EpfnP2PGDMBU8S567LHHdJZFXJTnbwamIPstLG/U0047jeVc49SQjPRHFVyaHnCDulSpUv2Tpc9+vLIrhyOmj+Qif+zYsVgmsm233ZYL29OTgLQxY8ZgOR4xGTkTZcuWLWniXMgEnelxrZRpJzO7wSy6BJnRFMeSrIlPjAAiAkQT+VjO3wRnz56tzYiaOO/269fv5JNP3meffVgdmdNNlWymx0AaAMr6gRwXmUtTq064OmLVYdWkidN2kOiKZP0QFnuXNm3acFzm4bDeMMuz3hDnrElOkPuHMgMKxjBu3DgsSyM7mODKmddCN660TLYlP4IYPC8lq0Xw7LMPX6dDLUWKI5LVAxZxFpw4cSLlzz77LCso+yoGD5x44okksxkiIbhCRuRzlS222CIzyrV98nCI6+lJdHLhhReSyeGVvQvn0VdeeYVzJzlUZZOsG8mMo0yZCRMmwLxP6tSpQxqR4CIZ0HLOmw3hmioyeWgaBiKiwahVOvzww9mr6QZ5n/Tu3XvIkCEkcGvqHGaohmVTpUr1z1ZmdsilzJziK5iXkspcjFnG/FI6lG0O5h2kyZEZXEEjmnQC0Le42WgQz1Jxhcb9YFSazmosP5rQsQ888ECHDh0uu+wyllgm3O22246jMDm0ZsvWKrRPFHWtLK0V1zILmJ6GL6pY0rBkMqqddtqJOX3nnXfmNMk+AOmY7osSxIkZ1tfIitsiIUtr5UdQaBDlTOaJMXiAFVQjZ9jcRceOHRn53nvvTbK58YRX4Uauvvrq++67j20Hb49BgwYdffTRxx13HMflbEYgU6W3IpmA/sDe6ZCXnoiO49lQ4sGweD///PMXXXRR7dq12fq8/fbb3NRtt91mv6B627BOY6O6TZUq1T9JfNJzKnytLZGYaDgQAExwHNcAzXdYI0bDPMV8xGKgAyUihwiWEwYJnMZ0kli30oWQrsUcjWXF0iz8wQcfXH755RwW77nnnu+///7FF1/s378/p/NMwdoSRxo8yvoJRDJd8aC4OmcgXM732TZPnB2xm2+++WuvvfbGG28woWPfCQT36NHDv7SJsDLBC4O/IgxIajLw56lSpUoMngt17979rbfe0oA1cvTqq6/Wq1cv4fvSiN54V3Ay5gT5+uuvs4QT5IFcc801SvBFic7f+uEsRSSYNRLr7AtzypTzzrnlllvGjBlz9913N2nShJP6TTfdxN0pB8to2WrQpB1kqlSpUqHwtVYzS3K1bduWeZZF9NNPP8Uy0bO0ZNsCEWzTpg3T36+//sp6UxD8BgOtBzQNGzaMJpYfpj/lo6hhhMb9IBHWNiz9cy2kgQ0fPpxrMWBN+q+88sry5cu7det2+umnb7DBBipENkj0pg7VG8o2hF2dzlE2L7i04qNHj2YBYDpu3rw5cQUdtW/fntrJkyePHTuWQxgXxSKCtAaDCrlZgZZwFoDffvvN9M/TZsnBxhTaCg2inMmsLu3atWOcAwYM0OU0bG4Bi8gx+cmvokLeGyzh7I26du3KrX322Wf2AzRVJMMtWrTgDTlv3jx9IU8E6VUYOXIkOfrrtkFFRrRmqbhMXIBV/2wuzzzzTAbTuHFj3lSff/65BsmQuFk2SU8//bQyg+pUqVL9k8UnPafWwXfI5LOR79y5M7PM448//vXXX2tNYgS0aoJjzj3ggAM42s6YMePll1/GJQFL088///zJJ5/AhxxyiKZmKWoYoXE/SIRL//TTT1iYCzEqTq7ffvstVzn88MNxaWI6Vi2zpMbDGVe/J4H1SV0h4iSg+fPnK59yNSFFbOkJTJw4kSM7VUpYuXKlfsUjz0p/T0nJjvbdd19mcJbk6667TocwiSaNmR6UaUSrgvvssw8nNi702GOPEZHo6tJLL9WP/irfyI+g0CDKmQxzAGWRY0PzyCOPmCbGAGPlSqG9ISfOTkjLNjfOk6xWrZp+PFiucpBTxfF38803J+eBBx5gLaRVr8LcuXN5MiTsueeedYN/VUlyyo0Ux9IVVn/RCAie6xre9q1atSJBIwwqMm8kFnKd4O0RpkqV6p8qPv45Fb7WllRMr1dffXX16tU5s7KM3XvvvVOnTmWleeaZZ/r06bNwYeYvnm699daHHnoow2IJ6d+/P4sWye+///5RRx3FbNisWbNjjjkm210wlzlgywlGJeNecskll19++bhx41i0vvzyy9NOOw3Ybrvt9t57b1pZ7ThfMiSGcffdd3PmfuGFF/bff3+dfQcNGqSRIxY5/RQP55WvvvqKBYBpV03IXE6AmGQp4ZR/4IEHfvTRRySzyWDB++9//8uzOuecc7Su2yWGN9pooyuvvJIzIqM67LDDdDlGMnjw4Jtuuok+leaIsaGWLVvuuuuuXP3++++/8MIL33nnHRYbbpb70h/0mosaMDJx242SnWyXsI3glhnAVVdddcUVV/BIWelnzpzJSBiPeZ7Kd2Sa7IS33nqLt82zzz47a9YszuscZx966CFuZI899uAJmxLJuOzqzjvvPB4yGzseJqd8nuEPP/xw5JFH8rbkiKy/fGXyJcd1xE1df/31vIV4QXn3sidjJAMHDqQfnjkJquWWeQgdOnSYPn06TUFpqlSp/t9L84sjHRoQkJMRzHr56quv6m/Komzvwc/4sMyw8UfMUKysHLyUg5gNmY+23357/XYepK70e+844TG9qn/ELFanTh3iQ4YMMddFZ5xxBl0ddNBBKkf6+7U1g19fpZ9SQUS4nPkLndKYMWP0i5A0LWJZrljS2DdQsvHGG+uvKqEBAwaYH+5F7BgU56iKu9VWW+kIq271V01Y5Fih6ZyeVcWxjEWdcyeFymRFJ5MFmLVEEUQ///73v/Wn4EakIbYvSpAcRhzluQUyuRyWQl6Cfv367bzzzkSuvfZaXghl6vdG8Xy++eYbXN0OizQlTZo04VhvumWDwkGcZNY8E0R33HEHfW655Za6HftVth+7gJdj9uzZytTvjWJF/PHHH4Oesnfx6KOPkrnLLruY153zscp1L+qNe2HwtAZDzv79WnrjXaR+EA/ZvI6IBF4I3nuNGzd+++23NQyVsx2h87Zt23KbuNRmBlRYyDaROKuyyTz22GM1ACO2ROwRWchpJY3bHzFihH5+/pdffjFdGUAxLFeAHA5tsuNiFBr3GcWwcUNZgJy4WIBsRqFNgJSTM9mBErJx82GUkOUKkMNyBSiKkZ3vMArlIKUYo4RsXMMC5MR9Fki2G8qAFMpBSkYJ2bj5MErIcgXIZpRdOGNVnq16ZgLzpKnEAIphucy5rDG1a9dmguOcyvHxgAMOuOCCC7bZZhumOS7GikJkp5124mBRt25dcphVzz//fObERo0aqTcNi4medW633XZr374905aauD3mMiKsYVr2yCRIMitBp06d2rVrp2ROzKwuTKwcoA8++GCAy7EcHnfccffcc8/mm2/OgMnEEmdIDI/5kdn2lFNOueuuuzgz1apVi25btWrFal2vXj0yGWGXLl2YwXG33Xbbrl276vtDIszyDLVjx476cRiCLJwNGjTg/HrWWWfxNEjYbLPNKL/99tvN3/hk8FjmaFyeCTdFublZnh67B/YWnEe5NO6ee+5Jbxy7eXomTZeTxORTyBUR68o+++xz66238hBYvVq3bs2Ri8euZN4i3CMvAT3rh4kYErdDD9wOz1NX0VA5mrMl2muvvTbZZBMTpE9GuPvuuzN+k8wqyyOlnI0Fj5dlm0Mekcsvv5wnpp0Hd005T4zxkGZqGQ8utXSoTJ4b984z57owrxFPlTeMvqSVeA9gqeKFo1xB9c/j0jB40emHxfLOO+/ccccduSKdK5M3FbfP2ZQlnIsqiDiRc0XurkWLFuTTYZs2bXiT88o2bNiQJu7rtttuO/vss0nWO5w0ffBo4lnxplJXyFwORTEqaVqS/ChGpS63gyhJLQptSlJrMyppST6MSlQSlYBy1krGTZIfxajU5XYQJalFOdNyJkhJ0tYVoxKVRCUg3HhlpqQsWiKoYrXGM2J+wZoq5j5NlErDag5SK9ZkGlBQgEXGtSMKIrt/LgeYNK7122+/Mbcy6XMqdVZrMzZclTPVmkimfm23SMMGSKAQZtnWtVRul8Dkq9Y8EEVMXBap3MQ1KkXUqh7EWJOmqyhNcSWIsSxj6pwOAUrUrW4TUAkWka9I0IHbm3pQK1aDVAKuIiYfwFVvSIUap5oEWCXIRUoTaLQwF1KmCtUzgFUO1gxGtQhWISKBR2EiZmBB4x+XhhXHVZMuhEzcbpXLq0MaMsPDmkIjEzHXimesfS2HsX6THRfL9eM+y41ibEy5QIx1cgQOo9Am1SLcePZr4xkrNx+Wm4SxuAKfsTG1hpFxVW6zXJ/9fLlJGOuUC8TY0ByxQLLdUFYtwvU5vtZnrNx8WG4SxuIKHEYmM0bFCoyYO1RsrhHDcqMYiyuIZ6xTazPWb7LjYmnKlCmc3jhGs9busMMORDTrRfXjMzbqErBAjM2Zj5Xrs3IQbjz7tT5j5caz3CT5cqMY65QIfMY6+QIx1nZ9Vg7C9dnO8Wt9xvolpWCBZLsOY3EF8Yx1am3G+k12XCzXj/ssN4qxMeUCMdbJETiMQptUi3Dj2a+NZ6zcfFhuEsbiCnzGxtQaRsZVuc1yffbz5SZhrFMuEGNDc8QCyXZDWbUI1+f4Wp+xcvNhuUkYiytwGPn7bF/FCozMZcw1YlhuFGNxBfGMdWptxvpNdlyM2Ciw1jZv3py19vPPP9cXhulaa7PcJPlyoxjrlAh8xjr5AjHWdn1WDsL12c7xa33G+iWlYIFkuw5jcQXxjHVqbcb6TXZcLNeP+yw3irEx5QIx1skROIxCm1SLcOPZr41nrNx8WG4SxuIKfMbG1BpGxlW5zXJ99vPlJmGsUy4QY0NzxALJdkNZtQjX5/han7Fy82G5SRiLK3AYmcwYhf95renF7i6KUc60nAlSSdOicpYsWfLaa6/VqlWrd+/e5s8XUVQ/qERsB1GSWhTalKTWZlTSknwYlagkKgHlrJVCm5LU2oxKWpIPo5xpOROkkqYlyY9iVOpyO4iS1KLQpiS1NqOSluTDqEQlUQkoZ61k3CT5UYxKXW4HUZJalDMtZ4KUJG1dMSpRSVRCkrW22OJsRFDFao1nuVGMxRXEM9aptRnrN9lxsQSvDH4DX4Xg3y0gkp5rbZabJF9uFGOdEoHPWCdfIMbars/KQbg+2zl+rc9Yv6QULJBs12EsriCesU6tzVi/yY6L5fpxn+VGMTamXCDGOjkCh1Fok2oRbjz7tfGMlZsPy03CWFyBz9iYWsPIuCq3Wa7Pfr7cJIx1ygVibGiOWCDZbiirFuH6HF/rM1ZuPiw3CWNxBQ4jkxmj3N8y/+3EbaMqwT/6zVorN9uWKlWqVKlS/c/1z/wOOQmjUpfbQZSkFoU2Jam1GZW0JB9GJSqJSkA5a6XQpiS1NqOSluTDKGdazgSppGlJ8qMYlbrcDqIktSi0KUmtzaikJfkwKlFJVALKWSsZN0l+FKNSl9tBlKQW5UzLmSAlSVtXjEpUEpWQ5Dj3DzzXpkqVKlWqVOuVin3pbERQC7Va41luFGNxBfGMdWptxvpNdlws12c/X24UY/0SwwIxNmc+Vq7PykG48ezX+oyVG89yk+TLjWKsUyLwGevkC8RY2/VZOQjXZzvHr/UZ65eUggWS7TqMxRXEM9aptRnrN9lxsVw/7rPcKMbGlAvEWCdH4DAKbVItwo1nvzaesXLzYblJGIsr8BkbU2sYGVflNsv12c+Xm4SxTrlAjA3NEQsk2w1l1SJcn+NrfcbKzYflJmEsrsBhZDJjlJ5rU6VKlSpVqj9X6Z/XZlUitoMoSS0KbUpSazMqaUk+jEpUEpWActZKoU1Jam1GJS3Jh1HOtJwJUknTkuRHMSp1uR1ESWpRaFOSWptRSUvyYVSikqgElLNWMm6S/ChGpS63gyhJLcqZljNBSpK2rhiVqCQqIcm5tthB2IigitUaz3KjGIsriGesU2sz1m+y42K5Pvv5cqMY65cYFoixOfOxcn1WDsKNZ7/WZ6zceJabJF9uFGOdEoHPWCdfIMbars/KQbg+2zl+rc9Yv6QULJBs12EsriCesU6tzVi/yY6L5fpxn+VGMTamXCDGOjkCh1Fok2oRbjz7tfGMlZsPy03CWFyBz9iYWsPIuCq3Wa7Pfr7cJIx1ygVibGiOWCDZbiirFuH6HF/rM1ZuPiw3CWNxBQ4jkxmj9FybVYnYDqIktSi0KUmtzaikJfkwKlFJVALKWSuFNiWptRmVtCQfRjnTciZIJU1Lkh/FqNTldhAlqUWhTUlqbUYlLcmHUYlKohJQzlrJuEnyoxiVutwOoiS1KGdazgQpSdq6YlSikqiEJGttscXZiKCK1RrPcqMYiyuIZ6xTazPWb7LjYrk++/lyoxjrlxgWiLE587FyfVYOwo1nv9ZnrNx4lpskX24UY50Sgc9YJ18gxtquz8pBuD7bOX6tz1i/pBQskGzXYSyuIJ6xTq3NWL/Jjovl+nGf5UYxNqZcIMY6OQKHUWiTahFuPPu18YyVmw/LTcJYXIHP2Jhaw8i4KrdZrs9+vtwkjHXKBWJsaI5YINluKKsW4focX+szVm4+LDcJY3EFDiOTGaP0Z6NSpUqVKlWqP1fpd8hZlYjtIEpSi0KbktTajEpakg+jEpVEJaCctVJoU5Jam1FJS/JhlDMtZ4JU0rQk+VGMSl1uB1GSWhTalKTWZlTSknwYlagkKgHlrJWMmyQ/ilGpy+0gSlKLcqblTJCSpK0rRiUqiUpIcq4tdhA2Iqhitcaz3CjG4griGevU2oz1m+y4WK7Pfr7cKMb6JYYFYmzOfKxcn5WDcOPZr/UZKzee5SbJlxvFWKdE4DPWyReIsbbrs3IQrs92jl/rM9YvKQULJNt1GIsriGesU2sz1m+y42K5ftxnuVGMjSkXiLFOjsBhFNqkWoQbz35tPGPl5sNykzAWV+AzNqbWMDKuym2W67OfLzcJY51ygRgbmiMWSLYbyqpFuD7H1/qMlZsPy03CWFyBw8hkxij9DjlVqlSpUkWKhcSsJQ4LJNsNZUAK5SAlo4Rs3HwYJWS5AmQzCrJyKP0OOasSsR1ESWpRaFOSWptRSUvyYVSikqgElLNWCm1KUmszKmlJPoxypuVMkEqaliQ/ilGpy+0gSlKLQpuS1NqMSlqSD6MSlUQloJy1knGT5EcxKnW5HUy4kKTKqWIHYSOCesRqjWe5UYzFFcQz1qm1Ges32XGxXJ/9fLlRjPVLDAvE2Jz5WLk+KwfhxrNf6zNWbjzLTZIvN4qxTonAZ6yTLxBjbddn5SBcn+0cv9ZnrF9SChZItuswFlcQz1in1mas32THxXL9uM9yoxgbUy4QY50cgcMotEm1CDee/dp4xsrNh+UmYSyuwGdsTK1hZFyV2yzXZz9fbhLGOuUCsSBV/krPtVmViO0gSlKLQpuS1NqMSlqSD6MSlUQloJy1UmhTklqbUUlL8mGUMy1nglTStCT5UYxKXW4HUZJaFNqUpNZmVNKSfBiVqCQqAeWslYybJD+KUanL7WC63K4rpX9emypVqlSpUv25+uPrAlsEnW8VYlhuFGNxBfGMdWptxvpNdlws12c/X24UY/0SwwIxNmc+Vq7PykG48ezX+oyVG89yk+TLjWKsUyLwGevkC8RY2/VZOQjXZzvHr/UZ65eUggWS7TqMxRXEM9aptRnrN9lxsVw/7rPcKMbGlAvEWCdH4DAKbVItwo1nvzaesXLzYblJGIsr8BkbU2sYGVflNsv12c+Xm4SxTrlALEiVv9LvkLMqEdtBlKQWhTYlqbUZlbQkH0YlKolKQDlrpdCmJLU2o5KW5MMoZ1rOBKmkaUnyoxiVutwOoiS1KLQpSa3NqKQl+TAqUUlUAspZKxk3SX4Uo1KX28F0uV1X+mMLY4ugs9OJYblRjMUVxDPWqbUZ6zfZcbFcn/18uVGM9UsMC8TYnPlYuT4rB+HGs1/rM1ZuPMtNki83irFOicBnrJMvEGNt12flIFyf7Ry/1mesX1IKFki26zAWVxDPWKfWZqzfZMfFcv24z3KjGBtTLhBjnRyBwyi0SbUIN5792njGys2H5SZhLK7AZ2xMrWFkXJXbLNdnP19uEsY65QKxIFX+Sv+8NlWqVKlSpfpzlX6HnFWJ2A6iJLUotClJrc2opCX5MCpRSVQCylkrhTYlqbUZlbQkH0Y503ImSCVNS5IfxajU5XYQJalFoU1Jam1GJS3Jh1GJSqISUM5aybhJ8qMYlbrcDqZH23WlP74usEXQ+VYhhuVGMRZXEM9Yp9ZmrN9kx8Vyffbz5UYx1i8xLBBjc+Zj5fqsHIQbz36tz1i58Sw3Sb7cKMY6JQKfsU6+QIy1XZ+Vg3B9tnP8Wp+xfkkpWCDZrsNYXEE8Y51am7F+kx0Xy/XjPsuNYmxMuUCMdXIEDqPQJtUi3Hj2a+MZKzcflpuEsbgCn7ExtYaRcVVus1yf/Xy5SRjrlAvEglT5K/0OOVWqVKlSpfpz9ccWxlZRUZGz04lhuVGMxRUYlluuXDk7jsWN78e4Jh9rs1yf/Xy5UYz1SwwLxNic+Vi5PisH4cazX+szVm48y02SLzeKsU6JwGesk2MnIOUggj6bTLvKsJ3j1/qM9UtKwQLJdh3G4griGevU2oz1m+y4WK4f91luFGNjygVirJMjcBiFNqkW4cazXxvPWLn5sNwkjMUV+IyNqTWMjKtym+X67OfLTcJYp1wgVlOq/PXHY7VlXozQV8JhuVGMxRUYZi3HstaaTIFTazPWb7LjYrk++/lyoxjrlxgWiLE587FyfVYOwo1nv9ZnrNx4lpskX24UY50Sgc9Y4xYWFvLS2xH7nUDQZ7srn+0cv9ZnrF9SChZItuswFlcQz1in1mas32THxXL9uM9yoxgbUy4QY50cgcMotEm1CDee/dp4xsrNh+UmYSyuwGdsTK1hZFyV2yzXZz9fbhLGOuUCsSBV/vrLfjbKHJ1t5exHMm6S/ChGpS63gyhJLQptSlJrMyppST6MSlQSlYDkmk81Ky6c8D0QGkRRjEpakg+jnGk5E6SSpiXJj2JU6nI7iJLUotCmJLU2o5KW5MOoRCVRCShnrWTcJPlRjEpdbgf9T2iq0umPLYytP+87ZGn16tVPPPFEr169ateurYjpNr4f45p8rM1yDaPFixf/+uuvRFD58uWZ4itUqMAAqlat2rBhww033JAc5YdeLpQFYmzOfKxcn5WDcOPZr7UZmDlz5vTp0xVXFeLIWLly5S222KJKlSpKU74SBIqHstwoxjolAp+xxuUl4A325JNPHnXUUdWqVVPQ7tbnqG7Fdo5f6zPWLykFCyTbdRiLK4hnrFNrM9ZvsuNiuX7cZ7lRjI0pF4ixTo7AYRTapFqEG89+bTxj5ebDcpMwFlfgMzam1jAyrsptluuzny83CWOdcoFYTany1x+P1ZZ5MUJfCYflRjEWF2B6hZn6WefefffdK6644quvvqpTp06Q+Ee38f0Y1+RjbZYrxqIJEyZ88cUX77333ttvv02QxbVRo0bAggULFi5c2KlTp/PPP3/33XdnYKzBmfqwbh0WiLE587FyfVYOwo1nv9YwzxaNHDny+++/f/zxx4cMGYLboEGD+vXrA/PmzVuxYkXXrl3PO++8du3a6QtbrN1tFMuNYqxTIvAZKxeLXnnlleuuu2748OHseIKUHJcDFHG6Fds5fq3PWL+kFCyQbNdhLK4gnrFOrc1Yv8mOi+X6cZ/lRjE2plwgxjo5AodRaJNqEW48+7XxjJWbD8tNwlhcgc/YmFrDyLgqt1muz36+3CSMdcoFYkGq/PXHY7XFBO08/RiWG8VYXIGaJk2atP/++1evXp0lgYNmkPhHt/H9GNfkY22WaxjB3M7YsWM7dOiwcuXKG2644dxzzyW+dOlSliVcjla33377CSec8PddawHuUe6HH3548MEH4z7//PPdu3cnMmvWrL59+z722GO1a9d+9NFHDzjgADKdbqNYbhRjnRKBz1iT88svv+yzzz4NGzYcMGBAxYoV1SqLSPM5qlvTJ9ZmuVGM9UtKwQLJdh3G4griGevU2oz1m+y4WK4f91luFGNjygVirJMjcBiFNqkW4cazXxvPWLn5sNwkjMUV+IyNqTWMjKtym+X67OfLTcJYp1wgVlOqdSAeqy9magM5GcWwXGxhYSEn2iVLlrAScKjq2LGjvkuUTL4AORzaZMfFyDD966KIFYhpvUqVKoMGDVIEsfT27NmTwbAITZw4Uf0glcewADnxGEahHKRklJMz2YF8xhYE4qb69+/Px6NevXrsafQEEIf49u3bs7Np2rQpLwERVak8hlEMG9ewAPmskSxevLhHjx6MZN999+X5M2CCypFCGZBCOUgpxiiGjZsnCyTbdViuAMWwXAFyOLTJjotRaNxnFMPGDWUBcuJiAbIZhTYBUk7OZAdKyMbNh1FClitADssVoChGdr7DKJSDlGKMErJxDQuQ4qnWif6nf7+W63GUfPfdd3kVGzRowCKXbfhzxJLDJaTvvvuOOX3zzTffcsstg71aRhxkd999d0bFUjRw4MBs2d9TuiNAN9K2bdtNNtlE9068evXqO+ywA0+ALcXPP//MLQdF/ztl3mvBZ/iBBx744IMPGFK1atU0NpRNSpUqVap/qMJXO82M61Z0O2LEiIceeohVFm7YsGG24X8iFngsh+maNWvqBjP3uRawnLGwf1NxC6xYAEfbL774ApdTIzsJtSK1mvuV+z8W1x06dOiTTz7JJgC3efPmrLVqSpUq1fopPrap1onC19rMWWPdSX3Onz//4osvvuCCC3SaqV27Ni+kEv4k6brAb7/9Nnr0aHj//fc3Qc5YnPOY/RlG5cqVd9ppp6DoL5bGJqsvV8U5RRoPdvjw4dOnT69YseKee+5p4mjp0qUcZ3Hr1q3bokULImr9s6WrIx7y3LlzL7roogsvvJDhEfkfvAFSpUqVp/T5TZW//vTvkFkwmFKx//rXv7beeusePXqsWLGCeK1atZTwp4pLs1wNGTJkyZIllSpV6ty5My5vICxNLLTvvPMOS9QRRxzRsmXLbM1fKg2Yx8UJ9Z577unXrx+cbUugzz77jLtr0KDBVlttpa6w6O233/7222856Z599tnVq1fPZv9PpAFwF7fccgsbmu7du8+ZM4fIhhtuWL58eR5+Ni9VqlSp/rn6079D1sL27rvvfv3111dddRVrHmst8aZNm67bC/ky/bOgMgYmes5STO76qagPP/zwyCOPBJj977zzTvsb179WDJvRstBed911jz766MyZM7MNsaJq1apVH3/8MbX77LMPGwv1s3jx4ieffPK8885jYTvppJMuuOACXpFszZ8vxqDLvfbaa8OGDbv22msXLVrENoJXoVGjRspJlSrVeis+wqnWibJ/jOeIIFOkABvPcqOY6X7q1Kk9e/a89957d9llFyZcDpescF999RWuyTTdRvWD9ZvsuFiuGIsWLFiw8847T5gwoVOnTpyqWZA4Vw0aNGjUqFHt27c/+uijDznkEI569hlL5VGXMP2LsTnzsXJ9Vg7ChTn/IZbYSy+9tG7dus8999yuu+5qEgR+P1ie8/jx49lPLFy4sFevXjxb7pQnzxbn119/7dKly7HHHtutWzfWYDK5U+5Xl7aH4bPcKMY6JQKbsZMnT2ZI999/P8MbPHjwHnvsUbFixc8//7xjx45KUH6mzLuE3T8KvYSd49f6jPVLSsECyXYdxuIK4hnr1NqM9ZvsuFiuH/dZbhRjY8oFYqyTI3AYhTapFuHGs18bz1i5+bDcJIzFFfiMjak1jIyrcpvl+uzny03CWKdcIBakyl9/PFZbTMfO049huQ5jJQ6RRx11VIcOHS655BJaOXgdeOCBJHz77bdt2rRRCTLdRvWJ9ZvsuFiuzZrcObay0mywwQazZs364YcfOCxuvfXWN9xwgwZDPsuP+kF+tw7b/WNz5mPlqlbirD9u3DhArUiPnRE++OCDnP4PPvhgntumm27K+Zu4vRVw+gSofeaZZ0488cQNN9ywa9eu3O+UKVPYT7DV2HHHHW+66Sa2GqyvqkIaiQAbxXKjGOuUCAyzb1i2bBmjYlk999xzCX7wwQfseGrUqMF2h5eATLsfuT473Tps5/i1PmP9klKwQLJdh7G4gnjGOrU2Y/0mOy6W68d9lhvF2JhygRjr5AgcRqFNqkW48ezXxjNWbj4sNwljcQU+Y2NqDSPjqtxmuT77+XKTMNYpF4jVlGodiMfqi4nbQE5GPjPJFhQUcH59+OGHDzjgAGZ8Fl0irAfM+KwHHLnICbrJyK6VHA5tsuNiZFj9X3PNNdzmVlttNX/+fA2JY9b555/PalS1atVbb72VIEdAktUPUnkMC5ATj2EkxnItnsZJJ52kl0DSe1pvbh2ytfx37tyZdZdBmq6i+mRPQz4LLfdI/9wUay2ndvrZaKONOCITJFNSlSCGUQwb17AAiRnDPffcw4aGFZdbQBzZGWSdOnV+++23oLRYPyiUg5SMQjlIKcYoho2bJwsk23VYrgDFsFwBcji0yY6LUWjcZxTDxg1lAXLiYgGyGYU2AVJOzmQHSsjGzYdRQpYrQA7LFaAoRna+wyiUg5RijBKycQ0LkOKp1on+2MLYyjQU3+nEsFyHtc599913ffr0eeGFF5o0acKMT+u999578803M9X++uuv1apVM1Wm26g+sX6THRfLFWOZ6znUDh06lKPVQw89pFYsq0737t0//fTTypUrv/POO/qpXY0Q+d06LBBjc+Zj5cI8Gbmc8EaOHGlyGBLjfP/999kBHH/88fr7x7Ty6Hr37s3OgBxEsvoJ+sgydtGiRVtuuSVn4jvuuIOdhGklzol2zJgx1atXHzZsWLNmzVSrKoEyQ1luFGOdEoHhb7/9lkP5K6+80rhxY52qb7vttjvvvJPDOmf6DTbYgIjdj1yfnW4dtnP8Wp+xfkkpWCDZrsNYXEE8Y51am7F+kx0Xy/XjPsuNYmxMuUCMdXIEDqPQJtUi3Hj2a+MZKzcflpuEsbgCn7ExtYaRcVVus1yf/Xy5SRjrlAvEglTrQDxWX2ZHE2xucjDymRVl4cKFO++8c+3atbcOxMkSW69ePS7KjL906VKtx5JdKzkc2mTHxcgw/bOic4bmHfPqq6+ymBFBJGD//e9/E2c9O+WUU9QUdJORymNYgJx4DCMxlstxwuOKcGZAhYXsCb788stGjRqx9vfr14+DqeLIDNt0FdrnwIEDdRpmQVX/WOJYfXnLY7/77rvtrrCCGEYxbFzDAl1l7ty5egO0bt1622235Q2AcBlJy5Ytly9f7tRKoRykZBTKQUoxRjFs3DxZINmuw3IFKIblCpDDoU12XIxC4z6jGDZuKAuQExcLkM0otAmQcnImO1BCNm4+jBKyXAFyWK4ARTGy8x1GoRykFGOUkI1rWIAUT7VO9Gf9HDJTPCeYbbbZhqmfAxzHNezbb7+90047sR5svPHGHNTWyYXixQLGkl+rVi3zE0aIOLZ+/fqsQLyf5s2bp6VIJX+2giFkNoxffPHF448//sQTT2AffvjhXr16/f7774yT094zzzxDXPrwww9NlXrwRdMnn3wCbLHFFm3atNEeAqtW7W8Qp15F/gdisb/hhht22GGH4cOHf/TRR++88w438t5773Xo0IGRbLLJJuwMsqmpUqVaX6WpI9U6UPaJFpfZ0QSbmxyMfGZZ7dix48yZMznfKIg4ynTq1Ill4OCDDzYHOykoDelHClJCmuy4GBnmEkcddRT3uNdee61YsULnPKQEDnk0sSCdc845jITWoJuMTFoUC5ATj2Ekzpz4gsudfPLJwSuQkY6kFStWBJDWfqlLly465poR+n3yYLWZOOOMM5SZufDaPxg+66yzaKLP++67zzx21QpiGMWwcQ0LuOgbb7yx2267zZkzB0ZcVGKzxWB69+4NO7VSKAcpGYVykFKMUQwbN08WSLbrsFwBimG5AuRwaJMdF6PQuM8oho0bygLkxMUCZDMKbQKknJzJDpSQjZsPo4QsV4AclitAUYzsfIdRKAcpxRglZOMaFiDFU60ThZ9rSy11OnXqVM40t956a926dc2ywSvHnKu/XFujRg17OVm3Mm+RxYsXDxkyhDWMtVYLmJoYBksOJy0ixPfbbz/if954HHEhxsAV99hjj4suuuiYY44hwsLTvXv3884774JAxI30TzVQEjpC4tROnDjxp59+IoE+sw1BE1dZsGCBfsFF1apV9957b+432/wnSI8XO2HCBN4A7GY22mgjnbDVylAZD27NmjXJVFWqVKlS/fPFlOeLOZGZUZNjTkZ2XH+/85ZbbtFPw6opkx38Ad4WW2zBRa+55hqaTBwF1X/0gxwObbLjYlsDBw5komeNYcXFZX3V6XbVqlWPP/54hQoVWHj22WcfBkxE45FUHsMC5MRjGBlmJBrGtGnTOPxxnD3++OOXLFlCxJQIEJnKh1VuANFE1RNPPMGd1q9fn0UuKMpUEecluOqqq2jiZk866STTP6LWQAyjGDauWEPlRnr06MFCqzEjkzNz5syGDRuy1l599dV23DAK5SAlo1AOUooximHj5skCyXYdlitAMSxXgBwObbLjYhQa9xnFsHFDWYCcuFiAbEahTYCUkzPZgRKycfNhlJDlCpDDcgUoipGd7zAK5SClGKOEbFzDAgRnl4RUeWtdfoeMZTFjHa1Tpw5HW6ZazbZqAsaPH1+5cmWm2ptvvplJOdPFWgXdZPuRHA5tsuNiiWtx6bPOOotlplmzZvPnz1eEi86ZM+eSSy6pXr06y0/r1q3HjBljVjL1g9RJDAuQE49hJNZIsDwrjtQM48gjj2QXwroYutYi5SO7H6QmCtkxcKd77rknfSqTm5oyZcrJJ59cqVIlmjp37jx9+nTimQEFUj+CGEYxbFwxg1m6dOlll1226aab/vbbb4xBV1QCrd9//z0nWt4A+iktu1aMQjlIySiUg5RijGLYuHmyQLJdh+UKUAzLFSCHQ5vsuBiFxn1GMWzcUBYgJy4WIJtRaBMg5eRMdqCEbNx8GCVkuQLksFwBimJk5zuMQjlIKcYoIRvXsAApnmqdqPz111+vA64tGhxAUYx4VZhJOaXdeeedTKO4++6778Ybb8yJTd8fctZhkbvpppu+++47Ir///vvuu+/OMsDSS1fqTVYvcKZTi7GwiZgglimbDk1QkQULFnz88cecrZcvX85VZs2aNWLEiC+++OLhhx++9tprP/nkE2Z8Vrj+/fs3bdqUQlYj1aoT+7p+RNa/rkZoIgaM5FKlQi66ePFiVv0HH3yQ8bDoElemAeQHg44zYjHj1l555RV6YDxVqlThJeAJc6f333//jTfeyIGeV+GUU06577772ADZ3VIuYNjE1SGgu0B06ABSmi2CejgMhqvfdttt9957L0H9a/DmK2veAAyVV2TYsGEk82bo2LEjA9bvkkRBZxmpNzMkuwmZx64qJdsy9wgL0J/NKGdazgSppGlJ8qMYlbrcDqIktSi0KUmtzaikJfkwKlFJVALKWSsZN0l+FKNSl9tB84FKlaf+mLZsEdQjVms8M98xA06aNMn88l7i1apVO+aYY3bccUdWFFpfffXVkSNHzpgxg1bV1qpVixPniSeeyMmS5aFevXoHH3zwf//733ffffe0005r2bIlK+LPP/98xhln0AMdMmu//PLLrOhHHXUUV/zll19YPjmwnnTSSRyXn3jiia5du3bp0oWef/3115deeolzFcdEM1QsnVStWpUFoF27dttuu22jRo3oWZ0rgR3ADz/8wNnruOOOY/wPPfQQrRdccAH9XHfddazNFKofrvuvf/3r4osvbtCgAeulqlg82rZt+8ILL3BHV155ZY0aNeiWZPWvS9iMuB3EhkPDIKLWUDa13L7s4MGDeUr6CoGIEkimN5ZwDpfbb799mzZt6tevT5yg+qGQh8OYebznnnsu+5LHHnts8803P/bYY2GWw/PPP1/f9JLJafiBBx644ooruB0Wy9GjRw8dOvSQQw4hgY0L937VVVfpotz1448/Tg+83FyLB8hLw3OG6YfX7scff+QJawwEeQNstdVWRx999KhRo3gpeQN069bt888///DDD88555wmTZq88847vJSwFmw6efHFFyk8/PDDuV9e9OHDhzMAdhKM6plnnqFcvxvLPAfzxBzGmpx8WCDZrsNYXEE8Y51am7F+kx0Xy/XjPsuNYmxMuUCMdXIEDqPQJtUi3Hj2a+MZKzcflpuEsbgCn7ExtYaRcVVus1yf/Xy5SRjrlAvEglTrQDxWX8xrBnIycytnGsSaFKwdmS88BZJaUdYPpAhzJXM3SyDTNNPxDTfcwCTOwYgEpt3GjRsT0a8cYg3bZJNNmG05qnIhpldWPuZZ1huO5ixyvXv3tnvWYDQSrAaspiAr+62suQuAY9mll17KACZOnMh1WSo22mijefPmLV269Mwzz2zVqhVDpZBIjx49OJezsaC3uXPnPv3005zPBg0a1K9fv8suu0ysa5n+kcNc3YxETVgplDNlgcinZ24Qqx4QCeoHNj0jcnBtEZwwYcKhhx7K5oAtC9uCgw46aJtttqFD7oV1dKeddho3bhxpU6ZM2WGHHXiwixYtopD7vfnmm3kmLKtsNXgmnMXNHxZk7jb4tVxyjXRFGySSuSJvgM8++4wzN+s9B/Rrr722bt26nMJp+vjjj9kPcUWeP90+8sgjvDRvv/22alm5GfNZZ53F2Z1b4A1w/PHHEzcPyn5iDhs3TxZItuuwXAGKYbkC5HBokx0Xo9C4zyiGjRvKAuTExQJkMwptAqScnMkOlJCNmw+jhCxXgByWK0BRjOx8h1EoBynFGCVk4xoWIMVTrROtg79fy96H0wbSkUJbIVyAVrGJ2CJSpUqVDTfcEMuszclGv3VB/5j8brvtxkG5f//+//nPf1hiabrjjjt69erFSkYh6+7ChQs5GbMeXHLJJRzdOJlpMBqGfTlzRwSNFDdNiOsuWbKkffv2HK85W++8886MjQMiV+TSnTt37tOnz7Bhw1iDZ8+ezfqqX8tADm9KDuKctPbbbz9GyAAYj66S7TpMGhuyx5BE5HMJZO5CFtGqoBFpKrHFJoZFbrvttmNPwOagefPm+jVVnDU5rXJH3CkH3yOOOILnzEuwwQYbcI+sgitWrGBV++abb0488cQ999yTTB4CV6HPzN2ufQ/gGmuGBJhMQKJnTsx0y3NmDGeccQYRXkpyunTp8uCDD95///1PPvkk+zAW1FtvvfWAAw6gH6RvFDg3T5o0iWM3I+ccT+dICalSpcpf+kylWgfKPtHiMjuazN4mF6MYlitANnPQ4fhyzz33sGQyxXMo+eqrrzjiTJ482Rx93n33XSZ0pmPWXVyCOh5xGNp4441ZAlkdWfngjz76iCYUejnAYeQw56fWrVsfc8wxnPYY2JFHHsmKqz65Ihc66qij2BYwv7PAEwk6y9zFsccey0rMkAj27duX1ZqzOHElZC/gXc6AlJMz2YFi2LihHNxK5l5mzJjB8ZRDIbB8+XIOuHfeeafS0KxZs9josMNgYeOky6NAPHxWRI68HHw5U9IP5RyO6c2+hAA5bFybNZ7bbrttyy23ZDvFJXiGLKLTp0+nW0TkjTfe4NWvVq0amxszEvTll1/yVrnlllt41BzT2ajp+duXECCHjZsnCyTbdViuAMWwXAFyOLTJjotRaNxnFMPGDWUBcuJiAbIZhTYBUk7OZAdKyMbNh1FClitADssVoChGdr7DKJSDlGKMErJxDQuQ4qnWidbx368tkTjWMIKBAwdylDnwwAOJMIGyknF+Iq4cZs/McSn4Q1+sDkOIo9XKlSs5WrH4ffvtt5UrV95qq614c6iqdOKEhJj3mzRpwnUZGCto8JQyg9HCoD9bBVRC06JFiwYPHsz5j3MwkQEDBuy+++6kqTXIWo8U3M2aESNGsETtuuuuPGpW1lGjRulOlcPdwdwCC1v2cQdiCRw9ejQbIx4Rj5rNDUdPnoaqSio61BV5YhUqVNh7772JsF62atVKv9JLrbwQFStWZDB65gRpArg6l+7WrRu1rP2cp3fYYQfiqVKlSrUeah18h1xqcaE5c+Z8//335513HtMrhxVOsZ06deJERROz+eeff37KKadce+21Tz311NVXX83RVgcX1gCm2sMOO2y77bYj88MPP2SF5mirCbrUop+GDRuefPLJ9DNs2DCOcZq+cVmZLrzwQlaa//73v1yLS7Mqq2r48OHcxfnnn896AOCybGjVV8J6Je4F8ZzZGbC/IcLyxorLCqpnzi1wfGfpYgeDq7+PpAWVe2dXpJ9WY3kmkxdLC3MpROeMhM7ZqVxwwQWcs3lZGUzXrl1pIoGrvP/+++eee+5NN930yCOPXHLJJbwNKKGVtwqZvXv3bt26NUMljZHw/KlaPx97qlR/U/GBSrVulH2ixaXJTpCTUQzLFSCbmaZZ3phkZ8+ezew5YcKERo0asQxw0MFloWXlu+uuu/QXT/WzUc8//zw8efJkqjgEs/QuWbJkl112ufPOOz/44AMyKQy9HOAwspnagw8++JxzzqFPdNttt3E8HTRo0G+//bZ8+fIzzzyTaZ0Fhv5ZHg466CDWKhYbqm699VYGQJzbYQyciYcMGcKKSycEoy5nGJByciY7UAwbN5QZIUPiIN6uXbt+/fpphMcdd9wxxxzD0+a+eCF4CJx3uWtaJ06cyK2xsVi4cCE7DzL79OlDJ6h///50MnToUP0TvOYSAuSwcW3m6b311ls8Mf3d32nTptWsWfPjjz9mMDQBjRs35g3A88d97LHHeAO8+eabjHnMmDHwV199RZyBbb311vfccw+rL5n2JQTIYePmyQLJdh2WK0AxLFeAHA5tsuNiFBr3GcWwcUNZgJy4WIBsRqFNgJSTM9mBErJx82GUkOUKkMNyBSiKkZ3vMArlIKUYo4RsXMMCpHiqdaK/8jtkLs9Rac8992Th5BAzdepUloFvvvmGqVOD47B41lln6cdtevbsyXzKAgCzkjELb7PNNnRCCXP0iBEjtthiC5qQOi+pWE44tnIVmEszm7P2zJo1i+My77kNN9yQxZ7DH02sB0888UTHjh2Z9En+5JNPevTowXVJo4oTMMNjYSZTx8H1Soxz/Pjx8+bN6969O64WVB5ghQoVuC/GvPnmm3On3DWtbH1eeumlpk2bwixj3Fe3bt1g7pRnRYTDvX4RWCnEtXhl2TCxtNerVw+XkfBIv/7667Zt29LE2C666CJ2P/o10ccee+wdd9zBykotu5natWtvt912DJsdDy8TbwBGbv4+d6pUqVKtV8r+qZgjJlPNWWqNZ7lRjMUV+MwsyXRZq1YtePHixd99993222/PpK80xHIl5twDa2CcZZno9ddGaWLmZenlEKy1DUswuMIflwOwNss1zCzPksNpifka95dfflm6dCmTPkwOV2e6V74sKwEXQlSxEm+wwQYEuR3OeR06dKhUqZLJRMEV3CcjBhTBjWe/1mes3FAWcF/z589v0KABd4TLKlW1atUWLVpwL+SY++KWcRHPHBfLnfLM9UfRkydP5mZZ7bS8OZfwGWtyxAB9slIyDDZbuGxTOChzXMaliTS96Ei1evURGyySuQXiBAcPHtyyZcu6desqmUykS4QyVm6eLJBs12EsriCesU6tzVi/yY6L5fpxn+VGMTamXCDGOjkCh1Fok2oRbjz7tfGMlZsPy03CWFyBz9iYWsPIuCq3Wa7Pfr7cJIx1ygViNaXKX388VlvmxQh9JRyWG8VYXEE8Y51am7F+kx0Xy/XZz5cbxVi/xLBAjM2Zj5Xrs3IQbjz7tT5j5caz3CT5cqMY65QIfMY6+QIx1nZ9Vg7C9dnO8Wt9xvolpWCBZLsOY3EF8Yx1am3G+k12XCzXj/ssN4qxMeUCMdbJETiMQptUi3Dj2a+NZ6zcfFhuEsbiCnzGxtQaRsZVuc1yffbz5SZhrFMuEAtS5a918zsac6blTJBKmpYkP4pRqcvtIEpSi0KbktTajEpakg+jEpVEJaCctVJoU5Jam1FJS/JhlDMtZ4JU0rQk+VGMSl1uB1GSWhTalKTWZlTSknwYlagkKgHlrJWMmyQ/ilGpy+0gwrWbbBZIthvKgBTKQUpGCdm4+TBKyHIFyGaUZFPyxxbGlilWazzLjWIsriCesU6tzVi/yY6L5frs58uNYqxfYlggxubMx8r1WTkIN579Wp+xcuNZbpJ8uVGMdUoEPmOdfIEYa7s+Kwfh+mzn+LU+Y/2SUrBAsl2HsbiCeMY6tTZj/SY7Lpbrx32WG8XYmHKBGOvkCBxGoU2qRbjx7NfGM1ZuPiw3CWNxBT5jY2oNI+Oq3Ga5Pvv5cpMw1ikXiLGhOWKBZLuhrFqE63N8rc9Yufmw3CSMxRU4jExmjNa7H95JlSpVqlSp/mFKv0POqkRsB1GSWhTalKTWZlTSknwYlagkKgHlrJVCm5LU2oxKWpIPo5xpOROkkqYlyY9iVOpyO4iS1KLQpiS1NqOSluTDqEQlUQkoZ61k3CT5UYxKXW4HUZJalDMtZ4KUJG1dMSpRSVRCknNtsYOwEUEVqzWe5UYxFlcQz1in1mas32THxXJ99vPlRjHWLzEsEGNz5mPl+qwchBvPfq3PWLnxLDdJvtwoxjolAp+xTr5AjLVdn5WDcH22c/xan7F+SSlYINmuw1hcQTxjnVqbsX6THRfL9eM+y41ibEy5QIx1cgQOo9Am1SLcePZr4xkrNx+Wm4SxuAKfsTG1hpFxVW6zXJ/9fLlJGOuUC8TY0ByxQLLdUFYtwvU5vtZnrNx8WG4SxuIKHEYmM0bpd8ipUqVKlSrVn6kyZf4P2tkE/V2RHRwAAAAASUVORK5CYII=)", "_____no_output_____" ] ], [ [ "v.T# TRANSPUESTA ", "_____no_output_____" ], [ "", "_____no_output_____" ], [ "print (M.dot(v))\nprint (v.T.dot(v))\nv1=np.array([3,-3,1])\nv2=np.array([4,9,2])\nprint (np.cross(v1, v2, axisa=0, axisb=0).T)\nprint (np.multiply(M, v))\nprint (np.multiply(v, v))\n", "[[14]\n [32]\n [50]]\n[[14]]\n[-15 -2 39]\n[[ 1 2 3]\n [ 8 10 12]\n [21 24 27]]\n[[1]\n [4]\n [9]]\n" ] ], [ [ "# Transpuesta\n\n$$C_{mxn}=A_{nxm}^T$$\n$$c_{ij}=a_{ji}$$\n\n$$(A+B)^T = A^T + B^T$$\n$$(AB)^T = B^T A^T$$\n\nSi $A=A^T$ entonces A es **simetrica**\n", "_____no_output_____" ] ], [ [ "M", "_____no_output_____" ], [ "print(M.T)#transpuestas \nprint(v.T)", "[[1 4 7]\n [2 5 8]\n [3 6 9]]\n[[1 2 3]]\n" ], [ "#el determinante es para saber el valor de la matriz ", "_____no_output_____" ] ], [ [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAj4AAAFlCAIAAABk8XVSAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAL/jSURBVHhe7J0HgBPF98eTTXK9cp3ee+9NQESaoAL+pEtR6QioKFawARYUBEUQQVCagvTee++9H3BwXO9pW/L/zkwuHFXkaOH/PoS93dnZN2/ezM6bt7vZ6B0Oh44gCIIg3AfJ+ZcgCIIg3ARyXQRBEISbQa6LIAiCcDPIdREEQRBuBrkugiAIws0g10UQBEG4GeS6CIIgCDeDXNcdOTuut55Tb9xZZxJBEATxBHC/rmvFuHpiXM9JvXr1eo9bcfYpGejPHJ8sVrYfPyNWCIIgiCeB+466jm93ruRg+/btkwe3LFHifuKUs9m+sPeKJ8T1lSjbS6zULVtCrDwGnjyzEARBPHZyfcGw13JHNmeWj+1VlyduH1zivq+yHT3tXHncFB80SdRr26DizqTHyBNjFoIgiMfOg7zXVbzFoEnbljtDle2Du9EtIoIgCOIh8MAf02gxyeW8/lqW03edXdG73vX7Y/Xq9b5+U2xFb72+xGDnFUhEbCJHDs93l2MZ7HiWPu4sy+hcX8H2ZF9v642tnELqYT+XcENi7xucLXY507OTb5bGN8BNBwKuSA6Vhc7OnYx7EZVbsxAEQTytiGti/5ls/5TzgmE2t+4740q6EWeG2+6tO/YM3/kvxzKcOerWFZcrOWL3mbEi6YZd2fTqdavk7FJB9rHX0+4uzaWP68BbuDXP3UTl1iwEQRBPKQ886gIlyjqH46OneRBwdly3lvxhvbpjl9847k5uyaIORGo5xvvssVncYfrXY3OwffttHh0RsF11e3EJ18f8yZMn35K4ffC3N0m9DVya0AcHOvWevDDngXV7Yb9QOEfdbszDuJuoB2QWgiCIp42H4bpu4uyyv4RL6fXxoBbigYfiLYY6h+RbB/Mb+M/HuoKSSS2cKU56Ld82iUu4fvztE53+9q6glG1CHxz46s0HFh+0zbFtEvYLhZHwwi15XNxd1B3IjUkJgiDcHokP9PeB8/jbXXFEUvYutvd09nP0k1s678oA1z2c7ONvOsjJvR3rUqbXsq1vFeMpYnldap0yJUQKKFai/F0SXeW7jr1LCnAduP34aWeSw3H27PKxvevWrXuzxv9Z1G3z3atZCIIgnkIwyknOce8/IwZJ4NzOwZkTO8SuCqWK6/XnzhwVW3fAeRD+Zm9fF3mPx95ZmdtKdeW/eyJb+7cUjkgDzu1zP9YvUeKFIZN37HDa4Tr/VdTt892rWQiCIJ5CMMo9hAuGKxY630Jx01d5s6/l3cjNF/ZuT26OfdSs+NYZ/rjuQ93lwY3c4U5mIQiCeGA88AuGy3uL5wd0ul4f8Yt316+CzV16VuS5Hbe9MnaPx95RmTtcb8vOf/dEtvZvKRyRBvjW2VNHxBbq3/zm65auA29N4Yg04Ny+fb57NQtBEMRTCAa/B3nB8NzKH/vUfyE75Bo71HkfpuXLzqf3dgzp9uPKcyJRf+7cyj7169f/MXu7eKkKzlwnzvA0vri3Y2+njIBdZ3PuyLlPpP1LIlv7txSOSAN8y1WRyYuEwjBLtyHOS4f/TRTIlVkIgiCeQjD25fqCYY4HBUq0HDyZXyqr24s9Npf9fB3/lrK4YLZ9cEvxxVpkLtFy8g0Ps7fIHo4nizziZRz3duwTxc0VgVlyoetTYxaCIIgHxn27ruzvbuWkbl3+dSb2ZLjLb3FaTNrGX3CY4xiRNefrAVtMOpP9DkRQt3xJseuejhXULetcuU5J56W1++E/HusqHRVxvc0RQNvl179Mdk/krEjuzUIQBPGUoRfXDQmCIAjCXXgEX0kmCIIgiAcJuS6CIAjCzdBrmuZcJQiCIIgnHvZYGt3rIgiCINwLumBIEARBuBnkugiCIAg3g1wXQRAE4WaQ6yIIgiDcDHJdBEEQhJtBD8cTBEEQ7gQ9HE8QBEG4H3TBkCAIgnAzyHURBEEQbga5LoIgCMLNINdFEARBuBnkugiCIAg3gx6OJwiCINwJejieIAiCcD/ogiFBEAThZpDrIgiCINwMcl0EQRCEm0GuiyAIgnAz6DENgriBu58Rer3euZY7blvKgxJOEE899HA8QdwAnIrwK4qiqKpqMBiwKUkSVpCIFZ4rt0AmhAMIFMWZTCZyXQRxL9DD8QRxM2Iyl5GR8corr1y7dk2WZZwjf/31V7ly5XC+CAeWe1DKjh073njjDaxAZuHChf/++29fX1/nboIg7gq5LsKNydl7RciClFzGLpAA0tLSGjduPHv2bKPRCIF58+b18PDAyoOKuhBvWa1WuEbIzMzM7NWr15o1a/z9/Z27CYK4K/SYBuHewAcIZ4PwRaw4d9wvwj9hCYoUKVKM4+3tjdjoQfktAGmIsSC5aNGihQoVguQHKJwgnnrobCHcGziYq1evfvXVVxs2bBBuzLmDIIinF3JdhBsjHNW1a9dGjhy5e/duBC7wZGLXneCB2XWcqQRBuBXkugg3xnWdTTghfpHvbq4LYZl4blDEZ+S6CMJNYQ/mEsT/E9Djjxw5UrZs2U2bNmEdfs65gyAI94G5LjFRJYj/DyBEs9vtMTExZrMZm8J7EQThXjDXhf8E4RZgtiXLsqIoWO7Zs2fatGnz5s07c+aMqzcLVFWFf9q1a9dHH300ePDgxYsXZ2RkINFisezfv//48eNYP3v2LCTs3bs3MTFRzOOA8/g7gzw4Nisra8uWLWPHjn3rrbdGjx59/vx5oZUzU/Z3mbF0yRSbWN7fGwBwlJCAlXtBFJRTARdCTlpa2po1a7766itU4eeff7569WpO3XCUqJErxSVTLAViF0E8FuhtGoQ7gaETA+s333wzcuRIDK8mk8nHx6devXrLly8fMWLEsGHDkAdj7nfffYcMiLG8vb2Tk5NffvnlqVOnpqen165d+9q1a5AAV4e9WE6ePLlLly44KqfzA6mpqY0bN96xY4f4OpczVadDuNa1a9eVK1f6+fmFhoaeO3euQIEC69aty5s3r9FoFHmgmJAPRApKFMO9+JaYSHQBR9KiRQv4kjt9JRmHC4RYZ+qdcWW+tSzokJCQAIMcPHgQ+sM+0dHRNWrUWL16NSwp8uNY2BkF5ZSAA7EO2xoMBpHo2kUQjxjW90QvJ4gnH4yeGFLnz5+P0RMO4++//4aD2b17d6VKldCV4auYc9C0mTNnenl5tWnTRrwLA87J398fQRKirlOnTo0fPx7+Y9KkSSdOnEAEBscmjgLOYjgpKSlVqlSxWq0507GelJQ0cODAnTt32jjDhw+HMoi9ZNnu0OBWNSyPHT9us9tF1MJSkMRiHfn8hejMLLNTVg5Qizp16mRmZjq3bwGHo6zPPvusY8eOHe4B2ASlA+fxOUA8ioq/8847J0+eRO1gE+RHFWBVqAh9oTNUPXPmnNVmg9b8IFYFHGi3206dPmO2WKEP4LsI4vFAb9Mg3AaMqhg+W7duvX379iVLljRo0ECEIIsWLWrbtu1XX30loi64ATiebdu2hYSEwKXhqH79+p0/f37p0qWI0hCfvfTSSwsXLnzhhRduihtybt426sLJAmkA5YolnETdunU/+eSTTz/9+Nql88fPnF8+Z+aGw+dXb9kY5oWJoYTYJ/7qxcsxcetWLPtlzppZc36vW7n0TeXeS9QFP7R161Y4WmfSHRAaQiVEgdhEQTeVJVyOSBfn/sxZc3p26zpt+uROr3Q+e/786Qun50z5PcHst2TpDJNJQxXgka9evXA+OnbDwrmzV21fumFd8ZAgnYT47wbJBPFIQfclCHchNjYW8Va9evUQLmAU5n5EgzdCTxZRV0JCAoKqUqVKvfvuu++99977nCZNmuCo9PR0HAKfB5eDpTjWKfcW7hR1wYXAfSKYE/zxxx/wARMmTLDbrW1aNKlZo2q1IgVKlKl0NcumqTZ2U06RP//kvQrlyz1TpaLJr/CGfcduLfRfoy4cwiRlP9Z/J4RkLF2bIiUnSBdyXMuPPxlukqQ1a5amXbtWu2rVuvVrFgoKrF//fzZZU1QrqoBPj9c6VqxcpWap4lEFSh6/lqyJeJIgHh/0cDzhTpjNZngg+CHxnnVnKo8exCYiGAzHWDl16tTx48ePcby9vRs2bIgMuZyr4XC4GQR5b775JgSGh4f36tULibVr1zYYjH/O/Wvzlq3PN62PMngxKJCp9P4Hw3bv3vXOe++ybf7W0PtA3HkS1bwTyAaXIuqYc3kTcP/Tpk3r3r17tWrV8uTJM3rUKIPRWKlKFb+g4HWbNq5ds7Js6VI4Dh/I5Cv6n3/5efv2ba91bY8SnFII4vGBjk0PxxNuA/qrl5cX/BBCInRfHlQwsEtcOURiYGAgNlu3bv3PP//Axwiw/vvvv+NAZEYeLszpQpCCAwUi/bbwcrTdu3fDY/Xu3RvxSseOHf/888+IiIiAgIDixYvjaG/fQDhUfkZBJUjDqiQZJJOnp4enJ7utjLJY+v2AElGRiRMn/nxnsFdkuHr1qlA4Z2VFyurVq+vXrz9ixAgfH58BAwYgavT08ChZsqR/YBCU9fTy9jAaDHoHtESNYBMYFk7T08vTy8sDdeECsVNUkCAeD6xn8nOWINwAjJtBQUGFChXat2/fuXPnRApwXWpDn0aGmjVrwlfddFtIlmUM3K5+j0OwIlKwxLEi220Re+E8+vTpY7fb4cDgCOHAKlSokJWVVblyZThFlk3P7mzBRekdmsSClRucIXcGueLixYvi0ZJ/JT4+npXIyxfHChISEhAvlihR4sCBA+PHj+/WrVu+fPk0Va1Zvabe4OF0RzqNVYEde4P6CBeZCObVstcJ4vFBFwwJtwEjqIeHR5cuXeCW+vXrd+HCBZvNBh/2xRdfGPgPQrIOLUlDhw6NjY3FGI2xHp4mLS1t1apVffv2he+BBARGyLZt2zakHzlyBCvYFPLvBDtRdDqIgkC4Rk9PT7PZDPfw2WefJSUlValSBaUzIfwfZGHov1GiqwCe6b8D4ajXwIEDx40bN+EegDcVXplplQ3W09PTYRmEidgFa0RHR3/yySd2m61WzVrMI8H1cv1yXNPk20yGU/PsLb4kiMcEOyPQiQnCLUB/xfL1119v0aLFxo0b69Sp89xzz2Hp4+MjerPo1i+++OIHH3wAd4Xw69lnn61RowZS4GwwmiNP6dKlQ0JCJk6c2KBBg7p16y5atAiJAlHKrWAXxMLnIUbZu3cvSnz++echfN68edhVtWpV5EHgxhyjw6rpVDgNuDIWtDgQwmiq3kOvKape1umMSOTJfPRnEcw9XUJEKeyyHb8uKlL+FaPRCIfq3OBASGBgYGho6MKFC2vXrt24ceN69ept3bpV8pDKVihjUhEyOhwS9IVyHsLYDp0R8wXuik3sgqdeljTMEXSyXtJpCrI4RRPEo4X3Tn7OE8STj7iyh46blZU1Z86c48eP+/n5tWrVCq5oxYoV8EMVKlTAeI14AkP8yZMnFyxYgKjI399feDiM5iIWWb169bp163x9fatXr44RHJ4PwvkZwU4JQc6H47GpKAr2Hjp0CO7KYrFERUVhL6TBk7Vp0yYiIgJ74bqMjowP+r45f8vx3Xt2B3h5IYpRjLLOYTSp1nnzZ3fq+dW6jUvrVisjymEujP9NT0u/+8PxuUeYDuzevRveWpblwoULN2vWDHFk9OXoV9u2C/AJ1kw6TZI9rJaWz76Y4RG6dt18+D5JrzFlmVtTJnzz6egf52/evzMyIthHkSWjCXUQ8gniEUOui3AbRNgkEFf/gEgU4YirMyMRKdgUS5EIsA6QE+muFGxihcm6s+tCcTlzYolNsRSiWBo+mnnYgF4LNh3btXunn4+3EXkklYUrmn3+/L879fx4w6aVdaqWFyO+q7T0f/teV+7JqT9WhH3ELgf8k+aQVL1qRIAlG+22ls+1zjAFrV6zwGTQG6EnDnBoiL3Gf/35qPEzt+zfGRER6uvQIaZjVSaIxwE9pkG4DRhtmXvhILrCJkAsJS6O5cyAFLEUiQJxCBKxFNnEXiY62yHdFuzCsaIUIQSIFVEQVuACxnw9ukunHqvXbr929crr3Xu+//7HNrsNQcua5cu7dnztx3GTZGvyx+8P69ixW3T0ReR3Sn8kCCWBUJs9CZm9iWqzG11GSbXbPx72cbfXehw+dfzk8f1dO3cdPXI0fJ6mqL9NnNipQ5c/Zi9MSUkY2Lt/vz6Dk9PTyW8RjxGKugjiNtz2bRp3waGps2f9efLsBZPDbpW8DJoaHBrZ+41ungbPvbs3r1q7RtN5IhechF5n6Pl6j7x5o7jXYDyCC4Z3gd3bQnCo0yl2+9Tffo2Lj8OWpjdgXChcpEiXjh0NDm3Zon/2Hzuhd0gKuyOmN3kH9+nVNSw4CAOIUwpBPFrIdRHEbfjPrsuB4ESWJZOHZlX0Xh4O2e5g70oyKPhv1RlwmvlIOlnVNHalDQ4MO533uqTH67qghMZckEOvqJqqOowSnBmCMngvFXpqmkGRJUlRDB6Sw6Tp7QZVURxeksFmkOCM6bIN8Xhw3iogCMIF5nNpaWnCdXl6emIT3ktM8u7ixhw6h6LTG3VwTnq9qjgkI/NQOMihOtgfox67HJpez54WEeEKe3ZP/5hdF4ByTB1eQfFUBvdI8GEska8rDr2BJetVDBnsgUNJ0en4c5QE8cjBaUhRF0HcgKqqWMJ1NWrUaNy4cYi6kFKuXLnAwEBxf0hkyyXwXomJiadOncI5iAhv+PDhmzZt8vPzc+4mCOKukOsiiBuAU8FJkZWVNXDgwNjYWJH4008/FS1aFCsG/jxI7kERhw8ffu+991Cc0WgsUKAA3KR4KwdBEP8KuS6CuAGcEXc6Kdhlinu473UviFIAwjh4LxHMPSjhBPHUQ66LIAiCcDPoASGCIAjCzSDXRRAEQbgZ9HA8QRAE4U6we850r4sgCIJwL+iCIUEQBOFmkOsiCIIg3AxyXQRBEISbQa6LIAiCcDPIdREEQRBuBj0cTxAEQbgT9HA8QRAE4X7QBUOCIAjCzSDXRRAEQbgZ5LoIgiAIN4NcF0EQBOFmkOsiCIIg3Ax6OJ4gCIJwJ+jheIIgCML9oAuGBEEQhJtBrosgCIJwM8h1EQRBEG4GuS6CIAjCzSDXRRAEQbgZ9HA8QRAE4U7Qw/EEQRCE+0EXDAmCIAg3gy4YEgRBEO4EXTAkCIIg3A+6YEgQBEG4GeS6CIIgCDeD7nURBEEQ7gTd6yIIgiDcD7pgSBAEQbgZdMGQIAiCcCfogiFBEAThftAFQ4IgCMLNINdFEARBuBnkugiCIAg3g1wXQRAE4WaQ6yIIgiDcDHo4niAIgnAn6OF4giAIwv2gC4YEQRCEm0GuiyAIgnAzyHURBEEQbga5LoIgCMLNINdFEARBuBn0cDxBEAThTtDD8QRBEIT7QRcMCYIgCDeDXBdBEAThZpDrIgiCINwMcl0EQRCEm0GuiyAIgnAz6OF4giAIwp2gh+MJgiAI94MuGBIEQRBuBrkugiAIws0g10UQBEG4GeS6CIIgCDeDHtMg7gnRT1y9RdM0vV6PFYPBIFKePlBZVzXF0mUEbEoSTfsI4rFBD8cT/44YrFVVtdlsYl0M4iaTycPDQ+R5+kAd7Xa7oijCb4kU1NdoNGLdlUgQxCMGZx9FXcS/gB4C0FXWr1/fr18/0WHEcuzYsS+88ALP9RSCWd3cuXM//fRT5zav9eTJkxs2bAhrUNRFEI8Rcl3EvyM6yYoVK3r37j1q1Chvb2+kgOrVqxcpUkTkefqA67p48eK+ffvgpVBZq9X65ptvzps3r3nz5mzSR1EXQTw+yHUR/47oJMuWLRs6dOju3bt9fHwwcIv7QE/xvS5VVV3VBGazuWDBgn/++WeLFi2wFykiG0EQjx666EHcKzkvkWFAf+qvmIlgS1QT9RWJBEE8CZDrIv4DIvzCaI5gC2HH0+29UEFXNQXOHQRBPG7obCTuFXGJzLUU8D1PM856cpxJBEE8bujheOLfEcHWypUrhwwZsnfvXl9f3/9v4zgsYLVa8+XLR/e6COKxwyaSYlQiiLvwr64LGUB6evrBgwcvXLiAIb5BgwaeHgZVZ3LoHEa9ptfpHQjx2coNOByS3oHJk+rQGzS9pNNp/KmPXF0MEN/EslgsBw4cOHv2bFRU1DPPPOPl5QUN9QZJ0qFEh06vssL1RkWnGhyapEFPvWZw6HVIx1lh0DsUtqGXkA69JXJdBPEkQRcMiQeA3W6HV8ubNy+G9WHDhrVs2bJJk+dSU1I0h8q9Hv5r8GEi842IRPRDpye4bab/hKqqH330Uf78+Zs0afLxxx+3bt26bt26cXFxbN91d4M1tsGLw4rGfCvblvCBL3Vwz6TXadhm6QRBPEmwZ6gIIpdkZWWdOHFiwoQJiMl27drVqVOnXTt3/f3XPEmv6RBUMa8l4w/cmMjvgqchFkNww/qiHv/YJ1fYbLajR4+OHj0ayuzcufP1118/fvz4tGnTsAtFwCshnEII6NAZsCVxX+VA2KeTdQ4VLotrC6WRKLGYy6HhIyQTBPEkwHyXc5Ug7ozGWb58ealSpTIyMrDu3OFwKIqCKAdLOAzX+rZt2yRJGvHpcEXNkGVZtdmvXD3x5dffWWx2B4vDkEtWFLuqyFkZKT+M/fHkxRgZMjUsVOzOJVDAarWiXIBwcN++fT4+PgMHDkSpKFFVbbHx174Y9X16lkXVFEWTFbtNtmYuXTJ/4OABg98aPOePuamZliyVaQNZmiqrDqxqZrM5ODh42bJl3BjkzAjicUIXDIn7Bx0IS4zjWIo3+4leJa7OBQYEaQ7Lj+N+bP+/9pUr1/zjj9lwHIho8NE7dIsWLHi9e/eqFSt/8NHHsUlJiHpwMI94HgBQRtzHhQdNSkqCWw0KCsLmz+PGdezQvlKlSt9//7NVVliwpVk0h/zFZyN79OybkWE5d/ZUnzd6tm/3v3SLmamk4QSRtBzXGQmCeBIg10XkCvgDsTx//vwvv/zy1ltvNWvW7N1339XrDdWqVdPpZB8vz5YtWtSpW41dgnM4fQDye3l6ValS5X//a+vQSzYFsRhSH8zDrhB+4cKFSZMmQQ0og3gLKXBXcGYB/gHNmjZp0PhZh96kMWX0kt4eczF64k+TZ/w+a9KkyQv+mTfy80/Wr14xb/58XjP2dImqe2rfGEIQbgrd63raENey+DWth35RS/ShmJiYNm3aVKhQYerUqYhvmjRpkpWV5eXlUa5CeUkK6t23T7c3uhSLzIP8Gn+2Dx+9ZGje8oUBgwZVrlyBeS2jyZM9EcHitvvukaLKCQkJHTp0gKOaPHlyenp648aNkejj41O5cmUI7/p6j249e5QtlF/Re7AbWSyo8gqNiOg9sE+DxvURp+kMxg5d2+UN9d+/97Ai6Rx69vSjwVkCQRBPBGygwFSUeGpAi2KkVhRFtK6rmZ27HzSQrKpqnz599uzZs2DBgs2bNyPw6t27t8lkgifz8/fV63wQVOkkzSjLLD/+CST2Qg72QQdkl+MM+IucObP8J6CJUOadd96BGrNmzdq2bduUKVOwKctyRERE/vz5kcHBvJMkyYpDb5RYWXCUHj6+AR99+qHRJOk0h6Y6bDarxW4PDA5hQiWVPSF/fzoRBPFwYKcmO+OJpws0rct1iZSHBOQjzNq9e3fp0qUbNGjg4eEBxzlp0qT4+PiKFSuKu1/sCXP2eeigaER7+/btK1y4MIItLy8veLIZM2ZcunSpWrVqBoMzdnLmvo5B5zAYJYPk0AwOFd7v2+8nal7+HV5ta9Tg6fDfQPe6COJJgy4YPlWwFpWkL7/8skSJEikpKWJ6Apy7bwdGfIFz+78A+XAJISEhCHGaN2/er1+/OnXqjBw5Ej6jVq1aPJyCTs7MDxXUEcog2vPz84P3guvq27dvo0aNEHWhatWrVxfuHCqJ/CBbL/YXiko6ZopvR3/994Llv82YVbFMSSP8FiqQ4xCCIJ4EcL7TBcOnCjGCx8XFRUdHYyB2pdwJZ0fgTug+wIGenp7fffddkyZN4MMsFgscxvjx419//fX69evDcznzPXxENaHMt99++/zzz/v6+sJz9+jRY+LEiVi2aNECHl3kvBkJx2p6narI9hEjPpu/ZOk/Cxc2a9LYyBLhttjrd9l/giCeGJjrwn/i/zOXL1/esWMH/Jxz+z+CA1944YUlS5asW7fu999/7969e9euXX/66adixYo5czwqRBXq1q27bNkyKDNr1qyePXu+8sorkydPLlOmjHDPWIrMLtjTIZJmNmcgZNyxZ8/SNWsqVa5oUhWDpqg6ncIuJ2qSQ3bmJgjiyYBcl3uD8RqwuCn72ULnjhwgUVVVV04sc25Onz69efPmYl3gPOwe4A9bXP/1E6yL4EYskaJpSmpKYkJiUrpsUOzmhGtX45JTrZpDU+yWjKSEuLjUDKumyhnXrsXHJSt2aKW/7292iXJdOgjFTCYTNgEyoGqZaQmpCfGZVptqiY+LT7iWkALbWC0ZfXv33rFz96BBg48dO7Zp09ZNW7cdP3WW3QaDVObeHg+uRlH5C7WwxZtO1dgG/jkvecLF4sNeVsLfXAIHjs+djOiSyZqfwyTA8Oyr4DhOYVnwYZIgBYXYWTqTKKTy9hGfB41LK6YfR2wCpgOrn1hcL9+1zjehoZ3bgFtGWIPLyYlTIjOgs6/nOJwZkh+NJPxX2QtWkHy9+sSTArkuN0aceq6zUXgLbGJFrAuEo3K5K1mWkUcgcmKUt9sxQjlTnIfdGzgWCPeAY8WKAGXhzO/evVvevIWmzFx08ezxMiWKN2jULCXLjHHj94nj8kble7PvIIeqvNqyZb58RY8ePiHuKzlF/xdELVAoq8+NagCRgv3vDBlcILLQ+Im/2tKvlCtVukixsja7lpmWtnTxyrNnzrV5+eWmzzZ+/rkmjZu0GDN2AsSy73PBIPrH870u6CyaTI+BEwMtWzInw90Xy6DnbksMrXr2LqsbTHdbO/JGZlMZ1yZ7BYrCpfPexNLYHhGhZr95khXHdrH1h4boolgqiiJWoAM3ALjHouFshMr4z3wYhAiBACtMULbM67XBinOd+ydXOvNkrg1uceKJ4TaXUAh3QZyBaMGdO3f+8ssv8fHxRYsW7dSp0+zZs6dMmRIbGxsaGoq9OGkPHTr02Wef7d27NywsDBn69u3r5+eHCGPGjBnbt2/fv39/v379MFB6enp+8MEH/v7+zgKyEZ3kv/7oCUYM6JickqQqKhwHxgoJrszo5RcU5KuzW83paVmKAz6FDRbYqwXnCfbwENGbSUh40Dgy0uP4WzJMGn8xvE5nCs8TrHPYUtPS2eCtY+8sVFmUJXl5eQT4+zOvxwYvaKh/9G+Ox1B77ty5efPmrVm99tTJU76+3p27dkJo6O8XAE8tGXD2yjCgqjdCQ4ND0elNfKTl351jf9h9vJu0xBxl27Zt8+fP37RpEzpMsWLF3n777datWrNw1cC/ncDGftgBQmQ9fLdmYBKYQCaMGw1Z2DiO+QDfeGCgvuhp6enpUG/VqlW7d+9G123WrNmIESPy5s1rMJrYLCJ7vIJ+/B805Q0EnbnuLJ3pCuWxZYK/QiOePXt25syZGzduPHXqVHBwnh49ur/11kAPk9FgMLIvb/BjHKyOrJqskqwQmEFxOAzsjcwsFeZlUTgrm3gSQFsTbgqGe5zbK1asyJMnD0bSOnXqvPrqqxhe4Z8wDl27dk3MN5ctWxYSEhIVFdWnT582bdoYjcYePXog9lq3bl29evXgw0wmU82aNXH4M888k5iY6JSeA+4ib/8Ow7ugqrJ4VyEiPb5u11Qrf3ehpsh2TbGxJMWOBEWR2T9ZRk58nMc/eDRFtbEYAwUzxSwoEXppsgI7qlxPngFZHAqUxCQdx4jPf3mHIaSDu+cRQu6eDXtffvllNFyPbt1HfzGqSqUqmAF89/0YmMtuszGFVaum2m1QFVIUq5CIKFZTUTM7anaraLirgICA6tWrDxo0CNOUIEbw0SOHZdmsyDZNdWjscNYaNs2swDpIg3E0K0vmNoF7YTkQ/zlFXodV+98qLjLcNg8S0V3Hjx8PDdFR4bFeeuklzKiwbrOzarIKMRVQcdZITAr+KjCFlXcrbKLhHNitaBaHZoWiqAlKbNy4ceHChV9//fWRI0eWKFkKHX76tCmK3QLrsd2siiwfzIgPS1AUHMvUYSksMNU0duHUqSjxBEA/NenepKSkwOVcvnx5woQJnTt3hsc6f/58kyZNrly5AtcFlwZPU7VqVQ8PDziqiIgINDkCMgRP+/btK126NM5SDGGzZs1KSEjAGIG9bO59SzyBdCz/+09Niq4l5uZiSitgKdkbYoqbMw8kP6y5Lb8NwlTXI7hiV9sQr0g5ShMKX1fApTFMcO9RFyKG3r17izPrLtlg1Ro1asydOzf7C3C3AaLYj415ekoO6Urs1SLFizVt+tyiBfP1GEUlPbz9vAVLq9WtX7RgAQ+djDAB8ZBeBw/sWL9xq39wSO2aVW6KjJKSkk6cOIGZCoZvtD6C9bcHv/PblJ86dX0lI01dsmj1iZOHChfK3/C5ZvlLFDM5dEYM6pp9xbq1+3cd8PH1qVf/marVq5s8PFAUQuOb6oYqo76it9yp4tjr2nVTHmGxnTt3Fi1aNDw8HHvhgdG9T58+HXvtmqeXl0Gn6jR1z959qRmWZ5s0YUGQJuscCvzxkqUrzpy/WL5MmUaNn48qkFfSa+jNeoeB+TuHtn79+kaNGqHKkLl3377nmzRp1bzJjBnTEW5t2Lx9y+aNRoNUr0GTWnXqsFBMr7CoUzPIDmXR8tWlSpYsW7qEpJMl6dYaE48H1nPQkwg3Bac63AlcTvfu3TGw2jk42xF7IRHnM8amefPmYWScPHkyfFg6JzY2tmTJkmPHjkVOTHIxFcUkFyEF1iEQhzil5wDp4D9GXcijYJLtDFsgg02IscQCE1n+qAGf5LM0Acsp7p4/LBCSsJGMFaE4NBtKZxVzlepcxYdrxVOhJ1f1P0RdFosFRsbUIe7OIANITk6+rcEFKAWNwpqJvd3enpicavL0at3qBc2atXHlkqHvDqlWrYqXf9CyTduzWOAg22wZRw/u/mjokOcbNvT2Cpww6U92D+tGUJxTJl9OmzbNKHn88fvka7En6tSsVrJIhS4dOhbMGxIaGj5zwSrEyPasa/16dQsOjejQ/tWKFSt4+/h/POKLTFkzswjoZiAQVkJXvAswjs1mQ85bbSi0EmAdqoLnnnvO29vbbM66FH3+0w+GNm/cINDf7/NRX2coiOURXlovnj1WslD+KpUqdujQMSw4IH/eQjt2HbLySNShaghPXQKxhMDTZ86iHTu/2kaxpH44dEhoaOirr7SrUa2yl7f/G30Hm22wtvXAzm3Dhrz7TIMGHt5Bfy1cya4VqFbeQYknhZvmZIQ7gfY7deoUJiAvvfQS/JMImFyIBj548CDO2C+++KJatWq1atXCdBvTz0uXLmHcdOZDAKKqYgUyxfIBkVMU1vkdGL7CU/BHRDU3ZXuIoLvjk108u1XCN6CGSxOXAiLlfkBbhIWFhYTAAYRi5S5g0nCrwUXDucCYeyH6wrr16/74cwYGYPatA0nC+O/l7dOk8bOyTZElAyJHxBgmo8lmxwjreO65RnpZ0RBTZssWouAtxHpWVtbhw4eXLl2KqE7SOwoVLjzlt+mKZti6Y9O0P2bs3L21UGjApx+OTDPb9+/a8veClevWrpv+xx/bt69v81KLb78ac/jgMd52N5toy5YtiPgbN26MPnYnnn322dGjRyMz1BBHAa4de82/WIee8O7inhw6cOnSpdG3UXd4xRbNm3hJCC35TUj2+J8y8suR5SpX3bB5y4w/Z2xct9RTsg0d9iW7fMh/fs3AryVALGZ158+fX7Nmzcw/Z6anpVeoUBHnztSp0+b89fcfs+ds3bq5T89O0yZPWrd+o04vmW02o4fxhabPolCVlQU70lD5hMH7DOGW4Az//PPPcV5t2LAB64DPU9VXXnnFFXW9//77fn5+Y8aM+eOPP/7MBut79uwR89A33ngDGTAddgq9HUL4f73XdSu3HIaExzWTFUXfofRbklHle4+6XHN8kfO2YC/AeIo2ch6WDVJE9Azv0rNnz3z58kVGRhYuXNjf3x9tPWvWLNbGLHpU1y9f4GHy/WvbAXZZjMezDs2O48+f2J/P5Df+17/Zuyw5oixEPH/99VezZs3y5MlToECBQoUKmUymvBHhl6LP/fnnzL/mL2S3ISFZNv/57Yc+wQU3HD55+uCm9z//XoZLtCMcSdq3bZ2P5Dd+/FQzD1+d0rNB34B7EFy4A9iVmJgIfVAFcZTTItwmmFF9+umn5cuXh19HlSMiIlDl1157jWVmgZQSH3O2dLj/J19/b2G93abaU8d9O3rzzr2IOxXV5rCnDhvUPrhA9csxcVbNZkUhdvuuXbu6d++elwOZfn7+RoO06J95MZeiBw0abFW0TBnzA+v5Y1vD/b0//PyrTLQbvwV67sgmveQzc+l6qMZsSzxJ0FTCvcG4hiXmj1iiOXGeY4kpKm9cBs5VjFkYC9q3b98xmw4dOlStWhWZxTwXYF2sPFRuKQMJj6Lc2yGKvkPpuVMK4YKIMOrfmYacwYMHY8h2HpYDtAsiGAQoBw4cGDly5MqVK3fv3t2yZUsvL6+KFSuiWUVzsQqwFVfjsQTXmmtdgKO+/PLLTp06hYeHY+6yfv36TZs2wRkXKlwoLDyifftXX27dUnKoBgf7tehkBT1C8/PyLFam6vvv9JMMBiOL7LxUVdL0dm8/L3Zf7Ub5wNvbGx6xYMGCcIp3AYU6D+BAMbFMSkpCHX/77bcePXpgirBjx44RI0ZgV7169Xj/dH54qWyhYPjSe/QZMKBm9SpGHbtvD8uZLQ4vL3hkI2JqZFqzdk3Tpk3Pnj07duzYtWvXYsZWr15dzNUqVamSN1++kSO/0muqp06R4IclH8Wh8/AN1PQQZXeWARku6xJPFOgxhJuCUe/MmTMYzuCHEhISxEw2NTW1cuXKONXj4uKQgkluSEgIBkERLQEcmJ6eHh0dLfIj6sKJjokwdrG5/O3iCX7cA4i63BdU+d6jLhgT3gu+B2y9A9gFz3HkyBFX8OECwhFyvfDCC0FBQWlpaVjH5OPatWuYhZQuXTo5OVk0HKKsDcsXeHog6jqIpuNRFw6WRdSVn0Vd83JGXYh40BPgukRECJlLlixBPxk4cAB/3tKmKVaHYtHkrLjLZ4oULPDi/7qkK5pNlm0senEgfLFZrrZv27pA3nKxcWkWB7th6ZSejVAMS2Gi2yIyYAnEUa7EUaNGwWdv3LgRm1DSZrNhjoXuvXPnTuxF9RRNjY85V4ZFXT8g6rIxvVEZ9pCqyj7mI/t25g8P6f/+FzhcVmWzxVKvbp0yZcqgRYTM+Pj4iPCwMqVL2WwWXmu7wqxrUy1Jg958LU9E4cPnLiJmZlGXQz6PqMvgO3PZBoq6nkAo6nJvMMnFYHT06NF27dphQNy7d2/37t2xiSEJIAMmue+88w52IerCeHr8+PGJEyciIJg/fz4y4HyOiIjA8rvvvsM0vHPnzhjghGTivoGHqFu3roiuEDHcFuxq0KABomFX4JsTNA3GWavV+vvvv2N2smrVKsTKly9fLlGihLhs6Mx3z+AQjP4oC2Hc6tWrT548OXny5L59+yK9arXqLMxDvM4d4LVr8d179C5UouzY778xOjRVpzPoMW5bLbaMDz/6ZufeYzP/mhoU7CspmPY6hbtAuI8ibrrnehMiA5bAeVg2zD85HNOmTTt8+DDCo7feemvBggUwZtGiRZHuzJQNoiEjdDYY9fjoZJ1mP3Pm/GvdX6//bMvPPn5Xr9P0GlMDfunKlStz5sw5ceIEph1t27ZNSEisWbMG+6IhfCccviZrsvnbMePmLVv3+++/FS9YAMZysK82OwsinkzozfHuDQKm0aNHN2vWDM7pueeea9iwYXh4eO3atTEuiJECbTxkyJCRI0fu378fezFWYhPuqmnTpvBYkNCyZUsfH58xY8Y0b94cIxoCMiGZeFyI6ATtgqH87bffRpMNGjQoMDAQu0TL8jNX5L1XIDAqKqpKlSqnTp1q3bp1pUqV5s2bFxoaCmm1atbiAoH+/IVLLVq1CS9YfOE/8/NHhBt0OiOCHZ0uMy12yDtv79x9YuXa9TVqVTTq7ewJEPZt4AcAK5gjbsL9+eef1apVe/HFF1NSUqBe8eLFxdcWnbmzQdlwtljBf51D2bdvX6sX27Z+ue2vv00J8DFJEvsVUaPJ2KZdO4TL8IIVK1b84IMP2HMxOgfkw3U5dJJB0it228cfffTH/JXzV6xt9Vw9E/w01HG+Dop4QmG91blKuCEYjzC6AZycCKd27Nhx8eJFTDNTU1NjY2MxbUcGcZ0EJCQkYMa9e/fumJgYi8WCbEjEsTab7dy5c7t27Tp//vydLgYiEdz9gqHIcxPOfdlgW3xybImZL1ZzrOcOZ9l3ILtc5x+2diOu9Jx7ceC9XzDMJWgy0aYIFNCmcDbp6WlWqyUuLi4rK4uFRuwLwWhcZd3yhR4mn/lb9yk4hF1Pw8GyQ1HOHz8QafIdP2mu68qd6AZwBhji0dYI4NAHME2Jj49HP8BOh2I5dGBf6bIVvhj5TbrFyjqGjI9dsWUkxMW83KJxm1deuZackqVoFplf8ZPN/AsODwaoJ0AH3r59O6IudFd0USzRmUVH1dh3ouW4mDMlwwOGjx5jhRnYt7IVWbNuXLe0dInik6dOR2e22R1Wu13RbCyaUhWrzXrkyBGY8fTp06gvrBofF2s2Z9pt7BpjZlrSm927NG3e8mxMrE2GiRRNzlQVs8wMZj97ZJtO8p69dB3/CvbDbHLiv8PmMsyDEcSdEZ3kLl9J5n2J5cEJjiWfQzvh+51gn9jmSw2H8aWRJbAXMKl8Ped3hO8HoQP0QeliiRShCRbsqWq2Krn6/U2l4WCxC+niAyDH+shfBHUd9uIl/qOXzGCqXrXqHKomea5es7ZNm/8tWLa8UaOGUMikd0ganJB87tSZajXrfT325zff6IIgA7ELuJPCcHiSZj6+f8f/Ovdu1Lr9G11f9XBYZYOv3ehZumBeR+a19u276v3DP/n4I18/XzhxmC8yMjI0NA9CG6eIhw+bRaBqsv1abEzdOg1eHzDo7fc/Qkt7SelbN6/p2LH/4KHDmzVrxL6hbDCpBo8SJYp6G40GnYoGZ62eDezn0LHnMNAnsjIT3xr41sFjZ8eOGx8UFMB7ox71Cg8LRzynU5UTp05Vql7nz7/mtXqhpYdRb2Jdh3hiYEMOQdwVDP3gLlEXUthMnE/tsQKwcms2NgDxhwnEGkYjzSFnvxcd/+38u8Bi//2Dol0xJVQS+gA2q79BJ6HMzeUxrfhXlZl22WDzkUVdt4GZhT0WIbNXFMmqPWPqzz9Ien5vyWDUSSajp9/qTVsRPe3asdVL0nsy/4/9XgaDcerUqXdoCyeIrlRbxuhP3jHpJb1HgMHo4QnvJBk9fEM3b9u5Y91ybwPGfw+9wQMFieWYsT/ab3m65KGCFjl7+qSXQe/NagVvZNDpvT8fNUGRM954rS1zy5K3Xm/k8xKDf2DYmUuXFdaUt21f3rjwTIf3+HqyG2/8pSpOhg4diq6yeP48H4n7KsmoN3rrjf5z5y1+tE1O/AsUdRH/jugkd4m6MBacPXt27ty5u3btOnfuXN68eQcNGtSqVUvxhlbk5QJcPY3P1kVIxO6HY5O/3lWv6h0YkkRUdP9g6ImJiZkzZ86WLVugTERExOuvv96hQwf2+AAPFJjyTCHxYQl8RZQqUhw8gY2IIh2qPu6oC8aS2B0YVZF0anpqyrWEVF4BPUZiaFOoUGEfL0+7zRxzMZr5KZ1RYy+WVSMjI+BxWVXuoDB8uuSQkxNjE1PNMoZrTWGhm9FTlYyF80YZHNZLMbGqzgQdIBS+EqYICwvPExyEdaeIhw88js1quXDuLAqFETQ9amcIDg6LCvGNu3YpLcuuOqCZAl+j6QwwVOEihT09TBL7vVDUOmfFUQn8Y09iWK3p0dHndAZv9mVongVVCwoKCgsLM2dlxsZcRqiOGqusqaWIyIiQoIBH2OTEv0Cui/h3RCe5i+tCKFa7dm3EOtWrVw8PD4cPS09PX79uTc2aNXE0znxNVfSSgQ0b8FbMaWn8ChbGQRl7HHqjzqAzShh1+NXC3I0Q8DEoNy0trVq1akWLFp01a1ZKSsrChQubNGnChy32rSnUCOM7fC7GLPhdcUENifxkYBpiLybkLJnLxJ7H6bq42sxuTDms84ua/EXmXOHr8L0ijV8RvadpgPPKLYZpDPqwAnPszAcw16dH4MKkoKybeZT152RXnP2HqnquMZv9oBWhJBoViqIKyIEPn3QgP9TMqSmqyl6ywV4Pz14Gj7DMhLy3NKYwIzsWIvA3WyDxpPDo5k3EUwzG8U8++WTPnj0zZswYM2bMd999Bze2bvVqRbZbs7LWrVnzw/ffz/zzj/OXYlTEVuIlrRpiLFm2WVeuXJWQnMxmuOwjhqZcAVf06aefHjhwAB501KhRv/76KxJXrVqFpWKzbNm4/ofvx0z7/fcz58/bVKiOYU+DMkePHJn480+//PzT7n37sxSHrJOyf/8KYxcbvh4rzJvyP2zBRufsH+DAf/5hb90Ve/lJzc5rjO18yP13uEAjDpCYb2CbWLJ1WIBdmoP/Zi1z0+eR46q4gZsDtVO5GgbmXNnu7IrzP3xX9t8cYJvnxn+217l5I3DbiFmdYlndXeYlnhTo4XjiAYA47JVXXvHz82ORlCQVKVIE/oP9LoamdevWo2PHzps2bXlnyDt1a9WZv3CxzmCAq9i5bfPXX33VqmWrzl26Xoy5giEE01ynuNzh6enZpk2boKAgFjNJ+vCICKywx9Fk+wfvvd+uTduNG9Z//NFHz9R9ZtbceSy2kKSpU6Y0bPjsvPkLf5n4S8NnGg5+Z5jMZ/gPRqEHDfMoN9sqR4pYZVs5Eu8GAi3+O5X4MCfNrwwyl8d+aJFdorwnIY8CppVrwWIshf1hzcR8NEvnurKmY5W47c0pvp8v4JlY3bFkx90AEpgReDrPq/GQnHhSYK4LYw1B3B/oQ4p4/IH9oJ+ESCs6OnrHjh2LFy+GKypRuuCyBQsP7j+2edfef5YsO7p/X5X84W8PHByTlKwzKDFnD1+JTylcuJDikDIkDxMPxe4bdGW4SagBfUTnlmX75csXoMyK1WssFmuZEkU3r146Y+6GNVu3z1+84Myx440qlxw46O0LcUnx1y5/NXzE+ImTl6/fdODA/uH9usz66cf5S9ZBrMSGcj7Bf8K45bxFwq3n8n84u7O/Rcz+Zh9mxP/s9f8g6lGBiYmHc419cmrIE25IyUnOb0yzOt6WHJkgB1t3kkY8Bpjrwn+CuD/Qh+CxsLJ3797evXtXqlSpcePGiHi+//57g8FQrUpN/8DgXv365i9a2CZpeQpEvDnojcSEuNNnonU6U4fXen4/7vvmLZqzWS2bQ7GrM+ICzX0j+vSxY8feeuutypUqN2rQqM3Lbb768iuTyVSufDk/f/83e/UoX6GEXmfw9QsY8HY32WzbtvOI3tOrYfMWLV9+CVNwyST1GtI/ONhv59adfCIvrhrlQid3QNSQV5L9za6tcz1780mDqwYFc/ace9I1Z77bH3A9h3PtegLxhECui8gViG9mzJjx3HPPnTt3bsSIEcuWLTty5EjhwoXz5s1XsEDxJs2b9x/U36jXGdmVJ83gYXDoVC+TSXIYHTovHK53IFzT6R7Es2rwW4i6Fi1aVL9+/cOHD78/7P1FCxedOHqsaqXKcFqly5WvXq/eR58MkVSH5DDoJMXoxX673SQZA/NETJwyxdvL08iuE6mS0QOO1NPLYEC0xa5N0bBFEE8c5LqIXCHLMjxW1apVFy9e3LFjx7Jly8bGxsbHx1eoUMHk5e0wGLw8TZ6K4ml32FMtv06ZVblqlTIlirKH1+A12DUf+BvmAK/P9e8XOFG73c5+wb1EiSVLlnTt+lqFChWh3pHDh0qVLhmUJ8Rg9DR5yAbV06FqFjX9++9nFC5csGWT2h6SydvD00NTjbINPvXPP+Zn2JSXX0Q4KMN18WcVWFxIEMSTA7kuIlcoimKxWC5evLh79+6EhISVK1e+8cYbmZmZ1apVM7BntBwO9m0t9o7TwW+9e/5S/Iw//gzw82EPv2F3trt6UEGNpmnwXlBj69atycnJ6zdu7Ni5k9mSVb1qNT17IklyaBp7tNGR9clnww8eujT1t0kBPogJ4ZmgioLPvLn/fPnNj99P+Ll29cqSQ+VPMDDBXDxBEE8K5LqIXOHl5dWgQYPLly83adIkX758Q4cORfRjMBhq1aylV9G9VIdOS05Pbte5w9m0lGXrNxctWszAvjIks9c+wWc54xnxB0vn9v3h7e39zDPPxMTEtGnTJjIysu+A/jaVvS69fr367Kqf3iA5vDOyrvTv2337tmObtmyuW7caexGS3iHjo8mTJk9+55PhU2fP69i5M/9+GfNY2Mu+AEQQxJMEe72bc5Ug7gC8EZa3/Uoy+k9aWtqBAwcQ7kRERBQpUgSJiYmJBQoU8GDvDcqIj7/6Wo+3I6IKjx03OtDHx6E3OthXjxGKaarOtGz+n51ff2vh5u3PVCpj0mSHxL64c5uvv94zZrN5165dNpstNDSkWImier0UF5dQIF8+L0+TTlXSMzO6dO5uMAVMnvpLWJ5gPfs5D/ZopKqmjRw1bv6CtdNnzihTqgRCNAn/2Xde2dxO0mkIHh/rV5IJgrgOzj72fLNziyDuwJ1cF9LBTb1IPHMIVFmJiz3VtWv3PJHFPh7+pZdR1mmqqveIypfP38OI/ZrOuGzBnA7d+y3esLlulcq+kl1DXKQ3mXLnFcRsDCoxRdi3a/WaKiP+i4u90rFjl+DQ/J99/iV7NZ2kkxxSaESEl6c88osP//pn9cQpsyMjQ/SaotcZ/P39IyMjsl+mwcSR6yKIJwg++BDE3YAzALe+fhcr/Dtdzlfcik2xC8g2+2+/jPXQ67y8fX398vj7+8HnefsELlm5VlXtv/w4JjwkT7Cfr95g9A3KHxZR5OC+vYrK3hTrPP5+ESoBprTGvlsKBTXFsmLJ30ZJ7+3t4+cX4OOLv1iETp025/Klc4HeRk8Po49/kF9AIHT18Qno1bu/rKg41vl5vK/fJQjiRuiCIfHvoKNguWLFinfffXf37t0+Pj7YRHTFOtCdgw8M+IlxsefPR2t69sY58TY8VS+VLlMmNMAnPjYm+mKMg93tMqj86fhKFcr5+vo8gGcNXfBQEEXwhZKWmnTy5Hl+DVDV6RSdw+DQPIsUL5on2PfwgX2ypmOPjrDScZg+PDycX/xEzXEs0hw2my1v3rwzZ86kqIsgHi84++iCIfHvuFzXO++843JdAv6bEbeBfZ1Xp+pVSce/RqWxn5JCikORWH6jQ+Vv8TE69PySHO+DcG/IzdYeyCOHTGd82KsR8Zd/R4tN0/Q6A3tskD15odccRlWvZ1/vYu+vd2gS+1Ky8zuuTAALtlAR9qYGfhetQIEC5LoI4knAeVuCIP4VDOWKoly+fPnSpUtYgoyMDOe+28HDeTG+I3Bhr4LFkr2MnCeh72k6uAwW0MCNITvzWsLdPBBYMRCLAEuAkrO9rAPr/Cea4E75m8F5bnxcujHgtzIzM2Mux/AaM2RZdu4jCOKxQq6L+BdEyIUliI6OLleuXFFOkSJF1qxZI/LcFj16F3tQT6zjH/uFQGw5E3QG7OTOBOtIYw9UsNWc3iO3QDwvTcACJb7JSoM+7PIge186T+H7cmRm+kn//PNPMdS0MOpapHz58gi8IEPc6HJmIgjicUD3uoh/AcO0uDiGgfvatWsiRSSGhIQEBQXxXE8hqGNWVlZCQoLKXy6MFJws4eHh/v7+WKcLhgTxuMDZR/e6iH9B9BAxUouAQ3QbsXKne11PAa7oip0n2dXne1iK6zsABEE8esh1Ef8BjN1iEHdx0+bThDg1hId2VdN1vjzFFSeIJx+6YEgQBEG4E2wuSVEXQRAE4V7Q9XqCIAjCzSDXRRAEQbgZ5LoIgiAIN4NcF0EQBOFmkOsiCIIg3Ax6OJ4gCIJwJ+jheIIgCML9oAuGBEEQhJtBrosgCIJwM8h1EQRBEG4GuS6CIAjCzSDXRRAEQbgZ9HA8QRAE4U7Qw/EEQRCE+0EXDAmCIAg3g1wXQRAE4WaQ6yIIgiDcDHJdBEEQhJtBrosgCIJwM+jheIIgCMKdoIfjCYIgCPeDLhgSBEEQbga5LoIgCMLNINdFEARBuBnkugiCIAg3g1wXQRAE4WbQw/EEQRCEO0EPxxMEQRDuB10wJAiCINwMcl0EQRCEm0GuiyAIgnAzyHURBEEQbga5LoIgCMLNoIfjCYIgCHeCHo4nCIIg3A+6YEgQBEG4GXTBkCAIgnAn6IIhQRAE4X7QBUOCIAjCzSDXRRAEQbgZdK+LIAiCcCfoXhdBEAThftAFQ4IgCMLNoAuGBEEQhDtBFwwJgiAI94MuGBIEQRBuhjtFXZqmQVtxhRMrCBmxIgJHo9EoVkQi8TCAeYWF7Xa7JDknPUjBOhrFYDCIRGoCgiAeNm7juoTfWrZs2SeffOJM4iCxSpUqv/32G4ZObNK4+fAQk4bz58+3a9fuJjsHBAQsWbLE398f6y6vRhAE8ZBws1EmLS3tyJEjUVFRJUuWLJFN/vz5sctdfLD7ItwVpgjFixd3mp6Tmpp65swZ2B/Q1IEgiEeA20RdYmScOXPma6+9dvjw4XLlyjl3ZA+p2IsVGjofHqIJAIwsQiusIxTr37//okWLTp48iagLKSL8JQiCeHi4zcPxGBOx/PPPP7t163bw4MEKFSpgkxzVYwftAtf1zz//nDp1KiAgAJt0wZAgiIcKi1GES3jycbkuRF2HDh0i1/WEQK6LIIhHz9M2ymDoFA90CHKuA2emR4KzyBtx6ePM9EgQJQpyboq9BEEQbsdT5bowHKuqCvcgPATWxSbWsXRmeiS4NBFFi3WxBM5Mj4qcRSuKgqXQCutYcWYiCIJwH562qEuv18uyfPTo0fnz569cuTI9PR2jM3j0lxaFJuDkyZP//PPPsmXLEhISkCg8hzPTIwGFikcn4MAuXLgAZRYvXgxlxF6CIAi342lzXTNnzixXrlytWrX69+/3yiuv1KxV69TpMyLi0DkQDKkqAg44DoeqsQ/zIUhxOBSsY0XneGAhERzGokWLKlasWL169f79+7dr165qteo7dmxHFMTcKFNGcyAccl5EzA7RhA5MJbYucErMBXBaa9asqVatGvTp06dPx44dK1euvGHDBmYJGIIVojB1eKFMBx43sg9TS4cUGAv/dA6ZKeeUShAE8Xh4qlwXhtkVK1a89tpr69at3bV920cffRR98fJXI0djF7yFbLc7FItFttsw9qqqppMxMKuaaodXU81ZdrtZUbAlROUS+C2M/evXr2/VqtXq1at37do1Zsz3SUkpo0aNdOhQClyopio2xZZlZSvYUtj1O7vFrip2m9VqNttk7Fbssh3ezSk0F0CZtWvXNmrUaPny5bt37540aVJSUtLw4cNVVXYo+O/QFAvKs0AL/MOmpugcdlW22G02s9Vmt8s21WFXFJ1m4RcZmW9ziiYIgnjkPFUPxyOP3W43Go1Y1+u0+ISk2rXrFi1aaPXK5TpNSc2wHNqz65ORE36Z9mvFonk1CQM0CyoyzbYrpw+OmvBnw+eb9+r0kl5ih+ceSJZlWeJAT4vFWr5cxYioPFs2b3Sohoz0jCNH93711cjPvv+1VsXicARQx2ZN/Xz0T0v+mR8fH1+2XLleffsicJT0OgP+5wKEXLAMlPH09BQ+Fet16tRJSUk+eeq4QdElWW1njuwa881vr731Vuum9SQHzGpXZNvUaTNm/DHn3LnoIgWj2vfs/fobPf0Nqqo3GnToM5j0MK0gjZ4wJAjiUYJxjI2q7oXQ+yZciSaTCaMnNrEwm80ZGWlB/n4GnTr229HP1K//Rreee3fstdtV7vT0GGK3b95Ut1bN9q90njvnb4us6fQGIee+EZpwBdibFV3fz7VYzFarxcfbB8X+OmVKw2fqd+/SbdOWHVaEPZoCZRABjh/33ew//npv2LDfp/+O7Te791y+Yg10eiC4/JbQJzEx0c/PH05x3pyZz9Su9VqnbitXrrbZ2cVUPYsLtZ8mTBw9+pvXevTcvHl1y6bPDBv6wcy5S3R6E1TVaypbEgRBPA4wgj0NE2QMxwhxFEVBMJGcnIwI4Msvv+zYqWOnTp1SUlKrVK2ql6SOXbssXbZ45GfDxYsONVZ7Vv9KFSvMn/fX0qV/+Xh7IZ3FG7kAmghlEOighNTU1EWLFn311VcdO3Zs06ZNUlJiuXJloUyrVq0XLlwwZdKP7PYWf4YCB1ktlr/mLP5xwvhOXTo1bfn833NmFYmKGvvjL7JT9v0jtEpPT1+6dOmoUaNglpYtW169erVq1arY+VyzpgsWzJ8+9Wcje4uumMvoDEZjvbr1Jk36tUf3bsWKFRnyzlthISH79x2E7Zjr07tHpE4QxNPKU+K6sLRarfBYpUuX/vjjjw8cOFC8WHFJMuBTvWZtTTJF5S9csHB+Hy8PIx/HNR476CVjQHBImbJlfQP9dQ7VeQks10A+/OjXX39dqlSpIUOGbNu2rWjRoiwc1NTatWsj0i1QoECx4kXhLOEqdXpV5zDqHIqXp+eYcWOfe74R3IKmk/OEBpQtXjwhId4q59Z5wY9OnToVlhkwYMCmTZsKFy7s5eUFB1WtWjWd3hgWEVWmTBkfH29Jr0IPB3NOBr3OWKNW3SbPPSfpVDiz9avXJyenlClRDPuhXi4dPEEQRC55GlwXRmF4i4kTJ44ZM+aDDz7YtWvX/Hnzvvjyi/DwML1kqFK9pk5vckgecGTwE3qMvnr4BozROAh+hIdhmsovpYlgAo7QeVXtvyI0wcrMmTM/+eSTgQMHQpnly5fDpxYsWNDX16dixQrM5sxxYgEHCr+AonGkqjcY6z7znIcH9FHxyUhLPn/hTOEihYweHkL4fYPIb9CgQd26dduxY8fKlSsRBZYrVw6q1qhRQyeZHAi0UL5Dk1Brrj1cOnNg7JlMZeaM398e8m6vvkOqVK/RuXM7OHgYR9MbYEOCIIjHxdPguoDdbl+7dm1QUFDHjh0DAgJ0krR1x66NmzYVL1YkMiwUDsJhMDokE8Zkh94Av4ENVefBHiiQTNjmP9QBDyIeMLh/14WRX7Bu3TpfX99OnTqFhYUhfd++fcuWLQsICSpYqBB8lgPlGAwidNEj5ELgZfDUG0xMLU2VVLvOrn76+cgrmVlffPapF8uVK9avXy/Lco8ePSIiIqDbyZMn//77byhWpEgRfnHQpNNLkgqn5FCN8GPw5xpzqQZmmONHD506dtTg4xOSNzLDbNU0h5F9weBBBagEQRD3w1PiukwmU0hISHx8fLNmzd55551X//e/tm1ezszMqFG9OgspdAgS2LU55MR/iV0vFLDYy7mKjeuj8f2bRTxPGB4ebjab27ZtO2TIkPbt27/44ospqakVKlTy8wtAgcxpITxjpbMtfESJkkNFvCPb5WEffLp24/b5ixZXLF/KwMO43ADLQCXogCiwe/fuTZs2vXz5MvxWnjx5WKnQxFlxp5XYE4aIQDXEY6bR345ZunL5yuXz92xa0bfnIAsiMZ1mZLGrOIQgCOIxgDDEzYDSzrVsWDUkadiwYS1atAgMDDx16lTFihV//fVX9nBE27bwEHqHZuBeAUcyv8W9BkZnfrQqBmzuSPiXlbHnfoEm4ssG/fv3b9WqFeI/hDglS5acOnXq/155pUOHzsxToTReTvZ3e7GCBIkXrGRlpg/oN+Do8XNLV6+pWqO63qEa9Pwbw7mgb9++cJ8Is86fPx8VFTVlypQuXbrAhxkQ+bHyuXx+qZN5MaYaW+dmgxfz1Ol9KpQr82Lj6lvWbU3IsCnM2aESBEEQjwc+cLoJcAlgxowZ0PnQoUNi07kve6+qqrIsK+w1fexLvipb8mTFqqpmRbUvnDreXx+468hJRZFtdlW22y12i2K3Xok+4B8Y+t3UWXZFtamajX1HWEhnX8/lJdwrrEBODmX4pqbJXBWHKkO8Ilu2rpsv6Y1r9xyxqGqmXVbt9uT4M62fe65jh/aJqelZimZFHRS7ptqdou+XW5VxomrseRLFalPte9Yv9jH6zF6wTlWQE//V6VOmpSWn22XFbtMUW8KLz9cMCS5zKTXLoloc0CjbKhAD1xgREZGamspE8u+QEQRBPFSekguGeg5iL/FVKvYQBLtnw8B4qtdph/fvHT/h59VrN2g6bfbsmePGj0/NSEekkxh37Zeff545fa4myxtXr/lpwk+Hjx5nd3uYX2cBCP/8B0ShIIcyLrCXPT94/MjRCeN/nv/PCoNe+/vP2T//PDE2Ph7+YkCfAdv37fX08/tw2LAh/Qe81X/giM9Hm+25fZsGCr6dMgARlxJ94cJPP/+0YMESGGrFsqXjx0+4FHMVTv3vubPbtXlxzapVV69e/vyLb9dsO/hq17ahfp5GROrsiQ6CIIjHxlPiuu4Ed0B6hBcXL1xYtnx5TFxyg8YNDh3ct3zFigxzll6nS0tOWbl81dYtu+rXrZeVkrRi2fKz587Db7HbUQ8eTa9XdXrt0qXoFStXnTx5rknjRmdPHF21bFlCUjLcS2T+wtVq1r50OebihbNXLpyJPns2JjZee3htxG+xxcXFLV267OiJsw0bP3vl4nlY6UrsVYPROHvu7Fo1q30wbGj9ujUWLFz67mfffvH1Jx4OBY5LgUYPxT4EQRD3BL+r4Q4IPe/5pyaRGQdgL7srw+5ksU2MuKrOYdAkfovP6Z6wF8BbGRAQIQc3B9/FbukIsv/mAi6Wa8L+Mc0dmgbB7GE+touFQFwNdkuOOSuuoFAmd++Bugv8dh/XTeIWwAqKEkuxNysry2a3eXh6efv6IZGHW+y5EqgrlIIAehEUQRCPmKd1lGHDLLs+hzU2ykrZb3hiT3+zp+MR4/BtrPC9RqyJHEgXR/LBWXweAFwQFqw4VihKkFAgu4SXXSIwYN15UY8rKHY9NFAqK0HKtgArMnvJ/koGP/+AkJAwfz9/7OaKCa0ekFEIgiDuC5ogEwRBEG7GU/XmeOLRQxcMCYJ4xLALQMIlPPm4XNe93esiHhHkugiCePTQKEMQBEG4GQ/SdWHGreVAbDr3EQRBEMQD4gFHXaqqWiyWEydOJCYmkt8iCIIgHgYP2HVJknTw4MF69erNnj3bmUQQBEEQD5QH7LrEcxNms/lebtffdIEROHcQBPHfUTk4j3BmOZOI/4IYhQAM6DKmcx/xhPEg3xwvJDo3ODdt3gR6xvnz5z/77DMsbz32TtxjNoL4f8iePXsmTJhA58j9IZ64xnLNmjUzZ84Um8QTCHp49psTHgcIy65cufLNN99cuHDBmXQPQGnnGkEQN5KQkHDmzBk6R+4PYTcsL1++DDO6UognDbTL/V8whOsTATWWFovl6tWrSUlJ2HSJFohsyHD48OH9+/eLn8YQRwmECxXrSGculSMO/69AiCzL6Hn79u07cuSI2WxmPzeCQlT800G6TlORScdeIMh+KcvOlg6F/bKi5lAV9kuKDwLoL+qoKAosc+DAAVQ/MytN1WRFU1jxPA9KtTugnaJTFdnhQPFQC3qyP9ATIvgv6ucSiBFAn7i4uIOc9PR0kcYtolOYPihUwRKbUI9pyBLFf6isQBB0xl587s9MKE9YRqzcnZzZnMdngxTWXbh50esOHToECycnJyOFHSI0Rz6ssp9F4x+mNWqpijVWD9bs10WL4iDQbrej/6CvHjt2zGq18mQUaNc5bLCNqrPzw2A3pylYKawgSJJ5WVw9blWoI5TMyso6deoU+mR09DlFsbFf5LluSefBTJ4GI6M/OtOZzcWHtca/g/MOujo37g1mXIcDp0x8fDx6BcyYkpKCFJw16Aoq6xKoE9SzQynePYU1hY4yr6uQwurrXOVth1P+YvTF/Xv2nzwOM5r5OYe9MBHW8A/SUAKrLySIY/nxrg8y8T7Iy0BOGDIjI+PEiRN79+69dOkS22bHQkcZEiEBOVmrcDNCZ6SgQJ6Ov6xj8yJYDvy9C6K9xXrOoewu4BDR0DnPL9ZBAS+TqcIUQC+CejgCybxZWTnYFg2Nvbxzsf8QyWSiB2Jmj56D/mO32dhoxhqBVZZ/eBVZLVkh+MPrC9jwwUqC7OweIQQC6Ia+DZkYmrDJ0nEAcvCSmSxmSDtveyaXj5hioIbO0JP1TV6gfF36o4dZ4n6x2Ww4z3fs2FG7du18+fIVKVLk5ZdfnjBhgoeHx/fffy/MhLNi586dyODj42MymUqUKDF79mwciNHhmWeeKVasGDoHDqxYsWLlypVXrlwpjnIWkAORPn36dOiMniE2nfuyQcrq1asbNWoEZby8vHx9fRvUf+bK5SuKVVbtCvv9LpaJf/gf9H0L/iianY0l2IRfeWCgjmvWrHnhhReioqI8PT29vb2r16x68vQJm4yRS2iuaKrdoqkyNFBsNthKjHTYI5TM1pNnvn/QBBC7Z8+el156qUCBArAMGqhq1arou7w0BXrIoiDk48qxNedfUT6y4MM2cYAs5HJwQJ8+fcLDwzHkMeVv+kmwG8FeKIM+w8q5B5BTKO88PgdIxyjWqVOnQoUKoUZGoxF9Ca2PdGcOwA5kFcz5EUm8EnaeyQlKQZP9888/DRo0QHVgIjTZc889Fxsba5dhH/bDaTCCqmEURm64NGY7phkzCzdUtkBVs6NPYa8MN2WzffDBB+XKlQsICICSoaEhH374ntVm5tbkB2VryeSxn2fjItlHJDMxN5r8jqAKixYt6t+//20tdicww3vllVcKFiwIM+IMrVChws4dO2VZwTCpMJWYWH6KuLTCR1gPPTm7Grxj4C9A/lmzZuGUj4yI9Db5+vr4tG33cnJyKiqnKVZmRp7T2Z/4Rxzo5HqajRXE9qmpaYkffvhh6dKlhRkjIiK+/W6Mjf3yHdS0IBMzECaG7DgIZkdxmaI7YoOdW851sed2QHNWWUWZPHnyRx99JDad++4KOuq2bdsw+uXPnx9mBOXLl8eMCr2Rq8Rry9RiRuXlIxldSNQOesEmWGf1RQL7qAp8/6RJk2rVqhUSEoJ2CQoKatOmDRKZSJHJaXGxwUACqyNK4qZghnbmYaBemKC89dZbJUuWxGiMHo5xcvz48UxJGBI5mIWYxdhwyDqws1HtmiJjzVkWT3KuQ2G29uiBH73/C4ZwIehDJ0+ebN26NQbBevXqobHRcu+//z5s5MqDEwMZYmJifvjhh7lz5xYuXBgj3e7duzEutGjRAqc0WgU+DK2ObJGRka4Dxcqt3H3XL7/8kjdv3pEjR27atKlz587bd2wfMWI4Yku9HlVV2A9USXp8HCwzAkSdUa9iTdKzdfEK2gcCNDEYDHPmzIHTgjIbNmxAjzlw8MhHH34MZw419Cgd0xb2cnp+gOQwsNcAa5LeJmFCypXEkmXI/uGx+wbK4AycMWMG2uWTTz5Zt27dxx9/fPTo0ffee4+poNMbdKqRzbBgBfbj/jiCFw5zCbMg3eBgby1mdpR0DgM3F5f930Cfw2myZcsW+Bj49buzfv16jAg4Ct1UHO5C9IEVK1agX6G/Ycbz448/4rR8++23EWdz3diHXVNwHoCPgX/Ym48NOF6HGhlZ5dmHZ+FgsEAXHTt2LKzUsWNH9CJMwqACjKTqjdBD0kkSm2gyGQa0l07lDQRJsKUJdkG76vQKrMRMyS89zZs3r23btpixLV68OG/eAj+MnbB71x7JwY7Fh7cA+2gQbDDoJJjdJmECrtcprFDoybTjnwcJry5jwYIFqampcAyw+ejRo8+cOTOg3wDELnpUjRXLquHQm5gavIvydMy4WaLQSyhnYL2agVaeOHEixu5xP45bv3n9802bLlq8+KefJvDujIzsd3+wCkuio7EkHcwlwwBCDj4c/IWh2a8r4IBTZ88uW7YUMxWcU0uXLoVvGDFixJGjx5DL2ca8n3Kt0FFxIjtY0+g09FUUJXovBjtWrOv90g8OSZLmz5+Ps2z48OHoORgJz549+8477yBm4uaCclDMoTEzci2cOjuXXEOjJnloTHOkMpvjWJyz1atX/+mnn9AP69Wvv2DRol9/nezQMLryjsYtBoH8gyMwhqBHMdviw0SwD19kg9Bt7dq1PXv2xBTtr7/+wgCOQSA6Opr/IDqTgbw4QWAu1dm4aFNY0oAPszD7fUd0SCQprAQMn9dlP1KYRXkYeD9w56f16NEDnembb77BXA+DNaR9++23OA3gqDBYI0OHDh1g9+XLl1erVg1HIeR//vnnK1WqhM4t2hsDBIaexo0bC4Vc3LQJUUgRL4JC1IW5IatBdh5RCywxR8aMDJKxjuC9RvXqJUsUX7NmNayNZtu2c7fO6FmzVq3SJUsY2U+fSA5dpt7hq5PsBw6d0Bm8KpYvZXwQ7zFC6UKB0NBQoQymS+XKVwoLDd65Yyt6wYXzFzdu2qxJau1azxYrUczDU6+XDZIkxyfGbNy4IyElo2yFilUqV/b39mQvk2fd6f6B6aDAtWvXoAzaCPrAh1WpUiUzM/P0qVNGgxRz6eKmjZusslapes3yFct6GoxwnTqdnJmetXvPwdNnzxUqXLhm7dqBgYHMt2NEZQOKSXQegKbH8HcvL4LCuY1RctCgQenp6c6kOwOvP27cOLQm1qG2SBSgFIhCnIfAGnMgsYk5KaZQe/fuiYqKRLdw6BwHjxyT9BLGUDQz7+x8xNWUhLiEw8dPN2zSCP3VwH5khglnvVnT4P8QcomenJiYiJCuVatWc+f86VBsR09Hb926PdDXr0bN2oWLF4OjMTpsUCX6ctzWLdus5syKVWqUrVjB09MkSXYDZkqqh6JjUReaPk+ePExtnW7K1On9evWaNWv6q/9rZ7PaDh8+su/QUUyr69evFxwZiXby1lkcqv3smUs79h6QHVLtWrVLFS+KNmIDHq+EAOphCYVZ58g2ODYxrGNagKm0ODWQDbhOE4HYRGaxiSpjRo85OHIiEUHnqeOnrsXFGD0wgfF0OMwwo9E7tFyZonC3enblTZeelrZpy7YLl2PLlSxRo3oNn4AA9usDDln8ZA/kCDNCM8zV0THq16nT7uWXfvvtN4eGRjmwe89eP7+QunXrFShUQIKlHHb0pKuxCTt27UtOTi5bpky1atWNHqi0zL2hCX0xIytNr+jQwaA8lMTc4v1hw+bP/6d1y+dh4b0Hjx0/fiwqNKhW3QaB4ZGYmpo0OwZxTSdh74o1m5o0a+rt4WFkvyArpl8MbhtmRsgUNhEpYOrUqZcuXYJ3RLrLvGIXVm7t4VAJVUY7YhfyoNA6depglrZr1y4/Pw+dCo+l7Tly0MsrT/lShfXsEh9KVTMzrPv2HTx9+kShwgWq167vExgMV4DcXKSEw2GNsLAwvqnbf+BAnXr1BvTr/cN336CfHD50ZNfuPb5+/jVqoTeWYGaCj0QYpDM6VMP27TsKliwelS8Kjcoqlt3iGH6x4ufnhyX0/PTTT7/++mvmF2vXslqsB48cP3x4f0RYaI069UMio6CJXifbrPLufYePHDoYGR5au/YzEXmj2KRMwgwbZ4AHk8xkPw54c9wPGCwwNOPcRuQEE2M0hGkAvBHEwnVhPSkpCWaCr9q/fz/C58McjHQYTTCngAR4fgwTmO6JY52ibwcyg+nTp6MzQRSKw6ZzH9+Lw5EoYBGwoly8eDEyIqJF8+ft1ozx474L8A+oV69+oUIFvX18P/tqVJZdQabLFw79+MOEHl07hYRGfj/xT4TiTom5A8oIHQD6MZbJKcnhEZHP1K9nt6T//uvE4AD/OnWfKVKsiJ+nT59+g9JlTbbJB/ftLFggX+nSZWrUqG7y8Hi2SdO45FRbrjXilnMCrQCimWLFihUvXtxmTZ8358+QoKA6teqWK1PeZPTo3W9wmtlmt1uPH91fvWrlwkWKvfK/VyMjI8pVrHH83GUbq5ri0NhfSBY2RxiNcPlefuDfpYNYuTuQ7MrpPD4bkejai5xwDw0aNkTAHXft6sUL5yaMHdOza8c8YZHjJk4221lDaLI9JTFuym9T33mrX6mCBVu/2DETaSrqIguZyCMQAgEGL3Q2TK1UW+q3Hw9m158bP1soXyE/v8DhI8eny6psTVm1ZF5oeES1mnWqVChvNBg7demZnGmxsQs2igP/ZWdXdMn87ocfMa/9Z97cpLjLbVq/EJInz4tt2pUpVTJfVNSyDZtTVdVqSf70/UGBQUHNmjdt2PAZH1/fr8f8YJH5bZ0cCMvjBETceSUbrP/888+YU2P9ag4wa0GL39Q6Qh8gzCj0hBkxp4kIjUQvPXVi/4Tvf+rWoU1gYMCvUxfi9Ga2Ui0x0WcrlC+bv2Dh+g0b+0j6KuUqHDt9IQvVlS0QJmQKINOq2Hft3uvv59/7jdds1tQP3hvq7+fXsGHDiPC8AQEhv/w6zcyKNi+c92dUZHj1GrVefLGNv49P27avppjtNnYp3cZsiJMIkrJBXT777DOTybR8+bJrV6KbNW4UER7Vtm27ciWLlSpWdP2OPZmqJlszVi6YM2L48IZ1ahUqXjo6KcUOC4rLnk4DsF4EaVAS812XAWEuLBF9IoIXm2KXAEXjEOfxORDWA+Jkx8ysZMmSmF7DVVw4d3T8d+M6d+oQGJ5n3K+z2PU+2azI1pPHD1avUrlYoWL/a9OmcN7IShUrHbkQk6U6+GVDO5pECERuFApVN2/ZKhkNH334vmxN/+LTjwP8/Bs0bFwgMszPz//7CZMy7CryHdi5+Yfvvm7T6iV/r8B1m7dbeT/Bf6GkEAhELQAmkfC1mPAlX4t9qVnziKjCL7/0csUSRfNFFtq47QCyybaUTu3bBQaFPdv4udBg/4jwiKXL1lkUnP8Ylqzo4XwYeDwwV3zfnDlzBvO1Ll26wArYhH3BvHnzXK7r6NGjWEd0j7kDpiQCzEBLly6NkwSZ586dC9thniiO5VLvCEpBnIsxF/4P6zflFxLQaXbu3Dljxh/vvTfslVf+J+l1H34w9NyZoxFheWb+Nd9itaYmXenRtb1k9Fq1fY9FUfdvW9q5Q5dhQ/p6egWMmfw3u2L+gIAyCGv27t07c+bM999/H9Gnh5fHm292u3zxZJFC+X+Z/Guq1Z6emTG0XxcvD9+ZizeZzYkvtGz42pt9UtIzbJa0P36f6O3j3e/dj7Kc8u4f1kkVdov7wIEDCJE//vhjxMowe4cO7dOSL5cuXujLUV+nW+XMzKwvPxrq5en/y+9z0LW/+vKzjh3bJ6al2xT78aMHg0Ii3/n4KzOksavbGAedxsepBYHVq1e/l3tdrjMcOZ1Jd0ZIu1NO7MLsBx0MXQIDWa9evQICgmrWrIUgZ/f2Ld07vjpsyEAP7zxjJ06zwoVg9LNnJl6L7ta92/D33y0aHvbCi11SmesSNxgYojg44O3bt0+bNm3o0KEvv/wyeu/IkSNPHt5dICzqr6UrM2RLWvzVzq809wuMXL/3cGZqbI0KpYYN/yzVYreY078e+bGXZ8C3P0yxsDsNUBuVZQPV5cuXMTlDoDBw4MDylSr7+HodPrR72aJ59Z+pd/by5SybLSUxtlLZUo1atkuS1ehTxyuVLr1+1wGbzSybk3p0eTU4LCouHSJvBtV/9tlnYflqOShRogTCHedGNrVq1cJsz3lYNqK+ICsr68iRIzgTP/nkkzfeeANRbMtmrVQ5ff3axT1e7fF2/+4mD+PkqcuQXdMsqpL55hvdn2veIiYpxWKzbF7xT/6QwJc69Ey0oqrXXRfmsjDj5MmTh7w9qEXzFzwk09Rff9yyeVlIaMiGjVsRbsbHxT7/fGP/4JB9J05nmjNbPN/wp59/yrLasGv27795GH3nr9mepbC7z6pNVe0YbS2Yia5cufK7777r168fHENgUGB09Lk/fpvUtGGDxIRkm92WFHexVP6odp26oWXt9owvPni7X7+3mj3bqFDxMmcTU1gzq1b0J1cHFRZAbN20aVOnpbIpXLgwZmPOjWxq16594cIFHOI8PgcQZTabMUGHGT/88EMMiZiyt2vXDgbZtml5tw6vvTd0iM7TOOaXv8yqTZYzFLv18xEfdevcJS0pDXW7ePpAWIDPkI+/MEO2kqWqNnb3neu2cePGSZMmwcc0adIEkfe8WX8c3L09LDBo6bJVWbKSEnep7csvefjm2XPslFWW58+e8fprXV/v3M3L4Ldu0w52I5/1Q2ffgZLwRufOnVu+fPk333zTu3fvokWLBgYGopT5s2bWr1nn/NU4mDH58tlyxcs2bdHVbLPOnfNr0WKF9h8+gtns2bP7q1QpW6hQxUuxyVZ2W4zP0G5/dj50mOsSHuw+wPGnT58ODg7u2rUrjAJgayxF1IUTFZvIgHWMAlg5mw3WYT7h+YXrWrVqlZAg1ALOMnIg0mfMmAGBBw8eFGW5dmEdY8Tvv/9esWJFNEaZMmXatXu1TJmyRkmPGe6Fc6c7d+iUbsGECCdD5sXTxwIMhnc/G8NuuCup6CNXow/4+Qf/8Nvf6DlC5v0hNAGiapjA+vv7I7jBIAjFdJLu1ykTYy6db/Pyy5k2e4aqWRQ55vS2/BFRb7wzOt2S3uHVtqejY+wK6pJlt6Q1q1urQtV6CdbbzPL+EzD1ihUratSogXlGkSJFXnrpJZyEMOPoUaPSU66+8lLrxJQ0MztV7Nbk6LxBoe269kWgkp6RGp8Ub4fjQmx66WhIRP6+Q7+wskCFWU1Mt0R9EXVFRETcY9SFZkpLS8MMA6707iAbMjNr5mhoAcy7ZcsWTN4xDULRrVq1atSooV4v9eszABNf5o3s1ivnD/v7h33/6yw2gWVdy67JGBRkzZZZr2yJFi91z2KPIGAPm66gCMylJk6ciDERTVa2bNlXXnkF4xeirrVr1x4/vO/17j0TZTVTsWvWxBN7l/p6+X/x0+9pqYkdXm4dl5SMmAOe1JJ+qWjBIi+07mpmLhFa2jGPhzPIly8fZDZo0ABjmZe3V/HiRZOSY63W9CtXL7E5uKJaM6/Vql6t1nMvIxBUreaYmDMZzO9ZNXPc24MG+ufJF5uYclPHFDrDewHM9AVYx8QRQ5Irke9nIDMzXA4pSEHh69evr1+/PjpGwYIFYcZ69erhfHx/6DBNNWuqRbOpJ/dv8PXy/GXactQHXQllDh7y1vrNWzCjsqKW5oQ+XV/JV7T0hYRUWcX5JcOM8C4wI4JU9P//vdq+YP7CPp5e27es3blzQ79+/a3wK7Ksyemb1iwyGQwTp86Bi7p85ZLFhmQM2tbVi+eZ9J5zV2xCA0FJ1W65diUeUQt8CfSEt8YsEO1SqlQp+DNLVlpifCzKRZxlyYgvXbz4C+06Z+E8R2SjwFEpnw8bXLBYmTPxiLowztrRLjCEywIANhG2Ak5jWa2//PLLBx98gLpg3bmP70XHc9kQK0ICWLJkCeYHAQEB6DOtW7fGuQYNv/76a+zS1CxMZC5fPqUzGEdP+svCJjSwrZyWmpSekgS/Bc0TrpwLDQoY+O5HCGjgunAmZlosw4cPxzQdZqxatSqqHBYaatJLF86eOrB3Zx80MeJQFgOn7Nu7VWf0GvPT7xb2JEqWqphXL5zvJwWt3rTLrjrYRFOoqKoYddGrMbMJypPn2SbPvfhia0SuderUslotstUWe+UqhiNVVsyJcdUqVGv0XAe4sSUL53z15ecWWUPNVTVlxtSvTVLg6nX7rA6cO3aHgiBRGOMxcMfbEvcCnAQ6E+byGIkgS7Qlluj92ETjYViJioo6fvw4Tgw0qotChQphL1fAeSHedaxIBKKI24JDUIQ4UID8mCljOobzEFPIQ4cOz579Z+VKFSWDoXr1WgUKFJsyZZq3waCwO6VeHnq9p2TyhAxIkPzhUSS96mB3HTVDri/cCs0XLFjQsWPHSpUq7dq169ixY3BjderU8fb0qlSxanhkwZmz/vIwGj30OoNDMxl8jFgxaJLBZ/q0mUXyRiESVvVekkPvazJ56I1GfoMmN2zYsAFdFqcB4tGTJ0/+9ddfOP9hvdp16vj6h834Y06Ajx900OlUA6JRo4fRwyQ5JC9v76DgAJ3FcvXyxaFvDzNbLC+0fdGgU/UOVdEb0Ag4AEIEvJx/B+0LD4fSMbf4V+rWrXvt2jXY0yUfK6xnOBxoYjhg9AHMHy9durRo0aKOnTphGlandl20gCYZdRJGRTu0VfSeBp0CZdkdcoMnFyQZHSrM7cHqILEnUHiTjR8/HnNbiEV3hfzZs2djcIRrxChcskzFHyeM93PIHqiBzqAzBEua6uHh4+MX9PuMP0IDAz34rWyD3sPLaDCZ2I01nAA2i9y5Y2cMat9+++3Vq1fRChjTPUwe5cqVxzzJYPKIDA/T27NSrsV+M2r0ngPnOnfpanLIUB7njMFuzUjPWLJw8aTfZrVs3TY00P8mE8MUqL4nx+NGMB451zw8RAYgzhfgPJ531L1797Zp0wanMHTDuIb4FdXHrrr168CAOr2nzqQzqJJJQfVU9jSPzqTXe4z8anT92rWM7Gxlt6KMBgNkSXqY3whDjh49+tNPP0Xnh8A9e/ZMnzajQKECwSHBJUuXq1613nfffIuSdSyz6gPFDThBMQTpoyLymiT4vaT9B/a8/f6nBYqWrF+7ChoNo0J6VnK7dh127Nw1YcIEmHH16tVfffUVKlKrVk3U1MvHLyQsXFVsCfFXP3z/w+i4tLYdOhpVWa83KKy5VYPOhswoSIKWrAr8dhwHNhFmcRorG4g1cGA3ZxIHm0i8yYZYwv23bdu2QIECmE6dOnUKc3dESMiJQA0ZNJ0HRhiDXjWpetWgxz9J8kIfDAgM9vP3U+1Zl6LPDh7yvlX1aP1SaxZK6JkZh777/rfffte9e3fEmjt27Pjtt9/CwsIj8kZGFihQrkq1b8f+YJDQoRXYH/pCIZPBhJqo6JySwaSzoN1w2hiZdnoxpGIuiOASwfevv/56OSZm2YoV7w4dajRIUFIySjqjMTwyj05JTYqLH/HZyGOnz3Tv3smoN7Rs8fKQQe+yFoMgh8EXAbhONWJsYh2eteTjBL3j/oCnwaytW7duaKQffvgBUxLM7jMzMxGEGY1GpCADJilvv/22l5cXTC/CLBAXFzdnzhxsIgNW0BUmT56MTfi/pKQkkUe4sZwgBdzp4XjMiVq0aJE3b97Lly+ziayiHD96HNO0QkUKYdaEqRJKw5TCrNgUS+oXH7zt61do76ETKmIINiOXY6P3+PgH/jBlTu7jX6EYjODn53f+/HkoBn2wAnddMG/e5KQkNnFlGrLZlmZO+nHUp37+Qas271DsGTh37bJqgZJ2884NywN9wz4f/Qt7QDl3DBgwAG2EYBeaAIz1FSpUwKCMswIRnt0mq6wt0HyZ0yf/FOQd8ueSNWxarEB1c5tnnw0JDvKWdB9/OiKRPSKNnOacj8Sisvf+cDwyQAG0Mki+B0RPAM7js82LmB59bNmyZciAnpOYmNjo2UaSQTp68Djm7TJiKVlBmwb6BX/96zyVfTcC03KWqip21ZrVoHSRpi/3hKlFzAUhAJMedJisLERm7Cn57du3h4SEVK9eHSksGyspS7WnWzKSBg3omzdvwX0nz7GHs5HOJGC6av579gx/vzzTZ86zomMp8uljJwL8/REDiZ5vNptHjRqF8WT4iM/ZYYhRZPOwt9+KCs3jb5TavdzjapbdppmhqWy3//PPvKjI8BAvY/HipU5evoowzln/uwLN7/3heKj04Ycf4uyDA0OjoMqxsbEidIDzVtHUDOX07o2BBo+ff1/skojqAN6FbaePbi4QmbdH/2EsQJbZtUd0LczYcC5DILJt2rjJx8en/jN1rXYzjkBchbAY47XdktbjtS55o/Kfv3TVakPHskWf3luxZOFAfx8vv4i/lm9KkzWEIDhi796tBsnzgw+GCSURCUFtuJwffvgeoanoib17vxkVGuJj0HXt/noiggVWjI31UtX25ft9EHVFJyIoZK2FWvHvR9weXmXWpPf4cLzIPHjwYG9v7/3794uTHWbExAtxdnx8PPay8hT1yuVjRp3xy1//svEwSBxtt5pfbNk8Mk8gZjwfffp5qg0BF0TaMzNSS5YoCafCAkArRgsZEyAfX58XX26FocGOtkGS3awpGTZz7JtvvhkaXvR0dIyNjWWQLa9f8EeQPs/KTbvQuTHAoUCosXXrVhjtiy++4Ga0woz9+w8wGk2/T58Bm6ATy/aMd9/rGxUa6iNJ7Tu0TzbLNmYvNBa7VyxjeEq71vy5BpUq1Y5LTrOxyxX2HPcNHjVs0sBd8v0AG8MKJ06cCAsLw5QE86wxY8a0bNkSoyRwfa8LE+cqVarAe3Xp0uXHH39EM5coUQJnCJoEezGhwK78+fOPGDGiXr16X375JazM2vuW605CY3HB0HXhXuxCZpwkL774ItqmefPm8JrvvPNOgbx5ocbL7dpaFbQBuwbPe67590nfBAdETp6xwMzOTQw7sL1y7eJe34Cgsb/NzWX8C5UgFGbp1asXlMHkC4Ps0KFDEWVicvp8o8Yyzis4Cz5saZbM5X9NDwsN/2biFAsbVVOt0A/jq00+dWRL6aLF/tdpQFImi9Sd0u8XzB4wQiHsQwN98skniCJgwxo1amCeYWZDv6zx02D5ysXBQXlGfvVDChtfcHpgZDYf2L51zapV/d/oEp4n9OfpC6zsJo5iR3CQbSlUuW/fvvd4wRAZ7tS+t4LMQiCWOVNw+MSJE1EjjJJff/01zsaKFSti8p6vQP6UxFTmEtgDEiraNMg/zzdT5mtIgdLoJ+w+nazZzA3LFG3W5nWZXUzBue1ssueffx4d5n//+x+aDKM/nDFasGfPntiFSYZit1rQNNbMX8d8FhES+ufiFVZWEvuwysi2rZvWhISGvffRF5k2BYMm8p4/fS44KBgxzXvvvQfLI6DBtB3RzJLlK9EVYUXFbjl38sia1au+/2p4aEDoW8NG81sxrKteu3pl7eqV82fNKBQV+Xyrl+OyzPfSMWEcuC7MVLDiTLozMONnn30GM1atWhVxIUIldk1bpytdunRGRrqGJmaoZ/ZsCjJ6Tpy+REhEEs411oSynHAlpm71yk2bvnA1NT0LZmAPeZhxsmNW0aFDh3HjxsFth4eGIkoZ/M5gs2yFEdkjF7ZM2ZqGVgsJy7t05Tr0Pu4FrZb05G1r1yxbtLBB40ZRhUvtPX6O3TKU5cOH9hsNxtDQsA8++ECMMBht0FJbtmwWrguNePTo4TWrln81/EMEMsO//dHMZiV2mbW4/cthfQsVL3sxKY1dHGbNr7GHC+4ArzLrDwhN4CDFunPfLYi9sAQUgxkxfKHnwOEVL14c6lWuXJk7ePQv1s+uxhw36UxfTfnbhuJ5l+NHy/v27ERDDxnYPyAgeOrsv80q2sVuzkypUL4sBkZEXTAjOmFwcLBe0n/6xcd21lWZL1LtGQ4l8dtvPo4IL/j3vNU2pMPXs6KUDQv/DJZCVm3erSkONsBxJTFBgZL58uUbPnz4N1+PatYUcaGHr0/g3n0HMX9lZ6SSefwklFk16vNPAgL9Ph35fSqb7ts0OUtml4HT+/bqGZW3yK79RzFG2R3o47IDJr6XfvlwuH/X5Wo5+HPx5WLMNYYMGTJv3jwMi7NmzcIcBHuxRJj1/vvvly1btkiRIjgxMKwfOXJEtCtMgrgNjY1j27dvj6mukImls5hsWJ+6g+vCCkRt3LgRpw2KKFeuXOfOncd+/z0c5OTffuOPKNls5qwss2XCt5+XLlVmxYbtNqtiwUSCjV34PEjXBeUBOgr6btGiRVEvjIY//fQTbPL9mB8wWbLJFqs9y2bNnPDD98UK5J2/cl0ChlOMirLNItszrfYdm9eUL1F0xOixmWa7za7Awzul3y+YRNetW1co07Zt2/Hjx8ONYdiCnpgYY9jVbFl//vlbkRLFZs6dD79qYWGHarPin1mFUiqMldm8Qb2w/KXjUs2qmsFmjvflunKJMC/aGoFjs2bN0OXghuFyYN4mTZ7r17+vzcz61H90XQwchoktxm5YCR0VfRKjJJps7ty52KXIdkxyLebM994ZXK5k6R179qbxrs1iVhlzT/M/f80qUrTYr7/PYH5LcfAvwGNqax869D1oCD3RM2FwDOU1ataKi0+APDZ7xrEKcmEak/ruwK7e/oV27jtjt2HmjUktpgiIHq2L5043Gkxjp/59Lx0T59TixYvhd2EoZ9KdQa0vXLjQsGFDVLlUqVIvvPACzIjQE6cwdt3FdcEgyHD08KHK5coOHjgsOS2df1Ub+yyY0OP0RzeDTEwscKZ/+fln1atXW7x0CWZssoJppMWcnvzOkEFVq9fad+S4lU32cSZaM9HTIcWiqWbLpXO7I8KCXmz3Bv/atpKVmTVk8AA0NUYJnNGY3b722muYFMbGXlVk9lyHBT6RtSGmApm9OrWTvELOX0tFk4oW/3JYv//qutC0U6ZMgRNCTbHp3HcLIifA+YWugipjcMME5ZdffsH5hakADkeL8F7ndF0juevCACfiM/RliMEHMWjTRvULFisXl5rB5wWWKVMmo1HQc9AnMRfBzL56jerrN69nISUGTos1Mz3p42FDKlYqt2fvQRv6EToycyesyje5LpQOiQj6u3btCiUROdStW/u7777p0KHT88+3SEtNZ5clINdmtinpXLm0Pm92MnkH7z0VjYmGak1NTb726qv/e/75ZqfOx2RhREIHcCDgwnnG1X9M5Mp1wSKsz/CnaxISEsRVAt4kLB1LsS6maZjjJyUlZWRkiESAFeRH+JWcnIy9IhEHQjJwFpONSLyT64IcAAUSExMhCn0Gww2ANizawviQlvzxO+9WrFL50LnoLAR86PCYGKPFH7Trgv6oF8jKyoIyrvpCGYyr8NcIumxm81dffVWmYuWde/elKpgs2TXZnAVNremr/5lbvGihGfMWpMHxWzDfUXPvulA6bBIbG5uSkoJ1oQ9fYgA1q5bMX8aNLVO27Lqt26wYMDHVgu2s9sTEZCsbXuHZrJrV2r9rR70hMCYhQ9HS2SPGTtmsyo/MdYlOJXoIuk18fLx4lAO2xcKKQd+GDNb/5Lr41RJnN0b/QTdmVwh5ewF0KqzDHpkZSYMGDqxdr9GR0+fRItjPLAgTWjJn/TKuVJEiS9dsQLoN1uNXB1X2MBuLJ9AH0L0hR+hps+PEZyZOT07FeIG+mSljspA+eeynOkPQ0pXbzVmJFqtNRhSO00uVD25Z5m80vT/y53vpmChiwYIF9xh1iQqiY0BDmFFUmdWJd4+7uC5Yfs+ePfBPo78bm8kiKeZlHcwvsF2oKaThTBeXXiEUAlFR/LNb0s0psa9369bw2eejL8cwn83iewSftoz0LCsOx1TAbs5IOlu5dPFqdV7MxIxfsdssaIWUpKSU5OQUZ3MIVXFy27IwrHKDsqI0xTLq/T56U+D2I2exicaFTl8O6/+fXBeEQxaiLrguse7cdwvMOtknOwZAmBH9B+tCvevGRJk3ui4WedpsOF9wOFRhH1t6zw6tw8Lzn7l8FXbAbBsfmBEyIVnIAagxwlPVlpWVntyzW/daNepHX7wCK9pkHMTU0Rx21SHfGnXhWGiFQtEuGARsmI9CEARyg2v2lMzkJExRs1Qre5+QLfX7UR9Jeq8Ne06g58ddPPliy6btO3a9lpiSCQ+nsCkXbMjKg/jH57ty9ZiGxL+aBykI4RHS+vn5GY1GpCAyFUtkcC29vb2DgoJ8fHwQTYu9APlBYGAg9ooUSINRsII89wgOQX7IgXCoAVGSxO/H4g8T5EDINbhf/40bV/0w+Xe9To27eBFnTnJKMrthi96D3qWZIAXnELvIwF7gxRuE3YUUn3tFqC3qi3gfykAlsQkrcIvo0GdGffH5tKnTfvz5p6DgoGvR0VcvnEtMSNA7dIv/mjuk75vvj/iqfJXK8RejL12+in/sbQO5A8XCJmFhYQEBAWITS2jFVh3yyM+Hj/lh7JhxE6Ki8l2GZaKj4+Ou6R1q1w7tV65Yzt584DDEXLmwduOmChUr+Pl66jRvZtzHgbAkVtA46EV58uRBvbACkM7uVEvYy76zqmOjFpqXt6kmsbfXOZAIp6bqMCrrJB37ig9ra0gUwoFoMk9PT0iDTBSHToV1qzltSN9em3YeGPfLRG9JF3Ph3NlLVxLTsjAKjR/z3Vcjv/5+3DhMOC6ev3Ap+sK1a7E4mn85l31bE31enBeQwwTyxxJ0qvbhsGG//DIJw4dB0rKsCFbW5skTXLJMwaWLFw4e8nYKnwegLosXrVR0horlSwkN7w7klytXrlWrVs7tu4LMQFQZGmIdSmIJPZlJYGI2VZBlB/t2uo69dxFjFc4PZc/uHa1fbN2t5+svtWkTc+XipeiYixcuK5pNdegRIEICpKHW4oxmNWaPYugNel1mWmq3jp0uXr7yw7jxcOWXL11Ed8tIz7hyMeaVVi+eOXlU09tVSbdp8+Ez52OrV6skvoZuZM9MGIMCAwIDA0wmJh9iRetg5d233571xww9u/KvpKdnrFy7IzAoqGih/OxVfBp7NajikDD+QnX2glDnqXTHM0qMJFhB2NS8eXOxfifEXqEG+ozLjNjEUqwwWK+Dx2XNqVkRV7I0eNbu3butXr0KK+iZV6/Fbd95IF/+vKEheSAUAtB1/P39RW8U0pDMHvLQ67KyUnt063b+fPQvk37BfCD6YjSiZwRP7C2EDtmBJtN0qg7eBbER2swBUfxw9vIjCMRgKxlN7MUB0FDHnq8B6HK/T5uBRkbp6RmWJUuWh4WFFswbEnv1apu2r/r5BX7++WcpKamxly9eunDBbLXz5zZgR+e32h8PTHF3AI0Abht13YpzLsNWcfZZLpw+5i3pPX19g0JCggKDQoLQfqHDPvlCla17t28pkT9fwchIo9EUGBIaVbDoL1P/YPeHmXQIgBd7MGEEZLE5tmKNPnskX0iQv7dHcEie4MCAIPwLDHmj74BMs7lBvdpGkzEgDxQMDgmCliHVazdIzWI/fvZQ0DRzWmxYkL/JA3WPCAgKDQhg5unU5TX41+m//VIgb2SDho179+1frAgotWHTTrvCXsyKIVsYl8t4dFHX3RDtzZpL2bdrW4mC+QtHhusNJv88kVEFi0+cNgMz3fTEmKYN6hfOl8+bPR0YkDd/kR69+2NQcUq4Mwd2bwv1NAZ4ewbnCQkICgkMRKOFfPr1T5a0pBL5oowm76CQyLDAgOBAZr0X27SHzOzptFOCC6YjZkuKvGnd6lIlileuXK1P3zerVa8aEBg18dfZCCounz/aolmjYiWKv/H66y2bNQvwD+r6Zv9Um3yLpIcIXJQiZ+3eualgvqi8EeFGvSEkJDJvgQLz589R7WldOrbDlDAQPTQQ51FwQECeAoVLRsfGm/lM3CniOqzG7A25ig1V9tDpfH19A4NDYUB0e6xMnDwlKzPjrX69oyLC27Zr263nG3nCIqvWqH/hSjwLwhCyOOyIJx1wADfJRl+zZa1curBsiWLP1Kk5sM8bFcqWzhNZ7Lc//2b3buxZA97sXiRfvqBAP8nkEVmgaNkqtRLSMxCsYURwSngEaNr+HRtKF8ybLyIMDjcwOG++fGV/m/K7zZr5+7RfCxXM3/jZhn16v1mocJGogsVWb9yC+RSfN99m2GHhIrqVYtu/c6OHXheA0Yx1xYCAwDwBgZFffzce9VowZ1rhfFGRoXng4tgDifkL7dp3AEOZU0Q2EM0uQjCZ7IxFULxm5YqSRYvUqFYFIXuFilUC84RN/3MWxqspU6bAwwkniulIcBD6fvDy1ev4FB9iIebWFn9EMJ8rfNgTjtBTvAgKrqtChQrYxPyF77wZV5XY87Wy3WrJ2rlzp13yxAQEUSGzuiTlL1yoRJH8WWmpB/YdRPSJaYh4L1iR4sUL5c/PXniHMMkpic1Zcg8bztBPzOn79+1HYM8mqXq0vE7TG8KjosqULH5o367UTLPC3g2H4BDzJYO3nx/GNW8Ti2UfAg7Flrl92w4re+0Zm9BJ7GKKFBIaWrl8GZjp4qXLK1ZviLkSUyh/vhYtX8yXP4rNGNnMFLMup03QLv379//nn3/u5UVQDw9ni7PZoJacnHDwwAEWeEkeqs6gSfoSxUsUjArXqdY9e/bBPSAzoiJYPjw8smKFsmz2eVfSUxIP7ttt1ySdgb3+irWNTooqUqJU3uA9u3ZlyjhcMrH3CxjQf4LyhFatWomFgUKnG2WjUNapoJumJiYlL1+x+sKFc34BAc83f6FM6RIIM/Qqe8p0zYYth/btx+y7dr16tRs2wHzZi4VwjwjmJxxKSkrygQOHJBhK42/JMzjKlykZEZbnxIlTMbHxktHDwKzOu4LJVK1GTQ8PkxePNJxSnHBjo1uoEJhy8OBBTTKKh9QxoiJvMTRN3khJp+7avWfb9p3pmVlly1do2rSpt483YjX2vRH0JkQOLHi4wQRs4GWXgR3xcXErlq9ABBcaFt6waatSpYoa+ev2Txw7mZCQDFuzl7FLJqPJVKdWDS/2Ti2JXQJ5NDgc6anx+/cfUHFuof+wr8PoS5Uuli9/ONS/EB29ZvXqa3HxEVH5mzR/oXBBDDt69uI3ZzRzs5LoNQjYMtJS9u7bzy6YSRgW0BUNaJ2iRYoWKRCZEBt97PhZB7vmxK8oafpqNarB39z0cjveJMyW0AFrmEthILp27erqtevOnY8OCs7T6LnnypYt6yHpr169ivMaRhbNilbEQFG+UpWwsBDWV5meSL+pxR8R//lXEh4XMB+WcF3dunUT7zDE5i3nya2ghyuskdCQ7ETAKciagV2j1bPXyPIvcmE0Rg521mW/WNTBO9ADBpaGeKEPAgLYHWcs00jH9GGvt0TPkAw4o6EkyhezGoOBX/18SODkh2IYSlh9cTawQVnDeM+mNMy76g3Ma2KGizEEWvPrFsxYrvMK+Z4Q14UPMyyMqCqs8dDCbC6CgRI10jBiweWwc5V1GpYRjc6GQ/Dv+mJqqqDJmAFQfTh4nd6mSV469iZlzcBeFGdAW8FEvNWQCQXcFtGNmWL8vMMSzYskxaEws2kOI/taD7qm+PqRHm2gcmHsZaj484hgkywYFMqyqjFng1EKC4WfKliDzpjsMJPjP6wBP4Ra8Vdc3gT2s0MAKsv2orPlyMWLQO9i4pkXNBggim2gvqz/cVsKAbeIRlhgMhkhFhNAdo2NZZFURWZ3LdgUi5/r8BgGA8zI+jJrcdYBcirwsMEJzyqJItlpzfoOnyc79zqHXwNrdOjELhTy9NvD7MBuaMBSkCLBdfFeDBGwgAFpqp3dKzHAJqxEVl1Y5DaN4gRWYY3CM7DfNmH2NiiKKhkkWMzEtUQGlMTtx1AUBXvh/IWExwVTRZxLTz5Cz3uMum7E6ZudfZZXl0WbaCzRtkhGIltHOoTy3Nf/PjBEb+J/AfsKNAZTnubsB1CK6cr6uEiAU2FrLPx7GGSXwv6yEtgAxc3ALoKzZPY9bWzyfHyN/2W7cEbwLdYuT4jrgumgIVeSW5GlqtAaHoVXSmjs7AzM9rxajH/vRSyrEMIOg1iMFzjL+e0FjJrZeZjH5+uAtS/7wzdywHIIUUBi0vgmu6OEj4TZFBOBJufHY4H9SGF3SvkhjwTUi7sqZjMsea3YycbTmf5sNpXde4VeWLIVsZEDp8EBz3nLfgavZfZeVhY7CgM0SkdBPAHcaIDs1BywAqASWoSvsubAHyZZZObH89VHZUsUxop3acAcD2C970ZYFtfa7eFHsttLTBaf3XA5rPtweBFO8bzhsnfcKjPHjuzVW5vg5qOyE4UnBux19Tzl8SDOuqcbOAL2QTtndyCYG03F7uOzPoATxdlkIt219YBhnYOVxsXzl3eIcniJXA92Sl1XkudBiIaz8eHAa4sPK1WUhtgP8Z7Yx0dLsc4HAhfYfOK6DdNJWJfpiipAQ5xaCGXhSDEEavi4hlFeGZYbRkel7wFkYqbAh5XArndheqHy34hBWUzydZswf8YGUOfmDSARe1kO9oelsL9cMuQzbdlw4OwbbD+vF/sxC9Z5Hh3QiBmQrzAlWPjKFEAK6g49RZfBOj+tsORzBGb/28MEYr+oFa9vzg8W2WU5U1ivdKWw45hrv0G6s/XY1RN8REsy8cjEFXEeK5Ri6nKhbBNrj5hsxZluzi4nzOv6QF0+CPBsd4UNZewioagg+8MqLkYJ1lnYRIoJBMghPndHWIwf5cx7t6OYgWFnZ/98jDgr+RRz3cDOyYKAd2PRQnzecmPHQcM8DDDAaQ49//AyxYSawf+IhVBMcLc+9GC5XibWxPnv1CfbGqK/Pv4ueyeErZi5srUX7gw683Obf8S4wWETWGfV/hVhBPERQzTEwEpsTHXlcMKGSvG5N9h4jpks+x00dj4K8Tm5afMRgXq5TMUqKyqUXXdRRX7O3NBhb+GGDn57mKQbMsAa2M55QmL7lpZiCnJdnCtOvbJT8Z+1ODvF0Or81IMQJD1KhHY5QGOLPnlbctjgzvAud5MEVC3b0DdIEXb7F7FiN5Y58vE++WTz9Lsu0c7sI2aK4sMqbhJzRyz4mMFW+H+sPiSzYIrKZtbiw2evbJMVl60JV5KtcyXx38jntg8LZzmuNfbJ/s8/fJ1dAeR3hNgHyuHPEwfXleFSnVuWm5IFCs4PC5hcFRH2dx15F5DHdRTb4K1k5ElMPuRkX9Bjm7zVnDlvRGRmNy+FHvwvtOJieG4m2JnIVWOJyIgUvvsRwQsWpTuLdd6gup4g1m793ITT4Oxgbi5R6xu45VBYI6cF8B+b7AEZ534n2HSJdH34gw6sx/JTjP2uIzMu1sUHOW6S8rBx1kCUqkfTQwdnQo4PFkL/OyPy8c7BauE6mP1Hf2P3GZHoTHOC1dvVWCTwpcghMuXIJ9JuwJmBmdPIP9er9Vi4Wb//N3Cz86YWDfBI2uCmQu6lzEel2r3wJOlyG+6onthx6+5bU+4JfhhbZB/vXL++yvvVXWC5+Cd76+7kyPt4eGAK/AcpN5R5zwrwdsiZ+6bD7k3Kw+EBDTW3SLgvqTmy//cjXZ/Hyf9b10UQBEG4K0/9w/HEw+UJecKQIIj/P7CLo8IlPPm4XNd/fzieeIiQ6yII4tHzFI8yd3XJ2Mk+2f/559HwKMu6D7LVE3+ztxg3bRIEQTw2nirXhSk/e0Ecf5Oeqsrsx9A0/qUo16jr+rD/mk7H3n2JtezvjDj35R6hCYAy2fqwNzJgXeFvKMvhMPGfPcDqSmBpTsQjqrm9ootCUR5XhGkilmKF63EDMAX/hoim02Qde5tHtmbsv8KUYUk8K0EQxGPiaYu6xFjMLoSydfZeeJZ6m8uK4gkZg/gmKdsQgzQ//IHArsbeeD2WqaSxr8fe+EUYFA4drl/6dK08qNaBGmJFKIPlTSk5MTgc/BfvJfZD4+wBZaS58uRQjSAI4vHxtLkuBBMbNmwYNWpUl86v9enTd8WKFc4dbNS9IYJhX2LkX/nAHm4F5ln4ngeGoii7du36+uuvu3Tp0rNnz3/+WSDLsl6nSezdBE5NRKn843w1QE4t+JdqH0wbwTK7d+8eO3Zs165du3fv/vfff1ssFpcPc6FnL25Q4NxUvaTBPsy3se/x8u98Mk/PXptALowgiMfK0+a63n333TZt2kydOjU9PX3RosX/+9//5s2bBxeisXd1s9+aw4qsKeynq9l1M/bzruyPIjvYpTz4kwczKot4a8SIES1btvz111+Tk5PXrFkDnzFt2m/85xtUReG/sgc1ZKxp7De22U8f2tnP4wL+e4pIsMvsup5T6P0CTSDuxx9/fOGFFyZMmJCQkLB58+Zu3br9/PPPSAfcPs6ffGWXEdnPXbHfNBC7cDj/2TyWxh1YTt9KEATxGGCvBHYvoLRz7RYw6BYuXHjVqlVHjhxdvHjJ3Ll/GY2madOm8ts97Ceu9+zetX7D+pTUVL3BgNGavWrGoeg0GY4mNi7+wuUr/L0xDwZ4r8jIyEWLFh07dmzZsmVLlizx9fX9848Z8JgO9jOqtn37D2zetDE+IR4uk18vdMBjyLLt0IH9q9esuXQpBi6EuQr2HptcgcpDmcDAwDlz5hw/fnzlypXr1q3z8fGZPHkyU4b94qmyf//e9evXxsZesWt6u84AKxhUm16xXbwYvXL1mtNnz7J7YBDFPwRBEI8R5rucq088LBj4t5+aRIrzp7URJdi1uLiEokWLPvtsI7M58+95cwoXyl+qVMmKFcoFBAZ9/+OETDviHXvCtcsrly78fMSnBQoWHffrdLtTUm6BJuwHuXm0IqKZjIyMggUK16pR3WZJXbtyYYmihUuWKF2pQiV/v4CPhn+ZJbNf9zt38mDlsmUK5s1brHART0/v9p1eS8kw2xT+VEcuQOkuZUSYhZXq1avDzVstWZs2bahQoVyRokWrVKkc6O83eNiniRa7othjo0+0bdEkKjKqSrWawXmCmzzf8nJsIkIw9kO5TsEM1PSJ+KlJgiD+P/E0XDBENTBoYoDGuInwAthke3pm2rnzZ+Pi4grmK2C3yh+/93GfIcM27ty1feO6Xh2bfzzskyuxyapOO3Ls+PARX546sCcuIVFn8DDkuBl2H0ATqAFlsIJNaIIlfEZmZmZ09IXMjNTQsBCbrA375MtXX+uxYdu2HVt2vtu72zfffb/n5Bm9zfLlVx+FFSm9btv6XTt2fzps6N8L5oyfMN2U6zZi7cy/bgV94LTgRGNjY8+cOZM/Xz6TpP/0w48aNWm5Zfuu7Vu3fv7+4Ck/fLNn736EaVN+mx5nlVZt2bpt6+bFf07Yv2vbT9PmI0hVHJqU/UQmQRDEY+EpudcFp4UlPMT8+fO7du3aqFHDQoUK1K9fz2KxVKlSxdfbZ+GChQP79grwD/TyMlWsVMVstWaZsyTJUP+ZBtt2bP/m+9EeRvbbj+w5w9wB9yBcl9lsnjdvXs+ePRs1alSgQIHKlasgLqlRo4avj+9vv00Z/tFHYSFBnt5axdpVVE2fmpJsl7Vr17K+++GHQvmLB4f6DBk68Jmylf6ZNy9NPN+fC6ASHOqSJUt69+7doEGDIkWKFC9eHLaqWKmSQ9NNmTLlixGfhgXnMUmGqtWqK5rDmp4u6fR9Bg7544/fSxUtZDTpK1Yq5++bJz4xFbMCfpXVKZkgCOKx8JS4LozOFy5caNWqVf/+/YOCggYOGLhu/fpX27d36By169QxGI2lypT00JkVm/302ZM//DitbKXKeSPD2K0k/gOgBv4DC4hKcu25WIgDZRDtCWWwDu+1du3aHj16SEZDjRo1kaFs6ZKSw65aMi7HHPls1I9ReYvVqFjWw9d7+oy/SxQqKLHf6JI8jLqIAD8ES3an4PsnJSWlffv2r7/+Orxp9+7dV6xYMXDgQKhRt25dvdFUpGhRP0+DXjHHXL44YuR3QfmKlilTUq/afYND8kfkkWwZKWnxo7/5KSFZbfJ8fYdOZr+aTBAE8XjBEOYWIJQBt73XhRW73d6rV6+QkJBdu3YhwsCm1WarXae2l7dXRkYGYjJNkU8e3lardsOwYP+QgJDN+49mKRq/xGjHJ/bCQV//kLFT57pk3h84HKWDDz/80MPDAx5L5thstubNmwcHB507c1pV7JpsOXvycJP6dQpGhQaFFFi0Zk+WqtrULBxoszlkJU3Lsp49vC8yPLjv4C+scq5UAqNHjzYYDPBYrntvCEyhHsyIdVWRM1MSGtSoXiA83CBJf69cn2m3q7I5i+0x9+rUrnT54r4m/fvvfpuoaLKa4cAhsuJ6WgNVpntdBEE8Yp6SqAs1uXjxIpbJycmILa5evfr555/vP3CgarVqPr6+CMp0khQanr9bjy4fDHsvJMjnixFfpqdnsF/zYXeAsB/RFgu4ECQJgfcNdMDy0qVL8BZZnGvXrv3www9r1qyJiIjMl68Av96mDw2L7NK9+5B3P8wb7Df6s4+uxqewL1Q5NMlgdTiM8XFxXTu/UaxKjRFfvuOR60Dw8uXLWCYlJaWnpycmJk6cOPGff/4pUKBA/vz5UWMNOhtMXbq/NvT9oZXLl/lq+GeXrsRpOqMRXlzv8eKrHQf1f+v5pk1mzvppw9qtiubt0KmaxIJUgiCIxwZmym4BpvNg+vTp0PngwYNi07mPxzrvvfcevIWvr2+JEiUKFSpUsFBByWjoN6C/oqnig9DHothVa8amVX/rTYGff/0T+6oS+wqVLfbifh+/4B9+m+MUlwugCfjmm29cyhQuXLhYsWKSJHXr2k2REctoil1m+WTFJqv7Ni0JCfDp9e53iM34180yL5zZX6Nm/W59B8Unp1mRR7U4Rd8vkydPNhqNnp6eUAPKACjTtGlTBGFQgj2Mqag2RGO2rDNHd/v7Bnbq9bYZsZndbGGxoKLZrBkpl+vWKlekTIPkdKuqmq3Mbk7QCn369AkPD09JScE6quXcQRAE8XBAhMC+MyR82BOO0PO2b47HLoBga/bs2RhAg4ODq1Wrlr9Avq1btlSsVLFY0aIYUfWSQdYMBk0xaZbEpNhCJWq3bt9txm9jJS1L0nleiz1TslzdL3/4ZUDP/0mahlBM3AZDCazs/4LQE25yzpw5sbGxUKZKlSqlSpVav359qZLFy5Yuq6oQjahFcuh1Rr09LSGmVu0m+crUXb5kppcqHzi4/7We3bu+3r/vW/19RDAoaR76XN1gslqtixYtOn/+fGBgYKVKlcqXL79x40Z494oVKzhk2aGTNKMJhjRqssOWWrRktbCiFbdsXGpQ7JpkdOj1ng5Vr7e80av3bzNWXI27HBbso+kkEwtWGagvvTmeIIhHDRv13QHhbO90r0vM9wUiGkOyg73nVpZl8+GDu4cMGpCUmopMmiV16bxZRoPHR1+NtSH4kdNUq/1S9Em/gMCxk/8wczEIOZgQ5xdwnaX8J6CA0CRbGZYm262KbDty8FD/Xn2SEpNZYGNLWb9yqa9XQL93Psiw27evXVWmSJHvfvgxNiEpPjklLol95FzfPYIOQhlWt+yoiO3QtLjL0b17dL96Nc6OWNBqPbh1XaBfcOceA6x2uftr3Q7s22+XVdluS7x6sUK58sVKVkjONNucD/87wQbd6yII4hHzNERdAIOmWAFIRF4eFbCgRVFs165dbf9qe6tde/HFl2xWy9Rpv0fkLbBsxYrIPHlOHt//7uD3sqxZe/YfzluweP6CUQMGvvXSiy8YIUWvY8EX+/znMILb1mlYJonrCXcGP5gUn9C1S9fLsbHtXv2fl4fu5wmTvP1Clq5eUTAqrE3jRjv2HwovVETSS3CgOKpY8eIL5s/z9fEWou6PnMoInCo5HJmpCa916Xbk1NlX23fy8jD88dtk2eA5b8ny8qWKfv3VFz/9/HPrl9qUK1t24fx5sM/kqdPa/6+dAVXRVMlgFBEpJFPURRDEI8YtXZf4leSco6TYy4bjbPTOa33Yw95hZLVaZs+as2X7bqviqFi5au83egT4ehl0Unxc/NIli1gUwd82q+r0terWrlCurEGnl3LnusRSqCSWKEXPXzSl2OW//v5r85YtmZmZ1WrW7PJat5DgIJ0ir1iy6FpikqZnr27HARCRJ0+eNm1eNhlzdcFQKHMTwnXpVHara/HSFStXr8vKyipdqsSbffuGhoVJMJqm7Nm39++//7ly5WqB/IW6dO1arnw5BFn8EQ2YyiQMjEnDwIED58+ff/r0aXJdBEE8GtzPdXXr1m3//v1ly5bF4CtGSawAjKHYxArPzg5wrsAFYJ1X1KE3yNx9wBXwF2dgw+jQy2yL3X/SVJ0Rh0EopOBz367rVoQg9nAhVliMyK66wTkitsKGJ+qi6RwSdrPnHZ3HZHPz9gPDoamqeIc9+9UyiX0pG4lGVl0YCQoyy3H/KzFthXFZTeBeeSbOgAEDFixYQK6LIIhHhltGXQ0aNAgMDHQlli9f/ssvv2QDK4flZuOrGId5kMBR+EVFxFLXPQFzF3Bddp0OvgxuQ9E7PPgKy8KzXc+bS4RCzDdhnb37A65L0+uNqkGv6hwmTTI49IqBFfjIvvMLlVT2KyaaXgcVYEoDtIAHNzALQD+eh2MwGKAuP0J/6XLMkMFDNA3+noFdhw8ftlgsJ06cINdFEMSj4clyXXdXBnHVwoUL3333XQyOIqdY1qxZEy7NaDRiM4frwgfrfFNIde1x7gKaTg9vAkciRluEINyHOfc+YK7LxZqD/74KAj0s2ZVJuAz2G5TgUQ78zD2x+4IsxoIp2H+xgwPrCnM6DetAJt25c+ebN2vhcCg8Cwt5sQwJCVmzZg1clyuFIAjiIYFBxm2iLvgtoaqiKK55PVIANjH3FyMmjZsPD5haPEAors06U7NBE2AX7E9RF0EQDxt3cl1iiZFRDI7QXIyVfL8Tcl0PD9EEosPAUbnWBWId9qcmIAjiYeNOFwxdg6PYJB4Ld28F7KUGIgjiocImyE+U6yIIgiCIf4VuSxAEQRBuBrkugiAIws2gC4aPH2oCgiCIe4fudREEQRDuB10wJAiCINwMiroeP9QEBEEQ9w5dMCQIgiDcD7pgSBAEQbgZ5LoIgiAIN4NcF0EQBOFm0L0ugrgfcp444p4xls7t++ImgVg+KJku9VybfOdjJmd9XeS+vq5qusiNTJfFcsp8Qgx4Ew+w1oKbBIIHLjM3Am9uZoIg/hWcNUCceOKF+ljP5a+9QI4QKE5JIf8ByhRicy/zQSHsBlRVFT9EAN2EtvcNREGCqCyqKaTlRqYQhRVhSRdi75ODqDhWXC2OZW4aGnJQcQgR1ReIZrpvblJSCL9vmeS6Hj/UBG4KGg5noxgisZ5LlwAJkCOGDIgSm8C5+74QQqAk1oXMXCr5AHENjlh3VTmX6rlqCuEPRCAkuMbZByXzYeDSDYiU3CspKo4VIVMYge+5T1y6AbF+30oyVXKKIwji7ojzxWKxnD9/fseOHWlpaTVq1Khdu7bJZMo+sZHh+gpPvO1J5mA72F7289PYTklJOXLkyMGDBzEPbd68eZEiRdjBNwwW/AiWWySKgtiS/5y1yn9tm+0S5WEcv3r16p49ey5cuJA3b94WLVoEBgYi3SWTH8zX9VwsFtiVrauzIGdmto+vPDAwMsKMp06dgoaZmZl169arWrWqh4cJu1ASIjIowDRjN+RFlbkKTD32J3vJYDViqw5V01JTU/fv33/s2DFPT0+YsVChQry+IqdrTRwO4TAaW+F7gSubs7pQUlEUmHHnzp0xMTGQ9vzzzwcFBQkLc7Oznw7nA7BTZraQ69zYiA8SV29Efffu3Wuz2erWrVulShV0IVcVhEY3qiVUvb4CPyAq4ayXw5GUlCTMiD7TpEmT/Pnzczcj8vM8/AfmeQfGpkjPlsnSkCKhTzobjv9E8KVLl3bv3n3lypUSJUo++2wjX1/fO7gulzSXZJEIgdlbAFoSBHEvwBngDMRoi0EW41FoaChGMZx+Q4YMsdntMnbLmk01KxarxapYZYumsR+VdjgU7MCBioI/qkVRzTLy2hyaRXUosqrIsjx9+vSIiAiMOBgjPD29AgPzLFu6QlWwT5FtFlW22+yKXdXsst2h2TGe8rJsZtWuylarLGegJJvF4SzOgZLsdnu3bt0wfHt5ecFvQUnojFFDVawWu9XC1dEgXXPYNYfmUByaDaXZIFm2K4rMKmO3oHi7w4EiNYdT8gMEI2P58uVR5ciw0JDgIIPR491hH1ntsBCKlq2abMcfWc2AhaxmOzMdMxSrmqzYYAik8E0smDiEcJo6bty4yMhITCMKFCiAugcHB69cv5HVyJahKVbYTYMY2abIZg0GVVSrAslWxSajKJuCv6iyRbPDMlxFh8NsNr/yyiuwoZ+fH5oG2mKmEh8fz8pV0ZQwuzXFjuPhNM2qhu7hkLEKw3JtraxXsODyIQFftXXr1lKlSqF9o6Ki/P39TR7e337zrWK3KbzbwFboYFBLk1Et1YpeI8uakgnjoYfICrOuxpS1CoGwEGSOGjUKpvPw8ICrRpXRMzG9YH1Ys1sVdBGrrJizUE30PBkH2FVrFmpslUUb2azoZCpKVx1auqJlobHgCF9++WWICggIQAPpJOOzjZtmZmSike0WMxSQbawxmQScQqwBNBs6JbqhZpPR36EpdFYgUKjJuM94jSD+f4Jz++zZszVr1kTIdfz4ccwiy5Ur99tvv509c9qhynpN0zswcNlT0swOvZEfoeIgvR4jgvnq1SuJCfEYSTQdJqwGHT4OzNrZlagTJ0689957EHj48OE5c+fY7JZRo7/EoOJgTkq5Fnv12pUr5iyz6pA0HcQ6DPAlqi4lIfHK5VhzplnDqKn3dM7++XwUo1FCQsLs2bOhLSK53r17Hzp0aM6cOTpNMep1dpstNjY2Li4O4wWCBg1q8ONQPUzAE5OTMY47+CQXEQ905aEF/jxIoqOj69evj1jhyPHj23fuLFQw/68Tf0qMi9WpNhgtIZ5VLSsz08hU8ECN9My/Klar9cqVmMTERDg4aOichDtVkxAKDxs27PTp06jstGnTsrLMP/4wFiZGPox7V2OvXbl6Ld1qtzo8rTqr5rAY7F56uy4xKfpSzCWYV9NMDp2nxtrEKRFRNbzCvHnzMF85cOBA165d9+3bt3TpUj54ckNrclJKKs/LjI/WN6DFHUxGclq6VeFhyUMDuqGyL730EnRDyL5x48aw0LDvvvsuJSWZOXOdLjUl9dKlmLT0dIz6GPbR1BKicU3LzLRcuRqblJyMSYkdPZAHW4DXSgczQsjJkydRWcwG4Hi+//579Ft0HnRZZEpLS7fA9TvjWRXxF6YT1+Jir8Rcspoz9aqkaRK3IfqkEUrCjEajcdWqVZCJ3ti6VatNmzeuWLEU7gnHpqYlX75yCW3KFMCBDoSMKno4j7vU5JQUTLAkvSG7TbLhbUAQxL+D4RJDJ8AUUQAPgbESE/xdu3Yq9sy0a4mbN6/u3r7roHdHZSI3m70jbrH+PW9m9RpVg4ODCuTN2+5/nU5eiLGzsQQ4MPfF/BdTXSENZGVllq9Qtny5UjZL2tnTx1o2bxoZFhKVJ6RG9dor1m+xQA3Nlp568a3+/fLnqxgSHFi5UsVvxszIgI9zqukQGmIJaUIsZs2Y83744QeaPX3Jwr/r1X8mNDQkf1RkqxdfPnrmLOa1COas1qzLly9N/31K+QqV95w4z0IehGFMTfgM1OIBRw+oMioO3VCQiBH9fLzPnjiSnhgzuH/vvAUKBASElC9b9tsx31nMqtmG2Cjrj+nTqlevnickBNFAly5dL16JQ6iI+vGQi30sFguzIOInmy0zMxNhMbyjQzafPXn4pZdeDI/MnycktHrNevOXrcxCRCTbUuJTenTunD9/WGCQf5XK1X/+ZXqWzWFD0KU46wvrQZSQCXsuXrwYZhwzZgzS09Izjhw+PPLj9+s1bW22s9AALgHzDDkz+dKF6N8m/lS8ZPnjF2IeZtDF2ka0MgwIoFX7Dp29PL2uXr6Ylpzw1oABJUqWDg7JU6pU6dEjv7HIqk2xK5b0n38aW7lqDb/A0IKF87/xRu8rKRabKrsEotvCjFiKBrp8+TKi9mbNmlmtZmtW0sUL0X9OmliyRJmD564g3ELTqXLmlYunOnfpGJkvH8LnWlUrz5j+TzpK0hCT4h+L+FxmxBLrc//6C05v0i/jZVvGxAk/lC1TKjA4sFCRwq/37JOcnM6iNc1mNSeeO3v655+/r1q1XvTlazK7HHBD1EWui/h/REZGxl+cv+/KmjVrcNIyx3IjSMGJx0cJO85tdkbKcseOHf39/Y8fP3bk0M4yhYpWKFs6wOD1eu+PzZhSsnHVumXr+pDQoB/GfpeUFL93+5aoyHzPt2pvwXSXj7aYzEKaEMtHSCUhMSE0NE+DerVka2rdWlVbtXrh8JHDG1atqlWlalBwvp2HT2L4+fidN4oVL7Nu9d6jx/Z169DOwxDy3aRZ2Ve52IArZAogE1XG5Bdz5+P7t4cFBX72+ReJCXEH9+0sVbxInWebptlkRbYM/+SDIoULlSxWyGTy2Xr8rMwGHoR9cF4YMpxDubBAfHy801J3YO5cjE5/IQi41YZQxqUVbMg22cgrt2jeLE+Af1zM+U/ee7tkiWJrN2w4dvhUlw6veBo8xoz706raVy9flCcwYNKkSakpyRvWrYoKDenco28WawdNsdtkVl0IYzIBNi5duuTn59fqhZYZqfE1K5fv2LHzybPnr1279lKLxnmCC586F2dXMvr2ea1C1Wrrtm/ff2BPu1atjJLfn/NWWthwe916kMbkcvfw66+/ItJAPAe/WKdO3VLFSxQMDS5T7blM9AcchAhGtg/t171IoSKlC+Q1GgOOnL96fUJxC4h6YSveH+/YIbFr69atol7Ow3IgFAPQE0s4hnr1GoaHhackxvd98/XqVatu3Lg5LS151Jefe5q85sxfaFPt/8z6PW++vKhCQlrm4iV/B/sHdRswwpotHO0lBAIh8+jRo+jenTp1QsKQgX2KFS5UNn+UUe9z8GyMzK6Q2hRbaudXX65Zp/b23bv37NrWtnkTkyFo/rL1Fg06s/7CrsveyHfffYuIfsG82WtWLs4T5D958qTz0ec/++zTYF//Ll16w+3Jqr1X9w5FixQuWigyPLzgqUtXLWxWcIProsc0iP9HYLD47LPPsIJT6k43z5FesGDBd955B2M9ptjOVA6GDxyIYWv9+vU7d+48fPgw5qQxMTGYlu7ctdPDU0q/luobZKpYpHLTV3uN++ULk6bq9Y7klMQtW7a80LK1QTLoVXOHzq/vOX4BLsdL5zDo2KQU5+OVK1dWrFhx8ODBI0eOJCQmXrp4cUCfN99+e9BzTZqvWr8xPCqfySFfOH64Uo1G/d/74MvhQ5vWr/v2iO9aNq3n0AxaVlyjZxoq4bV2rZ8prlEKzwpp69atg8BTp07B2QBs1qpcdtmKtS+0/Z9R0gyqeUD/ASu3HNyxf1+Yt5SSFK9ojh2bN7/SudfGQ4dqlS5iZLfZmUg9uySnh2mEKzp37tz48eMxBvHSbgMbWfT6okWLvv322zfZUByFDElJSdBn7969x46fjIm5HH3+LIbF3Xt3P/tskw+GD2/RrKlJM5ptlxvVe87hX3/z5omWpJSt23e3aNnSaHDoZHOb1i8nqv4rVy70lvQSJgkOvabTnz93FmY8fvw4BtyEhISLFy8iJh7xyQeLFi5s3KK1j4+vQbUuXzqjfadvlqxbWqWisVmDtmMnzqpQq5yvpqbFXar7TPN8ZaotWTjXWw957MlMWBIB64YNG06ePHnixAnYELMfuOQqVarExsb6+Ro+GjJg0zHb7p2LfJha7EaXJSHabghaMWtKr3dH7T99ulzhSP0dbsugXcaOHStG4Dv1RuytWLFi7969sX6TJQG8C8yImRa77nrkyNWrV8+fv1inds01a1bu3bUzPCJvoaLFDXrrpQsXq9d45u2PPx40qH9m3JWdR0+2fLahrDNJWlLL5162BZdbv3SKJ1cAxcH/XbhwYfny5ceOHTt06FBycjLM+O233w4ZPCQlKQ5R2bJ5s/oM/GL3mVPlikU4FIRHmVWq1Jy7YHG5chUMqj01LqZCjdoNW3ScOm2cJ7tUC5FqVpZNmBEyT58+fS02Ni019fy5Uz/99LNOMnz2+ZeS0aRXbO8N/L/2rgQ+iiLdT0/PZDKZXJObHCTcIZBwgwGCQAgS2QiERFiUgCJoAJ94I/gUQYFVVsXlWLlcDkk4IsihnKKAcmjAIAkx5AASgXDkmsxMd093z/tX12TeAIblrf58ujv/TDrV1V1ff/VV1XdUVU+mL1+fd6b8fMvQwLorF9Qeuk0bVs5duOZQ/snoyFADGhhMOuUEwm78/wLqwI3fBtCbxJEmi8vNgriFTS6nI8sFeXl50dHRERER6enpCxYsmDdvHiwc0jbJxkkcWQoXGtoYg7OnzObJ1AscdslGXHhOwlEQi05/HWQ0pj86qUGSSVgjC42NDS+//LKXl1fXrl2feOKJxYsXp6dnaFjtpg1rZZv19Hffwp83kxVxTrZcD/M2TJw8gxNt5wpPV5vJJBpnE2WuelJmSmiH+0mcqADaYejQoXq9fvDgwSC+ZMkSWJGQkJCffqqSeQtvtVo4AfyUnzsVGxk6aNjwG5yNrKzzJslm2fVJDouoqxBRlyhLglIJB1kKiIUGiI7znwMVILWgjqwmIJ/jOHj9YWFhkZGRGRkZC99+Z9bsVz007OMTxnNWa8HZwhtQdTaryKP05QmZQyNbP1AvmEWbwHFkCgtMfvvl7ihfnwlPPd8owheXBM5SV1s35cmpOp1n9+7dJ0+ejCqnpqZqNJodO3aJAoTPkw0LNtFcd+3BYQOCImKLrtzkxdqCb04LVtlEeLLK5srB/bv3Tkqpt+ARkCwH+9ejRw+DwTBkyJBZs2YtX748rEULNH1tbS04Ix1EqHsqa0Rsj7R6ntRUlAULDlwNAo2NHyzUsj6FpVcgxeagEFFCxeaFScg2zbM5slyAaqJZ27RpM2bMGFgXOAoMo3n22WcRh5LNFAKHghJXu/D1mR6sdtvuvWawbDVb0DqcieeFQ/u2Bhp8pr76N55sjiBA1bKysnQ6XWJiYnZ29tKlSwcOHAiTefjwYQgaDY/ofMPyRZ4q3x9KfhJllEMzWX74/nuTlbdCbFZeMNUMSOraZ/AIkyQLuEGST5480bNnT19f32HDhs2ePXvZsmVGf2Orli0Fq+XmteoL5RUC2SVD9hmten8Bev7x4nIzuh6ZCLX8ffEbwSFRBZVXGyAKSKmpN0Ftuk2XG/9BcHT8O2axXOG4Q4EzBxoEgM8YHBw8evRouLfQodAm27dvh+maP38+FCjR9TbJLjTE+gdPyn6VKF547WSvFMyUac/W9VOyxrcIC23dvtepkgqrzGMkipL83vvvwcZAR8CdpzSzssZ7aLVFhYXKM2XYJh5j1mbK/zLPoPZbtDzHLMF4NVhsvMjLIsfXVJfERASN+PMMGDkUAIVBgwZFRUXl5+dbrVbQBLcwt1BGFosFpkMSrQf378qe+Hh0eExEmw6IesxkmyPR75LM7dq2ltX5HisqA/OKooBWFWwQApXFvypDJ6CIwZi3t/fYsWOvXr0KbmGQPlqzlmU1H374dzIFBSUmSRZl219N9aWO4SEZmRPqif6Ge2DZnLvu8YmPGH39evbsV/FTNUfmZEmd5819U6PRrV69BjExqoyajhgxIjAwsLSkjJCUzZdKCmZmP9mna4KHl/HDDTtNUMRyowyLYyMbQGXefLHi++iolo9NetlikzipEc0BibVq1erUqVMwADACVdVXDT7eQ1NSyEyq0iVkW8OUR0fG9kw2QfUS6yNCAxMTJtpy31+oVvsVll3BnY6a3wEqHwpH1h1wXFbgzCG2ThRhTjw8PKZOnYr4ElVGzgcffMCq1Rs3biC2QOQs5ro3Zr8yMnmQ1kubPeMVK49WNqH1cWnLqmWTM9MDAo0JiWmVN01E3gpg/GBjcnNzIUY0DXIGDBiAHMReZM4SbSPym5a+pWWCzp6/rHQ3sncRdeckySwKcBAuFp7x9dI/M2sunDNYtoaGuriOHbt06YJQmC7LIdb08/F5eHQG2Y+I5iP9DEax0dZ4bVzmmKjo+Ms368kuTZsFV9a891ZwcOSZqmrciKe5Thg2E8q64ca/I5gmOM5/Do47FDhzaAKjDmMYGg0GDJnQvO+++y5USe/evaFRyIY85YODXTkh0xuMSq0Ul1m1XcO0btWKYc0VJRVqO6tmyNdJnPj2JIwffFsYMBQ4dOjQ7t2fR0S1hIOv+JWSmpE0drn66vWpM17tN2jQI2P+pCZP8PBgZKsKg9764nMvqfThb7z+DKsUgLkqKCiArerUqRPCDrCHcKGyshJRHU7JezaMmqxt2cX27dqIVnPJuXMSw0gqtZ3srMNlwpXCOHlrR1lOuGVNAUxSOM5/Do47FDiyXFBcXAzTAvsK04LT6uqrS5f9DTd27hyvZtT4sIyklVV2S92Lz0+zeobPevN1ss+QUcvKbkzwFtuhfc2N6pIfixXWUGnVqVP5Xl765OTBnp6eyDpw4ADZbhcc3AJiZFCW1XjozILQpn17g7fm/JnvNYJKkj1EFgzKarLFgn/2uVc8dEGvzX4WAlerPBF/lJeXx8bGxsXFIeyAkn77LwsRFHZJ6EIqBck4aoc0WKJQs+QURCFIgF7/GQlQKAQccGTdAcdlBY6sJsCmgiuI0c/PD13owoULCGgQd3bunACxQMmr1RpY3IDwyAhjYHlZ8c2aWkmlEyFeWcWDQ70+OiJCtl4v/7HEJnuAICwJaIJa//79EXghB57ZsWPH2rVrB+tFuwGj9GZSPcKOWmRYpX/btXbO0262NVx5+umnO8R1nfFf2RqyN1Wurav/sbQUXRGhIbofrNeiRYvqTY3de/RSNl+STqdWSaC79uPcXXv2vb1wbqCPjwrDiVQYo4j+kG2mRJLkoU0gnd0NN9xoHlCX0BHA0aNHYWCgJoYOHTp8+PDQ0FCMRujK6upqDHsJzqco2W31Hf0DH582W5DIlkSUJcVtVkkwiaIVpuLhMYM99dEl56vIe0GS9NLMFzBIg4KCECUkJSX5YNwymmF/Gon4i7zVJZAJmfPnCnt16/nIpEk3Giy8YIVPjWhD4rkbNVUZ40YlJg4pLS2zwolVmESI0LFjRzAJfTFy5Mj2UNYGA0b66tWrJfCoTJxZbPhjtVnqZkx53Nvb7+DpQk4URAnRnfWzT9ZodD7Hi8plhIxkNyOiCd416vqFAIf79++H3KAcU1JSUlNTjQH+Wq0mJiYaYiSPIfISa69fmjg6vW+fwd+Xl9UjioGlRfSJmgucjW+01l8fPSwlKCwGgZeyWUCclj2VYcgbSKhyr169qDbPyspSgiEELLjJRl51Em2b1y7Vaw1zF6/h0Di8XRb5a9fL0h5KSxmSVl5+ieOsZKbTZrtx40aHDh0gN0QMo0aNiomOgRhZNbvj00/Bo2i3kyk20Tz50VGxvQY2KlEXMgnvsmSXEHX9BVFXUfmV5ndp/IsgjyURkJSXl4eeo9VqIUN0SPgBqHJkZEtzo4UESGhQhE2CaBXE0jPHo8PD0jKerONFBOyQIyeYkDTfvDmsb++I6A5l1bWgiTBr3LhxoBkWFoYq9+jRw8vLC2Y7Ozsbl0CSzCxI4qalC1l1wNlSRF0yB34QDCH8R9B0ufSBvt0fGv7QxcvX0D9Fm1UWLTdu1gQGhYCx7t27o4dHRkZiBGm0us/3HkAV0CiIsQWu8YMliyNjWm3bvpO8hkZm1gnruPzRe/ODQ0jURZ5+a2zKzpkzRzFhbrjhRrPAeAbCw8P9/f3pC6rwx+fOnQuHFPYGMZNy3U5+ZGHJog8i47s/9GAyBicj25UXaRiJYUVZ0qhkhq3btOnrXn36xsfHMCq2U+dYKBOQhWMLVT5r1qzAwKARI0d2aNdaJdvUantBwZlHxz0yZtz4V+fO88SoZ0S1SoP4o7Hm+pg/Z9p1gR9vXB8ZFiKpRQ3x9xnomvj4eBgwo9EYEhIyffp0qCGojDFjxvj6+jASL6vUEqtRqSQtI6h4fuPmTzvdl9QrIZZFFGiXy4oLc/L2THoqu0WAt8ho4O2iSrCmDj/7FwMygRghQKhFVDkhIWHevDkxrWL690tK6p+EKA/qrL62Zuy4SZwucF3OmlZh/h4QG2MlryQxCA3VjN2uZe21169u2XHgwT8Nj24ZAXWOKlssZqjvgICAtLS0l156CU5Aenp6dEw0CYHsxGm3k20O9rBQ3YZ1WyTJOyPjQdC6evnS+AkTQ1u0XvHhipBQo4YlYQDLqjUaLay+IAggGBERMWPGjNRhqeBzVHo6OCezVQhCZHHXju0Xb1iffGy8hlFLEJHyAhz0+dmTx7ftOzJ1+vQAX0+co11Qgh5/OUhfY5i2bdviCPZQU1jrefPmoblTH0i9774+KrsEQZJ4Bm1tl4OM2v07Pztz4VpW1lhvNCPLyqxGYFgvVt1QXbbls6MPjsxsG0UMTLdu3UwmExwpkM3IyEAIhURmZmZUVBTCU/KSol1z9rsT2/YcnjI9O9jfAMkzkgWXKqpuPpQ+Pq5n4t+WL/UzGtFtNOjyKpWH3tC+Q3sk4EzQ3U/JycmxHePgKxj0OoT4NoF/8835H+du3fTJ9v79EsmLXWTvIJ4F5pmC40f3f5P/yJQpIQY9+jz5zjWHDNxRlxtu3DPg8CruJwH1Up3ACQIekWwiqIs1Bk2a9gpCBDO8R4Hfsm7d9/kFHAkaOImrnTf3WbU2eM/Bb4gjSX7IAoArTd4mcuS7IjgEagf37oyPi129ZiWHTF6Gp8zjFl6oLDs9uF+/GTNfqzebUM4syZwkkq/FUOAkCBDeXFjdu2PTsSNHzQJIgeL15YveYtW6tTsPCDay5C7ZLJ/nrdV4+p8oPC/aODNZyEAMwSF0+BWjLld+lDR8dLINgawW8vxPF0oG9hsw48W3rlhMFlGQzXAGrKJozvnHypLiYvIiFyrHNcycPpnRBR4vKIJgyf565YNgEsaGPkKhLIgiX3zm7Po16xF8kO0JNu7M8b0BvkETps228A1V584k9u713CuzG61mXrKiyZRvRcFDSEO7QqFG/pA1JXCgLNGIgmVKVmbHPoMa0SyCZCGckI06CA1zFr/Nsr5FpZdRCPyQaOwu2+T/VYAmXeXCU2hCEQJnbqxd/O7b9XW1PHqgxDfcKO/ZuXPH3oNqahpXvb/kfPkFK+FWEvm6aRMf9vILP15Y5iQIGk6aAKGppMk+eFsdqpm75K8aD78zpZWQBYlP+ZvFZ0906dp7zsIl9TYJTWWVyPfEkPpKPAmJlVHjSpO3keUtdDausWba5McGDrj/wsVKIjiIXZJBUyA7kHgwv/q9+UFh4acqq8lzEOdSLhW4oy433Pg/AH4u1BBGDhLwUmkCwAVTfe077/x1957d33198qagulh2ziwIHdu1Pnbk8BNPPV1vsnh5avft/nTO3KX33Z/04synPdQa6osrHwdAE+48q4bPKfEWU1L/gYJkr7daN27cuHNjbk7uhkvV5n6JPaY/9vDer77z94v4ZMuWTblrc3Jzdn5+PDN9uJOQMrqVxYkmPxUJOK0Fp7/Ozp5Rfc2s12n279vz+psLE3r0eXHm875a9uhXX65avvLoV4fPFF8ym82nT37TtmMcgkE1g3CCsvorwCk9CqTVahYHxALwuFV2cfKkiQe+/Mrf6L9rU86W3M0btubu+OzwyLS0I/t3PP/CzEYetlSVs2H98uUrktNGP/FElha1AkGyMEcWRyBAPAJHcgo5yFxDTe30p6YfOXnCy8frUmnJ80+/0CDqFiz+S6RRnpAx9lTxRdbTf9uWnJzNa3Nzt548+ePg5IFq1sYo+wAUHgmQwjPIUaVCYKYS7SuXLc/btu3Y14crqmpqrl8/W1DQu28ia5e+OPTFyhWr8g8dKqq8jkDwyOGD9CsunSz9ikBNHSmlxcnueSJbmedM77yzaPVHH+kM3rV11+bPfP2r08Wz3ngtKa7N53v3PvPCTMamRkD90foVSz/8OGNs1sQJYzzI6qyDDviknUdpHSIHHBHGffXlgRUfbjhx6GDRxcsNjaZvvz7SrVeih8b28IiRZecrPDUeW3LW5m1cn7t587ffFaYMTUZrEIEpmwHBG7ilElBiJxt6+Iplyxa8/V5ky9YHDxzc+vG6jblbPs7Z2q9/f6OP965tn6xZn3PsywPFFVX1Jv6H/BNdu3X10ns6Jfj7Ml1UXm648bsFxh6GMYAETdMELvGCbffu3SaTObZzp7DQYKulMbxFaEJ8fLeePeMTEnbt3Ll69cqCH85NmPj4wgXz/HwMZADTwmSE/y8oSWgLm2SvuHCpdZu2ep0nmV8z4OPTrm3bbl0Tyi9UhUdFe+lZZBm8DT56Q2CgMfWBIU4mASdvQFNaFRsXm9Alft++vatWrcw/ffahUZmL3383NNAILVNScv7Ed/lavXd8Qme1Cg413zcxMdDor0zd/GpwYYaAppVfXFMhPKm4VBkRGeWl9/Ax+BkMngZvrwBjUErKwD59+0RFR23N27p+7brSiqpHsyYtmD/H18sThoTQwG8TVRf68C1UAUHBQ1NTCwoK/rFm9a5duzt17b5k6eLunWJRrvTi5fDIlr5eOm8DpOsLIYe3CBswoC/Lqlnodhc+KchzHGrdvveL/ZWXL4eFR8W2b8dbG6HhhwwerGbZosKiU6dOe/kHdEroZJcEs7kxJSWF7sFxiODXA2GpCbANTSyyOk992oiRtbX169et27gh1+Bj/O+5bzz6cIZGo+57f1Joi6Dtn25ZuXJlVWX1tGnPvPbqSwYd+dZjCkLEhSw9VaA+d+7Hb/NPefn5dUIPsUswzMnJg3U6fVlFVVRMjF7vASEavL31eq+WLSMHJPWj2ypISYUOJQggSRpbpb56vUar0/sbjQb0baUNfAxeQ5IH+fv7kS//PVfk7R8Q1ylOjUhS4Pr36++trNo6mHRbCzfc+OXAOKLxBNLE+VUgKf+D485RRgaeMpId583AGaAgTY/Uy6aZtDhNU3/2nxLE7bIsKF4wnGHcrCZLCgxDFnjucOFBEEdnXX4DQFy0mgBNgAFwhXrhL2IrWVlJkpBkNLhDq2xAuzuoDAEnZaQVgo4WoafOx4miiFMKesNtwD0AbVmNhr4C7sgEkEZBJECQ0gQDuBliRD7N+W1A5uuUqpEolTxXVmwbOES2qExhEvMMltFvkAkGHSWbAQoAtApO4SCHJpTqOmQI0Jy7yJAeaRFXIJM+BQna92gOlaprb/wn7Lrhhhv3DufodZ7SQYgjxh6Asec6/O4OlCLaoklf0JzbEgC9wfnQuwK3aewqluw/J7DDHuDjuHgrKNuOk98EtBauFaFpHCE/KC6oYlJXYm4VzXUP3CllifBJdRUgjXyaxlWcOp+IxL20jpOm4/wWPh3NTfMpXB/xm8HJIXk1gAFvNBsgXgsVCTLxF4zfSzuDIK0X+hvNAWjObVW+l8o6ZeU4vxW3EaS4LafZwm644ca9w3UcOceYq9W5Dc3lu8Jpk3B03u98kGvObffcBc7iClAKOSBFdJzzEhJObXsvNH9FOHmgz8UpcBsP8MJxRB7D3JOZcaRccCdN18fR09tucILeQOG8hxbEKb1K8+nReYme/sa4rQfSNARIoljSxCTP5co/wZ11wSlNADTT2Wmp4WwOzoJOUhR3Erwzh8Jtutxw4z8I1GGGAqBaAb8keav6+B2CzHEqjCKl8EpmDlX3YLrcaGpzQLElVN+TUNu1L/zeO8CduJthdMMNN/6doGgtmbzx06TAlFd//hBqC6ZLIpvVHFC0rfPMjWYBGeHTZL2cEnMk6NU/JNymyw03/qOACOuP52IDt8aGiuly454AQd3poNBM+vlDgqwDO5JuuOGGG2648bsHwzD/A5IOjBVVk0d5AAAAAElFTkSuQmCC)", "_____no_output_____" ], [ "![ejemplo.jpg](data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAqgAAAEbCAIAAACZbOFNAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAIL/SURBVHhe7Z0HYBTF98evphESelVqAEV6E0IvKkUEVBRQRBRBUX9gb+jfCgpKExVioRfpvShKD016J6B0EmpIvb7/7+zbbDbJXXKBJCS59zEus7NvZt7UN293704vSZKOYRiGYRjfwKD8yzAMwzCMD8CGn2EYhmF8CDb8DMMwDONDsOFnGIZhGB+CDT/DMAzD+BBs+BmGYRjGh2DDzzAMwzA+BBt+hmEYhvEh2PAzDMMwjA9RmA3/qQlD9DItJ5xSorLDHSZnGIZhmHxIDhn+UxNakpGUaTlkjRKfjjVDNGK5bk+jjkZQIPJoFAWyxR0mZxiGYZh8SA4Z/qijkUpIEBnxpTujfmrClxEasWzaU3VvMWSNlxuGGrUHUyC8dg0KZIs7TM4wDMMw+ZCcvtUfHh4u/omcvyqDdV4zZrhs9hWR2+bwSSWQFWHDpkgy24aFKVHZ4Q6TMwzDMEw+JKcNf52nniLLP3xMutv9a5bKd87Dx494Sj5nGIZhGCbPyfGX+2p2I8uvi1iaxvKn2P2nurm9b37q1JohLVtq3wBoqbmlv2aIXl+D7heIPUUNkqDHCeIanYkslLBcNM7lU/VtgrRvIqh4eCMhQ3I1vUigFgZaDlGfbKhFpH3PITU69RmIXGOKBXJ91WvAc73AqTUTNInTtBXIvDEZhmEYn4buZt8pq5XH4brBq6Wo8cqtfIRTSIkLHx/l5roakwFFRM1ei8gr9VqaBwiUTM1WkfRUjkZPLRmSqzFuH1YouaSWoc02pQKqJm5rlCaNF/VKQ0pSD5UEWoUYhmEYHyXHPX6dLiyjz39q1Xx6vP9UN0+Py8MHj18dpRjYVOOlZNFliiYuxXimffQeGal9vdA9YcO2ySkFquUNH/9OFyXoLaKscKiLbGDB06qq1l57y0O53aHT1akpdD41YUBX5bmHnImcjTjXRXRNf/shY73UdyXU/YhQQimUyLQxGYZhGF8mFwx/Rsufld0XBnnblGFdwlKuplrPwyezcYta9aenZGXKxQcM5ED4+Om38eoeCtoGdREK6/JOOlXdWH7V7g/uKRRLaQ7d4BGUiZyNYp7dWWdP9YqitgkL6zJl27aUaznVmAzDMEyhJDcMv8byyx/rU11Uz/6+eCwtnlqnPJdWH+hng8GrU+4BeCxFAR63kj9Mb1bCWRFWsw4FUj6emMHynzp5WD5V7H7qZx8juirVBZ5rnL5eXXrS3YHI4V1riKf3EzI8vs+BxmQYhmEKKbli+FNtn/hYX4q/69nun5rQskaNrsMjvLldf+dozP7qLG8N3A7pLH+qg092X90H3C5dpkSNT3nAEBkZIdv/NK8X5mFjMgzDMAWM3DH8Wss/IOWmuke7n3JHQKc+8s7kDbU7Ri0tl8w+SLX8h0+eSm/3U1Ef0qfBG6XChk3ZJh7tw/4rzRwxvAa9HZC3jckwDMMUNHLJ8Gssv+J4ZuLvq3fC1UfeuceaIfRaHexitt/p8x7NvmfMGOXGvmr3U58NuPmWo2wQFtYF9n8bvH86l5/g52ljMgzDMAWP3DL8Gq9XJpOH6aohjFhKT6tPrVFvxmvQPksnMfksW2jM/u280+c9qZY/IoJK1Pj7KQ/pdZHDB6Q+oFc+fJ/6OX8PiK8E0Hzm/1TUUe2TAy8bk2EYhvFVcs3wp7X8GW5za1ENYURX+at5anR1a6nSiw3I0kamR327Xv0SIJmsjW32CRs2QtGWSNMAXaakfAhQfkGPqFGja4S3j+Uj0QRKOpFKRKVsZbxsTIZhGMZHyWHDH15bCYCwbqrty9TuC0MoP65WznTh4YNXu/uGG+1LbZCiz8SraItWSPF+b5NsJk+ngGqBQXj6BugyZZvmCb0Maj1+dVTG3wVIX6+wYdPTpkyX0MvGZBiGYXwTvSRJSpBhGIZhmMJO7t3qZxiGYRgm38GGn2EYhmF8CDb8DMMwDONDsOFnGIZhGB+CDT/DMAzD+BBs+BmGYRjGh2DDzzAMwzA+BBt+hmEYhvEh2PAzDMMwjA/Bhj9/s2aIXk+/t3v75EQeTH4CPZobPzCRDxBjVZBPqpfXU4enaq5wh81aGHslZww/TVftZM3NthK/T8fTo7Dg4715G9U/NeHLiEx+7bIAI2oWPj5KkqRtw8KybJk8GDld3hkfHrHUBwcnr7GFnJz0+COHj8knIyV/btEK48bRK3y24rnCqVXzI7P41auCStTRSF26397KLbwck+I3Rgup5edZ6cvknOEfPB6b4y/z4g5d2LBtkjSlUK58PoiP92a2qy/sfnjtGspZoSO1alm2TJ6MnLCadXQ+6PPzGlvIyUGPv+awEYM9OP3yjSMFT5tMdzJiU6o+QEjdoabZq4qTFISsyKdrhPg9ehGjiGWpAASQWM1L+9TCbVpItpywhq6QcOZFZEer1OiWE04qcTIe5NO3QObcXibi8pA1qhASqvlk3lYiJm3F0zcdZS0nB5oc3EZqYrNTcZGDRv/MdSbuXBO3wl5XP40yqaT1iiGPTLyvV/rSvShRCGSz6TSo6YCSlDKUr4KU+ot/MVAih9eApHw55QqhKSsl0qNAaqxb5UWk10uE+HntwydT65xKaooha6iclNjU9GlU9FSEJ7KUp9zV2kFGTZJ5N4kYb2elNnXafBXU8kHqZU/Ke4rXopFRRERMqnSqcsBDhqnR2lU0fTWzmRx4kC9wSDmB+L33waslKWp8OAXUGJAaCeRfhk85SSWNjDihJ30kL4c1GWrCWlENWuFMMtcgy6Rc0CQQQTUnTbYiqCnDk1gaMtFKzo5OtPHaUjwUIaLd1EeLptzbzkSrCYU1J5pmc5N52nDarASaqyIHbSo5mCYyVdOsddYiZ5IiL0rU6Owm87Txt6WJiE2R1dZRBFNzSXMpfaFasRTSFSYncaN0mqw0adKVnkYskxKzbDr3aTUFaxCxqaIipSYXjbjmSvqyKOhJQFOsHO9GeW1a4Cm5IK2kijaJEEk5EfGp4qmJMytCRVNWmnw86KAtl8KaE00LuM0nbZ7arARpr8poMk3FfU08VdabSqVJCxk5mCahJqWnDLWZiOiUE20YZDe5J/mCR84afgqJTk4bo20ed80l2lM7eiCjnso5hmvGQpos5KsZxl2aMjLLXCVNj6bLQEWTk3sBIn2BKaRJkz6DlPN0iT2Uk04Td6Wl4iGPbGWSJg/PJ6lo65FWJH2C1HO3DZc+EuJ0KtJlqrMWkYsbFTxlntOaaJKqZSuknnu+oiFdWWnrpSbJVNVU+fQlpD+XSVuEKuJVWhGZoWU8ZEhhjXTqlfTVUfAsoGbksSxNMJPkAveFp4tNq4qbEjMtQiWNVhrc65BW3PNJKtp80oqkT5AhAxHhRgW38d5V1n2lPMd6VlYhNWm6TFLlPaSU8SJ5GtJJFSxy+uN84jXY23vJj+7xEeKW39EoipdzjIwMH/+OuydOXaZIq+soKd3chUrBU+ZeoLm1U2N4pBKZES/FsiST95vcF+FlC6SQI5l44g4bwWPd3XZfDunsNvMc0SSnhsRtcgdjPsfIoT7y6qW/O6xvNpN7pVI6sllETo2fnMhnzZCuEeHjp7v5KImnLvZQ2SyVyW7Dus3Q+0yymzynOuVuk9OGXxc2bMRgXcSXXx5Wzr0m/d5JebPk1IQBwyMHDx4cOXyAh4UDI09GDD9Pi4uHzD1x6uRhessI3VxjeJ2UzZ7Y4rnFSzFvqFE73O0jxcyK8KYFZHIkE0/ceSN4qrvH7rtjnQXuMr9zTXJwSKRBaOYRddwKsjnmc4uc6COP3aHlDuvrMbl7K+CVSunIjoY5NX5yJB/PZl/GbRe7q6w3ymSrYT1l6GUm2U2eU52SD8hxw49hQC66cia/HBPRNeU1CBpC6Z138aEZt/cJyOyvnjJlyupMTD8hekuD2nEeM09PRFdl1IoXp93M9TVjvNriZSqmauWpWcRbxJHzV5EeE1pir+wGD0Wka4EsyJFMPJEhc2/mctqeWjNEfnnGm+7T6Cze+/HevnjKPCc00eDdyPFmptD4SOsquhm33qgKvCrRA9lNm9oy2hflRTo5kCluu0OLl/VNh1dLhNhMucHjPPVQu9vTUMG78ZM1tzUrlb71aPZTUbvYq8p6qJT7vvZmzGgy9Ng7mZDd5DnVKXcJZfdyZ6R/CEJ7odQYcV0h/UZQJd3+SSSWo9RcRCaUOrU4TcYgtUA1MyXKTeZpEQKDV6vZpWqpSRk+frz6rCpVBcKDWFrSa6XRXiufmtfg1aRXumhtEZ5aQINW19vNJE19PZ24z1x7QQimSQ3SnoszhdRITcYC+YJGEKiyItpN44sc0hainrnLXHCnmmhEMxs5ac81ebmphUBkm3qJ6qWm0qZxW6/0pXtRIhWhnKTNIKu0mutAU656Ic0IF9GafNIUpc1LI64RcFtfz8qr4kqE2+QgbSFaUlNQD6RJosSnKd9TEam4VTDt+NGSoVR3Jx7zUS8IwTSpQeq5CGnQCoG0VzUX3VfWi0oBTaapOaqRXqyKaS5o5VOrRWQzuZf6FwD0+F+piE8jbuIcHXG3bocyOQU8hC9rR20rjF9rl0KaocrjNpfxsoEx7rrqVnM/MAWEXLjVzzB3iVMTvjzszT3JAo24F+qDXylzdyjEX5PI+DJs+JnCQ9iwbYXa2SfE+7N58x2ZPo94jst2nyl88K1+hmEYhvEh2PAzTD5EOyv16ole+bcAoqmQlFohoJcrRTHKhQJcTYYpCPCtfobJl0guXWHalKMq9Kecpaue5hrDMLkMe/wMkw+heakndxjAUuKoL7jOsEteZ8i7l2tF6w57HgyT97DhZ5h8h3ZOkq0s8IYSHj6qICqjl9RaSLQH0N7h1wQZhskd2PAzTL5D2HmamMr0lCSX02A06vVG+bQg4pSrhQVH1IluZUgul6im3gj7rzfoDWJjADG9Ts83AhgmF+EJxjD5D2HoHZLT4XI5o6Ojb9yMdUl6F3zmAgtsOhx9cvklyaGTnC6X3eW0o6Jnzp5JSEoWex0SpH8Zhsk12PAzTP5DctmS4s6f/W/WzBldu3bbuGmrZDS5hEtcUJGtPlYbuocvuSR7YmLs2f9OThw/vmOHTufOX9BYfTb8DJO78K1+hknF7XRIfcXOAx5SKf/SP9nCkpTQo1uXo8eOx8UnJlpt039f2OOxx8xGvZ8hK1VyGrVqys35DDXNsnGIlLsVkl5yOV22mOiL3bp0vxJzNeFWXILNuffIsQfur2mUXAblVr9JiMplZSwXMem0ykvcapL3ajDMncAeP8OkgnXcIeN0OimMI8LKZXeQAOFyuZAER4neZbtd59Vk9nvv40+Wrftz6g9jjC4pwRAQaJRMhrtgXVApqqDdbpfrJWqEAKBLJJYlBh2MOv5keckQGlJ6zLdjN2ze8PGwgWZJZ5OUhQiXXWnfY6BSCCqUWpi6RhHKK1Ai2oE0wZF0Q1i5zDAFBDb8DJMe1YEzGAxY1rP058gCQYyQY3DA5LrN+YVyW7duU6duvaAAf/i/kl6x+Xlv+aEJzBsCqBeqSXaOArhEMtkC+RgMxsDAwA4dO1WrFmY2GxEDyymqhv+VfxRQilouqaHG4EgyeQnKNRqNOCJMA+NuacIwd4Jyq4phGADrcvr06f3792NeIFyjRo26dev6+fllbuQsFsvu3bvPnTtnNpsh2bRp03vvvTdDEppo3hkJ8V6f0yEZtyyZ0f3pl8YvXj34sY6SwZj37/STe52cnLx169a4uDgYuSJFirRt2xZHNBHqmKGanlDcYqw3cNth/NG8BqP0/SdvvP/VlI2Hjzd5oIYB3r7OKenMlCPyB3JQd/bs2X/++QcBONyVKlVq0qQJmhqnMMPy9TwCTfHXX39du3bNZDJ16NAhNDSUqu91IzBM/oBmF8P4MljQYYdwvHr1as2aNWVzZoCR8/f33759O+IVOQ2IhFGkVEuWLIELC2MAOwQaNGhgtVrlPHEQOJ2QtOMPYSW9O3CVDK3TYXc47TaXc83cnwP0xp+WrBN3uBWpPAX6gGHDhqE1UDUcUc3hw4fDAANUX5GT90lyTcVRidKASMhT7VTQIGM/fNlfZ9h+OMqOHCSnS7KpOcoC4sZ+fHx8/fr1qUcAOmXVqlXUwopoXnHkyJHixYtDByjz/fff22w2aAiUywxTQOCNKuPr0ExAACt4ZGRkVFQUVnac4gjr8vvvv9OpJ2DPFi9eDKcfyclWHTp06J9/9khwX+HdyrYJYk6Xw2azXLlyJcYz2HYIeyjfQxZAhbvtSkILOLhr1qyRG0m4+DguW7YMxljRMQWofevWLaUm7rh69drN2FjYdmSq/mlzyAi1xp49e9CkOEXpOKLBFy5ciITQRJbKU+RmEJsSdCW1BmnFMAUIvtXP+DrySi5A+Lnnnps7dy4CsCtY3HGsVKnSiRMn4GXKsqkoaSTp8uXLTZo0gWEjM0BX33jjrTGjx4iQ+FY6WH+nQa/bvGXzF5+PIlOHIwW0BAYGjh8/vnLlytg/GIwGl87w14KpPfsMHnf3bvVjH7N27dqePXsiQDqjWeD0w/Y/8sgjFEOSsMfjxo1bv349naZDNIzeUKtWra+/HlWkSFBKpHPix6/Lt/pPNHkgLOOtfuqC//3vfz/88INRrj2pUa5cOWwFSpQooZaeNxw9erRly5axsbEIf/zxx59++ikCognyVg2GuVPktYthfBqYE4fDcf369ZIlS2JSwIQ3bdoUAVrQN27cCAtEKAk0d7anT59OSz/SwmYjgOS1a9e5cf2W/AU88hMBl83uSLZaE2/dio3zDCyK1WoV8k6702W3u5yrfv/NT2+Ysni104HMYAmV0nMbqiyAMs8//zxV8P777y9atChqh/DAgQNxCQJKAvl194SEhJs34dW7qWMsan7r1s3YWzb50wEAzYdm+e6joX46/Y5DJ+wuySHiUHElT8igCORWtWpVKhSdIisigNNP+RCUJLc5cuRIsWLFxLqp040YMUIdA8plhikg8E0qxtfBNMARtmTNmjUwM7AxcCXHjBkDvxaXED9jxgzVulASgmJ+//13hBHo3LkzbCTCyOHff0/t3BUpiZ+mkd+A05tMRn8/v8CQkFDYTk+Ehob6+fnJj9J1ktOqdyVbTGaH3hXqtDglg8txu58OzCaoC6pGATTI6tWrye4OGzascePGFP/XX39dvnxZFldAcxUpUgR2EbVQ6qMhFDUPCSkWGmIyGuW2dEouh16yJBuMkl7y19lQoOQ0OO0G5Yv7U9i8efOlS5dQInL+7rvvzGYzwoifM2cOKYkjxeQxYvch7wvROBTDMAUFHrIMI4D9WLRoEVmRdu3aNWzYsEGDBognI3fjxg0SU0E81v2YmJiNGzdSzBNPPPHII48EBgbCBbRYLMuXLycZ+SKO9OcdLt2erTvXLlnxz669kl63NXLLknVrdx8+6H0Gd4hqzDZs2EBvHmBH0qtXr+7du5O5hTHesmXL7Vlc5OB0OtavX79y2crjJ/6TDPqNGzctW77qv3NnxRf5p+SJpkN4yZIlNpsNgUaNGmHbER4ejks4VRVT5RmG8RZMG4bxcWA//v3333LlymFGwOGePXu2w+GgJ7gAvuzixYthzu3yl7cQdDp58mQIwESVLVsWm4Bbt241adKErGatWrUQQ3eDlTRe47LZurdsFarXiY+sITOzSecf3PfFV7Kd0W0hG2aF3r17o3bgscceQ3337dtHT9bB008/TSY5WyBzpLp162ZYWDUjamaQX2DU++mMgd9NmGSx2hwpd84hefHixapVq6IsdMoPP/wABXAUXSJvTX799Vd0E7iNFr49tLf6P/74Y5SbZ0UzTA7CL/cxvg6mACzcL7/8MnToUISrVKmyefPm8uXLHzhwoG3btklJSYjs378/zAyMDb1iBmjR79atGzxXBJ577rnffvsN8V988cVnn30GW4XwsmXLunbtilR06j2yebS4RCqXy2B06owund5o0AcYjRkzgnoA9m/KlClnz55VYjMF+vj7+7/66qvY62TUDbmhfKgNt75evXo3b95ErVH9fv36oZSHH35469atEMNeZ9euXRUrVqRUyCdjVm5RtxYGvcEFK29E7cRTDDQT6miEo5/yuYZ58+ahUARKly6NQqtVq3bmzBm4/nFxccjn0UcfXbBgAd38x+aMMleRW0VCqpUrV6I8JTYrULuOHTuiRIRpA6cl3ct96GgESJhhChB8q59hxNfUwKdHAGt9eHg4zCECcNnr1q1LkdgKREdHay0BlvuTJ0/u3bsXYdjFXr16UWSXLl2Cg4NlEd38+fPJ/NCp90gGvS7A3+DvZ/IP8Df5BZnMwUZTIGxihpy0maMWwpx6AQyhVcatbohERXCklx4QLlmyZPv27VF9Pz+/nj17or6IjImJ2bBhA7KCJFASewHSIgdkZTKb/fz9zSaTn9mEf/xMRiM2SSky8O/RKaRJ06ZNK1eujPgKFSqgg6gjdu7ciY1OJqUjHm3ifbMA7POQkCpFmTBMIUSeNQzju2CVP3bsWLFixWBjYI3oPj/x5ZdfYo7AzMBQzZ07V3urH6nGjRtHFqh69epwjmE2bDZbYmJikyZNII9LsJc3btxQTaP3iDffJKcDhUj4TxwQJf4y5ETmCkUACqinmYCqoS4AwkpGGiCAeIvF0q1bN7QJwLaGPm4Ajh8/jp0Naod47HIQA2GAVEr6rFHrI/5c4id6lT9RXadSBTQpHH2UgrImT54MbUkBhKl0QPf/3RZN1SQowyyBGAnTN/MoGWngW/1M4YA9fsZHUWaAzNq1a+HaIhAUFATbv2nTJrj4ICAgAKeIhxlYuHChLKsA13Dp0qWUVZkyZY4ePYpUW7du3bVrF3xTkoHV/+OPPygMSNgb5N+u1+NPL+E/vfh4u15y6l1ShpvKZP8QUPOnU4r3BARoy5IJZ86cgUtN2VatWnXbtm1btmxBm1y+fBnGT66QhJjo6GgyfpQqmyAVKkd/VDnxOQhoiAzRdNevX0fYZDKFhISgrI0bN+IIIfpaBcigC2D4PZUu1zXr1tCCrFAdNM7t1ohhCgIY3wzjg2B9J/cOrm3Lli1p3VdmRQqqZwlKlChx4cIFJCH27NkTGBhIlxRpGZySu0/hJ554glxVsk/eAktKf27O0oBaoAoqKAgxQLnsAZKEX4ujEqUBV6HtpEmTqAqEXDkFnKKCxI8//gjh7NUulZSaQeGUIOkGOnfujPzTFQ1kdYQCCAcHBx87dgxJlPw0oBGQSXJyMo5uBdxCdZfLd9My7PEzhQN+uY/xUbBkUyAqKqpu3bpY6GFOatSoQb/+QsC6xMfHnzt3jqZJREQEfZsNwupLfAEBAdWqVZPFBYiB8KVLl+DuIzm2C/v27StfvjzCiCdzlYOgFigRR5irt99++/jx48qFTIEmQUFB48aNgyuPbYoSmwKuJiUl9erVa/369ci8bNmy9L1G1AgAkSdPnoSNRLkdOnRYs2YN6gUQTwJ3AkpBtmjA2rVr0xP3sLAwbaegFMSfOXOG9BkzZsywYcMyvtyHTMCyZct++uknSHqp27MyCFCNKFKFX+5jCgds+BkfBSOfBv/IkSOxiCNQsWLFnTt3FilShK5iQccRXn7jxo2xLUBkp06dVq9ejXj4yq1bt4bTD3uDrQBsD65SKkoIP3jEiBEIwHhMmTJlwIABZEUy2pI7BMXREWZ45cqVMTExpDaOJOAJaNKjRw96iK5EpYDkp06datCgAewrLO5vv/3WtWtXxKvKo/r9+/f/448/kLZ48eIbN2584IEHEM6Y1W1AdcEe67XXXkOG2HNs3bpV1ZOuXrt2rUWLFrdu3UJko0aNduzY4Xb7Ag4dOhQZGel9s9evX5++tBFkzJMNP1NIoOnBML4G7AfMOWxbw4YNsXaD119/HSaNPEVAHq3VaoW9x0yBQEhISFRUFOL37t0LW4IY2IZt27aRPIGr4NixYxDGVcjAaiYnJyPS7d3jO0cpOJtAHzoquWhA/Ndff03KlylTBvbVLr8GKKcTIEwfbqRmGT16NKqGeCX9nYHM0eYdOnSgFu7Xr1+60lEW6N27NwQA9l4HDx5UEmsgYW1Cb6CK0FHJSEPGr+yl/JXLDFNAyGH/g2EKFidPnjx9+jTsB1zbXr16wdJowQyBXenTpw8EcJqQkPDXX39hoSe/H8lr1KjRuHFjWVZBNkaGKlWqwDVEcsTs37//woULSIUwFZqzyMVmG6oRjkouGqD2unXrcES4R48eQUFBlEQFl9q3b1+hQgWEIQNhmGo5ac5w9uxZ+NZoMTQ+fYMQIuWSBWRo+/btS8rD9K5cuVJOlwYSpmp6D2136KhkpAHlqu1Gv89LkXSVYQoKRvXryRjGp8B6jYUbR3i0991336BBg7p16wZLo9pCdUGvVq1aYGDgPffc07Bhw0cffbRixYr0Llvz5s0//PBD2Hg1iQosR926dWGiHnjgAewMOnXqBPOJ+IyS+RBqk1KlSrVp0+add94JDQ0ltVVbiHoVKVIEm56AgABUs2PHjvQJRrfGMrugdOQfGxtbvXp1WHcYfnrATzpQp6AgdEdwcHD58uXr1av32GOPoRdE4lwGw8DhcFSqVAm7umeffVb9ecAcqTjD5Bli4VOCDONLYORjEYc5oVUbpwBhGDD1FAHEUADWSDXbiFHXesRju0BhFQjAE4U8SaIgylnNIT8DnaEwjlCYao0wjqryuEoBVApQHQmKvxPo5rk2N9KEThEmqHkRiTDk/fz8SDhXobJwRBilI0AK0JhhmIICG36GYRiG8SEKgP/BMAzDMExOwYafYRiGYXwINvwMwzAM40Ow4WcYhmEYH4INP8MwDMP4EGz4GYZhGMaHYMPPMAzDMD4EG36GYRiG8SHY8DMMwzCMD8GGn2EYhmF8CDb8DMMwDONDsOFnGIZhGB+CDT/DMAzD+BBs+BmGYRjGh2DDzzAMwzA+BBt+hmEYhvEh2PAzDMMwjA/Bhp9hGIZhfAg2/AzDMAzjQ7DhZxiGYRgfgg0/wzAMw/gQbPgZhmEYxodgw88wDMMwPgQbfoZhGIbxIdjwMwzDMIwPwYafYRiGYXwINvwMwzAM40Ow4WcYhmEYH4INP8MwDMP4EGz4GYZhGMaHYMPPMAzDMD4EG36GYRiG8SHY8DMMwzCMD8GGn2EYhmF8CDb8DMMwDONDsOFnGIZhGB+CDT/DMAzD+BBs+BmGYRjGh2DDzzAMwzA+BBt+hmEYhvEh2PAzDMMwjA/Bhp9hGIZhfAg2/AzDMAzjQ7DhZxiGYRgfgg0/wzAMw/gQbPgZhmEYxodgw88wDMMwPgQbfoZhGIbxIdjwMwzDMIwPwYafYRiGYXwINvwMwzAM40Ow4WcYhmEYH4INP8MwDMP4EGz4GYZhGMaHYMPPMAzDMD4EG36GYRiG8SHY8DMMwzCMD8GGn2EYhmF8CDb8DMMwDONDsOFnGIZhGB+CDT/DMAzD+BBs+BmGYRjGh2DDzzAMwzA+BBt+hmEYhvEh2PAzDMMwjA/Bhp9hGIZhfAg2/AzDMAzjQ7DhZxiGYRgfgg0/wzAMw/gQbPgZhmEYxodgw88wDMMwPgQbfoZhGIbxIdjwMwzDMIwPwYafYRiGYXwINvwMwzAM40PoJUlSggzDZEBMD5oiegQQksS/4h89Jo9e55L0RlxDJMMwTIGAPX6GyQKYeZckOV2SQxztOmey02m3SDqXzq7TJTkUKYZhmIIBG36GcYPw6eWAHr695DLoXUad0yg5hWfvklwuPS7oxfSBu88wDFOQYMPPMJkg6SQYeYdecuokh85lczmlPTt2P9l7oB3evmSUdGb8r+wRGIZhCgL8jJ9h3KC6+zD8ksup0zmPHz707ZhxR4+dOLVvv7VotbMxUcXMkmTQ68XTfvkNAIZhmIIAe/wMkwViEwDrbjSHFC/23LPP1KlVzSmZsGHGtsApDL6LXX6GYQoQ7PEzjBtSPH4RhM+P/3HQwbd3WJ/p1HblvsQL148UM7msBpOfZNXrzTo976EZhikY8GpVsIFFoq0bBQjtJUCnPoJSZ88ocllJ4jJJuPCvXu/SGfQGI6y7XueEi+8yiIzEFfEvPH7GW+RGFb1AAUJ7CdApk3soDa2BItUjU7hhj79gg+67ceNGfHw8AvKjZoHRaKxYsSICFKPG+wJoB6fTee3aNavVSqeovsvlMpvN5cuXR9hgUDa7EMNVXEKMGqkBF5FW/GMwGJFOXhRdenvC0x06rTh47VLMv0UDnAYXNgROnfxRfkrGZA7aMzY29tatW3LzKo2G9seIxSnFqPFMboCWx+yIjo5GO2t7oWTJkgEBAegL0Q3cBYUaNvwFG3Tf999/P2rUKJPJ5HA4MGlhyapWrfr333/7+/tDwNcmMKqPRW3AgAHbtm3Trl/t27f/7bffcErrGkkCBChSlkoFDStPDUouGYzyx/awStrjn+740IoDVy7F/Fc00GlwwvBLfJ/fe9Cq06ZN+/DDD7UjtlSpUrt27YLVgYDaZUwugS7YuXPnE088gcaXB7nAz88P/dKqVSvEuJ0RTGGCDX/BBt339ddfR0REfPDBB3BqabpiAcWsht9PMj61kpIfv2LFCriVSpRON3LkyLp16y5YsABNQSASYhCeMGHCsWPHKEYLpoVT0ptNpjp167788mCTSTH8BnvCUx07rTwQcynmTNFAh8Fp1BnYVmUDNPtPP/2EHoHtDwwMpBGLsfr0009jK0Ay3J65Crrg3Llz69evR7OjqXF65MgRdApmTZs2bXBK8Yo0Uxhhw1+wQfeNHj0aJm3Tpk1BQUEUSZNW7VmfmsNqrbXV79SpU3Bw8OLFi2Fm1NaAADh48ODNmzcpJi16l0iqL168WL16dYW0+MYencGe9FTHjqsORF+OPlskwGbQmVwS7JYh/zcxaqAN4Gi325cuXVqlSpUHH3yQLuXBUEG5kydPnjRp0tatW4sVK0aRVK6qYR6o4cuo7UzgNDIysnPnzitXrmzdujU1Pnv8hRvu3YKNdolEmHB76iModZbvVZKZJ+gSyRA4hUCDBg3au6Ndu3Yd2rVt37ZNg3r1sDvWSS6dy6GTnCKINVGPpVO8HyC+yLeANDDqC4WdTmdsbCwcvlWrVr388ssDBw48ceKEIpEniM7IMGi1YUCnTC6htLKMekqXaMrgSKdMYYU7mGHcgIUQxl3+c4k3+SWHQed0Oqw6lwv7AMTb7S6HyyF+paeAvNYHq4813eFwDBkyJDw8/Mknn5w+fbrFYlEuMwzjM7DhL+RIGrSndBUI11X+o39TT2U3N0UUYfEn/0PCOY8oKx1yqakFUlDEpwTlv/RnOYmcp3jdSTp96uQ9Fe4tWbr8psjdluT4SvfcW7Z0ud27dstNUgAgrw7HVq1aDRo0aP78+T169KBLBQsaGgLsZuhLFuRw2v6nEzUqNayGxElBgOqnRURqa5ESovmiiUm54OaP8WnY8BdylKVC/twavfgGlGvpEItGypfQiQPCDlpH5CTia2sRzKUPrcMTJd2gJyG0Fd+VK8pXhFQoLvVK+vOcQXj9ep3eIOnFd/KXKnPv+ImTJk2JGD99xoxpP0z5ccyPk36qUrWKSVcwbvbTLVx/f/9hw4Z9/vnn3bt3L1WqlHKt4EAjhIaK3eUSv5coXsMUPZ92BKhnFKEZ2wUNmg6oL50igKmReq5B0jnEtNVeUGqNPwTEV1CknDI+DRv+Qg75eUCslDJ0qpJitGg5UByGFNwMj1wycupjRVVhBPR6o/wLeClqpNx8h4T2T0POjefUAsT/aJXg4KJPP92nX79+ffF/v77PPCP+LVOmjCLP5DlG9LekwxARY0DuLM0WjIaNMnjkcQNZEZZFaLQXDOi5OwIw/xSDU9REmQ3iXFRcUuaFCJKYHKm2CAIikXLG+DZiJjCFGywTWDLgJdjtdpvN5pB9a+WavAKmLBYUlhFR8qJKS46IMcifVk9dSHIWUs9qteIIbcXSRlsBVVOxJaE/Okv9kw+0ruUWUAZot00IQ0kc1eWYyRuo2TFOHOI9C4dL4/3SCFHHhPIn/68ZucqAl8d8AYBqhyPmCE0Qh8MB3ZX6iKvqnwy2A3QiZrN2vrLhZxTY8BdyYJaOHDnyySeftG3btmTJksHBwZ06dVq7di3iAayW5HJKLofkdIhX1J20Xoj1EwfxbbXi1qAIi0+z6QxwrtKtJXcCFMASdv369V9//fXJJ5+sUqUK1KtVq9aoUaMsFousgVy4UAyrO1Z46AqV5ZufTofQXH4yIF62g+YQc7i9A3qniN2PXk8fblahfQmO6vclMLmHGAUycXFx06dP7927d9WqVQMCAsOq1/i/T/7PZrXIw9ipx0gWW1yMFHloi+GgJNQg4p0O8dhLyT1/Az337ds3fPjw8PDwYjIPP/zw3j175LqIOYE9s5jC4lMnBsllEBXEjHFhMojKyg2A/7H7lxwO0Tb4T8ma8VXY8BdyEhMTBw4cGBERERIS8uKLL3bt2nX79u1Dh75KH+LCooCFUph38ajU4DKYZPdJGHw5HsjWljYD8hqU8pgwB4D5xJo0ZcqU//3vfxcvXsSOZNCgQVjZv/zyy19++QUqyJYfxYlv1ZX9anGUl2vhrmGlc9qsTic2D9i1YP0Tl7CuyXmDnNQTIKC18YgRNwFklCgm15D7F0PRNWPGjJdffvncuXNt2rQZOnRostU6evToSZO+F4ZfgvcvDD96hsYJdoQOGg8YRRCAtZftPXa0NhFfMNzf+Pj4Xr16LV68GCZ/yJAhzZs337ZtW//nBkRHXxYTFDYdG16n3eWwuyQDqka7HTGjXXYcxfwVU1e0gV7MIPkkh6YGU0DhNauQg7WyT58+W7ZsWbly5bhx4+bNm9e+ffvzFy5s3RaJ5UFI4B+n42pM9KFjxxwwp8oNbZdBL+EPq4nNZt2yZatNtahiTcmZVQMFwWpWrFhx2bJlf/7559SpU7///vsvvvgCOi9cuNBut+mwcunhxyDCtX37DqtNLOOSvF2QNdTFxFxeuWLF3xs2Xr8eR7dyyULLVp8pPFC3ot+LFy++dOnSv/76a9asWWMnjP/m29EGk+n3efMwTsWLGA7HoQP7ly1bunv3LrvNjjRICMsnBpLkiIm+tGr1yrXr/rx4OUZv8hP72YKAzWbDXgfGfsWKFZjCmBr16tWLOn16x65/xI9GSZIRc8Llirl4Mer0WYfy3B87Y6cBfr/dunPnzqVLlx04cNiGOSzv4bFVVrJmfBU2/Fkj5oryR1tpYXWE2yDvtW/FJ9AWm2TyCUIjmaJFi7722mvVqlWDt4pTf3//unXrGg16ePQQg73fs2fPTz/82Lplm1m/L3Fi1ZBBBsmJ8ccOH1i6dMnj3Xs8//xL4hPrcsZyJSl8p5DT/Mwzz2AvAj2xrOO0ZcuWAQEBdrsd5ViS448fObzg97ldO3fu02+ATm80in0JNIern/zpiI/rPFDvlaFDn3ryyftqhX3w4ecWq3j2CfXlnspP/cHcMeLWisHw1FNPPfLIIxgtCGMz0PTBB4ODQzBanA7b1ejLj3bp1qH9w4MGDerYsX3V6rWWr1zvwj7RoHPaE78ZObJevYYvDR7cp1+/Bx6o/eab7yZZbLCDyowW8ze3hgzNKUKJ8owip6FEiRLvvvsu9seYvDDdoaGhjRo1cqFODhvEYy5d2L3rnzFfjWzeNHzJsrUuNIoBfr94frd/z+7WrVp37vbokJdead2iebMHm23f+Y9eZ9JhR89zw7dhw58tYOphkGwOR1JcQtzxI/tGffrhc4OHWRziczL5ZyZhsaD7onQPHEskWX0EcPXcuXN6lyOkeDAuj3j/7V69+/485eezFy5bjMGwqGa9y2g04dK2v9c80rHTJ198dnTHLps92GqgL6uHr4CK5ozhB9AqxUcXN9IRPnPmDFyc0qVL+5lNkX+vf6j9w19/9sXebVts+lC7wein15swavXWZSsXjP1x8hsffL5798E1KxY+2KDy2O/G/xgxXXb1hYsHn0d9/4kpBGAkY7RgkFAAYDSeO3s+OclZrnRZP7Pjgw/fPfLflelzluw/8M/kyZ+5pMRBL7194OhZp2T7a/2Cb76LGDz0nb17965ctapNk/umTfpp9LhfLMjEBRsKCyrmby49+oaq+/btg6c+f/78BZkyb9685cuXx8bGCt88BUwKQJMXAUzq06dPBxgNpUsW07ksQ196vnffAfNmzIq5YXHqA0zyW4vYItsSb73/3vuxUsDva1bs27l/zP+9c+bSsQEvDrt8/oZd3BthfBo2/NkGU3HRokVt2rTt3OGhUV99Gx+XoFzIf2CZgLbR0dFYcb744ovevXvDt163bl1I0dCaNe836I2ffPHJ5i2b58ybLi8FkviVeext4AAZjM1btflry8aNGzbcV62aXjwypRzlY86BRTwpKWnDhg0TJ06Eo9a2bds333wTS1vDhg2x1DVv23bD1i1/bdpQp0YVmHID2h6LvXhMq1+2dFX3Hj3eeHtYmfKlmrcM/ynix9KhrsULV9x02F06hwmLuVj+lFKYQoBs/vQWi2Xbtm2TJk16+eUhHdt3HPa/1602S4OGDW7cuLlx87YxY7/p+Ej78uUrPtv3xdFffnHzxr9z58x06fxXrNjS/qGHPxrxbvkK5Zs3bzRj1qTSJQKXLl0VG58Muyr+5DvkGGDyvjbniYyMnD59+qxZs2ZkCgTmzp17/fp11JQSIoBNwNmzZ2fPnv1///d/PXv2bNeu3c6dO4OKBNe6r7ZObxo3ceKWbVu+Gf252YgayDUR00R/Our04cPHf5j8Y5smLcrfU+7FN98d9lKfK6cP/rVzr1Pvz3PDx2HDn00ws1z6sOo13hg+fM6caZXuKQPLKpxMhXy0k4ZNhevw+uuvN27c+KWXXpo5cyaWhdDQUKvVWqZcuXvvrQzLWrFipXvvvScwwIx1z6CzKwuf+NIaY3Bo8arVaxULCfETl3RYVWhRhD+hqe8dAQ2xzDVp0qRXr16fffbZxYsXq1WrFh8fD68OOkPALzi0Ws37ioUW94eT73JBPReUMZkkyTz8zbeHv/kGomHgsc5VqFipwQM1Tv3339XEJHmHgAUQ5KPuYO4QjJZly5Y1a9asW7dun3zyyZkzZ6pVq56ckCC5HI2aNClZsswXX37ZufNDeqPO6dTpneYHGzcJ9nNevnBG0htffX34m+/+z89P3PTXS/bipUo0alL7/Pn/btyKtTtdMJ/yh1dpdOf8mMESMWTIkMWLF8OnX5IV2B9UqVJFTXj58uUXX3wR0wE5wN8oUqQITeFKVaoWL1XaJemqhIVVKF/O389kFO/nCqsvXtLR6++rW3/chO/q1b7PqNe7zJLObH6sfUez3rb7yD7x8IMNv29TmA2/MMkyCKjAm6RIoMhlAz28ZIPR1Khxs/7PPluvQR0/cWeN8hFb9PwzmWDj7Xb7k08+CTfi6aefPnLkyIkTJ7DuvPDCC6h47ftrBxctKjv5QmujQVh88UmglNTyV50YDCaz+DlaSZJ/LVVeEKmSd1ZPanzw66+/vvzyy2XLll2+fHl0dPTatWvHjx9frFgxrG4NGjRAt+lM/g65YY1YyiSnbMzFnU+j0a9J4xYNGzQymsRX/EBDS+KtqPNX761StmSgv0tnckJ38QX7Ob+IF2jQ5nAfAQKYC2hJRCJAp4hXwySff4BWGL39+/cPDg6GdYyJiVm7dh1GS8lSJQ0m/YPNmxvNQf36PhMcFIhdq9Gg15vsV6LP2+1ShQqVUcn76zRo0rQxpj6soMkg2ZOl4ydOV6hYLqRoiMFgxLQWu10xrnNqT5sGbGSB2WwOCAgwZQXEIEw39m/cuNG7d29sd4YNG3b69OlDhw7NmjXr8ccfx6VmzZqLHbB4ACc+aGoQs4OeVGBKI2Aw+Rfp3bdvSJC/XmdywfQ7bx07ejrRYK7zQE0/eU7lATScaLyp5MboomzVgtRFXrl8xyBzkGWeWjEtajyOiujdppAbfixttJwRiFS7gVa9bII0Br34rjAxNcUzQWQmzCQBn0F2G/IHJ0+e/Oeff1q0aDFmzJiKFSsixmazwcSiQZo0aYzVUewOdAYjVj1J+PQY0YruaBixfMINQshlEN94hiuptbzDOqI4HKEMnBuEf/nll5YtW9KLS9u3b79w4UL58uXLlStHfonQEVqIoEiLE6ErtJYtPrwcg8tulHSTv5984UrygBf7FQswoT9sQtglp2BSEeNekmA8Rsjs3LkTzbtw4cKPP/74o48+wu6QBICSIN+ApXzq1KkWi2XKlClt27aFgcQgPLDvwH//ni5TtkSFiuV1OhNsuuhx8Xl+R1JS7Jdfjw4pXvHZ/s/IO00xMc2w+VgNnK7fpsw6d+n6032fKFYkULy6Iu/dc3W0yK8iZl0CZFRJ9ALmL/qoe/fu6CzsjxGDWbNq1SoINKhfV1YZfwYxT5W8xbM98UE+g96FHQ7msMhIhzly7fzJsd//Wqtxq4fbtzS5xHOwvEFr9nAKzRGmSzkLFaQWh8XEmwb3EmR76tSp8+fPK+eegbuFI+QBmR41QJDYXQeLaKGFGhpLxsiRIzt16tRBZvjw4YmJiXcwJpBQ/hNdKO/+lKzoiMj80rVWqxUDDub/jz/+uHXrFpb1t9566/fff/fz86tT7wGhPFSmN4ZEdYQrjbkiDwmlLvTgX/6DERUGV+xxyDm6M9AvmA+kIRZ0uPsXL17EPuD1119HZOPGjeHEoESTJMHXR6kOg1ibRcniFoV4FxuZGHQuk2Q1SrZ5U6d9M2Haa2++OqB/H3h7Zp3OLKEuWBCpNEaABscRLb9+/fpvvvlm9OjRhw8fRgwMyddff42Yc+fOQQYCObhi5iCwedDtt99+w2i5dOnS7Dmzhwx+OSExoWmzxnrh+Bp1EsaMy2Sw2qyJr//v450HTv40bXK1GpX8DHoDouWPt2HbuOD3hZ99OfaFl19/7ZVBQSKFHWZCHuT5DswF9Ahs/+bNm2NjY/fv3z9kyBBMZ/RiePOmmBOin8TtChzJ/RDrkvifpjd2x2gQlyXuasyLg99JMAVP/mn8vWVK6fTywpX7oL/AunXr2rdvP2PGDIqhSzkLlpGxY8eqizwWuoSEhBwsKykpCS2/evXqLKcGBNBlGKtbt2799ddf165di7TYzJEy+WhmiZ4ppGA0YP+FXXORIkWU2spNv2zZMuwGgCInGyGAHnJLovhLxtEu7iBBWPxhQ3HrypF6lUu36/x0nM3ulOxO8Q0hdvlanoJVG5aSBjqBuqDi165dq169OuprNpsDAwNDQkIeeOCBoKCgUqVKnT17OslldzqcLnsyjmeO7QvS61/7ZNQtF9whh/jtE/znTMbR6Yjr2uCBKhXqxFkgjRa1yHVUCro90PLICMc33ngDUwIEBAQEBwdXqFDhnnvuwem4ceOEDFrbYXPZHZIlrkPDWqUq1o3DMuh02VxiuyAUdFiTk66O+vT9ikVDfp62MNmWkOywWJyogxWVs0BW7liU2LFjx8cee0ykyvPeyT+g7mhVDIzk5GSMlngZNRAXFwczQ12T2630008/1a5d+8aNG94XhJX0gw8+oHvgGC1FixYtU6ZMtcpVMbz/78tP7BizDpckRqgtJibqiZ5d69d7cNf+Y/FOVxLWAJsDQ8ZqT7YkJ4z9alSp4sV/Gj85yepIFqPQ5rQluhCUv/oHWYgPwuUD0DLQDrt2zAjUEZt1TOHixYuj3dAIpUuXSoi7acdsxATBwencuGRWiMH0xbcznHbEJCRjV41udCJsPX10X3jtGuHtOp+8HOO0WRx2h9VpV9sdBQFYKSySGzZsoFPl2h2DrDCiIiIisPCOGjUKYYARqFz2DKkBlHMPKELyl5MWK1ZMXksM9Lhk5syZKEsrRmHvUVPhOGfOnBo1amBnrOapRRUDENi0aVPdunXRZVAD+lSqVAnuFgZwvlp/CrnHjzmzcuVKGG90AGJwiuPcuXPVMIH+uHnzZv/+/fu4o2+ffv36Pvf0U8+uX79RvZUvvjbDIFnNdp0rGK6lU2cRnwfKle3s7YDaYZmYP3/+l19++f77748ZM2b58uXbtm2bPXv2zz//XL78PWbxfX0O6KuXrORB+4m9i87i1OucDr2w7nqMZeylE3TwlvRGh04si+Ln0IQTfifQ8o3jp59+OnHixI8++uizzz7D7vjAgQNLly6FZ9C7d28xNuHkoR5QziE59H7ie1icepckfwsJ9iY2p91qef/TzyfNmP/jgmUD+/fyM/qb9Ca4fhJ8fqPeT745wKhgSKDNsSzCcGKVx04LqAGYUixV1DXaqZFPgFbvvffejz/++Mknn2DYwJYcPHhw0dIlM2fNfv7ZgRipksuikyxnoo73fXrwjXj98lVL6j9QM0CS/MUjcEmnT5Zs1i+//u6bn36dNHXmcy+/aDaI+0loFZ1BfD7UoJNfHZV/90cp8q4ixr8kVa1adcGCBZ9//vmHH3743XffrVmzJjIyct68edOmTfcLCoGvb5Kf8evtFknv56+TzHB19DrYVZNTMohv7LUfPXy0fY/n7mnUZvHiBZVLl9QbzeI1JfkZXh6AgUQLL6FOfOXcA2QgcQRKlGfI1sKXu3XrlrxmiOQUgwA1I0lmFyQkNbBRHjt27KBBg8qXL+92apAkBf79998BAwacPn2aXsb8v//7v8TExMGDB2PthcxtK5PzQJXCCvoMJr9BgwboLYABR2OudOnSV65coW4gMFAgif11VFTUqQwg7mTU6RMnT127flPes8l/dunWtSP3VS/W/uHnE60Oi5QIdxQ+h7iYt7j1+Gn0Y3MN1ACgMB2t1mSHNQkez+lj+wMN+rdGfHrTKSU4XMJFssNTSnI4LC7rjVYNG1WuUDs+wQ5/OtkWLzyKHHKKoCc0wV5YqyGgSDHlnFbJYnPF32zbtF7ZivffTJLsjiT4LDZLUuLNK2+89nKdevW27dmb7BC3duwOpw1/8t4kxdenI3v8+Yvb8PhpPBPq2BZ3heTbUHDc7ZaEf08cbtWsYd+n+1yOvm61OqxiWEFSCCfFXfnwrf/dX7v2xu27YuH7i9/3ETcCLOK2l0iPEjCpMa7zyfhAy4iqpswLNUxHecpQ1ewOGyZp0vqli0oYjaO++dnqdFjEb11hksTv3L7+/rBab334aXRsnMXhwuzA1h61xp9aTTFDcs3jB8hN9fjpNMv80R9LliwJDw+HVkqUB5AVGiQ+Pr558+bYUmBTq+4qYKSxdFNxQEmQHZBKbns7nKXq1atHR0cj7DYrRIpBJN9gHjZsGGzNiBEjLBYLugEx06dPh1YPPfQQdgDQVklztynkHj/cgkOHDqEnMCbefvtt9BDi4dyvXbsWV0mMgLuD/XU1d1SvXq1KlUo4hIYG64VvjExcOjj58Cvh9+sscqQJUfLL8PnCW0Lt5N2OeOCEI7lxiKQWQABBRK1fv370N2OmRPyCJS9yy9axY8bPmT1bPBzU6a9euTJ61KjRo8edj74cm3BzzLfgu+uxcXZXTn7fJ/SAbrI+AsRATxwxVXB682rMt6O/GTd27IXL0ZakW2O+/vab0WOvXYs1mJwTv//6l59/CfIv8cN3PwzoJ34tt98z/QcPGXrj5i3UULwXIJ565jvPlbltaOgCGicYNmL/7aRvlNLBbvft1//AwSPYzr45/PX+zz7bt2+/Pn2fGTfxe0yA3yZH/PLT5ACz4afvv39hwMDn+g945tn+A196Oeb6DYdO75IwTsRfvhouqCbNBe2RIsVR3KJwYeasWbX6u2/Hzl+8zCJJG/5e+c033y1dvNqgl6KvnB700qCzZy9ePn/u9SEvozmefeYZTJJZc+dm7UffVVDB69ev7969GxZdifIAbC2EscLT2yrYu7z00ktk+69evbp582YaKrcH0gI4hHD3X3nllZIlS1IXuIUuwdgvWLAAe46BAwequ5BevXo1adIkMjLy/PnzGLSy+N2nMBt+gG5A52HJqFOnzqBBg0JCQhCJ4bJw4UIc6RJAf6CTgKeuNRnRUC5x0Inb3bGx148fP3Lq5Fm7TRcff+3YkaMnT/5rczrE19xkyICKAPJA8hY1CY5KRl6DWpClB2azWY2hp18IY1RTRffs2z9z5pyVq/+oXqPm9StX5s2cuXTJUvHMQq/HrJs9e+6MWfMCi4aUq1B63tzZM2fMvHH9ltFsvv3JlAFoBdDyUAxHhElV+ZI+KfHW9Dmzp8/53T84pGyZ4gt/nz1z5sybN2OhP5a2SlWqxd6K27dnz+H9+w8cPLh///7jJ06gyZA2BzW8PajjMMaoE71EdHkKSkaMDI2KdEexsoq1FdtUA1z/YiVK3Ful6pGjxw4c2Lv/wN7Dhw4cOnTw3Lnz8u87GCreU8mWbDuw95/D+/cdPLB//759x46dSLaJb4YWAx4lUEn5AwwGuWrK7XGgDZvNJjHGMV6wX4/c/tuMmVt37qpco+bZcydmzZi68e9NuOBwWEKLF6tcteo/u7Yf2PfPoQOi1qh3TMzV3BhbUBjeLQ14+LWw3NeuXaPf2AQkoAYgBmJjY0+cOHHs2DEEEEnDnkhOToYkAhBDbgBhUUxa0CBIuGrVqoSEBAjASXv99dfLli2LsYFTLP42+YVQSNIxHYgEVAqFtUAA+SOTuLg47CeoCxAPeUACKnIK6ezZs5cuXapXrx69mYF4pAoKCmrRogW8f3ih6VLdTUjjQglGQ82aNVFH9MFnn32G00cffZROMTgw5uThJwaWksAT6Gbxh8EBWavLZV0wf45Bryti0AdiAprMJoO5XKVKx8+dx8hXkmiQx624d00Tw0tImNIqGXkg463+LEH2DrsVfzab/LKe0+5y2i0OZ6LDmYwaClWtdnuyw57sFPfOxU/24p8kK9KIl4bcVDJXQPXjbRb6zfU4obNDsrtsyaJhkp2OG1APMeKneJ1JaGCreGsJ6kmY6/LdfnHblqZo3t/qpx5UOxFHb8AgASSvZFQYuY1b/ZmDfNDKdrvNYkmWh6rN6bJj0Ngwnm12G7rAmuCyJ7ocVvE+q/yuCoZCslOKtzsxZtDc8iSXB4ySZX4H0xTjxW6z4iB+Z1f+FV6rHaMH1tbisGP4Xbfbr2LGilGP/8UzDQfGlsXpTLDTb3AK0HTgzm/1IxVGO8zbhQsXYCbRv/fdd1/btm0nTJgwZswYuBwjR46kzDG2YdcXLVqEVcvPzw+msVGjRhEREVAbOcydO7dz5861atWC0UV8p06dcPree++5XQaR261bt+rXry+bMt1HH32EHHr06EFGFzU6c+YMikNaSCppNCBeTFH5AYqYfmlBJLYvzZo1+/7775UoORJQKjVP5ZrTOWvWLJQ7dOhQyOCUrkLshx9+gEoffPCBGnnXEbuYwkpkZCS2YGjxgIAAjAZ/f/+uXbviFNW+cuXK5s2bSYxGSaZAQP6jT7S59B07ddp/cN/OAwd37T2098A/Bw7s+2vdn2XLlIIfoqTQgJGNvveilDRAHqOHAhSTo4iP+uAfMQTQIHoDfCaj5PLXucSbTkYDzsSLTgazTv7pL/FLXy6X2SjaTg//SHaR8gADGs9odEh6p/iklugDdIH8CyN+OlewTu8HtZwG7KL90ErQDt2D/ZnoJCgtPux01/bX0IegMEVmCXoD6x0FKIbxErQXxorBaHK4xKdScYohLj7biSmJ9Vlvlgz+Tp0J6y6E5f6QTDrJLLkMklP8toPIIK+G9R0jdBXfpSnqq4wuMdYlg5gENpMJ1cF5UaOxiMGofeYl1i9sc8zejsfsgYUuMTGxb9++U6dOjYqKgtMFx/eLL76AZ4IFUBGSp8O0adMGDhwIGRyHDRsWExPzxhtv/P777xDDxjcpKQkWF5LYHyAMYN0pbTowTY7IIIw9RM+ePZE5fcERAlBm9erVmay9iM9koiHh4sWLsZV58sknlSgZGG8c6QaACpVCzybuvffedFdLly6NgrBXUM7zAWn0KwSgfQm129AfTZo0qV69OjqjQ4cOpUqVov5esmQJ9phIknn3a8DoIdtvLFo0tNb9te+r/cD99z9Q+/777r+vVo3q1QPkm+oZQeYoAtveChUqlPEabDNhA1ALJZccBTWBbUfNjeIbPlAjUTWj3qWXnDCruIStAJZR8UO8QkyuNkTEd+GKOeS1IbtzYNVp5RJKolz5Lq9oUazb8ksVUBFncpTQXzLJc05sUJQc7g7QB323Zs0adHrZsmWVTs2K1157DcuKqKc7kKeKEpWfUDSTUaLyFni2YqiK7Z8wioiBd4mDEaPDYHLpDVib5UEkD3cxSFx+sJtigyh+nY8yKTCI79XCLJDriykiFiYxG1AvSWdHJY2oMrbx4lMxVFs0i5i5ZvENGbkCBvzEiRO3bdsGy7dp0yYYchh4eF9k9sgWInD8+PHPPvsMztjff/+NVW706NFbt26tXLky9gdYkJ955pk///zzpZdeQkd9++23GzZswOn48eMpOSEPMblaev38+fPJQbr//vvr1auHeHh38PVpEs2ePRsxniYUwBr75ZdflitXTpmBaYHvfvr06fr16yvnZcpgLmNbgwy1KzMVgSOUREBZoTSF0unVq1eV83xAamsWGtAlWD2xi8SIoT7o3r07hgIuValSBZsA6gaMtrNnzyISp3I6z4hpo/wZTdhlG0wmP7PRD52MDbWf+PCY0WQy+uOCO48fZQHsbdetW/eX1/Tp04fGFtJSPjmI0Ft+sA7FST2B0Www+Qm3SfaeURkzKitMPTBhOJtkcZNJDG4lo9wFDow/diZmox6tCwWwdsm6QXO9wSy+mcxPD42EPqgLFDSKzhHKCedfrHR5o6cboBLGXqtWrdauXbt+/XqlU7Piww8/zGQ04hKGBIFwfoO0oqOicV4hDwCj2SReqMLOzyh2sWJYyC9YiYEhZq1eeP/4wxyFvJi74p0eDGkx7EUK2NK7OGKyA5TEvJVvzIn5INdVTA+TCZFFTMaiYtpi9ojGMKFicn3FDEbboCnMSKnklJPA3501a5bZbIaTg2UWLjgKDQsLe+KJJ9SBgZV54cKFV65c+eijjxo0aECWsmLFih988AFMLC3I6DUSxlVkAvz9/SFGpaijCwH49CtXrkQYV3v16iV3qPgYc6dOnShy7969x44dQ25yivSQVkOGDNGuzLAadHzhhRewmfjjjz+0UxinqI6qjApUxdHt4IcwRVJlKfKu46WzW2BQm37VqlWPP/44xlmxYsXQi40bN8YltPsvv/zy8ssvQwBi2G8iTP2Re12iqpStItR+yXK4fPPNNwsWLMAWmzY3jBY0I1oPCwEaZ8mSJaKb82TuoVzy3RGgRcEbaIgigGPGVLh69OjRixcv4hLVS7mQP0B9g4KCmjVrhpU68ypPnjwZUw87b8zN/FYLX4PWGfjljzzyCIxo27ZtcXobnYJ8Dh8+jOTwiXfs2BEcHKyO5I8//virr74aNWrUO++8A+8cFhqG9sknn5S/lluAgX3p0iVMzzlz5tB9deyA4e5Dn86dO8vZp0IKU+Dvv/9+9NFHbTYb9gq7du2Cx0/x8+fP79evH/YByBlFo1zaE1BCFZqhNOnUq5R/bGxsixYtoEb//v0hhvzpqhY1FZJQqp9//hkGZeTIke+//z5iaBYgsHjxYtRr2LBh48aNUwu6uxQ2w4/qoJ8QGDRo0PTp0xHAit+xY0csRgij0aOjozHKSQYO2caNGxHAsMAxlyCVUAQCXvY6JDEcaeBi9GSeig1/JlCb3xXDT0XjiFMvCyVhCmQckxhFo0ePxqqKQN7UIltAJSz6WN9LlCjBhr+gQEMuRww/1tKuXbs2b94cnjEtXIhHVmT46ZvErFYrJiPGcFhYGJVC/r2ch+7rr7/u3r07svroo488GX6A8Y/MITZ06NApU6YgH2Ty2GOP4RIVev369Q0bNsiyuiZNmmB/gOmfsVI0jwCyUq8iDCZOnDht2rTt27fDcOCSOp5xiapG6zOlQiQdIyIiXnnlFdh+eptPvfrjjz/C6gNUiiLvOoXQ8IOrV682atQIu0i1lbXVpF5EDALYpWIIql2YG2CIoKwbN25cu3bN+1KwhoaGhpJ85qnY8GcCWh6td7cMf0JCwoULF7wvFIawdOnSGDCQd+tk0FiiAexlnnkGllEcSSt1oXQLG/78A4YTjt4YfnlEi9mEQahEae5H4tI///yDiVanTh0YXfJzcBXHTz/9FBYdhv/tt9+Gdw4LDcN//Pjx8uXLU1YQowCyoiJg+L/77rtly5Z16dJFlVELwkhD/leuXHnwwQfPnTuHSIqn5AgjQEUjHBQUtG7dOrjv6iUgEqTMJuwSgDby1q1bcPQHDhzYq1cvNR8SwFWoTR8L16pESqIZ27Vrh90PFuSUD06Lq++88w58/QkTJrz22mtIhci7TiE0/OiDpUuXPvXUUwgEBASg0dH3ymWZ06dP//rrrwhAmPah6KHc6w8MUxSEXp80aZLdbldHWOZ88sknL7zwAgJZ6saGPxPQ8mjAu2L4Mfz+/PNPTHV4OUpsVmChwXpHa41bw184YMOff8BIw9Ebw0/v0BEIk7Ok9XrPnDkDdx8BOMpVqlRBJLnUWMqwRsHpx0qLSfHGG2/AA549e7b2YTnikZC2Czj9+OOP4Rxjwj788MNqKSQMSTmFbvHixX379kURwcHBcP0xligeYsjkv//+++2336AnZtOHH36IzQdNKwioyymtzCgI+tAlAvsAlFiiRAm1LBVk/vnnnz/zzDMIqHUn5RGGw1m1atWKFStu3LgRnhtFYs3H+rNr164tW7Y0bdoUqSiruwyUK0ygL9HQ6ic6OnTogFNEom8IDIWYmJhSpUpR9zdq1Cg5ORnxSvpcgAqNi4uD83fJa7DrRCoon6Vut/E5ft+BWi/vP8ePglBcUlISutL7fr9x4wY6HQlxVDIqjOT45/iZ2wZdALAJw84488/xIx7++u7du//v//5vz549GKXY0SrX5Ks4hbOLdfWtt97CyMcphnFsbGzr1q2x0mKZooG9efPmokWL3nfffXDWKYa+uwLbhejoaATAu+++i3wiIiKQCWQSExNxVAuCAJLAKRcGTKdr3749BITeMpQDMq9UqRKuwgpgkYcayIGgfADEsMDevHlTmX4y+/fvr1Chwi+//HLx4kUlSgPmMq3MAGVRPlQocoZWWGqwZYdjiZxJbNu2bfD+UV9sC0g+P1DYDD/6AB1WvHhxjBuA/qPWFyNCBmEMJtqyAQx3DHrEK+lzARptKNcbK05AjMYNEmaZhA1/JlDr3RXDD6gTvS+UBgmOQInyDipOi3Ih51DyTYtyLZuw4c8/UD96Y/hpcMKVwroKUwq32GKxqMIIQACeblBQEAT69OkTGRn5559/wmWXV2LDyJEj5YVQWMf33nsP1jEsLGzGjBmHDx9evnx5v379kBB7AlyFDFxw2OyGDRuuWbPmq6++euSRR2Bu1YIgcP78+fLly0MGOU+cOBFFC71lxOSRF/kXXniBFnnYXXjbGWcWnSI3HJXELtenn36KnQpZd0VOgyqMAI4UqZ7i+Pfff6Mi5cqVW7FixZUrV+Do169fH5WdNGkSbWIoyV0nXzxvyEFQJYwJjEj0d61atdR3Q2gEAKr2c889h80B4jHO/vvvP5LJJWjcG8VH/lK+bSMrIAZhJEFCL5Mw+Qox1FI60fsepEGCI1CisoKWGyxqhw4dOnLkCBYXGuHK5ZwDeWJaHT9+/MCBA/v27UNZ8OpyoyAmf4K1CEfs2AICAmDSsG9LN0oxGFq1avXdd9/BoZ83b154eHj37t39/f3p+1KRXIxL+X7+iBEj6CfzBwwYULdu3R49eqxfvx7H6tWrQwCDGdsFrM8YaUgLw48ZRN+5QqUAeM9QAPE1a9ZUv4+VID1h7OHdlSlTBjGYIBiuFI9TkYuMPM+UNRbzCOVevnwZzjp0wzaI5NOBSCShUtSs1FPQsmXLDz/8EFV7/PHHmzRp0qZNGxQ9aNAgWBwkJPn8QGF7xo/+w3qE5QkWvUKFCvfeey9iqHdJAPHoHtT61KlTsbGxuIShExoa6rabCwT8jD8T0NHo7rx/xp9nYHijjnBQ4KZgbO/YsYN2tDk+nlHQzJkzsSbSEuzn5/fmm2/Sb+STgPfwM/78A63/Xj7jhzBWV6ycixcvhj9dpUoV1f7hEkYIQPjo0aPw42FuYcgbNGgAaxoTEwNh8tFJEoF///0XkliEMWJryCASueEqtr+0uQwMDETCOnXqIKAtCIMQaWGq4VtXqlQJamgHPGQA5I8dOxYXFwf5+++/H9sRXEonSUAY2iIAd3/jxo1//vlnupf5vQT5oCw01O7du2fPnn3t2jVYlp49e3bo0IE+VkabBhK+uxQ2w09diMZFACBAYYwkEqDhi0iE6Qjy1V4su7DhzwTq60Js+FFBDHgY/ubNm2M1hN+PmqKO6oDPKVAKvJldu3ZhNaQphpUaBqNkyZKKhNew4c8/YPzg6I3hJ9OIS1ar9f3338eeD12PYUbCMHgQwNhAgOwljsgcthmrKyLJ5tF8xJHEEKCrogA5E8qBTukS5Y94CiAJFYQAQVfVNZySUDxlRcKUHEcKaMFV5Hnu3Ll27dpFRER07NiRckNCEvAS5IMSCcoBMQjQfKFwdvPMJfKFEjkLNS5QW1m7CKIDcKq9mnEcFCYwBGlY03AEFFZjFLm7BOmA3Rj00U4bUg8xihyTFmorOoqxLi9tiMdgxsCmVUYFjUnNS2kBtTBct5MnT2JpJjGgXNaASAgD+Gf//PMPTRYcQVRUFDwbVcBt8ryBdKBqkjJAjaRTRTT/kVF5tUkBOSqK6F2FVksoQ6423HQspKpuGA+0zOKoXVrh6SLGbDbTEMURkTiqizBFUliNpEwgSUcgF6KgpkJAPSrXUvSkJLgEKAfEU0CWcgP2o7Vq1Wrfvr02t+xC+qDWCKAWACXiSKd3knPOkl/0yCmoawGamEA3KNdkaBwgnsRwirByrTCCGULLx+nTp+nrJ8+cOYNIgKtYa0jsrkC6kTLXrl3bvHkz1IN1SU5OpkVQkWMygBajIw1jigQIy6NeuE2qDIlp2xMxaOTnZGw2myqcEYpH8tmzZ1NM6dKlsYRReN68eSJ3GYq5i1AFk5KSYJnWrVuH0R4TE4PITGqXT1AbMDY2FuN/9erVmAv0pnr+0RzKYFaieceMGUNPrHGqLp5iFGo+4UanRLqr6UAOFFDFKEaNR4DCopgMWdECjgBdJegSxRMIUzwJZOS///6bP3/+m2++CZtNeVKSbKGWRUc1BlAkieUHCrPNYwAm59SpUzt16tSqVavHHnuse/fu4eHh33zzDa0pGOKK3F0COhw/fhzrSNOmTbt06dKtW7eOHTs+9dRTWAHz1TzJb6SzBzhN11xYaBBJqGHlWgoUSUtSxqsquIRdwooVKzCWIDlw4EB0FiIRXrlyJX2chJa5uwh8UDBy5EgM77Zt22KcYyw1b958+fLlWvuUD0HrQfNz584NGzasWbNmmKpQHnOhXbt2J06cyKRf8hh0N4iMjOzQoUOTJk0Qzs+tmi3QyNjX1qlThz556Auw4S/kwG/48ccfLRbL0KFD586d+9lnnyFm1Kivd+3apceAdzl1qT9BrhO/VC6J3+rWiZ+zx59sG1ypv96NCESJnwXNCWgpmTlz5pYtW7DYff/993PmzKlcufKaNeu++uorbE0wJVG0XLpGTYSUWPzvFFoT8lYG/+skB6RdOaZmPoIqCuCmowcXLlyIpoNfS5GKUIoYAjAn6P3333+fPpdMuz0co6Oj6cMvyOfSpUsIX7hwAaeUEFA+gNb3/fv3nzx5Eqfw9dFT2EFiywixGzdurF27FvFyZ90RVC5Qzj2gCMkoUTJQ7MqVK999912FChVQX7TM888/j2oOHjwYuxbxw3ViaDjUUSKOIjP5X2VACQEa9zjIJcnDHeVkodSdYjabd+zYMWvWLOzOR48ePW/evBYtWhw4cODdd99VZwFpJQ4CobYKNBcdIF9GCOJKM5HyOQTywkiA1X/xxRcRph7PyQLuHhjkjz766LfffhsQEKBEFXrQc0yBJvPP8SN+/fr19BFSOBZY6LEgGox+UyZPdtniHcmxyYlxR48cPXRgb0zMNYs1IcFpTXbYXY44pz3RZXPZHBZ7fNzRqPMW8TVIDptklZyOtMvO7SNylL/Q4/Lly3abxemwO+zOlSvW+PsXad6ihcWa7HQmW+zW0+fO7923/99TUdbkZLvDZXFa7VabI8nutCVfi72w79A/R48cu3Ez0Wq12OwuK9Rzxkt2mxVbGHl5QkF5/zn+3ADKo/vIVD/++OP0yBA2o3z58jB1VapUueeee+jjzmhViMH40XvURYsWxdJWpkyZpUuXYgzcvHkTuyukVW+TAojhKvUIjlQiQBhJPvjgAwhAGK4eijh+/Dh9GyYSPvXUU5TE+7bN+Dl+BKhogHDmkBgqqOpJkWiWTZs2qfnExcU9+OCD0HDPnj2SxWG1JV67eWH//t0HjhyLv56AsZIMre1JdrtDSrYmJdvirYkJiTFHz1xIdNocjgTsgUW2kh0Vy6kB7xZq4X379qFVVeXPnDnj7+8fFhYWEx3ttFsxN+IwOyyxp6JOJyZbHUJ31N9mt1udVkuiNcGRmBhz9UpU9A2LM9npSHY5sFvHTBB1UIrxgKikd5/jZwoT7PEXcrCHbdu2LYwEOlte4cXPVmK/Z9Bj0jv37Nrd/bGerdqAdg+1b/XhJ+OSE3UGncvuMkt2R1z8pT37dn7y3rtPPvkcLK6SY865P1AGWsG5ER+3NRqQr96gq1CxnMEA9Vx6yZGccPONoUPat27ZpnXr1m3adu/Ve8+R4w6X0WCEFbLOnTe3TetHWrdu36Zty26d2kyfu9wmSSZxeyLIYRQV1hcGbyQ9MBKvvfbakiVLihUrNmLEiDVr1nz44YfTpk3D5gntSTIIbNy4ceDAgTAh8Pixtfrtt9+Cg4OHDBly7tw5Pz+/r776auTIkYipWrXq999/P1kGxpiSa4EZgAVdvXo1Zd6lSxdYCOwbmjZtilN037Zt2y5evCjL3hHIigJZGh7SRK0sgfjSpUuHh4cjTJcw5mE7ZXmdzpi4ePGiDu07d+rQqV3bdu3ahk/7dabVieFnREqbI/HmzWs7tmx5of8LH33wmezZGhAvD3XhcMsl5Bakbd26datXr66eonPRd7Qt0zmdCXG3Tp44/sPYcY8+1vv8pSuYJwb5Z8LRVEmJiZeiL69YtKRHt0cXLVgu/76wyEEnRv/dfIOHydfIU4kpwGTu8cOHgG8EawGnP17m4YcfNhkNa1Ysvn753xqV72ncpMnqdeuXLlvWrkVjg6Hkq69/arHZkuyOP5YtvqdCqeKlQ4rrDWUr1ouHF+F0yR6/LfWO451Bvo7wYe32RGtSfFLirfj4iIgIqNf/mSedjltvDx1UvmSJCRMm7N6167VXXzH7BdSo2+xsdKzDYd20YXmxkOCn+zy/bfuun3+eWPPe0ia/krMWLHda7fB0bMLhsZM7g4IKjcdvsVhg0WHPQkNDYXGV1rPZ/vzzz8DAQNXjh1irVq3glC9btgxhdD0k//7775CQkNGjR8Oi4PTatWuw+s2aNUtKSqKWQSQupWslhNevX282m7FWIPn27dsRAzH6gVFhIQ2GqVOnIkabKnPcevykFbYpr7zyCjYomTN48OA333wTg1lNjkag4YR8UN/ExEQ4zbVq1YL5v3z50s7I5aHFSvXs2Tty64ap02c+UL1SEf+SE35daLdjdti//urDcuXKlioaEmwwdukxIN7ugkvtEq4+hlKSE25/bo4aqA3lqeWhP/oL9YqMjDQajc2bN09MTDh74njFcmVKli5VzGwsWa76wVMXrKis3WFz2GyW5F5dupYpX7osNvhm85djf05GPnarE5lhPyMlZdkpEADs8fsa7PEXQtCvNIEBLMG6detef/112PuwsDB4EjASJUqUqB4WtmjhEpdOv3Dxkg4PdezareuiZfMb1i01a3bEhSvxfnpjnSZNZ8yYtn9nZIO694t3QSS4PrIncccomsnPCLHYHThwYNTIkX369K3foH6JkiUGDxms10mNGjaKvnxp5oJlo8ZNHDz09UaNGoz/7psP3nr1v6N7Zs9cAEdmSsSvLcLDp8/4tWmTxi8MHLBg6awgv6TJP/wcJ+5MSCZJJyGXnNE3v4BuhRlbvHgxDBssH2w2TC/d8L/vvvuw8kMAYmjYgwcPolWbNGkCvxwDIDY29vr167C1NWvW3LJlCwTIYAPKGfmItUCGPE4VCC9atAjdhHCNGjUaNGgAEwXhhx56CGOJOpGeIKRLmF1oG1G+fHk4vlkCTapVq4a6QwECZvLSpUtz5swZOHBgu3btkA+2NSdPnoRYieLFZ06bX69eq1mzZjV7MPyZfk+uWTfVbLT/+uvshCQbhsljTzy6YMGMvdt3BJn97E4jmsBgEDeMZL1yy91XJykCqPuJEyfGjh377LPPotdKly6NfRvavH79+n5m/5Jlyk6fFrFr+4a+fXu5JKNTvMqOAa6DkjqT6a3331uxavnUiJ/1TuwhTEYjlDfqXLmrPFPgwbBjCjRuPX7ZDxTvf2EdwZoII4EFceLEiSNGjMBpwwb1Y6/HnDt97Nsxoy1OV5IDS4bDZb3+2dsD/UJCl/y9E56OXfg7LskW/3CDuuXvaZRgES8+5YjHjxUN6x3sB3y+V199NTg4GEv5oz0eHfn1qF9/+w3rdYC/eeOGtVZL7LsffXIt0ZrocMGLkaw3d29cGWI2Dhz8fqLVsXz5vHV/rLQ5XTY7crQm37r4cKuGFSvVjbpyU3b5xXsKLnlpRYmFxuNHFR5//HEYyFWrViGMNgSIv3DhQmhoKHn8OIWFgww8fri8aFts+Ah/f/9GjRrBp0RaND4uPfjggzhFEkIpSQMyRLawTACudlRUFKwpjkeOHMGowwKCgkqVKoVIt8nd4tbjJ6AYUE7coV5FgKqPALYd8+bNg43HWILVHzZs2JQpUwYMGACd+/TpA5kNf6xcsPgPVNRhs9sdyfbk448+1Cz03rr//nfZaXNaXTcdzqSES1fK+fs/1P2FRKckbhfBz1c8fpyTmjmJqjzaH7OyePHiZcuWfeSRRz799NNp06bRa/OTJ0+GCEa/5EiQXLH/e/Hp4uXu2/ffJbSBhGgXpiR0tNtdlo2LFwUbTJ9/O80GXTFvMaOFx5/1z49RY7LH72uwx18Ioa7FVqB///6nT5+GG7R27dqIiIihQ4dirYdA/QYNi4QUq3BvldeHDYfTb8IokMQDz+ir8Ua9sWTJonZ6hCg8BjFE9PIDQ5GzergzaHEZM2YMtMIC/ddffy2cP/+dt99++KGHnU4pMKjo/bUbmEwBn378vr9Zb5Qfc0p6U2KyHZa+SGgRODVdOj/Rvn0nl85uMGDxMjsduqvX44oWDw70M0p6g2TUGQuju4NGg72Ep0vfP4pTmAe6pAV7PthjeI0vvPDC4MGDB8m8+OKLn332GcaA2ySe2LFjBzxpJEFZGEgtZcLDw9u3b3/o0CHEoyuvX7++fv16JcHtgvxxRIaAYtyCq5AkYUAB7EJeeeUVBP7888/Vq1d/++23zz//fEhICK62aNEC8a1bd+z8aHu9SWw6xeBwhF6PuV6sZJHAwEBk6dT7STqTPLCdynPx1NdDMI6y0VzeI1dClDJ79mzs3Vu3br158+Zly5ZhE/DUU09hK4N9W4MGDQyiNegPikmov2gDhFUFZaCvQVEUF+gazuRJkFaSYQAb/sIJ3PoTJ06cPXu2adOmvXr1wnYe7gW8txkzZmC5qV+/oU5v0BtNBhwgLbn0OtuRfXuWroh8sGmrBrXD7OIFIZfOYNXpHcL85+jaAQWwfMM4/fHHH7Bhb7zxRoUK5U1GrMqupYuXXrp4OaxGrWLFS+kMJn+TM9DgMEriLSdLknXST7/pA0v0fUb8jDcUN+oMRr1L0mHL4ly5cMHhqGs9H3+0dNFA5A6lda5C+HIfKl6uXDl05ZUrV8hsCOshQ2E6wkdHi9WsWfNtmXdl3nvvPYRhETE2RF4pyKk9AmNPMvTFJjCc6DgogCMsE0qh3BYsWIBLlATIGWcPsui0I1SiPANhFcjv2rUrNja2d+/ejRs39vf3h8C5c+eWLFkC9cRLiMjPaDAaLDp9sslkxMD4c8Xm/UfPdOz8ULGiwS6nziD5yS4Q5MSf/EFQBFw46LFL0OwCchDoBlBf7JkQeOmll6pVq2Y2m9GM2KZHRUWFhobWqFEDysi37TELhW7QSCQWz91g5hED3ajXRYysNGLlg3h/Vygv5BkmLWz4CydYDbGIwKxGRkZ+9NFHa9as+fnnn7t06QKvIiAgoGHDRuIFer3LAKvpcOqdzrNRB18YOrzkvRV++HGsWacPxLICv0JvRk5YXxCixUcsKHe8lIic5SPUg/3Akrdw4aJFi5a88OKL777/jtNla9K4oQFOjXinyizpzVjbHdaErz779I8NW7/7MaJx3Vp+BqgnXCC9w6R32javX/L6R1/26t3zjVdfwW7ADPdNMuqF8neqan4DVqF+/foI0GN11UbSdx1SGA0LT7Fs2bJwH7HzU+/ckmFGgyMGYRxhb5AQVlyNIUn19Nq1a/ChEUCe77//Prz/7du37969G8edO3fiCL8fV1Ho1q1bYWspLamRLZA/gD7YRuBIp55QBVAcpYWxxxGb2u+//3758uXffPPNI488cvnyZTRCpUqVhGk0SCYMCpfJ4UjetWX1q2++3qpHv5Hvv+MX6JT8dX7YImLAmbAaYrCLT5c49dj1WvQYZRhIBhdmCumZg5DyqAvmI1r7008/hf7YrLz11lsvvviixWK57777YPtdkkvsYjFbhSHHRDSIbsapQWzWhe+Pq5JJr3OIfYow/JjVDuyrDS7IQwyTRBkkDKPChr9wgmWlVq1aTZo0wco+duzYJ554Asc6deqEyITdX0s2+TqHwc+mt584urNHzwGlytdasnRR9apVsPimGkwsfcLaywtfzi0gUA8eZP/+/WH7N23ahMCgQYNu3boFi4WrTZs2genBUi5KtTmscTfee//dmUv++HX2788+1RULmcHp0EN9s9Huilv8+7R+A94YMOiVyVN+CA0NgcZIJG8tCiFokx49ehQvXnzx4sWzZ8+G2UBLoos//vhj8TU1KZQsWfKFF164cOHCq6++Sl/vg0iY5JUrV44aNYqMK2KQ23///QcBBGC2T58+TWI4ksCGDRuuXr2Kq0WKFHn66adr1qwZFhZG79YhgAGGSBLGLmThwoVISyohMg8QI0TWtmXLltAKlh67k6eeemr+/PkdOnTAJdhO+h0gowEj2GRwSRtWLevz7MsPPfX87N9+LF00GHkIXYW/L/5FnVMWRCP51rJXLd9jzwVI+SeffDIwMHDv3r0vv/xyv379Tp482aJFC7Thgw8+iKsYySljGaohqLy2h40JbcfFvhzaiWhZWVEj7IuFTIryDJMBjDCmQJPx5T4sKHQ/9saNG/D4YVkPHjx4/fp1uBFY5S9HRzvgz8GjEy/NWTbt+KtipQqvDB5+80aCxea0OJzJNrv4hjPxCp9Nctzq3KBO+XvqJVicDqdkpZf77uztH6gHFwfqWa3WI0eOrF+/fteuXTA8cXFxN2/evHTpUlJSot0uPoJot9pvXPl3YJ+e9eo2++fkf7cQabPYneK9PZf4QFPSDxHflipT6sdJ0xMttmS7K9nmtNod4pP84o/e7RO6FpqX+9AkMPCff/45fFwYY5i3N998E7slWA6z2VyxYkV6uQ81jY+Pf+ihh7Cvqly5Msw/jEq9evXgXPbs2ZM+3Yd8IIAVALtD2O/SpUu/8cYb6BSkBQhAht6PA126dElKUj4bRvmTALYF8Eph7LGZaNasGXKmHEhhT2R8ue/2QFnIAUCT6OjozTLHjx/HQMJQx74Ho50qa8fosMTNm/5r6ZLlvh77w61kzA7xjVR2l90GHexWp8ueEBN9j59f524D4unjoC6b+BJL8bWVVpf4yr8chpQXOtjtp06dwh5r27ZtZ86cQTtDf8wCtA9mMa6KAe2wSI74/73Qp1j5Wnv/u+RyymoLDZ0OVxLqsvn334MNxi/GTE1GbcSLufIbuEJ5izwZPEINyC/3+RopG1ymcCG8dr0e7k7z5s1bt279wAMPwE2Ek431vWzpMnqXXnzVrT55w59Lnnl64LND3hr3/diQokHiHqpObzCKb/sRYAnQwU3SY4kR/on4blP5BsCduXRITuYEZun+++9v3749bA+MVnBwcEhI0XLlyvr7i9v7BoMzNvpUvycHRF2MW7xqyQPVKwUajHqT0SAZHXrJroudMnHM559/P/HXOYOGPBtgNhmF6yk+6iQJT0c4QUp5hQh0K0z+e++9B9uPFouKilq6dCm6eMeOHTD/derUkTtNLNxBQUGLFi2Cf48xABn4+mjwd9999+eff0YmEMBG4X//+x+8dphMuJudO3fu1asXZFAK9S+OZcqUgdPcpk2bd955h56dExCDsQcYVF988QV9UBA+t+jUPLzZAgUogBqVKlWKXjxEjWDDMNQrVKiA0Y6AUMllnf3bL2++/+nXP/38xrCXQwwOVMIu3weXh7VDciHGIN6Q0zlhMuFFu7A2ittddzTUM4GUh25QvkqVKmhk9GOlSpXQzpgIZcuWRcehjyBggOUW9/Exo4V6RvQwdMT4huJOXIDyTswW4f+Lt/nRc0ZZeUwzKJ933cEUIJTnZEzB5Zts/h4/OtzulIyS5dCBP7v1GFCkaOX+zw6AZ6AX30+uN5jNTzzVu3rVKpaEuE0b/8KS8/nwt/5NlKb8NDHQ7N+sbfNSRYsY9Ga9IZcWFGgHZ0usXg6b5YmHH9px+Gy/l14tUSzAH06PZLAZ/R54oPnjjzac+su3b38wtlmzLq3aNHfoEsxObFAMQUVDBw1+ITg4yCge1SITo1j5CuPv8cNTTEhIgFuPesHnhvtI8bAiFACoKbnmN2/eRBiGRLGCYhMndkU4Igf4xEWLFkU+omnkq1gTkBBhCNAugeJVQ6sCMQpABkCAjhkltUzO49/jl1xLpk18ZfjHVZs+0rlTqwBbHMaYzRCgNxUZNLh/aNHQm+dO/HPkiCvOOujFF6s1DH/v/beLBAW0b9fa38+Usn0U99XlQJ4jSbbk+L82bLQ5XNN/HP/H7qgvR42sVrZY3QebVyhRzGw07ju47eLF66e37fpg3Linnx3co0fX8mXLNm3awCh6QKzteuUZgXvQXzh683v8TKFCnrBMASbzb+7LCPzBRJvdYY9bNv8n8fWeej8/g8mgV/78/UNWrvvb5nLt37cnAA62QR+CUWLwwyWzPnjvkeN2p/UOb/VnCrJ2OJ3JDkdifPzVeytUlD98AHMVmLJ6BTw36E2n9dpbr/YxCLezqEl8jYnwhkwGvzLlq5yLuWqF3cJ/LjvdyESmheNWvxZUBBY9HXSnnS7RUdwolu+9q2E1IZ3iqKJtHCShSAQoDGHlmgZE4irlQzfVEQbKZQ/k1K1+b3E5R37wWgAsobmICUNZjCI4xIYSpascPv2fxemc/cuPGEIBep2fESPJbPILrHBPtZhrN8Un+eXb/Xdz0LicNy6fLRbkbzSag6C3uQj2Kxj5P89dbHc47TZb/+ee1Bt1xcUchfImozmwe4/eyTbs7aG5UF7JxwPoAsC3+n0N9vgLPDD8CxcuhMcfJP9uSkbS7d8leECuBPHtdnaT05YgmQwOg5/eLok3+owGcc/QYIK9N+qcdmuSU2cwOuwWc4D4LjyH3hBoDDAiQ1Ou+QTC45c/mySWnzirzh8OizMZJt5pFB81MOns4gVsvdkAQ+W0OQ3+TqyCzgSjPlCnNyFsNJtREfEOtHitCd6qqP5DDz2ExiGPP3NntACBBRomFh45aoQw9QimM1UQMRTQTnAKQxJAQI0kSYqnGEB5Uoz2qhaI0RHgKpLQLYeMklpg+CdNmgRjExoa6lYy8+TZBbrZHRaXPU4v6VzGohjeBsmBXaRdMumNBrMeIzw+GYW6DE6dEUPH4NIZDCazyWwQ3yAhka+fkwplC1huR7LV5rBhUy4lWYzBUNAsuaxGv2DUwwkDH29zmf0cUoLJz0/vQi2wRTCZMCqEzmJjr2SUBrSJEpLD8Pg7d+4Mj79NmzZo/JxtfyYfUkgWQZ+FFlxAbhbFUCBdWAVz2mgoYjQGmPxN/kVDAwKLFgkICAoOCigSFAD8/QLMBpP8xZ/+QUWDgor4Fy0WAoHAwCJFAwLNJqyJubkuIGvxSTy93t9gCAgNDAiESkHF/ANCAv38Av39/PwD/cwBfiazOSDQv0hIYGBAMK4XKREQGISIIH8/s3g8K1wdp1Mne/iK+6Kaw4wNUkBBjfzkz9ajO8j8AzqlGApQPIFIgAAJECb5S38pknKmVIjHVboE1KtaEEnCJIkkFKNcdgfan2S8H7F3CMrCcAkILO0fVAr/BGAQYRT5BRbx9wsywbzrjX5FMcqDAoODAwOL+geKr/Xx9xNTQL4zgMrk3nD3AoPehJFdFFPRP6gkZmJwgH8A9PRDg+sxD/z8igcHFMVcKO4fEOwfBOUxPbBxF/c0ZOUzQo2MuUC3c+gU3YewmDCFaI4wnmCPv2CD7vv2229//PHHd999F8suTrHMhYSEPPnkk5jJJIMYCvgCtJz98ccfly9fVqLkmyJ16tRZuHAhmsKTDWPyBgzRiIiIL7744r333sOmjkYsAv369fPNEZv3oM0vXbq0Zs0ahNHUmDJHjx79+eefV6xY0bp1a1w1y7/JxBRi2PAXbNB9sGpfffUV9SMdq1atunfvXnolGKc+tYzCj7TZbI8//vjmzZtxiroDNEvnzp1///13BMgrJWEm70EXYJ8Kq68dsfRt/+q7h9xBuQrafNOmTd26dUPAaDSSi4/Gx864Xbt2tP1SN2FMoYQNf4EnWQb9CISVkwkNDcUlBNSj7wCPPyEhge5hou44IhL2vmjRor7ZIPkNi8WSlJQkD1hlxCKyWLFiOFKYjkwugWa32+2YI2hnTBNqbUQGBwfD/CPMVr/Qw4afYRiGYXwI3tkxDMMwjA/Bhp9hGIZhfAg2/AzDMAzjQ7DhZxiGYRgfgg0/wzAMw/gQbPgZhmEYxodgw88wDMMwPgQbfoZhGIbxIdjwMwzDMIwPwYafYRiGYXwINvwMwzAM40Ow4WcYhmEYH4INP8MwDMP4EGz4GYZhGMaHYMPPMAzDMD4EG36GYRiG8SHY8DMMwzCMD8GGn2EYhmF8CDb8DMMwDONDsOFnGIZhGB+CDT/DMAzD+BBs+BmGYRjGh9BLkqQE8xxPRev1eiWUc2QsKzdKuT1U3e5QpbxsT4ZhGKaAcpcNv9PpJLOEI07paDQaSSCnQJ4ul4sCOKIUgIDBcNdueJAmpBgpg+Md1h2NiSMyVOtFGVL+DMMwDAPu5q1+1eCpVpmMn3wxh0G2lD8VB3KpIC/JqABO73AjghyQLVUQYaBcYBiGYZgU7rLHD+CnWiyWRYsWmc3mJ554Ascc9/iFnXe5rl69+scffyQlJSFcrFixrl27Fi9eXJHIc1BrkJycjLC/vz9qTVYfZlu+fjsgQ1Rt//79u3fvRsMiq7p167Zo0cJkMikSDMMwjM9zlw0/DBXsU0xMzH333VezZs2NGzcGBQVlbvzEZiHtvQGEs0xis9leffXVqVOnUomwsjgdO3Yswria41sNT6B0gBKXLl36ww8/7NixA1a5ffv2w4YNa9OmDcJ34vQj53///bdly5bXrl2jSmFLsW3btnr16uEScs68lRiGYRhf4C7f6idrRFAMXcoEmDSIkfnEETEUpqtuwVWLxQJbq9o/xKxbt+7GjRu4SqckmduQU/7zzz8/++yzBw8e7NGjR5cuXbDdeeKJJ1asWHGHaqAif/zxB6w+hR0OR3Jy8qJFi5AtQWIMwzCML3M3Df9tA/MJQz5jxozly5fDvCEmc6sGK7hmzZqbN28igFO4wpA/ffr0P//8g9O8tIjYdpw9e3bkyJFFihRZsmTJzJkzZ82a9euvv2I38NZbb0FDRe62sFqttLlBmCqFIyqemJhIFWcYhmGYPDL8sEBkitSwCkVqUS5o0EYiDDNms9lGjRoFk0mnyjUZkYWMci4zf/58ujHQtGlTs9mMJNg9LFiwADF5aRRR3O+//3758uVXX321devWKBr06tWrb9++58+fX716NQRIjOS9AcLEqVOndu7ciZiAgIDw8HB6anDkyJF9+/blZR0ZhmGY/EyeevywtbBPMMBw0wHd96YjWSlCvQooDGDpSRg5QAZHMmbC4sngKiHnIUAYCXHpzJkz27ZtQxHgzTffbNCgAQmsXLkyKSkJgTyzi9A/MjISxfXu3RvKYAvi5+dnNBp79uxpMpk2bdpEdQFKgqyAJLUJKgvnPi4uDplXqFBhxIgRMP+It1gsK1as0DYLwzAM48tk0/CfmtAShkVmyBpNVMpJJsAIwdSRBYJv+tVXX7322ms4/v3334iHhVbFcKQYOMfwhlu0aDFw4MD169fDwiEephru++zZs69du3bhwoXp06dPmzZt5syZMTExlIMW6IkMN2zYcP36dRRdsWLF5s2bw8oiEiASpUMMYZLPbRITE0+fPh0aGlq+fHklSqZSpUrBwcE7duygnYoS6x1URzTO4sWL5Z7Rd+7c+cEHHwwLC6OXFrEhiI+PRzzJMwzDMD6NMIDesnqwThc+PkoOR40PHyxuTIuATkdBz8DowjIRMMOVK1eGTYJLGhISAoPXunVrf3//Jk2awC5CwG63JycnP/vss4gMCgqqXbs2jgj/8MMP8PsvXrxYrVq1wMBAbALgJSMThHEkq4nkSpGyLUQMsuratSuEUdl+/foh85MnT8LPJhvZp08fxEA9JU0uc/78eVS5YcOGsMRKlKznpUuXYPuxL7FaraS2ci0r1IY9ePAgmpQqtWXLFsS/9957qDJOUXfsb/KsjjJejYo7ILfzZxiGKbRkx+M/dfKwLvypbmHySdiwbVO6KAFJomBWoLwrV6688MIL586da9u27V9//QVDCH80NjYWBk8Rkhk5cuTcuXPhnR84cGDfvn07d+6sX7/+W2+9hYSlS5detWrVsmXLsBuAX4tLu3fv3rVr1wMPPAALBzunZCGDGCSBDIrGLqFHjx4QgIlFQtoKYBcCo0vCnoB6Q4cOfVEGylPALXT1448/xg4GJSrpNaBQqAELTaWrkObYglgsFpy6TesJCIOFCxeSaa9SpUrTpk2xFejevTu2RBBA/Jw5c3CJ5HOJNUO8uvHDMAzD3F2yY/jDatbRRQ4fk35x937Fh22bP3/+2bNnGzduDFME6wtPvVmzZpMnT1YkZKKjo6dNm3bPPfdMnTq1atWqiLn//vvh7sPpX7JkCaxmrVq16tSpAwtatGhRXKpduzZOg4ODkT+gTAjYP2wvbty4AbNXpkyZdu3aQQBW9vHHHyczef36dQgo0h6AGPYKlWVgVjMHkmXLlvVkZREP0ll9Fdl1F1+760nALagRUi1dulSuvR72Hi2DABqZWg+sWLGCthR5RTa2g7dFbufPMAxTeCFT5C3iZr8g5Ya/QMR5d6vf4XD06tULlht2Hd4tYuDoI3Do0CHkqd7qh/NqNpthm7dt27Z9+3Y49JGRkZs3b4ZZ7dOnDwSQG9z0YsWKde7cmV76A0pJGiCJq6qxf+6556AAgRJLlCghG0r9Y489BjEljTuQOZJQuXTMBAgAyCOspNcAtUNDQ7HXoVsCBCQvX76MXQV2DLdu3ZKz8fa2PCRRHFqpSJEiqAvabcOGDXIVBe+99x7tIXBp1apVShqPiPvng1en9HGaTpZvrStk7GztVeV62lHhNvltFyfQ5O8+H3Hufpi6zRyXw8evpityMjU/kJpP1ooxDMPkc7Lj8YMuU5AGi1/k8BrZurELw0PG7MyZM3DT4aPL1k15mR9bARIjzp49C2MGQ/XQQw916tSpbdu2CDz66KNXrlyh77gFsvKpXjWdEkqUDL3PT2GYxmUyS5YsOXjwIPYNiIRi2FWcP39eSezOU4cM4klhhDNBTe42H4AccLx+/Tq2O6IwGcTQBsjf3x9NARmKBKpAJkBg8eLF9PGEUqVKnTt3juq4dOlSalhq5Dlz5lDOhEjpjoiuX9ZWDGfk8AETTom4UxNa1hheRzFzqwdHdE3f9cL/TjWs6R3xNMmjxh/u2pKyFWRZXDp5T2TMp0vPwbrI+auUlKcmfBkRPv6dLpllHjm869ERInrbMN2Ell0Pp+4atg2Tn2+JtCQBRDvwow2GYQoiyjKWXWTPhxyh1BU/U8gPrlu3Lizuvn37yCVFJLzto0ePQhPy+BE5ceJEk8n07rvvbtq0CY7+li1b6IhTeOpIAgN28eJFuM7w+GEycUqRdFTKkyRkNXLkSDLJyB/2j0ygCuwiXfrpp59gd0kfJXFaoKRQ14Mfr4KrkKSbGW4lr169WqVKFWgO159KJOG9e/cislatWvRyH9WFyLJEbCPq1aunVgdNp60m1Rrce++9Fy5cQIme8yTXWTlJ7df0Hey+w9PFpp6KXNO53nTqoTiP8mnQFOchH21GatBj5pr8BOLUTaka0ufEMAxTQMimx68S1u2p1HueWYOScIRZKlu2bEJCAjxsOhUayJ/xIwMMcFquXDmcwoKGh4e3bNmyRYsWCDRr1gxHmEZVElAYmVAARyqIgIVbtGgRIkHFihWRw4MPPogj0bRpU9hIUmDhwoUQRhJoQmm1xMTEtGvXDvsSpGqcFc2bN+/Xr9+tW7eUxGkpUqRIWFhYfHw8/HKUS6XjePLkSTQLiqDH85AktQElzATsoqKioiAJw99IBlWjOoJq1apRpbDVwOYJYmpz5SHyHSKFrhG6yKNRygUPZFfeLWKMks9/atX8SPW1VK8y7zJFWl1HkdPccIDPL0eBGsMjlUiGYZgCRTYMv1j01Huba8YM1yylXgKTAwsKp3Pq1KnwdBFDayjdaVdlWrVqVb58+Xnz5sEcQhgCJAkDBpNJYgTsKxlspCLzRsLE/v37T5w4gUuwiJMnT14vs2HDho0yf//9d9u2bSnbrVu3QgeE6d54OoKDg996662PPvro3Xff/eCDD97Piueffz4gIEBJnBY/Pz/sY6DSrFmzcEpWH3WcP38+KoLGQSRiCLe7kHRABrsWq9UK+dq1a69duxYVRNWojmD69OmBgYEQo20QtlOoJoSV9HlEetc4q9fysivvnhTLL+z+4BF0ux54mbn8WAuIDYBs+9M8JZA9foZhmIKIsox5h7gBqpC6eqa7SeoJuqd96tSp0qVLw76++eab8EHh5sLo1qhRAznCXcYprCCM06RJk2Cf4LzCjCESBn7v3r2vvPLK//73P2QCMxYdHV2mTJlSpUrt2rUL+fz666979uxR72MTI0aMQCYwrnXq1Ll69SquAkUb+dEDUkEAoPRvv/02nYAW5KbmrER5gGSAcp4W5H/48OGyZcsWK1YMuxB6tA97HBISUqlSJVQKpRBogf79++/cuRNhJbEG5I945Hbt2rUqVaqgjmjSzz77TK5BmgcWiYmJLVu2hADqiHL//fdflOhBPWHL3N0zl7s95YKnm+Dp4jWjQjaRbkaIp+I8yachXf7u8hHgLHzwYM11T5mnTaZFpJArliapkHfbDgzDMPmc7Bn+20Y2heLdeJg6mFv40DBF1apVa9q0KcL0xfJNmjSJi4uDGCwTxGDm/f39YREhgx1AaGgo3GiYZwjAsMGePfLII7DZ5cuXRz4we6NHj6a9BRVEBg8CyBmeumwQBYpCsg2OiYlB/mQU27Rpgw2HWyubgyB/KDlu3DjUpUSJEr1l0AJg1qxZqnoQ69OnDxSbO3cuqkORWhAJkNWOHTtQd0giwyNHjiAhocjJ1URxkCGxhQsXUispl9OQiQWV7RzhydrJdlEgJ0prSNVrBF3IpDi38mnQiGeST4riaTJwm3naZJQqBTVekzJ8/Hi37x4wDMPkd/Lori9KgbFBALYHpmjdunWzZ8+GHw/j169fv3r16sGiV65ceejQoTD2EIDBTk5Ohr+7YMGCs2fPIrJGjRqwkdgBwIBRhvD1x44dm5CQUK5cuc6dO7dr165UqVJUBBnLadOmHTp0qEKFCnCd4ewiXlRYhnKA2N9//71y5UqEGzRo8Nxzz+ES5Z9LoGp0nDdv3vjx4w8cOACtsOd44403HnroIbPZbDKZIAB90CAffvjhnDlznnjiCcjIqVOBAID+0dHREydOxD6pVatWPXv2hPKIhzzVEaAsbKfQFOfPn0fmL774In2VryrAMAzD+BR5avhhbGCTKEDxMF0UJluFSwhQJClGAfWUJHGkSEjSKQXUnMn+IZICJAzUACUhARwRjxiKRwyOuQTMMOmAMI7Y3KA4Pz8/UgBVoOqA5cuXP//889u3b1e/cl8L6Uz5UFqAtNrMKRIxtOVSYwDCakEMwzCMT5FHhp/JFuiUjz/+eM+ePYsWLaIP9ysXGIZhGObOYLcvn1KzZs3hw4ebzWatp84wDMMwdwh7/PkRdIr6AEK+/c/7M4ZhGCZnYMOfT0G/UNfA42enn2EYhskp2PAzDMMwjA/B95AZhmEYxodgw88wDMMwPgQbfoZhGIbxIdjwMwzDMIwPwYafYRiGYXwINvwMwzAM40Ow4WcYhmEYH4INP8MwDMP4EGz4GYZhGMZn0On+Hy2I472HDLyYAAAAAElFTkSuQmCC)", "_____no_output_____" ] ], [ [ "np.identity(3)", "_____no_output_____" ], [ "#hacer una matriz que multiplcarse por si misma da la matriz identidad\n#que todo lo diagonal es 1 y el resto 0\n", "_____no_output_____" ], [ "v1 = np.array([3, 0, 2])\nv2 = np.array([2, 0, -2])\nv3 = np.array([0, 1, 1])\nM = np.vstack([v1, v2, v3])#creamos la matriz apartir de esos valores y luego los unimos \nprint (np.linalg.inv(M))#para invertir la matriz \nprint (np.linalg.det(M))#para ver el determinante de mi matriz ", "[[ 0.2 0.2 0. ]\n [-0.2 0.3 1. ]\n [ 0.2 -0.3 -0. ]]\n10.000000000000002\n" ], [ "print (np.linalg.inv(M))#invertir ", "[[ 0.2 0.2 0. ]\n [-0.2 0.3 1. ]\n [ 0.2 -0.3 -0. ]]\n" ], [ "print (np.linalg.det(M))#determinante", "10.000000000000002\n" ] ], [ [ "**Definicion de Variables**", "_____no_output_____" ] ], [ [ "a= np.array([1,1,1])\nb= np.array([2,2,2])", "_____no_output_____" ], [ "#Multiplcacion de los elementos\n#(lo hace elemento a elemento )\na*b", "_____no_output_____" ], [ "#metodo multiplicacion de elementos:\nnp.multiply(a,b)\n", "_____no_output_____" ], [ "#Metodo multiplicacion de matrices\n#2*1+2*1+2*1\nnp.matmul(a,b)", "_____no_output_____" ], [ "#Metodo producto punto\n#similar a la multiplcacion matricial\nnp.dot(a,b)", "_____no_output_____" ], [ "#Metodo producto cruz\n#como son paralelos no es muy perpendicular \nnp.cross(a,b)", "_____no_output_____" ], [ "#Metodo producto cruz con vectores ortogonales\n#ortogonal seria la parte de abajo en eje z \nnp.cross(np.array([1,0,0]), np.array([0,1,0]))", "_____no_output_____" ] ], [ [ "**Definicion de Matrices**", "_____no_output_____" ] ], [ [ "a = np.array([[1,2], [2,3]])\nb = np.array([[3,4],[5,6]])", "_____no_output_____" ], [ "print(a)\nprint(b)", "[[1 2]\n [2 3]]\n[[3 4]\n [5 6]]\n" ], [ "#Multiplicacion elemento a elemento \na*b\n#mulitplicacion punto a punto ", "_____no_output_____" ], [ "#Metodo multiplicacion elemento \nnp.multiply(a,b)", "_____no_output_____" ], [ "#Metodo multiplcacion matricial \n#1*3+2*5=13\n#1*4+2*4=16\n#2*3+3*5=21\n#2*4+3*6=26\nnp.matmul(a,b)", "_____no_output_____" ], [ "#Metodo producto punto ", "_____no_output_____" ] ], [ [ "**Inversion de matrices**", "_____no_output_____" ] ], [ [ "a= np.array([[1,1,1],[0,2,5],[2,5,-1]])#esta es mi matriz\nb= np.linalg.inv(a)#aqui la estoy invirtiendo\nb", "_____no_output_____" ], [ "np.matmul(a,b)#cuando la multiplico matricialmente me da una matriz identidad \n#cuando iniverto una matriz y luego la multilplico me da una identidad ", "_____no_output_____" ], [ "v1= np.array([3,0,2])\nv2=np.array([2,0,-2])\nv3=np.array([0,1,1])\nM=np.vstack([v1,v2,v3])\nM\n", "_____no_output_____" ], [ "M_inv = np.linalg.inv(M)#LA INVERTIMOS\nM_inv", "_____no_output_____" ] ], [ [ "# Valores y vectores propios\nUn valor propio $\\lambda$ y un vector propio $\\vec{u}$ satisfacen\n\n$$Au = \\lambda u$$\n\nDonde A es una matriz cuadrada.\n\nReordenando la ecuacion anterior tenemos el sistema:\n\n$$Au -\\lambda u = (A- \\lambda I)u =0$$\nEl cual tiene solucion si y solo si $det(A-\\lambda I)=0$\n\n1. Los valores propios son las raices del polinomio caracteristico del determinante\n2. Susituyendo los valores propios en $$Au = \\lambda u$$ y resolviendo se puede obtener el vector propio asociado\n", "_____no_output_____" ] ], [ [ "#TENGO UN ESPACIO DE DOS DIMENSIONES Y LO QUE HAGO\n#ES DISTORSIONAR ESE ESPACIO DIMENSIAL", "_____no_output_____" ], [ "v1 = np.array([0, 1])\nv2 = np.array([-2, -3])\nM = np.vstack([v1, v2])\neigvals, eigvecs= np.linalg.eig(M)\nprint(eigvals)#caractericas de las matrices \nprint(eigvecs)", "[-1. -2.]\n[[ 0.70710678 -0.4472136 ]\n [-0.70710678 0.89442719]]\n" ], [ "#valor propio es un valor que podemos crear y hacer la solucion de las operaciones ", "_____no_output_____" ], [ "A=np.array([[-81,16],[-420,83]])\nA", "_____no_output_____" ], [ "eigvals,eigvecs=np.linalg.eig(A)", "_____no_output_____" ], [ "eigvals", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ] ]
d029c16ab01d4efeaa90c6ac2fb6ddb1c7892b87
5,355
ipynb
Jupyter Notebook
AdventofCode2020/timings/Timing Day 1.ipynb
evan-freeman/puzzles
234c77df026a6f3f39a57b47c47636d8008a8571
[ "MIT" ]
null
null
null
AdventofCode2020/timings/Timing Day 1.ipynb
evan-freeman/puzzles
234c77df026a6f3f39a57b47c47636d8008a8571
[ "MIT" ]
null
null
null
AdventofCode2020/timings/Timing Day 1.ipynb
evan-freeman/puzzles
234c77df026a6f3f39a57b47c47636d8008a8571
[ "MIT" ]
null
null
null
19.125
84
0.488142
[ [ [ "# Setup", "_____no_output_____" ] ], [ [ "from day1 import puzzle1\nfrom day1 import puzzle1slow\nfrom day1 import puzzle1maybefaster\nfrom day2 import puzzle2\nfrom day2 import puzzle2slow\nfrom day2 import puzzle2maybefaster\n\nsample_input1 = [1721, 979, 366, 299, 675, 1456]\ninput1 = open(\"input1.txt\", \"r\").read()\ninput1_list = [int(x) for x in input1.split(\"\\n\") if x]", "_____no_output_____" ] ], [ [ "# Puzzle Timings", "_____no_output_____" ] ], [ [ "%%timeit\npuzzle1(input1_list)", "26.1 µs ± 2.34 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)\n" ], [ "%%timeit\npuzzle1maybefaster(input1_list)", "26 µs ± 1.54 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)\n" ], [ "%%timeit\npuzzle1slow(input1_list)", "2.38 ms ± 174 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" ], [ "%%timeit\npuzzle2(input1_list)", "1.69 ms ± 229 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)\n" ], [ "%%timeit\npuzzle2maybefaster(input1_list)", "2.14 ms ± 235 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n" ], [ "%%timeit\npuzzle2slow(input1_list)", "393 ms ± 60.4 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)\n" ] ], [ [ "# Sample Input Timings", "_____no_output_____" ] ], [ [ "%%timeit\npuzzle1(sample_input1)", "678 ns ± 33.2 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)\n" ], [ "%%timeit\npuzzle1slow(sample_input1)", "553 ns ± 29.4 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)\n" ], [ "%%timeit\npuzzle1maybefaster(sample_input1)", "645 ns ± 31.4 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)\n" ], [ "%%timeit\npuzzle2(sample_input1)", "2.1 µs ± 97.1 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)\n" ], [ "%%timeit\npuzzle2maybefaster(sample_input1)", "2.51 µs ± 288 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)\n" ], [ "%%timeit\npuzzle2slow(sample_input1)", "6.02 µs ± 166 ns per loop (mean ± std. dev. of 7 runs, 100000 loops each)\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ] ]
d029c4f14b630e689193f9659cfa0f670a250f35
17,798
ipynb
Jupyter Notebook
notebooks/M6-ensemble_sol_01.ipynb
datagistips/scikit-learn-mooc
9eb67c53173218b5cd3061712c827c6a663e425a
[ "CC-BY-4.0" ]
1
2021-07-14T09:41:21.000Z
2021-07-14T09:41:21.000Z
notebooks/M6-ensemble_sol_01.ipynb
datagistips/scikit-learn-mooc
9eb67c53173218b5cd3061712c827c6a663e425a
[ "CC-BY-4.0" ]
null
null
null
notebooks/M6-ensemble_sol_01.ipynb
datagistips/scikit-learn-mooc
9eb67c53173218b5cd3061712c827c6a663e425a
[ "CC-BY-4.0" ]
null
null
null
32.898336
131
0.369761
[ [ [ "# 📃 Solution of Exercise M6.01\n\nThe aim of this notebook is to investigate if we can tune the hyperparameters\nof a bagging regressor and evaluate the gain obtained.\n\nWe will load the California housing dataset and split it into a training and\na testing set.", "_____no_output_____" ] ], [ [ "from sklearn.datasets import fetch_california_housing\nfrom sklearn.model_selection import train_test_split\n\ndata, target = fetch_california_housing(as_frame=True, return_X_y=True)\ntarget *= 100 # rescale the target in k$\ndata_train, data_test, target_train, target_test = train_test_split(\n data, target, random_state=0, test_size=0.5)", "_____no_output_____" ] ], [ [ "<div class=\"admonition note alert alert-info\">\n<p class=\"first admonition-title\" style=\"font-weight: bold;\">Note</p>\n<p class=\"last\">If you want a deeper overview regarding this dataset, you can refer to the\nAppendix - Datasets description section at the end of this MOOC.</p>\n</div>", "_____no_output_____" ], [ "Create a `BaggingRegressor` and provide a `DecisionTreeRegressor`\nto its parameter `base_estimator`. Train the regressor and evaluate its\nstatistical performance on the testing set using the mean absolute error.", "_____no_output_____" ] ], [ [ "from sklearn.metrics import mean_absolute_error\nfrom sklearn.tree import DecisionTreeRegressor\nfrom sklearn.ensemble import BaggingRegressor\n\ntree = DecisionTreeRegressor()\nbagging = BaggingRegressor(base_estimator=tree, n_jobs=-1)\nbagging.fit(data_train, target_train)\ntarget_predicted = bagging.predict(data_test)\nprint(f\"Basic mean absolute error of the bagging regressor:\\n\"\n f\"{mean_absolute_error(target_test, target_predicted):.2f} k$\")", "Basic mean absolute error of the bagging regressor:\n36.65 k$\n" ], [ "abs(target_test - target_predicted).mean()", "_____no_output_____" ] ], [ [ "Now, create a `RandomizedSearchCV` instance using the previous model and\ntune the important parameters of the bagging regressor. Find the best\nparameters and check if you are able to find a set of parameters that\nimprove the default regressor still using the mean absolute error as a\nmetric.\n\n<div class=\"admonition tip alert alert-warning\">\n<p class=\"first admonition-title\" style=\"font-weight: bold;\">Tip</p>\n<p class=\"last\">You can list the bagging regressor's parameters using the <tt class=\"docutils literal\">get_params</tt>\nmethod.</p>\n</div>", "_____no_output_____" ] ], [ [ "for param in bagging.get_params().keys():\n print(param)", "_____no_output_____" ], [ "from scipy.stats import randint\nfrom sklearn.model_selection import RandomizedSearchCV\n\nparam_grid = {\n \"n_estimators\": randint(10, 30),\n \"max_samples\": [0.5, 0.8, 1.0],\n \"max_features\": [0.5, 0.8, 1.0],\n \"base_estimator__max_depth\": randint(3, 10),\n}\nsearch = RandomizedSearchCV(\n bagging, param_grid, n_iter=20, scoring=\"neg_mean_absolute_error\"\n)\n_ = search.fit(data_train, target_train)", "_____no_output_____" ], [ "import pandas as pd\n\ncolumns = [f\"param_{name}\" for name in param_grid.keys()]\ncolumns += [\"mean_test_score\", \"std_test_score\", \"rank_test_score\"]\ncv_results = pd.DataFrame(search.cv_results_)\ncv_results = cv_results[columns].sort_values(by=\"rank_test_score\")\ncv_results[\"mean_test_score\"] = -cv_results[\"mean_test_score\"]\ncv_results", "_____no_output_____" ], [ "target_predicted = search.predict(data_test)\nprint(f\"Mean absolute error after tuning of the bagging regressor:\\n\"\n f\"{mean_absolute_error(target_test, target_predicted):.2f} k$\")", "Mean absolute error after tuning of the bagging regressor:\n40.29 k$\n" ] ], [ [ "We see that the predictor provided by the bagging regressor does not need\nmuch hyperparameter tuning compared to a single decision tree. We see that\nthe bagging regressor provides a predictor for which tuning the\nhyperparameters is not as important as in the case of fitting a single\ndecision tree.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ] ]
d029c6f972bba00349e1cc8a4bd1a3d5a310069e
36,669
ipynb
Jupyter Notebook
lessons/Recommendations/1_Intro_to_Recommendations/4_Collaborative Filtering - Solution.ipynb
callezenwaka/DSND_Term2
6252cb75f9fbd61043b308a783b1d62cdd217001
[ "MIT" ]
1,030
2018-07-03T19:09:50.000Z
2022-03-25T05:48:57.000Z
lessons/Recommendations/1_Intro_to_Recommendations/4_Collaborative Filtering - Solution.ipynb
callezenwaka/DSND_Term2
6252cb75f9fbd61043b308a783b1d62cdd217001
[ "MIT" ]
21
2018-09-20T14:36:04.000Z
2021-10-11T18:25:31.000Z
lessons/Recommendations/1_Intro_to_Recommendations/4_Collaborative Filtering - Solution.ipynb
callezenwaka/DSND_Term2
6252cb75f9fbd61043b308a783b1d62cdd217001
[ "MIT" ]
1,736
2018-06-27T19:33:46.000Z
2022-03-28T17:52:33.000Z
42.148276
507
0.585754
[ [ [ "## Recommendations with MovieTweetings: Collaborative Filtering\n\nOne of the most popular methods for making recommendations is **collaborative filtering**. In collaborative filtering, you are using the collaboration of user-item recommendations to assist in making new recommendations. \n\nThere are two main methods of performing collaborative filtering:\n\n1. **Neighborhood-Based Collaborative Filtering**, which is based on the idea that we can either correlate items that are similar to provide recommendations or we can correlate users to one another to provide recommendations.\n\n2. **Model Based Collaborative Filtering**, which is based on the idea that we can use machine learning and other mathematical models to understand the relationships that exist amongst items and users to predict ratings and provide ratings.\n\n\nIn this notebook, you will be working on performing **neighborhood-based collaborative filtering**. There are two main methods for performing collaborative filtering:\n\n1. **User-based collaborative filtering:** In this type of recommendation, users related to the user you would like to make recommendations for are used to create a recommendation.\n\n2. **Item-based collaborative filtering:** In this type of recommendation, first you need to find the items that are most related to each other item (based on similar ratings). Then you can use the ratings of an individual on those similar items to understand if a user will like the new item.\n\nIn this notebook you will be implementing **user-based collaborative filtering**. However, it is easy to extend this approach to make recommendations using **item-based collaborative filtering**. First, let's read in our data and necessary libraries.\n\n**NOTE**: Because of the size of the datasets, some of your code cells here will take a while to execute, so be patient!", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport tests as t\nfrom scipy.sparse import csr_matrix\nfrom IPython.display import HTML\n\n%matplotlib inline\n\n# Read in the datasets\nmovies = pd.read_csv('movies_clean.csv')\nreviews = pd.read_csv('reviews_clean.csv')\n\ndel movies['Unnamed: 0']\ndel reviews['Unnamed: 0']\n\nprint(reviews.head())", " user_id movie_id rating timestamp date month_1 \\\n0 1 68646 10 1381620027 2013-10-12 23:20:27 0 \n1 1 113277 10 1379466669 2013-09-18 01:11:09 0 \n2 2 422720 8 1412178746 2014-10-01 15:52:26 0 \n3 2 454876 8 1394818630 2014-03-14 17:37:10 0 \n4 2 790636 7 1389963947 2014-01-17 13:05:47 0 \n\n month_2 month_3 month_4 month_5 ... month_9 month_10 month_11 \\\n0 0 0 0 0 ... 0 1 0 \n1 0 0 0 0 ... 0 0 0 \n2 0 0 0 0 ... 0 1 0 \n3 0 0 0 0 ... 0 0 0 \n4 0 0 0 0 ... 0 0 0 \n\n month_12 year_2013 year_2014 year_2015 year_2016 year_2017 year_2018 \n0 0 1 0 0 0 0 0 \n1 0 1 0 0 0 0 0 \n2 0 0 1 0 0 0 0 \n3 0 0 1 0 0 0 0 \n4 0 0 1 0 0 0 0 \n\n[5 rows x 23 columns]\n" ] ], [ [ "### Measures of Similarity\n\nWhen using **neighborhood** based collaborative filtering, it is important to understand how to measure the similarity of users or items to one another. \n\nThere are a number of ways in which we might measure the similarity between two vectors (which might be two users or two items). In this notebook, we will look specifically at two measures used to compare vectors:\n\n* **Pearson's correlation coefficient**\n\nPearson's correlation coefficient is a measure of the strength and direction of a linear relationship. The value for this coefficient is a value between -1 and 1 where -1 indicates a strong, negative linear relationship and 1 indicates a strong, positive linear relationship. \n\nIf we have two vectors x and y, we can define the correlation between the vectors as:\n\n\n$$CORR(x, y) = \\frac{\\text{COV}(x, y)}{\\text{STDEV}(x)\\text{ }\\text{STDEV}(y)}$$\n\nwhere \n\n$$\\text{STDEV}(x) = \\sqrt{\\frac{1}{n-1}\\sum_{i=1}^{n}(x_i - \\bar{x})^2}$$\n\nand \n\n$$\\text{COV}(x, y) = \\frac{1}{n-1}\\sum_{i=1}^{n}(x_i - \\bar{x})(y_i - \\bar{y})$$\n\nwhere n is the length of the vector, which must be the same for both x and y and $\\bar{x}$ is the mean of the observations in the vector. \n\nWe can use the correlation coefficient to indicate how alike two vectors are to one another, where the closer to 1 the coefficient, the more alike the vectors are to one another. There are some potential downsides to using this metric as a measure of similarity. You will see some of these throughout this workbook.\n\n\n* **Euclidean distance**\n\nEuclidean distance is a measure of the straightline distance from one vector to another. Because this is a measure of distance, larger values are an indication that two vectors are different from one another (which is different than Pearson's correlation coefficient).\n\nSpecifically, the euclidean distance between two vectors x and y is measured as:\n\n$$ \\text{EUCL}(x, y) = \\sqrt{\\sum_{i=1}^{n}(x_i - y_i)^2}$$\n\nDifferent from the correlation coefficient, no scaling is performed in the denominator. Therefore, you need to make sure all of your data are on the same scale when using this metric.\n\n**Note:** Because measuring similarity is often based on looking at the distance between vectors, it is important in these cases to scale your data or to have all data be in the same scale. In this case, we will not need to scale data because they are all on a 10 point scale, but it is always something to keep in mind!\n\n------------\n\n### User-Item Matrix\n\nIn order to calculate the similarities, it is common to put values in a matrix. In this matrix, users are identified by each row, and items are represented by columns. \n\n\n![alt text](images/userxitem.png \"User Item Matrix\")\n", "_____no_output_____" ], [ "In the above matrix, you can see that **User 1** and **User 2** both used **Item 1**, and **User 2**, **User 3**, and **User 4** all used **Item 2**. However, there are also a large number of missing values in the matrix for users who haven't used a particular item. A matrix with many missing values (like the one above) is considered **sparse**.\n\nOur first goal for this notebook is to create the above matrix with the **reviews** dataset. However, instead of 1 values in each cell, you should have the actual rating. \n\nThe users will indicate the rows, and the movies will exist across the columns. To create the user-item matrix, we only need the first three columns of the **reviews** dataframe, which you can see by running the cell below.", "_____no_output_____" ] ], [ [ "user_items = reviews[['user_id', 'movie_id', 'rating']]\nuser_items.head()", "_____no_output_____" ] ], [ [ "### Creating the User-Item Matrix\n\nIn order to create the user-items matrix (like the one above), I personally started by using a [pivot table](https://pandas.pydata.org/pandas-docs/stable/generated/pandas.pivot_table.html). \n\nHowever, I quickly ran into a memory error (a common theme throughout this notebook). I will help you navigate around many of the errors I had, and achieve useful collaborative filtering results! \n\n_____\n\n`1.` Create a matrix where the users are the rows, the movies are the columns, and the ratings exist in each cell, or a NaN exists in cells where a user hasn't rated a particular movie. If you get a memory error (like I did), [this link here](https://stackoverflow.com/questions/39648991/pandas-dataframe-pivot-memory-error) might help you!", "_____no_output_____" ] ], [ [ "# Create user-by-item matrix\nuser_by_movie = user_items.groupby(['user_id', 'movie_id'])['rating'].max().unstack()", "_____no_output_____" ] ], [ [ "Check your results below to make sure your matrix is ready for the upcoming sections.", "_____no_output_____" ] ], [ [ "assert movies.shape[0] == user_by_movie.shape[1], \"Oh no! Your matrix should have {} columns, and yours has {}!\".format(movies.shape[0], user_by_movie.shape[1])\nassert reviews.user_id.nunique() == user_by_movie.shape[0], \"Oh no! Your matrix should have {} rows, and yours has {}!\".format(reviews.user_id.nunique(), user_by_movie.shape[0])\nprint(\"Looks like you are all set! Proceed!\")\nHTML('<img src=\"images/greatjob.webp\">')", "_____no_output_____" ] ], [ [ "`2.` Now that you have a matrix of users by movies, use this matrix to create a dictionary where the key is each user and the value is an array of the movies each user has rated.", "_____no_output_____" ] ], [ [ "# Create a dictionary with users and corresponding movies seen\n\ndef movies_watched(user_id):\n '''\n INPUT:\n user_id - the user_id of an individual as int\n OUTPUT:\n movies - an array of movies the user has watched\n '''\n movies = user_by_movie.loc[user_id][user_by_movie.loc[user_id].isnull() == False].index.values\n\n return movies\n\n\ndef create_user_movie_dict():\n '''\n INPUT: None\n OUTPUT: movies_seen - a dictionary where each key is a user_id and the value is an array of movie_ids\n \n Creates the movies_seen dictionary\n '''\n n_users = user_by_movie.shape[0]\n movies_seen = dict()\n\n for user1 in range(1, n_users+1):\n \n # assign list of movies to each user key\n movies_seen[user1] = movies_watched(user1)\n \n return movies_seen\n \nmovies_seen = create_user_movie_dict()", "_____no_output_____" ] ], [ [ "`3.` If a user hasn't rated more than 2 movies, we consider these users \"too new\". Create a new dictionary that only contains users who have rated more than 2 movies. This dictionary will be used for all the final steps of this workbook.", "_____no_output_____" ] ], [ [ "# Remove individuals who have watched 2 or fewer movies - don't have enough data to make recs\n\ndef create_movies_to_analyze(movies_seen, lower_bound=2):\n '''\n INPUT: \n movies_seen - a dictionary where each key is a user_id and the value is an array of movie_ids\n lower_bound - (an int) a user must have more movies seen than the lower bound to be added to the movies_to_analyze dictionary\n\n OUTPUT: \n movies_to_analyze - a dictionary where each key is a user_id and the value is an array of movie_ids\n \n The movies_seen and movies_to_analyze dictionaries should be the same except that the output dictionary has removed \n \n '''\n movies_to_analyze = dict()\n\n for user, movies in movies_seen.items():\n if len(movies) > lower_bound:\n movies_to_analyze[user] = movies\n return movies_to_analyze\n\nmovies_to_analyze = create_movies_to_analyze(movies_seen)", "_____no_output_____" ], [ "# Run the tests below to check that your movies_to_analyze matches the solution\nassert len(movies_to_analyze) == 23512, \"Oops! It doesn't look like your dictionary has the right number of individuals.\"\nassert len(movies_to_analyze[2]) == 23, \"Oops! User 2 didn't match the number of movies we thought they would have.\"\nassert len(movies_to_analyze[7]) == 3, \"Oops! User 7 didn't match the number of movies we thought they would have.\"\nprint(\"If this is all you see, you are good to go!\")", "_____no_output_____" ] ], [ [ "### Calculating User Similarities\n\nNow that you have set up the **movies_to_analyze** dictionary, it is time to take a closer look at the similarities between users. Below is the pseudocode for how I thought about determining the similarity between users:\n\n```\nfor user1 in movies_to_analyze\n for user2 in movies_to_analyze\n see how many movies match between the two users\n if more than two movies in common\n pull the overlapping movies\n compute the distance/similarity metric between ratings on the same movies for the two users\n store the users and the distance metric\n```\n\nHowever, this took a very long time to run, and other methods of performing these operations did not fit on the workspace memory!\n\nTherefore, rather than creating a dataframe with all possible pairings of users in our data, your task for this question is to look at a few specific examples of the correlation between ratings given by two users. For this question consider you want to compute the [correlation](https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.corr.html) between users.\n\n`4.` Using the **movies_to_analyze** dictionary and **user_by_movie** dataframe, create a function that computes the correlation between the ratings of similar movies for two users. Then use your function to compare your results to ours using the tests below. ", "_____no_output_____" ] ], [ [ "def compute_correlation(user1, user2):\n '''\n INPUT\n user1 - int user_id\n user2 - int user_id\n OUTPUT\n the correlation between the matching ratings between the two users\n '''\n # Pull movies for each user\n movies1 = movies_to_analyze[user1]\n movies2 = movies_to_analyze[user2]\n \n \n # Find Similar Movies\n sim_movs = np.intersect1d(movies1, movies2, assume_unique=True)\n \n # Calculate correlation between the users\n df = user_by_movie.loc[(user1, user2), sim_movs]\n corr = df.transpose().corr().iloc[0,1]\n \n return corr #return the correlation", "_____no_output_____" ], [ "# Test your function against the solution\nassert compute_correlation(2,2) == 1.0, \"Oops! The correlation between a user and itself should be 1.0.\"\nassert round(compute_correlation(2,66), 2) == 0.76, \"Oops! The correlation between user 2 and 66 should be about 0.76.\"\nassert np.isnan(compute_correlation(2,104)), \"Oops! The correlation between user 2 and 104 should be a NaN.\"\n\nprint(\"If this is all you see, then it looks like your function passed all of our tests!\")", "_____no_output_____" ] ], [ [ "### Why the NaN's?\n\nIf the function you wrote passed all of the tests, then you have correctly set up your function to calculate the correlation between any two users. \n\n`5.` But one question is, why are we still obtaining **NaN** values? As you can see in the code cell above, users 2 and 104 have a correlation of **NaN**. Why?", "_____no_output_____" ], [ "Think and write your ideas here about why these NaNs exist, and use the cells below to do some coding to validate your thoughts. You can check other pairs of users and see that there are actually many NaNs in our data - 2,526,710 of them in fact. These NaN's ultimately make the correlation coefficient a less than optimal measure of similarity between two users.\n\n```\nIn the denominator of the correlation coefficient, we calculate the standard deviation for each user's ratings. The ratings for user 2 are all the same rating on the movies that match with user 104. Therefore, the standard deviation is 0. Because a 0 is in the denominator of the correlation coefficient, we end up with a **NaN** correlation coefficient. Therefore, a different approach is likely better for this particular situation.\n```", "_____no_output_____" ] ], [ [ "# Which movies did both user 2 and user 104 see?\nset_2 = set(movies_to_analyze[2])\nset_104 = set(movies_to_analyze[104])\nset_2.intersection(set_104)", "_____no_output_____" ], [ "# What were the ratings for each user on those movies?\nprint(user_by_movie.loc[2, set_2.intersection(set_104)])\nprint(user_by_movie.loc[104, set_2.intersection(set_104)])", "_____no_output_____" ] ], [ [ "`6.` Because the correlation coefficient proved to be less than optimal for relating user ratings to one another, we could instead calculate the euclidean distance between the ratings. I found [this post](https://stackoverflow.com/questions/1401712/how-can-the-euclidean-distance-be-calculated-with-numpy) particularly helpful when I was setting up my function. This function should be very similar to your previous function. When you feel confident with your function, test it against our results.", "_____no_output_____" ] ], [ [ "def compute_euclidean_dist(user1, user2):\n '''\n INPUT\n user1 - int user_id\n user2 - int user_id\n OUTPUT\n the euclidean distance between user1 and user2\n '''\n # Pull movies for each user\n movies1 = movies_to_analyze[user1]\n movies2 = movies_to_analyze[user2]\n \n \n # Find Similar Movies\n sim_movs = np.intersect1d(movies1, movies2, assume_unique=True)\n \n # Calculate euclidean distance between the users\n df = user_by_movie.loc[(user1, user2), sim_movs]\n dist = np.linalg.norm(df.loc[user1] - df.loc[user2])\n \n return dist #return the euclidean distance", "_____no_output_____" ], [ "# Read in solution euclidean distances\"\nimport pickle\ndf_dists = pd.read_pickle(\"data/Term2/recommendations/lesson1/data/dists.p\")", "_____no_output_____" ], [ "# Test your function against the solution\nassert compute_euclidean_dist(2,2) == df_dists.query(\"user1 == 2 and user2 == 2\")['eucl_dist'][0], \"Oops! The distance between a user and itself should be 0.0.\"\nassert round(compute_euclidean_dist(2,66), 2) == round(df_dists.query(\"user1 == 2 and user2 == 66\")['eucl_dist'][1], 2), \"Oops! The distance between user 2 and 66 should be about 2.24.\"\nassert np.isnan(compute_euclidean_dist(2,104)) == np.isnan(df_dists.query(\"user1 == 2 and user2 == 104\")['eucl_dist'][4]), \"Oops! The distance between user 2 and 104 should be 2.\"\n\nprint(\"If this is all you see, then it looks like your function passed all of our tests!\")", "_____no_output_____" ] ], [ [ "### Using the Nearest Neighbors to Make Recommendations\n\nIn the previous question, you read in **df_dists**. Therefore, you have a measure of distance between each user and every other user. This dataframe holds every possible pairing of users, as well as the corresponding euclidean distance.\n\nBecause of the **NaN** values that exist within the correlations of the matching ratings for many pairs of users, as we discussed above, we will proceed using **df_dists**. You will want to find the users that are 'nearest' each user. Then you will want to find the movies the closest neighbors have liked to recommend to each user.\n\nI made use of the following objects:\n\n* df_dists (to obtain the neighbors)\n* user_items (to obtain the movies the neighbors and users have rated)\n* movies (to obtain the names of the movies)\n\n`7.` Complete the functions below, which allow you to find the recommendations for any user. There are five functions which you will need:\n\n* **find_closest_neighbors** - this returns a list of user_ids from closest neighbor to farthest neighbor using euclidean distance\n\n\n* **movies_liked** - returns an array of movie_ids\n\n\n* **movie_names** - takes the output of movies_liked and returns a list of movie names associated with the movie_ids\n\n\n* **make_recommendations** - takes a user id and goes through closest neighbors to return a list of movie names as recommendations\n\n\n* **all_recommendations** = loops through every user and returns a dictionary of with the key as a user_id and the value as a list of movie recommendations", "_____no_output_____" ] ], [ [ "def find_closest_neighbors(user):\n '''\n INPUT:\n user - (int) the user_id of the individual you want to find the closest users\n OUTPUT:\n closest_neighbors - an array of the id's of the users sorted from closest to farthest away\n '''\n # I treated ties as arbitrary and just kept whichever was easiest to keep using the head method\n # You might choose to do something less hand wavy\n \n closest_users = df_dists[df_dists['user1']==user].sort_values(by='eucl_dist').iloc[1:]['user2']\n closest_neighbors = np.array(closest_users)\n \n return closest_neighbors\n \n \n \ndef movies_liked(user_id, min_rating=7):\n '''\n INPUT:\n user_id - the user_id of an individual as int\n min_rating - the minimum rating considered while still a movie is still a \"like\" and not a \"dislike\"\n OUTPUT:\n movies_liked - an array of movies the user has watched and liked\n '''\n movies_liked = np.array(user_items.query('user_id == @user_id and rating > (@min_rating -1)')['movie_id'])\n \n return movies_liked\n\n\ndef movie_names(movie_ids):\n '''\n INPUT\n movie_ids - a list of movie_ids\n OUTPUT\n movies - a list of movie names associated with the movie_ids\n \n '''\n movie_lst = list(movies[movies['movie_id'].isin(movie_ids)]['movie'])\n \n return movie_lst\n \n \ndef make_recommendations(user, num_recs=10):\n '''\n INPUT:\n user - (int) a user_id of the individual you want to make recommendations for\n num_recs - (int) number of movies to return\n OUTPUT:\n recommendations - a list of movies - if there are \"num_recs\" recommendations return this many\n otherwise return the total number of recommendations available for the \"user\"\n which may just be an empty list\n '''\n # I wanted to make recommendations by pulling different movies than the user has already seen\n # Go in order from closest to farthest to find movies you would recommend\n # I also only considered movies where the closest user rated the movie as a 9 or 10\n \n # movies_seen by user (we don't want to recommend these)\n movies_seen = movies_watched(user)\n closest_neighbors = find_closest_neighbors(user)\n \n # Keep the recommended movies here\n recs = np.array([])\n \n # Go through the neighbors and identify movies they like the user hasn't seen\n for neighbor in closest_neighbors:\n neighbs_likes = movies_liked(neighbor)\n \n #Obtain recommendations for each neighbor\n new_recs = np.setdiff1d(neighbs_likes, movies_seen, assume_unique=True)\n \n # Update recs with new recs\n recs = np.unique(np.concatenate([new_recs, recs], axis=0))\n \n # If we have enough recommendations exit the loop\n if len(recs) > num_recs-1:\n break\n \n # Pull movie titles using movie ids\n recommendations = movie_names(recs)\n \n return recommendations\n\ndef all_recommendations(num_recs=10):\n '''\n INPUT \n num_recs (int) the (max) number of recommendations for each user\n OUTPUT\n all_recs - a dictionary where each key is a user_id and the value is an array of recommended movie titles\n '''\n \n # All the users we need to make recommendations for\n users = np.unique(df_dists['user1'])\n n_users = len(users)\n \n #Store all recommendations in this dictionary\n all_recs = dict()\n \n # Make the recommendations for each user\n for user in users:\n all_recs[user] = make_recommendations(user, num_recs)\n \n return all_recs\n\nall_recs = all_recommendations(10)", "_____no_output_____" ], [ "# This loads our solution dictionary so you can compare results - FULL PATH IS \"data/Term2/recommendations/lesson1/data/all_recs.p\"\nall_recs_sol = pd.read_pickle(\"data/Term2/recommendations/lesson1/data/all_recs.p\")", "_____no_output_____" ], [ "assert all_recs[2] == make_recommendations(2), \"Oops! Your recommendations for user 2 didn't match ours.\"\nassert all_recs[26] == make_recommendations(26), \"Oops! It actually wasn't possible to make any recommendations for user 26.\"\nassert all_recs[1503] == make_recommendations(1503), \"Oops! Looks like your solution for user 1503 didn't match ours.\"\nprint(\"If you made it here, you now have recommendations for many users using collaborative filtering!\")\nHTML('<img src=\"images/greatjob.webp\">')", "_____no_output_____" ] ], [ [ "### Now What?\n\nIf you made it this far, you have successfully implemented a solution to making recommendations using collaborative filtering. \n\n`8.` Let's do a quick recap of the steps taken to obtain recommendations using collaborative filtering. ", "_____no_output_____" ] ], [ [ "# Check your understanding of the results by correctly filling in the dictionary below\na = \"pearson's correlation and spearman's correlation\"\nb = 'item based collaborative filtering'\nc = \"there were too many ratings to get a stable metric\"\nd = 'user based collaborative filtering'\ne = \"euclidean distance and pearson's correlation coefficient\"\nf = \"manhattan distance and euclidean distance\"\ng = \"spearman's correlation and euclidean distance\"\nh = \"the spread in some ratings was zero\"\ni = 'content based recommendation'\n\nsol_dict = {\n 'The type of recommendation system implemented here was a ...': d,\n 'The two methods used to estimate user similarity were: ': e,\n 'There was an issue with using the correlation coefficient. What was it?': h\n}\n\nt.test_recs(sol_dict)", "_____no_output_____" ] ], [ [ "Additionally, let's take a closer look at some of the results. There are two solution files that you read in to check your results, and you created these objects\n\n* **df_dists** - a dataframe of user1, user2, euclidean distance between the two users\n* **all_recs_sol** - a dictionary of all recommendations (key = user, value = list of recommendations) \n\n`9.` Use these two objects along with the cells below to correctly fill in the dictionary below and complete this notebook!", "_____no_output_____" ] ], [ [ "a = 567\nb = 1503\nc = 1319\nd = 1325\ne = 2526710\nf = 0\ng = 'Use another method to make recommendations - content based, knowledge based, or model based collaborative filtering'\n\nsol_dict2 = {\n 'For how many pairs of users were we not able to obtain a measure of similarity using correlation?': e,\n 'For how many pairs of users were we not able to obtain a measure of similarity using euclidean distance?': f,\n 'For how many users were we unable to make any recommendations for using collaborative filtering?': c,\n 'For how many users were we unable to make 10 recommendations for using collaborative filtering?': d,\n 'What might be a way for us to get 10 recommendations for every user?': g \n}\n\nt.test_recs2(sol_dict2)", "_____no_output_____" ], [ "# Use the cells below for any work you need to do!", "_____no_output_____" ], [ "# Users without recs\nusers_without_recs = []\nfor user, movie_recs in all_recs.items():\n if len(movie_recs) == 0:\n users_without_recs.append(user)\n \nlen(users_without_recs)", "_____no_output_____" ], [ "# NaN euclidean distance values\ndf_dists['eucl_dist'].isnull().sum()", "_____no_output_____" ], [ "# Users with fewer than 10 recs\nusers_with_less_than_10recs = []\nfor user, movie_recs in all_recs.items():\n if len(movie_recs) < 10:\n users_with_less_than_10recs.append(user)\n \nlen(users_with_less_than_10recs)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d029ca492680dfbfcdd0f6b95260793605262d01
43,126
ipynb
Jupyter Notebook
Feature_Engineering_Toolkit_demo_features_v1.ipynb
jassimran/Feature-Engineering-Toolkit
25609a192c1d1c7fb83c5a2c19439dcb776fbcc3
[ "MIT" ]
null
null
null
Feature_Engineering_Toolkit_demo_features_v1.ipynb
jassimran/Feature-Engineering-Toolkit
25609a192c1d1c7fb83c5a2c19439dcb776fbcc3
[ "MIT" ]
null
null
null
Feature_Engineering_Toolkit_demo_features_v1.ipynb
jassimran/Feature-Engineering-Toolkit
25609a192c1d1c7fb83c5a2c19439dcb776fbcc3
[ "MIT" ]
null
null
null
32.27994
647
0.3538
[ [ [ "### Feature Engineering notebook \n\nThis is a demo notebook to play with feature engineering toolkit. In this notebook we will see some capabilities of the toolkit like filling missing values, PCA, Random Projections, Normalizing values, and etc.", "_____no_output_____" ] ], [ [ "%load_ext autoreload\n%autoreload 1\n%matplotlib inline", "_____no_output_____" ], [ "from Pipeline import Pipeline\nfrom Compare import Compare\nfrom StructuredData.LoadCSV import LoadCSV\nfrom StructuredData.MissingValues import MissingValues\nfrom StructuredData.Normalize import Normalize\nfrom StructuredData.Factorize import Factorize\nfrom StructuredData.PCAFeatures import PCAFeatures\nfrom StructuredData.RandomProjection import RandomProjection", "_____no_output_____" ], [ "csv_path = './DemoData/synthetic_classification.csv'\ndf = LoadCSV(csv_path)()", "_____no_output_____" ], [ "df.head(5)", "_____no_output_____" ] ], [ [ "### Filling missing values\n\nBy default, median of the values of the column is applied for filling out the missing values", "_____no_output_____" ] ], [ [ "pipelineObj = Pipeline([MissingValues()])\nnew_df = pipelineObj(df, '0')\nnew_df.head(5)", "_____no_output_____" ] ], [ [ "However, the imputation type is a configurable parameter to customize it as per needs.", "_____no_output_____" ] ], [ [ "pipelineObj = Pipeline([MissingValues(imputation_type = 'mean')])\nnew_df = pipelineObj(df, '0')\nnew_df.head(5)", "_____no_output_____" ] ], [ [ "### Normalize data\n\nBy default, Min max normalization is applied. Please note that assertion has been set such that normlization cant be applied if there rae missing values in that column. This is part of validation phase", "_____no_output_____" ] ], [ [ "pipelineObj = Pipeline([MissingValues(), Normalize(['1','2', '3'])])\nnew_df = pipelineObj(df, '0')\ndf.head(5)", "_____no_output_____" ] ], [ [ "### Factorize data\n\nEncode the object as an enumerated type or categorical variable for column 4 and 8, but we must remove missing values before Factorizing", "_____no_output_____" ] ], [ [ "pipelineObj = Pipeline([MissingValues(), Factorize(['4','8'])])\nnew_df = pipelineObj(df, '0')\nnew_df.head(5)", "_____no_output_____" ] ], [ [ "### Principal Component Analysis \n\nUse n_components to play around with how many dimensions you want to keep. Please note that assertions will validate if a data frame has any missing values before applying PCA. In the below example, the pipeline first removed missing values before applying PCA.", "_____no_output_____" ] ], [ [ "pipelineObj = Pipeline([MissingValues(imputation_type = 'mean'), PCAFeatures(n_components = 5)])\npca_df = pipelineObj(df, '0')\npca_df.head(5)", "_____no_output_____" ] ], [ [ "### Random Projections\n\nUse n_components to play around with how many dimensions you want to keep. Please note that assertions will validate if a data frame has any missing values before applying Random Projections. Type of projections can be specified as an argument, by default GaussianRandomProjection is applied. In the below example, the pipeline first removed missing values before applying Sparse Random Projection. As of now, 'auto' deduction of number of dimensions which are sufficient to represent the features with minimal loss of information has not been implemeted, hence default value for ouput columns is 2 (Use n_components to specify custom value)", "_____no_output_____" ] ], [ [ "pipelineObj = Pipeline([MissingValues(imputation_type = 'mean'), RandomProjection(n_components = 6, proj_type = 'Sparse')])\nnew_df = pipelineObj(df, '0')\nnew_df.head()", "_____no_output_____" ] ], [ [ "### Download the modified CSV\nAt any point, the new tranformed features can be downloaded using below command\n", "_____no_output_____" ] ], [ [ "csv_path = './DemoData/synthetic_classification_transformed.csv'\nnew_df.to_csv(csv_path)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d029cf1a4352b34888de3488c6aea05fd620f367
18,024
ipynb
Jupyter Notebook
nircam_jdox/nircam_grisms/figure4_sensitivity.ipynb
aliciacanipe/nircam_jdox
fa1c3381283bb08b870162d0dd3bc9d5e94561ea
[ "BSD-3-Clause" ]
1
2022-03-10T06:48:27.000Z
2022-03-10T06:48:27.000Z
nircam_jdox/nircam_grisms/figure4_sensitivity.ipynb
aliciacanipe/nircam_jdox
fa1c3381283bb08b870162d0dd3bc9d5e94561ea
[ "BSD-3-Clause" ]
7
2019-04-05T16:30:32.000Z
2019-05-02T16:30:26.000Z
nircam_jdox/nircam_grisms/figure4_sensitivity.ipynb
aliciacanipe/nircam_jdox
fa1c3381283bb08b870162d0dd3bc9d5e94561ea
[ "BSD-3-Clause" ]
3
2019-03-20T15:14:26.000Z
2019-12-17T20:16:40.000Z
38.76129
167
0.576343
[ [ [ "# Figure 4: NIRCam Grism + Filter Sensitivities ($1^{st}$ order)", "_____no_output_____" ], [ "***\n### Table of Contents\n\n1. [Information](#Information)\n2. [Imports](#Imports)\n3. [Data](#Data)\n4. [Generate the First Order Grism + Filter Sensitivity Plot](#Generate-the-First-Order-Grism-+-Filter-Sensitivity-Plot)\n5. [Issues](#Issues)\n6. [About this Notebook](#About-this-Notebook)\n***", "_____no_output_____" ], [ "## Information", "_____no_output_____" ], [ "#### JDox links: \n* [NIRCam Grisms](https://jwst-docs.stsci.edu/display/JTI/NIRCam+Grisms#NIRCamGrisms-Sensitivity)\n * Figure 4. NIRCam grism + filter sensitivities ($1^{st}$ order)", "_____no_output_____" ], [ "## Imports", "_____no_output_____" ] ], [ [ "import os\nimport pylab\nimport numpy as np\nfrom astropy.io import ascii, fits\nfrom astropy.table import Table\nfrom scipy.optimize import fmin\nfrom scipy.interpolate import interp1d\nimport requests\nimport matplotlib.pyplot as plt\n%matplotlib inline", "_____no_output_____" ] ], [ [ "## Data", "_____no_output_____" ], [ "#### Data Location: \n\nThe data is stored in a NIRCam JDox Box folder here:\n[ST-INS-NIRCAM -> JDox -> nircam_grisms](https://stsci.box.com/s/wu9mo54vi957x50rdirlcg9zkkr3xiaw)", "_____no_output_____" ] ], [ [ "files = [('https://stsci.box.com/shared/static/i0a9dkp02nnuw6w0xcfd7b42ctxfb8es.fits', 'NIRCam.F250M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/vfnyk9veote92dz1edpbu83un5n20rsw.fits', 'NIRCam.F250M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/ssvltwzt7f4y5lfvch2o1prdk5hb2gz2.fits', 'NIRCam.F250M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/56wjvzx1jf2i5yg7l1gg77vtvi01ec5p.fits', 'NIRCam.F250M.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/v1621dcm44be21n381mbgd2hzxxqrb2e.fits', 'NIRCam.F277W.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/8slec91wj6ety6d8qvest09msklpypi8.fits', 'NIRCam.F277W.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/r42hdv64x6skqqszv24qkxohiijitqcf.fits', 'NIRCam.F277W.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/3vye6ni05i3kdqyd5vs1jk2q59yyms2e.fits', 'NIRCam.F277W.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/twcxbe6lxrjckqph980viiijv8fpmm8b.fits', 'NIRCam.F300M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/bpvluysg3zsl3q4b4l5rj5nue84ydjem.fits', 'NIRCam.F300M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/15x7rbwngsxiubbexy7zcezxqm3ndq54.fits', 'NIRCam.F300M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/a7tqdp0feqcttw3d9vaioy7syzfsftz6.fits', 'NIRCam.F300M.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/i76sb53pthieh4kn62fpxhcxn8lreffj.fits', 'NIRCam.F322W2.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/wgbyfi3ofs7i19b7zsf2iceupzkbkokq.fits', 'NIRCam.F322W2.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/jhk3deym5wbc68djtcahy3otk2xfjdb5.fits', 'NIRCam.F322W2.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/zu3xqnicbyfjn54yb4kgzvnglanf13ak.fits', 'NIRCam.F322W2.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/e2srtf52wnh6vvxsy2aiknbcr8kx2xr5.fits', 'NIRCam.F335M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/bav3tswdd7lemsyd53bnpj4b6yke5bgd.fits', 'NIRCam.F335M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/81wm768mjemzj84w1ogzqddgmrk3exvt.fits', 'NIRCam.F335M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/fhopmyongqifibdtwt3qr682lwdjaf7a.fits', 'NIRCam.F335M.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/j9gd8bclethgex40o7qi1e79hgj2hsyt.fits', 'NIRCam.F356W.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/s23novi3p6qwm9f9hj9wutgju08be776.fits', 'NIRCam.F356W.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/41fnmswn1ttnwts6jj5fu73m4hs6icxd.fits', 'NIRCam.F356W.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/wx3rvjt0mvf0hnhv4wvqcmxu61gamwmm.fits', 'NIRCam.F356W.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/e0p6vkiow4jlp49deqkji9kekzdt4oon.fits', 'NIRCam.F360M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/xbh0rjjvxn0x22k9ktiyikol7c4ep6ka.fits', 'NIRCam.F360M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/e7artuotyv8l9wfoa3rk1k00o5mv8so8.fits', 'NIRCam.F360M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/9r5bmick13ti22l6hcsw0uod75vqartw.fits', 'NIRCam.F360M.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/tqd1uqsf8nj12he5qa3hna0zodnlzfea.fits', 'NIRCam.F410M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/4szffesvswh0h8fjym5m5ht37sj0jzrl.fits', 'NIRCam.F410M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/iur0tpbts23lc5rn5n0tplzndlkoudel.fits', 'NIRCam.F410M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/rvz8iznsnl0bsjrqiw7rv74jj24b0otb.fits', 'NIRCam.F410M.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/sv3g82qbb4u2umksgu5zdl7rp569sdi7.fits', 'NIRCam.F430M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/mmqv1pkuzpj6abtufxxfo960z2v1oygc.fits', 'NIRCam.F430M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/84q83haic2h6eq5c6p2frkybz551hp8d.fits', 'NIRCam.F430M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/3osceplhq6kmvmm2a72jsgrg6z1ggw1p.fits', 'NIRCam.F430M.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/kitx7gdo5kool6jus2g19vdy7q7hmxck.fits', 'NIRCam.F444W.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/ug7y93v0en9c84hfp6d3vtjogmjou9u3.fits', 'NIRCam.F444W.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/0p9h9ofayq8q6dbfsccf3tn5lvxxod9i.fits', 'NIRCam.F444W.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/34hbqzibt5h72hm0rj9wylttj7m9wd19.fits', 'NIRCam.F444W.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/vj0rkyebg0afny1khdyiho4mktmtsi1q.fits', 'NIRCam.F460M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/ky1z1dpewsjqab1o9hstihrec7h52oq4.fits', 'NIRCam.F460M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/s93cwpcvnxfjwqbulnkh9ts9ln0fu9cz.fits', 'NIRCam.F460M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/1178in8zg462es1fkl0mgcbpgp6kgb6t.fits', 'NIRCam.F460M.R.B.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/b855uj293klac8hnoqhrnv8ei0rcvudj.fits', 'NIRCam.F480M.R.A.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/werzjlp3ybxk2ovg6u689zsfpts2t8w3.fits', 'NIRCam.F480M.R.A.2nd.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/yrh5mylru1upbo5rifbz77acn8k1ud6i.fits', 'NIRCam.F480M.R.B.1st.sensitivity.fits'),\n ('https://stsci.box.com/shared/static/oxu6jsg9cn9yqkh3nh646fx0flhw8rej.fits', 'NIRCam.F480M.R.B.2nd.sensitivity.fits')]\n ", "_____no_output_____" ], [ "def download_file(url, file_name, output_directory='./', overwrite=False):\n \"\"\"Download a file from Box given the direct URL\n\n Parameters\n ----------\n url : str\n URL to the file to be downloaded\n \n file_name : str\n The name of the file being downloaded\n \n output_directory : str\n Directory to download file_name into\n \n overwrite : str\n If False and the file to download already exists, the download\n will be skipped. If True, the file will be downloaded regardless\n of whether it already exists in output_directory\n\n Returns\n -------\n download_filename : str\n Name of the downloaded file\n \"\"\"\n download_filename = os.path.join(output_directory, file_name)\n if not os.path.isfile(download_filename) or overwrite is True:\n print(\"Downloading {}\".format(file_name))\n with requests.get(url, stream=True) as response:\n if response.status_code != 200:\n raise RuntimeError(\"Wrong URL - {}\".format(url))\n with open(download_filename, 'wb') as f:\n for chunk in response.iter_content(chunk_size=2048):\n if chunk:\n f.write(chunk)\n else:\n print(\"{} already exists. Skipping download.\".format(download_filename))\n return download_filename", "_____no_output_____" ] ], [ [ "#### Load the data\n\n(The next cell assumes you downloaded the data into your ```Users/$(logname)/``` home directory)", "_____no_output_____" ] ], [ [ "if os.environ.get('LOGNAME') is None:\n raise ValueError(\"WARNING: LOGNAME environment variable not set!\")", "_____no_output_____" ], [ "box_directory = os.path.join(\"/Users/\", os.environ['LOGNAME'], \"box_data\") \nbox_directory", "_____no_output_____" ], [ "if not os.path.isdir(box_directory):\n try:\n os.mkdir(box_directory)\n except:\n raise OSError(\"Unable to create {}\".format(box_directory))", "_____no_output_____" ], [ "for file_info in files:\n file_url, filename = file_info\n outfile = download_file(file_url, filename, output_directory=box_directory)", "_____no_output_____" ], [ "grism = \"R\"\nmod = \"A\"\nfilters = [\"F250M\",\"F277W\",\"F300M\",\"F322W2\",\"F335M\",\"F356W\",\"F360M\",\"F410M\",\"F430M\",\"F444W\",\"F460M\",\"F480M\"]", "_____no_output_____" ], [ "filenames = []\nfor fil in filters:\n filenames.append(os.path.join(box_directory, \"NIRCam.%s.%s.%s.1st.sensitivity.fits\" % (fil,grism,mod))) ", "_____no_output_____" ], [ "filenames", "_____no_output_____" ] ], [ [ "## Generate the First Order Grism + Filter Sensitivity Plot", "_____no_output_____" ], [ "### Define some convenience functions", "_____no_output_____" ] ], [ [ "def find_nearest(array,value):\n idx = (np.abs(array-value)).argmin()\n return array[idx]\n\n\ndef find_nearest(array,value):\n idx = (np.abs(array-value)).argmin()\n return array[idx]\n\n\ndef find_mid(w,s,w0,thr=0.05):\n fct = interp1d(w,s,bounds_error=None,fill_value='extrapolate')\n def func(x):\n #print \"x:\",x\n return np.abs(fct(x)-thr)\n \n res = fmin(func,w0)\n \n return res[0]", "_____no_output_____" ] ], [ [ "### Create the plots", "_____no_output_____" ] ], [ [ "f, ax1 = plt.subplots(1, figsize=(15, 10))\n\nNUM_COLORS = len(filters)\n\ncm = pylab.get_cmap('tab10')\n\ngrism = \"R\"\nmod = \"A\"\nfor i,fname in zip(range(NUM_COLORS),filenames):\n color = cm(1.*i/NUM_COLORS)\n d = fits.open(fname)\n w = d[1].data[\"WAVELENGTH\"]\n s = d[1].data[\"SENSITIVITY\"]/(1e17)\n ax1.plot(w,s,label=fil,lw=4,color=color)\n \nax1.legend(fontsize=16)\nminy,maxy = ax1.get_ylim()\nminx,maxx = ax1.get_xlim()\n\nax1.set_ylim(miny,2.15)\nax1.set_xlim(2.1,maxx)\nax1.tick_params(labelsize=18)\nf.text(0.5, 0.04, 'Wavelength ($\\mu m$)', ha='center', fontsize=22)\nf.text(0.03, 0.5, 'Sensitivity ('+r'$1 \\times 10^{17}\\ \\frac{e^{-} s^{-1}}{erg s^{-1} cm^{-2} A^{-1}}$'+')', va='center', rotation='vertical', fontsize=22)", "_____no_output_____" ] ], [ [ "### Figure option 2: filter name positions", "_____no_output_____" ] ], [ [ "f, ax1 = plt.subplots(1, figsize=(15, 10))\n\nthr = 0.05 # 5% of peak boundaries\n\nNUM_COLORS = len(filters)\n\ncm = pylab.get_cmap('tab10')\n\nfor i,fil,fname in zip(range(NUM_COLORS),filters,filenames):\n color = cm(1.*i/NUM_COLORS)\n d = fits.open(fname)\n w = d[1].data[\"WAVELENGTH\"]\n s = d[1].data[\"SENSITIVITY\"]/(1e17)\n\n wmin,wmax = np.min(w),np.max(w)\n vg = w<(wmax+wmin)/2.\n w1 = find_mid(w[vg],s[vg],wmin,thr)\n \n vg = w>(wmax+wmin)/2.\n w2 = find_mid(w[vg],s[vg],wmax,thr) \n \n if fil == 'F356W':\n ax1.text((w2+w1)/2 -0.04, s[np.where(w == find_nearest(w, (w2+w1)/2))]+0.25, fil, ha='center',color=color,fontsize=16,weight='bold')\n elif fil == 'F335M':\n ax1.text((w2+w1)/2 -0.03, s[np.where(w == find_nearest(w, (w2+w1)/2))]+0.22, fil, ha='center',color=color,fontsize=16,weight='bold') \n elif fil == 'F460M':\n ax1.text((w2+w1)/2+0.15, s[np.where(w == find_nearest(w, (w2+w1)/2))]+0.12, fil, ha='center',color=color,fontsize=16,weight='bold') \n elif fil == 'F480M':\n ax1.text((w2+w1)/2+0.15, s[np.where(w == find_nearest(w, (w2+w1)/2))]+0.1, fil, ha='center',color=color,fontsize=16,weight='bold')\n else:\n ax1.text((w2+w1)/2 -0.04, s[np.where(w == find_nearest(w, (w2+w1)/2))]+0.2, fil, ha='center',color=color,fontsize=16,weight='bold') \n ax1.plot(w,s,label=fil,lw=4,color=color)\n \n\nminy,maxy = ax1.get_ylim()\nminx,maxx = ax1.get_xlim()\n\nax1.set_ylim(miny,2.15)\nax1.set_xlim(2.1,maxx)\nax1.tick_params(labelsize=18)\nf.text(0.5, 0.04, 'Wavelength ($\\mu m$)', ha='center', fontsize=22)\nf.text(0.03, 0.5, 'Sensitivity ('+r'$1 \\times 10^{17}\\ \\frac{e^{-} s^{-1}}{erg\\ s^{-1} cm^{-2} A^{-1}}$'+')', va='center', rotation='vertical', fontsize=22)", "_____no_output_____" ] ], [ [ "## Issues", "_____no_output_____" ], [ "* None", "_____no_output_____" ], [ "## About this Notebook\n**Authors:** \nNor Pirzkal & Alicia Canipe\n\n**Updated On:** \nApril 10, 2019", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ] ]
d029e39e0fc4ef237aaa6e2bdd33933dd7b51872
147,157
ipynb
Jupyter Notebook
image/2. Flower Classification with TPUs/kaggle/fast-pytorch-xla-for-tpu-with-multiprocessing.ipynb
nishchalnishant/Completed_Kaggle_competitions
fc920af79f09de642e1e590cdc281bfbf5a92db3
[ "MIT" ]
null
null
null
image/2. Flower Classification with TPUs/kaggle/fast-pytorch-xla-for-tpu-with-multiprocessing.ipynb
nishchalnishant/Completed_Kaggle_competitions
fc920af79f09de642e1e590cdc281bfbf5a92db3
[ "MIT" ]
null
null
null
image/2. Flower Classification with TPUs/kaggle/fast-pytorch-xla-for-tpu-with-multiprocessing.ipynb
nishchalnishant/Completed_Kaggle_competitions
fc920af79f09de642e1e590cdc281bfbf5a92db3
[ "MIT" ]
null
null
null
56.381992
146
0.561971
[ [ [ "**Version 2**: disable unfreezing for speed", "_____no_output_____" ], [ "## setup for pytorch/xla on TPU", "_____no_output_____" ] ], [ [ "import os\nimport collections\nfrom datetime import datetime, timedelta\n\nos.environ[\"XRT_TPU_CONFIG\"] = \"tpu_worker;0;10.0.0.2:8470\"\n\n_VersionConfig = collections.namedtuple('_VersionConfig', 'wheels,server')\nVERSION = \"torch_xla==nightly\"\nCONFIG = {\n 'torch_xla==nightly': _VersionConfig('nightly', 'XRT-dev{}'.format(\n (datetime.today() - timedelta(1)).strftime('%Y%m%d')))}[VERSION]\n\nDIST_BUCKET = 'gs://tpu-pytorch/wheels'\nTORCH_WHEEL = 'torch-{}-cp36-cp36m-linux_x86_64.whl'.format(CONFIG.wheels)\nTORCH_XLA_WHEEL = 'torch_xla-{}-cp36-cp36m-linux_x86_64.whl'.format(CONFIG.wheels)\nTORCHVISION_WHEEL = 'torchvision-{}-cp36-cp36m-linux_x86_64.whl'.format(CONFIG.wheels)\n\n!export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH\n!apt-get install libomp5 -y\n!apt-get install libopenblas-dev -y\n\n!pip uninstall -y torch torchvision\n!gsutil cp \"$DIST_BUCKET/$TORCH_WHEEL\" .\n!gsutil cp \"$DIST_BUCKET/$TORCH_XLA_WHEEL\" .\n!gsutil cp \"$DIST_BUCKET/$TORCHVISION_WHEEL\" .\n!pip install \"$TORCH_WHEEL\"\n!pip install \"$TORCH_XLA_WHEEL\"\n!pip install \"$TORCHVISION_WHEEL\"", "\r\n\r\n\r\nThe following NEW packages will be installed:\r\n libomp5\r\n0 upgraded, 1 newly installed, 0 to remove and 32 not upgraded.\r\nNeed to get 228 kB of archives.\r\nAfter this operation, 750 kB of additional disk space will be used.\r\nGet:1 http://deb.debian.org/debian stretch/main amd64 libomp5 amd64 3.9.1-1 [228 kB]\r\nFetched 228 kB in 0s (5208 kB/s)\r\ndebconf: delaying package configuration, since apt-utils is not installed\r\nSelecting previously unselected package libomp5:amd64.\r\n(Reading database ... 59973 files and directories currently installed.)\r\nPreparing to unpack .../libomp5_3.9.1-1_amd64.deb ...\r\nUnpacking libomp5:amd64 (3.9.1-1) ...\r\nSetting up libomp5:amd64 (3.9.1-1) ...\r\nProcessing triggers for libc-bin (2.24-11+deb9u4) ...\r\n\r\n\r\n\r\nThe following additional packages will be installed:\r\n libopenblas-base\r\nThe following NEW packages will be installed:\r\n libopenblas-base libopenblas-dev\r\n0 upgraded, 2 newly installed, 0 to remove and 32 not upgraded.\r\nNeed to get 7602 kB of archives.\r\nAfter this operation, 91.5 MB of additional disk space will be used.\r\nGet:1 http://deb.debian.org/debian stretch/main amd64 libopenblas-base amd64 0.2.19-3 [3793 kB]\r\nGet:2 http://deb.debian.org/debian stretch/main amd64 libopenblas-dev amd64 0.2.19-3 [3809 kB]\r\nFetched 7602 kB in 0s (35.9 MB/s)\r\ndebconf: delaying package configuration, since apt-utils is not installed\r\nSelecting previously unselected package libopenblas-base.\r\n(Reading database ... 59978 files and directories currently installed.)\r\nPreparing to unpack .../libopenblas-base_0.2.19-3_amd64.deb ...\r\nUnpacking libopenblas-base (0.2.19-3) ...\r\nSelecting previously unselected package libopenblas-dev.\r\nPreparing to unpack .../libopenblas-dev_0.2.19-3_amd64.deb ...\r\nUnpacking libopenblas-dev (0.2.19-3) ...\r\nProcessing triggers for libc-bin (2.24-11+deb9u4) ...\r\nSetting up libopenblas-base (0.2.19-3) ...\r\nupdate-alternatives: using /usr/lib/openblas-base/libblas.so.3 to provide /usr/lib/libblas.so.3 (libblas.so.3) in auto mode\r\nupdate-alternatives: using /usr/lib/openblas-base/liblapack.so.3 to provide /usr/lib/liblapack.so.3 (liblapack.so.3) in auto mode\r\nSetting up libopenblas-dev (0.2.19-3) ...\r\nupdate-alternatives: using /usr/lib/openblas-base/libblas.so to provide /usr/lib/libblas.so (libblas.so) in auto mode\r\nupdate-alternatives: using /usr/lib/openblas-base/liblapack.so to provide /usr/lib/liblapack.so (liblapack.so) in auto mode\r\nProcessing triggers for libc-bin (2.24-11+deb9u4) ...\r\nFound existing installation: torch 1.4.0\r\nUninstalling torch-1.4.0:\r\n Successfully uninstalled torch-1.4.0\r\nFound existing installation: torchvision 0.5.0\r\nUninstalling torchvision-0.5.0:\r\n Successfully uninstalled torchvision-0.5.0\r\nCopying gs://tpu-pytorch/wheels/torch-nightly-cp36-cp36m-linux_x86_64.whl...\r\n\r\nOperation completed over 1 objects/77.8 MiB. \r\nCopying gs://tpu-pytorch/wheels/torch_xla-nightly-cp36-cp36m-linux_x86_64.whl...\r\n\r\nOperation completed over 1 objects/112.7 MiB. \r\nCopying gs://tpu-pytorch/wheels/torchvision-nightly-cp36-cp36m-linux_x86_64.whl...\r\n\r\nOperation completed over 1 objects/2.5 MiB. \r\nProcessing ./torch-nightly-cp36-cp36m-linux_x86_64.whl\r\n\u001b[31mERROR: fastai 1.0.60 requires torchvision, which is not installed.\u001b[0m\r\n\u001b[31mERROR: catalyst 20.2.1 requires torchvision>=0.2.1, which is not installed.\u001b[0m\r\n\u001b[31mERROR: allennlp 0.9.0 has requirement spacy<2.2,>=2.1.0, but you'll have spacy 2.2.3 which is incompatible.\u001b[0m\r\nInstalling collected packages: torch\r\nSuccessfully installed torch-1.5.0a0+e0b90b8\r\nProcessing ./torch_xla-nightly-cp36-cp36m-linux_x86_64.whl\r\nInstalling collected packages: torch-xla\r\nSuccessfully installed torch-xla-0.8+f1455a7\r\nProcessing ./torchvision-nightly-cp36-cp36m-linux_x86_64.whl\r\nRequirement already satisfied: numpy in /opt/conda/lib/python3.6/site-packages (from torchvision==nightly) (1.18.1)\r\nRequirement already satisfied: pillow>=4.1.1 in /opt/conda/lib/python3.6/site-packages (from torchvision==nightly) (5.4.1)\r\nRequirement already satisfied: torch in /opt/conda/lib/python3.6/site-packages (from torchvision==nightly) (1.5.0a0+e0b90b8)\r\nRequirement already satisfied: six in /opt/conda/lib/python3.6/site-packages (from torchvision==nightly) (1.14.0)\r\nInstalling collected packages: torchvision\r\nSuccessfully installed torchvision-0.6.0a0+b2e9565\r\n" ] ], [ [ "## Imports", "_____no_output_____" ] ], [ [ "import os\nimport re\nimport cv2\nimport time\nimport tensorflow\nimport collections\nimport numpy as np\nimport pandas as pd\nfrom tqdm import tqdm\nfrom glob import glob\nfrom PIL import Image\nimport requests, threading\nimport matplotlib.pyplot as plt\nfrom datetime import datetime, timedelta\n\nimport torch\nimport torchvision\nimport torch.nn as nn\nimport torch.optim as optim\nimport torch.nn.functional as F\nfrom torchvision import datasets\nfrom torchvision import transforms\nfrom torch.autograd import Variable\nfrom torch.utils.data import Dataset, DataLoader\nfrom torch.optim.lr_scheduler import OneCycleLR\n\nimport torch_xla\nimport torch_xla.utils.utils as xu\nimport torch_xla.core.xla_model as xm\nimport torch_xla.debug.metrics as met\nimport torch_xla.distributed.data_parallel as dp\nimport torch_xla.distributed.parallel_loader as pl\nimport torch_xla.distributed.xla_multiprocessing as xmp\n\nimport warnings\nwarnings.filterwarnings(\"ignore\")\n\ntorch.manual_seed(42)\ntorch.set_default_tensor_type('torch.FloatTensor')", "_____no_output_____" ], [ "# do not uncomment see https://github.com/pytorch/xla/issues/1587\n\n# xm.get_xla_supported_devices()\n# xm.xrt_world_size() # 1", "_____no_output_____" ] ], [ [ "## Dataset", "_____no_output_____" ] ], [ [ "DATASET_DIR = '/kaggle/input/104-flowers-garden-of-eden/jpeg-512x512'\nTRAIN_DIR = DATASET_DIR + '/train'\nVAL_DIR = DATASET_DIR + '/val'\nTEST_DIR = DATASET_DIR + '/test'\nBATCH_SIZE = 16 # per core \nNUM_EPOCH = 25", "_____no_output_____" ], [ "normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406],std=[0.229, 0.224, 0.225])\ntrain_transform = transforms.Compose([transforms.RandomResizedCrop(224),\n transforms.RandomHorizontalFlip(0.5),\n transforms.ToTensor(),\n normalize])\n\nvalid_transform = transforms.Compose([transforms.Resize((224,224)),\n transforms.ToTensor(),\n normalize])", "_____no_output_____" ], [ "train = datasets.ImageFolder(TRAIN_DIR, transform=train_transform)\nvalid = datasets.ImageFolder(VAL_DIR, transform=train_transform)\ntrain = torch.utils.data.ConcatDataset([train, valid])\n\n# print out some data stats\nprint('Num training images: ', len(train))\nprint('Num test images: ', len(valid))", "Num training images: 16465\nNum test images: 3712\n" ] ], [ [ "## Model", "_____no_output_____" ] ], [ [ "class MyModel(nn.Module):\n\n def __init__(self):\n super(MyModel, self).__init__()\n \n self.base_model = torchvision.models.densenet201(pretrained=True)\n self.base_model.classifier = nn.Identity()\n self.fc = torch.nn.Sequential(\n torch.nn.Linear(1920, 1024, bias = True),\n torch.nn.BatchNorm1d(1024),\n torch.nn.ReLU(inplace=True),\n torch.nn.Dropout(0.3),\n torch.nn.Linear(1024, 512, bias = True),\n torch.nn.BatchNorm1d(512),\n torch.nn.ReLU(inplace=True),\n torch.nn.Dropout(0.3),\n torch.nn.Linear(512, 104))\n \n def forward(self, inputs):\n x = self.base_model(inputs)\n return self.fc(x)", "_____no_output_____" ], [ "model = MyModel()\nprint(model)\ndel model", "Downloading: \"https://download.pytorch.org/models/densenet201-c1103571.pth\" to /root/.cache/torch/checkpoints/densenet201-c1103571.pth\n" ] ], [ [ "## Training", "_____no_output_____" ] ], [ [ "def train_model():\n train = datasets.ImageFolder(TRAIN_DIR, transform=train_transform)\n valid = datasets.ImageFolder(VAL_DIR, transform=train_transform)\n train = torch.utils.data.ConcatDataset([train, valid])\n \n torch.manual_seed(42)\n \n train_sampler = torch.utils.data.distributed.DistributedSampler(\n train,\n num_replicas=xm.xrt_world_size(),\n rank=xm.get_ordinal(),\n shuffle=True)\n \n train_loader = torch.utils.data.DataLoader(\n train,\n batch_size=BATCH_SIZE,\n sampler=train_sampler,\n num_workers=0,\n drop_last=True) # print(len(train_loader))\n \n \n xm.master_print(f\"Train for {len(train_loader)} steps per epoch\")\n \n # Scale learning rate to num cores\n learning_rate = 0.0001 * xm.xrt_world_size()\n\n # Get loss function, optimizer, and model\n device = xm.xla_device()\n\n model = MyModel()\n \n for param in model.base_model.parameters(): # freeze some layers\n param.requires_grad = False\n \n model = model.to(device)\n loss_fn = nn.CrossEntropyLoss()\n optimizer = optim.Adam(model.parameters(), lr=learning_rate, weight_decay=5e-4)\n scheduler = OneCycleLR(optimizer, \n learning_rate, \n div_factor=10.0, \n final_div_factor=50.0, \n epochs=NUM_EPOCH,\n steps_per_epoch=len(train_loader))\n \n \n \n def train_loop_fn(loader):\n tracker = xm.RateTracker()\n model.train()\n total_samples, correct = 0, 0\n for x, (data, target) in enumerate(loader):\n optimizer.zero_grad()\n output = model(data)\n loss = loss_fn(output, target)\n loss.backward()\n xm.optimizer_step(optimizer)\n tracker.add(data.shape[0])\n pred = output.max(1, keepdim=True)[1]\n correct += pred.eq(target.view_as(pred)).sum().item()\n total_samples += data.size()[0]\n scheduler.step()\n if x % 40 == 0:\n print('[xla:{}]({})\\tLoss={:.3f}\\tRate={:.2f}\\tGlobalRate={:.2f}'.format(\n xm.get_ordinal(), x, loss.item(), tracker.rate(),\n tracker.global_rate()), flush=True)\n accuracy = 100.0 * correct / total_samples\n print('[xla:{}] Accuracy={:.2f}%'.format(xm.get_ordinal(), accuracy), flush=True)\n return accuracy\n\n # Train loops\n accuracy = []\n for epoch in range(1, NUM_EPOCH + 1):\n start = time.time()\n para_loader = pl.ParallelLoader(train_loader, [device])\n accuracy.append(train_loop_fn(para_loader.per_device_loader(device)))\n xm.master_print(\"Finished training epoch {} train-acc {:.2f} in {:.2f} sec\"\\\n .format(epoch, accuracy[-1], time.time() - start))\n xm.save(model.state_dict(), \"./model.pt\")\n\n# if epoch == 15: #unfreeze\n# for param in model.base_model.parameters():\n# param.requires_grad = True\n\n return accuracy", "_____no_output_____" ], [ "# Start training processes\ndef _mp_fn(rank, flags):\n global acc_list\n torch.set_default_tensor_type('torch.FloatTensor')\n a = train_model()\n\nFLAGS={}\nxmp.spawn(_mp_fn, args=(FLAGS,), nprocs=8, start_method='fork')", "Train for 128 steps per epoch\n[xla:1](0)\tLoss=4.989\tRate=0.89\tGlobalRate=0.89\n[xla:3](0)\tLoss=4.780\tRate=0.81\tGlobalRate=0.81\n[xla:4](0)\tLoss=4.716\tRate=0.81\tGlobalRate=0.81\n[xla:2](0)\tLoss=4.723\tRate=0.69\tGlobalRate=0.69\n[xla:7](0)\tLoss=4.702\tRate=0.83\tGlobalRate=0.83\n[xla:0](0)\tLoss=4.785\tRate=0.57\tGlobalRate=0.57\n[xla:5](0)\tLoss=4.939\tRate=0.70\tGlobalRate=0.70\n[xla:6](0)\tLoss=4.991\tRate=0.73\tGlobalRate=0.73\n[xla:6](40)\tLoss=4.108\tRate=8.98\tGlobalRate=9.93\n[xla:2](40)\tLoss=3.599\tRate=8.96\tGlobalRate=9.73\n[xla:1](40)\tLoss=3.885\tRate=9.04\tGlobalRate=10.55\n[xla:0](40)\tLoss=3.802\tRate=8.91\tGlobalRate=9.08\n[xla:5](40)\tLoss=4.445\tRate=8.97\tGlobalRate=9.77\n[xla:3](40)\tLoss=4.046\tRate=9.01\tGlobalRate=10.27\n[xla:7](40)\tLoss=4.111\tRate=9.02\tGlobalRate=10.33\n[xla:4](40)\tLoss=4.295\tRate=9.01\tGlobalRate=10.28\n[xla:2](80)\tLoss=3.008\tRate=17.59\tGlobalRate=13.66\n[xla:7](80)\tLoss=3.415\tRate=17.61\tGlobalRate=14.25\n[xla:6](80)\tLoss=3.197\tRate=17.60\tGlobalRate=13.86\n[xla:3](80)\tLoss=3.940\tRate=17.61\tGlobalRate=14.20\n[xla:1](80)\tLoss=3.327\tRate=17.62\tGlobalRate=14.47\n[xla:5](80)\tLoss=2.703\tRate=17.59\tGlobalRate=13.70\n[xla:0](80)\tLoss=3.016\tRate=17.57\tGlobalRate=13.00\n[xla:4](80)\tLoss=3.871\tRate=17.61\tGlobalRate=14.20\n[xla:7](120)\tLoss=3.458\tRate=21.58\tGlobalRate=16.50\n[xla:5](120)\tLoss=3.633\tRate=21.57\tGlobalRate=16.00\n[xla:3](120)\tLoss=2.451\tRate=21.57\tGlobalRate=16.45\n[xla:4](120)\tLoss=2.573\tRate=21.58\tGlobalRate=16.45\n[xla:1](120)\tLoss=2.963\tRate=21.58\tGlobalRate=16.69\n[xla:2](120)\tLoss=2.925\tRate=21.57\tGlobalRate=15.96\n[xla:6](120)\tLoss=2.663\tRate=21.57\tGlobalRate=16.14\n[xla:0](120)\tLoss=2.882\tRate=21.56\tGlobalRate=15.35\n[xla:2] Accuracy=23.83%\n[xla:3] Accuracy=24.27%\n[xla:0] Accuracy=24.17%\n[xla:1] Accuracy=24.12%\n[xla:4] Accuracy=23.34%\n[xla:5] Accuracy=22.66%\nFinished training epoch 1 train-acc 24.17 in 128.63 sec\n[xla:7] Accuracy=22.95%\n[xla:6] Accuracy=23.14%\n[xla:7](0)\tLoss=2.414\tRate=5.76\tGlobalRate=5.76\n[xla:0](0)\tLoss=2.922\tRate=5.77\tGlobalRate=5.77\n[xla:6](0)\tLoss=2.563\tRate=5.79\tGlobalRate=5.79\n[xla:1](0)\tLoss=2.955\tRate=5.79\tGlobalRate=5.79\n[xla:4](0)\tLoss=2.739\tRate=5.77\tGlobalRate=5.77\n[xla:3](0)\tLoss=2.165\tRate=5.83\tGlobalRate=5.83\n[xla:5](0)\tLoss=2.761\tRate=5.75\tGlobalRate=5.75\n[xla:2](0)\tLoss=2.325\tRate=5.76\tGlobalRate=5.76\n[xla:3](40)\tLoss=3.028\tRate=16.53\tGlobalRate=22.02\n[xla:0](40)\tLoss=2.728\tRate=16.51\tGlobalRate=22.00\n[xla:6](40)\tLoss=2.971\tRate=16.51\tGlobalRate=22.00\n[xla:4](40)\tLoss=3.325\tRate=16.51\tGlobalRate=22.00\n[xla:7](40)\tLoss=3.123\tRate=16.50\tGlobalRate=21.99\n[xla:2](40)\tLoss=2.441\tRate=16.50\tGlobalRate=21.99\n[xla:5](40)\tLoss=3.490\tRate=16.49\tGlobalRate=21.99\n[xla:1](40)\tLoss=2.592\tRate=16.51\tGlobalRate=22.00\n[xla:6](80)\tLoss=2.163\tRate=17.94\tGlobalRate=20.35\n[xla:1](80)\tLoss=2.347\tRate=17.94\tGlobalRate=20.35\n[xla:7](80)\tLoss=2.023\tRate=17.94\tGlobalRate=20.35\n[xla:4](80)\tLoss=3.082\tRate=17.94\tGlobalRate=20.35\n[xla:2](80)\tLoss=1.798\tRate=17.94\tGlobalRate=20.35\n[xla:3](80)\tLoss=2.853\tRate=17.95\tGlobalRate=20.36\n[xla:5](80)\tLoss=2.171\tRate=17.94\tGlobalRate=20.35\n[xla:0](80)\tLoss=2.048\tRate=17.94\tGlobalRate=20.35\n[xla:5](120)\tLoss=2.252\tRate=21.94\tGlobalRate=21.58\n[xla:1](120)\tLoss=2.023\tRate=21.94\tGlobalRate=21.59\n[xla:4](120)\tLoss=1.703\tRate=21.94\tGlobalRate=21.58\n[xla:2](120)\tLoss=2.222\tRate=21.94\tGlobalRate=21.58\n[xla:3](120)\tLoss=1.383\tRate=21.94\tGlobalRate=21.59\n[xla:6](120)\tLoss=1.686\tRate=21.94\tGlobalRate=21.59\n[xla:0](120)\tLoss=2.221\tRate=21.94\tGlobalRate=21.58\n[xla:7](120)\tLoss=2.730\tRate=21.94\tGlobalRate=21.58\n[xla:5] Accuracy=44.58%\n[xla:1] Accuracy=46.00%\n[xla:7] Accuracy=44.78%\n[xla:6] Accuracy=45.51%\n[xla:3] Accuracy=45.31%\n[xla:0] Accuracy=48.19%\n[xla:2] Accuracy=45.90%\nFinished training epoch 2 train-acc 48.19 in 92.46 sec\n[xla:4] Accuracy=46.68%\n[xla:2](0)\tLoss=1.348\tRate=6.10\tGlobalRate=6.10\n[xla:5](0)\tLoss=1.965\tRate=6.05\tGlobalRate=6.05\n[xla:1](0)\tLoss=2.241\tRate=6.02\tGlobalRate=6.02\n[xla:7](0)\tLoss=1.871\tRate=6.05\tGlobalRate=6.05\n[xla:6](0)\tLoss=1.966\tRate=6.06\tGlobalRate=6.06\n[xla:0](0)\tLoss=2.161\tRate=6.07\tGlobalRate=6.07\n[xla:3](0)\tLoss=1.798\tRate=6.09\tGlobalRate=6.09\n[xla:4](0)\tLoss=2.093\tRate=6.01\tGlobalRate=6.01\n[xla:2](40)\tLoss=1.825\tRate=17.35\tGlobalRate=23.12\n[xla:0](40)\tLoss=1.859\tRate=17.34\tGlobalRate=23.11\n[xla:3](40)\tLoss=2.373\tRate=17.35\tGlobalRate=23.12\n[xla:1](40)\tLoss=2.128\tRate=17.32\tGlobalRate=23.09\n[xla:6](40)\tLoss=2.580\tRate=17.34\tGlobalRate=23.10\n[xla:7](40)\tLoss=2.253\tRate=17.33\tGlobalRate=23.10\n[xla:5](40)\tLoss=2.722\tRate=17.33\tGlobalRate=23.10\n[xla:4](40)\tLoss=2.187\tRate=17.32\tGlobalRate=23.09\n[xla:4](80)\tLoss=2.246\tRate=18.03\tGlobalRate=20.57\n[xla:7](80)\tLoss=1.302\tRate=18.04\tGlobalRate=20.58\n[xla:0](80)\tLoss=1.371\tRate=18.04\tGlobalRate=20.58\n[xla:3](80)\tLoss=2.093\tRate=18.04\tGlobalRate=20.58\n[xla:5](80)\tLoss=1.693\tRate=18.04\tGlobalRate=20.58\n[xla:1](80)\tLoss=1.771\tRate=18.03\tGlobalRate=20.57\n[xla:2](80)\tLoss=1.094\tRate=18.04\tGlobalRate=20.58\n[xla:6](80)\tLoss=1.144\tRate=18.04\tGlobalRate=20.58\n[xla:3](120)\tLoss=0.705\tRate=19.41\tGlobalRate=20.50\n[xla:0](120)\tLoss=1.308\tRate=19.41\tGlobalRate=20.50\n[xla:4](120)\tLoss=0.973\tRate=19.41\tGlobalRate=20.49\n[xla:2](120)\tLoss=1.732\tRate=19.42\tGlobalRate=20.50\n[xla:1](120)\tLoss=1.382\tRate=19.41\tGlobalRate=20.49\n[xla:5](120)\tLoss=1.643\tRate=19.41\tGlobalRate=20.50\n[xla:6](120)\tLoss=0.947\tRate=19.41\tGlobalRate=20.50\n[xla:7](120)\tLoss=1.808\tRate=19.41\tGlobalRate=20.50\n[xla:7] Accuracy=62.21%\n[xla:4] Accuracy=64.16%\n[xla:5] Accuracy=62.11%\n[xla:2] Accuracy=63.92%\n[xla:6] Accuracy=62.94%\n[xla:0] Accuracy=65.14%\n[xla:1] Accuracy=63.18%\n[xla:3] Accuracy=62.94%\nFinished training epoch 3 train-acc 65.14 in 97.13 sec\n[xla:7](0)\tLoss=1.025\tRate=5.97\tGlobalRate=5.97\n[xla:4](0)\tLoss=1.531\tRate=5.98\tGlobalRate=5.98\n[xla:3](0)\tLoss=0.876\tRate=5.92\tGlobalRate=5.92\n[xla:5](0)\tLoss=1.339\tRate=5.86\tGlobalRate=5.86\n[xla:0](0)\tLoss=1.646\tRate=5.88\tGlobalRate=5.88\n[xla:6](0)\tLoss=1.246\tRate=5.90\tGlobalRate=5.90\n[xla:1](0)\tLoss=1.717\tRate=5.85\tGlobalRate=5.85\n[xla:2](0)\tLoss=0.940\tRate=5.79\tGlobalRate=5.79\n[xla:1](40)\tLoss=1.652\tRate=17.00\tGlobalRate=22.68\n[xla:3](40)\tLoss=1.223\tRate=17.01\tGlobalRate=22.68\n[xla:0](40)\tLoss=1.591\tRate=17.01\tGlobalRate=22.69\n[xla:4](40)\tLoss=1.548\tRate=17.04\tGlobalRate=22.70\n[xla:5](40)\tLoss=1.994\tRate=17.00\tGlobalRate=22.68\n[xla:6](40)\tLoss=1.340\tRate=17.02\tGlobalRate=22.69\n[xla:2](40)\tLoss=1.142\tRate=16.99\tGlobalRate=22.67\n[xla:7](40)\tLoss=1.014\tRate=17.03\tGlobalRate=22.70\n[xla:4](80)\tLoss=1.145\tRate=21.72\tGlobalRate=23.71\n[xla:7](80)\tLoss=0.659\tRate=21.71\tGlobalRate=23.71\n[xla:6](80)\tLoss=0.885\tRate=21.71\tGlobalRate=23.70\n[xla:0](80)\tLoss=1.229\tRate=21.71\tGlobalRate=23.70\n[xla:2](80)\tLoss=0.829\tRate=21.70\tGlobalRate=23.69\n[xla:1](80)\tLoss=1.204\tRate=21.70\tGlobalRate=23.69\n[xla:3](80)\tLoss=1.697\tRate=21.70\tGlobalRate=23.69\n[xla:5](80)\tLoss=1.018\tRate=21.70\tGlobalRate=23.69\n[xla:4](120)\tLoss=0.758\tRate=23.00\tGlobalRate=23.76\n[xla:7](120)\tLoss=1.177\tRate=23.00\tGlobalRate=23.76\n[xla:5](120)\tLoss=1.129\tRate=22.99\tGlobalRate=23.75\n[xla:2](120)\tLoss=0.745\tRate=22.99\tGlobalRate=23.75\n[xla:3](120)\tLoss=0.391\tRate=23.00\tGlobalRate=23.75\n[xla:1](120)\tLoss=1.079\tRate=22.99\tGlobalRate=23.75\n[xla:0](120)\tLoss=1.019\tRate=23.00\tGlobalRate=23.75\n[xla:6](120)\tLoss=0.516\tRate=23.00\tGlobalRate=23.75\n[xla:0] Accuracy=74.37%\n[xla:2] Accuracy=74.51%\n[xla:5] Accuracy=74.17%\n[xla:1] Accuracy=74.61%\nFinished training epoch 4 train-acc 74.37 in 83.59 sec\n[xla:7] Accuracy=72.02%\n[xla:6] Accuracy=74.80%\n[xla:4] Accuracy=74.27%\n[xla:3] Accuracy=73.63%\n[xla:3](0)\tLoss=0.506\tRate=5.79\tGlobalRate=5.79\n[xla:0](0)\tLoss=0.983\tRate=5.81\tGlobalRate=5.81\n[xla:7](0)\tLoss=0.791\tRate=5.75\tGlobalRate=5.75\n[xla:5](0)\tLoss=1.441\tRate=5.75\tGlobalRate=5.75\n[xla:4](0)\tLoss=1.180\tRate=5.75\tGlobalRate=5.75\n[xla:6](0)\tLoss=1.197\tRate=5.77\tGlobalRate=5.77\n[xla:1](0)\tLoss=1.477\tRate=5.79\tGlobalRate=5.79\n[xla:2](0)\tLoss=0.864\tRate=5.77\tGlobalRate=5.77\n[xla:0](40)\tLoss=1.177\tRate=17.10\tGlobalRate=22.82\n[xla:7](40)\tLoss=0.832\tRate=17.07\tGlobalRate=22.80\n[xla:5](40)\tLoss=1.177\tRate=17.07\tGlobalRate=22.80\n[xla:1](40)\tLoss=1.128\tRate=17.09\tGlobalRate=22.81\n[xla:4](40)\tLoss=0.984\tRate=17.07\tGlobalRate=22.80\n[xla:6](40)\tLoss=0.951\tRate=17.08\tGlobalRate=22.80\n[xla:2](40)\tLoss=0.871\tRate=17.08\tGlobalRate=22.80\n[xla:3](40)\tLoss=0.729\tRate=17.09\tGlobalRate=22.81\n[xla:1](80)\tLoss=0.666\tRate=21.01\tGlobalRate=23.20\n[xla:6](80)\tLoss=0.421\tRate=21.00\tGlobalRate=23.20\n[xla:2](80)\tLoss=0.524\tRate=21.00\tGlobalRate=23.20\n[xla:4](80)\tLoss=0.667\tRate=21.00\tGlobalRate=23.19\n[xla:0](80)\tLoss=0.512\tRate=21.01\tGlobalRate=23.21\n[xla:7](80)\tLoss=0.560\tRate=21.00\tGlobalRate=23.19\n[xla:3](80)\tLoss=1.063\tRate=21.01\tGlobalRate=23.20\n[xla:5](80)\tLoss=0.707\tRate=21.00\tGlobalRate=23.19\n[xla:6](120)\tLoss=0.372\tRate=23.76\tGlobalRate=23.94\n[xla:2](120)\tLoss=0.595\tRate=23.76\tGlobalRate=23.94\n[xla:4](120)\tLoss=0.741\tRate=23.75\tGlobalRate=23.93\n[xla:1](120)\tLoss=0.763\tRate=23.76\tGlobalRate=23.94\n[xla:0](120)\tLoss=0.669\tRate=23.75\tGlobalRate=23.94\n[xla:7](120)\tLoss=0.929\tRate=23.75\tGlobalRate=23.93\n[xla:5](120)\tLoss=0.684\tRate=23.75\tGlobalRate=23.93\n[xla:3](120)\tLoss=0.457\tRate=23.75\tGlobalRate=23.94\n[xla:0] Accuracy=80.27%\n[xla:6] Accuracy=82.57%\n[xla:3] Accuracy=79.88%\n[xla:1] Accuracy=80.81%\nFinished training epoch 5 train-acc 80.27 in 83.07 sec\n[xla:5] Accuracy=79.83%\n[xla:2] Accuracy=82.52%\n[xla:7] Accuracy=80.13%\n[xla:4] Accuracy=80.66%\n[xla:5](0)\tLoss=0.858\tRate=5.81\tGlobalRate=5.81\n[xla:4](0)\tLoss=0.991\tRate=5.77\tGlobalRate=5.77\n[xla:2](0)\tLoss=0.447\tRate=5.83\tGlobalRate=5.83\n[xla:1](0)\tLoss=0.617\tRate=5.78\tGlobalRate=5.78\n[xla:7](0)\tLoss=0.436\tRate=5.84\tGlobalRate=5.84\n[xla:6](0)\tLoss=0.554\tRate=5.79\tGlobalRate=5.79\n[xla:3](0)\tLoss=0.664\tRate=5.78\tGlobalRate=5.78\n[xla:0](0)\tLoss=1.198\tRate=5.83\tGlobalRate=5.83\n[xla:5](40)\tLoss=1.213\tRate=16.77\tGlobalRate=22.36\n[xla:7](40)\tLoss=0.537\tRate=16.78\tGlobalRate=22.37\n[xla:0](40)\tLoss=0.681\tRate=16.78\tGlobalRate=22.37\n[xla:1](40)\tLoss=0.809\tRate=16.76\tGlobalRate=22.35\n[xla:3](40)\tLoss=0.404\tRate=16.76\tGlobalRate=22.35\n[xla:4](40)\tLoss=0.570\tRate=16.75\tGlobalRate=22.35\n[xla:6](40)\tLoss=0.691\tRate=16.76\tGlobalRate=22.35\n[xla:2](40)\tLoss=0.829\tRate=16.78\tGlobalRate=22.37\n[xla:2](80)\tLoss=0.537\tRate=21.62\tGlobalRate=23.53\n[xla:7](80)\tLoss=0.295\tRate=21.62\tGlobalRate=23.53\n[xla:0](80)\tLoss=0.694\tRate=21.62\tGlobalRate=23.53\n[xla:5](80)\tLoss=0.900\tRate=21.62\tGlobalRate=23.52\n[xla:1](80)\tLoss=0.749\tRate=21.61\tGlobalRate=23.52\n[xla:6](80)\tLoss=0.401\tRate=21.62\tGlobalRate=23.52\n[xla:4](80)\tLoss=0.697\tRate=21.61\tGlobalRate=23.52\n[xla:3](80)\tLoss=1.074\tRate=21.61\tGlobalRate=23.52\n[xla:0](120)\tLoss=1.010\tRate=23.28\tGlobalRate=23.80\n[xla:2](120)\tLoss=0.535\tRate=23.28\tGlobalRate=23.80\n[xla:5](120)\tLoss=0.798\tRate=23.27\tGlobalRate=23.80\n[xla:3](120)\tLoss=0.266\tRate=23.27\tGlobalRate=23.80\n[xla:6](120)\tLoss=0.806\tRate=23.27\tGlobalRate=23.80\n[xla:1](120)\tLoss=0.530\tRate=23.27\tGlobalRate=23.80\n[xla:4](120)\tLoss=0.620\tRate=23.27\tGlobalRate=23.80\n[xla:7](120)\tLoss=1.000\tRate=23.28\tGlobalRate=23.81\n[xla:3] Accuracy=84.67%\n[xla:5] Accuracy=84.23%\n[xla:4] Accuracy=84.57%\n[xla:2] Accuracy=85.74%\n[xla:6] Accuracy=85.25%\n[xla:7] Accuracy=83.84%\n[xla:0] Accuracy=84.96%\n[xla:1] Accuracy=85.40%\nFinished training epoch 6 train-acc 84.96 in 83.84 sec\n[xla:3](0)\tLoss=0.341\tRate=5.55\tGlobalRate=5.55\n[xla:6](0)\tLoss=0.545\tRate=5.53\tGlobalRate=5.53\n[xla:5](0)\tLoss=0.869\tRate=5.57\tGlobalRate=5.57\n[xla:1](0)\tLoss=1.031\tRate=5.58\tGlobalRate=5.58\n[xla:7](0)\tLoss=0.610\tRate=5.57\tGlobalRate=5.57\n[xla:0](0)\tLoss=0.322\tRate=5.56\tGlobalRate=5.56\n[xla:4](0)\tLoss=0.806\tRate=5.58\tGlobalRate=5.58\n[xla:2](0)\tLoss=0.249\tRate=5.55\tGlobalRate=5.55\n[xla:0](40)\tLoss=0.440\tRate=17.17\tGlobalRate=22.97\n[xla:2](40)\tLoss=0.619\tRate=17.17\tGlobalRate=22.96\n[xla:6](40)\tLoss=0.728\tRate=17.16\tGlobalRate=22.95\n[xla:7](40)\tLoss=0.419\tRate=17.18\tGlobalRate=22.97\n[xla:1](40)\tLoss=0.726\tRate=17.18\tGlobalRate=22.98\n[xla:3](40)\tLoss=0.267\tRate=17.17\tGlobalRate=22.96\n[xla:5](40)\tLoss=0.845\tRate=17.18\tGlobalRate=22.97\n[xla:4](40)\tLoss=1.157\tRate=17.18\tGlobalRate=22.97\n[xla:2](80)\tLoss=0.830\tRate=21.63\tGlobalRate=23.75\n[xla:5](80)\tLoss=0.778\tRate=21.64\tGlobalRate=23.75\n[xla:0](80)\tLoss=0.318\tRate=21.64\tGlobalRate=23.75\n[xla:1](80)\tLoss=0.354\tRate=21.64\tGlobalRate=23.75\n[xla:6](80)\tLoss=0.075\tRate=21.63\tGlobalRate=23.74\n[xla:4](80)\tLoss=0.335\tRate=21.64\tGlobalRate=23.75\n[xla:3](80)\tLoss=1.132\tRate=21.63\tGlobalRate=23.75\n[xla:7](80)\tLoss=0.797\tRate=21.64\tGlobalRate=23.75\n[xla:2](120)\tLoss=0.302\tRate=23.44\tGlobalRate=24.04\n[xla:7](120)\tLoss=1.040\tRate=23.45\tGlobalRate=24.04\n[xla:3](120)\tLoss=0.175\tRate=23.44\tGlobalRate=24.04\n[xla:4](120)\tLoss=0.538\tRate=23.44\tGlobalRate=24.04\n[xla:0](120)\tLoss=0.725\tRate=23.44\tGlobalRate=24.04\n[xla:6](120)\tLoss=0.337\tRate=23.44\tGlobalRate=24.04\n[xla:5](120)\tLoss=0.414\tRate=23.45\tGlobalRate=24.04\n[xla:1](120)\tLoss=0.748\tRate=23.45\tGlobalRate=24.04\n[xla:6] Accuracy=85.89%\n[xla:5] Accuracy=86.23%\n[xla:4] Accuracy=86.62%\n[xla:2] Accuracy=88.28%\n[xla:0] Accuracy=87.45%\nFinished training epoch 7 train-acc 87.45 in 83.10 sec\n[xla:1] Accuracy=87.40%\n[xla:7] Accuracy=87.30%\n[xla:3] Accuracy=86.23%\n[xla:1](0)\tLoss=0.519\tRate=5.73\tGlobalRate=5.73\n[xla:3](0)\tLoss=0.238\tRate=5.72\tGlobalRate=5.72\n[xla:7](0)\tLoss=0.480\tRate=5.75\tGlobalRate=5.75\n[xla:0](0)\tLoss=0.296\tRate=5.80\tGlobalRate=5.80\n[xla:5](0)\tLoss=0.807\tRate=5.73\tGlobalRate=5.73\n[xla:4](0)\tLoss=0.178\tRate=5.76\tGlobalRate=5.76\n[xla:2](0)\tLoss=0.618\tRate=5.79\tGlobalRate=5.79\n[xla:6](0)\tLoss=0.217\tRate=5.76\tGlobalRate=5.76\n[xla:5](40)\tLoss=1.007\tRate=16.91\tGlobalRate=22.57\n[xla:2](40)\tLoss=0.463\tRate=16.93\tGlobalRate=22.60\n[xla:6](40)\tLoss=0.672\tRate=16.92\tGlobalRate=22.58\n[xla:1](40)\tLoss=0.281\tRate=16.91\tGlobalRate=22.57\n[xla:7](40)\tLoss=0.522\tRate=16.92\tGlobalRate=22.58\n[xla:0](40)\tLoss=0.279\tRate=16.94\tGlobalRate=22.60\n[xla:3](40)\tLoss=0.162\tRate=16.91\tGlobalRate=22.57\n[xla:4](40)\tLoss=0.784\tRate=16.92\tGlobalRate=22.58\n[xla:7](80)\tLoss=0.546\tRate=21.57\tGlobalRate=23.57\n[xla:4](80)\tLoss=0.423\tRate=21.57\tGlobalRate=23.57\n[xla:1](80)\tLoss=0.242\tRate=21.56\tGlobalRate=23.56\n[xla:2](80)\tLoss=0.233\tRate=21.58\tGlobalRate=23.58\n[xla:3](80)\tLoss=0.664\tRate=21.56\tGlobalRate=23.56\n[xla:6](80)\tLoss=0.424\tRate=21.57\tGlobalRate=23.57\n[xla:5](80)\tLoss=0.149\tRate=21.56\tGlobalRate=23.56\n[xla:0](80)\tLoss=0.400\tRate=21.57\tGlobalRate=23.57\n[xla:5](120)\tLoss=0.466\tRate=23.62\tGlobalRate=24.01\n[xla:4](120)\tLoss=0.328\tRate=23.62\tGlobalRate=24.02\n[xla:2](120)\tLoss=0.298\tRate=23.62\tGlobalRate=24.02\n[xla:7](120)\tLoss=0.312\tRate=23.62\tGlobalRate=24.02\n[xla:0](120)\tLoss=0.304\tRate=23.63\tGlobalRate=24.03\n[xla:1](120)\tLoss=0.494\tRate=23.62\tGlobalRate=24.02\n[xla:6](120)\tLoss=0.201\tRate=23.62\tGlobalRate=24.02\n[xla:3](120)\tLoss=0.078\tRate=23.62\tGlobalRate=24.01\n[xla:3] Accuracy=89.40%\n[xla:1] Accuracy=89.79%\n[xla:6] Accuracy=88.87%\n[xla:2] Accuracy=90.28%\n[xla:0] Accuracy=90.82%\n[xla:7] Accuracy=89.16%\n[xla:4] Accuracy=89.60%\n[xla:5] Accuracy=88.23%\nFinished training epoch 8 train-acc 90.82 in 82.89 sec\n[xla:3](0)\tLoss=0.159\tRate=5.55\tGlobalRate=5.55\n[xla:7](0)\tLoss=0.407\tRate=5.50\tGlobalRate=5.50\n[xla:1](0)\tLoss=0.461\tRate=5.55\tGlobalRate=5.55\n[xla:2](0)\tLoss=0.457\tRate=5.50\tGlobalRate=5.50\n[xla:4](0)\tLoss=0.412\tRate=5.47\tGlobalRate=5.47\n[xla:0](0)\tLoss=0.488\tRate=5.51\tGlobalRate=5.51\n[xla:5](0)\tLoss=0.510\tRate=5.49\tGlobalRate=5.49\n[xla:6](0)\tLoss=0.302\tRate=5.50\tGlobalRate=5.50\n[xla:7](40)\tLoss=0.120\tRate=16.97\tGlobalRate=22.69\n[xla:3](40)\tLoss=0.310\tRate=16.99\tGlobalRate=22.71\n[xla:0](40)\tLoss=0.647\tRate=16.98\tGlobalRate=22.70\n[xla:4](40)\tLoss=0.127\tRate=16.96\tGlobalRate=22.68\n[xla:6](40)\tLoss=0.535\tRate=16.96\tGlobalRate=22.68\n[xla:1](40)\tLoss=0.606\tRate=16.99\tGlobalRate=22.71\n[xla:2](40)\tLoss=0.281\tRate=16.97\tGlobalRate=22.69\n[xla:5](40)\tLoss=0.367\tRate=16.96\tGlobalRate=22.69\n[xla:1](80)\tLoss=0.680\tRate=21.49\tGlobalRate=23.56\n[xla:6](80)\tLoss=0.238\tRate=21.48\tGlobalRate=23.54\n[xla:7](80)\tLoss=0.148\tRate=21.48\tGlobalRate=23.55\n[xla:3](80)\tLoss=0.212\tRate=21.49\tGlobalRate=23.56\n[xla:2](80)\tLoss=0.351\tRate=21.48\tGlobalRate=23.55\n[xla:0](80)\tLoss=0.211\tRate=21.48\tGlobalRate=23.55\n[xla:4](80)\tLoss=0.465\tRate=21.48\tGlobalRate=23.54\n[xla:5](80)\tLoss=0.449\tRate=21.48\tGlobalRate=23.54\n[xla:0](120)\tLoss=0.383\tRate=23.29\tGlobalRate=23.85\n[xla:3](120)\tLoss=0.211\tRate=23.29\tGlobalRate=23.86\n[xla:6](120)\tLoss=0.647\tRate=23.28\tGlobalRate=23.85\n[xla:1](120)\tLoss=0.446\tRate=23.28\tGlobalRate=23.86\n[xla:5](120)\tLoss=0.253\tRate=23.28\tGlobalRate=23.84\n[xla:4](120)\tLoss=0.215\tRate=23.28\tGlobalRate=23.84\n[xla:2](120)\tLoss=0.328\tRate=23.28\tGlobalRate=23.85\n[xla:7](120)\tLoss=0.733\tRate=23.28\tGlobalRate=23.84\n[xla:1] Accuracy=91.02%\n[xla:0] Accuracy=91.55%\n[xla:6] Accuracy=90.62%\n[xla:3] Accuracy=92.19%\n[xla:7] Accuracy=91.41%\n[xla:4] Accuracy=91.50%\nFinished training epoch 9 train-acc 91.55 in 83.32 sec\n[xla:2] Accuracy=91.36%\n[xla:5] Accuracy=91.70%\n[xla:0](0)\tLoss=0.272\tRate=5.82\tGlobalRate=5.82\n[xla:2](0)\tLoss=0.269\tRate=5.78\tGlobalRate=5.78\n[xla:7](0)\tLoss=0.213\tRate=5.77\tGlobalRate=5.77\n[xla:6](0)\tLoss=0.270\tRate=5.76\tGlobalRate=5.76\n[xla:3](0)\tLoss=0.460\tRate=5.73\tGlobalRate=5.73\n[xla:1](0)\tLoss=0.413\tRate=5.73\tGlobalRate=5.73\n[xla:4](0)\tLoss=0.232\tRate=5.78\tGlobalRate=5.78\n[xla:5](0)\tLoss=0.237\tRate=5.73\tGlobalRate=5.73\n[xla:5](40)\tLoss=0.376\tRate=17.18\tGlobalRate=22.95\n[xla:3](40)\tLoss=0.231\tRate=17.18\tGlobalRate=22.95\n[xla:1](40)\tLoss=0.265\tRate=17.18\tGlobalRate=22.95\n[xla:7](40)\tLoss=0.261\tRate=17.20\tGlobalRate=22.96\n[xla:6](40)\tLoss=0.575\tRate=17.19\tGlobalRate=22.96\n[xla:4](40)\tLoss=0.415\tRate=17.20\tGlobalRate=22.97\n[xla:0](40)\tLoss=0.520\tRate=17.22\tGlobalRate=22.99\n[xla:2](40)\tLoss=0.312\tRate=17.20\tGlobalRate=22.97\n[xla:4](80)\tLoss=0.384\tRate=21.32\tGlobalRate=23.50\n[xla:5](80)\tLoss=0.445\tRate=21.32\tGlobalRate=23.49\n[xla:1](80)\tLoss=0.346\tRate=21.31\tGlobalRate=23.49\n[xla:3](80)\tLoss=0.276\tRate=21.32\tGlobalRate=23.49\n[xla:2](80)\tLoss=0.479\tRate=21.32\tGlobalRate=23.50\n[xla:0](80)\tLoss=0.229\tRate=21.33\tGlobalRate=23.51\n[xla:6](80)\tLoss=0.373\tRate=21.32\tGlobalRate=23.50\n[xla:7](80)\tLoss=0.154\tRate=21.32\tGlobalRate=23.50\n[xla:1](120)\tLoss=0.319\tRate=23.64\tGlobalRate=24.02\n[xla:3](120)\tLoss=0.182\tRate=23.64\tGlobalRate=24.02\n[xla:2](120)\tLoss=0.190\tRate=23.64\tGlobalRate=24.03\n[xla:7](120)\tLoss=0.476\tRate=23.64\tGlobalRate=24.03\n[xla:4](120)\tLoss=0.161\tRate=23.64\tGlobalRate=24.03\n[xla:0](120)\tLoss=0.392\tRate=23.64\tGlobalRate=24.04\n[xla:6](120)\tLoss=0.040\tRate=23.63\tGlobalRate=24.03\n[xla:5](120)\tLoss=0.532\tRate=23.63\tGlobalRate=24.02\n[xla:0] Accuracy=93.46%\nFinished training epoch 10 train-acc 93.46 in 83.27 sec\n[xla:5] Accuracy=91.55%\n[xla:1] Accuracy=93.21%\n[xla:3] Accuracy=92.43%\n[xla:7] Accuracy=92.72%\n[xla:4] Accuracy=91.94%\n[xla:6] Accuracy=92.29%\n[xla:2] Accuracy=93.41%\n[xla:5](0)\tLoss=0.165\tRate=5.42\tGlobalRate=5.42\n[xla:1](0)\tLoss=0.285\tRate=5.39\tGlobalRate=5.39\n[xla:7](0)\tLoss=0.352\tRate=5.42\tGlobalRate=5.42\n[xla:3](0)\tLoss=0.137\tRate=5.41\tGlobalRate=5.41\n[xla:6](0)\tLoss=0.152\tRate=5.39\tGlobalRate=5.39\n[xla:2](0)\tLoss=0.672\tRate=5.38\tGlobalRate=5.38\n[xla:0](0)\tLoss=0.170\tRate=5.44\tGlobalRate=5.44\n[xla:4](0)\tLoss=0.203\tRate=5.38\tGlobalRate=5.38\n[xla:4](40)\tLoss=0.222\tRate=16.66\tGlobalRate=22.29\n[xla:3](40)\tLoss=0.124\tRate=16.68\tGlobalRate=22.30\n[xla:6](40)\tLoss=0.514\tRate=16.67\tGlobalRate=22.29\n[xla:5](40)\tLoss=0.522\tRate=16.68\tGlobalRate=22.30\n[xla:2](40)\tLoss=0.198\tRate=16.66\tGlobalRate=22.29\n[xla:7](40)\tLoss=0.150\tRate=16.68\tGlobalRate=22.30\n[xla:0](40)\tLoss=0.242\tRate=16.69\tGlobalRate=22.31\n[xla:1](40)\tLoss=0.296\tRate=16.66\tGlobalRate=22.29\n[xla:5](80)\tLoss=0.168\tRate=21.45\tGlobalRate=23.39\n[xla:4](80)\tLoss=0.244\tRate=21.44\tGlobalRate=23.38\n[xla:6](80)\tLoss=0.265\tRate=21.44\tGlobalRate=23.39\n[xla:1](80)\tLoss=0.130\tRate=21.44\tGlobalRate=23.38\n[xla:0](80)\tLoss=0.122\tRate=21.45\tGlobalRate=23.40\n[xla:2](80)\tLoss=0.102\tRate=21.44\tGlobalRate=23.38\n[xla:7](80)\tLoss=0.088\tRate=21.45\tGlobalRate=23.39\n[xla:3](80)\tLoss=0.719\tRate=21.44\tGlobalRate=23.39\n[xla:5](120)\tLoss=0.888\tRate=23.52\tGlobalRate=23.87\n[xla:1](120)\tLoss=0.283\tRate=23.52\tGlobalRate=23.87\n[xla:7](120)\tLoss=0.422\tRate=23.52\tGlobalRate=23.87\n[xla:4](120)\tLoss=0.406\tRate=23.52\tGlobalRate=23.87\n[xla:2](120)\tLoss=0.657\tRate=23.52\tGlobalRate=23.87\n[xla:6](120)\tLoss=0.396\tRate=23.52\tGlobalRate=23.87\n[xla:0](120)\tLoss=0.082\tRate=23.52\tGlobalRate=23.87\n[xla:3](120)\tLoss=0.254\tRate=23.52\tGlobalRate=23.87\n[xla:6] Accuracy=93.75%\n[xla:2] Accuracy=93.41%\n[xla:0] Accuracy=93.70%\n[xla:1] Accuracy=92.92%\n[xla:7] Accuracy=92.63%\n[xla:3] Accuracy=92.77%\n[xla:5] Accuracy=93.41%\n[xla:4] Accuracy=93.60%\nFinished training epoch 11 train-acc 93.70 in 83.24 sec\n[xla:2](0)\tLoss=0.042\tRate=5.97\tGlobalRate=5.97\n[xla:6](0)\tLoss=0.213\tRate=5.96\tGlobalRate=5.96\n[xla:0](0)\tLoss=0.338\tRate=5.91\tGlobalRate=5.91\n[xla:7](0)\tLoss=0.230\tRate=5.95\tGlobalRate=5.95\n[xla:4](0)\tLoss=0.089\tRate=5.91\tGlobalRate=5.91\n[xla:5](0)\tLoss=0.827\tRate=5.89\tGlobalRate=5.89\n[xla:3](0)\tLoss=0.143\tRate=5.90\tGlobalRate=5.90\n[xla:1](0)\tLoss=0.449\tRate=5.88\tGlobalRate=5.88\n[xla:0](40)\tLoss=0.683\tRate=17.12\tGlobalRate=22.84\n[xla:4](40)\tLoss=0.071\tRate=17.13\tGlobalRate=22.84\n[xla:3](40)\tLoss=0.630\tRate=17.12\tGlobalRate=22.83\n[xla:1](40)\tLoss=0.419\tRate=17.11\tGlobalRate=22.83\n[xla:6](40)\tLoss=0.439\tRate=17.14\tGlobalRate=22.85\n[xla:5](40)\tLoss=0.272\tRate=17.12\tGlobalRate=22.83\n[xla:2](40)\tLoss=0.203\tRate=17.15\tGlobalRate=22.86\n[xla:7](40)\tLoss=0.171\tRate=17.14\tGlobalRate=22.85\n[xla:4](80)\tLoss=0.119\tRate=21.45\tGlobalRate=23.55\n[xla:2](80)\tLoss=0.497\tRate=21.46\tGlobalRate=23.56\n[xla:0](80)\tLoss=0.122\tRate=21.45\tGlobalRate=23.55\n[xla:5](80)\tLoss=0.287\tRate=21.44\tGlobalRate=23.55\n[xla:1](80)\tLoss=0.269\tRate=21.44\tGlobalRate=23.54\n[xla:7](80)\tLoss=0.197\tRate=21.45\tGlobalRate=23.56\n[xla:6](80)\tLoss=0.076\tRate=21.45\tGlobalRate=23.56\n[xla:3](80)\tLoss=0.349\tRate=21.44\tGlobalRate=23.54\n[xla:1](120)\tLoss=0.359\tRate=23.48\tGlobalRate=23.96\n[xla:5](120)\tLoss=0.278\tRate=23.48\tGlobalRate=23.96\n[xla:6](120)\tLoss=0.156\tRate=23.49\tGlobalRate=23.97\n[xla:7](120)\tLoss=0.143\tRate=23.49\tGlobalRate=23.97\n[xla:4](120)\tLoss=0.218\tRate=23.49\tGlobalRate=23.96\n[xla:0](120)\tLoss=0.071\tRate=23.48\tGlobalRate=23.96\n[xla:3](120)\tLoss=0.410\tRate=23.49\tGlobalRate=23.96\n[xla:2](120)\tLoss=0.018\tRate=23.49\tGlobalRate=23.97\n[xla:7] Accuracy=93.95%\n[xla:3] Accuracy=94.19%\n[xla:4] Accuracy=94.29%\n[xla:0] Accuracy=93.85%\n[xla:2] Accuracy=94.14%\n[xla:1] Accuracy=93.31%\nFinished training epoch 12 train-acc 93.85 in 83.11 sec\n[xla:5] Accuracy=93.31%\n[xla:6] Accuracy=92.92%\n[xla:5](0)\tLoss=0.250\tRate=5.57\tGlobalRate=5.57\n[xla:4](0)\tLoss=0.235\tRate=5.53\tGlobalRate=5.53\n[xla:2](0)\tLoss=0.120\tRate=5.55\tGlobalRate=5.55\n[xla:7](0)\tLoss=0.183\tRate=5.53\tGlobalRate=5.53\n[xla:6](0)\tLoss=0.200\tRate=5.57\tGlobalRate=5.57\n[xla:0](0)\tLoss=0.160\tRate=5.61\tGlobalRate=5.61\n[xla:3](0)\tLoss=0.348\tRate=5.55\tGlobalRate=5.55\n[xla:1](0)\tLoss=0.483\tRate=5.52\tGlobalRate=5.52\n[xla:2](40)\tLoss=0.207\tRate=16.49\tGlobalRate=22.01\n[xla:6](40)\tLoss=0.342\tRate=16.49\tGlobalRate=22.02\n[xla:3](40)\tLoss=0.255\tRate=16.49\tGlobalRate=22.02\n[xla:1](40)\tLoss=0.226\tRate=16.48\tGlobalRate=22.01\n[xla:5](40)\tLoss=0.309\tRate=16.49\tGlobalRate=22.02\n[xla:4](40)\tLoss=0.083\tRate=16.48\tGlobalRate=22.01\n[xla:7](40)\tLoss=0.148\tRate=16.48\tGlobalRate=22.01\n[xla:0](40)\tLoss=0.223\tRate=16.51\tGlobalRate=22.03\n[xla:4](80)\tLoss=0.249\tRate=21.22\tGlobalRate=23.12\n[xla:7](80)\tLoss=0.050\tRate=21.22\tGlobalRate=23.12\n[xla:5](80)\tLoss=0.087\tRate=21.23\tGlobalRate=23.13\n[xla:6](80)\tLoss=0.070\tRate=21.23\tGlobalRate=23.13\n[xla:2](80)\tLoss=0.272\tRate=21.22\tGlobalRate=23.12\n[xla:0](80)\tLoss=0.145\tRate=21.23\tGlobalRate=23.13\n[xla:1](80)\tLoss=0.093\tRate=21.22\tGlobalRate=23.12\n[xla:3](80)\tLoss=0.148\tRate=21.22\tGlobalRate=23.12\n[xla:2](120)\tLoss=0.137\tRate=23.55\tGlobalRate=23.74\n[xla:0](120)\tLoss=0.253\tRate=23.55\tGlobalRate=23.75\n[xla:1](120)\tLoss=0.188\tRate=23.55\tGlobalRate=23.74\n[xla:6](120)\tLoss=0.065\tRate=23.55\tGlobalRate=23.74\n[xla:3](120)\tLoss=0.023\tRate=23.55\tGlobalRate=23.74\n[xla:4](120)\tLoss=0.632\tRate=23.54\tGlobalRate=23.74\n[xla:7](120)\tLoss=0.314\tRate=23.54\tGlobalRate=23.74\n[xla:5](120)\tLoss=0.125\tRate=23.55\tGlobalRate=23.74\n[xla:5] Accuracy=95.21%\n[xla:2] Accuracy=93.60%\n[xla:3] Accuracy=93.75%\n[xla:6] Accuracy=94.24%\n[xla:4] Accuracy=95.17%\n[xla:0] Accuracy=93.26%\n[xla:7] Accuracy=95.17%\nFinished training epoch 13 train-acc 93.26 in 83.70 sec\n[xla:1] Accuracy=94.53%\n[xla:1](0)\tLoss=0.371\tRate=5.52\tGlobalRate=5.52\n[xla:5](0)\tLoss=0.302\tRate=5.47\tGlobalRate=5.47\n[xla:3](0)\tLoss=0.199\tRate=5.48\tGlobalRate=5.48\n[xla:6](0)\tLoss=0.165\tRate=5.50\tGlobalRate=5.50\n[xla:4](0)\tLoss=0.242\tRate=5.50\tGlobalRate=5.50\n[xla:0](0)\tLoss=0.098\tRate=5.52\tGlobalRate=5.52\n[xla:2](0)\tLoss=0.199\tRate=5.55\tGlobalRate=5.55\n[xla:7](0)\tLoss=0.056\tRate=5.49\tGlobalRate=5.49\n[xla:3](40)\tLoss=0.180\tRate=16.87\tGlobalRate=22.56\n[xla:4](40)\tLoss=0.161\tRate=16.89\tGlobalRate=22.58\n[xla:7](40)\tLoss=0.448\tRate=16.88\tGlobalRate=22.57\n[xla:2](40)\tLoss=0.047\tRate=16.90\tGlobalRate=22.59\n[xla:5](40)\tLoss=0.587\tRate=16.87\tGlobalRate=22.56\n[xla:6](40)\tLoss=0.223\tRate=16.88\tGlobalRate=22.57\n[xla:0](40)\tLoss=0.186\tRate=16.89\tGlobalRate=22.58\n[xla:1](40)\tLoss=0.402\tRate=16.89\tGlobalRate=22.58\n[xla:7](80)\tLoss=0.055\tRate=21.67\tGlobalRate=23.64\n[xla:3](80)\tLoss=0.702\tRate=21.67\tGlobalRate=23.64\n[xla:6](80)\tLoss=0.105\tRate=21.67\tGlobalRate=23.65\n[xla:0](80)\tLoss=0.182\tRate=21.67\tGlobalRate=23.65\n[xla:2](80)\tLoss=0.502\tRate=21.68\tGlobalRate=23.66\n[xla:1](80)\tLoss=0.177\tRate=21.67\tGlobalRate=23.65\n[xla:4](80)\tLoss=0.034\tRate=21.67\tGlobalRate=23.65\n[xla:5](80)\tLoss=0.341\tRate=21.66\tGlobalRate=23.64\n[xla:4](120)\tLoss=0.298\tRate=23.22\tGlobalRate=23.84\n[xla:0](120)\tLoss=0.206\tRate=23.22\tGlobalRate=23.85\n[xla:5](120)\tLoss=0.231\tRate=23.22\tGlobalRate=23.84\n[xla:1](120)\tLoss=0.089\tRate=23.22\tGlobalRate=23.85\n[xla:3](120)\tLoss=0.509\tRate=23.22\tGlobalRate=23.84\n[xla:6](120)\tLoss=0.256\tRate=23.22\tGlobalRate=23.84\n[xla:2](120)\tLoss=0.299\tRate=23.22\tGlobalRate=23.85\n[xla:7](120)\tLoss=0.371\tRate=23.22\tGlobalRate=23.84\n[xla:2] Accuracy=95.75%\n[xla:5] Accuracy=94.38%\n[xla:7] Accuracy=95.65%\n[xla:1] Accuracy=94.97%\n[xla:0] Accuracy=95.17%\n[xla:3] Accuracy=94.53%\n[xla:4] Accuracy=95.17%\nFinished training epoch 14 train-acc 95.17 in 83.63 sec\n[xla:6] Accuracy=95.26%\n[xla:4](0)\tLoss=0.085\tRate=5.51\tGlobalRate=5.51\n[xla:1](0)\tLoss=0.050\tRate=5.48\tGlobalRate=5.48\n[xla:0](0)\tLoss=0.109\tRate=5.55\tGlobalRate=5.55\n[xla:2](0)\tLoss=0.422\tRate=5.50\tGlobalRate=5.50\n[xla:5](0)\tLoss=0.232\tRate=5.48\tGlobalRate=5.48\n[xla:6](0)\tLoss=0.117\tRate=5.47\tGlobalRate=5.47\n[xla:7](0)\tLoss=0.047\tRate=5.49\tGlobalRate=5.49\n[xla:3](0)\tLoss=0.021\tRate=5.50\tGlobalRate=5.50\n[xla:2](40)\tLoss=0.157\tRate=16.96\tGlobalRate=22.68\n[xla:1](40)\tLoss=0.784\tRate=16.95\tGlobalRate=22.67\n[xla:6](40)\tLoss=0.197\tRate=16.95\tGlobalRate=22.66\n[xla:3](40)\tLoss=0.172\tRate=16.96\tGlobalRate=22.68\n[xla:4](40)\tLoss=0.180\tRate=16.96\tGlobalRate=22.68\n[xla:0](40)\tLoss=0.257\tRate=16.98\tGlobalRate=22.70\n[xla:7](40)\tLoss=0.029\tRate=16.95\tGlobalRate=22.67\n[xla:5](40)\tLoss=0.133\tRate=16.94\tGlobalRate=22.66\n[xla:6](80)\tLoss=0.169\tRate=21.12\tGlobalRate=23.26\n[xla:3](80)\tLoss=0.149\tRate=21.12\tGlobalRate=23.26\n[xla:5](80)\tLoss=0.164\tRate=21.12\tGlobalRate=23.26\n[xla:0](80)\tLoss=0.106\tRate=21.13\tGlobalRate=23.28\n[xla:2](80)\tLoss=0.060\tRate=21.12\tGlobalRate=23.26\n[xla:7](80)\tLoss=0.103\tRate=21.12\tGlobalRate=23.26\n[xla:4](80)\tLoss=0.025\tRate=21.12\tGlobalRate=23.27\n[xla:1](80)\tLoss=0.269\tRate=21.12\tGlobalRate=23.26\n[xla:7](120)\tLoss=0.256\tRate=23.52\tGlobalRate=23.85\n[xla:4](120)\tLoss=0.021\tRate=23.52\tGlobalRate=23.85\n[xla:0](120)\tLoss=0.266\tRate=23.52\tGlobalRate=23.85\n[xla:5](120)\tLoss=0.136\tRate=23.52\tGlobalRate=23.84\n[xla:1](120)\tLoss=0.372\tRate=23.52\tGlobalRate=23.84\n[xla:2](120)\tLoss=0.401\tRate=23.52\tGlobalRate=23.85\n[xla:3](120)\tLoss=0.020\tRate=23.52\tGlobalRate=23.85\n[xla:6](120)\tLoss=0.237\tRate=23.52\tGlobalRate=23.84\n[xla:3] Accuracy=95.56%\n[xla:6] Accuracy=95.31%\n[xla:7] Accuracy=94.97%\n[xla:4] Accuracy=96.04%\n[xla:5] Accuracy=96.00%\n[xla:0] Accuracy=96.14%\n[xla:2] Accuracy=95.26%\n[xla:1] Accuracy=94.87%\nFinished training epoch 15 train-acc 96.14 in 83.26 sec\n[xla:7](0)\tLoss=0.048\tRate=5.68\tGlobalRate=5.68\n[xla:0](0)\tLoss=0.341\tRate=5.74\tGlobalRate=5.74\n[xla:5](0)\tLoss=0.374\tRate=5.72\tGlobalRate=5.72\n[xla:1](0)\tLoss=0.201\tRate=5.66\tGlobalRate=5.66\n[xla:3](0)\tLoss=0.069\tRate=5.67\tGlobalRate=5.67\n[xla:6](0)\tLoss=0.181\tRate=5.66\tGlobalRate=5.66\n[xla:4](0)\tLoss=0.106\tRate=5.65\tGlobalRate=5.65\n[xla:2](0)\tLoss=0.047\tRate=5.68\tGlobalRate=5.68\n[xla:0](40)\tLoss=0.045\tRate=16.68\tGlobalRate=22.24\n[xla:7](40)\tLoss=0.033\tRate=16.65\tGlobalRate=22.22\n[xla:1](40)\tLoss=0.084\tRate=16.64\tGlobalRate=22.21\n[xla:5](40)\tLoss=0.116\tRate=16.67\tGlobalRate=22.24\n[xla:2](40)\tLoss=0.107\tRate=16.65\tGlobalRate=22.23\n[xla:3](40)\tLoss=0.181\tRate=16.65\tGlobalRate=22.22\n[xla:4](40)\tLoss=0.109\tRate=16.64\tGlobalRate=22.21\n[xla:6](40)\tLoss=0.036\tRate=16.64\tGlobalRate=22.21\n[xla:7](80)\tLoss=0.086\tRate=21.61\tGlobalRate=23.47\n[xla:5](80)\tLoss=0.081\tRate=21.61\tGlobalRate=23.48\n[xla:0](80)\tLoss=0.087\tRate=21.61\tGlobalRate=23.48\n[xla:3](80)\tLoss=0.203\tRate=21.61\tGlobalRate=23.47\n[xla:2](80)\tLoss=0.114\tRate=21.61\tGlobalRate=23.48\n[xla:4](80)\tLoss=0.097\tRate=21.60\tGlobalRate=23.47\n[xla:6](80)\tLoss=0.378\tRate=21.60\tGlobalRate=23.47\n[xla:1](80)\tLoss=0.128\tRate=21.60\tGlobalRate=23.47\n[xla:0](120)\tLoss=0.036\tRate=23.31\tGlobalRate=23.79\n[xla:6](120)\tLoss=0.152\tRate=23.30\tGlobalRate=23.78\n[xla:4](120)\tLoss=0.024\tRate=23.30\tGlobalRate=23.78\n[xla:5](120)\tLoss=0.149\tRate=23.31\tGlobalRate=23.79\n[xla:3](120)\tLoss=0.224\tRate=23.30\tGlobalRate=23.78\n[xla:1](120)\tLoss=0.048\tRate=23.30\tGlobalRate=23.78\n[xla:2](120)\tLoss=0.096\tRate=23.30\tGlobalRate=23.78\n[xla:7](120)\tLoss=0.676\tRate=23.30\tGlobalRate=23.78\n[xla:4] Accuracy=96.48%\n[xla:3] Accuracy=95.61%\n[xla:1] Accuracy=96.58%\n[xla:0] Accuracy=95.95%\n[xla:5] Accuracy=95.70%\n[xla:7] Accuracy=94.97%\n[xla:6] Accuracy=95.65%\nFinished training epoch 16 train-acc 95.95 in 83.74 sec\n[xla:2] Accuracy=95.90%\n[xla:3](0)\tLoss=0.049\tRate=5.91\tGlobalRate=5.91\n[xla:6](0)\tLoss=0.132\tRate=5.94\tGlobalRate=5.94\n[xla:4](0)\tLoss=0.109\tRate=5.96\tGlobalRate=5.96\n[xla:7](0)\tLoss=0.157\tRate=5.89\tGlobalRate=5.89\n[xla:0](0)\tLoss=0.081\tRate=5.96\tGlobalRate=5.96\n[xla:1](0)\tLoss=0.281\tRate=5.90\tGlobalRate=5.90\n[xla:5](0)\tLoss=0.037\tRate=5.91\tGlobalRate=5.91\n[xla:2](0)\tLoss=0.023\tRate=5.95\tGlobalRate=5.95\n[xla:1](40)\tLoss=0.159\tRate=17.39\tGlobalRate=23.21\n[xla:2](40)\tLoss=0.115\tRate=17.41\tGlobalRate=23.23\n[xla:6](40)\tLoss=0.113\tRate=17.40\tGlobalRate=23.22\n[xla:3](40)\tLoss=0.017\tRate=17.39\tGlobalRate=23.21\n[xla:5](40)\tLoss=0.186\tRate=17.39\tGlobalRate=23.21\n[xla:4](40)\tLoss=0.030\tRate=17.41\tGlobalRate=23.23\n[xla:0](40)\tLoss=0.231\tRate=17.41\tGlobalRate=23.23\n[xla:7](40)\tLoss=0.139\tRate=17.38\tGlobalRate=23.20\n[xla:3](80)\tLoss=0.304\tRate=21.25\tGlobalRate=23.51\n[xla:1](80)\tLoss=0.130\tRate=21.25\tGlobalRate=23.51\n[xla:4](80)\tLoss=0.141\tRate=21.26\tGlobalRate=23.52\n[xla:0](80)\tLoss=0.078\tRate=21.26\tGlobalRate=23.52\n[xla:7](80)\tLoss=0.013\tRate=21.25\tGlobalRate=23.51\n[xla:6](80)\tLoss=0.143\tRate=21.25\tGlobalRate=23.52\n[xla:2](80)\tLoss=0.284\tRate=21.25\tGlobalRate=23.52\n[xla:5](80)\tLoss=0.097\tRate=21.25\tGlobalRate=23.51\n[xla:1](120)\tLoss=0.044\tRate=23.54\tGlobalRate=24.00\n[xla:6](120)\tLoss=0.008\tRate=23.55\tGlobalRate=24.01\n[xla:7](120)\tLoss=0.174\tRate=23.54\tGlobalRate=24.00\n[xla:4](120)\tLoss=0.157\tRate=23.55\tGlobalRate=24.01\n[xla:5](120)\tLoss=0.188\tRate=23.55\tGlobalRate=24.00\n[xla:3](120)\tLoss=0.019\tRate=23.55\tGlobalRate=24.00\n[xla:2](120)\tLoss=0.528\tRate=23.55\tGlobalRate=24.01\n[xla:0](120)\tLoss=0.285\tRate=23.55\tGlobalRate=24.01\n[xla:6] Accuracy=96.39%\n[xla:7] Accuracy=96.63%\n[xla:0] Accuracy=96.19%\n[xla:1] Accuracy=96.04%\n[xla:5] Accuracy=96.48%\n[xla:2] Accuracy=96.58%\n[xla:4] Accuracy=96.39%\nFinished training epoch 17 train-acc 96.19 in 82.70 sec\n[xla:3] Accuracy=96.09%\n[xla:5](0)\tLoss=0.139\tRate=5.33\tGlobalRate=5.33\n[xla:6](0)\tLoss=0.052\tRate=5.35\tGlobalRate=5.35\n[xla:3](0)\tLoss=0.185\tRate=5.32\tGlobalRate=5.32\n[xla:0](0)\tLoss=0.068\tRate=5.35\tGlobalRate=5.35\n[xla:1](0)\tLoss=0.219\tRate=5.30\tGlobalRate=5.30\n[xla:7](0)\tLoss=0.462\tRate=5.30\tGlobalRate=5.30\n[xla:4](0)\tLoss=0.285\tRate=5.30\tGlobalRate=5.30\n[xla:2](0)\tLoss=0.198\tRate=5.36\tGlobalRate=5.36\n[xla:3](40)\tLoss=0.106\tRate=16.73\tGlobalRate=22.39\n[xla:2](40)\tLoss=0.106\tRate=16.75\tGlobalRate=22.41\n[xla:6](40)\tLoss=0.106\tRate=16.75\tGlobalRate=22.40\n[xla:4](40)\tLoss=0.362\tRate=16.72\tGlobalRate=22.38\n[xla:7](40)\tLoss=0.036\tRate=16.73\tGlobalRate=22.38\n[xla:1](40)\tLoss=0.084\tRate=16.73\tGlobalRate=22.38\n[xla:0](40)\tLoss=0.156\tRate=16.75\tGlobalRate=22.40\n[xla:5](40)\tLoss=0.065\tRate=16.74\tGlobalRate=22.39\n[xla:3](80)\tLoss=0.097\tRate=21.76\tGlobalRate=23.65\n[xla:0](80)\tLoss=0.095\tRate=21.76\tGlobalRate=23.66\n[xla:2](80)\tLoss=0.124\tRate=21.76\tGlobalRate=23.66\n[xla:1](80)\tLoss=0.034\tRate=21.75\tGlobalRate=23.65\n[xla:4](80)\tLoss=0.077\tRate=21.75\tGlobalRate=23.65\n[xla:6](80)\tLoss=0.046\tRate=21.76\tGlobalRate=23.66\n[xla:7](80)\tLoss=0.189\tRate=21.75\tGlobalRate=23.65\n[xla:5](80)\tLoss=0.341\tRate=21.76\tGlobalRate=23.66\n[xla:1](120)\tLoss=0.067\tRate=23.67\tGlobalRate=24.06\n[xla:4](120)\tLoss=0.032\tRate=23.67\tGlobalRate=24.06\n[xla:2](120)\tLoss=0.057\tRate=23.67\tGlobalRate=24.07\n[xla:3](120)\tLoss=0.121\tRate=23.67\tGlobalRate=24.07\n[xla:5](120)\tLoss=0.383\tRate=23.67\tGlobalRate=24.07\n[xla:6](120)\tLoss=0.159\tRate=23.68\tGlobalRate=24.07\n[xla:0](120)\tLoss=0.208\tRate=23.67\tGlobalRate=24.07\n[xla:7](120)\tLoss=0.102\tRate=23.67\tGlobalRate=24.06\n[xla:2] Accuracy=96.73%\n[xla:4] Accuracy=96.24%\n[xla:0] Accuracy=96.19%\n[xla:5] Accuracy=96.29%\n[xla:1] Accuracy=96.19%\n[xla:7] Accuracy=96.19%\n[xla:6] Accuracy=96.88%\n[xla:3] Accuracy=96.29%\nFinished training epoch 18 train-acc 96.19 in 83.06 sec\n[xla:1](0)\tLoss=0.059\tRate=5.36\tGlobalRate=5.36\n[xla:4](0)\tLoss=0.026\tRate=5.37\tGlobalRate=5.37\n[xla:0](0)\tLoss=0.224\tRate=5.37\tGlobalRate=5.37\n[xla:6](0)\tLoss=0.082\tRate=5.34\tGlobalRate=5.34\n[xla:3](0)\tLoss=0.027\tRate=5.34\tGlobalRate=5.34\n[xla:7](0)\tLoss=0.212\tRate=5.36\tGlobalRate=5.36\n[xla:2](0)\tLoss=0.235\tRate=5.32\tGlobalRate=5.32\n[xla:5](0)\tLoss=0.114\tRate=5.31\tGlobalRate=5.31\n[xla:1](40)\tLoss=0.072\tRate=16.99\tGlobalRate=22.73\n[xla:4](40)\tLoss=0.097\tRate=16.99\tGlobalRate=22.74\n[xla:3](40)\tLoss=0.051\tRate=16.98\tGlobalRate=22.72\n[xla:7](40)\tLoss=0.090\tRate=16.99\tGlobalRate=22.73\n[xla:0](40)\tLoss=0.061\tRate=16.99\tGlobalRate=22.74\n[xla:2](40)\tLoss=0.078\tRate=16.97\tGlobalRate=22.72\n[xla:6](40)\tLoss=0.678\tRate=16.98\tGlobalRate=22.72\n[xla:5](40)\tLoss=0.096\tRate=16.97\tGlobalRate=22.71\n[xla:0](80)\tLoss=0.163\tRate=21.33\tGlobalRate=23.45\n[xla:3](80)\tLoss=0.163\tRate=21.33\tGlobalRate=23.44\n[xla:6](80)\tLoss=0.111\tRate=21.33\tGlobalRate=23.44\n[xla:5](80)\tLoss=0.354\tRate=21.32\tGlobalRate=23.44\n[xla:1](80)\tLoss=0.081\tRate=21.33\tGlobalRate=23.45\n[xla:4](80)\tLoss=0.031\tRate=21.33\tGlobalRate=23.45\n[xla:7](80)\tLoss=0.041\tRate=21.33\tGlobalRate=23.45\n[xla:2](80)\tLoss=0.139\tRate=21.33\tGlobalRate=23.44\n[xla:6](120)\tLoss=0.024\tRate=23.20\tGlobalRate=23.77\n[xla:2](120)\tLoss=0.103\tRate=23.20\tGlobalRate=23.76\n[xla:5](120)\tLoss=0.228\tRate=23.19\tGlobalRate=23.76\n[xla:4](120)\tLoss=0.017\tRate=23.20\tGlobalRate=23.77\n[xla:7](120)\tLoss=0.190\tRate=23.20\tGlobalRate=23.77\n[xla:3](120)\tLoss=0.012\tRate=23.19\tGlobalRate=23.76\n[xla:0](120)\tLoss=0.032\tRate=23.20\tGlobalRate=23.77\n[xla:1](120)\tLoss=0.435\tRate=23.19\tGlobalRate=23.77\n[xla:3] Accuracy=96.58%\n[xla:6] Accuracy=96.24%\n[xla:0] Accuracy=96.39%\n[xla:5] Accuracy=96.48%\n[xla:2] Accuracy=96.73%\nFinished training epoch 19 train-acc 96.39 in 83.66 sec\n[xla:1] Accuracy=96.34%\n[xla:4] Accuracy=96.53%\n[xla:7] Accuracy=97.02%\n[xla:5](0)\tLoss=0.082\tRate=5.90\tGlobalRate=5.90\n[xla:1](0)\tLoss=0.106\tRate=5.85\tGlobalRate=5.85\n[xla:4](0)\tLoss=0.043\tRate=5.86\tGlobalRate=5.86\n[xla:0](0)\tLoss=0.045\tRate=5.93\tGlobalRate=5.93\n[xla:3](0)\tLoss=0.012\tRate=5.90\tGlobalRate=5.90\n[xla:6](0)\tLoss=0.041\tRate=5.86\tGlobalRate=5.86\n[xla:7](0)\tLoss=0.042\tRate=5.89\tGlobalRate=5.89\n[xla:2](0)\tLoss=0.054\tRate=5.87\tGlobalRate=5.87\n[xla:4](40)\tLoss=0.146\tRate=17.25\tGlobalRate=23.03\n[xla:7](40)\tLoss=0.020\tRate=17.26\tGlobalRate=23.04\n[xla:2](40)\tLoss=0.016\tRate=17.26\tGlobalRate=23.04\n[xla:5](40)\tLoss=0.104\tRate=17.27\tGlobalRate=23.05\n[xla:3](40)\tLoss=0.024\tRate=17.27\tGlobalRate=23.04\n[xla:6](40)\tLoss=0.535\tRate=17.25\tGlobalRate=23.03\n[xla:1](40)\tLoss=0.158\tRate=17.25\tGlobalRate=23.03\n[xla:0](40)\tLoss=0.245\tRate=17.28\tGlobalRate=23.06\n[xla:5](80)\tLoss=0.032\tRate=21.46\tGlobalRate=23.63\n[xla:4](80)\tLoss=0.023\tRate=21.45\tGlobalRate=23.62\n[xla:7](80)\tLoss=0.071\tRate=21.46\tGlobalRate=23.62\n[xla:6](80)\tLoss=0.029\tRate=21.45\tGlobalRate=23.62\n[xla:1](80)\tLoss=0.090\tRate=21.45\tGlobalRate=23.62\n[xla:3](80)\tLoss=0.235\tRate=21.46\tGlobalRate=23.63\n[xla:0](80)\tLoss=0.182\tRate=21.46\tGlobalRate=23.63\n[xla:2](80)\tLoss=0.045\tRate=21.45\tGlobalRate=23.62\n[xla:5](120)\tLoss=0.046\tRate=23.80\tGlobalRate=24.17\n[xla:2](120)\tLoss=0.047\tRate=23.80\tGlobalRate=24.17\n[xla:6](120)\tLoss=0.031\tRate=23.80\tGlobalRate=24.17\n[xla:1](120)\tLoss=0.066\tRate=23.80\tGlobalRate=24.17\n[xla:4](120)\tLoss=0.091\tRate=23.80\tGlobalRate=24.17\n[xla:7](120)\tLoss=0.095\tRate=23.80\tGlobalRate=24.17\n[xla:3](120)\tLoss=0.011\tRate=23.80\tGlobalRate=24.17\n[xla:0](120)\tLoss=0.107\tRate=23.81\tGlobalRate=24.18\n[xla:0] Accuracy=97.41%\n[xla:1] Accuracy=96.83%\nFinished training epoch 20 train-acc 97.41 in 82.89 sec\n[xla:2] Accuracy=97.46%\n[xla:4] Accuracy=97.22%\n[xla:3] Accuracy=96.78%\n[xla:5] Accuracy=96.63%\n[xla:7] Accuracy=97.27%\n[xla:6] Accuracy=97.36%\n[xla:6](0)\tLoss=0.099\tRate=5.50\tGlobalRate=5.50\n[xla:4](0)\tLoss=0.035\tRate=5.53\tGlobalRate=5.53\n[xla:2](0)\tLoss=0.024\tRate=5.52\tGlobalRate=5.52\n[xla:5](0)\tLoss=0.010\tRate=5.47\tGlobalRate=5.47\n[xla:1](0)\tLoss=0.031\tRate=5.47\tGlobalRate=5.47\n[xla:3](0)\tLoss=0.105\tRate=5.46\tGlobalRate=5.46\n[xla:0](0)\tLoss=0.078\tRate=5.52\tGlobalRate=5.52\n[xla:7](0)\tLoss=0.159\tRate=5.47\tGlobalRate=5.47\n[xla:6](40)\tLoss=0.366\tRate=16.70\tGlobalRate=22.32\n[xla:4](40)\tLoss=0.024\tRate=16.71\tGlobalRate=22.33\n[xla:2](40)\tLoss=0.189\tRate=16.71\tGlobalRate=22.33\n[xla:7](40)\tLoss=0.030\tRate=16.69\tGlobalRate=22.31\n[xla:5](40)\tLoss=0.015\tRate=16.69\tGlobalRate=22.31\n[xla:0](40)\tLoss=0.075\tRate=16.71\tGlobalRate=22.33\n[xla:1](40)\tLoss=0.019\tRate=16.69\tGlobalRate=22.31\n[xla:3](40)\tLoss=0.013\tRate=16.69\tGlobalRate=22.31\n[xla:4](80)\tLoss=0.122\tRate=21.55\tGlobalRate=23.48\n[xla:0](80)\tLoss=0.195\tRate=21.55\tGlobalRate=23.48\n[xla:6](80)\tLoss=0.088\tRate=21.55\tGlobalRate=23.47\n[xla:3](80)\tLoss=0.815\tRate=21.54\tGlobalRate=23.47\n[xla:1](80)\tLoss=0.033\tRate=21.54\tGlobalRate=23.47\n[xla:2](80)\tLoss=0.062\tRate=21.55\tGlobalRate=23.48\n[xla:5](80)\tLoss=0.083\tRate=21.54\tGlobalRate=23.46\n[xla:7](80)\tLoss=0.019\tRate=21.54\tGlobalRate=23.46\n[xla:6](120)\tLoss=0.080\tRate=22.19\tGlobalRate=23.18\n[xla:4](120)\tLoss=0.025\tRate=22.19\tGlobalRate=23.19\n[xla:2](120)\tLoss=0.027\tRate=22.19\tGlobalRate=23.19\n[xla:3](120)\tLoss=0.028\tRate=22.19\tGlobalRate=23.18\n[xla:7](120)\tLoss=0.086\tRate=22.19\tGlobalRate=23.18\n[xla:5](120)\tLoss=0.067\tRate=22.19\tGlobalRate=23.18\n[xla:0](120)\tLoss=0.057\tRate=22.19\tGlobalRate=23.19\n[xla:1](120)\tLoss=0.100\tRate=22.19\tGlobalRate=23.18\n[xla:3] Accuracy=97.41%\n[xla:7] Accuracy=96.83%\n[xla:5] Accuracy=97.36%\n[xla:0] Accuracy=97.51%\n[xla:2] Accuracy=97.51%\n[xla:6] Accuracy=97.22%\n[xla:4] Accuracy=96.92%\n[xla:1] Accuracy=97.46%\nFinished training epoch 21 train-acc 97.51 in 85.79 sec\n[xla:2](0)\tLoss=0.028\tRate=5.99\tGlobalRate=5.99\n[xla:6](0)\tLoss=0.184\tRate=6.03\tGlobalRate=6.03\n[xla:1](0)\tLoss=0.008\tRate=6.00\tGlobalRate=6.00\n[xla:0](0)\tLoss=0.213\tRate=6.01\tGlobalRate=6.01\n[xla:7](0)\tLoss=0.071\tRate=5.97\tGlobalRate=5.97\n[xla:5](0)\tLoss=0.084\tRate=5.97\tGlobalRate=5.97\n[xla:3](0)\tLoss=0.056\tRate=6.01\tGlobalRate=6.01\n[xla:4](0)\tLoss=0.050\tRate=6.05\tGlobalRate=6.05\n[xla:6](40)\tLoss=0.033\tRate=17.20\tGlobalRate=22.92\n[xla:0](40)\tLoss=0.169\tRate=17.19\tGlobalRate=22.91\n[xla:7](40)\tLoss=0.019\tRate=17.17\tGlobalRate=22.90\n[xla:4](40)\tLoss=0.227\tRate=17.21\tGlobalRate=22.93\n[xla:1](40)\tLoss=0.034\tRate=17.19\tGlobalRate=22.91\n[xla:3](40)\tLoss=0.093\tRate=17.19\tGlobalRate=22.91\n[xla:2](40)\tLoss=0.012\tRate=17.18\tGlobalRate=22.90\n[xla:5](40)\tLoss=0.025\tRate=17.17\tGlobalRate=22.89\n[xla:6](80)\tLoss=0.013\tRate=21.28\tGlobalRate=23.44\n[xla:7](80)\tLoss=0.126\tRate=21.27\tGlobalRate=23.43\n[xla:2](80)\tLoss=0.328\tRate=21.27\tGlobalRate=23.43\n[xla:5](80)\tLoss=0.225\tRate=21.27\tGlobalRate=23.43\n[xla:0](80)\tLoss=0.245\tRate=21.28\tGlobalRate=23.44\n[xla:3](80)\tLoss=0.159\tRate=21.28\tGlobalRate=23.44\n[xla:1](80)\tLoss=0.027\tRate=21.28\tGlobalRate=23.44\n[xla:4](80)\tLoss=0.021\tRate=21.28\tGlobalRate=23.44\n[xla:3](120)\tLoss=0.071\tRate=24.02\tGlobalRate=24.19\n[xla:7](120)\tLoss=0.055\tRate=24.02\tGlobalRate=24.18\n[xla:0](120)\tLoss=0.015\tRate=24.03\tGlobalRate=24.19\n[xla:5](120)\tLoss=0.036\tRate=24.02\tGlobalRate=24.18\n[xla:6](120)\tLoss=0.009\tRate=24.03\tGlobalRate=24.19\n[xla:4](120)\tLoss=0.026\tRate=24.03\tGlobalRate=24.19\n[xla:2](120)\tLoss=0.030\tRate=24.02\tGlobalRate=24.18\n[xla:1](120)\tLoss=0.033\tRate=24.02\tGlobalRate=24.18\n[xla:4] Accuracy=97.31%\n[xla:6] Accuracy=96.92%\n[xla:2] Accuracy=97.90%\n[xla:1] Accuracy=97.41%\n[xla:7] Accuracy=97.41%\n[xla:3] Accuracy=97.61%\n[xla:0] Accuracy=97.46%\n[xla:5] Accuracy=97.46%\nFinished training epoch 22 train-acc 97.46 in 82.53 sec\n[xla:7](0)\tLoss=0.110\tRate=5.50\tGlobalRate=5.50\n[xla:1](0)\tLoss=0.074\tRate=5.50\tGlobalRate=5.50\n[xla:0](0)\tLoss=0.103\tRate=5.52\tGlobalRate=5.52\n[xla:2](0)\tLoss=0.008\tRate=5.50\tGlobalRate=5.50\n[xla:3](0)\tLoss=0.037\tRate=5.52\tGlobalRate=5.52\n[xla:5](0)\tLoss=0.063\tRate=5.55\tGlobalRate=5.55\n[xla:4](0)\tLoss=0.043\tRate=5.49\tGlobalRate=5.49\n[xla:6](0)\tLoss=0.089\tRate=5.52\tGlobalRate=5.52\n[xla:4](40)\tLoss=0.108\tRate=16.69\tGlobalRate=22.31\n[xla:0](40)\tLoss=0.139\tRate=16.71\tGlobalRate=22.33\n[xla:2](40)\tLoss=0.156\tRate=16.70\tGlobalRate=22.31\n[xla:5](40)\tLoss=0.220\tRate=16.72\tGlobalRate=22.34\n[xla:1](40)\tLoss=0.124\tRate=16.70\tGlobalRate=22.32\n[xla:6](40)\tLoss=0.051\tRate=16.71\tGlobalRate=22.33\n[xla:7](40)\tLoss=0.207\tRate=16.70\tGlobalRate=22.32\n[xla:3](40)\tLoss=0.452\tRate=16.71\tGlobalRate=22.33\n[xla:3](80)\tLoss=0.329\tRate=21.33\tGlobalRate=23.31\n[xla:1](80)\tLoss=0.107\tRate=21.32\tGlobalRate=23.30\n[xla:6](80)\tLoss=0.023\tRate=21.33\tGlobalRate=23.31\n[xla:5](80)\tLoss=0.123\tRate=21.33\tGlobalRate=23.31\n[xla:0](80)\tLoss=0.016\tRate=21.33\tGlobalRate=23.31\n[xla:7](80)\tLoss=0.025\tRate=21.33\tGlobalRate=23.30\n[xla:4](80)\tLoss=0.125\tRate=21.32\tGlobalRate=23.30\n[xla:2](80)\tLoss=0.016\tRate=21.33\tGlobalRate=23.30\n[xla:5](120)\tLoss=0.023\tRate=23.46\tGlobalRate=23.81\n[xla:6](120)\tLoss=0.105\tRate=23.46\tGlobalRate=23.80\n[xla:1](120)\tLoss=0.014\tRate=23.46\tGlobalRate=23.80\n[xla:7](120)\tLoss=0.171\tRate=23.45\tGlobalRate=23.80\n[xla:2](120)\tLoss=0.070\tRate=23.45\tGlobalRate=23.80\n[xla:3](120)\tLoss=0.047\tRate=23.44\tGlobalRate=23.80\n[xla:4](120)\tLoss=0.045\tRate=23.44\tGlobalRate=23.79\n[xla:0](120)\tLoss=0.105\tRate=23.44\tGlobalRate=23.80\n[xla:1] Accuracy=97.56%\n[xla:3] Accuracy=96.68%\n[xla:7] Accuracy=97.75%\n[xla:4] Accuracy=97.31%\n[xla:0] Accuracy=97.36%\n[xla:6] Accuracy=97.31%\n[xla:5] Accuracy=97.61%\n[xla:2] Accuracy=97.46%\nFinished training epoch 23 train-acc 97.36 in 83.69 sec\n[xla:5](0)\tLoss=0.206\tRate=5.78\tGlobalRate=5.78\n[xla:6](0)\tLoss=0.050\tRate=5.79\tGlobalRate=5.79\n[xla:1](0)\tLoss=0.370\tRate=5.75\tGlobalRate=5.75\n[xla:0](0)\tLoss=0.335\tRate=5.76\tGlobalRate=5.76\n[xla:3](0)\tLoss=0.237\tRate=5.79\tGlobalRate=5.79\n[xla:2](0)\tLoss=0.091\tRate=5.75\tGlobalRate=5.75\n[xla:7](0)\tLoss=0.260\tRate=5.74\tGlobalRate=5.74\n[xla:4](0)\tLoss=0.057\tRate=5.85\tGlobalRate=5.85\n[xla:6](40)\tLoss=0.016\tRate=17.21\tGlobalRate=22.98\n[xla:2](40)\tLoss=0.020\tRate=17.19\tGlobalRate=22.96\n[xla:4](40)\tLoss=0.119\tRate=17.23\tGlobalRate=23.00\n[xla:0](40)\tLoss=0.115\tRate=17.19\tGlobalRate=22.96\n[xla:1](40)\tLoss=0.055\tRate=17.19\tGlobalRate=22.96\n[xla:5](40)\tLoss=0.057\tRate=17.20\tGlobalRate=22.97\n[xla:3](40)\tLoss=0.107\tRate=17.20\tGlobalRate=22.97\n[xla:7](40)\tLoss=0.298\tRate=17.18\tGlobalRate=22.95\n[xla:5](80)\tLoss=0.537\tRate=21.93\tGlobalRate=23.97\n[xla:6](80)\tLoss=0.079\tRate=21.93\tGlobalRate=23.97\n[xla:4](80)\tLoss=0.034\tRate=21.94\tGlobalRate=23.98\n[xla:7](80)\tLoss=0.040\tRate=21.92\tGlobalRate=23.96\n[xla:1](80)\tLoss=0.123\tRate=21.92\tGlobalRate=23.96\n[xla:0](80)\tLoss=0.023\tRate=21.93\tGlobalRate=23.96\n[xla:3](80)\tLoss=0.062\tRate=21.93\tGlobalRate=23.97\n[xla:2](80)\tLoss=0.067\tRate=21.92\tGlobalRate=23.96\n[xla:4](120)\tLoss=0.248\tRate=23.46\tGlobalRate=24.14\n[xla:3](120)\tLoss=0.025\tRate=23.45\tGlobalRate=24.13\n[xla:1](120)\tLoss=0.028\tRate=23.45\tGlobalRate=24.12\n[xla:0](120)\tLoss=0.078\tRate=23.45\tGlobalRate=24.13\n[xla:5](120)\tLoss=0.165\tRate=23.45\tGlobalRate=24.13\n[xla:6](120)\tLoss=0.051\tRate=23.45\tGlobalRate=24.13\n[xla:7](120)\tLoss=0.043\tRate=23.45\tGlobalRate=24.12\n[xla:2](120)\tLoss=0.215\tRate=23.45\tGlobalRate=24.13\n[xla:5] Accuracy=97.02%\n[xla:0] Accuracy=97.75%\n[xla:2] Accuracy=97.31%\n[xla:7] Accuracy=97.66%\nFinished training epoch 24 train-acc 97.75 in 82.72 sec\n[xla:6] Accuracy=97.46%\n[xla:3] Accuracy=97.22%\n[xla:4] Accuracy=97.75%\n[xla:1] Accuracy=97.41%\n[xla:0](0)\tLoss=0.061\tRate=5.94\tGlobalRate=5.94\n[xla:7](0)\tLoss=0.073\tRate=5.89\tGlobalRate=5.89\n[xla:6](0)\tLoss=0.092\tRate=5.95\tGlobalRate=5.95\n[xla:3](0)\tLoss=0.233\tRate=5.92\tGlobalRate=5.92\n[xla:1](0)\tLoss=0.142\tRate=5.89\tGlobalRate=5.89\n[xla:5](0)\tLoss=0.107\tRate=5.88\tGlobalRate=5.88\n[xla:4](0)\tLoss=0.171\tRate=5.89\tGlobalRate=5.89\n[xla:2](0)\tLoss=0.100\tRate=5.90\tGlobalRate=5.90\n[xla:0](40)\tLoss=0.061\tRate=17.26\tGlobalRate=23.03\n[xla:6](40)\tLoss=0.039\tRate=17.26\tGlobalRate=23.02\n[xla:2](40)\tLoss=0.020\tRate=17.25\tGlobalRate=23.01\n[xla:1](40)\tLoss=0.142\tRate=17.24\tGlobalRate=23.00\n[xla:4](40)\tLoss=0.077\tRate=17.24\tGlobalRate=23.01\n[xla:3](40)\tLoss=0.052\tRate=17.25\tGlobalRate=23.02\n[xla:7](40)\tLoss=0.155\tRate=17.24\tGlobalRate=23.00\n[xla:5](40)\tLoss=0.043\tRate=17.24\tGlobalRate=23.00\n[xla:1](80)\tLoss=0.040\tRate=21.51\tGlobalRate=23.65\n[xla:7](80)\tLoss=0.045\tRate=21.51\tGlobalRate=23.65\n[xla:5](80)\tLoss=0.013\tRate=21.51\tGlobalRate=23.65\n[xla:6](80)\tLoss=0.007\tRate=21.52\tGlobalRate=23.66\n[xla:0](80)\tLoss=0.087\tRate=21.52\tGlobalRate=23.66\n[xla:2](80)\tLoss=0.076\tRate=21.51\tGlobalRate=23.66\n[xla:4](80)\tLoss=0.028\tRate=21.51\tGlobalRate=23.65\n[xla:3](80)\tLoss=0.066\tRate=21.51\tGlobalRate=23.66\n[xla:4](120)\tLoss=0.108\tRate=23.49\tGlobalRate=24.02\n[xla:0](120)\tLoss=0.030\tRate=23.49\tGlobalRate=24.03\n[xla:3](120)\tLoss=0.048\tRate=23.49\tGlobalRate=24.02\n[xla:7](120)\tLoss=0.045\tRate=23.49\tGlobalRate=24.02\n[xla:2](120)\tLoss=0.045\tRate=23.48\tGlobalRate=24.02\n[xla:6](120)\tLoss=0.106\tRate=23.49\tGlobalRate=24.03\n[xla:1](120)\tLoss=0.230\tRate=23.49\tGlobalRate=24.02\n[xla:5](120)\tLoss=0.015\tRate=23.49\tGlobalRate=24.02\n[xla:6] Accuracy=97.41%\n[xla:0] Accuracy=97.27%\nFinished training epoch 25 train-acc 97.27 in 82.68 sec\n[xla:2] Accuracy=98.00%\n[xla:4] Accuracy=97.46%\n[xla:1] Accuracy=97.17%\n[xla:3] Accuracy=98.14%\n[xla:7] Accuracy=97.41%\n[xla:5] Accuracy=97.56%\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ] ]