rajesh1729 commited on
Commit
733c188
1 Parent(s): dca9a73

Upload 21 files

Browse files
00_Introduction_Computational_Graphs.ipynb ADDED
@@ -0,0 +1,456 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 01 Introduction to Computational Graphs\n",
9
+ "description: A basic tutorial to learn about computational graphs\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/1eG1AF36Wa0EaANandAhrsbC3j04487SH?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "_MbzfbWoqAaR"
24
+ },
25
+ "source": [
26
+ "## Introduction to Computational Graphs with PyTorch\n",
27
+ "\n",
28
+ "by [Elvis Saravia](https://twitter.com/omarsar0)\n",
29
+ "\n",
30
+ "\n",
31
+ "In this notebook we provide a short introduction and overview of computational graphs using PyTorch.\n",
32
+ "\n",
33
+ "There are several materials online that cover theory on the topic of computational graphs. However, I think it's much easier to learn the concept using code. I attempt to bridge the gap here which should be useful for beginner students. \n",
34
+ "\n",
35
+ "Inspired by Olah's article [\"Calculus on Computational Graphs: Backpropagation\"](https://colah.github.io/posts/2015-08-Backprop/), I've put together a few code snippets to get you started with computationsl graphs with PyTorch. This notebook should complement that article, so refer to it for more comprehensive explanations. In fact, I've tried to simplify the explanations and refer to them here."
36
+ ]
37
+ },
38
+ {
39
+ "cell_type": "markdown",
40
+ "metadata": {
41
+ "id": "IGzBSo7H6xKu"
42
+ },
43
+ "source": [
44
+ "### Why Computational Graphs?"
45
+ ]
46
+ },
47
+ {
48
+ "cell_type": "markdown",
49
+ "metadata": {
50
+ "id": "lkFMbiPDrGIp"
51
+ },
52
+ "source": [
53
+ "When talking about neural networks in any context, [backpropagation](https://en.wikipedia.org/wiki/Backpropagation) is an important topic to understand because it is the algorithm used for training deep neural networks. \n",
54
+ "\n",
55
+ "Backpropagation is used to calculate derivatives which is what you need to keep optimizing parameters of the model and allowing the model to learn on the task at hand. \n",
56
+ "\n",
57
+ "Many of the deep learning frameworks today like PyTorch does the backpropagation out-of-the-box using [**automatic differentiation**](https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html). \n",
58
+ "\n",
59
+ "To better understand how this is done it's important to talk about **computational graphs** which defines the flow of computations that are carried out throughout the network. Along the way we will use `torch.autograd` to demonstrate in code how this works. "
60
+ ]
61
+ },
62
+ {
63
+ "cell_type": "markdown",
64
+ "metadata": {
65
+ "id": "YXjsI50-sMAa"
66
+ },
67
+ "source": [
68
+ "### Getting Started\n",
69
+ "\n",
70
+ "Inspired by Olah's article on computational graphs, let's look at the following expression $e = (a + b) * (b+1)$. It helps to break it down to the following operations:\n",
71
+ "\n",
72
+ "$$\n",
73
+ "\\begin{aligned}&c=a+b \\\\&d=b+1 \\\\&e=c * d\\end{aligned}\n",
74
+ "$$"
75
+ ]
76
+ },
77
+ {
78
+ "cell_type": "markdown",
79
+ "metadata": {
80
+ "id": "s0EG6DhnsnTm"
81
+ },
82
+ "source": [
83
+ "This is not a neural network of any sort. We are just going through a very simple example of a chain of operations which you can be represented with computational graphs. \n",
84
+ "\n",
85
+ "Let's visualize these operations using a computational graph. Computational graphs contain **nodes** which can represent and input (tensor, matrix, vector, scalar) or **operation** that can be the input to another node. The nodes are connected by **edges**, which represent a function argument, they are pointers to nodes. Note that the computation graphs are directed and acyclic. The computational graph for our example looks as follows:\n",
86
+ "\n",
87
+ "![](https://colah.github.io/posts/2015-08-Backprop/img/tree-def.png)\n",
88
+ "\n",
89
+ "*Source: Christopher Olah (2015)*"
90
+ ]
91
+ },
92
+ {
93
+ "cell_type": "markdown",
94
+ "metadata": {
95
+ "id": "m9VvF4CVtW0s"
96
+ },
97
+ "source": [
98
+ "We can evaluate the expression by setting our input variables as follows: $a=2$ and $b=1$. This will allow us to compute nodes up through the graph as shown in the computational graph above. \n",
99
+ "\n",
100
+ "Rather than doing this by hand, we can use the automatic differentation engine provided by PyTorch. \n",
101
+ "\n",
102
+ "Let's import PyTorch first:"
103
+ ]
104
+ },
105
+ {
106
+ "cell_type": "code",
107
+ "execution_count": null,
108
+ "metadata": {
109
+ "id": "YuD6zdWZp7DP"
110
+ },
111
+ "outputs": [],
112
+ "source": [
113
+ "import torch"
114
+ ]
115
+ },
116
+ {
117
+ "cell_type": "markdown",
118
+ "metadata": {
119
+ "id": "b7EKlMrouClt"
120
+ },
121
+ "source": [
122
+ "Define the inputs like this:"
123
+ ]
124
+ },
125
+ {
126
+ "cell_type": "code",
127
+ "execution_count": null,
128
+ "metadata": {
129
+ "id": "OZ2pB2A3uEQZ"
130
+ },
131
+ "outputs": [],
132
+ "source": [
133
+ "a = torch.tensor([2.], requires_grad=True)\n",
134
+ "b = torch.tensor([1.], requires_grad=True)"
135
+ ]
136
+ },
137
+ {
138
+ "cell_type": "markdown",
139
+ "metadata": {
140
+ "id": "Zm6Xl05quGZL"
141
+ },
142
+ "source": [
143
+ "Note that we used `requires_grad=True` to tell the autograd engine that every operation on them should be tracked. \n",
144
+ "\n",
145
+ "These are the operations in code:"
146
+ ]
147
+ },
148
+ {
149
+ "cell_type": "code",
150
+ "execution_count": null,
151
+ "metadata": {
152
+ "id": "XwXomBUxu1Ib"
153
+ },
154
+ "outputs": [],
155
+ "source": [
156
+ "c = a + b\n",
157
+ "d = b + 1\n",
158
+ "e = c * d\n",
159
+ "\n",
160
+ "# grads populated for non-leaf nodes\n",
161
+ "c.retain_grad()\n",
162
+ "d.retain_grad()\n",
163
+ "e.retain_grad()"
164
+ ]
165
+ },
166
+ {
167
+ "cell_type": "markdown",
168
+ "metadata": {
169
+ "id": "UzCLJvMku46r"
170
+ },
171
+ "source": [
172
+ "Note that we used `.retain_grad()` to allow gradients to be stored for non-leaf nodes as we are interested in inpecting those as well.\n",
173
+ "\n",
174
+ "Now that we have our computational graph, we can check the result when evaluating the expression:"
175
+ ]
176
+ },
177
+ {
178
+ "cell_type": "code",
179
+ "execution_count": null,
180
+ "metadata": {
181
+ "colab": {
182
+ "base_uri": "https://localhost:8080/"
183
+ },
184
+ "id": "4t-uhE6vvH2j",
185
+ "outputId": "e834dbd0-0d8b-4123-d8fe-b9192aeaba9c"
186
+ },
187
+ "outputs": [
188
+ {
189
+ "name": "stdout",
190
+ "output_type": "stream",
191
+ "text": [
192
+ "tensor([6.], grad_fn=<MulBackward0>)\n"
193
+ ]
194
+ }
195
+ ],
196
+ "source": [
197
+ "print(e)"
198
+ ]
199
+ },
200
+ {
201
+ "cell_type": "markdown",
202
+ "metadata": {
203
+ "id": "5eWub17iwi2L"
204
+ },
205
+ "source": [
206
+ "The output is a tensor with the value of `6.`, which verifies the results here: \n",
207
+ "\n",
208
+ "![](https://colah.github.io/posts/2015-08-Backprop/img/tree-eval.png)\n",
209
+ "*Source: Christopher Olah (2015)*"
210
+ ]
211
+ },
212
+ {
213
+ "cell_type": "markdown",
214
+ "metadata": {
215
+ "id": "tjX3LCRmw22a"
216
+ },
217
+ "source": [
218
+ "### Derivatives on Computational Graphs\n",
219
+ "\n",
220
+ "Using the concept of computational graphs we are now interested in evaluating the **partial derivatives** of the edges of the graph. This will help in gathering the gradients of the graph. Remember that gradients are what we use to train the neural network and those calculations can be taken care of by the automatic differentation engine. \n",
221
+ "\n",
222
+ "The intuition is: we want to know, for example, if $a$ directly affects $c$, how does it affect it. In other words, if we change $a$ a little, how does $c$ change. This is referred to as the partial derivative of $c$ with respect to $a$.\n",
223
+ "\n",
224
+ "You can work this by hand, but the easy way to do this with PyTorch is by calling `.backward()` on $e$ and let the engine figure out the values. The `.backward()` signals the autograd engine to calculate the gradients and store them in the respective tensors’ `.grad` attribute.\n",
225
+ "\n",
226
+ "Let's do that now:"
227
+ ]
228
+ },
229
+ {
230
+ "cell_type": "code",
231
+ "execution_count": null,
232
+ "metadata": {
233
+ "id": "Nc6lnO5yy1Cq"
234
+ },
235
+ "outputs": [],
236
+ "source": [
237
+ "e.backward()"
238
+ ]
239
+ },
240
+ {
241
+ "cell_type": "markdown",
242
+ "metadata": {
243
+ "id": "hxbtx6OCy3I8"
244
+ },
245
+ "source": [
246
+ "Now, let’s say we are interested in the derivative of $e$ with respect to $a$, how do we obtain this? In other words, we are looking for $\\frac{\\partial e}{\\partial a}$."
247
+ ]
248
+ },
249
+ {
250
+ "cell_type": "markdown",
251
+ "metadata": {
252
+ "id": "NvQcK9LTzD34"
253
+ },
254
+ "source": [
255
+ "Using PyTorch, we can do this by calling `a.grad`:"
256
+ ]
257
+ },
258
+ {
259
+ "cell_type": "code",
260
+ "execution_count": null,
261
+ "metadata": {
262
+ "colab": {
263
+ "base_uri": "https://localhost:8080/"
264
+ },
265
+ "id": "5NWnWDg4zHDn",
266
+ "outputId": "40cfe57c-23ee-4142-e62f-f7ef4b65fff0"
267
+ },
268
+ "outputs": [
269
+ {
270
+ "name": "stdout",
271
+ "output_type": "stream",
272
+ "text": [
273
+ "tensor([2.])\n"
274
+ ]
275
+ }
276
+ ],
277
+ "source": [
278
+ "print(a.grad)"
279
+ ]
280
+ },
281
+ {
282
+ "cell_type": "markdown",
283
+ "metadata": {
284
+ "id": "c05nEObzzbPn"
285
+ },
286
+ "source": [
287
+ "It is important to understand the intuition behind this. Olah puts it best:\n",
288
+ "\n",
289
+ ">Let’s consider how $e$ is affected by $a$. If we change $a$ at a speed of 1, $c$ also changes at a speed of $1$. In turn, $c$ changing at a speed of $1$ causes $e$ to change at a speed of $2$. So $e$ changes at a rate of $1*2$ with respect to $a$.\n"
290
+ ]
291
+ },
292
+ {
293
+ "cell_type": "markdown",
294
+ "metadata": {
295
+ "id": "8xXLOU37BYOr"
296
+ },
297
+ "source": [
298
+ "In other words, by hand this would be:\n",
299
+ "\n",
300
+ "$$\n",
301
+ "\\frac{\\partial e}{\\partial \\boldsymbol{a}}=\\frac{\\partial e}{\\partial \\boldsymbol{c}} \\frac{\\partial \\boldsymbol{c}}{\\partial \\boldsymbol{a}} = 2 * 1\n",
302
+ "$$"
303
+ ]
304
+ },
305
+ {
306
+ "cell_type": "markdown",
307
+ "metadata": {
308
+ "id": "A2iNJu6jzT5v"
309
+ },
310
+ "source": [
311
+ "You can verify that this is correct by checking the manual calculations by Olah. Since $a$ is not directly connectected to $e$, we can use some special rule which allows to sum over all paths from one node to the other in the computational graph and mulitplying the derivatives on each edge of the path together.\n",
312
+ "\n",
313
+ "![](https://colah.github.io/posts/2015-08-Backprop/img/tree-eval-derivs.png)\n",
314
+ "*Source: Christopher Olah (2015)*"
315
+ ]
316
+ },
317
+ {
318
+ "cell_type": "markdown",
319
+ "metadata": {
320
+ "id": "9uZE-Gl12cnB"
321
+ },
322
+ "source": [
323
+ "To check that this holds, let look at another example. How about caluclating the derivative of $e$ with respect to $b$, i.e., $\\frac{\\partial e}{\\partial b}$?\n",
324
+ "\n",
325
+ "We can get that through `b.grad`:"
326
+ ]
327
+ },
328
+ {
329
+ "cell_type": "code",
330
+ "execution_count": null,
331
+ "metadata": {
332
+ "colab": {
333
+ "base_uri": "https://localhost:8080/"
334
+ },
335
+ "id": "2q11abV90d6i",
336
+ "outputId": "11571cdc-7e55-43a9-931f-ec1ecf140efa"
337
+ },
338
+ "outputs": [
339
+ {
340
+ "name": "stdout",
341
+ "output_type": "stream",
342
+ "text": [
343
+ "tensor([5.])\n"
344
+ ]
345
+ }
346
+ ],
347
+ "source": [
348
+ "print(b.grad)"
349
+ ]
350
+ },
351
+ {
352
+ "cell_type": "markdown",
353
+ "metadata": {
354
+ "id": "2mGP1_iw0_ot"
355
+ },
356
+ "source": [
357
+ "If you work it out by hand, you are basically doing the following:\n",
358
+ "\n",
359
+ "$$\n",
360
+ "\\frac{\\partial e}{\\partial b}=1 * 2+1 * 3\n",
361
+ "$$\n",
362
+ "\n",
363
+ "It indicates how $b$ affects $e$ through $c$ and $d$. We are essentially summing over paths in the computational graph."
364
+ ]
365
+ },
366
+ {
367
+ "cell_type": "markdown",
368
+ "metadata": {
369
+ "id": "sbJvhj5m13Zq"
370
+ },
371
+ "source": [
372
+ "Here are all the gradients collected, including non-leaf nodes:"
373
+ ]
374
+ },
375
+ {
376
+ "cell_type": "code",
377
+ "execution_count": null,
378
+ "metadata": {
379
+ "colab": {
380
+ "base_uri": "https://localhost:8080/"
381
+ },
382
+ "id": "vrUxwsrd3-f-",
383
+ "outputId": "cc63c914-b2e4-43b9-8c43-dcd70975e8b0"
384
+ },
385
+ "outputs": [
386
+ {
387
+ "name": "stdout",
388
+ "output_type": "stream",
389
+ "text": [
390
+ "tensor([2.]) tensor([5.]) tensor([2.]) tensor([3.]) tensor([1.])\n"
391
+ ]
392
+ }
393
+ ],
394
+ "source": [
395
+ "print(a.grad, b.grad, c.grad, d.grad, e.grad)"
396
+ ]
397
+ },
398
+ {
399
+ "cell_type": "markdown",
400
+ "metadata": {
401
+ "id": "HftIH5Mx4Pdj"
402
+ },
403
+ "source": [
404
+ "You can use the computational graph above to verify that everything is correct. This is the power of computational graphs and how they are used by automatic differentation engines. It's also a very useful concept to understand when developing neural networks architectures and their correctness."
405
+ ]
406
+ },
407
+ {
408
+ "cell_type": "markdown",
409
+ "metadata": {
410
+ "id": "DxyJDoMOs1gu"
411
+ },
412
+ "source": [
413
+ "### Next Steps\n",
414
+ "\n",
415
+ "In this notebook, I've provided a simple and intuitive explanation to the concept of computational graphs using PyTorch. I highly recommend to go through [Olah's article](https://colah.github.io/posts/2015-08-Backprop/) for more on the topic.\n",
416
+ "\n",
417
+ "In the next tutorial, I will be applying the concept of computational graphs to more advanced operations you typically see in a neural network. In fact, if you are interested in this, and you are feeling comfortable with the topic now, you can check out these two PyTorch tutorials:\n",
418
+ "\n",
419
+ "- [A gentle introduction to `torch.autograd`](https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html)\n",
420
+ "- [Automatic differentation with `torch.autograd`](https://pytorch.org/tutorials/beginner/basics/autogradqs_tutorial.html)\n",
421
+ "\n",
422
+ "And here are some other useful references used to put this article together:\n",
423
+ "\n",
424
+ "- [Hacker's guide to Neural Networks\n",
425
+ "](http://karpathy.github.io/neuralnets/)\n",
426
+ "- [Backpropagation calculus](https://www.youtube.com/watch?v=tIeHLnjs5U8&ab_channel=3Blue1Brown)\n",
427
+ "\n"
428
+ ]
429
+ }
430
+ ],
431
+ "metadata": {
432
+ "colab": {
433
+ "name": "Introduction-Computational-Graphs.ipynb",
434
+ "provenance": []
435
+ },
436
+ "kernelspec": {
437
+ "display_name": "Python 3 (ipykernel)",
438
+ "language": "python",
439
+ "name": "python3"
440
+ },
441
+ "language_info": {
442
+ "codemirror_mode": {
443
+ "name": "ipython",
444
+ "version": 3
445
+ },
446
+ "file_extension": ".py",
447
+ "mimetype": "text/x-python",
448
+ "name": "python",
449
+ "nbconvert_exporter": "python",
450
+ "pygments_lexer": "ipython3",
451
+ "version": "3.9.12"
452
+ }
453
+ },
454
+ "nbformat": 4,
455
+ "nbformat_minor": 1
456
+ }
01_PyTorch_Hello_World.ipynb ADDED
@@ -0,0 +1,490 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 02 PyTorch Hello World!\n",
9
+ "description: Build a simple neural network and train it\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/1ac0K9_aa46c77XEeYtaMAfSOfmH1Bl9L?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "H7gQFbUxOQtb"
24
+ },
25
+ "source": [
26
+ "# A First Shot at Deep Learning with PyTorch\n",
27
+ "\n",
28
+ "In this notebook, we are going to take a baby step into the world of deep learning using PyTorch. There are a ton of notebooks out there that teach you the fundamentals of deep learning and PyTorch, so here the idea is to give you some basic introduction to deep learning and PyTorch at a very high level. Therefore, this notebook is targeting beginners but it can also serve as a review for more experienced developers.\n",
29
+ "\n",
30
+ "After completion of this notebook, you are expected to know the basic components of training a basic neural network with PyTorch. I have also left a couple of exercises towards the end with the intention of encouraging more research and practise of your deep learning skills. \n",
31
+ "\n",
32
+ "---\n",
33
+ "\n",
34
+ "**Author:** Elvis Saravia([Twitter](https://twitter.com/omarsar0) | [LinkedIn](https://www.linkedin.com/in/omarsar/))\n",
35
+ "\n",
36
+ "**Complete Code Walkthrough:** [Blog post](https://medium.com/dair-ai/a-first-shot-at-deep-learning-with-pytorch-4a8252d30c75)"
37
+ ]
38
+ },
39
+ {
40
+ "cell_type": "markdown",
41
+ "metadata": {
42
+ "id": "CkzttrQCwaSQ"
43
+ },
44
+ "source": [
45
+ "## Importing the libraries\n",
46
+ "\n",
47
+ "Like with any other programming exercise, the first step is to import the necessary libraries. As we are going to be using Google Colab to program our neural network, we need to install and import the necessary PyTorch libraries."
48
+ ]
49
+ },
50
+ {
51
+ "cell_type": "code",
52
+ "execution_count": null,
53
+ "metadata": {
54
+ "colab": {
55
+ "base_uri": "https://localhost:8080/"
56
+ },
57
+ "id": "FuhJIaeXO2W9",
58
+ "outputId": "bf494471-115e-45a8-c7cb-15a26f12154a"
59
+ },
60
+ "outputs": [
61
+ {
62
+ "name": "stdout",
63
+ "output_type": "stream",
64
+ "text": [
65
+ "1.10.0+cu111\n"
66
+ ]
67
+ }
68
+ ],
69
+ "source": [
70
+ "## The usual imports\n",
71
+ "import torch\n",
72
+ "import torch.nn as nn\n",
73
+ "\n",
74
+ "## print out the pytorch version used\n",
75
+ "print(torch.__version__)"
76
+ ]
77
+ },
78
+ {
79
+ "cell_type": "markdown",
80
+ "metadata": {
81
+ "id": "0a2C_nneO_wp"
82
+ },
83
+ "source": [
84
+ "## The Neural Network\n",
85
+ "\n",
86
+ "![alt text](https://drive.google.com/uc?export=view&id=1Lpi4VPBfAV3JkOLopcsGK4L8dyxmPF1b)\n",
87
+ "\n",
88
+ "Before building and training a neural network the first step is to process and prepare the data. In this notebook, we are going to use syntethic data (i.e., fake data) so we won't be using any real world data. \n",
89
+ "\n",
90
+ "For the sake of simplicity, we are going to use the following input and output pairs converted to tensors, which is how data is typically represented in the world of deep learning. The x values represent the input of dimension `(6,1)` and the y values represent the output of similar dimension. The example is taken from this [tutorial](https://github.com/lmoroney/dlaicourse/blob/master/Course%201%20-%20Part%202%20-%20Lesson%202%20-%20Notebook.ipynb). \n",
91
+ "\n",
92
+ "The objective of the neural network model that we are going to build and train is to automatically learn patterns that better characterize the relationship between the `x` and `y` values. Essentially, the model learns the relationship that exists between inputs and outputs which can then be used to predict the corresponding `y` value for any given input `x`."
93
+ ]
94
+ },
95
+ {
96
+ "cell_type": "code",
97
+ "execution_count": null,
98
+ "metadata": {
99
+ "id": "JWFtgUX85iwO"
100
+ },
101
+ "outputs": [],
102
+ "source": [
103
+ "## our data in tensor form\n",
104
+ "x = torch.tensor([[-1.0], [0.0], [1.0], [2.0], [3.0], [4.0]], dtype=torch.float)\n",
105
+ "y = torch.tensor([[-3.0], [-1.0], [1.0], [3.0], [5.0], [7.0]], dtype=torch.float)"
106
+ ]
107
+ },
108
+ {
109
+ "cell_type": "code",
110
+ "execution_count": null,
111
+ "metadata": {
112
+ "colab": {
113
+ "base_uri": "https://localhost:8080/"
114
+ },
115
+ "id": "NcQUjR_95z5J",
116
+ "outputId": "6db5df38-6f9d-454e-87d6-cee0c29dccb3"
117
+ },
118
+ "outputs": [
119
+ {
120
+ "data": {
121
+ "text/plain": [
122
+ "torch.Size([6, 1])"
123
+ ]
124
+ },
125
+ "execution_count": 3,
126
+ "metadata": {},
127
+ "output_type": "execute_result"
128
+ }
129
+ ],
130
+ "source": [
131
+ "## print size of the input tensor\n",
132
+ "x.size()"
133
+ ]
134
+ },
135
+ {
136
+ "cell_type": "markdown",
137
+ "metadata": {
138
+ "id": "9CJXO5WX1QtQ"
139
+ },
140
+ "source": [
141
+ "## The Neural Network Components\n",
142
+ "As said earlier, we are going to first define and build out the components of our neural network before training the model.\n",
143
+ "\n",
144
+ "### Model\n",
145
+ "\n",
146
+ "Typically, when building a neural network model, we define the layers and weights which form the basic components of the model. Below we show an example of how to define a hidden layer named `layer1` with size `(1, 1)`. For the purpose of this tutorial, we won't explicitly define the `weights` and allow the built-in functions provided by PyTorch to handle that part for us. By the way, the `nn.Linear(...)` function applies a linear transformation ($y = xA^T + b$) to the data that was provided as its input. We ignore the bias for now by setting `bias=False`.\n",
147
+ "\n",
148
+ "\n",
149
+ "\n"
150
+ ]
151
+ },
152
+ {
153
+ "cell_type": "code",
154
+ "execution_count": null,
155
+ "metadata": {
156
+ "id": "N1Ii5JRz3Jud"
157
+ },
158
+ "outputs": [],
159
+ "source": [
160
+ "## Neural network with 1 hidden layer\n",
161
+ "layer1 = nn.Linear(1,1, bias=False)\n",
162
+ "model = nn.Sequential(layer1)"
163
+ ]
164
+ },
165
+ {
166
+ "cell_type": "markdown",
167
+ "metadata": {
168
+ "id": "9HTWYD4aMBXQ"
169
+ },
170
+ "source": [
171
+ "### Loss and Optimizer\n",
172
+ "The loss function, `nn.MSELoss()`, is in charge of letting the model know how good it has learned the relationship between the input and output. The optimizer (in this case an `SGD`) primary role is to minimize or lower that loss value as it tunes its weights."
173
+ ]
174
+ },
175
+ {
176
+ "cell_type": "code",
177
+ "execution_count": null,
178
+ "metadata": {
179
+ "id": "3hglFpejArxx"
180
+ },
181
+ "outputs": [],
182
+ "source": [
183
+ "## loss function\n",
184
+ "criterion = nn.MSELoss()\n",
185
+ "\n",
186
+ "## optimizer algorithm\n",
187
+ "optimizer = torch.optim.SGD(model.parameters(), lr=0.01)"
188
+ ]
189
+ },
190
+ {
191
+ "cell_type": "markdown",
192
+ "metadata": {
193
+ "id": "FKj6jvZTUtGh"
194
+ },
195
+ "source": [
196
+ "## Training the Neural Network Model\n",
197
+ "We have all the components we need to train our model. Below is the code used to train our model. \n",
198
+ "\n",
199
+ "In simple terms, we train the model by feeding it the input and output pairs for a couple of rounds (i.e., `epoch`). After a series of forward and backward steps, the model somewhat learns the relationship between x and y values. This is notable by the decrease in the computed `loss`. For a more detailed explanation of this code check out this [tutorial](https://medium.com/dair-ai/a-simple-neural-network-from-scratch-with-pytorch-and-google-colab-c7f3830618e0). "
200
+ ]
201
+ },
202
+ {
203
+ "cell_type": "code",
204
+ "execution_count": null,
205
+ "metadata": {
206
+ "colab": {
207
+ "base_uri": "https://localhost:8080/"
208
+ },
209
+ "id": "JeOr9i-aBzRv",
210
+ "outputId": "299a0b60-a64c-46c4-d031-8aaf1cacbff9"
211
+ },
212
+ "outputs": [
213
+ {
214
+ "name": "stdout",
215
+ "output_type": "stream",
216
+ "text": [
217
+ "Epoch: 0 | Loss: 10.1346\n",
218
+ "Epoch: 1 | Loss: 8.2589\n",
219
+ "Epoch: 2 | Loss: 6.7509\n",
220
+ "Epoch: 3 | Loss: 5.5385\n",
221
+ "Epoch: 4 | Loss: 4.5636\n",
222
+ "Epoch: 5 | Loss: 3.7798\n",
223
+ "Epoch: 6 | Loss: 3.1497\n",
224
+ "Epoch: 7 | Loss: 2.6430\n",
225
+ "Epoch: 8 | Loss: 2.2356\n",
226
+ "Epoch: 9 | Loss: 1.9081\n",
227
+ "Epoch: 10 | Loss: 1.6448\n",
228
+ "Epoch: 11 | Loss: 1.4331\n",
229
+ "Epoch: 12 | Loss: 1.2628\n",
230
+ "Epoch: 13 | Loss: 1.1260\n",
231
+ "Epoch: 14 | Loss: 1.0159\n",
232
+ "Epoch: 15 | Loss: 0.9275\n",
233
+ "Epoch: 16 | Loss: 0.8563\n",
234
+ "Epoch: 17 | Loss: 0.7991\n",
235
+ "Epoch: 18 | Loss: 0.7532\n",
236
+ "Epoch: 19 | Loss: 0.7162\n",
237
+ "Epoch: 20 | Loss: 0.6865\n",
238
+ "Epoch: 21 | Loss: 0.6626\n",
239
+ "Epoch: 22 | Loss: 0.6433\n",
240
+ "Epoch: 23 | Loss: 0.6279\n",
241
+ "Epoch: 24 | Loss: 0.6155\n",
242
+ "Epoch: 25 | Loss: 0.6055\n",
243
+ "Epoch: 26 | Loss: 0.5975\n",
244
+ "Epoch: 27 | Loss: 0.5910\n",
245
+ "Epoch: 28 | Loss: 0.5858\n",
246
+ "Epoch: 29 | Loss: 0.5816\n",
247
+ "Epoch: 30 | Loss: 0.5783\n",
248
+ "Epoch: 31 | Loss: 0.5756\n",
249
+ "Epoch: 32 | Loss: 0.5734\n",
250
+ "Epoch: 33 | Loss: 0.5717\n",
251
+ "Epoch: 34 | Loss: 0.5703\n",
252
+ "Epoch: 35 | Loss: 0.5691\n",
253
+ "Epoch: 36 | Loss: 0.5682\n",
254
+ "Epoch: 37 | Loss: 0.5675\n",
255
+ "Epoch: 38 | Loss: 0.5669\n",
256
+ "Epoch: 39 | Loss: 0.5664\n",
257
+ "Epoch: 40 | Loss: 0.5661\n",
258
+ "Epoch: 41 | Loss: 0.5658\n",
259
+ "Epoch: 42 | Loss: 0.5655\n",
260
+ "Epoch: 43 | Loss: 0.5653\n",
261
+ "Epoch: 44 | Loss: 0.5652\n",
262
+ "Epoch: 45 | Loss: 0.5650\n",
263
+ "Epoch: 46 | Loss: 0.5649\n",
264
+ "Epoch: 47 | Loss: 0.5649\n",
265
+ "Epoch: 48 | Loss: 0.5648\n",
266
+ "Epoch: 49 | Loss: 0.5647\n",
267
+ "Epoch: 50 | Loss: 0.5647\n",
268
+ "Epoch: 51 | Loss: 0.5647\n",
269
+ "Epoch: 52 | Loss: 0.5646\n",
270
+ "Epoch: 53 | Loss: 0.5646\n",
271
+ "Epoch: 54 | Loss: 0.5646\n",
272
+ "Epoch: 55 | Loss: 0.5646\n",
273
+ "Epoch: 56 | Loss: 0.5646\n",
274
+ "Epoch: 57 | Loss: 0.5646\n",
275
+ "Epoch: 58 | Loss: 0.5645\n",
276
+ "Epoch: 59 | Loss: 0.5645\n",
277
+ "Epoch: 60 | Loss: 0.5645\n",
278
+ "Epoch: 61 | Loss: 0.5645\n",
279
+ "Epoch: 62 | Loss: 0.5645\n",
280
+ "Epoch: 63 | Loss: 0.5645\n",
281
+ "Epoch: 64 | Loss: 0.5645\n",
282
+ "Epoch: 65 | Loss: 0.5645\n",
283
+ "Epoch: 66 | Loss: 0.5645\n",
284
+ "Epoch: 67 | Loss: 0.5645\n",
285
+ "Epoch: 68 | Loss: 0.5645\n",
286
+ "Epoch: 69 | Loss: 0.5645\n",
287
+ "Epoch: 70 | Loss: 0.5645\n",
288
+ "Epoch: 71 | Loss: 0.5645\n",
289
+ "Epoch: 72 | Loss: 0.5645\n",
290
+ "Epoch: 73 | Loss: 0.5645\n",
291
+ "Epoch: 74 | Loss: 0.5645\n",
292
+ "Epoch: 75 | Loss: 0.5645\n",
293
+ "Epoch: 76 | Loss: 0.5645\n",
294
+ "Epoch: 77 | Loss: 0.5645\n",
295
+ "Epoch: 78 | Loss: 0.5645\n",
296
+ "Epoch: 79 | Loss: 0.5645\n",
297
+ "Epoch: 80 | Loss: 0.5645\n",
298
+ "Epoch: 81 | Loss: 0.5645\n",
299
+ "Epoch: 82 | Loss: 0.5645\n",
300
+ "Epoch: 83 | Loss: 0.5645\n",
301
+ "Epoch: 84 | Loss: 0.5645\n",
302
+ "Epoch: 85 | Loss: 0.5645\n",
303
+ "Epoch: 86 | Loss: 0.5645\n",
304
+ "Epoch: 87 | Loss: 0.5645\n",
305
+ "Epoch: 88 | Loss: 0.5645\n",
306
+ "Epoch: 89 | Loss: 0.5645\n",
307
+ "Epoch: 90 | Loss: 0.5645\n",
308
+ "Epoch: 91 | Loss: 0.5645\n",
309
+ "Epoch: 92 | Loss: 0.5645\n",
310
+ "Epoch: 93 | Loss: 0.5645\n",
311
+ "Epoch: 94 | Loss: 0.5645\n",
312
+ "Epoch: 95 | Loss: 0.5645\n",
313
+ "Epoch: 96 | Loss: 0.5645\n",
314
+ "Epoch: 97 | Loss: 0.5645\n",
315
+ "Epoch: 98 | Loss: 0.5645\n",
316
+ "Epoch: 99 | Loss: 0.5645\n",
317
+ "Epoch: 100 | Loss: 0.5645\n",
318
+ "Epoch: 101 | Loss: 0.5645\n",
319
+ "Epoch: 102 | Loss: 0.5645\n",
320
+ "Epoch: 103 | Loss: 0.5645\n",
321
+ "Epoch: 104 | Loss: 0.5645\n",
322
+ "Epoch: 105 | Loss: 0.5645\n",
323
+ "Epoch: 106 | Loss: 0.5645\n",
324
+ "Epoch: 107 | Loss: 0.5645\n",
325
+ "Epoch: 108 | Loss: 0.5645\n",
326
+ "Epoch: 109 | Loss: 0.5645\n",
327
+ "Epoch: 110 | Loss: 0.5645\n",
328
+ "Epoch: 111 | Loss: 0.5645\n",
329
+ "Epoch: 112 | Loss: 0.5645\n",
330
+ "Epoch: 113 | Loss: 0.5645\n",
331
+ "Epoch: 114 | Loss: 0.5645\n",
332
+ "Epoch: 115 | Loss: 0.5645\n",
333
+ "Epoch: 116 | Loss: 0.5645\n",
334
+ "Epoch: 117 | Loss: 0.5645\n",
335
+ "Epoch: 118 | Loss: 0.5645\n",
336
+ "Epoch: 119 | Loss: 0.5645\n",
337
+ "Epoch: 120 | Loss: 0.5645\n",
338
+ "Epoch: 121 | Loss: 0.5645\n",
339
+ "Epoch: 122 | Loss: 0.5645\n",
340
+ "Epoch: 123 | Loss: 0.5645\n",
341
+ "Epoch: 124 | Loss: 0.5645\n",
342
+ "Epoch: 125 | Loss: 0.5645\n",
343
+ "Epoch: 126 | Loss: 0.5645\n",
344
+ "Epoch: 127 | Loss: 0.5645\n",
345
+ "Epoch: 128 | Loss: 0.5645\n",
346
+ "Epoch: 129 | Loss: 0.5645\n",
347
+ "Epoch: 130 | Loss: 0.5645\n",
348
+ "Epoch: 131 | Loss: 0.5645\n",
349
+ "Epoch: 132 | Loss: 0.5645\n",
350
+ "Epoch: 133 | Loss: 0.5645\n",
351
+ "Epoch: 134 | Loss: 0.5645\n",
352
+ "Epoch: 135 | Loss: 0.5645\n",
353
+ "Epoch: 136 | Loss: 0.5645\n",
354
+ "Epoch: 137 | Loss: 0.5645\n",
355
+ "Epoch: 138 | Loss: 0.5645\n",
356
+ "Epoch: 139 | Loss: 0.5645\n",
357
+ "Epoch: 140 | Loss: 0.5645\n",
358
+ "Epoch: 141 | Loss: 0.5645\n",
359
+ "Epoch: 142 | Loss: 0.5645\n",
360
+ "Epoch: 143 | Loss: 0.5645\n",
361
+ "Epoch: 144 | Loss: 0.5645\n",
362
+ "Epoch: 145 | Loss: 0.5645\n",
363
+ "Epoch: 146 | Loss: 0.5645\n",
364
+ "Epoch: 147 | Loss: 0.5645\n",
365
+ "Epoch: 148 | Loss: 0.5645\n",
366
+ "Epoch: 149 | Loss: 0.5645\n"
367
+ ]
368
+ }
369
+ ],
370
+ "source": [
371
+ "## training\n",
372
+ "for ITER in range(150):\n",
373
+ " model = model.train()\n",
374
+ "\n",
375
+ " ## forward\n",
376
+ " output = model(x)\n",
377
+ " loss = criterion(output, y)\n",
378
+ " optimizer.zero_grad()\n",
379
+ "\n",
380
+ " ## backward + update model params \n",
381
+ " loss.backward()\n",
382
+ " optimizer.step()\n",
383
+ "\n",
384
+ " model.eval()\n",
385
+ " print('Epoch: %d | Loss: %.4f' %(ITER, loss.detach().item()))"
386
+ ]
387
+ },
388
+ {
389
+ "cell_type": "markdown",
390
+ "metadata": {
391
+ "id": "Bp50Q7J0Xkiw"
392
+ },
393
+ "source": [
394
+ "## Testing the Model\n",
395
+ "After training the model we have the ability to test the model predictive capability by passing it an input. Below is a simple example of how you could achieve this with our model. The result we obtained aligns with the results obtained in this [notebook](https://github.com/lmoroney/dlaicourse/blob/master/Course%201%20-%20Part%202%20-%20Lesson%202%20-%20Notebook.ipynb), which inspired this entire tutorial. "
396
+ ]
397
+ },
398
+ {
399
+ "cell_type": "code",
400
+ "execution_count": null,
401
+ "metadata": {
402
+ "colab": {
403
+ "base_uri": "https://localhost:8080/"
404
+ },
405
+ "id": "V1odfZpGFoBi",
406
+ "outputId": "a447b232-729e-4ccf-adc2-5aeaf79cc2ea"
407
+ },
408
+ "outputs": [
409
+ {
410
+ "name": "stdout",
411
+ "output_type": "stream",
412
+ "text": [
413
+ "17.096769332885742\n"
414
+ ]
415
+ }
416
+ ],
417
+ "source": [
418
+ "## test the model\n",
419
+ "sample = torch.tensor([10.0], dtype=torch.float)\n",
420
+ "predicted = model(sample)\n",
421
+ "print(predicted.detach().item())"
422
+ ]
423
+ },
424
+ {
425
+ "cell_type": "markdown",
426
+ "metadata": {
427
+ "id": "ozX4V1GhPLyr"
428
+ },
429
+ "source": [
430
+ "## Final Words\n",
431
+ "\n",
432
+ "Congratulations! In this tutorial you learned how to train a simple neural network using PyTorch. You also learned about the basic components that make up a neural network model such as the linear transformation layer, optimizer, and loss function. We then trained the model and tested its predictive capabilities. You are well on your way to become more knowledgeable about deep learning and PyTorch. I have provided a bunch of references below if you are interested in practising and learning more. \n",
433
+ "\n",
434
+ "*I would like to thank Laurence Moroney for his excellent [tutorial](https://github.com/lmoroney/dlaicourse/blob/master/Course%201%20-%20Part%202%20-%20Lesson%202%20-%20Notebook.ipynb) which I used as an inspiration for this tutorial.*"
435
+ ]
436
+ },
437
+ {
438
+ "cell_type": "markdown",
439
+ "metadata": {
440
+ "id": "LAABGiMHeDOr"
441
+ },
442
+ "source": [
443
+ "## Exercises\n",
444
+ "- Add more examples in the input and output tensors. In addition, try to change the dimensions of the data, say by adding an extra value in each array. What needs to be changed to successfully train the network with the new data?\n",
445
+ "- The model converged really fast, which means it learned the relationship between x and y values after a couple of iterations. Do you think it makes sense to continue training? How would you automate the process of stopping the training after the model loss doesn't subtantially change?\n",
446
+ "- In our example, we used a single hidden layer. Try to take a look at the PyTorch documentation to figure out what you need to do to get a model with more layers. What happens if you add more hidden layers?\n",
447
+ "- We did not discuss the learning rate (`lr-0.001`) and the optimizer in great detail. Check out the [PyTorch documentation](https://pytorch.org/docs/stable/optim.html) to learn more about what other optimizers you can use.\n"
448
+ ]
449
+ },
450
+ {
451
+ "cell_type": "markdown",
452
+ "metadata": {
453
+ "id": "4-o4w9vpPHZz"
454
+ },
455
+ "source": [
456
+ "## References\n",
457
+ "- [The Hello World of Deep Learning with Neural Networks](https://github.com/lmoroney/dlaicourse/blob/master/Course%201%20-%20Part%202%20-%20Lesson%202%20-%20Notebook.ipynb)\n",
458
+ "- [A Simple Neural Network from Scratch with PyTorch and Google Colab](https://medium.com/dair-ai/a-simple-neural-network-from-scratch-with-pytorch-and-google-colab-c7f3830618e0?source=collection_category---4------1-----------------------)\n",
459
+ "- [PyTorch Official Docs](https://pytorch.org/docs/stable/nn.html)\n",
460
+ "- [PyTorch 1.2 Quickstart with Google Colab](https://medium.com/dair-ai/pytorch-1-2-quickstart-with-google-colab-6690a30c38d)\n",
461
+ "- [A Gentle Intoduction to PyTorch](https://medium.com/dair-ai/pytorch-1-2-introduction-guide-f6fa9bb7597c)"
462
+ ]
463
+ }
464
+ ],
465
+ "metadata": {
466
+ "colab": {
467
+ "name": "PyTorch Hello World.ipynb",
468
+ "provenance": []
469
+ },
470
+ "kernelspec": {
471
+ "display_name": "Python 3 (ipykernel)",
472
+ "language": "python",
473
+ "name": "python3"
474
+ },
475
+ "language_info": {
476
+ "codemirror_mode": {
477
+ "name": "ipython",
478
+ "version": 3
479
+ },
480
+ "file_extension": ".py",
481
+ "mimetype": "text/x-python",
482
+ "name": "python",
483
+ "nbconvert_exporter": "python",
484
+ "pygments_lexer": "ipython3",
485
+ "version": "3.9.12"
486
+ }
487
+ },
488
+ "nbformat": 4,
489
+ "nbformat_minor": 1
490
+ }
02_A_Gentle_Introduction_to_PyTorch.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
03_Pytorch_Logistic_Regression_from_Scratch.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
04_Concise_Logistic_Regression.ipynb ADDED
@@ -0,0 +1,718 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 05 Concise Logistic Regression\n",
9
+ "description: Concise implementation of logistic regression model for binary image classification.\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/14hnFJvHDq9w7FGb8P6pd6-I7F3djTRG9?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "gC6qMkJooFub"
24
+ },
25
+ "source": [
26
+ "## Concise Logistic Regression for Image Classification\n",
27
+ "\n",
28
+ "- Shows a concise implementation of logistic regression for image classification\n",
29
+ "- Uses PyTorch"
30
+ ]
31
+ },
32
+ {
33
+ "cell_type": "code",
34
+ "execution_count": null,
35
+ "metadata": {
36
+ "id": "tI49R1p0n-XM"
37
+ },
38
+ "outputs": [],
39
+ "source": [
40
+ "# imports\n",
41
+ "import torch\n",
42
+ "import torchvision\n",
43
+ "import torch.nn as nn\n",
44
+ "from torchvision import datasets, models, transforms\n",
45
+ "import os\n",
46
+ "import numpy as np\n",
47
+ "import matplotlib.pyplot as plt\n",
48
+ "\n",
49
+ "%matplotlib inline\n",
50
+ "\n",
51
+ "# use gpu if available\n",
52
+ "device = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")"
53
+ ]
54
+ },
55
+ {
56
+ "cell_type": "code",
57
+ "execution_count": null,
58
+ "metadata": {
59
+ "colab": {
60
+ "base_uri": "https://localhost:8080/"
61
+ },
62
+ "id": "O92KeM06pJqc",
63
+ "outputId": "322d8266-f005-4b17-f18e-3d7046cba4b8"
64
+ },
65
+ "outputs": [
66
+ {
67
+ "name": "stdout",
68
+ "output_type": "stream",
69
+ "text": [
70
+ "--2022-04-03 16:17:19-- https://download.pytorch.org/tutorial/hymenoptera_data.zip\n",
71
+ "Resolving download.pytorch.org (download.pytorch.org)... 13.226.230.76, 13.226.230.24, 13.226.230.114, ...\n",
72
+ "Connecting to download.pytorch.org (download.pytorch.org)|13.226.230.76|:443... connected.\n",
73
+ "HTTP request sent, awaiting response... 200 OK\n",
74
+ "Length: 47286322 (45M) [application/zip]\n",
75
+ "Saving to: ‘hymenoptera_data.zip’\n",
76
+ "\n",
77
+ "hymenoptera_data.zi 100%[===================>] 45.10M 25.3MB/s in 1.8s \n",
78
+ "\n",
79
+ "2022-04-03 16:17:21 (25.3 MB/s) - ‘hymenoptera_data.zip’ saved [47286322/47286322]\n",
80
+ "\n",
81
+ "Archive: hymenoptera_data.zip\n",
82
+ " creating: hymenoptera_data/\n",
83
+ " creating: hymenoptera_data/train/\n",
84
+ " creating: hymenoptera_data/train/ants/\n",
85
+ " inflating: hymenoptera_data/train/ants/0013035.jpg \n",
86
+ " inflating: hymenoptera_data/train/ants/1030023514_aad5c608f9.jpg \n",
87
+ " inflating: hymenoptera_data/train/ants/1095476100_3906d8afde.jpg \n",
88
+ " inflating: hymenoptera_data/train/ants/1099452230_d1949d3250.jpg \n",
89
+ " inflating: hymenoptera_data/train/ants/116570827_e9c126745d.jpg \n",
90
+ " inflating: hymenoptera_data/train/ants/1225872729_6f0856588f.jpg \n",
91
+ " inflating: hymenoptera_data/train/ants/1262877379_64fcada201.jpg \n",
92
+ " inflating: hymenoptera_data/train/ants/1269756697_0bce92cdab.jpg \n",
93
+ " inflating: hymenoptera_data/train/ants/1286984635_5119e80de1.jpg \n",
94
+ " inflating: hymenoptera_data/train/ants/132478121_2a430adea2.jpg \n",
95
+ " inflating: hymenoptera_data/train/ants/1360291657_dc248c5eea.jpg \n",
96
+ " inflating: hymenoptera_data/train/ants/1368913450_e146e2fb6d.jpg \n",
97
+ " inflating: hymenoptera_data/train/ants/1473187633_63ccaacea6.jpg \n",
98
+ " inflating: hymenoptera_data/train/ants/148715752_302c84f5a4.jpg \n",
99
+ " inflating: hymenoptera_data/train/ants/1489674356_09d48dde0a.jpg \n",
100
+ " inflating: hymenoptera_data/train/ants/149244013_c529578289.jpg \n",
101
+ " inflating: hymenoptera_data/train/ants/150801003_3390b73135.jpg \n",
102
+ " inflating: hymenoptera_data/train/ants/150801171_cd86f17ed8.jpg \n",
103
+ " inflating: hymenoptera_data/train/ants/154124431_65460430f2.jpg \n",
104
+ " inflating: hymenoptera_data/train/ants/162603798_40b51f1654.jpg \n",
105
+ " inflating: hymenoptera_data/train/ants/1660097129_384bf54490.jpg \n",
106
+ " inflating: hymenoptera_data/train/ants/167890289_dd5ba923f3.jpg \n",
107
+ " inflating: hymenoptera_data/train/ants/1693954099_46d4c20605.jpg \n",
108
+ " inflating: hymenoptera_data/train/ants/175998972.jpg \n",
109
+ " inflating: hymenoptera_data/train/ants/178538489_bec7649292.jpg \n",
110
+ " inflating: hymenoptera_data/train/ants/1804095607_0341701e1c.jpg \n",
111
+ " inflating: hymenoptera_data/train/ants/1808777855_2a895621d7.jpg \n",
112
+ " inflating: hymenoptera_data/train/ants/188552436_605cc9b36b.jpg \n",
113
+ " inflating: hymenoptera_data/train/ants/1917341202_d00a7f9af5.jpg \n",
114
+ " inflating: hymenoptera_data/train/ants/1924473702_daa9aacdbe.jpg \n",
115
+ " inflating: hymenoptera_data/train/ants/196057951_63bf063b92.jpg \n",
116
+ " inflating: hymenoptera_data/train/ants/196757565_326437f5fe.jpg \n",
117
+ " inflating: hymenoptera_data/train/ants/201558278_fe4caecc76.jpg \n",
118
+ " inflating: hymenoptera_data/train/ants/201790779_527f4c0168.jpg \n",
119
+ " inflating: hymenoptera_data/train/ants/2019439677_2db655d361.jpg \n",
120
+ " inflating: hymenoptera_data/train/ants/207947948_3ab29d7207.jpg \n",
121
+ " inflating: hymenoptera_data/train/ants/20935278_9190345f6b.jpg \n",
122
+ " inflating: hymenoptera_data/train/ants/224655713_3956f7d39a.jpg \n",
123
+ " inflating: hymenoptera_data/train/ants/2265824718_2c96f485da.jpg \n",
124
+ " inflating: hymenoptera_data/train/ants/2265825502_fff99cfd2d.jpg \n",
125
+ " inflating: hymenoptera_data/train/ants/226951206_d6bf946504.jpg \n",
126
+ " inflating: hymenoptera_data/train/ants/2278278459_6b99605e50.jpg \n",
127
+ " inflating: hymenoptera_data/train/ants/2288450226_a6e96e8fdf.jpg \n",
128
+ " inflating: hymenoptera_data/train/ants/2288481644_83ff7e4572.jpg \n",
129
+ " inflating: hymenoptera_data/train/ants/2292213964_ca51ce4bef.jpg \n",
130
+ " inflating: hymenoptera_data/train/ants/24335309_c5ea483bb8.jpg \n",
131
+ " inflating: hymenoptera_data/train/ants/245647475_9523dfd13e.jpg \n",
132
+ " inflating: hymenoptera_data/train/ants/255434217_1b2b3fe0a4.jpg \n",
133
+ " inflating: hymenoptera_data/train/ants/258217966_d9d90d18d3.jpg \n",
134
+ " inflating: hymenoptera_data/train/ants/275429470_b2d7d9290b.jpg \n",
135
+ " inflating: hymenoptera_data/train/ants/28847243_e79fe052cd.jpg \n",
136
+ " inflating: hymenoptera_data/train/ants/318052216_84dff3f98a.jpg \n",
137
+ " inflating: hymenoptera_data/train/ants/334167043_cbd1adaeb9.jpg \n",
138
+ " inflating: hymenoptera_data/train/ants/339670531_94b75ae47a.jpg \n",
139
+ " inflating: hymenoptera_data/train/ants/342438950_a3da61deab.jpg \n",
140
+ " inflating: hymenoptera_data/train/ants/36439863_0bec9f554f.jpg \n",
141
+ " inflating: hymenoptera_data/train/ants/374435068_7eee412ec4.jpg \n",
142
+ " inflating: hymenoptera_data/train/ants/382971067_0bfd33afe0.jpg \n",
143
+ " inflating: hymenoptera_data/train/ants/384191229_5779cf591b.jpg \n",
144
+ " inflating: hymenoptera_data/train/ants/386190770_672743c9a7.jpg \n",
145
+ " inflating: hymenoptera_data/train/ants/392382602_1b7bed32fa.jpg \n",
146
+ " inflating: hymenoptera_data/train/ants/403746349_71384f5b58.jpg \n",
147
+ " inflating: hymenoptera_data/train/ants/408393566_b5b694119b.jpg \n",
148
+ " inflating: hymenoptera_data/train/ants/424119020_6d57481dab.jpg \n",
149
+ " inflating: hymenoptera_data/train/ants/424873399_47658a91fb.jpg \n",
150
+ " inflating: hymenoptera_data/train/ants/450057712_771b3bfc91.jpg \n",
151
+ " inflating: hymenoptera_data/train/ants/45472593_bfd624f8dc.jpg \n",
152
+ " inflating: hymenoptera_data/train/ants/459694881_ac657d3187.jpg \n",
153
+ " inflating: hymenoptera_data/train/ants/460372577_f2f6a8c9fc.jpg \n",
154
+ " inflating: hymenoptera_data/train/ants/460874319_0a45ab4d05.jpg \n",
155
+ " inflating: hymenoptera_data/train/ants/466430434_4000737de9.jpg \n",
156
+ " inflating: hymenoptera_data/train/ants/470127037_513711fd21.jpg \n",
157
+ " inflating: hymenoptera_data/train/ants/474806473_ca6caab245.jpg \n",
158
+ " inflating: hymenoptera_data/train/ants/475961153_b8c13fd405.jpg \n",
159
+ " inflating: hymenoptera_data/train/ants/484293231_e53cfc0c89.jpg \n",
160
+ " inflating: hymenoptera_data/train/ants/49375974_e28ba6f17e.jpg \n",
161
+ " inflating: hymenoptera_data/train/ants/506249802_207cd979b4.jpg \n",
162
+ " inflating: hymenoptera_data/train/ants/506249836_717b73f540.jpg \n",
163
+ " inflating: hymenoptera_data/train/ants/512164029_c0a66b8498.jpg \n",
164
+ " inflating: hymenoptera_data/train/ants/512863248_43c8ce579b.jpg \n",
165
+ " inflating: hymenoptera_data/train/ants/518773929_734dbc5ff4.jpg \n",
166
+ " inflating: hymenoptera_data/train/ants/522163566_fec115ca66.jpg \n",
167
+ " inflating: hymenoptera_data/train/ants/522415432_2218f34bf8.jpg \n",
168
+ " inflating: hymenoptera_data/train/ants/531979952_bde12b3bc0.jpg \n",
169
+ " inflating: hymenoptera_data/train/ants/533848102_70a85ad6dd.jpg \n",
170
+ " inflating: hymenoptera_data/train/ants/535522953_308353a07c.jpg \n",
171
+ " inflating: hymenoptera_data/train/ants/540889389_48bb588b21.jpg \n",
172
+ " inflating: hymenoptera_data/train/ants/541630764_dbd285d63c.jpg \n",
173
+ " inflating: hymenoptera_data/train/ants/543417860_b14237f569.jpg \n",
174
+ " inflating: hymenoptera_data/train/ants/560966032_988f4d7bc4.jpg \n",
175
+ " inflating: hymenoptera_data/train/ants/5650366_e22b7e1065.jpg \n",
176
+ " inflating: hymenoptera_data/train/ants/6240329_72c01e663e.jpg \n",
177
+ " inflating: hymenoptera_data/train/ants/6240338_93729615ec.jpg \n",
178
+ " inflating: hymenoptera_data/train/ants/649026570_e58656104b.jpg \n",
179
+ " inflating: hymenoptera_data/train/ants/662541407_ff8db781e7.jpg \n",
180
+ " inflating: hymenoptera_data/train/ants/67270775_e9fdf77e9d.jpg \n",
181
+ " inflating: hymenoptera_data/train/ants/6743948_2b8c096dda.jpg \n",
182
+ " inflating: hymenoptera_data/train/ants/684133190_35b62c0c1d.jpg \n",
183
+ " inflating: hymenoptera_data/train/ants/69639610_95e0de17aa.jpg \n",
184
+ " inflating: hymenoptera_data/train/ants/707895295_009cf23188.jpg \n",
185
+ " inflating: hymenoptera_data/train/ants/7759525_1363d24e88.jpg \n",
186
+ " inflating: hymenoptera_data/train/ants/795000156_a9900a4a71.jpg \n",
187
+ " inflating: hymenoptera_data/train/ants/822537660_caf4ba5514.jpg \n",
188
+ " inflating: hymenoptera_data/train/ants/82852639_52b7f7f5e3.jpg \n",
189
+ " inflating: hymenoptera_data/train/ants/841049277_b28e58ad05.jpg \n",
190
+ " inflating: hymenoptera_data/train/ants/886401651_f878e888cd.jpg \n",
191
+ " inflating: hymenoptera_data/train/ants/892108839_f1aad4ca46.jpg \n",
192
+ " inflating: hymenoptera_data/train/ants/938946700_ca1c669085.jpg \n",
193
+ " inflating: hymenoptera_data/train/ants/957233405_25c1d1187b.jpg \n",
194
+ " inflating: hymenoptera_data/train/ants/9715481_b3cb4114ff.jpg \n",
195
+ " inflating: hymenoptera_data/train/ants/998118368_6ac1d91f81.jpg \n",
196
+ " inflating: hymenoptera_data/train/ants/ant photos.jpg \n",
197
+ " inflating: hymenoptera_data/train/ants/Ant_1.jpg \n",
198
+ " inflating: hymenoptera_data/train/ants/army-ants-red-picture.jpg \n",
199
+ " inflating: hymenoptera_data/train/ants/formica.jpeg \n",
200
+ " inflating: hymenoptera_data/train/ants/hormiga_co_por.jpg \n",
201
+ " inflating: hymenoptera_data/train/ants/imageNotFound.gif \n",
202
+ " inflating: hymenoptera_data/train/ants/kurokusa.jpg \n",
203
+ " inflating: hymenoptera_data/train/ants/MehdiabadiAnt2_600.jpg \n",
204
+ " inflating: hymenoptera_data/train/ants/Nepenthes_rafflesiana_ant.jpg \n",
205
+ " inflating: hymenoptera_data/train/ants/swiss-army-ant.jpg \n",
206
+ " inflating: hymenoptera_data/train/ants/termite-vs-ant.jpg \n",
207
+ " inflating: hymenoptera_data/train/ants/trap-jaw-ant-insect-bg.jpg \n",
208
+ " inflating: hymenoptera_data/train/ants/VietnameseAntMimicSpider.jpg \n",
209
+ " creating: hymenoptera_data/train/bees/\n",
210
+ " inflating: hymenoptera_data/train/bees/1092977343_cb42b38d62.jpg \n",
211
+ " inflating: hymenoptera_data/train/bees/1093831624_fb5fbe2308.jpg \n",
212
+ " inflating: hymenoptera_data/train/bees/1097045929_1753d1c765.jpg \n",
213
+ " inflating: hymenoptera_data/train/bees/1232245714_f862fbe385.jpg \n",
214
+ " inflating: hymenoptera_data/train/bees/129236073_0985e91c7d.jpg \n",
215
+ " inflating: hymenoptera_data/train/bees/1295655112_7813f37d21.jpg \n",
216
+ " inflating: hymenoptera_data/train/bees/132511197_0b86ad0fff.jpg \n",
217
+ " inflating: hymenoptera_data/train/bees/132826773_dbbcb117b9.jpg \n",
218
+ " inflating: hymenoptera_data/train/bees/150013791_969d9a968b.jpg \n",
219
+ " inflating: hymenoptera_data/train/bees/1508176360_2972117c9d.jpg \n",
220
+ " inflating: hymenoptera_data/train/bees/154600396_53e1252e52.jpg \n",
221
+ " inflating: hymenoptera_data/train/bees/16838648_415acd9e3f.jpg \n",
222
+ " inflating: hymenoptera_data/train/bees/1691282715_0addfdf5e8.jpg \n",
223
+ " inflating: hymenoptera_data/train/bees/17209602_fe5a5a746f.jpg \n",
224
+ " inflating: hymenoptera_data/train/bees/174142798_e5ad6d76e0.jpg \n",
225
+ " inflating: hymenoptera_data/train/bees/1799726602_8580867f71.jpg \n",
226
+ " inflating: hymenoptera_data/train/bees/1807583459_4fe92b3133.jpg \n",
227
+ " inflating: hymenoptera_data/train/bees/196430254_46bd129ae7.jpg \n",
228
+ " inflating: hymenoptera_data/train/bees/196658222_3fffd79c67.jpg \n",
229
+ " inflating: hymenoptera_data/train/bees/198508668_97d818b6c4.jpg \n",
230
+ " inflating: hymenoptera_data/train/bees/2031225713_50ed499635.jpg \n",
231
+ " inflating: hymenoptera_data/train/bees/2037437624_2d7bce461f.jpg \n",
232
+ " inflating: hymenoptera_data/train/bees/2053200300_8911ef438a.jpg \n",
233
+ " inflating: hymenoptera_data/train/bees/205835650_e6f2614bee.jpg \n",
234
+ " inflating: hymenoptera_data/train/bees/208702903_42fb4d9748.jpg \n",
235
+ " inflating: hymenoptera_data/train/bees/21399619_3e61e5bb6f.jpg \n",
236
+ " inflating: hymenoptera_data/train/bees/2227611847_ec72d40403.jpg \n",
237
+ " inflating: hymenoptera_data/train/bees/2321139806_d73d899e66.jpg \n",
238
+ " inflating: hymenoptera_data/train/bees/2330918208_8074770c20.jpg \n",
239
+ " inflating: hymenoptera_data/train/bees/2345177635_caf07159b3.jpg \n",
240
+ " inflating: hymenoptera_data/train/bees/2358061370_9daabbd9ac.jpg \n",
241
+ " inflating: hymenoptera_data/train/bees/2364597044_3c3e3fc391.jpg \n",
242
+ " inflating: hymenoptera_data/train/bees/2384149906_2cd8b0b699.jpg \n",
243
+ " inflating: hymenoptera_data/train/bees/2397446847_04ef3cd3e1.jpg \n",
244
+ " inflating: hymenoptera_data/train/bees/2405441001_b06c36fa72.jpg \n",
245
+ " inflating: hymenoptera_data/train/bees/2445215254_51698ff797.jpg \n",
246
+ " inflating: hymenoptera_data/train/bees/2452236943_255bfd9e58.jpg \n",
247
+ " inflating: hymenoptera_data/train/bees/2467959963_a7831e9ff0.jpg \n",
248
+ " inflating: hymenoptera_data/train/bees/2470492904_837e97800d.jpg \n",
249
+ " inflating: hymenoptera_data/train/bees/2477324698_3d4b1b1cab.jpg \n",
250
+ " inflating: hymenoptera_data/train/bees/2477349551_e75c97cf4d.jpg \n",
251
+ " inflating: hymenoptera_data/train/bees/2486729079_62df0920be.jpg \n",
252
+ " inflating: hymenoptera_data/train/bees/2486746709_c43cec0e42.jpg \n",
253
+ " inflating: hymenoptera_data/train/bees/2493379287_4100e1dacc.jpg \n",
254
+ " inflating: hymenoptera_data/train/bees/2495722465_879acf9d85.jpg \n",
255
+ " inflating: hymenoptera_data/train/bees/2528444139_fa728b0f5b.jpg \n",
256
+ " inflating: hymenoptera_data/train/bees/2538361678_9da84b77e3.jpg \n",
257
+ " inflating: hymenoptera_data/train/bees/2551813042_8a070aeb2b.jpg \n",
258
+ " inflating: hymenoptera_data/train/bees/2580598377_a4caecdb54.jpg \n",
259
+ " inflating: hymenoptera_data/train/bees/2601176055_8464e6aa71.jpg \n",
260
+ " inflating: hymenoptera_data/train/bees/2610833167_79bf0bcae5.jpg \n",
261
+ " inflating: hymenoptera_data/train/bees/2610838525_fe8e3cae47.jpg \n",
262
+ " inflating: hymenoptera_data/train/bees/2617161745_fa3ebe85b4.jpg \n",
263
+ " inflating: hymenoptera_data/train/bees/2625499656_e3415e374d.jpg \n",
264
+ " inflating: hymenoptera_data/train/bees/2634617358_f32fd16bea.jpg \n",
265
+ " inflating: hymenoptera_data/train/bees/2638074627_6b3ae746a0.jpg \n",
266
+ " inflating: hymenoptera_data/train/bees/2645107662_b73a8595cc.jpg \n",
267
+ " inflating: hymenoptera_data/train/bees/2651621464_a2fa8722eb.jpg \n",
268
+ " inflating: hymenoptera_data/train/bees/2652877533_a564830cbf.jpg \n",
269
+ " inflating: hymenoptera_data/train/bees/266644509_d30bb16a1b.jpg \n",
270
+ " inflating: hymenoptera_data/train/bees/2683605182_9d2a0c66cf.jpg \n",
271
+ " inflating: hymenoptera_data/train/bees/2704348794_eb5d5178c2.jpg \n",
272
+ " inflating: hymenoptera_data/train/bees/2707440199_cd170bd512.jpg \n",
273
+ " inflating: hymenoptera_data/train/bees/2710368626_cb42882dc8.jpg \n",
274
+ " inflating: hymenoptera_data/train/bees/2722592222_258d473e17.jpg \n",
275
+ " inflating: hymenoptera_data/train/bees/2728759455_ce9bb8cd7a.jpg \n",
276
+ " inflating: hymenoptera_data/train/bees/2756397428_1d82a08807.jpg \n",
277
+ " inflating: hymenoptera_data/train/bees/2765347790_da6cf6cb40.jpg \n",
278
+ " inflating: hymenoptera_data/train/bees/2781170484_5d61835d63.jpg \n",
279
+ " inflating: hymenoptera_data/train/bees/279113587_b4843db199.jpg \n",
280
+ " inflating: hymenoptera_data/train/bees/2792000093_e8ae0718cf.jpg \n",
281
+ " inflating: hymenoptera_data/train/bees/2801728106_833798c909.jpg \n",
282
+ " inflating: hymenoptera_data/train/bees/2822388965_f6dca2a275.jpg \n",
283
+ " inflating: hymenoptera_data/train/bees/2861002136_52c7c6f708.jpg \n",
284
+ " inflating: hymenoptera_data/train/bees/2908916142_a7ac8b57a8.jpg \n",
285
+ " inflating: hymenoptera_data/train/bees/29494643_e3410f0d37.jpg \n",
286
+ " inflating: hymenoptera_data/train/bees/2959730355_416a18c63c.jpg \n",
287
+ " inflating: hymenoptera_data/train/bees/2962405283_22718d9617.jpg \n",
288
+ " inflating: hymenoptera_data/train/bees/3006264892_30e9cced70.jpg \n",
289
+ " inflating: hymenoptera_data/train/bees/3030189811_01d095b793.jpg \n",
290
+ " inflating: hymenoptera_data/train/bees/3030772428_8578335616.jpg \n",
291
+ " inflating: hymenoptera_data/train/bees/3044402684_3853071a87.jpg \n",
292
+ " inflating: hymenoptera_data/train/bees/3074585407_9854eb3153.jpg \n",
293
+ " inflating: hymenoptera_data/train/bees/3079610310_ac2d0ae7bc.jpg \n",
294
+ " inflating: hymenoptera_data/train/bees/3090975720_71f12e6de4.jpg \n",
295
+ " inflating: hymenoptera_data/train/bees/3100226504_c0d4f1e3f1.jpg \n",
296
+ " inflating: hymenoptera_data/train/bees/342758693_c56b89b6b6.jpg \n",
297
+ " inflating: hymenoptera_data/train/bees/354167719_22dca13752.jpg \n",
298
+ " inflating: hymenoptera_data/train/bees/359928878_b3b418c728.jpg \n",
299
+ " inflating: hymenoptera_data/train/bees/365759866_b15700c59b.jpg \n",
300
+ " inflating: hymenoptera_data/train/bees/36900412_92b81831ad.jpg \n",
301
+ " inflating: hymenoptera_data/train/bees/39672681_1302d204d1.jpg \n",
302
+ " inflating: hymenoptera_data/train/bees/39747887_42df2855ee.jpg \n",
303
+ " inflating: hymenoptera_data/train/bees/421515404_e87569fd8b.jpg \n",
304
+ " inflating: hymenoptera_data/train/bees/444532809_9e931e2279.jpg \n",
305
+ " inflating: hymenoptera_data/train/bees/446296270_d9e8b93ecf.jpg \n",
306
+ " inflating: hymenoptera_data/train/bees/452462677_7be43af8ff.jpg \n",
307
+ " inflating: hymenoptera_data/train/bees/452462695_40a4e5b559.jpg \n",
308
+ " inflating: hymenoptera_data/train/bees/457457145_5f86eb7e9c.jpg \n",
309
+ " inflating: hymenoptera_data/train/bees/465133211_80e0c27f60.jpg \n",
310
+ " inflating: hymenoptera_data/train/bees/469333327_358ba8fe8a.jpg \n",
311
+ " inflating: hymenoptera_data/train/bees/472288710_2abee16fa0.jpg \n",
312
+ " inflating: hymenoptera_data/train/bees/473618094_8ffdcab215.jpg \n",
313
+ " inflating: hymenoptera_data/train/bees/476347960_52edd72b06.jpg \n",
314
+ " inflating: hymenoptera_data/train/bees/478701318_bbd5e557b8.jpg \n",
315
+ " inflating: hymenoptera_data/train/bees/507288830_f46e8d4cb2.jpg \n",
316
+ " inflating: hymenoptera_data/train/bees/509247772_2db2d01374.jpg \n",
317
+ " inflating: hymenoptera_data/train/bees/513545352_fd3e7c7c5d.jpg \n",
318
+ " inflating: hymenoptera_data/train/bees/522104315_5d3cb2758e.jpg \n",
319
+ " inflating: hymenoptera_data/train/bees/537309131_532bfa59ea.jpg \n",
320
+ " inflating: hymenoptera_data/train/bees/586041248_3032e277a9.jpg \n",
321
+ " inflating: hymenoptera_data/train/bees/760526046_547e8b381f.jpg \n",
322
+ " inflating: hymenoptera_data/train/bees/760568592_45a52c847f.jpg \n",
323
+ " inflating: hymenoptera_data/train/bees/774440991_63a4aa0cbe.jpg \n",
324
+ " inflating: hymenoptera_data/train/bees/85112639_6e860b0469.jpg \n",
325
+ " inflating: hymenoptera_data/train/bees/873076652_eb098dab2d.jpg \n",
326
+ " inflating: hymenoptera_data/train/bees/90179376_abc234e5f4.jpg \n",
327
+ " inflating: hymenoptera_data/train/bees/92663402_37f379e57a.jpg \n",
328
+ " inflating: hymenoptera_data/train/bees/95238259_98470c5b10.jpg \n",
329
+ " inflating: hymenoptera_data/train/bees/969455125_58c797ef17.jpg \n",
330
+ " inflating: hymenoptera_data/train/bees/98391118_bdb1e80cce.jpg \n",
331
+ " creating: hymenoptera_data/val/\n",
332
+ " creating: hymenoptera_data/val/ants/\n",
333
+ " inflating: hymenoptera_data/val/ants/10308379_1b6c72e180.jpg \n",
334
+ " inflating: hymenoptera_data/val/ants/1053149811_f62a3410d3.jpg \n",
335
+ " inflating: hymenoptera_data/val/ants/1073564163_225a64f170.jpg \n",
336
+ " inflating: hymenoptera_data/val/ants/1119630822_cd325ea21a.jpg \n",
337
+ " inflating: hymenoptera_data/val/ants/1124525276_816a07c17f.jpg \n",
338
+ " inflating: hymenoptera_data/val/ants/11381045_b352a47d8c.jpg \n",
339
+ " inflating: hymenoptera_data/val/ants/119785936_dd428e40c3.jpg \n",
340
+ " inflating: hymenoptera_data/val/ants/1247887232_edcb61246c.jpg \n",
341
+ " inflating: hymenoptera_data/val/ants/1262751255_c56c042b7b.jpg \n",
342
+ " inflating: hymenoptera_data/val/ants/1337725712_2eb53cd742.jpg \n",
343
+ " inflating: hymenoptera_data/val/ants/1358854066_5ad8015f7f.jpg \n",
344
+ " inflating: hymenoptera_data/val/ants/1440002809_b268d9a66a.jpg \n",
345
+ " inflating: hymenoptera_data/val/ants/147542264_79506478c2.jpg \n",
346
+ " inflating: hymenoptera_data/val/ants/152286280_411648ec27.jpg \n",
347
+ " inflating: hymenoptera_data/val/ants/153320619_2aeb5fa0ee.jpg \n",
348
+ " inflating: hymenoptera_data/val/ants/153783656_85f9c3ac70.jpg \n",
349
+ " inflating: hymenoptera_data/val/ants/157401988_d0564a9d02.jpg \n",
350
+ " inflating: hymenoptera_data/val/ants/159515240_d5981e20d1.jpg \n",
351
+ " inflating: hymenoptera_data/val/ants/161076144_124db762d6.jpg \n",
352
+ " inflating: hymenoptera_data/val/ants/161292361_c16e0bf57a.jpg \n",
353
+ " inflating: hymenoptera_data/val/ants/170652283_ecdaff5d1a.jpg \n",
354
+ " inflating: hymenoptera_data/val/ants/17081114_79b9a27724.jpg \n",
355
+ " inflating: hymenoptera_data/val/ants/172772109_d0a8e15fb0.jpg \n",
356
+ " inflating: hymenoptera_data/val/ants/1743840368_b5ccda82b7.jpg \n",
357
+ " inflating: hymenoptera_data/val/ants/181942028_961261ef48.jpg \n",
358
+ " inflating: hymenoptera_data/val/ants/183260961_64ab754c97.jpg \n",
359
+ " inflating: hymenoptera_data/val/ants/2039585088_c6f47c592e.jpg \n",
360
+ " inflating: hymenoptera_data/val/ants/205398178_c395c5e460.jpg \n",
361
+ " inflating: hymenoptera_data/val/ants/208072188_f293096296.jpg \n",
362
+ " inflating: hymenoptera_data/val/ants/209615353_eeb38ba204.jpg \n",
363
+ " inflating: hymenoptera_data/val/ants/2104709400_8831b4fc6f.jpg \n",
364
+ " inflating: hymenoptera_data/val/ants/212100470_b485e7b7b9.jpg \n",
365
+ " inflating: hymenoptera_data/val/ants/2127908701_d49dc83c97.jpg \n",
366
+ " inflating: hymenoptera_data/val/ants/2191997003_379df31291.jpg \n",
367
+ " inflating: hymenoptera_data/val/ants/2211974567_ee4606b493.jpg \n",
368
+ " inflating: hymenoptera_data/val/ants/2219621907_47bc7cc6b0.jpg \n",
369
+ " inflating: hymenoptera_data/val/ants/2238242353_52c82441df.jpg \n",
370
+ " inflating: hymenoptera_data/val/ants/2255445811_dabcdf7258.jpg \n",
371
+ " inflating: hymenoptera_data/val/ants/239161491_86ac23b0a3.jpg \n",
372
+ " inflating: hymenoptera_data/val/ants/263615709_cfb28f6b8e.jpg \n",
373
+ " inflating: hymenoptera_data/val/ants/308196310_1db5ffa01b.jpg \n",
374
+ " inflating: hymenoptera_data/val/ants/319494379_648fb5a1c6.jpg \n",
375
+ " inflating: hymenoptera_data/val/ants/35558229_1fa4608a7a.jpg \n",
376
+ " inflating: hymenoptera_data/val/ants/412436937_4c2378efc2.jpg \n",
377
+ " inflating: hymenoptera_data/val/ants/436944325_d4925a38c7.jpg \n",
378
+ " inflating: hymenoptera_data/val/ants/445356866_6cb3289067.jpg \n",
379
+ " inflating: hymenoptera_data/val/ants/459442412_412fecf3fe.jpg \n",
380
+ " inflating: hymenoptera_data/val/ants/470127071_8b8ee2bd74.jpg \n",
381
+ " inflating: hymenoptera_data/val/ants/477437164_bc3e6e594a.jpg \n",
382
+ " inflating: hymenoptera_data/val/ants/488272201_c5aa281348.jpg \n",
383
+ " inflating: hymenoptera_data/val/ants/502717153_3e4865621a.jpg \n",
384
+ " inflating: hymenoptera_data/val/ants/518746016_bcc28f8b5b.jpg \n",
385
+ " inflating: hymenoptera_data/val/ants/540543309_ddbb193ee5.jpg \n",
386
+ " inflating: hymenoptera_data/val/ants/562589509_7e55469b97.jpg \n",
387
+ " inflating: hymenoptera_data/val/ants/57264437_a19006872f.jpg \n",
388
+ " inflating: hymenoptera_data/val/ants/573151833_ebbc274b77.jpg \n",
389
+ " inflating: hymenoptera_data/val/ants/649407494_9b6bc4949f.jpg \n",
390
+ " inflating: hymenoptera_data/val/ants/751649788_78dd7d16ce.jpg \n",
391
+ " inflating: hymenoptera_data/val/ants/768870506_8f115d3d37.jpg \n",
392
+ " inflating: hymenoptera_data/val/ants/800px-Meat_eater_ant_qeen_excavating_hole.jpg \n",
393
+ " inflating: hymenoptera_data/val/ants/8124241_36b290d372.jpg \n",
394
+ " inflating: hymenoptera_data/val/ants/8398478_50ef10c47a.jpg \n",
395
+ " inflating: hymenoptera_data/val/ants/854534770_31f6156383.jpg \n",
396
+ " inflating: hymenoptera_data/val/ants/892676922_4ab37dce07.jpg \n",
397
+ " inflating: hymenoptera_data/val/ants/94999827_36895faade.jpg \n",
398
+ " inflating: hymenoptera_data/val/ants/Ant-1818.jpg \n",
399
+ " inflating: hymenoptera_data/val/ants/ants-devouring-remains-of-large-dead-insect-on-red-tile-in-Stellenbosch-South-Africa-closeup-1-DHD.jpg \n",
400
+ " inflating: hymenoptera_data/val/ants/desert_ant.jpg \n",
401
+ " inflating: hymenoptera_data/val/ants/F.pergan.28(f).jpg \n",
402
+ " inflating: hymenoptera_data/val/ants/Hormiga.jpg \n",
403
+ " creating: hymenoptera_data/val/bees/\n",
404
+ " inflating: hymenoptera_data/val/bees/1032546534_06907fe3b3.jpg \n",
405
+ " inflating: hymenoptera_data/val/bees/10870992_eebeeb3a12.jpg \n",
406
+ " inflating: hymenoptera_data/val/bees/1181173278_23c36fac71.jpg \n",
407
+ " inflating: hymenoptera_data/val/bees/1297972485_33266a18d9.jpg \n",
408
+ " inflating: hymenoptera_data/val/bees/1328423762_f7a88a8451.jpg \n",
409
+ " inflating: hymenoptera_data/val/bees/1355974687_1341c1face.jpg \n",
410
+ " inflating: hymenoptera_data/val/bees/144098310_a4176fd54d.jpg \n",
411
+ " inflating: hymenoptera_data/val/bees/1486120850_490388f84b.jpg \n",
412
+ " inflating: hymenoptera_data/val/bees/149973093_da3c446268.jpg \n",
413
+ " inflating: hymenoptera_data/val/bees/151594775_ee7dc17b60.jpg \n",
414
+ " inflating: hymenoptera_data/val/bees/151603988_2c6f7d14c7.jpg \n",
415
+ " inflating: hymenoptera_data/val/bees/1519368889_4270261ee3.jpg \n",
416
+ " inflating: hymenoptera_data/val/bees/152789693_220b003452.jpg \n",
417
+ " inflating: hymenoptera_data/val/bees/177677657_a38c97e572.jpg \n",
418
+ " inflating: hymenoptera_data/val/bees/1799729694_0c40101071.jpg \n",
419
+ " inflating: hymenoptera_data/val/bees/181171681_c5a1a82ded.jpg \n",
420
+ " inflating: hymenoptera_data/val/bees/187130242_4593a4c610.jpg \n",
421
+ " inflating: hymenoptera_data/val/bees/203868383_0fcbb48278.jpg \n",
422
+ " inflating: hymenoptera_data/val/bees/2060668999_e11edb10d0.jpg \n",
423
+ " inflating: hymenoptera_data/val/bees/2086294791_6f3789d8a6.jpg \n",
424
+ " inflating: hymenoptera_data/val/bees/2103637821_8d26ee6b90.jpg \n",
425
+ " inflating: hymenoptera_data/val/bees/2104135106_a65eede1de.jpg \n",
426
+ " inflating: hymenoptera_data/val/bees/215512424_687e1e0821.jpg \n",
427
+ " inflating: hymenoptera_data/val/bees/2173503984_9c6aaaa7e2.jpg \n",
428
+ " inflating: hymenoptera_data/val/bees/220376539_20567395d8.jpg \n",
429
+ " inflating: hymenoptera_data/val/bees/224841383_d050f5f510.jpg \n",
430
+ " inflating: hymenoptera_data/val/bees/2321144482_f3785ba7b2.jpg \n",
431
+ " inflating: hymenoptera_data/val/bees/238161922_55fa9a76ae.jpg \n",
432
+ " inflating: hymenoptera_data/val/bees/2407809945_fb525ef54d.jpg \n",
433
+ " inflating: hymenoptera_data/val/bees/2415414155_1916f03b42.jpg \n",
434
+ " inflating: hymenoptera_data/val/bees/2438480600_40a1249879.jpg \n",
435
+ " inflating: hymenoptera_data/val/bees/2444778727_4b781ac424.jpg \n",
436
+ " inflating: hymenoptera_data/val/bees/2457841282_7867f16639.jpg \n",
437
+ " inflating: hymenoptera_data/val/bees/2470492902_3572c90f75.jpg \n",
438
+ " inflating: hymenoptera_data/val/bees/2478216347_535c8fe6d7.jpg \n",
439
+ " inflating: hymenoptera_data/val/bees/2501530886_e20952b97d.jpg \n",
440
+ " inflating: hymenoptera_data/val/bees/2506114833_90a41c5267.jpg \n",
441
+ " inflating: hymenoptera_data/val/bees/2509402554_31821cb0b6.jpg \n",
442
+ " inflating: hymenoptera_data/val/bees/2525379273_dcb26a516d.jpg \n",
443
+ " inflating: hymenoptera_data/val/bees/26589803_5ba7000313.jpg \n",
444
+ " inflating: hymenoptera_data/val/bees/2668391343_45e272cd07.jpg \n",
445
+ " inflating: hymenoptera_data/val/bees/2670536155_c170f49cd0.jpg \n",
446
+ " inflating: hymenoptera_data/val/bees/2685605303_9eed79d59d.jpg \n",
447
+ " inflating: hymenoptera_data/val/bees/2702408468_d9ed795f4f.jpg \n",
448
+ " inflating: hymenoptera_data/val/bees/2709775832_85b4b50a57.jpg \n",
449
+ " inflating: hymenoptera_data/val/bees/2717418782_bd83307d9f.jpg \n",
450
+ " inflating: hymenoptera_data/val/bees/272986700_d4d4bf8c4b.jpg \n",
451
+ " inflating: hymenoptera_data/val/bees/2741763055_9a7bb00802.jpg \n",
452
+ " inflating: hymenoptera_data/val/bees/2745389517_250a397f31.jpg \n",
453
+ " inflating: hymenoptera_data/val/bees/2751836205_6f7b5eff30.jpg \n",
454
+ " inflating: hymenoptera_data/val/bees/2782079948_8d4e94a826.jpg \n",
455
+ " inflating: hymenoptera_data/val/bees/2809496124_5f25b5946a.jpg \n",
456
+ " inflating: hymenoptera_data/val/bees/2815838190_0a9889d995.jpg \n",
457
+ " inflating: hymenoptera_data/val/bees/2841437312_789699c740.jpg \n",
458
+ " inflating: hymenoptera_data/val/bees/2883093452_7e3a1eb53f.jpg \n",
459
+ " inflating: hymenoptera_data/val/bees/290082189_f66cb80bfc.jpg \n",
460
+ " inflating: hymenoptera_data/val/bees/296565463_d07a7bed96.jpg \n",
461
+ " inflating: hymenoptera_data/val/bees/3077452620_548c79fda0.jpg \n",
462
+ " inflating: hymenoptera_data/val/bees/348291597_ee836fbb1a.jpg \n",
463
+ " inflating: hymenoptera_data/val/bees/350436573_41f4ecb6c8.jpg \n",
464
+ " inflating: hymenoptera_data/val/bees/353266603_d3eac7e9a0.jpg \n",
465
+ " inflating: hymenoptera_data/val/bees/372228424_16da1f8884.jpg \n",
466
+ " inflating: hymenoptera_data/val/bees/400262091_701c00031c.jpg \n",
467
+ " inflating: hymenoptera_data/val/bees/416144384_961c326481.jpg \n",
468
+ " inflating: hymenoptera_data/val/bees/44105569_16720a960c.jpg \n",
469
+ " inflating: hymenoptera_data/val/bees/456097971_860949c4fc.jpg \n",
470
+ " inflating: hymenoptera_data/val/bees/464594019_1b24a28bb1.jpg \n",
471
+ " inflating: hymenoptera_data/val/bees/485743562_d8cc6b8f73.jpg \n",
472
+ " inflating: hymenoptera_data/val/bees/540976476_844950623f.jpg \n",
473
+ " inflating: hymenoptera_data/val/bees/54736755_c057723f64.jpg \n",
474
+ " inflating: hymenoptera_data/val/bees/57459255_752774f1b2.jpg \n",
475
+ " inflating: hymenoptera_data/val/bees/576452297_897023f002.jpg \n",
476
+ " inflating: hymenoptera_data/val/bees/586474709_ae436da045.jpg \n",
477
+ " inflating: hymenoptera_data/val/bees/590318879_68cf112861.jpg \n",
478
+ " inflating: hymenoptera_data/val/bees/59798110_2b6a3c8031.jpg \n",
479
+ " inflating: hymenoptera_data/val/bees/603709866_a97c7cfc72.jpg \n",
480
+ " inflating: hymenoptera_data/val/bees/603711658_4c8cd2201e.jpg \n",
481
+ " inflating: hymenoptera_data/val/bees/65038344_52a45d090d.jpg \n",
482
+ " inflating: hymenoptera_data/val/bees/6a00d8341c630a53ef00e553d0beb18834-800wi.jpg \n",
483
+ " inflating: hymenoptera_data/val/bees/72100438_73de9f17af.jpg \n",
484
+ " inflating: hymenoptera_data/val/bees/759745145_e8bc776ec8.jpg \n",
485
+ " inflating: hymenoptera_data/val/bees/936182217_c4caa5222d.jpg \n",
486
+ " inflating: hymenoptera_data/val/bees/abeja.jpg \n"
487
+ ]
488
+ }
489
+ ],
490
+ "source": [
491
+ "# download the data\n",
492
+ "!wget https://download.pytorch.org/tutorial/hymenoptera_data.zip\n",
493
+ "!unzip hymenoptera_data.zip"
494
+ ]
495
+ },
496
+ {
497
+ "cell_type": "code",
498
+ "execution_count": null,
499
+ "metadata": {
500
+ "id": "var371SKtNyx"
501
+ },
502
+ "outputs": [],
503
+ "source": [
504
+ "# create data loaders\n",
505
+ "\n",
506
+ "data_dir = 'hymenoptera_data'\n",
507
+ "\n",
508
+ "# custom transformer to flatten the image tensors\n",
509
+ "class ReshapeTransform:\n",
510
+ " def __init__(self, new_size):\n",
511
+ " self.new_size = new_size\n",
512
+ "\n",
513
+ " def __call__(self, img):\n",
514
+ " result = torch.reshape(img, self.new_size)\n",
515
+ " return result\n",
516
+ "\n",
517
+ "# transformations used to standardize and normalize the datasets\n",
518
+ "data_transforms = {\n",
519
+ " 'train': transforms.Compose([\n",
520
+ " transforms.Resize(224),\n",
521
+ " transforms.CenterCrop(224),\n",
522
+ " transforms.ToTensor(),\n",
523
+ " ReshapeTransform((-1,)) # flattens the data\n",
524
+ " ]),\n",
525
+ " 'val': transforms.Compose([\n",
526
+ " transforms.Resize(224),\n",
527
+ " transforms.CenterCrop(224),\n",
528
+ " transforms.ToTensor(),\n",
529
+ " ReshapeTransform((-1,)) # flattens the data\n",
530
+ " ]),\n",
531
+ "}\n",
532
+ "\n",
533
+ "# load the correspoding folders\n",
534
+ "image_datasets = {x: datasets.ImageFolder(os.path.join(data_dir, x),\n",
535
+ " data_transforms[x])\n",
536
+ " for x in ['train', 'val']}\n",
537
+ "\n",
538
+ "# load the entire dataset; we are not using minibatches here\n",
539
+ "train_dataset = torch.utils.data.DataLoader(image_datasets['train'],\n",
540
+ " batch_size=len(image_datasets['train']),\n",
541
+ " shuffle=True)\n",
542
+ "\n",
543
+ "test_dataset = torch.utils.data.DataLoader(image_datasets['val'],\n",
544
+ " batch_size=len(image_datasets['val']),\n",
545
+ " shuffle=True)"
546
+ ]
547
+ },
548
+ {
549
+ "cell_type": "code",
550
+ "execution_count": null,
551
+ "metadata": {
552
+ "id": "gc9G-ZTRulDD"
553
+ },
554
+ "outputs": [],
555
+ "source": [
556
+ "# build the LR model\n",
557
+ "class LR(nn.Module):\n",
558
+ " def __init__(self, dim):\n",
559
+ " super(LR, self).__init__()\n",
560
+ " self.linear = nn.Linear(dim, 1)\n",
561
+ " nn.init.zeros_(self.linear.weight)\n",
562
+ " nn.init.zeros_(self.linear.bias)\n",
563
+ "\n",
564
+ " def forward(self, x):\n",
565
+ " x = self.linear(x)\n",
566
+ " x = torch.sigmoid(x)\n",
567
+ " return x "
568
+ ]
569
+ },
570
+ {
571
+ "cell_type": "code",
572
+ "execution_count": null,
573
+ "metadata": {
574
+ "id": "WfSUxBpL6BV1"
575
+ },
576
+ "outputs": [],
577
+ "source": [
578
+ "# predict function\n",
579
+ "def predict(yhat, y):\n",
580
+ " yhat = yhat.squeeze()\n",
581
+ " y = y.unsqueeze(0) \n",
582
+ " y_prediction = torch.zeros(y.size()[1])\n",
583
+ " for i in range(yhat.shape[0]):\n",
584
+ " if yhat[i] <= 0.5:\n",
585
+ " y_prediction[i] = 0\n",
586
+ " else:\n",
587
+ " y_prediction[i] = 1\n",
588
+ " return 100 - torch.mean(torch.abs(y_prediction - y)) * 100"
589
+ ]
590
+ },
591
+ {
592
+ "cell_type": "code",
593
+ "execution_count": null,
594
+ "metadata": {
595
+ "id": "LL5DrdjqxI7m"
596
+ },
597
+ "outputs": [],
598
+ "source": [
599
+ "# model config\n",
600
+ "dim = train_dataset.dataset[0][0].shape[0]\n",
601
+ "\n",
602
+ "lrmodel = LR(dim).to(device)\n",
603
+ "criterion = nn.BCELoss()\n",
604
+ "optimizer = torch.optim.SGD(lrmodel.parameters(), lr=0.0001)"
605
+ ]
606
+ },
607
+ {
608
+ "cell_type": "code",
609
+ "execution_count": null,
610
+ "metadata": {
611
+ "colab": {
612
+ "base_uri": "https://localhost:8080/"
613
+ },
614
+ "id": "i3s0mxFq6LJ6",
615
+ "outputId": "66126bae-bd85-46d2-b6e4-74f7e332b469"
616
+ },
617
+ "outputs": [
618
+ {
619
+ "name": "stdout",
620
+ "output_type": "stream",
621
+ "text": [
622
+ "Cost after iteration 0: 0.6931472420692444 | Train Acc: 50.40983581542969 | Test Acc: 45.75163269042969\n",
623
+ "Cost after iteration 10: 0.6691471338272095 | Train Acc: 64.3442611694336 | Test Acc: 54.24836730957031\n",
624
+ "Cost after iteration 20: 0.6513183116912842 | Train Acc: 68.44261932373047 | Test Acc: 54.24836730957031\n",
625
+ "Cost after iteration 30: 0.6367825269699097 | Train Acc: 68.03278350830078 | Test Acc: 54.24836730957031\n",
626
+ "Cost after iteration 40: 0.6245337128639221 | Train Acc: 69.67213439941406 | Test Acc: 54.90196228027344\n",
627
+ "Cost after iteration 50: 0.6139225959777832 | Train Acc: 70.90164184570312 | Test Acc: 56.20914840698242\n",
628
+ "Cost after iteration 60: 0.6045235991477966 | Train Acc: 72.54098510742188 | Test Acc: 56.86274337768555\n",
629
+ "Cost after iteration 70: 0.5960512161254883 | Train Acc: 74.18032836914062 | Test Acc: 57.51633834838867\n",
630
+ "Cost after iteration 80: 0.5883085131645203 | Train Acc: 73.77049255371094 | Test Acc: 57.51633834838867\n",
631
+ "Cost after iteration 90: 0.5811558365821838 | Train Acc: 74.59016418457031 | Test Acc: 58.1699333190918\n",
632
+ "Cost after iteration 100: 0.5744911432266235 | Train Acc: 75.0 | Test Acc: 59.47712326049805\n",
633
+ "Cost after iteration 110: 0.5682383179664612 | Train Acc: 75.40983581542969 | Test Acc: 60.13071823120117\n",
634
+ "Cost after iteration 120: 0.5623383522033691 | Train Acc: 75.81967163085938 | Test Acc: 60.13071823120117\n",
635
+ "Cost after iteration 130: 0.5567454099655151 | Train Acc: 75.81967163085938 | Test Acc: 59.47712326049805\n",
636
+ "Cost after iteration 140: 0.5514224767684937 | Train Acc: 75.81967163085938 | Test Acc: 59.47712326049805\n",
637
+ "Cost after iteration 150: 0.5463394522666931 | Train Acc: 76.22950744628906 | Test Acc: 58.82352828979492\n",
638
+ "Cost after iteration 160: 0.5414711833000183 | Train Acc: 76.63934326171875 | Test Acc: 58.82352828979492\n",
639
+ "Cost after iteration 170: 0.5367969274520874 | Train Acc: 77.04917907714844 | Test Acc: 58.82352828979492\n",
640
+ "Cost after iteration 180: 0.5322986841201782 | Train Acc: 77.04917907714844 | Test Acc: 58.82352828979492\n",
641
+ "Cost after iteration 190: 0.5279611349105835 | Train Acc: 77.45901489257812 | Test Acc: 58.82352828979492\n"
642
+ ]
643
+ }
644
+ ],
645
+ "source": [
646
+ "# training the model\n",
647
+ "costs = []\n",
648
+ "\n",
649
+ "for ITER in range(200):\n",
650
+ " lrmodel.train()\n",
651
+ " x, y = next(iter(train_dataset))\n",
652
+ " test_x, test_y = next(iter(test_dataset))\n",
653
+ "\n",
654
+ " # forward\n",
655
+ " yhat = lrmodel.forward(x.to(device))\n",
656
+ "\n",
657
+ " cost = criterion(yhat.squeeze(), y.type(torch.FloatTensor))\n",
658
+ " train_pred = predict(yhat, y)\n",
659
+ "\n",
660
+ " # backward\n",
661
+ " optimizer.zero_grad()\n",
662
+ " cost.backward()\n",
663
+ " optimizer.step()\n",
664
+ " \n",
665
+ " # evaluate\n",
666
+ " lrmodel.eval()\n",
667
+ " with torch.no_grad():\n",
668
+ " yhat_test = lrmodel.forward(test_x.to(device))\n",
669
+ " test_pred = predict(yhat_test, test_y)\n",
670
+ "\n",
671
+ " if ITER % 10 == 0:\n",
672
+ " costs.append(cost)\n",
673
+ "\n",
674
+ " if ITER % 10 == 0:\n",
675
+ " print(\"Cost after iteration {}: {} | Train Acc: {} | Test Acc: {}\".format(ITER, \n",
676
+ " cost, \n",
677
+ " train_pred,\n",
678
+ " test_pred))\n",
679
+ " "
680
+ ]
681
+ },
682
+ {
683
+ "cell_type": "markdown",
684
+ "metadata": {
685
+ "id": "W0Q8WUq9opWB"
686
+ },
687
+ "source": [
688
+ "### References\n",
689
+ "- [A Logistic Regression Model from Scratch](https://colab.research.google.com/drive/1iBoJ0kngkOthy7SgVaVQA1aHEROt5mra?usp=sharing)"
690
+ ]
691
+ }
692
+ ],
693
+ "metadata": {
694
+ "colab": {
695
+ "name": "Concise Logistic Regression.ipynb",
696
+ "provenance": []
697
+ },
698
+ "kernelspec": {
699
+ "display_name": "Python 3 (ipykernel)",
700
+ "language": "python",
701
+ "name": "python3"
702
+ },
703
+ "language_info": {
704
+ "codemirror_mode": {
705
+ "name": "ipython",
706
+ "version": 3
707
+ },
708
+ "file_extension": ".py",
709
+ "mimetype": "text/x-python",
710
+ "name": "python",
711
+ "nbconvert_exporter": "python",
712
+ "pygments_lexer": "ipython3",
713
+ "version": "3.9.12"
714
+ }
715
+ },
716
+ "nbformat": 4,
717
+ "nbformat_minor": 1
718
+ }
05_First_Neural_Network.ipynb ADDED
@@ -0,0 +1,421 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 06 First Neural Network - Image Classifier\n",
9
+ "description: Build a minimal image classifier using MNIST\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/1i94k-n97Z5r1KWV9Vly9IiKnYxf3Tfvu?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "B4QLVt7dOLvR"
24
+ },
25
+ "source": [
26
+ "## First Neural Network: Image Classification \n",
27
+ "\n",
28
+ "Objectives:\n",
29
+ "- Train a minimal image classifier on [MNIST](https://paperswithcode.com/dataset/mnist) using PyTorch\n",
30
+ "- Usese PyTorch and torchvision"
31
+ ]
32
+ },
33
+ {
34
+ "cell_type": "code",
35
+ "execution_count": null,
36
+ "metadata": {
37
+ "id": "GQO-_1VmOKAA"
38
+ },
39
+ "outputs": [],
40
+ "source": [
41
+ "# The usual imports\n",
42
+ "\n",
43
+ "import torch\n",
44
+ "import torch.nn as nn\n",
45
+ "import torchvision\n",
46
+ "import torchvision.transforms as transforms"
47
+ ]
48
+ },
49
+ {
50
+ "cell_type": "code",
51
+ "execution_count": null,
52
+ "metadata": {
53
+ "colab": {
54
+ "base_uri": "https://localhost:8080/",
55
+ "height": 440,
56
+ "referenced_widgets": [
57
+ "1f5bffd15e004f94b4c0c160d9fd8de1",
58
+ "db33884155d845de87219e380dac93e4",
59
+ "f69126416a024d81acba55ed9d55d403",
60
+ "adcb0b32f9d044a7be84399c7c78b0e3",
61
+ "9afc91c828a34f1a9aeeda0ed4ab03f2",
62
+ "7d3f956600f540b99ecb4b34a25406cd",
63
+ "f7a5cf68799f4dd88799c2661023c9cf",
64
+ "b70dd86e5cd346a39d916785e53999f3",
65
+ "10fd4ffeec3b4296accab82cbf669532",
66
+ "b5ce3aa109794b9c8426a2aa6a3db2bb",
67
+ "919a3015ac534f24bd0d429efbd74868",
68
+ "e2c9c5e882014f78808521df5b440fa8",
69
+ "a8e52c852bbc409589950c351dd1ed49",
70
+ "d7c608f697254a309d312b17aa48909e",
71
+ "3837ef6dc033413692d78bee9c886cc7",
72
+ "a87b4fcaebc246b9bb3b9f9e6769c165",
73
+ "2c70b515382e4a78b3d6fcb237f1c1ec",
74
+ "a5e9382e8f7249adaf103f29379eb09a",
75
+ "d953e078e7674e8f82b6942e3903d28c",
76
+ "404e11ae4a0f4c5a9dbd806a71fe86df",
77
+ "a2e9fa143b3c4e418c20b909c0001ede",
78
+ "bf3bd4bd992c4f0f89232bdf1d90e82d",
79
+ "e56e4aec66e246e5a912aeb6e2c02217",
80
+ "82588d321aa746539465eb6678765576",
81
+ "fd74ada4948948e2b5edd7afabd35a2a",
82
+ "c3ce8434db4e40f3ad706c10d2a3d476",
83
+ "907a358a2c0047b980b9b6247ac6a805",
84
+ "39d23aa64be84a3aba5d1f4c50232130",
85
+ "a25da8720eae4b0796f52f492e985109",
86
+ "013b2e36a1e34997bc0fc85efe0d2d03",
87
+ "0cc3973d865d494ea4114aaf8cd0c87b",
88
+ "e22eef5cedc14cf097af701dca4862a8",
89
+ "edb32f5f8ae64662ac5f79cd4c12bc2d",
90
+ "3c9909790988447d80508923c17b0fcb",
91
+ "0fe027d4def3455d8d9f65961e79795e",
92
+ "923f7bcae8d24f6bb87d7f18ca17c943",
93
+ "d12c5cb31ee14b1ca8781fefcf1927e5",
94
+ "538005e3a226493caa1116e5ee0d4083",
95
+ "843bf0d404b0460a99f57a6b7892929a",
96
+ "725dcb166a0847b2bbd297ff134fe584",
97
+ "b781c19fa93645ddb79f988e33abb9c0",
98
+ "d7e49571d5d845eaac58d9af4dcbcba1",
99
+ "1aea37ee14dc40c497266b7a29f06d8e",
100
+ "950ef43921c04627b73414c6d0b11531"
101
+ ]
102
+ },
103
+ "id": "MEfdNU7dORsY",
104
+ "outputId": "d94970ab-6aec-432c-b240-6958711379c5"
105
+ },
106
+ "outputs": [
107
+ {
108
+ "name": "stdout",
109
+ "output_type": "stream",
110
+ "text": [
111
+ "Downloading http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz\n",
112
+ "Downloading http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz to ./data/MNIST/raw/train-images-idx3-ubyte.gz\n"
113
+ ]
114
+ },
115
+ {
116
+ "data": {
117
+ "application/vnd.jupyter.widget-view+json": {
118
+ "model_id": "1f5bffd15e004f94b4c0c160d9fd8de1",
119
+ "version_major": 2,
120
+ "version_minor": 0
121
+ },
122
+ "text/plain": [
123
+ " 0%| | 0/9912422 [00:00<?, ?it/s]"
124
+ ]
125
+ },
126
+ "metadata": {},
127
+ "output_type": "display_data"
128
+ },
129
+ {
130
+ "name": "stdout",
131
+ "output_type": "stream",
132
+ "text": [
133
+ "Extracting ./data/MNIST/raw/train-images-idx3-ubyte.gz to ./data/MNIST/raw\n",
134
+ "\n",
135
+ "Downloading http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz\n",
136
+ "Downloading http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz to ./data/MNIST/raw/train-labels-idx1-ubyte.gz\n"
137
+ ]
138
+ },
139
+ {
140
+ "data": {
141
+ "application/vnd.jupyter.widget-view+json": {
142
+ "model_id": "e2c9c5e882014f78808521df5b440fa8",
143
+ "version_major": 2,
144
+ "version_minor": 0
145
+ },
146
+ "text/plain": [
147
+ " 0%| | 0/28881 [00:00<?, ?it/s]"
148
+ ]
149
+ },
150
+ "metadata": {},
151
+ "output_type": "display_data"
152
+ },
153
+ {
154
+ "name": "stdout",
155
+ "output_type": "stream",
156
+ "text": [
157
+ "Extracting ./data/MNIST/raw/train-labels-idx1-ubyte.gz to ./data/MNIST/raw\n",
158
+ "\n",
159
+ "Downloading http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz\n",
160
+ "Downloading http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz to ./data/MNIST/raw/t10k-images-idx3-ubyte.gz\n"
161
+ ]
162
+ },
163
+ {
164
+ "data": {
165
+ "application/vnd.jupyter.widget-view+json": {
166
+ "model_id": "e56e4aec66e246e5a912aeb6e2c02217",
167
+ "version_major": 2,
168
+ "version_minor": 0
169
+ },
170
+ "text/plain": [
171
+ " 0%| | 0/1648877 [00:00<?, ?it/s]"
172
+ ]
173
+ },
174
+ "metadata": {},
175
+ "output_type": "display_data"
176
+ },
177
+ {
178
+ "name": "stdout",
179
+ "output_type": "stream",
180
+ "text": [
181
+ "Extracting ./data/MNIST/raw/t10k-images-idx3-ubyte.gz to ./data/MNIST/raw\n",
182
+ "\n",
183
+ "Downloading http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz\n",
184
+ "Downloading http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz to ./data/MNIST/raw/t10k-labels-idx1-ubyte.gz\n"
185
+ ]
186
+ },
187
+ {
188
+ "data": {
189
+ "application/vnd.jupyter.widget-view+json": {
190
+ "model_id": "3c9909790988447d80508923c17b0fcb",
191
+ "version_major": 2,
192
+ "version_minor": 0
193
+ },
194
+ "text/plain": [
195
+ " 0%| | 0/4542 [00:00<?, ?it/s]"
196
+ ]
197
+ },
198
+ "metadata": {},
199
+ "output_type": "display_data"
200
+ },
201
+ {
202
+ "name": "stdout",
203
+ "output_type": "stream",
204
+ "text": [
205
+ "Extracting ./data/MNIST/raw/t10k-labels-idx1-ubyte.gz to ./data/MNIST/raw\n",
206
+ "\n"
207
+ ]
208
+ }
209
+ ],
210
+ "source": [
211
+ "# load the data\n",
212
+ "\n",
213
+ "class ReshapeTransform:\n",
214
+ " def __init__(self, new_size):\n",
215
+ " self.new_size = new_size\n",
216
+ "\n",
217
+ " def __call__(self, img):\n",
218
+ " return torch.reshape(img, self.new_size)\n",
219
+ "\n",
220
+ "transformations = transforms.Compose([\n",
221
+ " transforms.ToTensor(),\n",
222
+ " transforms.ConvertImageDtype(torch.float32),\n",
223
+ " ReshapeTransform((-1,))\n",
224
+ " ])\n",
225
+ "\n",
226
+ "trainset = torchvision.datasets.MNIST(root='./data', train=True,\n",
227
+ " download=True, transform=transformations)\n",
228
+ "\n",
229
+ "testset = torchvision.datasets.MNIST(root='./data', train=False,\n",
230
+ " download=True, transform=transformations)"
231
+ ]
232
+ },
233
+ {
234
+ "cell_type": "code",
235
+ "execution_count": null,
236
+ "metadata": {
237
+ "colab": {
238
+ "base_uri": "https://localhost:8080/"
239
+ },
240
+ "id": "3KqEETlaOjzl",
241
+ "outputId": "3cbdaccf-48d9-442a-fe63-f71bc7026a10"
242
+ },
243
+ "outputs": [
244
+ {
245
+ "data": {
246
+ "text/plain": [
247
+ "(torch.Size([60000, 28, 28]), torch.Size([10000, 28, 28]))"
248
+ ]
249
+ },
250
+ "execution_count": 3,
251
+ "metadata": {},
252
+ "output_type": "execute_result"
253
+ }
254
+ ],
255
+ "source": [
256
+ "# check shape of data\n",
257
+ "\n",
258
+ "trainset.data.shape, testset.data.shape"
259
+ ]
260
+ },
261
+ {
262
+ "cell_type": "code",
263
+ "execution_count": null,
264
+ "metadata": {
265
+ "id": "hvv8j6j1OnAG"
266
+ },
267
+ "outputs": [],
268
+ "source": [
269
+ "# data loader\n",
270
+ "\n",
271
+ "BATCH_SIZE = 128\n",
272
+ "train_dataloader = torch.utils.data.DataLoader(trainset, \n",
273
+ " batch_size=BATCH_SIZE,\n",
274
+ " shuffle=True, \n",
275
+ " num_workers=0)\n",
276
+ "\n",
277
+ "test_dataloader = torch.utils.data.DataLoader(testset, \n",
278
+ " batch_size=BATCH_SIZE,\n",
279
+ " shuffle=False, \n",
280
+ " num_workers=0)"
281
+ ]
282
+ },
283
+ {
284
+ "cell_type": "code",
285
+ "execution_count": null,
286
+ "metadata": {
287
+ "id": "M6QNGvdzPEE1"
288
+ },
289
+ "outputs": [],
290
+ "source": [
291
+ "# model\n",
292
+ "\n",
293
+ "model = nn.Sequential(nn.Linear(784, 512), nn.ReLU(), nn.Linear(512, 10))"
294
+ ]
295
+ },
296
+ {
297
+ "cell_type": "code",
298
+ "execution_count": null,
299
+ "metadata": {
300
+ "id": "Z3N2bygHPFnk"
301
+ },
302
+ "outputs": [],
303
+ "source": [
304
+ "# training preparation\n",
305
+ "\n",
306
+ "trainer = torch.optim.RMSprop(model.parameters())\n",
307
+ "loss = nn.CrossEntropyLoss()"
308
+ ]
309
+ },
310
+ {
311
+ "cell_type": "code",
312
+ "execution_count": null,
313
+ "metadata": {
314
+ "id": "QRZq3LkFPHGU"
315
+ },
316
+ "outputs": [],
317
+ "source": [
318
+ "def get_accuracy(output, target, batch_size):\n",
319
+ " # Obtain accuracy for training round\n",
320
+ " corrects = (torch.max(output, 1)[1].view(target.size()).data == target.data).sum()\n",
321
+ " accuracy = 100.0 * corrects/batch_size\n",
322
+ " return accuracy.item()"
323
+ ]
324
+ },
325
+ {
326
+ "cell_type": "code",
327
+ "execution_count": null,
328
+ "metadata": {
329
+ "colab": {
330
+ "base_uri": "https://localhost:8080/"
331
+ },
332
+ "id": "KC1AnsOYPJvc",
333
+ "outputId": "76d6ad83-766e-4f9b-c437-03e69d5c8486"
334
+ },
335
+ "outputs": [
336
+ {
337
+ "name": "stdout",
338
+ "output_type": "stream",
339
+ "text": [
340
+ "Epoch: 1 | Train loss: 1.0415 | Train Accuracy: 91.9010\n",
341
+ "Epoch: 2 | Train loss: 0.1291 | Train Accuracy: 96.0871\n",
342
+ "Epoch: 3 | Train loss: 0.0997 | Train Accuracy: 97.0399\n",
343
+ "Epoch: 4 | Train loss: 0.0865 | Train Accuracy: 97.4913\n",
344
+ "Epoch: 5 | Train loss: 0.0740 | Train Accuracy: 97.8611\n"
345
+ ]
346
+ }
347
+ ],
348
+ "source": [
349
+ "# train\n",
350
+ "\n",
351
+ "for ITER in range(5):\n",
352
+ " train_acc = 0.0\n",
353
+ " train_running_loss = 0.0\n",
354
+ "\n",
355
+ " model.train()\n",
356
+ " for i, (X, y) in enumerate(train_dataloader):\n",
357
+ " output = model(X)\n",
358
+ " l = loss(output, y)\n",
359
+ "\n",
360
+ " # update the parameters\n",
361
+ " l.backward()\n",
362
+ " trainer.step()\n",
363
+ " trainer.zero_grad()\n",
364
+ "\n",
365
+ " # gather metrics\n",
366
+ " train_acc += get_accuracy(output, y, BATCH_SIZE)\n",
367
+ " train_running_loss += l.detach().item()\n",
368
+ "\n",
369
+ " print('Epoch: %d | Train loss: %.4f | Train Accuracy: %.4f' \\\n",
370
+ " %(ITER+1, train_running_loss / (i+1),train_acc/(i+1)))"
371
+ ]
372
+ },
373
+ {
374
+ "cell_type": "markdown",
375
+ "metadata": {
376
+ "id": "22DFwdqAmJ4G"
377
+ },
378
+ "source": [
379
+ "### Other things to try\n",
380
+ "\n",
381
+ "- Evaluate on test set\n",
382
+ "- Plot loss curve\n",
383
+ "- Add more layers to the model"
384
+ ]
385
+ },
386
+ {
387
+ "cell_type": "code",
388
+ "execution_count": null,
389
+ "metadata": {
390
+ "id": "jDordzFcPNmU"
391
+ },
392
+ "outputs": [],
393
+ "source": []
394
+ }
395
+ ],
396
+ "metadata": {
397
+ "colab": {
398
+ "name": "First Neural Network.ipynb",
399
+ "provenance": []
400
+ },
401
+ "kernelspec": {
402
+ "display_name": "Python 3 (ipykernel)",
403
+ "language": "python",
404
+ "name": "python3"
405
+ },
406
+ "language_info": {
407
+ "codemirror_mode": {
408
+ "name": "ipython",
409
+ "version": 3
410
+ },
411
+ "file_extension": ".py",
412
+ "mimetype": "text/x-python",
413
+ "name": "python",
414
+ "nbconvert_exporter": "python",
415
+ "pygments_lexer": "ipython3",
416
+ "version": "3.9.12"
417
+ }
418
+ },
419
+ "nbformat": 4,
420
+ "nbformat_minor": 1
421
+ }
06_Neural_Network_from_Scratch.ipynb ADDED
@@ -0,0 +1,432 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 07 Neural Network from Scratch\n",
9
+ "description: An implementation of simple neural network from scratch\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/1YBcEZMUHhJUiwOIwQbqwmAAGrRznpP_E?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "Ee4B4v5tAp1C"
24
+ },
25
+ "source": [
26
+ "# A Simple Neural Network from Scratch with PyTorch and Google Colab\n",
27
+ "\n",
28
+ "In this tutorial we implement a simple neural network from scratch using PyTorch.\n"
29
+ ]
30
+ },
31
+ {
32
+ "cell_type": "markdown",
33
+ "metadata": {
34
+ "id": "w4cEhtf_Ap1E"
35
+ },
36
+ "source": [
37
+ "## About\n",
38
+ "\n",
39
+ "In this tutorial we will implement a simple neural network from scratch using PyTorch. The idea of the tutorial is to teach you the basics of PyTorch and how it can be used to implement a neural network from scratch. I will go over some of the basic functionalities and concepts available in PyTorch that will allow you to build your own neural networks. \n",
40
+ "\n",
41
+ "This tutorial assumes you have prior knowledge of how a neural network works. Don’t worry! Even if you are not so sure, you will be okay. For advanced PyTorch users, this tutorial may still serve as a refresher. This tutorial is heavily inspired by this [Neural Network implementation](https://repl.it/talk/announcements/Build-a-Neural-Network-in-Python/5457) coded purely using Numpy. In fact, I tried re-implementing the code using PyTorch instead and added my own intuitions and explanations. Thanks to [Samay](https://repl.it/@shamdasani) for his phenomenal work, I hope this inspires many others as it did with me."
42
+ ]
43
+ },
44
+ {
45
+ "cell_type": "markdown",
46
+ "metadata": {
47
+ "id": "MP9ewMSlC7JU"
48
+ },
49
+ "source": [
50
+ "\n",
51
+ "The `torch` module provides all the necessary **tensor** operators you will need to implement your first neural network from scratch in PyTorch. That's right! In PyTorch everything is a Tensor, so this is the first thing you will need to get used to. Let's import the libraries we will need for this tutorial."
52
+ ]
53
+ },
54
+ {
55
+ "cell_type": "code",
56
+ "execution_count": null,
57
+ "metadata": {
58
+ "id": "bKmXKSQnAp1G"
59
+ },
60
+ "outputs": [],
61
+ "source": [
62
+ "import torch\n",
63
+ "import torch.nn as nn"
64
+ ]
65
+ },
66
+ {
67
+ "cell_type": "markdown",
68
+ "metadata": {
69
+ "id": "1EWBBl1nAp1M"
70
+ },
71
+ "source": [
72
+ "## Data\n",
73
+ "Let's start by creating some sample data using the `torch.tensor` command. In Numpy, this could be done with `np.array`. Both functions serve the same purpose, but in PyTorch everything is a Tensor as opposed to a vector or matrix. We define types in PyTorch using the `dtype=torch.xxx` command. \n",
74
+ "\n",
75
+ "In the data below, `X` represents the amount of hours studied and how much time students spent sleeping, whereas `y` represent grades. The variable `xPredicted` is a single input for which we want to predict a grade using the parameters learned by the neural network. Remember, the neural network wants to learn a mapping between `X` and `y`, so it will try to take a guess from what it has learned from the training data. "
76
+ ]
77
+ },
78
+ {
79
+ "cell_type": "code",
80
+ "execution_count": null,
81
+ "metadata": {
82
+ "id": "fsAVbHnjAp1P"
83
+ },
84
+ "outputs": [],
85
+ "source": [
86
+ "X = torch.tensor(([2, 9], [1, 5], [3, 6]), dtype=torch.float) # 3 X 2 tensor\n",
87
+ "y = torch.tensor(([92], [100], [89]), dtype=torch.float) # 3 X 1 tensor\n",
88
+ "xPredicted = torch.tensor(([4, 8]), dtype=torch.float) # 1 X 2 tensor"
89
+ ]
90
+ },
91
+ {
92
+ "cell_type": "markdown",
93
+ "metadata": {
94
+ "id": "RC0ru9kCAp1U"
95
+ },
96
+ "source": [
97
+ "You can check the size of the tensors we have just created with the `size` command. This is equivalent to the `shape` command used in tools such as Numpy and Tensorflow. "
98
+ ]
99
+ },
100
+ {
101
+ "cell_type": "code",
102
+ "execution_count": null,
103
+ "metadata": {
104
+ "colab": {
105
+ "base_uri": "https://localhost:8080/"
106
+ },
107
+ "id": "sfC-B1BEAp1W",
108
+ "outputId": "fbe6380d-f76b-4dee-c744-155022ce83d6"
109
+ },
110
+ "outputs": [
111
+ {
112
+ "name": "stdout",
113
+ "output_type": "stream",
114
+ "text": [
115
+ "torch.Size([3, 2])\n",
116
+ "torch.Size([3, 1])\n"
117
+ ]
118
+ }
119
+ ],
120
+ "source": [
121
+ "print(X.size())\n",
122
+ "print(y.size())"
123
+ ]
124
+ },
125
+ {
126
+ "cell_type": "markdown",
127
+ "metadata": {
128
+ "id": "zrND9MS9Ap1f"
129
+ },
130
+ "source": [
131
+ "## Scaling\n",
132
+ "\n",
133
+ "Below we are performing some scaling on the sample data. Notice that the `max` function returns both a tensor and the corresponding indices. So we use `_` to capture the indices which we won't use here because we are only interested in the max values to conduct the scaling. Perfect! Our data is now in a very nice format our neural network will appreciate later on. "
134
+ ]
135
+ },
136
+ {
137
+ "cell_type": "code",
138
+ "execution_count": null,
139
+ "metadata": {
140
+ "colab": {
141
+ "base_uri": "https://localhost:8080/"
142
+ },
143
+ "id": "hlBvtfAmAp1i",
144
+ "outputId": "db5ba4e4-cf29-4761-a058-677b16e5dcef"
145
+ },
146
+ "outputs": [
147
+ {
148
+ "name": "stdout",
149
+ "output_type": "stream",
150
+ "text": [
151
+ "tensor([0.5000, 1.0000])\n"
152
+ ]
153
+ }
154
+ ],
155
+ "source": [
156
+ "# scale units\n",
157
+ "X_max, _ = torch.max(X, 0)\n",
158
+ "xPredicted_max, _ = torch.max(xPredicted, 0)\n",
159
+ "\n",
160
+ "X = torch.div(X, X_max)\n",
161
+ "xPredicted = torch.div(xPredicted, xPredicted_max)\n",
162
+ "y = y / 100 # max test score is 100\n",
163
+ "print(xPredicted)"
164
+ ]
165
+ },
166
+ {
167
+ "cell_type": "markdown",
168
+ "metadata": {
169
+ "id": "R1kTs5S5Ap1m"
170
+ },
171
+ "source": [
172
+ "Notice that there are two functions `max` and `div` that I didn't discuss above. They do exactly what they imply: `max` finds the maximum value in a vector... I mean tensor; and `div` is basically a nice little function to divide two tensors. "
173
+ ]
174
+ },
175
+ {
176
+ "cell_type": "markdown",
177
+ "metadata": {
178
+ "id": "xRvMSpEFAp1n"
179
+ },
180
+ "source": [
181
+ "## Model (Computation Graph)\n",
182
+ "Once the data has been processed and it is in the proper format, all you need to do now is to define your model. Here is where things begin to change a little as compared to how you would build your neural networks using, say, something like Keras or Tensorflow. However, you will realize quickly as you go along that PyTorch doesn't differ much from other deep learning tools. At the end of the day we are constructing a computation graph, which is used to dictate how data should flow and what type of operations are performed on this information. \n",
183
+ "\n",
184
+ "For illustration purposes, we are building the following neural network or computation graph:\n",
185
+ "\n",
186
+ "\n",
187
+ "![alt text](https://drive.google.com/uc?export=view&id=1l-sKpcCJCEUJV1BlAqcVAvLXLpYCInV6)"
188
+ ]
189
+ },
190
+ {
191
+ "cell_type": "code",
192
+ "execution_count": null,
193
+ "metadata": {
194
+ "id": "C7pDC5SfAp1p"
195
+ },
196
+ "outputs": [],
197
+ "source": [
198
+ "class Neural_Network(nn.Module):\n",
199
+ " def __init__(self, ):\n",
200
+ " super(Neural_Network, self).__init__()\n",
201
+ " # parameters\n",
202
+ " # TODO: parameters can be parameterized instead of declaring them here\n",
203
+ " self.inputSize = 2\n",
204
+ " self.outputSize = 1\n",
205
+ " self.hiddenSize = 3\n",
206
+ " \n",
207
+ " # weights\n",
208
+ " self.W1 = torch.randn(self.inputSize, self.hiddenSize) # 3 X 2 tensor\n",
209
+ " self.W2 = torch.randn(self.hiddenSize, self.outputSize) # 3 X 1 tensor\n",
210
+ " \n",
211
+ " def forward(self, X):\n",
212
+ " self.z = torch.matmul(X, self.W1) # 3 X 3 \".dot\" does not broadcast in PyTorch\n",
213
+ " self.z2 = self.sigmoid(self.z) # activation function\n",
214
+ " self.z3 = torch.matmul(self.z2, self.W2)\n",
215
+ " o = self.sigmoid(self.z3) # final activation function\n",
216
+ " return o\n",
217
+ " \n",
218
+ " def sigmoid(self, s):\n",
219
+ " return 1 / (1 + torch.exp(-s))\n",
220
+ " \n",
221
+ " def sigmoidPrime(self, s):\n",
222
+ " # derivative of sigmoid\n",
223
+ " return s * (1 - s)\n",
224
+ " \n",
225
+ " def backward(self, X, y, o):\n",
226
+ " self.o_error = y - o # error in output\n",
227
+ " self.o_delta = self.o_error * self.sigmoidPrime(o) # derivative of sig to error\n",
228
+ " self.z2_error = torch.matmul(self.o_delta, torch.t(self.W2))\n",
229
+ " self.z2_delta = self.z2_error * self.sigmoidPrime(self.z2)\n",
230
+ " self.W1 += torch.matmul(torch.t(X), self.z2_delta)\n",
231
+ " self.W2 += torch.matmul(torch.t(self.z2), self.o_delta)\n",
232
+ " \n",
233
+ " def train(self, X, y):\n",
234
+ " # forward + backward pass for training\n",
235
+ " o = self.forward(X)\n",
236
+ " self.backward(X, y, o)\n",
237
+ " \n",
238
+ " def saveWeights(self, model):\n",
239
+ " # we will use the PyTorch internal storage functions\n",
240
+ " torch.save(model, \"NN\")\n",
241
+ " # you can reload model with all the weights and so forth with:\n",
242
+ " # torch.load(\"NN\")\n",
243
+ " \n",
244
+ " def predict(self):\n",
245
+ " print (\"Predicted data based on trained weights: \")\n",
246
+ " print (\"Input (scaled): \\n\" + str(xPredicted))\n",
247
+ " print (\"Output: \\n\" + str(self.forward(xPredicted)))\n",
248
+ " "
249
+ ]
250
+ },
251
+ {
252
+ "cell_type": "markdown",
253
+ "metadata": {
254
+ "id": "qm5gimnyAp1s"
255
+ },
256
+ "source": [
257
+ "For the purpose of this tutorial, we are not going to be talking math stuff, that's for another day. I just want you to get a gist of what it takes to build a neural network from scratch using PyTorch. Let's break down the model which was declared via the class above. \n",
258
+ "\n",
259
+ "## Class Header\n",
260
+ "First, we defined our model via a class because that is the recommended way to build the computation graph. The class header contains the name of the class `Neural Network` and the parameter `nn.Module` which basically indicates that we are defining our own neural network. \n",
261
+ "\n",
262
+ "```python\n",
263
+ "class Neural_Network(nn.Module):\n",
264
+ "```\n",
265
+ "\n",
266
+ "## Initialization\n",
267
+ "The next step is to define the initializations ( `def __init__(self,)`) that will be performed upon creating an instance of the customized neural network. You can declare the parameters of your model here, but typically, you would declare the structure of your network in this section -- the size of the hidden layers and so forth. Since we are building the neural network from scratch, we explicitly declared the size of the weights matrices: one that stores the parameters from the input to hidden layer; and one that stores the parameter from the hidden to output layer. Both weight matrices are initialized with values randomly chosen from a normal distribution via `torch.randn(...)`. Note that we are not using bias just to keep things as simple as possible. \n",
268
+ "\n",
269
+ "```python\n",
270
+ "def __init__(self, ):\n",
271
+ " super(Neural_Network, self).__init__()\n",
272
+ " # parameters\n",
273
+ " # TODO: parameters can be parameterized instead of declaring them here\n",
274
+ " self.inputSize = 2\n",
275
+ " self.outputSize = 1\n",
276
+ " self.hiddenSize = 3\n",
277
+ "\n",
278
+ " # weights\n",
279
+ " self.W1 = torch.randn(self.inputSize, self.hiddenSize) # 3 X 2 tensor\n",
280
+ " self.W2 = torch.randn(self.hiddenSize, self.outputSize) # 3 X 1 tensor\n",
281
+ "```\n",
282
+ "\n",
283
+ "## The Forward Function\n",
284
+ "The `forward` function is where all the magic happens (see below). This is where the data enters and is fed into the computation graph (i.e., the neural network structure we have built). Since we are building a simple neural network with one hidden layer, our forward function looks very simple:\n",
285
+ "\n",
286
+ "```python\n",
287
+ "def forward(self, X):\n",
288
+ " self.z = torch.matmul(X, self.W1) \n",
289
+ " self.z2 = self.sigmoid(self.z) # activation function\n",
290
+ " self.z3 = torch.matmul(self.z2, self.W2)\n",
291
+ " o = self.sigmoid(self.z3) # final activation function\n",
292
+ " return o\n",
293
+ "```\n",
294
+ "\n",
295
+ "The `forward` function above takes the input `X`and then performs a matrix multiplication (`torch.matmul(...)`) with the first weight matrix `self.W1`. Then the result is applied an activation function, `sigmoid`. The resulting matrix of the activation is then multiplied with the second weight matrix `self.W2`. Then another activation if performed, which renders the output of the neural network or computation graph. The process I described above is simply what's known as a `feedforward pass`. In order for the weights to optimize when training, we need a backpropagation algorithm. \n",
296
+ "\n",
297
+ "## The Backward Function\n",
298
+ "The `backward` function contains the backpropagation algorithm, where the goal is to essentially minimize the loss with respect to our weights. In other words, the weights need to be updated in such a way that the loss decreases while the neural network is training (well, that is what we hope for). All this magic is possible with the gradient descent algorithm which is declared in the `backward` function. Take a minute or two to inspect what is happening in the code below:\n",
299
+ "\n",
300
+ "```python\n",
301
+ "def backward(self, X, y, o):\n",
302
+ " self.o_error = y - o # error in output\n",
303
+ " self.o_delta = self.o_error * self.sigmoidPrime(o) \n",
304
+ " self.z2_error = torch.matmul(self.o_delta, torch.t(self.W2))\n",
305
+ " self.z2_delta = self.z2_error * self.sigmoidPrime(self.z2)\n",
306
+ " self.W1 += torch.matmul(torch.t(X), self.z2_delta)\n",
307
+ " self.W2 += torch.matmul(torch.t(self.z2), self.o_delta)\n",
308
+ "```\n",
309
+ "\n",
310
+ "Notice that we are performing a lot of matrix multiplications along with the transpose operations via the `torch.matmul(...)` and `torch.t(...)` operations, respectively. The rest is simply gradient descent -- there is nothing to it."
311
+ ]
312
+ },
313
+ {
314
+ "cell_type": "markdown",
315
+ "metadata": {
316
+ "id": "9t26Dr5zAp1u"
317
+ },
318
+ "source": [
319
+ "## Training\n",
320
+ "All that is left now is to train the neural network. First we create an instance of the computation graph we have just built:\n",
321
+ "\n",
322
+ "```python\n",
323
+ "NN = Neural_Network()\n",
324
+ "```\n",
325
+ "\n",
326
+ "Then we train the model for `1000` rounds. Notice that in PyTorch `NN(X)` automatically calls the `forward` function so there is no need to explicitly call `NN.forward(X)`. \n",
327
+ "\n",
328
+ "After we have obtained the predicted output for ever round of training, we compute the loss, with the following code:\n",
329
+ "\n",
330
+ "```python\n",
331
+ "torch.mean((y - NN(X))**2).detach().item()\n",
332
+ "```\n",
333
+ "\n",
334
+ "The next step is to start the training (foward + backward) via `NN.train(X, y)`. After we have trained the neural network, we can store the model and output the predicted value of the single instance we declared in the beginning, `xPredicted`. \n",
335
+ "\n",
336
+ "Let's train!"
337
+ ]
338
+ },
339
+ {
340
+ "cell_type": "code",
341
+ "execution_count": null,
342
+ "metadata": {
343
+ "colab": {
344
+ "base_uri": "https://localhost:8080/"
345
+ },
346
+ "id": "9sTddOpLAp1w",
347
+ "outputId": "c2ca53f6-8710-440f-af85-b8cf850340c7"
348
+ },
349
+ "outputs": [
350
+ {
351
+ "name": "stdout",
352
+ "output_type": "stream",
353
+ "text": [
354
+ "#0 Loss: 0.26455259323120117\n",
355
+ "#100 Loss: 0.0024994986597448587\n",
356
+ "#200 Loss: 0.002286414382979274\n",
357
+ "#300 Loss: 0.0021202608477324247\n",
358
+ "#400 Loss: 0.0019605369307100773\n",
359
+ "#500 Loss: 0.0018112537218257785\n",
360
+ "#600 Loss: 0.0016757562989369035\n",
361
+ "#700 Loss: 0.0015555238351225853\n",
362
+ "#800 Loss: 0.0014504743739962578\n",
363
+ "#900 Loss: 0.001359498011879623\n",
364
+ "Predicted data based on trained weights: \n",
365
+ "Input (scaled): \n",
366
+ "tensor([0.5000, 1.0000])\n",
367
+ "Output: \n",
368
+ "tensor([0.9522])\n",
369
+ "Finished training!\n"
370
+ ]
371
+ }
372
+ ],
373
+ "source": [
374
+ "NN = Neural_Network()\n",
375
+ "for i in range(1000): # trains the NN 1,000 times\n",
376
+ " if (i % 100) == 0:\n",
377
+ " print (\"#\" + str(i) + \" Loss: \" + str(torch.mean((y - NN(X))**2).detach().item())) # mean sum squared loss\n",
378
+ " NN.train(X, y)\n",
379
+ "NN.saveWeights(NN)\n",
380
+ "NN.predict()\n",
381
+ "\n",
382
+ "print(\"Finished training!\")"
383
+ ]
384
+ },
385
+ {
386
+ "cell_type": "markdown",
387
+ "metadata": {
388
+ "id": "L9nBzkgdbjcA"
389
+ },
390
+ "source": [
391
+ "The loss keeps decreasing, which means that the neural network is learning something. That's it. Congratulations! You have just learned how to create and train a neural network from scratch using PyTorch. There are so many things you can do with the shallow network we have just implemented. You can add more hidden layers or try to incorporate the bias terms for practice. I would love to see what you will build from here. Reach me out on [Twitter](https://twitter.com/omarsar0) if you have any further questions or leave your comments here. Until next time!"
392
+ ]
393
+ },
394
+ {
395
+ "cell_type": "markdown",
396
+ "metadata": {
397
+ "id": "zcms4BCySKXj"
398
+ },
399
+ "source": [
400
+ "## References:\n",
401
+ "- [PyTorch nn. Modules](https://pytorch.org/tutorials/beginner/pytorch_with_examples.html#pytorch-custom-nn-modules)\n",
402
+ "- [Build a Neural Network with Numpy](https://enlight.nyc/neural-network)\n"
403
+ ]
404
+ }
405
+ ],
406
+ "metadata": {
407
+ "colab": {
408
+ "collapsed_sections": [],
409
+ "name": "Neural Network from Scratch.ipynb",
410
+ "provenance": []
411
+ },
412
+ "kernelspec": {
413
+ "display_name": "Python 3 (ipykernel)",
414
+ "language": "python",
415
+ "name": "python3"
416
+ },
417
+ "language_info": {
418
+ "codemirror_mode": {
419
+ "name": "ipython",
420
+ "version": 3
421
+ },
422
+ "file_extension": ".py",
423
+ "mimetype": "text/x-python",
424
+ "name": "python",
425
+ "nbconvert_exporter": "python",
426
+ "pygments_lexer": "ipython3",
427
+ "version": "3.9.12"
428
+ }
429
+ },
430
+ "nbformat": 4,
431
+ "nbformat_minor": 1
432
+ }
07_bow.ipynb ADDED
@@ -0,0 +1,365 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 08 Bag of Words Text Classifier\n",
9
+ "description: Build a simple bag of words text classifier.\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/19suDts9MNIhx0TeGO26_BIY2Xc0n6DBC?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "OP_uXHGK0Q9d"
24
+ },
25
+ "source": [
26
+ "# Bag of Words Text Classifier\n",
27
+ "\n",
28
+ "The code below implements a simple bag of words text classifier.\n",
29
+ "- We tokenize the text, create a vocabulary and encode each piece of text in the dataset\n",
30
+ "- The lookup allows for extracting embeddings for each tokenized inputs\n",
31
+ "- The embedding vectors are added together with a bias vector\n",
32
+ "- The resulting vector is referred to as the scores\n",
33
+ "- The score are applied a softmax to generate probabilities which are used for the classification task\n",
34
+ "\n",
35
+ "The code used in this notebook was inspired by code from the [official repo](https://github.com/neubig/nn4nlp-code) used in the [CMU Neural Networks for NLP class](http://www.phontron.com/class/nn4nlp2021/schedule.html) by [Graham Neubig](http://www.phontron.com/index.php). \n",
36
+ "\n",
37
+ "![img txt](https://github.com/dair-ai/ML-Notebooks/blob/main/img/bow.png?raw=true)\n"
38
+ ]
39
+ },
40
+ {
41
+ "cell_type": "code",
42
+ "execution_count": null,
43
+ "metadata": {
44
+ "id": "rYJ7PiaO2R6Q"
45
+ },
46
+ "outputs": [],
47
+ "source": [
48
+ "import torch\n",
49
+ "import random\n",
50
+ "import torch.nn as nn"
51
+ ]
52
+ },
53
+ {
54
+ "cell_type": "markdown",
55
+ "metadata": {
56
+ "id": "M3eH6PyS1Ykz"
57
+ },
58
+ "source": [
59
+ "### Download the Data"
60
+ ]
61
+ },
62
+ {
63
+ "cell_type": "code",
64
+ "execution_count": null,
65
+ "metadata": {
66
+ "id": "F_lDByee1ddU"
67
+ },
68
+ "outputs": [],
69
+ "source": [
70
+ "%%capture\n",
71
+ "\n",
72
+ "# download the files\n",
73
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/dev.txt\n",
74
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/test.txt\n",
75
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/train.txt\n",
76
+ "\n",
77
+ "# create the data folders\n",
78
+ "!mkdir data data/classes\n",
79
+ "!cp dev.txt data/classes\n",
80
+ "!cp test.txt data/classes\n",
81
+ "!cp train.txt data/classes"
82
+ ]
83
+ },
84
+ {
85
+ "cell_type": "markdown",
86
+ "metadata": {
87
+ "id": "G9gihHeo0dK6"
88
+ },
89
+ "source": [
90
+ "### Read the Data"
91
+ ]
92
+ },
93
+ {
94
+ "cell_type": "code",
95
+ "execution_count": null,
96
+ "metadata": {
97
+ "id": "YOYzmcLdzD8i"
98
+ },
99
+ "outputs": [],
100
+ "source": [
101
+ "# function to read in data, process each line and split columns by \" ||| \"\n",
102
+ "def read_data(filename):\n",
103
+ " data = []\n",
104
+ " with open(filename, 'r') as f:\n",
105
+ " for line in f:\n",
106
+ " line = line.lower().strip()\n",
107
+ " line = line.split(' ||| ')\n",
108
+ " data.append(line)\n",
109
+ " return data\n",
110
+ "\n",
111
+ "train_data = read_data('data/classes/train.txt')\n",
112
+ "test_data = read_data('data/classes/test.txt')"
113
+ ]
114
+ },
115
+ {
116
+ "cell_type": "markdown",
117
+ "metadata": {
118
+ "id": "WEIAf06u2kZz"
119
+ },
120
+ "source": [
121
+ "### Contruct the Vocab and Datasets"
122
+ ]
123
+ },
124
+ {
125
+ "cell_type": "code",
126
+ "execution_count": null,
127
+ "metadata": {
128
+ "id": "9MJHDqjT2qDu"
129
+ },
130
+ "outputs": [],
131
+ "source": [
132
+ "# creating the word and tag indices\n",
133
+ "word_to_index = {}\n",
134
+ "word_to_index[\"<unk>\"] = len(word_to_index) # adds <UNK> to dictionary\n",
135
+ "tag_to_index = {}\n",
136
+ "\n",
137
+ "# create word to index dictionary and tag to index dictionary from data\n",
138
+ "def create_dict(data, check_unk=False):\n",
139
+ " for line in data:\n",
140
+ " for word in line[1].split(\" \"):\n",
141
+ " if check_unk == False:\n",
142
+ " if word not in word_to_index:\n",
143
+ " word_to_index[word] = len(word_to_index)\n",
144
+ " else:\n",
145
+ " if word not in word_to_index:\n",
146
+ " word_to_index[word] = word_to_index[\"<unk>\"]\n",
147
+ "\n",
148
+ " if line[0] not in tag_to_index:\n",
149
+ " tag_to_index[line[0]] = len(tag_to_index)\n",
150
+ "\n",
151
+ "create_dict(train_data)\n",
152
+ "create_dict(test_data, check_unk=True)\n",
153
+ "\n",
154
+ "# create word and tag tensors from data\n",
155
+ "def create_tensor(data):\n",
156
+ " for line in data:\n",
157
+ " yield([word_to_index[word] for word in line[1].split(\" \")], tag_to_index[line[0]])\n",
158
+ "\n",
159
+ "train_data = list(create_tensor(train_data))\n",
160
+ "test_data = list(create_tensor(test_data))\n",
161
+ "\n",
162
+ "number_of_words = len(word_to_index)\n",
163
+ "number_of_tags = len(tag_to_index)"
164
+ ]
165
+ },
166
+ {
167
+ "cell_type": "markdown",
168
+ "metadata": {
169
+ "id": "n-4FU9Ab2McP"
170
+ },
171
+ "source": [
172
+ "### Model"
173
+ ]
174
+ },
175
+ {
176
+ "cell_type": "code",
177
+ "execution_count": null,
178
+ "metadata": {
179
+ "id": "Zt76PIzP0jWn"
180
+ },
181
+ "outputs": [],
182
+ "source": [
183
+ "# cpu or gpu\n",
184
+ "device = \"cuda\" if torch.cuda.is_available() else \"cpu\"\n",
185
+ "\n",
186
+ "# create a simple neural network with embedding layer, bias, and xavier initialization\n",
187
+ "class BoW(torch.nn.Module):\n",
188
+ " def __init__(self, nwords, ntags):\n",
189
+ " super(BoW, self).__init__()\n",
190
+ " self.embedding = nn.Embedding(nwords, ntags)\n",
191
+ " nn.init.xavier_uniform_(self.embedding.weight)\n",
192
+ "\n",
193
+ " type = torch.cuda.FloatTensor if torch.cuda.is_available() else torch.FloatTensor\n",
194
+ " self.bias = torch.zeros(ntags, requires_grad=True).type(type)\n",
195
+ "\n",
196
+ " def forward(self, x):\n",
197
+ " emb = self.embedding(x) # seq_len x ntags (for each seq) \n",
198
+ " out = torch.sum(emb, dim=0) + self.bias # ntags\n",
199
+ " out = out.view(1, -1) # reshape to (1, ntags)\n",
200
+ " return out"
201
+ ]
202
+ },
203
+ {
204
+ "cell_type": "markdown",
205
+ "metadata": {
206
+ "id": "Mi4FNOy02Z1t"
207
+ },
208
+ "source": [
209
+ "### Pretest the Model"
210
+ ]
211
+ },
212
+ {
213
+ "cell_type": "code",
214
+ "execution_count": null,
215
+ "metadata": {
216
+ "colab": {
217
+ "base_uri": "https://localhost:8080/"
218
+ },
219
+ "id": "pn_LCZJv2Osz",
220
+ "outputId": "2c83bb22-a7e8-40af-cb1b-c04f3de6bd38"
221
+ },
222
+ "outputs": [
223
+ {
224
+ "data": {
225
+ "text/plain": [
226
+ "tensor([[-0.0108, -0.0067, -0.0260, -0.0255, 0.0119]], device='cuda:0',\n",
227
+ " grad_fn=<ViewBackward0>)"
228
+ ]
229
+ },
230
+ "execution_count": 6,
231
+ "metadata": {},
232
+ "output_type": "execute_result"
233
+ }
234
+ ],
235
+ "source": [
236
+ "# function to convert sentence into tensor using word_to_index dictionary\n",
237
+ "def sentence_to_tensor(sentence):\n",
238
+ " return torch.LongTensor([word_to_index[word] for word in sentence.split(\" \")])\n",
239
+ "\n",
240
+ "# test the sentence_to_tensor function\n",
241
+ "type = torch.cuda.LongTensor if torch.cuda.is_available() else torch.LongTensor\n",
242
+ "out = sentence_to_tensor(\"i love dogs\").type(type)\n",
243
+ "test_model = BoW(number_of_words, number_of_tags).to(device)\n",
244
+ "test_model(out)"
245
+ ]
246
+ },
247
+ {
248
+ "cell_type": "markdown",
249
+ "metadata": {
250
+ "id": "SH5r2Xzs21zB"
251
+ },
252
+ "source": [
253
+ "### Train the Model"
254
+ ]
255
+ },
256
+ {
257
+ "cell_type": "code",
258
+ "execution_count": null,
259
+ "metadata": {
260
+ "colab": {
261
+ "base_uri": "https://localhost:8080/"
262
+ },
263
+ "id": "f86xjDAi2bt8",
264
+ "outputId": "c329b5b2-6d09-405c-bca9-6066e3415c18"
265
+ },
266
+ "outputs": [
267
+ {
268
+ "name": "stdout",
269
+ "output_type": "stream",
270
+ "text": [
271
+ "ITER: 1 | train loss/sent: 1.4746 | train accuracy: 0.3661 | test accuracy: 0.3977\n",
272
+ "ITER: 2 | train loss/sent: 1.1221 | train accuracy: 0.6023 | test accuracy: 0.4149\n",
273
+ "ITER: 3 | train loss/sent: 0.9114 | train accuracy: 0.7124 | test accuracy: 0.4072\n",
274
+ "ITER: 4 | train loss/sent: 0.7681 | train accuracy: 0.7684 | test accuracy: 0.4063\n",
275
+ "ITER: 5 | train loss/sent: 0.6629 | train accuracy: 0.8069 | test accuracy: 0.4081\n",
276
+ "ITER: 6 | train loss/sent: 0.5802 | train accuracy: 0.8331 | test accuracy: 0.4023\n",
277
+ "ITER: 7 | train loss/sent: 0.5167 | train accuracy: 0.8549 | test accuracy: 0.4100\n",
278
+ "ITER: 8 | train loss/sent: 0.4632 | train accuracy: 0.8683 | test accuracy: 0.4072\n",
279
+ "ITER: 9 | train loss/sent: 0.4187 | train accuracy: 0.8838 | test accuracy: 0.3986\n",
280
+ "ITER: 10 | train loss/sent: 0.3802 | train accuracy: 0.8954 | test accuracy: 0.3973\n"
281
+ ]
282
+ }
283
+ ],
284
+ "source": [
285
+ "# train and test the BoW model\n",
286
+ "model = BoW(number_of_words, number_of_tags).to(device)\n",
287
+ "criterion = nn.CrossEntropyLoss()\n",
288
+ "optimizer = torch.optim.Adam(model.parameters())\n",
289
+ "type = torch.LongTensor\n",
290
+ "\n",
291
+ "if torch.cuda.is_available():\n",
292
+ " model.to(device)\n",
293
+ " type = torch.cuda.LongTensor\n",
294
+ "\n",
295
+ "# perform training of the Bow model\n",
296
+ "def train_bow(model, optimizer, criterion, train_data):\n",
297
+ " for ITER in range(10):\n",
298
+ " # perform training\n",
299
+ " model.train()\n",
300
+ " random.shuffle(train_data)\n",
301
+ " total_loss = 0.0\n",
302
+ " train_correct = 0\n",
303
+ " for sentence, tag in train_data:\n",
304
+ " sentence = torch.tensor(sentence).type(type)\n",
305
+ " tag = torch.tensor([tag]).type(type)\n",
306
+ " output = model(sentence)\n",
307
+ " predicted = torch.argmax(output.data.detach()).item()\n",
308
+ " \n",
309
+ " loss = criterion(output, tag)\n",
310
+ " total_loss += loss.item()\n",
311
+ "\n",
312
+ " optimizer.zero_grad()\n",
313
+ " loss.backward()\n",
314
+ " optimizer.step()\n",
315
+ "\n",
316
+ " if predicted == tag: train_correct+=1\n",
317
+ "\n",
318
+ " # perform testing of the model\n",
319
+ " model.eval()\n",
320
+ " test_correct = 0\n",
321
+ " for sentence, tag in test_data:\n",
322
+ " sentence = torch.tensor(sentence).type(type)\n",
323
+ " output = model(sentence)\n",
324
+ " predicted = torch.argmax(output.data.detach()).item()\n",
325
+ " if predicted == tag: test_correct += 1\n",
326
+ " \n",
327
+ " # print model performance results\n",
328
+ " log = f'ITER: {ITER+1} | ' \\\n",
329
+ " f'train loss/sent: {total_loss/len(train_data):.4f} | ' \\\n",
330
+ " f'train accuracy: {train_correct/len(train_data):.4f} | ' \\\n",
331
+ " f'test accuracy: {test_correct/len(test_data):.4f}'\n",
332
+ " print(log)\n",
333
+ "\n",
334
+ "# call the train_bow function\n",
335
+ "train_bow(model, optimizer, criterion, train_data)"
336
+ ]
337
+ }
338
+ ],
339
+ "metadata": {
340
+ "accelerator": "GPU",
341
+ "colab": {
342
+ "name": "bow.ipynb",
343
+ "provenance": []
344
+ },
345
+ "kernelspec": {
346
+ "display_name": "Python 3 (ipykernel)",
347
+ "language": "python",
348
+ "name": "python3"
349
+ },
350
+ "language_info": {
351
+ "codemirror_mode": {
352
+ "name": "ipython",
353
+ "version": 3
354
+ },
355
+ "file_extension": ".py",
356
+ "mimetype": "text/x-python",
357
+ "name": "python",
358
+ "nbconvert_exporter": "python",
359
+ "pygments_lexer": "ipython3",
360
+ "version": "3.9.12"
361
+ }
362
+ },
363
+ "nbformat": 4,
364
+ "nbformat_minor": 1
365
+ }
08_cbow.ipynb ADDED
@@ -0,0 +1,286 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 09 Continuous Bag of Words (CBOW) Text Classifier\n",
9
+ "description: Build a continuous bag of words text classifier.\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/1lqS67-mbCspIKzx6y9wn7CuP96utWzP2?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>\n"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "FXO-zuq0o5cU"
24
+ },
25
+ "source": [
26
+ "# Continuous Bag of Words (CBOW) Text Classifier\n",
27
+ "\n",
28
+ "The code below implements a continuous bag of words text classifier.\n",
29
+ "- We tokenize the text, create a vocabulary and encode each piece of text in the dataset\n",
30
+ "- The lookup allows for extracting embeddings for each tokenized input\n",
31
+ "- The embedding vectors are added together\n",
32
+ "- The resulting vector is multiplied with a weight matrix, which is then added a bias vector; this results in scores\n",
33
+ "- The scores are applied a softmax to generate probabilities which are used for the final classification\n",
34
+ "\n",
35
+ "The code used in this notebook was inspired by code from the [official repo](https://github.com/neubig/nn4nlp-code) used in the [CMU Neural Networks for NLP class](http://www.phontron.com/class/nn4nlp2021/schedule.html) by [Graham Neubig](http://www.phontron.com/index.php). \n",
36
+ "\n",
37
+ "![img txt](https://github.com/dair-ai/ML-Notebooks/blob/main/img/cbow.png?raw=true)"
38
+ ]
39
+ },
40
+ {
41
+ "cell_type": "code",
42
+ "execution_count": null,
43
+ "metadata": {
44
+ "id": "ORP_xNj9o3md"
45
+ },
46
+ "outputs": [],
47
+ "source": [
48
+ "import torch\n",
49
+ "import random\n",
50
+ "import torch.nn as nn"
51
+ ]
52
+ },
53
+ {
54
+ "cell_type": "code",
55
+ "execution_count": null,
56
+ "metadata": {
57
+ "id": "NO7P5X0tqr-N"
58
+ },
59
+ "outputs": [],
60
+ "source": [
61
+ "%%capture\n",
62
+ "\n",
63
+ "# download the files\n",
64
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/dev.txt\n",
65
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/test.txt\n",
66
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/train.txt\n",
67
+ "\n",
68
+ "# create the data folders\n",
69
+ "!mkdir data data/classes\n",
70
+ "!cp dev.txt data/classes\n",
71
+ "!cp test.txt data/classes\n",
72
+ "!cp train.txt data/classes"
73
+ ]
74
+ },
75
+ {
76
+ "cell_type": "markdown",
77
+ "metadata": {
78
+ "id": "Wa83HOUIq5EP"
79
+ },
80
+ "source": [
81
+ "## Read and Process Data"
82
+ ]
83
+ },
84
+ {
85
+ "cell_type": "code",
86
+ "execution_count": null,
87
+ "metadata": {
88
+ "id": "D31E3u_UqwTc"
89
+ },
90
+ "outputs": [],
91
+ "source": [
92
+ "# function to read in data, process each line and split columns by \" ||| \"\n",
93
+ "def read_data(filename):\n",
94
+ " data = []\n",
95
+ " with open(filename, 'r') as f:\n",
96
+ " for line in f:\n",
97
+ " line = line.lower().strip()\n",
98
+ " line = line.split(' ||| ')\n",
99
+ " data.append(line)\n",
100
+ " return data\n",
101
+ "\n",
102
+ "train_data = read_data('data/classes/train.txt')\n",
103
+ "test_data = read_data('data/classes/test.txt')\n",
104
+ "\n",
105
+ "# creating the word and tag indices\n",
106
+ "word_to_index = {}\n",
107
+ "word_to_index[\"<unk>\"] = len(word_to_index) # add <UNK> to dictionary\n",
108
+ "tag_to_index = {}\n",
109
+ "\n",
110
+ "# create word to index dictionary and tag to index dictionary from data\n",
111
+ "def create_dict(data, check_unk=False):\n",
112
+ " for line in data:\n",
113
+ " for word in line[1].split(\" \"):\n",
114
+ " if check_unk == False:\n",
115
+ " if word not in word_to_index:\n",
116
+ " word_to_index[word] = len(word_to_index)\n",
117
+ " else:\n",
118
+ " if word not in word_to_index:\n",
119
+ " word_to_index[word] = word_to_index[\"<unk>\"]\n",
120
+ "\n",
121
+ " if line[0] not in tag_to_index:\n",
122
+ " tag_to_index[line[0]] = len(tag_to_index)\n",
123
+ "\n",
124
+ "create_dict(train_data)\n",
125
+ "create_dict(test_data, check_unk=True)\n",
126
+ "\n",
127
+ "# create word and tag tensors from data\n",
128
+ "def create_tensor(data):\n",
129
+ " for line in data:\n",
130
+ " yield([word_to_index[word] for word in line[1].split(\" \")], tag_to_index[line[0]])\n",
131
+ "\n",
132
+ "train_data = list(create_tensor(train_data))\n",
133
+ "test_data = list(create_tensor(test_data))\n",
134
+ "\n",
135
+ "number_of_words = len(word_to_index)\n",
136
+ "number_of_tags = len(tag_to_index)"
137
+ ]
138
+ },
139
+ {
140
+ "cell_type": "markdown",
141
+ "metadata": {
142
+ "id": "cNsjv5misKIi"
143
+ },
144
+ "source": [
145
+ "## Model"
146
+ ]
147
+ },
148
+ {
149
+ "cell_type": "code",
150
+ "execution_count": null,
151
+ "metadata": {
152
+ "id": "7JPQ9OUZrC6z"
153
+ },
154
+ "outputs": [],
155
+ "source": [
156
+ "# cpu or gpu\n",
157
+ "device = \"cuda\" if torch.cuda.is_available() else \"cpu\"\n",
158
+ "\n",
159
+ "# create a simple neural network with embedding layer, bias, and xavier initialization\n",
160
+ "class CBoW(torch.nn.Module):\n",
161
+ " def __init__(self, nwords, ntags, emb_size):\n",
162
+ " super(CBoW, self).__init__()\n",
163
+ "\n",
164
+ " # layers\n",
165
+ " self.embedding = torch.nn.Embedding(nwords, emb_size)\n",
166
+ " self.linear = torch.nn.Linear(emb_size, ntags)\n",
167
+ "\n",
168
+ " # use xavier initialization for weights\n",
169
+ " nn.init.xavier_uniform_(self.embedding.weight)\n",
170
+ " nn.init.xavier_uniform_(self.linear.weight)\n",
171
+ "\n",
172
+ " def forward(self, x):\n",
173
+ " emb = self.embedding(x) # seq x emb_size\n",
174
+ " out = torch.sum(emb, dim=0) # emb_size\n",
175
+ " out = out.view(1, -1) # reshape to (1, emb_size)\n",
176
+ " out = self.linear(out) # 1 x ntags\n",
177
+ " return out\n",
178
+ "\n",
179
+ "EMB_SIZE = 64\n",
180
+ "model = CBoW(number_of_words, number_of_tags, EMB_SIZE)\n",
181
+ "criterion = torch.nn.CrossEntropyLoss()\n",
182
+ "optimizer = torch.optim.Adam(model.parameters())\n",
183
+ "type = torch.LongTensor\n",
184
+ "\n",
185
+ "if torch.cuda.is_available():\n",
186
+ " model.to(device)\n",
187
+ " type = torch.cuda.LongTensor"
188
+ ]
189
+ },
190
+ {
191
+ "cell_type": "code",
192
+ "execution_count": null,
193
+ "metadata": {
194
+ "colab": {
195
+ "base_uri": "https://localhost:8080/"
196
+ },
197
+ "id": "aycOgcVssgZI",
198
+ "outputId": "efe7bc92-5699-43d4-b382-54fd24d06134"
199
+ },
200
+ "outputs": [
201
+ {
202
+ "name": "stdout",
203
+ "output_type": "stream",
204
+ "text": [
205
+ "epoch: 1 | train loss/sent: 1.4089 | train accuracy: 0.3826 | test accuracy: 0.4149\n",
206
+ "epoch: 2 | train loss/sent: 0.9089 | train accuracy: 0.6358 | test accuracy: 0.4104\n",
207
+ "epoch: 3 | train loss/sent: 0.5298 | train accuracy: 0.8076 | test accuracy: 0.3837\n",
208
+ "epoch: 4 | train loss/sent: 0.3289 | train accuracy: 0.8864 | test accuracy: 0.3670\n",
209
+ "epoch: 5 | train loss/sent: 0.2179 | train accuracy: 0.9254 | test accuracy: 0.3851\n",
210
+ "epoch: 6 | train loss/sent: 0.1529 | train accuracy: 0.9467 | test accuracy: 0.3774\n",
211
+ "epoch: 7 | train loss/sent: 0.1131 | train accuracy: 0.9594 | test accuracy: 0.3774\n",
212
+ "epoch: 8 | train loss/sent: 0.0835 | train accuracy: 0.9719 | test accuracy: 0.3643\n",
213
+ "epoch: 9 | train loss/sent: 0.0594 | train accuracy: 0.9795 | test accuracy: 0.3566\n",
214
+ "epoch: 10 | train loss/sent: 0.0477 | train accuracy: 0.9837 | test accuracy: 0.3706\n"
215
+ ]
216
+ }
217
+ ],
218
+ "source": [
219
+ "# perform training of the Bow model\n",
220
+ "\n",
221
+ "for epoch in range(10):\n",
222
+ " # perform training\n",
223
+ " model.train()\n",
224
+ " random.shuffle(train_data)\n",
225
+ " total_loss = 0.0\n",
226
+ " train_correct = 0\n",
227
+ " for sentence, tag in train_data:\n",
228
+ " sentence = torch.tensor(sentence).type(type)\n",
229
+ " tag = torch.tensor([tag]).type(type)\n",
230
+ " output = model(sentence)\n",
231
+ " predicted = torch.argmax(output.data.detach()).item()\n",
232
+ " \n",
233
+ " loss = criterion(output, tag)\n",
234
+ " total_loss += loss.item()\n",
235
+ "\n",
236
+ " optimizer.zero_grad()\n",
237
+ " loss.backward()\n",
238
+ " optimizer.step()\n",
239
+ "\n",
240
+ " if predicted == tag: train_correct+=1\n",
241
+ "\n",
242
+ " # perform testing of the model\n",
243
+ " model.eval()\n",
244
+ " test_correct = 0\n",
245
+ " for sentence, tag in test_data:\n",
246
+ " sentence = torch.tensor(sentence).type(type)\n",
247
+ " output = model(sentence)\n",
248
+ " predicted = torch.argmax(output.data.detach()).item()\n",
249
+ " if predicted == tag: test_correct += 1\n",
250
+ " \n",
251
+ " # print model performance results\n",
252
+ " log = f'epoch: {epoch+1} | ' \\\n",
253
+ " f'train loss/sent: {total_loss/len(train_data):.4f} | ' \\\n",
254
+ " f'train accuracy: {train_correct/len(train_data):.4f} | ' \\\n",
255
+ " f'test accuracy: {test_correct/len(test_data):.4f}'\n",
256
+ " print(log)"
257
+ ]
258
+ }
259
+ ],
260
+ "metadata": {
261
+ "accelerator": "GPU",
262
+ "colab": {
263
+ "name": "cbow.ipynb",
264
+ "provenance": []
265
+ },
266
+ "kernelspec": {
267
+ "display_name": "Python 3 (ipykernel)",
268
+ "language": "python",
269
+ "name": "python3"
270
+ },
271
+ "language_info": {
272
+ "codemirror_mode": {
273
+ "name": "ipython",
274
+ "version": 3
275
+ },
276
+ "file_extension": ".py",
277
+ "mimetype": "text/x-python",
278
+ "name": "python",
279
+ "nbconvert_exporter": "python",
280
+ "pygments_lexer": "ipython3",
281
+ "version": "3.9.12"
282
+ }
283
+ },
284
+ "nbformat": 4,
285
+ "nbformat_minor": 1
286
+ }
09_deep_cbow.ipynb ADDED
@@ -0,0 +1,285 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 10 Deep Continuous Bag of Words (Deep CBOW) Text Classifier\n",
9
+ "description: Build a deep continuous bag of words text classifier\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/18yz-qvMQYIYZt1BLihSJrKQZXh8zjH8x?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "B8m-hOTiIQdz"
24
+ },
25
+ "source": [
26
+ "# Deep Continuous Bag of Words (Deep CBOW) Text Classifier\n",
27
+ "\n",
28
+ "The code below implements a continuous bag of words text classifier.\n",
29
+ "- We tokenize the text, create a vocabulary and encode each piece of text in the dataset\n",
30
+ "- We create embeddings for inputs and sum them together\n",
31
+ "- The resulting vector is fed to hidden neural network, which generates a new vector that is multiplied to a weights matrix\n",
32
+ "- We then add the bias and obtain scores\n",
33
+ "- The scores are applied a softmax to generate probabilities which are used for the final classification\n",
34
+ "\n",
35
+ "The code used in this notebook was inspired by code from the [official repo](https://github.com/neubig/nn4nlp-code) used in the [CMU Neural Networks for NLP class](http://www.phontron.com/class/nn4nlp2021/schedule.html) by [Graham Neubig](http://www.phontron.com/index.php). \n",
36
+ "\n",
37
+ "![img txt](https://github.com/dair-ai/ML-Notebooks/blob/main/img/deep_cbow.png?raw=true)"
38
+ ]
39
+ },
40
+ {
41
+ "cell_type": "code",
42
+ "execution_count": null,
43
+ "metadata": {
44
+ "id": "nfqATQzlIJ-k"
45
+ },
46
+ "outputs": [],
47
+ "source": [
48
+ "import torch\n",
49
+ "import random\n",
50
+ "import torch.nn as nn"
51
+ ]
52
+ },
53
+ {
54
+ "cell_type": "code",
55
+ "execution_count": null,
56
+ "metadata": {
57
+ "id": "-4qY1e2XNiri"
58
+ },
59
+ "outputs": [],
60
+ "source": [
61
+ "%%capture\n",
62
+ "\n",
63
+ "# download the files\n",
64
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/dev.txt\n",
65
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/test.txt\n",
66
+ "!wget https://raw.githubusercontent.com/neubig/nn4nlp-code/master/data/classes/train.txt\n",
67
+ "\n",
68
+ "# create the data folders\n",
69
+ "!mkdir data data/classes\n",
70
+ "!cp dev.txt data/classes\n",
71
+ "!cp test.txt data/classes\n",
72
+ "!cp train.txt data/classes"
73
+ ]
74
+ },
75
+ {
76
+ "cell_type": "markdown",
77
+ "metadata": {
78
+ "id": "6Vh6stZfNt7F"
79
+ },
80
+ "source": [
81
+ "## Read and Process the Data"
82
+ ]
83
+ },
84
+ {
85
+ "cell_type": "code",
86
+ "execution_count": null,
87
+ "metadata": {
88
+ "id": "ZjrwnvlyNsG2"
89
+ },
90
+ "outputs": [],
91
+ "source": [
92
+ "# function to read in data, process each line and split columns by \" ||| \"\n",
93
+ "def read_data(filename):\n",
94
+ " data = []\n",
95
+ " with open(filename, 'r') as f:\n",
96
+ " for line in f:\n",
97
+ " line = line.lower().strip()\n",
98
+ " line = line.split(' ||| ')\n",
99
+ " data.append(line)\n",
100
+ " return data\n",
101
+ "\n",
102
+ "train_data = read_data('data/classes/train.txt')\n",
103
+ "test_data = read_data('data/classes/test.txt')\n",
104
+ "\n",
105
+ "# creating the word and tag indices\n",
106
+ "word_to_index = {}\n",
107
+ "word_to_index[\"<unk>\"] = len(word_to_index) # add <UNK> to dictionary\n",
108
+ "tag_to_index = {}\n",
109
+ "\n",
110
+ "# create word to index dictionary and tag to index dictionary from data\n",
111
+ "def create_dict(data, check_unk=False):\n",
112
+ " for line in data:\n",
113
+ " for word in line[1].split(\" \"):\n",
114
+ " if check_unk == False:\n",
115
+ " if word not in word_to_index:\n",
116
+ " word_to_index[word] = len(word_to_index)\n",
117
+ " else:\n",
118
+ " if word not in word_to_index:\n",
119
+ " word_to_index[word] = word_to_index[\"<unk>\"]\n",
120
+ "\n",
121
+ " if line[0] not in tag_to_index:\n",
122
+ " tag_to_index[line[0]] = len(tag_to_index)\n",
123
+ "\n",
124
+ "create_dict(train_data)\n",
125
+ "create_dict(test_data, check_unk=True)\n",
126
+ "\n",
127
+ "# create word and tag tensors from data\n",
128
+ "def create_tensor(data):\n",
129
+ " for line in data:\n",
130
+ " yield([word_to_index[word] for word in line[1].split(\" \")], tag_to_index[line[0]])\n",
131
+ "\n",
132
+ "train_data = list(create_tensor(train_data))\n",
133
+ "test_data = list(create_tensor(test_data))\n",
134
+ "\n",
135
+ "number_of_words = len(word_to_index)\n",
136
+ "number_of_tags = len(tag_to_index)"
137
+ ]
138
+ },
139
+ {
140
+ "cell_type": "markdown",
141
+ "metadata": {
142
+ "id": "sSoomtjuN4HD"
143
+ },
144
+ "source": [
145
+ "## Model"
146
+ ]
147
+ },
148
+ {
149
+ "cell_type": "code",
150
+ "execution_count": null,
151
+ "metadata": {
152
+ "id": "j_-GavImNz6n"
153
+ },
154
+ "outputs": [],
155
+ "source": [
156
+ "device = \"cuda\" if torch.cuda.is_available() else \"cpu\"\n",
157
+ "\n",
158
+ "# create a simple neural network with embedding layer, bias, and xavier initialization\n",
159
+ "class DeepCBoW(nn.Module):\n",
160
+ " def __init__(self, nwords, ntags, hidden_size, num_layers, emb_size):\n",
161
+ " super(DeepCBoW, self).__init__()\n",
162
+ "\n",
163
+ " self.num_layers = num_layers\n",
164
+ "\n",
165
+ " # layers\n",
166
+ " self.embedding = nn.Embedding(nwords, emb_size)\n",
167
+ " self.linears = nn.ModuleList([nn.Linear(emb_size if i ==0 else hidden_size, hidden_size) \\\n",
168
+ " for i in range(num_layers)])\n",
169
+ "\n",
170
+ " # use xavier initialization for weights\n",
171
+ " nn.init.xavier_uniform_(self.embedding.weight)\n",
172
+ " for i in range(self.num_layers):\n",
173
+ " nn.init.xavier_uniform_(self.linears[i].weight)\n",
174
+ "\n",
175
+ " # output layer\n",
176
+ " self.output_layer = nn.Linear(hidden_size, ntags)\n",
177
+ "\n",
178
+ " def forward(self, x):\n",
179
+ " emb = self.embedding(x) # seq x emb_size\n",
180
+ " emb_sum = torch.sum(emb, dim=0) # emb_size\n",
181
+ " h = emb_sum.view(1, -1) # reshape to (1, emb_size)\n",
182
+ " for i in range(self.num_layers):\n",
183
+ " h = torch.tanh(self.linears[i](h))\n",
184
+ " out = self.output_layer(h) # 1 x ntags\n",
185
+ " return out\n",
186
+ "\n",
187
+ "HIDDEN_SIZE = 64\n",
188
+ "NUM_LAYERS = 2 # hidden layers\n",
189
+ "EMB_SIZE = 64\n",
190
+ "model = DeepCBoW(number_of_words, number_of_tags, HIDDEN_SIZE, NUM_LAYERS, EMB_SIZE).to(device)\n",
191
+ "criterion = nn.CrossEntropyLoss()\n",
192
+ "optimizer = torch.optim.Adam(model.parameters())\n",
193
+ "type = torch.LongTensor\n",
194
+ "\n",
195
+ "if torch.cuda.is_available():\n",
196
+ " model.to(device)\n",
197
+ " type = torch.cuda.LongTensor"
198
+ ]
199
+ },
200
+ {
201
+ "cell_type": "markdown",
202
+ "metadata": {
203
+ "id": "tMqill6ZOLPu"
204
+ },
205
+ "source": [
206
+ "## Model Training"
207
+ ]
208
+ },
209
+ {
210
+ "cell_type": "code",
211
+ "execution_count": null,
212
+ "metadata": {
213
+ "id": "BkY11eyXOIOY"
214
+ },
215
+ "outputs": [],
216
+ "source": [
217
+ "# perform training of the Bow model\n",
218
+ "\n",
219
+ "for epoch in range(10):\n",
220
+ " # perform training\n",
221
+ " model.train()\n",
222
+ " random.shuffle(train_data)\n",
223
+ " total_loss = 0.0\n",
224
+ " train_correct = 0\n",
225
+ " for sentence, tag in train_data:\n",
226
+ " sentence = torch.tensor(sentence).type(type)\n",
227
+ " tag = torch.tensor([tag]).type(type)\n",
228
+ " output = model(sentence)\n",
229
+ " predicted = torch.argmax(output.data.detach()).item()\n",
230
+ " \n",
231
+ " loss = criterion(output, tag)\n",
232
+ " total_loss += loss.item()\n",
233
+ "\n",
234
+ " optimizer.zero_grad()\n",
235
+ " loss.backward()\n",
236
+ " optimizer.step()\n",
237
+ "\n",
238
+ " if predicted == tag: train_correct+=1\n",
239
+ "\n",
240
+ " # perform testing of the model\n",
241
+ " model.eval()\n",
242
+ " test_correct = 0\n",
243
+ " for sentence, tag in test_data:\n",
244
+ " sentence = torch.tensor(sentence).type(type)\n",
245
+ " output = model(sentence)\n",
246
+ " predicted = torch.argmax(output.data.detach()).item()\n",
247
+ " if predicted == tag: test_correct += 1\n",
248
+ " \n",
249
+ " # print model performance results\n",
250
+ " log = f'epoch: {epoch+1} | ' \\\n",
251
+ " f'train loss/sent: {total_loss/len(train_data):.4f} | ' \\\n",
252
+ " f'train accuracy: {train_correct/len(train_data):.4f} | ' \\\n",
253
+ " f'test accuracy: {test_correct/len(test_data):.4f}'\n",
254
+ " \n",
255
+ " print(log)"
256
+ ]
257
+ }
258
+ ],
259
+ "metadata": {
260
+ "accelerator": "GPU",
261
+ "colab": {
262
+ "name": "deep-cbow.ipynb",
263
+ "provenance": []
264
+ },
265
+ "kernelspec": {
266
+ "display_name": "Python 3 (ipykernel)",
267
+ "language": "python",
268
+ "name": "python3"
269
+ },
270
+ "language_info": {
271
+ "codemirror_mode": {
272
+ "name": "ipython",
273
+ "version": 3
274
+ },
275
+ "file_extension": ".py",
276
+ "mimetype": "text/x-python",
277
+ "name": "python",
278
+ "nbconvert_exporter": "python",
279
+ "pygments_lexer": "ipython3",
280
+ "version": "3.9.12"
281
+ }
282
+ },
283
+ "nbformat": 4,
284
+ "nbformat_minor": 1
285
+ }
10_Introduction_to_GNNs_with_PyTorch_Geometric.ipynb ADDED
@@ -0,0 +1,1064 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 11 Introduction to GNNs\n",
9
+ "description: Introduction to Graph Neural Networks. Applies basic GCN to Cora dataset for node classification.\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/1d0jLDwgNBtjBVQOFe8lO_1WrqTVeVZx9?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "sjX6CPJFA52R"
24
+ },
25
+ "source": [
26
+ "## Introduction to GNNs with PyTorch Geometric\n",
27
+ "\n",
28
+ "In this short notebook, the goal is to provide a introductory guide to get started with Graph Neural Networks using the popular library called [PyTorch Geometric](https://pytorch-geometric.readthedocs.io/en/latest/index.html). PyTorch Geometric is a PyTorch based libary hence we will be using PyTorch in this tutorial. \n",
29
+ "\n",
30
+ "The code used in this tutorial has been adapted from their official [examples](https://pytorch-geometric.readthedocs.io/en/latest/notes/introduction.html). I have incorporated a bit more beginner-friendly guidance and kept it minimal."
31
+ ]
32
+ },
33
+ {
34
+ "cell_type": "code",
35
+ "execution_count": null,
36
+ "metadata": {
37
+ "colab": {
38
+ "base_uri": "https://localhost:8080/"
39
+ },
40
+ "id": "mwTz9zaHC7YA",
41
+ "outputId": "ce24d6a4-907f-4094-eb98-bc6ea0520e34"
42
+ },
43
+ "outputs": [
44
+ {
45
+ "name": "stdout",
46
+ "output_type": "stream",
47
+ "text": [
48
+ "11.1\n"
49
+ ]
50
+ }
51
+ ],
52
+ "source": [
53
+ "# Find the CUDA version PyTorch was installed with\n",
54
+ "!python -c \"import torch; print(torch.version.cuda)\""
55
+ ]
56
+ },
57
+ {
58
+ "cell_type": "code",
59
+ "execution_count": null,
60
+ "metadata": {
61
+ "colab": {
62
+ "base_uri": "https://localhost:8080/"
63
+ },
64
+ "id": "2jo0YpV0DLDW",
65
+ "outputId": "238637c0-e60b-42fc-e7de-86f47f39ec4f"
66
+ },
67
+ "outputs": [
68
+ {
69
+ "name": "stdout",
70
+ "output_type": "stream",
71
+ "text": [
72
+ "1.10.0+cu111\n"
73
+ ]
74
+ }
75
+ ],
76
+ "source": [
77
+ "# PyTorch version\n",
78
+ "!python -c \"import torch; print(torch.__version__)\""
79
+ ]
80
+ },
81
+ {
82
+ "cell_type": "markdown",
83
+ "metadata": {
84
+ "id": "P-VLTfxzEmLu"
85
+ },
86
+ "source": [
87
+ "Install the follow packages but make sure to install the right version below. Find more instructions [here](https://pytorch-geometric.readthedocs.io/en/latest/notes/installation.html) if you get lost. "
88
+ ]
89
+ },
90
+ {
91
+ "cell_type": "code",
92
+ "execution_count": null,
93
+ "metadata": {
94
+ "colab": {
95
+ "base_uri": "https://localhost:8080/"
96
+ },
97
+ "id": "fLbSOIkaDRe4",
98
+ "outputId": "b196161e-d1bf-4595-ccaa-49747f3ec00c"
99
+ },
100
+ "outputs": [
101
+ {
102
+ "name": "stdout",
103
+ "output_type": "stream",
104
+ "text": [
105
+ "Looking in links: https://data.pyg.org/whl/torch-1.10.0+cu111.html\n",
106
+ "Collecting torch-scatter\n",
107
+ " Downloading https://data.pyg.org/whl/torch-1.10.0%2Bcu113/torch_scatter-2.0.9-cp37-cp37m-linux_x86_64.whl (7.9 MB)\n",
108
+ "\u001b[K |████████████████████████████████| 7.9 MB 2.5 MB/s \n",
109
+ "\u001b[?25hInstalling collected packages: torch-scatter\n",
110
+ "Successfully installed torch-scatter-2.0.9\n"
111
+ ]
112
+ }
113
+ ],
114
+ "source": [
115
+ "!pip install torch-scatter -f https://data.pyg.org/whl/torch-1.10.0+cu111.html"
116
+ ]
117
+ },
118
+ {
119
+ "cell_type": "code",
120
+ "execution_count": null,
121
+ "metadata": {
122
+ "colab": {
123
+ "base_uri": "https://localhost:8080/"
124
+ },
125
+ "id": "Q-wRLXE_DkZF",
126
+ "outputId": "cb249940-3c85-4572-eb41-3d3ef2a5407d"
127
+ },
128
+ "outputs": [
129
+ {
130
+ "name": "stdout",
131
+ "output_type": "stream",
132
+ "text": [
133
+ "Looking in links: https://data.pyg.org/whl/torch-1.10.0+cu111.html\n",
134
+ "Collecting torch-sparse\n",
135
+ " Downloading https://data.pyg.org/whl/torch-1.10.0%2Bcu113/torch_sparse-0.6.12-cp37-cp37m-linux_x86_64.whl (3.5 MB)\n",
136
+ "\u001b[K |████████████████████████████████| 3.5 MB 2.8 MB/s \n",
137
+ "\u001b[?25hRequirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from torch-sparse) (1.4.1)\n",
138
+ "Requirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.7/dist-packages (from scipy->torch-sparse) (1.19.5)\n",
139
+ "Installing collected packages: torch-sparse\n",
140
+ "Successfully installed torch-sparse-0.6.12\n"
141
+ ]
142
+ }
143
+ ],
144
+ "source": [
145
+ "!pip install torch-sparse -f https://data.pyg.org/whl/torch-1.10.0+cu111.html"
146
+ ]
147
+ },
148
+ {
149
+ "cell_type": "code",
150
+ "execution_count": null,
151
+ "metadata": {
152
+ "colab": {
153
+ "base_uri": "https://localhost:8080/"
154
+ },
155
+ "id": "lAobCDu6Dppo",
156
+ "outputId": "d7675ad2-6b5f-4162-caa8-13fa658d1793"
157
+ },
158
+ "outputs": [
159
+ {
160
+ "name": "stdout",
161
+ "output_type": "stream",
162
+ "text": [
163
+ "Collecting torch-geometric\n",
164
+ " Downloading torch_geometric-2.0.3.tar.gz (370 kB)\n",
165
+ "\u001b[K |████████████████████████████████| 370 kB 5.3 MB/s \n",
166
+ "\u001b[?25hRequirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (1.19.5)\n",
167
+ "Requirement already satisfied: tqdm in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (4.62.3)\n",
168
+ "Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (1.4.1)\n",
169
+ "Requirement already satisfied: networkx in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (2.6.3)\n",
170
+ "Requirement already satisfied: scikit-learn in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (1.0.2)\n",
171
+ "Requirement already satisfied: requests in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (2.23.0)\n",
172
+ "Requirement already satisfied: pandas in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (1.3.5)\n",
173
+ "Collecting rdflib\n",
174
+ " Downloading rdflib-6.1.1-py3-none-any.whl (482 kB)\n",
175
+ "\u001b[K |████████████████████████████████| 482 kB 48.3 MB/s \n",
176
+ "\u001b[?25hRequirement already satisfied: googledrivedownloader in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (0.4)\n",
177
+ "Requirement already satisfied: jinja2 in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (2.11.3)\n",
178
+ "Requirement already satisfied: pyparsing in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (3.0.7)\n",
179
+ "Collecting yacs\n",
180
+ " Downloading yacs-0.1.8-py3-none-any.whl (14 kB)\n",
181
+ "Requirement already satisfied: PyYAML in /usr/local/lib/python3.7/dist-packages (from torch-geometric) (3.13)\n",
182
+ "Requirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.7/dist-packages (from jinja2->torch-geometric) (2.0.1)\n",
183
+ "Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.7/dist-packages (from pandas->torch-geometric) (2018.9)\n",
184
+ "Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.7/dist-packages (from pandas->torch-geometric) (2.8.2)\n",
185
+ "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.7/dist-packages (from python-dateutil>=2.7.3->pandas->torch-geometric) (1.15.0)\n",
186
+ "Collecting isodate\n",
187
+ " Downloading isodate-0.6.1-py2.py3-none-any.whl (41 kB)\n",
188
+ "\u001b[K |████████████████████████████████| 41 kB 486 kB/s \n",
189
+ "\u001b[?25hRequirement already satisfied: importlib-metadata in /usr/local/lib/python3.7/dist-packages (from rdflib->torch-geometric) (4.10.1)\n",
190
+ "Requirement already satisfied: setuptools in /usr/local/lib/python3.7/dist-packages (from rdflib->torch-geometric) (57.4.0)\n",
191
+ "Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata->rdflib->torch-geometric) (3.7.0)\n",
192
+ "Requirement already satisfied: typing-extensions>=3.6.4 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata->rdflib->torch-geometric) (3.10.0.2)\n",
193
+ "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests->torch-geometric) (2021.10.8)\n",
194
+ "Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests->torch-geometric) (2.10)\n",
195
+ "Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests->torch-geometric) (3.0.4)\n",
196
+ "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests->torch-geometric) (1.24.3)\n",
197
+ "Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from scikit-learn->torch-geometric) (3.1.0)\n",
198
+ "Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.7/dist-packages (from scikit-learn->torch-geometric) (1.1.0)\n",
199
+ "Building wheels for collected packages: torch-geometric\n",
200
+ " Building wheel for torch-geometric (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
201
+ " Created wheel for torch-geometric: filename=torch_geometric-2.0.3-py3-none-any.whl size=581968 sha256=eee568026f004ea2d960222f33768328b638194617a05429f1e10e5c019857d0\n",
202
+ " Stored in directory: /root/.cache/pip/wheels/c3/2a/58/87ce0508964d4def1aafb92750c4f3ac77038efd1b9a89dcf5\n",
203
+ "Successfully built torch-geometric\n",
204
+ "Installing collected packages: isodate, yacs, rdflib, torch-geometric\n",
205
+ "Successfully installed isodate-0.6.1 rdflib-6.1.1 torch-geometric-2.0.3 yacs-0.1.8\n"
206
+ ]
207
+ }
208
+ ],
209
+ "source": [
210
+ "!pip install torch-geometric"
211
+ ]
212
+ },
213
+ {
214
+ "cell_type": "markdown",
215
+ "metadata": {
216
+ "id": "GNfiSdoUFaoF"
217
+ },
218
+ "source": [
219
+ "## Getting Started"
220
+ ]
221
+ },
222
+ {
223
+ "cell_type": "markdown",
224
+ "metadata": {
225
+ "id": "D6LCvEr7CGF9"
226
+ },
227
+ "source": [
228
+ "Import PyTorch"
229
+ ]
230
+ },
231
+ {
232
+ "cell_type": "code",
233
+ "execution_count": null,
234
+ "metadata": {
235
+ "colab": {
236
+ "base_uri": "https://localhost:8080/"
237
+ },
238
+ "id": "pqrlyFN1AtXI",
239
+ "outputId": "850011fd-9eb1-4afc-8ba2-1416f4801c05"
240
+ },
241
+ "outputs": [
242
+ {
243
+ "name": "stdout",
244
+ "output_type": "stream",
245
+ "text": [
246
+ "1.10.0+cu111\n"
247
+ ]
248
+ }
249
+ ],
250
+ "source": [
251
+ "import torch\n",
252
+ "\n",
253
+ "# print torch version\n",
254
+ "print(torch.__version__)"
255
+ ]
256
+ },
257
+ {
258
+ "cell_type": "markdown",
259
+ "metadata": {
260
+ "id": "V0xfcmMfCFIH"
261
+ },
262
+ "source": [
263
+ "The great thing about PyTorch Geometric is that it contain useful functionalities to import and load graph related data. "
264
+ ]
265
+ },
266
+ {
267
+ "cell_type": "code",
268
+ "execution_count": null,
269
+ "metadata": {
270
+ "id": "gfIc5j5YB2_a"
271
+ },
272
+ "outputs": [],
273
+ "source": [
274
+ "from torch_geometric.data import Data"
275
+ ]
276
+ },
277
+ {
278
+ "cell_type": "markdown",
279
+ "metadata": {
280
+ "id": "4wuBs1NHEFqn"
281
+ },
282
+ "source": [
283
+ "Now let's create an unweighted and undirected graph with three nodes and four total edges."
284
+ ]
285
+ },
286
+ {
287
+ "cell_type": "code",
288
+ "execution_count": null,
289
+ "metadata": {
290
+ "colab": {
291
+ "base_uri": "https://localhost:8080/"
292
+ },
293
+ "id": "-nLnUIObCTjK",
294
+ "outputId": "e4ac0c52-5b42-40be-8f35-091f04fd7a9c"
295
+ },
296
+ "outputs": [
297
+ {
298
+ "name": "stdout",
299
+ "output_type": "stream",
300
+ "text": [
301
+ "Data(x=[3, 1], edge_index=[2, 4])\n"
302
+ ]
303
+ }
304
+ ],
305
+ "source": [
306
+ "# define edge list\n",
307
+ "edge_index = torch.tensor([[0, 1, 1, 2], [1, 0, 2, 1]], dtype=torch.long)\n",
308
+ "\n",
309
+ "# define node features\n",
310
+ "x = torch.tensor([[-1], [0], [1]])\n",
311
+ "\n",
312
+ "# create graph data object\n",
313
+ "data = Data(x=x, edge_index=edge_index)\n",
314
+ "print(data)"
315
+ ]
316
+ },
317
+ {
318
+ "cell_type": "markdown",
319
+ "metadata": {
320
+ "id": "zV7bQ6tAEQ7H"
321
+ },
322
+ "source": [
323
+ "Our data object `Data` has many useful utility functions to check the properties of the graph. "
324
+ ]
325
+ },
326
+ {
327
+ "cell_type": "code",
328
+ "execution_count": null,
329
+ "metadata": {
330
+ "colab": {
331
+ "base_uri": "https://localhost:8080/"
332
+ },
333
+ "id": "idbfqPzoEOPC",
334
+ "outputId": "f22248ff-27e6-4cec-ec15-a0c2bbea463c"
335
+ },
336
+ "outputs": [
337
+ {
338
+ "name": "stdout",
339
+ "output_type": "stream",
340
+ "text": [
341
+ "4\n"
342
+ ]
343
+ }
344
+ ],
345
+ "source": [
346
+ "# check number of edges of the graph\n",
347
+ "print(data.num_edges)"
348
+ ]
349
+ },
350
+ {
351
+ "cell_type": "code",
352
+ "execution_count": null,
353
+ "metadata": {
354
+ "colab": {
355
+ "base_uri": "https://localhost:8080/"
356
+ },
357
+ "id": "r0lcSstME0MP",
358
+ "outputId": "c88ad32a-22aa-4507-8824-ac9961cad74e"
359
+ },
360
+ "outputs": [
361
+ {
362
+ "name": "stdout",
363
+ "output_type": "stream",
364
+ "text": [
365
+ "3\n"
366
+ ]
367
+ }
368
+ ],
369
+ "source": [
370
+ "# check number of nodes of the graph\n",
371
+ "print(data.num_nodes)"
372
+ ]
373
+ },
374
+ {
375
+ "cell_type": "code",
376
+ "execution_count": null,
377
+ "metadata": {
378
+ "colab": {
379
+ "base_uri": "https://localhost:8080/"
380
+ },
381
+ "id": "yiegwHTyE2AO",
382
+ "outputId": "182f22cc-0a52-48b8-d1b6-19c0d926a35f"
383
+ },
384
+ "outputs": [
385
+ {
386
+ "name": "stdout",
387
+ "output_type": "stream",
388
+ "text": [
389
+ "1\n"
390
+ ]
391
+ }
392
+ ],
393
+ "source": [
394
+ "# check number of features of the graph\n",
395
+ "print(data.num_features)"
396
+ ]
397
+ },
398
+ {
399
+ "cell_type": "code",
400
+ "execution_count": null,
401
+ "metadata": {
402
+ "colab": {
403
+ "base_uri": "https://localhost:8080/"
404
+ },
405
+ "id": "JNWjhaSeE3Yk",
406
+ "outputId": "5a08aebe-a590-4a93-f58f-a0d45d70a2b2"
407
+ },
408
+ "outputs": [
409
+ {
410
+ "name": "stdout",
411
+ "output_type": "stream",
412
+ "text": [
413
+ "False\n"
414
+ ]
415
+ }
416
+ ],
417
+ "source": [
418
+ "# check if graph is directed\n",
419
+ "print(data.is_directed())"
420
+ ]
421
+ },
422
+ {
423
+ "cell_type": "markdown",
424
+ "metadata": {
425
+ "id": "HF-jGPyhFeO6"
426
+ },
427
+ "source": [
428
+ "## Loading Data"
429
+ ]
430
+ },
431
+ {
432
+ "cell_type": "markdown",
433
+ "metadata": {
434
+ "id": "KBVX6mZfFBxE"
435
+ },
436
+ "source": [
437
+ "Find more fun functions related to graph data [here](https://pytorch-geometric.readthedocs.io/en/latest/modules/data.html#torch_geometric.data.Data). "
438
+ ]
439
+ },
440
+ {
441
+ "cell_type": "markdown",
442
+ "metadata": {
443
+ "id": "LHvE23mwFJ-3"
444
+ },
445
+ "source": [
446
+ "One of the cool things about the PyTorch Geometric library is that it contains out-of-the-box benchmark datasets that are ready to use and explore. A popular dataset is the Cora dataset that is used for supervised graph node classification. (We will talk about these applications in an upcoming tutorial but for now we will focus on the data itself).\n",
447
+ "\n",
448
+ "\"The Cora dataset consists of 2708 scientific publications classified into one of seven classes. The citation network consists of 5429 links. Each publication in the dataset is described by a 0/1-valued word vector indicating the absence/presence of the corresponding word from the dictionary. The dictionary consists of 1433 unique words.\" - [Papers with Code](https://paperswithcode.com/dataset/cora)."
449
+ ]
450
+ },
451
+ {
452
+ "cell_type": "markdown",
453
+ "metadata": {
454
+ "id": "_8ganBm_FiaQ"
455
+ },
456
+ "source": [
457
+ "Let's load the Cora dataset:"
458
+ ]
459
+ },
460
+ {
461
+ "cell_type": "code",
462
+ "execution_count": null,
463
+ "metadata": {
464
+ "colab": {
465
+ "base_uri": "https://localhost:8080/"
466
+ },
467
+ "id": "K3bXwOpoE75M",
468
+ "outputId": "ede1ca54-0336-4642-c82f-c1b9c7ed5f3b"
469
+ },
470
+ "outputs": [
471
+ {
472
+ "name": "stderr",
473
+ "output_type": "stream",
474
+ "text": [
475
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.x\n",
476
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.tx\n",
477
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.allx\n",
478
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.y\n",
479
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.ty\n",
480
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.ally\n",
481
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.graph\n",
482
+ "Downloading https://github.com/kimiyoung/planetoid/raw/master/data/ind.cora.test.index\n",
483
+ "Processing...\n",
484
+ "Done!\n"
485
+ ]
486
+ }
487
+ ],
488
+ "source": [
489
+ "from torch_geometric.datasets import Planetoid\n",
490
+ "\n",
491
+ "dataset = Planetoid(root='tmp/Cora', name='Cora')"
492
+ ]
493
+ },
494
+ {
495
+ "cell_type": "markdown",
496
+ "metadata": {
497
+ "id": "rQTFnBcuFpLv"
498
+ },
499
+ "source": [
500
+ "Let's check some of the properties of the Cora dataset."
501
+ ]
502
+ },
503
+ {
504
+ "cell_type": "code",
505
+ "execution_count": null,
506
+ "metadata": {
507
+ "colab": {
508
+ "base_uri": "https://localhost:8080/"
509
+ },
510
+ "id": "q9jvXbRXFlGG",
511
+ "outputId": "5d22903d-65dd-4e67-b5d4-ccfe682c1157"
512
+ },
513
+ "outputs": [
514
+ {
515
+ "name": "stdout",
516
+ "output_type": "stream",
517
+ "text": [
518
+ "Number of graphs: 1\n",
519
+ "Number of features: 1433\n",
520
+ "Number of classes: 7\n"
521
+ ]
522
+ }
523
+ ],
524
+ "source": [
525
+ "# number of graphs\n",
526
+ "print(\"Number of graphs: \", len(dataset))\n",
527
+ "\n",
528
+ "# number of features\n",
529
+ "print(\"Number of features: \", dataset.num_features)\n",
530
+ "\n",
531
+ "# number of classes\n",
532
+ "print(\"Number of classes: \", dataset.num_classes)"
533
+ ]
534
+ },
535
+ {
536
+ "cell_type": "markdown",
537
+ "metadata": {
538
+ "id": "57SX-idAF02R"
539
+ },
540
+ "source": [
541
+ "We can see that this particular graph dataset only contains one graph. Graph data can be very complex and can include multiple graphs depending on the type of data and application. Let's check more feature of the Cora dataset:"
542
+ ]
543
+ },
544
+ {
545
+ "cell_type": "code",
546
+ "execution_count": null,
547
+ "metadata": {
548
+ "colab": {
549
+ "base_uri": "https://localhost:8080/"
550
+ },
551
+ "id": "8aOt8HwfFrH_",
552
+ "outputId": "b7043e70-aafe-4b7d-c877-83feed65c12b"
553
+ },
554
+ "outputs": [
555
+ {
556
+ "name": "stdout",
557
+ "output_type": "stream",
558
+ "text": [
559
+ "Number of nodes: 2708\n",
560
+ "Number of edges: 10556\n",
561
+ "Is directed: False\n"
562
+ ]
563
+ }
564
+ ],
565
+ "source": [
566
+ "# select the first graph\n",
567
+ "data = dataset[0]\n",
568
+ "\n",
569
+ "# number of nodes\n",
570
+ "print(\"Number of nodes: \", data.num_nodes)\n",
571
+ "\n",
572
+ "# number of edges\n",
573
+ "print(\"Number of edges: \", data.num_edges)\n",
574
+ "\n",
575
+ "# check if directed\n",
576
+ "print(\"Is directed: \", data.is_directed())"
577
+ ]
578
+ },
579
+ {
580
+ "cell_type": "markdown",
581
+ "metadata": {
582
+ "id": "3XX2MRY4GEQS"
583
+ },
584
+ "source": [
585
+ "You can sample nodes from the graph this way:"
586
+ ]
587
+ },
588
+ {
589
+ "cell_type": "code",
590
+ "execution_count": null,
591
+ "metadata": {
592
+ "colab": {
593
+ "base_uri": "https://localhost:8080/"
594
+ },
595
+ "id": "qGJKbv-4GAtY",
596
+ "outputId": "8c2a45dc-0b4d-4027-b252-5e48ef787cd6"
597
+ },
598
+ "outputs": [
599
+ {
600
+ "name": "stdout",
601
+ "output_type": "stream",
602
+ "text": [
603
+ "Shape of sample nodes: torch.Size([5, 1433])\n"
604
+ ]
605
+ }
606
+ ],
607
+ "source": [
608
+ "# sample nodes from the graph\n",
609
+ "print(\"Shape of sample nodes: \", data.x[:5].shape)"
610
+ ]
611
+ },
612
+ {
613
+ "cell_type": "markdown",
614
+ "metadata": {
615
+ "id": "wV1yBQSvGM9q"
616
+ },
617
+ "source": [
618
+ "We extracted 5 nodes from the graph and checked its shape. You will see that each node has `1433` features."
619
+ ]
620
+ },
621
+ {
622
+ "cell_type": "markdown",
623
+ "metadata": {
624
+ "id": "oXMY2lU0GWQL"
625
+ },
626
+ "source": [
627
+ "Another great advantage of using PyTorch Geometric to load the Cora data is that it comes pre-processed and ready to use. It also has the splits for training, validation and test which we can directly use for training a GNN.\n",
628
+ "\n",
629
+ "Let's check some stats for the partitions of the data:"
630
+ ]
631
+ },
632
+ {
633
+ "cell_type": "code",
634
+ "execution_count": null,
635
+ "metadata": {
636
+ "colab": {
637
+ "base_uri": "https://localhost:8080/"
638
+ },
639
+ "id": "MzITbLkpGIUP",
640
+ "outputId": "7a95cc3c-67dc-4050-e22a-556928cefae8"
641
+ },
642
+ "outputs": [
643
+ {
644
+ "name": "stdout",
645
+ "output_type": "stream",
646
+ "text": [
647
+ "# of nodes to train on: 140\n",
648
+ "# of nodes to test on: 1000\n",
649
+ "# of nodes to validate on: 500\n"
650
+ ]
651
+ }
652
+ ],
653
+ "source": [
654
+ "# check training nodes\n",
655
+ "print(\"# of nodes to train on: \", data.train_mask.sum().item())\n",
656
+ "\n",
657
+ "# check test nodes\n",
658
+ "print(\"# of nodes to test on: \", data.test_mask.sum().item())\n",
659
+ "\n",
660
+ "# check validation nodes\n",
661
+ "print(\"# of nodes to validate on: \", data.val_mask.sum().item())"
662
+ ]
663
+ },
664
+ {
665
+ "cell_type": "markdown",
666
+ "metadata": {
667
+ "id": "7IGOoSfwHjeD"
668
+ },
669
+ "source": [
670
+ "That information is important as it will indicate to our model which nodes to train against and which to test against, and so on.\n",
671
+ "\n",
672
+ "When training neural networks we train them using batches of data. PyTorch Geometric provides efficient processes to load batches of data.\n",
673
+ "\n",
674
+ "PyTorch Geometric contains a data loader which is a very popular feature in PyTorch to efficiently load data when training neural networks. "
675
+ ]
676
+ },
677
+ {
678
+ "cell_type": "markdown",
679
+ "metadata": {
680
+ "id": "_SUZUnXzH1zN"
681
+ },
682
+ "source": [
683
+ "So let's try to load the data using the built in `DataLoader`:"
684
+ ]
685
+ },
686
+ {
687
+ "cell_type": "code",
688
+ "execution_count": null,
689
+ "metadata": {
690
+ "id": "9tdHl4oZGw_y"
691
+ },
692
+ "outputs": [],
693
+ "source": [
694
+ "from torch_geometric.datasets import Planetoid\n",
695
+ "from torch_geometric.loader import DataLoader\n",
696
+ "device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')"
697
+ ]
698
+ },
699
+ {
700
+ "cell_type": "code",
701
+ "execution_count": null,
702
+ "metadata": {
703
+ "colab": {
704
+ "base_uri": "https://localhost:8080/"
705
+ },
706
+ "id": "xtJAp4QqIMWw",
707
+ "outputId": "8df69cee-89ab-441b-9526-6029b41b6bb8"
708
+ },
709
+ "outputs": [
710
+ {
711
+ "name": "stdout",
712
+ "output_type": "stream",
713
+ "text": [
714
+ "cpu\n"
715
+ ]
716
+ }
717
+ ],
718
+ "source": [
719
+ "print(device)"
720
+ ]
721
+ },
722
+ {
723
+ "cell_type": "code",
724
+ "execution_count": null,
725
+ "metadata": {
726
+ "id": "06xbeTJcH8C-"
727
+ },
728
+ "outputs": [],
729
+ "source": [
730
+ "dataset = Planetoid(root='tmp/Cora', name='Cora')\n",
731
+ "data = dataset[0].to(device)"
732
+ ]
733
+ },
734
+ {
735
+ "cell_type": "markdown",
736
+ "metadata": {
737
+ "id": "8MH0lQsYIV0V"
738
+ },
739
+ "source": [
740
+ "Print some quick statistics about the data:"
741
+ ]
742
+ },
743
+ {
744
+ "cell_type": "code",
745
+ "execution_count": null,
746
+ "metadata": {
747
+ "colab": {
748
+ "base_uri": "https://localhost:8080/"
749
+ },
750
+ "id": "gK7-K6uYH_Iu",
751
+ "outputId": "0613a450-caf0-4137-c00f-a45ed8db412f"
752
+ },
753
+ "outputs": [
754
+ {
755
+ "name": "stdout",
756
+ "output_type": "stream",
757
+ "text": [
758
+ "X shape: torch.Size([2708, 1433])\n",
759
+ "Edge shape: torch.Size([2, 10556])\n",
760
+ "Y shape: torch.Size([2708])\n"
761
+ ]
762
+ }
763
+ ],
764
+ "source": [
765
+ "print(\"X shape: \", data.x.shape)\n",
766
+ "print(\"Edge shape: \", data.edge_index.shape)\n",
767
+ "print(\"Y shape: \", data.y.shape)"
768
+ ]
769
+ },
770
+ {
771
+ "cell_type": "markdown",
772
+ "metadata": {
773
+ "id": "TUGNxhBOIuYe"
774
+ },
775
+ "source": [
776
+ "## Model and Training"
777
+ ]
778
+ },
779
+ {
780
+ "cell_type": "markdown",
781
+ "metadata": {
782
+ "id": "v1pz24blIwnR"
783
+ },
784
+ "source": [
785
+ "Finally, let's define a standard GCN to train on the Cora dataset. The aim is to train a model that gets better at predicting the class of the node."
786
+ ]
787
+ },
788
+ {
789
+ "cell_type": "markdown",
790
+ "metadata": {
791
+ "id": "dhdMehneI2bn"
792
+ },
793
+ "source": [
794
+ "To keep thins simple we will use the same model definition as used in the [tutorial](https://pytorch-geometric.readthedocs.io/en/latest/notes/introduction.html) we adpated the code from. Note that we are using the built-in `GCNConv` model but you could easily implement your own (something we will cover in a future tutorial). \n",
795
+ "\n",
796
+ "The model below uses two `GCNConv` layers. The first layer is followed by a non-linearity `ReLU` and `Dropout`. The result is fed to the second layer on top of which we apply `Softmax` to get distribution over the number of classes."
797
+ ]
798
+ },
799
+ {
800
+ "cell_type": "code",
801
+ "execution_count": null,
802
+ "metadata": {
803
+ "id": "q0AiwyWrJGhj"
804
+ },
805
+ "outputs": [],
806
+ "source": [
807
+ "import torch.nn.functional as F\n",
808
+ "from torch_geometric.nn import GCNConv\n",
809
+ "\n",
810
+ "class GCN(torch.nn.Module):\n",
811
+ " def __init__(self):\n",
812
+ " super().__init__()\n",
813
+ " \"\"\" GCNConv layers \"\"\"\n",
814
+ " self.conv1 = GCNConv(data.num_features, 16)\n",
815
+ " self.conv2 = GCNConv(16, dataset.num_classes)\n",
816
+ "\n",
817
+ " def forward(self, data):\n",
818
+ " x, edge_index = data.x, data.edge_index\n",
819
+ " x = self.conv1(x, edge_index)\n",
820
+ " x = F.relu(x)\n",
821
+ " x = F.dropout(x, training=self.training)\n",
822
+ " x = self.conv2(x, edge_index)\n",
823
+ "\n",
824
+ " return F.log_softmax(x, dim=1)"
825
+ ]
826
+ },
827
+ {
828
+ "cell_type": "markdown",
829
+ "metadata": {
830
+ "id": "Mfob8LS2KezY"
831
+ },
832
+ "source": [
833
+ "Initial model and optimizer"
834
+ ]
835
+ },
836
+ {
837
+ "cell_type": "code",
838
+ "execution_count": null,
839
+ "metadata": {
840
+ "id": "GSZL4HS5Kd55"
841
+ },
842
+ "outputs": [],
843
+ "source": [
844
+ "model = GCN().to(device)\n",
845
+ "optimizer = torch.optim.Adam(model.parameters(), lr=0.01, weight_decay=5e-4)"
846
+ ]
847
+ },
848
+ {
849
+ "cell_type": "markdown",
850
+ "metadata": {
851
+ "id": "qpgxKMhmKitV"
852
+ },
853
+ "source": [
854
+ "Define axcuracy function for evaluating performance:"
855
+ ]
856
+ },
857
+ {
858
+ "cell_type": "code",
859
+ "execution_count": null,
860
+ "metadata": {
861
+ "id": "abnE-XTmKl92"
862
+ },
863
+ "outputs": [],
864
+ "source": [
865
+ "# useful function for computing accuracy\n",
866
+ "def compute_accuracy(pred_y, y):\n",
867
+ " return (pred_y == y).sum()"
868
+ ]
869
+ },
870
+ {
871
+ "cell_type": "markdown",
872
+ "metadata": {
873
+ "id": "26W7sxVsKrGM"
874
+ },
875
+ "source": [
876
+ "And finally we train the model on the trainin nodes for 200 epochs:"
877
+ ]
878
+ },
879
+ {
880
+ "cell_type": "code",
881
+ "execution_count": null,
882
+ "metadata": {
883
+ "colab": {
884
+ "base_uri": "https://localhost:8080/"
885
+ },
886
+ "id": "m39ZbE6RKyim",
887
+ "outputId": "2e7a3a02-e654-4c11-9c50-d40eab4e6c45"
888
+ },
889
+ "outputs": [
890
+ {
891
+ "name": "stdout",
892
+ "output_type": "stream",
893
+ "text": [
894
+ "Epoch: 10, Loss: 0.8235, Training Acc: 0.9357\n",
895
+ "Epoch: 20, Loss: 0.2665, Training Acc: 0.9786\n",
896
+ "Epoch: 30, Loss: 0.1056, Training Acc: 0.9857\n",
897
+ "Epoch: 40, Loss: 0.0583, Training Acc: 1.0000\n",
898
+ "Epoch: 50, Loss: 0.0461, Training Acc: 1.0000\n",
899
+ "Epoch: 60, Loss: 0.0388, Training Acc: 0.9929\n",
900
+ "Epoch: 70, Loss: 0.0406, Training Acc: 1.0000\n",
901
+ "Epoch: 80, Loss: 0.0447, Training Acc: 1.0000\n",
902
+ "Epoch: 90, Loss: 0.0571, Training Acc: 0.9929\n",
903
+ "Epoch: 100, Loss: 0.0304, Training Acc: 1.0000\n",
904
+ "Epoch: 110, Loss: 0.0373, Training Acc: 1.0000\n",
905
+ "Epoch: 120, Loss: 0.0268, Training Acc: 1.0000\n",
906
+ "Epoch: 130, Loss: 0.0504, Training Acc: 0.9857\n",
907
+ "Epoch: 140, Loss: 0.0245, Training Acc: 1.0000\n",
908
+ "Epoch: 150, Loss: 0.0294, Training Acc: 1.0000\n",
909
+ "Epoch: 160, Loss: 0.0378, Training Acc: 0.9929\n",
910
+ "Epoch: 170, Loss: 0.0441, Training Acc: 1.0000\n",
911
+ "Epoch: 180, Loss: 0.0223, Training Acc: 1.0000\n",
912
+ "Epoch: 190, Loss: 0.0370, Training Acc: 0.9929\n",
913
+ "Epoch: 200, Loss: 0.0224, Training Acc: 1.0000\n"
914
+ ]
915
+ }
916
+ ],
917
+ "source": [
918
+ "# train the model\n",
919
+ "model.train()\n",
920
+ "losses = []\n",
921
+ "accuracies = []\n",
922
+ "for epoch in range(200):\n",
923
+ " optimizer.zero_grad()\n",
924
+ " out = model(data)\n",
925
+ "\n",
926
+ " loss = F.nll_loss(out[data.train_mask], data.y[data.train_mask])\n",
927
+ " correct = compute_accuracy(out.argmax(dim=1)[data.train_mask], data.y[data.train_mask])\n",
928
+ " acc = int(correct) / int(data.train_mask.sum())\n",
929
+ " losses.append(loss.item())\n",
930
+ " accuracies.append(acc)\n",
931
+ "\n",
932
+ " loss.backward()\n",
933
+ " optimizer.step()\n",
934
+ " if (epoch+1) % 10 == 0:\n",
935
+ " print('Epoch: {}, Loss: {:.4f}, Training Acc: {:.4f}'.format(epoch+1, loss.item(), acc))\n"
936
+ ]
937
+ },
938
+ {
939
+ "cell_type": "code",
940
+ "execution_count": null,
941
+ "metadata": {
942
+ "colab": {
943
+ "base_uri": "https://localhost:8080/",
944
+ "height": 265
945
+ },
946
+ "id": "bUtwTwemLJGs",
947
+ "outputId": "d351364f-f04a-4a75-b3b3-80bd517fe569"
948
+ },
949
+ "outputs": [
950
+ {
951
+ "data": {
952
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXoAAAD4CAYAAADiry33AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3dd3xUZb748c83kzLpHRIIIaGHEoqhrKjIVRF0FV11xcV63Yuua7nrNvzdu67r9uJdV9druXZd0V1srKKuKIqolCBNCCXUJCQkJKTXyTy/P84kDJCQSTJpM9/365VXZk79njNnvueZ5zznPGKMQSmllO8K6OsAlFJK9SxN9Eop5eM00SullI/TRK+UUj5OE71SSvm4wL4OoC0JCQkmLS2tr8NQSqkBY9OmTceMMYltjeuXiT4tLY3s7Oy+DkMppQYMETnU3jitulFKKR/XYaIXkWEislpEdorIDhG5p41pREQeEZFcEdkmItPcxt0kIntdfzd5ewOUUkqdmSdVNw7gh8aYr0QkEtgkIh8aY3a6TbMAGO36mwk8DswUkTjg50AWYFzzrjDGHPfqViillGpXh4neGFMIFLpeV4lIDjAUcE/0C4EXjfU8hXUiEiMiycD5wIfGmDIAEfkQmA8s8+pWKKUGjKamJvLz86mvr+/rUAYku91OSkoKQUFBHs/TqYuxIpIGTAXWnzJqKJDn9j7fNay94W0tewmwBCA1NbUzYSmlBpD8/HwiIyNJS0tDRPo6nAHFGENpaSn5+fmkp6d7PJ/HF2NFJAJ4HfhPY0xlF2I8I2PMU8aYLGNMVmJimy2ElFI+oL6+nvj4eE3yXSAixMfHd/rXkEeJXkSCsJL834wxb7QxSQEwzO19imtYe8OVUn5Mk3zXdWXfedLqRoBngBxjzP+0M9kK4EZX65tZQIWrbv8DYJ6IxIpILDDPNczrHM1OHludy6d7Snpi8UopNWB5Ukc/G7gB2C4iW1zD/h+QCmCMeQJYCVwC5AK1wC2ucWUi8ktgo2u+B1suzHqbLUB4as1+vpmZzJwxWvWjlGpfREQE1dXVfR1Gr/Gk1c1a4Iy/FVytbb7fzrhngWe7FF0niAjpCeEcOFbT06tSSqkBxafujB2hiV4p1UVbtmxh1qxZZGZmcuWVV3L8uHW7zyOPPML48ePJzMxk0aJFAHz66adMmTKFKVOmMHXqVKqqqvoy9A71y2fddFV6QjhvbC6gttFBWLBPbZpSPukX/9zBziPebcQ3fkgUP79sQqfnu/HGG3n00UeZM2cO999/P7/4xS94+OGH+d3vfseBAwcICQmhvLwcgD/96U889thjzJ49m+rqaux2u1e3wdt8qkSfnhgOwMFjtX0ciVJqIKmoqKC8vJw5c+YAcNNNN7FmzRoAMjMzWbx4MS+//DKBgVYBcvbs2dx777088sgjlJeXtw7vr/p3dJ2UnmAl+gPHahg/JKqPo1FKdaQrJe/e9u6777JmzRr++c9/8utf/5rt27ezdOlSLr30UlauXMns2bP54IMPGDduXF+H2i6fKtGnxbckev+5mq6U6r7o6GhiY2P57LPPAHjppZeYM2cOTqeTvLw85s6dy+9//3sqKiqorq5m3759TJo0iZ/+9KdMnz6dXbt29fEWnJlPlejDQwJJirKzXy/IKqXOoLa2lpSUlNb39957Ly+88AK33347tbW1jBgxgueee47m5mauv/56KioqMMZw9913ExMTw89+9jNWr15NQEAAEyZMYMGCBX24NR3zqUQPaBNLpVSHnE5nm8PXrVt32rC1a9eeNuzRRx/1ekw9yaeqbsC6IKuJXimlTvC9RB8fTnltE+W1jX0dilJK9Qs+l+iHxoYCUFBe18eRKKVU/+BziX5IjJXoC8u1UwOllAJfTPTR1h1qRyq0RK+UUuCDiT4hIoQgm3BES/RKKQX4YKIPCBCSou0UaoleKXUGb731FiLS72928gafS/QAydGhWkevlDqjZcuWcc4557Bs2bIeW0dzc3OPLbszfDLRD4m2a6sbpVS7qqurWbt2Lc888wyvvvoqYCXlH/3oR0ycOJHMzMzWm6I2btzI2WefzeTJk5kxYwZVVVU8//zz3Hnnna3L++Y3v8knn3wCWJ2a/PCHP2Ty5Ml8+eWXPPjgg0yfPp2JEyeyZMkSrO47IDc3lwsvvJDJkyczbdo09u3bx4033shbb73VutzFixfz9ttvd3t7O7wzVkSeBb4JFBtjJrYx/sfAYrflZQCJrt6lDgJVQDPgMMZkdTtiDwyJCeVoZSHNToMtQPumVKrfem8pFG337jKTJsGC351xkrfffpv58+czZswY4uPj2bRpExs2bODgwYNs2bKFwMBAysrKaGxs5Nprr+W1115j+vTpVFZWEhoaesZl19TUMHPmTB566CEAxo8fz/333w/ADTfcwDvvvMNll13G4sWLWbp0KVdeeSX19fU4nU5uvfVW/vznP3PFFVdQUVHBF198wQsvvNDtXeJJif55YH57I40xfzTGTDHGTAHuAz49pbvAua7xvZLkAZJjQnE4DceqG3prlUqpAWTZsmWtnYgsWrSIZcuWsWrVKm677bbWRw7HxcWxe/dukpOTmT59OgBRUVEdPpLYZrNx1VVXtb5fvXo1M2fOZNKkSXz88cfs2LGDqqoqCgoKuPLKKwGw2+2EhYUxZ84c9u7dS0lJCcuWLeOqq67yyiOQPelKcI2IpHm4vOuAnqvw8lBLE8uC8joGR/XvDgGU8msdlLx7QllZGR9//DHbt29HRGhubkZEWpO5JwIDA096Xk59/Ylrgna7HZvN1jr8jjvuIDs7m2HDhvHAAw+cNG1bbrzxRl5++WVeffVVnnvuuU5uXdu8VkcvImFYJf/X3QYb4F8isklElnQw/xIRyRaR7JKSkm7FkhytN00ppdq2fPlybrjhBg4dOsTBgwfJy8sjPT2dyZMn8+STT+JwOADrhDB27FgKCwvZuHEjAFVVVTgcDtLS0tiyZUvrY4w3bNjQ5rpaknpCQgLV1dUsX74cgMjISFJSUlrr4xsaGqittTpMuvnmm3n44YcBq9rHG7x5MfYy4PNTqm3OMcZMAxYA3xeR89qb2RjzlDEmyxiTlZiY2K1AhrbcHatNLJVSp1i2bFlrlUmLq666isLCQlJTU8nMzGTy5Mm88sorBAcH89prr3HXXXcxefJkLrroIurr65k9ezbp6emMHz+eu+++m2nTprW5rpiYGP7jP/6DiRMncvHFF5/0q+Gll17ikUceITMzk7PPPpuioiIABg8eTEZGBrfccovXtllargCfcSKr6uadti7Guk3zJvAPY8wr7Yx/AKg2xvypo/VlZWWZ7OzsDuNqjzGGCT//gGunDxsQPdgo5U9ycnLIyMjo6zD6rdraWiZNmsRXX31FdHR0m9O0tQ9FZFN710K9UqIXkWhgDvC227BwEYlseQ3MA772xvo8iIfkaDtFFVp1o5QaOFatWkVGRgZ33XVXu0m+KzxpXrkMOB9IEJF84OdAEIAx5gnXZFcC/zLGuD8IfjDwpoi0rOcVY8z7Xou8A0nRdooqNdErpQaOCy+8kEOHDnl9uZ60urnOg2mex2qG6T5sPzC5q4F1V1JUKF/sO9ZXq1dKnYExBlchUHWSJ9Xtp/LJO2MBkqPtFFc10Ozs/E5RSvUcu91OaWlplxKWvzPGUFpait3euWbjPtdnbIukaDvNrpumtC29Uv1HSkoK+fn5dLcZtb+y2+0ndWzuCd9N9K7kXlhRr4leqX4kKCiI9PT0vg7Dr/hs1U2S6+7YIm1Lr5Tycz6b6JOjT5TolVLKn/lsoo8LDybYFqBNLJVSfs9nE72IMDg6RG+aUkr5PZ9N9ADJUaFadaOU8ns+neiTou0c1aobpZSf8/lEX1hRrzdmKKX8mm8n+ig7jQ4nx2ub+joUpZTqMz6d6JNb29Jr9Y1Syn/5dKJvvWmqUm+aUkr5L79I9NryRinlz3w60SdGhBAgWnWjlPJvPp3oA20BDIrUnqaUUv7NpxM9aE9TSinVYaIXkWdFpFhE2uzvVUTOF5EKEdni+rvfbdx8EdktIrkistSbgXsqKcqudfRKKb/mSYn+eWB+B9N8ZoyZ4vp7EEBEbMBjwAJgPHCdiIzvTrBdkRRt56gmeqWUH+sw0Rtj1gBlXVj2DCDXGLPfGNMIvAos7MJyuiU52k5Vg4Oqer1pSinln7xVR/8NEdkqIu+JyATXsKFAnts0+a5hbRKRJSKSLSLZ3uxirKWJpT7zRinlr7yR6L8ChhtjJgOPAm91ZSHGmKeMMVnGmKzExEQvhGVp6VKwqKLBa8tUSqmBpNuJ3hhTaYypdr1eCQSJSAJQAAxzmzTFNaxXJUeHAlCoXQoqpfxUtxO9iCSJiLhez3AtsxTYCIwWkXQRCQYWASu6u77OGhQVAuhNU0op/xXY0QQisgw4H0gQkXzg50AQgDHmCeBq4Hsi4gDqgEXGei6wQ0TuBD4AbMCzxpgdPbIVZ2APshEXHkyh1tErpfxUh4neGHNdB+P/Cvy1nXErgZVdC817BkWGUFypdfRKKf/k83fGAsRHBHO8trGvw1BKqT7hF4k+NiyYshpN9Eop/+QXiT4+PJjSaq26UUr5J79I9HHhIVTWO2hqdvZ1KEop1ev8I9FHBANoPb1Syi/5R6IPsxK91tMrpfyRfyT6cFeir9ZEr5TyP36R6ONdVTelWqJXSvkhv0j0rSV6TfRKKT/kF4k+JjQI0ESvlPJPfpHoA20BxIQFaaJXSvklv0j0YFXfaKJXSvkjv0n08eHBlNbo3bFKKf/jN4leS/RKKX/lZ4leOwhXSvkfv0r0x2sbcTpNX4eilFK9qsNELyLPikixiHzdzvjFIrJNRLaLyBciMtlt3EHX8C0iku3NwDsrLjyEZqehsl5L9Uop/+JJif55YP4Zxh8A5hhjJgG/BJ46ZfxcY8wUY0xW10L0jvhwvTtWKeWfOkz0xpg1QNkZxn9hjDnuersOSPFSbF6lnYQrpfyVt+vobwXec3tvgH+JyCYRWXKmGUVkiYhki0h2SUmJl8OC4fHhABwsrfH6spVSqj/rsHNwT4nIXKxEf47b4HOMMQUiMgj4UER2uX4hnMYY8xSuap+srCyvXzFNjrITHBjA4dJaby9aKaX6Na+U6EUkE3gaWGiMKW0ZbowpcP0vBt4EZnhjfV0RECAMiw3lkCZ6pZSf6XaiF5FU4A3gBmPMHrfh4SIS2fIamAe02XKntwyPD+dQmSZ6pZR/6bDqRkSWAecDCSKSD/wcCAIwxjwB3A/EA/8rIgAOVwubwcCbrmGBwCvGmPd7YBs8lhoXxrr9pRhjcMWllFI+r8NEb4y5roPx3wW+28bw/cDk0+foO2nxYdQ2NnOsupHEyJC+DkcppXqF39wZCyda3hwu05Y3Sin/4VeJPjU+DEAvyCql/IpfJfqU2FBE4KAmeqWUH/GrRB8SaGNIdCiH9aYppZQf8atEDzA8PkybWCql/IpfJnq9O1Yp5U/8LtGnxoVTWtNIlT6uWCnlJ/wu0adpyxullJ/xu0Tf0sTysNbTK6X8hN8l+pabprREr5TyF36X6CNCAokPD+aQNrFUSvkJv0v04GpiqSV6pZSf8NNEH6519Eopv+GXiT41LowjFXU0OJr7OhSllOpxfpnoh8eHYQzkldX1dShKKdXj/DLRpyW4Ogo/phdklVK+zy8T/ciECAD2lVT3cSRKKdXzPEr0IvKsiBSLSJt9vorlERHJFZFtIjLNbdxNIrLX9XeTtwLvjuiwIBIiQjTRK6X8QoddCbo8D/wVeLGd8QuA0a6/mcDjwEwRicPqYzYLMMAmEVlhjDnenaC9YWRiOPtK+knVjdMJAV34cdVUB04HBEeACBhj/QUEWP8b3U5kgaFgC3QNr7GmDw4/eXmNNWCcJ5bX4frrwel6ZpDYINi66xhHIzQ3dG5bOlrnqdvTWbZgCHR1H9my39wFhXfuM2ioxjqkAVsIBAafvmz3dTqbIcB2+nIcDdDcaL2WgNM/E/f1tCcgCILsJ947ndBUAwGBEBR6+vq78/m0tez2tqdFoB1sQdbrlmMMTt5v7XE/xrrq1PU4nda2uG+Pu/a2rS2NNdaxGRJxyvBaMB009nBfT+t+k9OX5QUeJXpjzBoRSTvDJAuBF40xBlgnIjEikozVqfiHxpgyABH5EJgPLOtO0N4wclAE724r7LmOwkv2WF+s+JGnjzMGDq+DwROgcAv8/UaYshjm/co6+GqOwaEvrKSx+SUo+AqmfAfSzoGQSEifAxuehA/vt5LKsFlw5ePw5vegqRa+/QK8cy/sX31inaGxMPUGOLDGWifAWbfApQ9Zwz57CA5+Zg0fNB5Gz4Otr1rrm3kbhMVDzgrYtRLGXWJ9ebf9/eSDedSFEDMctvwNHPWd21/xo2Dm7RAxyHpfeQTWPwHNDsi8BnLegdK9nVumu4AgmLwIastg97unj48cArNut+Lf97G1bcPPhsnXWduy8Wk4fhCmXg/5GyFv/Yl57dHw7ZesYZ/89kQiC7TDwsesz+TdH53YbzvehJH/BuEJ1j52T4xp51qfk2mGTc+fvJ72iA0mXQNj50NVEax7HMoPWcMvehASRsM/boHUWRCXDptf7vznEzcSxl9u7ZfKAkBg/EI494eQnAml++Dzv8DWZacn+pAoa5sKsk/enkC7ddyXH4aDa2HS1VB3HHJXwfgrrOXsfOvE/uyqQLu1/tl3WyfTl75l7d+My9y2x43Y4KJfwKw7rM9q7Z+hqtCK9eBnULYfplwPBZsgb501z6RrYOH/WsM+ewhyP/QstrGXQHjiif0WPgh+3I3jvB1i5WYPJrQS/TvGmIltjHsH+J0xZq3r/UfAT7ESvd0Y8yvX8J8BdcaYP7WxjCXAEoDU1NSzDh061IXN8dwzaw/wy3d2sum/LyQ+wksdhW9fbh2w5YesZIFYCTNqiDXeFgQTvgW73oEv/2olCEeDdSDWl1uJMjQOcv4JDleLoMhkGDbTmqelpBiRBNVFMGY+JE+2DkSnwyohBAS6vsQCs++BsDjrxHLoC9jzHsSNgMnfgYo8+OqFE8uKSIJpN1ol0O3/gJJdMOJ8qC2Fou3WeoMjYczFsOcD64syZTHEDrfG1ZbBVy9a25F5LQzK8Hy/OR3w9esn1tNiaJYVz6HPIWmSte9aSoadVZoLW5ZZJahpN544oYCVSPb8Cw6ttd7bQmDcpdYJsPaYNSxuBCRmwO6VEJ1iJfyW0vfmv8Gx3dZyxi+ElOnW8F3vWid0DCRlQtkBa7+NvcRKZk111sknYbQ1fX2ldWKvKrTeRw87eT3tKT9sJe8m170hQ8+CjMutde95z0pu8aOtz7LLn88bULQNUs92nVCOWrE2VFrHYNH2EyfTlu1pkb8Rdq44fb8V74Jtr1nfgxFzrJN5kB1GXWTtZ7HBtBtOfH+6qmU9GGtdzU3WcVu0/cT2iNuvuYOfW/ut5buRMBZi02DvBxCbbhWE3I+DuuNWoaRl+rB46xgLiz9zXDUlsOkF63ObfJ2134JCYfp3u7SZIrLJGJPV5rj+kujdZWVlmezsbI/i6qpPdhdz83Mb+ftt32BGelzXFlJXDm/dYSWh4WfDiwvBHgX2GKsE0dxoldhaSk+N1Se+jFOvt77YDZVw1bOw8f8g+zlr3KgL4KybrZ/L8SOtZFddAjXFVsJa97j1RV3wR6s6Zt/H8OHPrVJIUBis/DHM+SlkfPPkeGuOWSX7lp/wXzwKW1+DGd+1DrTWaganlRQiEq2TxLG91s/n6GHW9rX8/A6JPHn5TfXWtobGdH5fGmNtW0tp0BZibbsIVBdbpZ7u/vKqr7ROhC1VTKcq228l38hk6wTZVGcNa0mUtkDrhBYSefIJp7YMlv+7dRxc+IsTVUCNtfD2HVa12WUPWwmzZb811lr71B59cgyOBqt0DNYX39MTW125VTJ132/OZnjvp9ZJ/VtPWeO68/nUlJx8gqwrt37ptPxC+cb3ITKp7fnb2m8tywi0Wwne/fNpqMKr1RgV+fDFX60S+cLHrJPTqdvTwtlsfYeO7oCz74Sxl1qfac0x67vd1nGw6XlY/5SV4Kfd2P4xdqr2joMu6I1E/yTwiTFmmev9bqwkfz5wvjHmtrama09vJPq8slrO/cNqfnPlJL4zM/XkkRUF1k/glLNOHl5dbJXSWqorNr1glXLA+hLFpMJta9r/kBtrrJKXMVZ1SE9UGSml/NKZEr2nF2M7sgK4U0RexboYW2GMKRSRD4DfiEisa7p5wH1eWme3DI0JJSQwoO2WN+/9BPatturKirbDjresKot/3mNVy7QICofFr1ulhI1PW6WmM53Jg8OtBK+UUr3Io0QvIsuwSucJIpKP1ZImCMAY8wSwErgEyAVqgVtc48pE5JfARteiHmy5MNvXAgKEEYkR5Ba7En1dufWTLTjcqgppqoXd78GXj8GRr2D949bPtptXnqiDDA63/kZfCP/2312vP1ZKqR7kaaub6zoYb4DvtzPuWeDZzofW89Liw9hTWGHVb2/4P6te9uLfWElebLD6N1C2D87/f9ZFktHzYNC4themSV4p1U95q+pmQBoaE0r97lXw+cOQfp7VyuKdH1hVMlMXw4anrAuis75nXYRUSqkByC8fgdBiaGwoU8wujATAoldgxFyrOd3IuVbTQbBao2iSV0oNYH5foh8ju6mLyyAsJBIuuN8q1Y+/AoZMgWtesEr6Sik1gPl3oo8OJD0gl+KYa0gDGDoN7s050bZ2whV9GJ1SSnmHX1fdDG/YT5g0sM8+6cTAyMHavl0p5VP8OtGHF1utPrdIOy1plFLKB/hn1U3ZAXjuEqT2GEUyiF213n9anFJK9Rf+mejzs6HqCExZzLKCDAqOa5eCSinf5Z9VN8cPWv8v+ROlQ8+noFwTvVLKd/lnoi8/CBGDITiMoTFhVNQ1Ud3g6HA2pZQaiPwz0R8/ZD1fGuumKYAjWqpXSvko/030MVaHGUNjrESfV1bblxEppVSP8b9E39wElfmtJfpRiVaLm91Hq/owKKWU6jn+l+gr8qxeflxd4EWHBTE0JpScQk30Sinf5H+JvqXFjatED5CRHEVOYWWfhKOUUj3NfxO9q44eYHxyJPtLqqlvau6bmJRSqgd5lOhFZL6I7BaRXBFZ2sb4P4vIFtffHhEpdxvX7DZuhTeD75TqYlhxFxxca/VW79azfEZyFE4De7SeXinlgzq8M1ZEbMBjwEVAPrBRRFYYY3a2TGOM+YHb9HcBU90WUWeMmeK9kLsoZwV89aL1Om4kBNhaR2UkW8+b33mkksyUmL6ITimleownJfoZQK4xZr8xphF4FVh4humvA5Z5IzivKs6xeo4KT4RBGSeNSo0LIzzYpvX0Simf5MmzboYCeW7v84GZbU0oIsOBdOBjt8F2EckGHMDvjDFvtTPvEmAJQGpqqgdhddLRnZA0Ca5bdlJpHqyOwscmRWrLG6WUT/L2xdhFwHJjjPtVzeHGmCzgO8DDIjKyrRmNMU8ZY7KMMVmJiYnejcoYKN4Bg8dbHYDbo0+bJCM5ipyiSqx+zpVSynd4kugLgGFu71Ncw9qyiFOqbYwxBa7/+4FPOLn+vndUHoH6Chg0vt1JMpKjqKp3kK9PslRK+RhPEv1GYLSIpItIMFYyP631jIiMA2KBL92GxYpIiOt1AjAb2HnqvD2uOMf630GiB7SeXinlczpM9MYYB3An8AGQA/zdGLNDRB4UkcvdJl0EvGpOrvvIALJFZCuwGquOvg8S/Q7r/+D2E/24pEhE0Hp6pZTP8ajjEWPMSmDlKcPuP+X9A23M9wUw6dThvSZvI6x/wnrsQeQQCI1td9LwkECGx4VpiV4p5XN8t4epA2vglUXQVAsYGHVRh7OMHxLFjiOa6JVSvsV3E/3bd1p3v97wBhxef8ZqmxYZSVGs3F5EVX0TkfagXghSKaV6nu8m+upimH4rxKRafx5ouSC7u6iKrLS4noxOKaV6jW8+1Ky5CRx1bbaXP5OxSZEA7Dla3RNRKaVUn/DNRN/gajkTEtmp2YbEhBJkEw5rb1NKKR/im4m+vsL6HxLVqdlsAUJKbJh2K6iU8im+mei7WKIHGBYXpiV6pZRP8dFE72oiae9ciR4gNS5UE71Syqf4aKLveok+NS6MiromKmqbvByUUkr1Dd9M9PWuEn1I51rdAKTGhQNoqV4p5TN8M9F3q+omDNBEr5TyHb6d6Lt0MTYU0ESvlPIdPproq6wOwAPtnZ410h5EXHiwJnqllM/wzURfX2mV5kW6NHtqnLalV0r5Dt9M9A1VXaqfbzE8Poxt+eVsPFjmxaCUUqpv+Giir+xS/XyL2+eMJDosiG8/+SWrdxV7MTCllOp9Pproq7rUtLJFRnIU799zHvHhIbyxub3ucZVSamDwKNGLyHwR2S0iuSKytI3xN4tIiYhscf19123cTSKy1/V3kzeDb1d990r0YPU4dd6YBD7bW0Kz03Q8g1JK9VMdJnoRsQGPAQuA8cB1ItJWLx6vGWOmuP6eds0bB/wcmAnMAH4uIu335+ctDZXdqqNvcf7YQZTXNrEtv9wLQSmlVN/wpEQ/A8g1xuw3xjQCrwILPVz+xcCHxpgyY8xx4ENgftdC7YSGyk4/ubIt545KIEDgk90lXghKKaX6hieJfiiQ5/Y+3zXsVFeJyDYRWS4iwzo5LyKyRESyRSS7pKQbidUYVx1996puAGLDg5k8LIZP9miiV0oNXN66GPtPIM0Yk4lVan+hswswxjxljMkyxmQlJiZ2PZKmOnA6vFJ1AzBvfBJb88r5uqDCK8tTSqne5kmiLwCGub1PcQ1rZYwpNcY0uN4+DZzl6bxe140nV7Zl8axUokODeHjVHq8sTymlepsniX4jMFpE0kUkGFgErHCfQESS3d5eDuS4Xn8AzBORWNdF2HmuYT2noetPrmxLlD2IJeeNYFVOMVvz9KKsUmrg6TDRG2McwJ1YCToH+LsxZoeIPCgil7smu1tEdojIVuBu4GbXvGXAL7FOFhuBB13Dek43HmjWnpvOTiM4MIB3txd6bZlKKdVbAj2ZyBizEkLHhg8AABfqSURBVFh5yrD73V7fB9zXzrzPAs92I8bOqe/6I4rbExESyMjECPYcrfLaMpVSqrf43p2xXq6jbzF2cAR7j1Z7dZlKKdUbfDDRt1TdeK9EDzAmKZKC8jqq6rWLQaXUwOKDib5nSvRjBlnL26OleqXUAKOJ3kNjk1oSvdbTK6UGFt9M9IF2sAV5dbFDY0IJDbJpoldKDTi+l+gbq71emgcICBDGDNaWN0qpgcf3En1DFQRH9MiixwyOZOeRSnKLNdkrpQYOH0z0PVOiB5g/MYnKegcX/s8a/r4xr+MZlFKqH/DBRO+dJ1e25YKMwXx537+RGhfGv3Ye7ZF1KKWUt/leom/suaobgEGRdiYPi2FXUWWPrUMppbzJ9xJ9D5boW4xLiiT/eB2VevOUUmoA8MFEXw0hPVeiBxifbN11u6tQL8oqpfo/H0z0vVCiT7aWr9U3SqmBwLcSfbMDHHUQ3LOJPinKTkxYEDmFmuiVUv2fbyX6xpbHH/Rs1Y2IMC4pkhytulFKDQC+legbXA8c6+GqG4CM5Ch2F1VR39Tc4+tSSqnu8CjRi8h8EdktIrkisrSN8feKyE4R2SYiH4nIcLdxzSKyxfW34tR5varRleh7sHlli7ljB1HvaOb6p9dzvKaxx9enlFJd1WGiFxEb8BiwABgPXCci40+ZbDOQZYzJBJYDf3AbV2eMmeL6u5ye1PrkSu8+i74t541J5K/XTWNbQQX3r9jR4+tTSqmu8qREPwPINcbsN8Y0Aq8CC90nMMasNsbUut6uA1K8G6aHGnqnjr7FpZnJ3DBrOO9tL6S4qr5X1qmUUp3lSaIfCrg/2CXfNaw9twLvub23i0i2iKwTkSvam0lElrimyy4pKfEgrDb00LPoz+Q7M1NxOI0++0Yp1W959WKsiFwPZAF/dBs83BiTBXwHeFhERrY1rzHmKWNMljEmKzExsWsB9GIdfYuRiRGcPTKeV9YfpsGhF2aVUv2PJ4m+ABjm9j7FNewkInIh8F/A5caYhpbhxpgC1//9wCfA1G7Ee2Z9UKIH+I9zR3Ckop4bn9lARa0+FkEp1b94kug3AqNFJF1EgoFFwEmtZ0RkKvAkVpIvdhseKyIhrtcJwGxgp7eCP01D75foAeaOG8RfFk3hq8PHmfXbj/jxP7Zqs0ulVL8R2NEExhiHiNwJfADYgGeNMTtE5EEg2xizAquqJgL4h4gAHHa1sMkAnhQRJ9ZJ5XfGmB5M9JVgC4HA4B5bRXsWThnKyMQInvv8IP/YlM+csYl8M3NIr8ehlFKn6jDRAxhjVgIrTxl2v9vrC9uZ7wtgUncC7JQe6kbQUxOHRvOHqzP5eNdRPsop1kSvlOoXfO/O2F5qWtkeW4Awd+wgVu8uptlp+jQWpZQCn0v0VT3+QDNPXJAxmPLaJtbmHuOjnKOa8JVSfcq3En0fV920OHdMAoEBwi3PbeDWF7J57vMDfR2SUsqP+Vaib6js86obgCh7EFdOHcq01Fiyhsfyl4/2Ulrd0PGMSinVA3ws0fePEj3AH6+ZzPLvnc3vrsqktrGZX72bgzFahaOU6n0+luh7tmPwrhg1KILvzx3Fm5sL+H9vbsfZTn29MYbaRkcvR6eU8ge+lej7SR39qX5w4WjunDuKZRvyeHLNfipqm3hgxQ72HrXu5DXGcM+rW5jzx0+oadBkr5TyLo/a0Q8Y33oKYtP6OorTiAg/nDeG3OJq/rxqD+/vKGJrXjmrco6y4s5zeHfbEVZsPQLAG5sLuGHW8A6WqJRSnpP+WG+clZVlsrOz+zoMryupamDenz/leG0Td5w/kqfXHkCABoeTOWMSKa1poKHJyb9+cB6uO4yVUsojIrLJ9QDJ0/hWib6fS4wM4blbZlBUUc/8iUnMSI/j/a+LGDM4kmuyUnj/6yJ+vHwbn+wuYe64QX0drlLKR2iJvh+pb2pm3p/XUFxVz4OXT+SCjEHER4T0dVhKqQFAS/QDhD3IxuvfO5vbX97ET17fBlitdhZMTOLGb6Txee4x3tl2hKULMhg1qO3WRcYYdhZWMnZwJIE237rWrpTqGi3R90NNzU42Hijj6yMVfLK7hHX7SwkQweE02AKEsCAb912SwaWZyQTZhKfW7Odv6w/zo3ljOHCslic+3ceFGYN49LpphAbb+npzBjRjDP/IzmfuuEEkRvb/X1fGGL2+46fOVKLXRD8A7C+p5sUvDzEiMZy5Ywdx5ytfsTW/4qRpUuPCOFxmdds7Mz2ODQfLmJ4Wx8u3ziQ48ETJvqK2iQOlNUwaGo0tQDDGkFdWx+DoEEICT5wU6puasQfZyCur5S8f7WVqagwXT0gi4QxVSXuOVnG0sp5zR3veQ1iDo5nfvJtDSJCNO84fSUxYMMYYdh+tYlCknYq6JrbmlXPxhKRunbRqGhz85PVtZCRF8v25o7jn1S2IwEPXTD7jL5+1e49x/TPrufqsFP50zWQAiivrOVrZwKSU6Nbpmp2GAOG0JLvpUBnBNttJ03bE0ezkn9uOMHVYLGkJ4R7Pt+nQcf7ztc38ZdFUpqXGAtb+bXA4ibIHebwcNTBpovcxxhi25lfwee4xjDFMGx7LjLQ4/vLRXhxOw4/njeXtrQX84LWtXD8rlRnp8RRX1pMWH879b3/NkYp6BkeFMCQmlMLyeooq6xmXFMnvr8rk0z0l/HPrEQ6V1fLwtVN48tN9bCuowBgItgWwcMoQrj4rhUCb8N72IrbkldPkNKTFh/HOtkKanYZ7LhjNf144GoDP9h7DAFH2QA6V1hIabGNQZAhNzYZj1Q288MVB1h8oI0AgKjSIV5fMYlteRWvVVYsFE5P4xcIJvPzlIdISwslIjqKwoo5xSVEMiQk9bR/tL6nm832lnDsqgdrGZn729tdsOnQcEatHsKfW7AfguhnD+M6M4azZW8KHO49yTVYK35mR2pqwb3p2A5/uKSHIJrx3z3n84f1drMo5itPALxdO4IZvpJFXVsvip9eTmRLNI4umEuA6gT6z9gC/XplDSGAAz98yg1kj4k+KsanZyXtfF5FXVos9yMbcsYnsL6nhLx/tZXtBBaFBNn6xcALfzhpGTYODjQfLGJkYQUps6GknlKZmJ5c+8hl7jlaTmRLNW3fMpqrBwaKn1lFSVc8b35tNanzYGY+rTYeOMyw2lEFRdo+PRafTsDb3GGcNjyU8pHM1wY0OJz9ZvpWM5ChumzOS9ftLiY8IZtSg7t0Lk1tcxZf7Slk8czgBAdZ+eu7zA7y7rZCymkZevHUG9U3NPLxqL/dfNp5Bkadv758+2M2Gg2U8uHAC45KiOlxndYODbfnlTBkWQ1hw39SIa6L3U798ZyfPrD35gWpDou3cMXcUn+ceo7rBQUxYMBnJkTy+eh9Vrpu1ZqbHUd3gYMeRSgAeXzyNEYkRvLzuEMs35VPn6j0rODCAzKFWSfXrIxVcljkEp4HXv8pnzOAIwkMC2Xy4/IwxBgcG8MerMxmbFMkNz2wgIiSQY9UNjB0cyfyJSYQE2ThW1cBfPtpLSGAADQ7nacuYkR7HXf82irjwYHKLq9l8uJxX1h+msfnEtMG2AH595UQe+tceiirrmTQ0mrNHxvOkK+EDDIsLJa+sjlGDIkiOtjNlWAyPfpzLt7NSWL4pn9AgG/UOJ7fPGcHuoipW5RRzYcZgdh+t5GhlA40OJ7eek87dF4zmN+/m8Fp2HvPGD+bAsRqOlNfxqysncsWUoQCs3l3Mb1buIre4+rTtSYgI4ccXj+GtzUf4cn8pP5k/lg93Hm3dlwkRwWQNj+O756aTlRaHMYY/r9rLIx/t5cqpQ3lzcwG3zE5j8+FydhypwB5kIzEyhEXTh9HUbDDGMGFoNLPS4wkNtlFcVc+v383h7S1HiAgJ5IfzxrBoemrrL6iq+iYKyusoqqhn3f4ywoJtXDl1KCmxofzq3RyeWXuApCg7Dy6cwLwJSYB1Aqioa6Kk2moyPCTGTnxECMYYvtxfSpQ9iOWb8nn+i4MAnD82kU92l5AYGcKKO2fz63dzsAfZ+O23JhEggtMYgtr45VVYUcfL6w5xyaRkJgyJ5qvDx7n52Q1U1jtYPDOVX10xkeWb8vnx8m1MGBLF/pIavjEynpKqBrYXVHDdjGHcd0kGr6w/zNRhMWSlxZFTWMllf12LYD12/H++PYXLJg8hr6yWhIiQ035ZHjhWw3df2Mi+khpCg2xcNjmZG7+RxpjBkZTXNnKkop7ModGtJ50WBeV15JXVMjjKTnonfrm1RxO9n2pqdvLSl4cYlxxJWnw4W/LKmZke12ZLntziKlZuL+KyyUNITwinqr6JH7y2hTGDI/nJ/HGt09U0OPhoVzFOp+HC8YOJcJXiWuqGjTG8/lUBL3xxkNLqBu6+YDTpCeFU1jtIiw+jvslJSXU9QbYA4sNDGBYXSqSrWuHLfaUsfnodwYEBvH/Pea3VFsYYHlixg5zCKn55xURqGh0UHK9jcJSdjQfL+Nu6QxypqG+N0RYgXJaZzHfPHcHGg2WEhwQyd6xVx/5RzlGWvrGdZ27KYuKQaDYeLGuNbWRiBH/bcJiPc45SWFHPrqIq7EEBfLn0Av777a9Zub2Qh6+dwsIpQ2l0OPnD+7v4MOcodY3NPHVjFss35fHyusOtcdw5dxT3XjSGY9UN3PbyJjYfLmdYnPXrI6+sjmFxofz3peOZMyaR4soGPt1TzPD4cGaOiCMk0Iaj2cmdr2zm/R1F2AKEBxdOwGlgy+FyPt1TzLHqRiYNjSY6NIi1ucf4ZmYyjyyayrce/4IteeWEBdv449WTSYgI5t+f30hN48ndW4YEBpCZEs3W/AqMMSw5bwTb8iv4bO8xIu2BzEyPJzBA+Hh3MY2uE2yQzbpWZAykxIaSf7yOK6YMYc/RanYWVvKjeWPYcaSSVTlHaWo+ObekJ4RjD7KRU1jZOuzms9Morqpn5fYiLpmUxKqcYoJtAVS7Ch3fGBHPvpJqahubuXhCktVFp8DklGh2FVXx7rZCGhxOQoNsXJqZzIqtR0iOtnPOqAT+tv4wYwZHcKi0lrOGx/Liv8/guc8P8uuVOQBkpkTzdUEFY5OiWmMaGhNKaLCNsppG3vje2fxk+TY25x1nzphEVuUUMzgqhNmjEliz5xjx4cHEhQez8WAZkfZAli4Yx5a8ct7cXEB9kxMRaEmvZw2PZemCcWQNj6Wp2fDHD3bxf58daD1evzdnJJNSojEG5k9M6vjL3YZuJ3oRmQ/8BasrwaeNMb87ZXwI8CJwFlAKXGuMOegadx9wK9AM3G2M+aCj9Wmi91/vf11ISJCNuWM9v4+gwdHMe9uLCLQJYwZbJzX36xKn8vSC5c4jlTicTjJTYqhucHCotIYJQ06va29ZntNp+Cz3GBsOlJKZYl3TaNHsNLyy4TDr9pfS0NTMgonJXDZ5yBnjBKt64/fv72JmelxraRmgttHBK+sP897XRewrqeaO80dy6zkjsAUI1Q0OyqobGRobis1ViqxvasbhNAQGWKXj7IPH+XRPCesPlDJpaDS3nTeStIRwjDFsOFDGa9l57CiopKKuiYsnDGZGejyx4UFMGRbD8dom3tl6hPUHyhg9OIKfXjyOxmYndy3bzIc7jxIaZOPa6cMYHh9GYmQIQbYADpfW8lnuMYor67lldhoBIhwpr+eOuSMB2F1UxYQhUby07hD3v72DBxdOoK6xmd++t4sZ6XGkxITy4c6jxEcE0+BwUlhRT0xYEPPGD+a6Gak8sGIH2wsquPqsFH588TgSIoJ5/ouDfLqnhNrGZv538TQSIkJwNDtZ/PR6hseHsXRBBnP+sJq6pmYe+vZkRISXvjzIxoPH+cPVmXw7axiV9U1c++Q69hyt4uaz09iSV05OYSVzxw2iqt5BcWU9c8YmcsOs4aTEWlVjx2sa+WhXMXlltUTaAwkODODhVXspq2kkPjyYmkYH9U1OFs9MZf7EJN7cXMAbXxUAEB8ezKafXdThsdmWbiV6EbEBe4CLgHyszsKvc+/7VUTuADKNMbeLyCLgSmPMtSIyHlgGzACGAKuAMcaYM/acrYleqYHH0ezkzc0FnD0qgaFtXDfxVEVtE9Fh1q+8kqoGEiKCTzoxG2MoqWogPiKk9UTW6HBSVtNIUnTH1xfcT/QbXNeHstLiWseXVDWc1MKqpsFBWU0jw+LCTpvfU1X1TXy48yif55YSGxbE7NEJJxVm9h6tar1o3tG1lPZ0N9F/A3jAGHOx6/19AMaY37pN84Frmi9FJBAoAhKBpe7Tuk93pnVqoldKqc45U6L35I6aoUCe2/t817A2pzHGOIAKIN7DeVuCXCIi2SKSXVJS4kFYSimlPNFvbp00xjxljMkyxmQlJnreDlsppdSZeZLoC4Bhbu9TXMPanMZVdRONdVHWk3mVUkr1IE8S/UZgtIiki0gwsAhYcco0K4CbXK+vBj42VuX/CmCRiISISDowGtjgndCVUkp5osNbuIwxDhG5E/gAq3nls8aYHSLyIJBtjFkBPAO8JCK5QBnWyQDXdH8HdgIO4PsdtbhRSinlXXrDlFJK+YDutrpRSik1gGmiV0opH9cvq25EpAQ41MXZE4BjXgzHWzSuzuuvsWlcnaNxdV5XYhtujGmzbXq/TPTdISLZ7dVT9SWNq/P6a2waV+doXJ3n7di06kYppXycJnqllPJxvpjon+rrANqhcXVef41N4+ocjavzvBqbz9XRK6WUOpkvluiVUkq50USvlFI+zmcSvYjMF5HdIpIrIkv7MI5hIrJaRHaKyA4Rucc1/AERKRCRLa6/S/oovoMist0VQ7ZrWJyIfCgie13/Y3s5prFu+2WLiFSKyH/2xT4TkWdFpFhEvnYb1ub+EcsjrmNum4hM64PY/igiu1zrf1NEYlzD00Skzm3fPdHLcbX72YnIfa59tltELu7luF5zi+mgiGxxDe/N/dVejui548wYM+D/sB62tg8YAQQDW4HxfRRLMjDN9ToSqxvG8cADwI/6wb46CCScMuwPwFLX66XA7/v4sywChvfFPgPOA6YBX3e0f4BLgPcAAWYB6/sgtnlAoOv1791iS3Ofrg/iavOzc30XtgIhQLrre2vrrbhOGf8QcH8f7K/2ckSPHWe+UqKfAeQaY/YbYxqBV4GFfRGIMabQGPOV63UVkEM7vWr1IwuBF1yvXwCu6MNYLgD2GWO6emd0txhj1mA9gdVde/tnIfCisawDYkQkuTdjM8b8y1i9ugGsw+rzoVe1s8/asxB41RjTYIw5AORifX97NS4REeDbWH1a96oz5IgeO858JdF73GVhbxKRNGAqsN416E7XT69ne7t6xI0B/iUim0RkiWvYYGNMoet1ETC4b0IDrEdcu3/5+sM+a2//9Lfj7t+xSn4t0kVks4h8KiLn9kE8bX12/WWfnQscNcbsdRvW6/vrlBzRY8eZryT6fkdEIoDXgf80xlQCjwMjgSlAIdbPxr5wjjFmGrAA+L6InOc+0li/Ffukza1YHdtcDvzDNai/7LNWfbl/zkRE/gurz4e/uQYVAqnGmKnAvcArIhLViyH1u8/uFNdxcoGi1/dXGzmilbePM19J9P2qy0IRCcL6AP9mjHkDwBhz1BjTbIxxAv9HD/1c7YgxpsD1vxh40xXH0Zafgq7/xX0RG9bJ5ytjzFFXjP1in9H+/ukXx52I3Ax8E1jsShC4qkZKXa83YdWFj+mtmM7w2fX5PhOru9NvAa+1DOvt/dVWjqAHjzNfSfSedHfYK1x1f88AOcaY/3Eb7l6ndiXw9anz9kJs4SIS2fIa60Le15zcFeRNwNu9HZvLSaWs/rDPXNrbPyuAG12tImYBFW4/vXuFiMwHfgJcboypdRueKCI21+sRWN147u/FuNr77PpD96IXAruMMfktA3pzf7WXI+jJ46w3rjL3xh/Wlek9WGfi/+rDOM7B+sm1Ddji+rsEeAnY7hq+Akjug9hGYLV42ArsaNlPQDzwEbAXWAXE9UFs4Vgdyke7Dev1fYZ1oikEmrDqQm9tb/9gtYJ4zHXMbQey+iC2XKz625Zj7QnXtFe5PuMtwFfAZb0cV7ufHfBfrn22G1jQm3G5hj8P3H7KtL25v9rLET12nOkjEJRSysf5StWNUkqpdmiiV0opH6eJXimlfJwmeqWU8nGa6JVSysdpoldKKR+niV4ppXzc/werzw3QOTy/pgAAAABJRU5ErkJggg==\n",
953
+ "text/plain": [
954
+ "<Figure size 432x288 with 1 Axes>"
955
+ ]
956
+ },
957
+ "metadata": {
958
+ "needs_background": "light"
959
+ },
960
+ "output_type": "display_data"
961
+ }
962
+ ],
963
+ "source": [
964
+ "# plot the loss and accuracy\n",
965
+ "import matplotlib.pyplot as plt\n",
966
+ "plt.plot(losses)\n",
967
+ "plt.plot(accuracies)\n",
968
+ "plt.legend(['Loss', 'Accuracy'])\n",
969
+ "plt.show()"
970
+ ]
971
+ },
972
+ {
973
+ "cell_type": "markdown",
974
+ "metadata": {
975
+ "id": "30cq1fmqK-my"
976
+ },
977
+ "source": [
978
+ "It looks like the model achieves a very high accuracy and small loss on the training dataset. To see how well it generalizes, let's test on the testing nodes:"
979
+ ]
980
+ },
981
+ {
982
+ "cell_type": "code",
983
+ "execution_count": null,
984
+ "metadata": {
985
+ "colab": {
986
+ "base_uri": "https://localhost:8080/"
987
+ },
988
+ "id": "_6Q354OSLVkG",
989
+ "outputId": "8a86891a-bd2f-4d43-cb4f-d4773c8b6f00"
990
+ },
991
+ "outputs": [
992
+ {
993
+ "name": "stdout",
994
+ "output_type": "stream",
995
+ "text": [
996
+ "Accuracy: 0.7900\n"
997
+ ]
998
+ }
999
+ ],
1000
+ "source": [
1001
+ "# evaluate the model on test set\n",
1002
+ "model.eval()\n",
1003
+ "pred = model(data).argmax(dim=1)\n",
1004
+ "correct = compute_accuracy(pred[data.test_mask], data.y[data.test_mask])\n",
1005
+ "acc = int(correct) / int(data.test_mask.sum())\n",
1006
+ "print(f'Accuracy: {acc:.4f}')"
1007
+ ]
1008
+ },
1009
+ {
1010
+ "cell_type": "markdown",
1011
+ "metadata": {
1012
+ "id": "iwOLBEpgLW3p"
1013
+ },
1014
+ "source": [
1015
+ "Very cool! It seems we got a very nice accuracy for the test as well. Our model is doing okay. There are many ways you can go about trying to improve this model, but we will keep that for another time. Hopefully, with this tutorial you got a glimpse of graph data and how to use PyTorch Geometric to train GNNs on a very popular dataset. "
1016
+ ]
1017
+ },
1018
+ {
1019
+ "cell_type": "markdown",
1020
+ "metadata": {
1021
+ "id": "NapVlc1wH_1Y"
1022
+ },
1023
+ "source": [
1024
+ "Note that I haven't tested if this code works with GPUs. I will leave that as an exercise for the learner. "
1025
+ ]
1026
+ },
1027
+ {
1028
+ "cell_type": "markdown",
1029
+ "metadata": {
1030
+ "id": "ZtRNFO0mL2_b"
1031
+ },
1032
+ "source": [
1033
+ "If you are interested in the full tutorial and more examples, visit the [PyTorch Geomtric documentation](https://pytorch-geometric.readthedocs.io/en/latest/notes/introduction.html) where I adapted the code from. \n",
1034
+ "\n",
1035
+ "Feel free to reach out on [Twitter](https://twitter.com/omarsar0) if you have any further questions."
1036
+ ]
1037
+ }
1038
+ ],
1039
+ "metadata": {
1040
+ "colab": {
1041
+ "name": "Introduction to GNNs with PyTorch Geometric.ipynb",
1042
+ "provenance": []
1043
+ },
1044
+ "kernelspec": {
1045
+ "display_name": "Python 3 (ipykernel)",
1046
+ "language": "python",
1047
+ "name": "python3"
1048
+ },
1049
+ "language_info": {
1050
+ "codemirror_mode": {
1051
+ "name": "ipython",
1052
+ "version": 3
1053
+ },
1054
+ "file_extension": ".py",
1055
+ "mimetype": "text/x-python",
1056
+ "name": "python",
1057
+ "nbconvert_exporter": "python",
1058
+ "pygments_lexer": "ipython3",
1059
+ "version": "3.9.12"
1060
+ }
1061
+ },
1062
+ "nbformat": 4,
1063
+ "nbformat_minor": 1
1064
+ }
11_RoBERTa_Fine_Tuning_Emotion_classification.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
12_Text_Classification_Attention_Positional_Embeddings.ipynb ADDED
@@ -0,0 +1,707 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "id": "432c961b",
6
+ "metadata": {},
7
+ "source": [
8
+ "---\n",
9
+ "title: 13 Text Classification using Transformer\n",
10
+ "description: An implementation of Attention Mechanism and Positional Embeddings on a text classification task\n",
11
+ "---"
12
+ ]
13
+ },
14
+ {
15
+ "cell_type": "markdown",
16
+ "id": "fdfc7c80",
17
+ "metadata": {},
18
+ "source": [
19
+ "<a href=\"https://colab.research.google.com/drive/1Jc-_kcO3xYHDMFYSsIcUimcc38_WFlyh?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
20
+ ]
21
+ },
22
+ {
23
+ "cell_type": "markdown",
24
+ "id": "7noRR_tsvQj6",
25
+ "metadata": {
26
+ "id": "7noRR_tsvQj6"
27
+ },
28
+ "source": [
29
+ "<a href=\"https://www.kaggle.com/code/ritvik1909/text-classification-attention\" target=\"_blank\"><img align=\"left\" alt=\"Kaggle\" title=\"Open in Kaggle\" src=\"https://kaggle.com/static/images/open-in-kaggle.svg\"></a>"
30
+ ]
31
+ },
32
+ {
33
+ "cell_type": "markdown",
34
+ "id": "cd6727ba",
35
+ "metadata": {
36
+ "id": "cd6727ba",
37
+ "papermill": {
38
+ "duration": 0.012957,
39
+ "end_time": "2022-03-31T16:24:21.587262",
40
+ "exception": false,
41
+ "start_time": "2022-03-31T16:24:21.574305",
42
+ "status": "completed"
43
+ },
44
+ "tags": []
45
+ },
46
+ "source": [
47
+ "# Text Classification Using Attention and Positional Embeddings\n",
48
+ "\n",
49
+ "Recently most of the natural language processing tasks are being dominated by the `Transformer` architecture. Transformers were introduced in the paper [Attention Is All You Need](https://arxiv.org/abs/1706.03762), which used a simple mechanism called `Neural Attention` as one of its building blocks. As the title suggests this architecture didn't require any recurrent layer."
50
+ ]
51
+ },
52
+ {
53
+ "cell_type": "code",
54
+ "execution_count": null,
55
+ "id": "a6594f99",
56
+ "metadata": {
57
+ "_cell_guid": "b1076dfc-b9ad-4769-8c92-a6c4dae69d19",
58
+ "_uuid": "8f2839f25d086af736a60e9eeb907d3b93b6e0e5",
59
+ "execution": {
60
+ "iopub.execute_input": "2022-03-31T16:24:21.624487Z",
61
+ "iopub.status.busy": "2022-03-31T16:24:21.623707Z",
62
+ "iopub.status.idle": "2022-03-31T16:24:27.094642Z",
63
+ "shell.execute_reply": "2022-03-31T16:24:27.094004Z",
64
+ "shell.execute_reply.started": "2021-12-19T16:07:29.870867Z"
65
+ },
66
+ "id": "a6594f99",
67
+ "papermill": {
68
+ "duration": 5.494136,
69
+ "end_time": "2022-03-31T16:24:27.094795",
70
+ "exception": false,
71
+ "start_time": "2022-03-31T16:24:21.600659",
72
+ "status": "completed"
73
+ },
74
+ "tags": []
75
+ },
76
+ "outputs": [],
77
+ "source": [
78
+ "import numpy as np\n",
79
+ "import pandas as pd\n",
80
+ "from sklearn.datasets import fetch_20newsgroups\n",
81
+ "from sklearn.model_selection import train_test_split\n",
82
+ "\n",
83
+ "import tensorflow as tf\n",
84
+ "from tensorflow import keras\n",
85
+ "from tensorflow.keras import layers as L\n",
86
+ "from keras.preprocessing import sequence\n",
87
+ "from keras.preprocessing.text import Tokenizer"
88
+ ]
89
+ },
90
+ {
91
+ "cell_type": "markdown",
92
+ "id": "48a57962",
93
+ "metadata": {
94
+ "id": "48a57962",
95
+ "papermill": {
96
+ "duration": 0.011937,
97
+ "end_time": "2022-03-31T16:24:27.118566",
98
+ "exception": false,
99
+ "start_time": "2022-03-31T16:24:27.106629",
100
+ "status": "completed"
101
+ },
102
+ "tags": []
103
+ },
104
+ "source": [
105
+ "We will be using 20 news groups data in our notebooks which comes as a standard dataset in the `scikit-learn` package"
106
+ ]
107
+ },
108
+ {
109
+ "cell_type": "code",
110
+ "execution_count": null,
111
+ "id": "511c1fd2",
112
+ "metadata": {
113
+ "execution": {
114
+ "iopub.execute_input": "2022-03-31T16:24:27.147111Z",
115
+ "iopub.status.busy": "2022-03-31T16:24:27.146301Z",
116
+ "iopub.status.idle": "2022-03-31T16:24:40.349996Z",
117
+ "shell.execute_reply": "2022-03-31T16:24:40.350508Z",
118
+ "shell.execute_reply.started": "2021-12-19T16:07:35.49611Z"
119
+ },
120
+ "id": "511c1fd2",
121
+ "papermill": {
122
+ "duration": 13.220649,
123
+ "end_time": "2022-03-31T16:24:40.350682",
124
+ "exception": false,
125
+ "start_time": "2022-03-31T16:24:27.130033",
126
+ "status": "completed"
127
+ },
128
+ "tags": []
129
+ },
130
+ "outputs": [],
131
+ "source": [
132
+ "dataset = fetch_20newsgroups(subset='all')\n",
133
+ "\n",
134
+ "X = pd.Series(dataset['data'])\n",
135
+ "y = pd.Series(dataset['target'])\n",
136
+ "X_train, X_valid, y_train, y_valid = train_test_split(X, y, test_size=0.1, stratify=y, random_state=19)\n",
137
+ "y_train = pd.get_dummies(y_train)\n",
138
+ "y_valid = pd.get_dummies(y_valid)"
139
+ ]
140
+ },
141
+ {
142
+ "cell_type": "markdown",
143
+ "id": "fb0fa502",
144
+ "metadata": {
145
+ "id": "fb0fa502",
146
+ "papermill": {
147
+ "duration": 0.011286,
148
+ "end_time": "2022-03-31T16:24:40.374005",
149
+ "exception": false,
150
+ "start_time": "2022-03-31T16:24:40.362719",
151
+ "status": "completed"
152
+ },
153
+ "tags": []
154
+ },
155
+ "source": [
156
+ "The concept of `Neural Attention` is fairly simple ie not all input information seen by a model is equally important to the task at hand. Although this concept has been utilised at vaious different places as well eg Max Pooling in CNNs, but the kind of attention we are looking for should be `context aware`.\n",
157
+ "\n",
158
+ "The attention mechanism allows output to focus attention on input while producing output while the self-attention model allows inputs to interact with each other i.e calculate attention of all other inputs with respect tt one input.\n",
159
+ "\n",
160
+ "In the paper, the authors proposed another type of attention mechanism called multi-headed attention which refers to the fact that the outer space of the self attention layer gets factored into a set of independent sub-spaces leanred separately, where each subspace is called a \"head\"\n",
161
+ "\n",
162
+ "There is a learnable dense projection present after the multihead attention which enables the layr to actually learn something, as opposed to being a purely stateless transformation.\n",
163
+ "\n"
164
+ ]
165
+ },
166
+ {
167
+ "cell_type": "code",
168
+ "execution_count": null,
169
+ "id": "cc578b76",
170
+ "metadata": {
171
+ "execution": {
172
+ "iopub.execute_input": "2022-03-31T16:24:40.405951Z",
173
+ "iopub.status.busy": "2022-03-31T16:24:40.399870Z",
174
+ "iopub.status.idle": "2022-03-31T16:24:40.407809Z",
175
+ "shell.execute_reply": "2022-03-31T16:24:40.408219Z",
176
+ "shell.execute_reply.started": "2021-12-19T16:07:46.149062Z"
177
+ },
178
+ "id": "cc578b76",
179
+ "papermill": {
180
+ "duration": 0.022865,
181
+ "end_time": "2022-03-31T16:24:40.408467",
182
+ "exception": false,
183
+ "start_time": "2022-03-31T16:24:40.385602",
184
+ "status": "completed"
185
+ },
186
+ "tags": []
187
+ },
188
+ "outputs": [],
189
+ "source": [
190
+ "class TransformerBlock(L.Layer):\n",
191
+ " def __init__(self, embed_dim, dense_dim, num_heads, **kwargs):\n",
192
+ " super().__init__(**kwargs)\n",
193
+ " self.embed_dim = embed_dim\n",
194
+ " self.dense_dim = dense_dim\n",
195
+ " self.num_heads = num_heads\n",
196
+ " self.attention = L.MultiHeadAttention(num_heads=num_heads, key_dim=embed_dim)\n",
197
+ " self.dense_proj = keras.Sequential([L.Dense(dense_dim, activation='relu'), L.Dense(embed_dim)])\n",
198
+ " self.layernorm1 = L.LayerNormalization()\n",
199
+ " self.layernorm2 = L.LayerNormalization()\n",
200
+ " \n",
201
+ " def call(self, inputs, mask=None):\n",
202
+ " if mask is not None:\n",
203
+ " mask = mask[: tf.newaxis, :]\n",
204
+ " attention_output = self.attention(inputs, inputs, attention_mask=mask)\n",
205
+ " proj_input = self.layernorm1(inputs + attention_output)\n",
206
+ " proj_output = self.dense_proj(proj_input)\n",
207
+ " return self.layernorm2(proj_input + proj_output)\n",
208
+ " \n",
209
+ " def get_config(self):\n",
210
+ " config = super().get_confog()\n",
211
+ " config.update({\n",
212
+ " \"embed_dim\": self.embed_dim,\n",
213
+ " \"num_heads\": self.num_heads,\n",
214
+ " \"dense_dim\": self.dense_dim\n",
215
+ " })\n",
216
+ " return config"
217
+ ]
218
+ },
219
+ {
220
+ "cell_type": "markdown",
221
+ "id": "e1cb0ce3",
222
+ "metadata": {
223
+ "id": "e1cb0ce3",
224
+ "papermill": {
225
+ "duration": 0.011173,
226
+ "end_time": "2022-03-31T16:24:40.431117",
227
+ "exception": false,
228
+ "start_time": "2022-03-31T16:24:40.419944",
229
+ "status": "completed"
230
+ },
231
+ "tags": []
232
+ },
233
+ "source": [
234
+ "The idea behind Positional Encoding is fairly simple as well, ie to give the model access to token order information, therefore we are going to add the token's position in the sentence to each word embedding\n",
235
+ "\n",
236
+ "Thus, one input word embedding will have to components: the usual token vector representing the token independent of any specific context, and a position vector representing the position of the token in the current sequence."
237
+ ]
238
+ },
239
+ {
240
+ "cell_type": "code",
241
+ "execution_count": null,
242
+ "id": "2db9b3dd",
243
+ "metadata": {
244
+ "execution": {
245
+ "iopub.execute_input": "2022-03-31T16:24:40.461368Z",
246
+ "iopub.status.busy": "2022-03-31T16:24:40.460599Z",
247
+ "iopub.status.idle": "2022-03-31T16:24:40.462951Z",
248
+ "shell.execute_reply": "2022-03-31T16:24:40.462566Z",
249
+ "shell.execute_reply.started": "2021-12-19T16:07:46.137032Z"
250
+ },
251
+ "id": "2db9b3dd",
252
+ "papermill": {
253
+ "duration": 0.020698,
254
+ "end_time": "2022-03-31T16:24:40.463070",
255
+ "exception": false,
256
+ "start_time": "2022-03-31T16:24:40.442372",
257
+ "status": "completed"
258
+ },
259
+ "tags": []
260
+ },
261
+ "outputs": [],
262
+ "source": [
263
+ "class PositionalEmbedding(L.Layer):\n",
264
+ " def __init__(self, sequence_length, input_dim, output_dim, **kwargs):\n",
265
+ " super().__init__(**kwargs)\n",
266
+ " self.token_embeddings = L.Embedding(input_dim, output_dim)\n",
267
+ " self.position_embeddings = L.Embedding(sequence_length, output_dim)\n",
268
+ " self.sequence_length = sequence_length\n",
269
+ " self.input_dim = input_dim\n",
270
+ " self.output_dim = output_dim\n",
271
+ " \n",
272
+ " def call(self, inputs):\n",
273
+ " length = tf.shape(inputs)[-1]\n",
274
+ " positions = tf.range(start=0, limit=length, delta=1)\n",
275
+ " embedded_tokens = self.token_embeddings(inputs)\n",
276
+ " embedded_positions = self.position_embeddings(positions)\n",
277
+ " return embedded_tokens + embedded_positions\n",
278
+ " \n",
279
+ " def get_config(self):\n",
280
+ " config = super().get_config()\n",
281
+ " config.update({\n",
282
+ " \"output_dim\": self.output_dim,\n",
283
+ " \"sequence_length\": self.sequence_length,\n",
284
+ " \"input_dim\": self.input_dim,\n",
285
+ " })\n",
286
+ " return config"
287
+ ]
288
+ },
289
+ {
290
+ "cell_type": "markdown",
291
+ "id": "afde9c93",
292
+ "metadata": {
293
+ "id": "afde9c93",
294
+ "papermill": {
295
+ "duration": 0.011052,
296
+ "end_time": "2022-03-31T16:24:40.485320",
297
+ "exception": false,
298
+ "start_time": "2022-03-31T16:24:40.474268",
299
+ "status": "completed"
300
+ },
301
+ "tags": []
302
+ },
303
+ "source": [
304
+ "Here we define some contants to parameterize the model"
305
+ ]
306
+ },
307
+ {
308
+ "cell_type": "code",
309
+ "execution_count": null,
310
+ "id": "280fe4c7",
311
+ "metadata": {
312
+ "execution": {
313
+ "iopub.execute_input": "2022-03-31T16:24:40.510644Z",
314
+ "iopub.status.busy": "2022-03-31T16:24:40.509859Z",
315
+ "iopub.status.idle": "2022-03-31T16:24:40.513940Z",
316
+ "shell.execute_reply": "2022-03-31T16:24:40.513540Z",
317
+ "shell.execute_reply.started": "2021-12-19T16:07:46.179928Z"
318
+ },
319
+ "id": "280fe4c7",
320
+ "papermill": {
321
+ "duration": 0.017492,
322
+ "end_time": "2022-03-31T16:24:40.514057",
323
+ "exception": false,
324
+ "start_time": "2022-03-31T16:24:40.496565",
325
+ "status": "completed"
326
+ },
327
+ "tags": []
328
+ },
329
+ "outputs": [],
330
+ "source": [
331
+ "vocab_size = 10_000\n",
332
+ "embed_dim = 256\n",
333
+ "num_heads = 2\n",
334
+ "dense_dim = 32\n",
335
+ "seq_length = 256"
336
+ ]
337
+ },
338
+ {
339
+ "cell_type": "markdown",
340
+ "id": "9ecb3a8d",
341
+ "metadata": {
342
+ "id": "9ecb3a8d",
343
+ "papermill": {
344
+ "duration": 0.011166,
345
+ "end_time": "2022-03-31T16:24:40.536738",
346
+ "exception": false,
347
+ "start_time": "2022-03-31T16:24:40.525572",
348
+ "status": "completed"
349
+ },
350
+ "tags": []
351
+ },
352
+ "source": [
353
+ "The input texts are here tokenized and padded to a uniform sequence length"
354
+ ]
355
+ },
356
+ {
357
+ "cell_type": "code",
358
+ "execution_count": null,
359
+ "id": "17d66cfc",
360
+ "metadata": {
361
+ "execution": {
362
+ "iopub.execute_input": "2022-03-31T16:24:40.571810Z",
363
+ "iopub.status.busy": "2022-03-31T16:24:40.566381Z",
364
+ "iopub.status.idle": "2022-03-31T16:24:49.253838Z",
365
+ "shell.execute_reply": "2022-03-31T16:24:49.254595Z",
366
+ "shell.execute_reply.started": "2021-12-19T16:07:46.189685Z"
367
+ },
368
+ "id": "17d66cfc",
369
+ "papermill": {
370
+ "duration": 8.706763,
371
+ "end_time": "2022-03-31T16:24:49.254778",
372
+ "exception": false,
373
+ "start_time": "2022-03-31T16:24:40.548015",
374
+ "status": "completed"
375
+ },
376
+ "tags": []
377
+ },
378
+ "outputs": [],
379
+ "source": [
380
+ "tokenizer = Tokenizer(num_words=vocab_size, oov_token='<unw>')\n",
381
+ "tokenizer.fit_on_texts(X_train)\n",
382
+ "X_train = tokenizer.texts_to_sequences(X_train)\n",
383
+ "X_train = sequence.pad_sequences(X_train, maxlen=seq_length)\n",
384
+ "X_valid = tokenizer.texts_to_sequences(X_valid)\n",
385
+ "X_valid = sequence.pad_sequences(X_valid, maxlen=seq_length)"
386
+ ]
387
+ },
388
+ {
389
+ "cell_type": "markdown",
390
+ "id": "6f65487a",
391
+ "metadata": {
392
+ "id": "6f65487a",
393
+ "papermill": {
394
+ "duration": 0.01132,
395
+ "end_time": "2022-03-31T16:24:49.277897",
396
+ "exception": false,
397
+ "start_time": "2022-03-31T16:24:49.266577",
398
+ "status": "completed"
399
+ },
400
+ "tags": []
401
+ },
402
+ "source": [
403
+ "**Defining the model**\n",
404
+ "The model architecture is fairly simple ie,:\n",
405
+ "* Input Layer\n",
406
+ "* Positional Embeddings\n",
407
+ "* Transformer Block\n",
408
+ "* Pooling\n",
409
+ "* Dropout\n",
410
+ "* Output Layer"
411
+ ]
412
+ },
413
+ {
414
+ "cell_type": "code",
415
+ "execution_count": null,
416
+ "id": "141d4968",
417
+ "metadata": {
418
+ "execution": {
419
+ "iopub.execute_input": "2022-03-31T16:24:49.307177Z",
420
+ "iopub.status.busy": "2022-03-31T16:24:49.306557Z",
421
+ "iopub.status.idle": "2022-03-31T16:24:52.014778Z",
422
+ "shell.execute_reply": "2022-03-31T16:24:52.015656Z",
423
+ "shell.execute_reply.started": "2021-12-19T16:07:54.105332Z"
424
+ },
425
+ "id": "141d4968",
426
+ "papermill": {
427
+ "duration": 2.726304,
428
+ "end_time": "2022-03-31T16:24:52.015828",
429
+ "exception": false,
430
+ "start_time": "2022-03-31T16:24:49.289524",
431
+ "status": "completed"
432
+ },
433
+ "tags": []
434
+ },
435
+ "outputs": [],
436
+ "source": [
437
+ "inputs = keras.Input(shape=(None, ), dtype=\"int64\")\n",
438
+ "x = PositionalEmbedding(seq_length, vocab_size, embed_dim)(inputs)\n",
439
+ "x = TransformerBlock(embed_dim, dense_dim, num_heads)(x)\n",
440
+ "x = L.GlobalMaxPooling1D()(x)\n",
441
+ "x = L.Dropout(0.5)(x)\n",
442
+ "outputs = L.Dense(20, activation='softmax')(x)\n",
443
+ "\n",
444
+ "model = keras.Model(inputs, outputs)\n",
445
+ "model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])"
446
+ ]
447
+ },
448
+ {
449
+ "cell_type": "code",
450
+ "execution_count": null,
451
+ "id": "d1c00f20",
452
+ "metadata": {
453
+ "colab": {
454
+ "base_uri": "https://localhost:8080/"
455
+ },
456
+ "execution": {
457
+ "iopub.execute_input": "2022-03-31T16:24:52.047519Z",
458
+ "iopub.status.busy": "2022-03-31T16:24:52.046116Z",
459
+ "iopub.status.idle": "2022-03-31T16:24:52.050660Z",
460
+ "shell.execute_reply": "2022-03-31T16:24:52.051090Z",
461
+ "shell.execute_reply.started": "2021-12-19T16:07:57.059649Z"
462
+ },
463
+ "id": "d1c00f20",
464
+ "outputId": "bd62e4b1-9855-417c-ba4c-3421498659be",
465
+ "papermill": {
466
+ "duration": 0.022404,
467
+ "end_time": "2022-03-31T16:24:52.051216",
468
+ "exception": false,
469
+ "start_time": "2022-03-31T16:24:52.028812",
470
+ "status": "completed"
471
+ },
472
+ "tags": []
473
+ },
474
+ "outputs": [
475
+ {
476
+ "name": "stdout",
477
+ "output_type": "stream",
478
+ "text": [
479
+ "Model: \"model\"\n",
480
+ "_________________________________________________________________\n",
481
+ " Layer (type) Output Shape Param # \n",
482
+ "=================================================================\n",
483
+ " input_1 (InputLayer) [(None, None)] 0 \n",
484
+ " \n",
485
+ " positional_embedding (Posit (None, None, 256) 2625536 \n",
486
+ " ionalEmbedding) \n",
487
+ " \n",
488
+ " transformer_block (Transfor (None, None, 256) 543776 \n",
489
+ " merBlock) \n",
490
+ " \n",
491
+ " global_max_pooling1d (Globa (None, 256) 0 \n",
492
+ " lMaxPooling1D) \n",
493
+ " \n",
494
+ " dropout (Dropout) (None, 256) 0 \n",
495
+ " \n",
496
+ " dense_2 (Dense) (None, 20) 5140 \n",
497
+ " \n",
498
+ "=================================================================\n",
499
+ "Total params: 3,174,452\n",
500
+ "Trainable params: 3,174,452\n",
501
+ "Non-trainable params: 0\n",
502
+ "_________________________________________________________________\n"
503
+ ]
504
+ }
505
+ ],
506
+ "source": [
507
+ "model.summary()"
508
+ ]
509
+ },
510
+ {
511
+ "cell_type": "code",
512
+ "execution_count": null,
513
+ "id": "220efe0c",
514
+ "metadata": {
515
+ "execution": {
516
+ "iopub.execute_input": "2022-03-31T16:24:52.079868Z",
517
+ "iopub.status.busy": "2022-03-31T16:24:52.079195Z",
518
+ "iopub.status.idle": "2022-03-31T16:24:52.081785Z",
519
+ "shell.execute_reply": "2022-03-31T16:24:52.081377Z",
520
+ "shell.execute_reply.started": "2021-12-19T16:07:57.071427Z"
521
+ },
522
+ "id": "220efe0c",
523
+ "papermill": {
524
+ "duration": 0.018413,
525
+ "end_time": "2022-03-31T16:24:52.081887",
526
+ "exception": false,
527
+ "start_time": "2022-03-31T16:24:52.063474",
528
+ "status": "completed"
529
+ },
530
+ "tags": []
531
+ },
532
+ "outputs": [],
533
+ "source": [
534
+ "es = keras.callbacks.EarlyStopping(verbose=1, patience=5, restore_best_weights=True)\n",
535
+ "rlp = keras.callbacks.ReduceLROnPlateau(patience=3, verbose=1)"
536
+ ]
537
+ },
538
+ {
539
+ "cell_type": "code",
540
+ "execution_count": null,
541
+ "id": "1c8d14d0",
542
+ "metadata": {
543
+ "colab": {
544
+ "base_uri": "https://localhost:8080/"
545
+ },
546
+ "execution": {
547
+ "iopub.execute_input": "2022-03-31T16:24:52.108981Z",
548
+ "iopub.status.busy": "2022-03-31T16:24:52.108220Z",
549
+ "iopub.status.idle": "2022-03-31T16:26:11.847579Z",
550
+ "shell.execute_reply": "2022-03-31T16:26:11.846932Z",
551
+ "shell.execute_reply.started": "2021-12-19T16:07:57.083924Z"
552
+ },
553
+ "id": "1c8d14d0",
554
+ "outputId": "89557c96-d6e9-46b1-acb3-f62258084fe2",
555
+ "papermill": {
556
+ "duration": 79.753721,
557
+ "end_time": "2022-03-31T16:26:11.847749",
558
+ "exception": false,
559
+ "start_time": "2022-03-31T16:24:52.094028",
560
+ "status": "completed"
561
+ },
562
+ "tags": []
563
+ },
564
+ "outputs": [
565
+ {
566
+ "name": "stdout",
567
+ "output_type": "stream",
568
+ "text": [
569
+ "Epoch 1/100\n",
570
+ "531/531 [==============================] - 38s 61ms/step - loss: 2.4192 - accuracy: 0.3540 - val_loss: 0.8035 - val_accuracy: 0.7676 - lr: 0.0010\n",
571
+ "Epoch 2/100\n",
572
+ "531/531 [==============================] - 31s 59ms/step - loss: 0.6457 - accuracy: 0.8065 - val_loss: 0.4755 - val_accuracy: 0.8653 - lr: 0.0010\n",
573
+ "Epoch 3/100\n",
574
+ "531/531 [==============================] - 32s 60ms/step - loss: 0.2459 - accuracy: 0.9263 - val_loss: 0.4804 - val_accuracy: 0.8679 - lr: 0.0010\n",
575
+ "Epoch 4/100\n",
576
+ "531/531 [==============================] - 32s 59ms/step - loss: 0.1157 - accuracy: 0.9647 - val_loss: 0.5709 - val_accuracy: 0.8706 - lr: 0.0010\n",
577
+ "Epoch 5/100\n",
578
+ "530/531 [============================>.] - ETA: 0s - loss: 0.0670 - accuracy: 0.9793\n",
579
+ "Epoch 5: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
580
+ "531/531 [==============================] - 31s 59ms/step - loss: 0.0670 - accuracy: 0.9793 - val_loss: 0.6382 - val_accuracy: 0.8573 - lr: 0.0010\n",
581
+ "Epoch 6/100\n",
582
+ "531/531 [==============================] - 32s 59ms/step - loss: 0.0261 - accuracy: 0.9927 - val_loss: 0.5705 - val_accuracy: 0.8785 - lr: 1.0000e-04\n",
583
+ "Epoch 7/100\n",
584
+ "530/531 [============================>.] - ETA: 0s - loss: 0.0144 - accuracy: 0.9968Restoring model weights from the end of the best epoch: 2.\n",
585
+ "531/531 [==============================] - 32s 59ms/step - loss: 0.0144 - accuracy: 0.9968 - val_loss: 0.5800 - val_accuracy: 0.8801 - lr: 1.0000e-04\n",
586
+ "Epoch 7: early stopping\n"
587
+ ]
588
+ }
589
+ ],
590
+ "source": [
591
+ "history = model.fit(X_train, y_train, validation_data=(X_valid, y_valid),\n",
592
+ " callbacks=[es, rlp], epochs=100\n",
593
+ ")"
594
+ ]
595
+ },
596
+ {
597
+ "cell_type": "code",
598
+ "execution_count": null,
599
+ "id": "20fdf465",
600
+ "metadata": {
601
+ "colab": {
602
+ "base_uri": "https://localhost:8080/",
603
+ "height": 786
604
+ },
605
+ "execution": {
606
+ "iopub.execute_input": "2022-03-31T16:26:12.630165Z",
607
+ "iopub.status.busy": "2022-03-31T16:26:12.629441Z",
608
+ "iopub.status.idle": "2022-03-31T16:26:15.973619Z",
609
+ "shell.execute_reply": "2022-03-31T16:26:15.974083Z",
610
+ "shell.execute_reply.started": "2021-12-19T16:09:17.294329Z"
611
+ },
612
+ "id": "20fdf465",
613
+ "outputId": "83f8ec40-9ce5-455e-d0c4-132492c2e930",
614
+ "papermill": {
615
+ "duration": 3.759336,
616
+ "end_time": "2022-03-31T16:26:15.974233",
617
+ "exception": false,
618
+ "start_time": "2022-03-31T16:26:12.214897",
619
+ "status": "completed"
620
+ },
621
+ "tags": []
622
+ },
623
+ "outputs": [
624
+ {
625
+ "data": {
626
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAABIEAAAMCCAYAAAD+tdY5AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOzdd3SUddr/8c+dSe8JpEEIkRYgIGVZkAVBQlOkBBUF1rpiWQUeKYoIigqooICCDXt7xP1ZACkqCIiiCAIBJLSAtAAJARIgfTKZ3x9IHrMQAqTck5n365w9ZzNzzz3XnVzHo5/z/V5fw2632wUAAAAAAACn5mZ2AQAAAAAAAKh6hEAAAAAAAAAugBAIAAAAAADABRACAQAAAAAAuABCIAAAAAAAABdACAQAAAAAAOACCIEAAIDpHn/8ccXFxWnOnDlmlwIAAOC03M0uAAAAV/X4449r/vz5at++vT7++GOzy0EFpKen6/PPP9cvv/yigwcPKisrS15eXoqKilKrVq104403qmPHjjIMw+xSAQCACyMEAgAApgsLC9NVV12lkJAQs0u5LHa7XW+88YbefPNNFRQUlLweGBio/Px8paSkKCUlRV988YVatmypV199VZGRkSZWDAAAXBkhEAAAMN2YMWM0ZswYs8u4bBMmTNCXX34pSercubPuuecetWvXTt7e3pKko0ePatWqVXr//ff1+++/68CBA4RAAADANIRAAAAAV+Czzz4rCYBGjBih4cOHn3dNVFSUhg4dqltvvVWzZ8+WmxvjGAEAgHn4NxEAAGqg3bt3a/z48UpISFDLli3Vrl07DR48WPPmzZPVar3gZ/bt26dXX31Vd955Z6nP3XrrrXrvvfeUn59/wc999dVXiouL0x133CFJ+vrrr3X77berQ4cOiouL0/fffy9JSkhIUFxcnNatW6esrCw9//zzSkhIUIsWLXTttddq4sSJOnbs2AW/o6zB0KmpqYqLi1NcXFzJc48aNUqdOnVSy5Ytdf311+u1115TYWFhmb+r/Px8zZkzR71791bLli3VuXNnjRo1Srt37z7v/peqoKBAs2fPliR169btggHQX7m7u2v06NFq165dyWtz5sxRXFycHn/88TI/V9bvZd26dYqLi1NCQoIkafXq1Ro2bJg6duyopk2b6oMPPtDEiRMVFxenkSNHXrS2uXPnKi4uTomJiRd8f+XKlfr3v/+tTp06qUWLFurYsaMefPBB/fTTTxe9LwAAcDysBAIAoIb55JNPNHXqVBUXF0uSfH19lZubq6SkJCUlJWnp0qV666235OPjU+pzY8aMUXJysiTJy8tLvr6+OnXqlLZs2aItW7ZoyZIl+vDDD+Xv71/md0+ZMkUff/yx3NzcFBAQcMGVLWlpaRo/frwOHz4sHx8fGYahY8eOlQxOnj9/voKCgi77udesWaOHH35Y+fn5CggIUFFRkfbt26fZs2crOTlZr7/++nmfOXPmjO66666S5/bw8FBeXp6WLl2qH374Qc8+++xl1yFJy5Yt04kTJyRJDz300CV/rioGQ7/33nuaNm2aDMMo9Tfp27evPv/8c/3www/Kzs4u8++6ePHikuv/ymq1avz48Vq0aFHJa/7+/jp58qRWrVqlVatWadiwYXr00Ucr/ZkAAEDVYCUQAAA1yPfff6/JkyfLx8dHjz76qNauXaukpCRt3rxZ77zzjmJjY7V+/Xo9//zz5322VatWmjJlilauXKmtW7dq3bp12rp1q9544w3FxsZq27ZtmjFjRpnfvW3bNn3yyScaMWKE1q1bp/Xr1+u3335TmzZtSl03ZcoUBQYG6rPPPtPmzZuVlJSk119/XYGBgTp8+LDmzp17Rc8+atQodevWTStWrNCGDRu0ceNGjRkzRoZhaMWKFVq9evV5n5kyZYqSk5Pl6+ur6dOnKykpSRs3btTixYvVpEmTKw6B1q1bJ0mqXbu2rr766iu6R2U4fvy4XnrpJQ0dOlRr1qzRb7/9pqSkJF1//fVq3769IiIiVFBQoOXLl1/w8ykpKdq9e7cMwzgvBHrxxRe1aNEi1a9fXy+//HLJ727jxo2aNGmS/Pz89M4775SESAAAwPERAgEAUEPYbDY999xzkqRXXnlFw4YNU2hoqCTJ09NT1157rd5++235+Pjoyy+/PG/r1aRJkzRo0CDVrVu35DVPT08lJCTonXfekbu7u+bPn6+8vLwLfn9ubq7uv/9+DR8+XIGBgZLOrgypVatWqes8PT31/vvvl4RD7u7u6t69u/79739Lkr777rsrev6WLVtq1qxZio6OlnR2BdT999+v6667TpL07bfflrr+0KFDWrhwoSTp2Wef1YABA+Th4SFJaty4sd555x15eXldUS179+6VJDVt2vSKPl9ZCgoKdMMNN2jSpEmqXbu2pLOrvCIjI+Xm5qY+ffpIUplBzbnX27VrV2pg9f79+/XRRx8pNDRUH374oW644Qb5+vpKOvs3Hzp0qCZPnixJevPNN6vs+QAAQOUiBAIAoIZYv369Dh8+rCZNmujaa6+94DUxMTFq1aqVioqKtH79+ku+d7169dSoUSPl5eVpx44dF7zGYrHo7rvvLvdet9566wWPeu/Ro4eks3N+cnNzL7m2c+67774Lbqfq3r27pLOrWv5q+fLlstvtioqKOm+ViyQFBARo8ODBl12HJGVlZUnSFW1rq2z33ntvme+de+61a9eWbF/7q6VLl5a67pwFCxbIbrerT58+ioqKuuC9e/fuLU9PT6WkpJQ56wkAADgWZgIBAFBDbNq0SdLZVRqdOnUq87ozZ85IOns8+X/7+eef9eWXX2rr1q3KyMi44DDosv6DPiYmpmTl0cW0bNnygq9HRESUqvHcypJLVd59T58+Xer17du3S5Latm1b5iyevw5qrom8vb0vuhqpRYsWuuqqq7Rv3z598803uv3220ve27Jliw4ePCgPDw9df/31pT6XlJQkSZo/f/55K6z+qqioSNLZOVDh4eEVeRQAAFANCIEAAKghMjIyJEmFhYU6fvx4udf/d8BzbqjzOR4eHgoODpa7+9l/HTh16pSsVmuZ28EuJQCSJD8/vwu+/tetV2WdYHYxZQ02Pnffc4HEOZmZmZKksLCwMu95pcFFcHCwpLO/MzMFBweXe+x83759NWfOHC1evLhUCHRuK1jnzp1Lnuecc72Wk5OjnJyccusoq2cAAIBjIQQCAKCGOHcaWPfu3S94EtbFrF69Wh9//LEsFoseeugh9e/fX/Xq1Su1Qmbo0KHauHGj7Hb7Be9hsViuvHgn07BhQ23atEk7d+40tY5L+ZucC4E2b96s1NRURUdHq7i4WN98803J+//tXK+NHz/+krYAAgCAmoGZQAAA1BDnBv9eaJtXec5t6bnllls0fPhwxcTEnLdF6kIzY2qyc3OJzq1quZCLvXcxHTp0kHT2dK6tW7de0T3OBTgFBQVlXnNua19FxMbGqkWLFrLb7SUzgNatW6eMjAz5+vqWzFT6q4r0GgAAcFyEQAAA1BCtW7eWJO3atUvp6emX9dlz1zdv3vyC7x8+fFgHDhyoWIEO5tyzbtq0qczVTRs2bLiie/fs2bNke9zlrMr6ax3nTlhLS0sr89rk5OQrqu+/9evXT5K0aNEiSdKSJUskSQkJCfLx8Tnv+nO99tNPP1XK9wMAAMdACAQAQA3RsWNHRUVFyWazafr06Re99r9n1Zybp7N79+4LXj9z5swyg5KaqkePHjIMQ0ePHi3Z+vRX2dnZ+uyzz67o3t7e3hoxYoQkadWqVXrttdcuen1RUZFmzpxZKnRq0qSJJOn333+/4DDur7/+utJW4vTp00dubm7avXu3tm/frmXLlkn6v3DovyUmJsowDO3du7fc35HZc5EAAMClIwQCAMBkVqtVJ0+evOj/rFarPDw89OSTT8owDC1evFgPPfRQqePcrVarfv/9d02fPv28LT7nThP7z3/+oy+++EKFhYWSpCNHjmjcuHFasmSJQxx3XpliYmJKQo4JEyZo0aJFJcOj9+zZo2HDhl3wdLRLNXToUCUmJkqSZs+erXvvvVc///xzqe1daWlpmjdvnvr06aO5c+eWzNqRzp5aFh4eLqvVqjFjxujQoUOSzg5Z/uyzz/Tkk09W2t8kPDxc7du3lyRNnDhRp06dUnBwcJmnzDVq1KhkFtAzzzyjGTNmlFqxlJ2drTVr1mjs2LH6n//5n0qpEQAAVD0GQwMAYLKkpCR17Njxotd89NFH6tChg7p3766pU6dq0qRJWrFihVasWCFvb295e3vrzJkzstlsF/z8wIED9dVXX2nz5s2aMGGCnnrqKfn5+ZUcqz5y5Ej9+uuvWr9+faU/n5mefPJJpaSkaMeOHRo7dqyeeOIJeXl5lRxRP3nyZI0ZM0YeHh5XdP8XXnhB9erV09y5c7VmzRqtWbNGhmEoMDBQ+fn5pQKhtm3bKjY2tuRnd3d3PfXUUxo5cqTWr1+vHj16yN/fX/n5+SoqKtLNN9+s4uJizZ8/v6K/BklnB0D/+uuvJVvMrr/++os+96OPPqr8/HzNmzdPb731lt566y35+/vLMAxlZ2eXrBw7Fy4BAADHx0ogAABqmJtvvlnffvut7rrrLjVu3Fhubm7Kzs5WcHCw2rdvrxEjRpQMgj7H09NT77//vu6//37Vq1dPbm5uslgs6tSpk9588009/PDDJj1N1QoMDNS8efP00EMPqX79+rLb7fLy8lLfvn31xRdfqGHDhiXXXQnDMDR8+HAtX75cw4cPV5s2bRQaGqrc3Fy5u7urcePGuvXWW/XRRx9p3rx5ioiIKPX5nj176r333lOHDh3k5+en4uJiNW3aVFOnTtVzzz1X4ef/q969e8vT07Pk57K2gp1jsVj09NNP69NPP1X//v1Vt25dFRYWqqCgQHXq1FFCQoKeeuopzZ49u1LrBAAAVcewO9sAAAAAgEv0+eefa+LEiWrfvr0+/vhjs8sBAACoUqwEAgAALqmwsFAfffSRJOkf//iHydUAAABUPUIgAADgtI4cOaLx48drw4YNys3NlSQVFxdr69atuvfee7V7924FBARo0KBBJlcKAABQ9dgOBgAAnNaBAwfUq1evkp8DAwNVUFBQMrDZy8tLr7zyirp162ZWiQAAANWGEAgAADitc8etr1mzRvv27dPJkydlt9sVGRmpDh066F//+lepE7sAAACcGSEQAAAAAACAC2AmEAAAAAAAgAsgBAIAAAAAAHABhEAAAAAAAAAugBAIAAAAAADABRACAQAAAAAAuABCIAAAAAAAABdACAQAAAAAAOACCIEAAAAAAABcACEQAAAAAACACyAEAgAAAAAAcAGEQAAAAAAAAC6AEAgAAAAAAMAFEAIBAAAAAAC4AEIgAAAAAAAAF0AIBAAAAAAA4AIIgQAAAAAAAFwAIRAAAAAAAIALIAQCAAAAAABwAYRAAAAAAAAALoAQCAAAAAAAwAUQAgEAAAAAALgAQiAAAAAAAAAXQAgEAAAAAADgAgiBAAAAAAAAXAAhEAAAAAAAgAsgBAIAAAAAAHABhEAAAAAAAAAugBAIAAAAAADABRACAQAAAAAAuABCIAAAAAAAABdACAQAAAAAAOACCIEAAAAAAABcACEQAAAAAACACyAEAgAAAAAAcAGEQAAAAAAAAC6AEAgAAAAAAMAFEAIBAAAAAAC4AEIgAAAAAAAAF0AIBAAAAAAA4AIIgQAAAAAAAFwAIRAAAAAAAIALIAQCAAAAAABwAYRAAAAAAAAALoAQCAAAOK1169YpLi5O7777rtmlAAAAmI4QCAAAAAAAwAUQAgEAAAAAALgAd7MLAAAAMNtvv/2m119/XVu3bpXValXDhg01dOhQDRo0qNR1KSkpmjNnjpKSkpSZmamgoCA1aNBA9957r6677jpJUkFBgd566y0tXrxYaWlp8vDwUFRUlDp37qxx48aZ8HQAAABnEQIBAACXtnLlSg0fPly1a9fWPffcI39/fy1ZskQTJ05UamqqRo0aJUnKzMzUXXfdJUkaPHiw6tSpo8zMTG3btk1btmwpCYGeeeYZffnll0pMTFSbNm1ks9m0f/9+rVu3zqxHBAAAkEQIBAAAXJjNZtPkyZPl6+urzz//XBEREZKkoUOH6s4779Rbb72lgQMHKjY2Vps2bdKJEyc0a9Ys9enTp8x7fv/99+rSpYumTZtWXY8BAABwSZgJBAAAXFZycrKOHDmim2++uSQAkiRPT08NGzZMxcXFWrFihSQpICBAkvTTTz8pOzu7zHv6+/trz5492r17d9UWDwAAcJkIgQAAgMtKTU2VJDVq1Oi89xo3bixJOnTokCSpffv2SkxM1FdffaVrrrlGgwcP1uzZs7Vnz55Sn3viiSd06tQp9evXTz169NCECRP0/fffq7i4uIqfBgAA4OIIgQAAAC7RtGnTtGjRIj3yyCMKDg7W+++/r/79++uTTz4puaZHjx5auXKlpk+frmuuuUZr167Vww8/rDvuuEOFhYUmVg8AAFwdIRAAAHBZ0dHRknTeap6/vlavXr1Srzdp0kTDhg3Tm2++qdWrV6tevXqaMWOG7HZ7yTXBwcEaMGCApkyZohUrVmjYsGHasGFDydYyAAAAMxACAQAAlxUfH686deroq6++UkZGRsnrVqtV7777rgzDUPfu3SVJWVlZ523pCgwMVHR0tPLy8lRQUCCbzabTp0+XusYwDDVv3lySdOrUqSp+IgAAgLJxOhgAAHB6a9euVUFBwXmvh4SE6Mknn9Tw4cN1yy236NZbb5Wfn5+++eYbbd68WQ8++KBiY2MlSQsWLNCHH36oHj16qH79+nJ3d9dvv/2mNWvW6IYbbpC3t7dOnz6tzp07KyEhQc2bN1doaKhSU1M1b948BQUFqVu3btX85AAAAP/HsP917TIAAIATWbdune68884y37/qqqv07bffav369XrjjTe0ZcsWWa1WNWzYUP/85z81aNCgkmt37NihDz74QJs2bVJGRobc3NwUHR2txMRE3X777fL09FRhYaHmzJmjtWvX6tChQ8rJyVF4eLg6dOigBx54oCRQAgAAMAMhEAAAAAAAgAtgJhAAAAAAAIALIAQCAAAAAABwAYRAAAAAAAAALoAQCAAAAAAAwAWYdkR8cXGxbDbnmEltsRhO8yxwTvQoHB09CkdHj8LR0aNwdPQoHJ0z9aiHh6XM98oNgY4eParHHntMJ06ckGEYuvXWW3XXXXeVumbdunV66KGHFB0dLUnq2bOnhg8fftH72mx2ZWXlXkr9Di842NdpngXOiR6Fo6NH4ejoUTg6ehSOjh6Fo3OmHg0LCyjzvXJDIIvFoscff1zx8fHKzs7WzTffrE6dOqlRo0alrmvXrp3mzp1b8WoBAAAAAABQ6cqdCRQeHq74+HhJkr+/vxo0aKD09PQqLwwAAAAAAACV57IGQ6empmrHjh1q1arVee9t3rxZ/fv317Bhw5SSklJpBQIAAAAAAKDiDLvdfkmTj3JycnTHHXfowQcfVK9evUq9l52dLcMw5Ofnp9WrV2vq1KlatmzZRe/nXIOh3WSzFZtdBlAmehSOjh6Fo6NH4ejoUTg6ehSXo6jIqsOHD6ugoECXGFlUmGEY1fZdlcEwDHl5ealu3bpyd/co9d7FBkNfUghktVr14IMPqnPnzrrnnnvKLSYhIUFffPGFQkNDL3JPm9MMXXKmAVJwTvQoHB09CkdHj8LR0aNwdPQoLsfx40fl7e0rP79AGYZRLd9Z04JKu92unJzTys/PVe3aUaXeu9hg6HK3g9ntdk2YMEENGjQoMwDKyMgoScy2bt2q4uJihYSEXE79AAAAAAAAKioqrNYAqCY6uxsrUEVFhZf1uXJPB9u4caMWLlyoJk2aaMCAAZKk0aNH68iRI5KkIUOG6LvvvtO8efNksVjk7e2tmTNn8scCAAAAAABXhEyhfFfyOyo3BGrXrp127dp10Wtuv/123X777Zf95QAAAAAAAKgel3U6GAAAAAAAgLPr2fNas0uoEoRAAAAAAAAALqDc7WAAAAAAAACuyG636/XXZ+vXX3+WYRi666571b17Lx0/flyTJo1XTk6ObLYijR07Xi1aXK0XXpisnTu3yzAM3Xhjf9122z/NfoRSCIEq6OPfDinQ30sDmoWbXQoAAAAAAE5lSXK6vt6WVqn37N8iUjfGR1zStatXr1RKyi598ME8nTqVpWHD7lSrVm21fPm3at/+Gt11172y2WwqKMhXSspuZWQc08cf/z9J0pkzZyq17srAdrAKysqzasrSnVpUyU0JAAAAAADMtXXrZvXo0VsWi0WhobXUpk1b7dyZrGbNmmvp0kV69925+uOPPfL19VOdOnV15MhhzZo1Xb/++ov8/PzMLv88rASqoH93itWeE3l6bnmK6gR562/1gs0uCQAAAAAAp3BjfMQlr9qpTq1bt9Vrr72tX35Zo6lTn9Fttw3VDTf01QcfzNP69Wu1cOGXWrlyuZ54YpLZpZbCSqAKcre4ac7g1ooO9tZjX2/XgZO5ZpcEAAAAAAAqQatWbbRy5XLZbDZlZmZq8+YkNWsWr7S0owoJCVX//gPVr98A7d69S1lZWbLbi3Xddd11333/1u7du8wu/zysBKoEgT4emjWwhe75dLNGzd+m94a2UbCPh9llAQAAAACACujSpZu2bftdd989RIZh6KGHRqpWrdr65pvF+vTTj+Tu7i4fH19NnPiMMjKO6fnnn1FxsV2S9MADD5tc/fkMu91uN+OLrVabsrKcY9VMcLCvsrJyteXwKf37861qERWo125pKQ8LC63gGM71KOCo6FE4OnoUjo4ehaOjR3E50tIOKDKyfrV+p8XiJputuFq/szJc6HcVFhZQ5vWkFJWoVd0gPdU7TkmppzR1eYpMytcAAAAAAADOw3awSnZ9s3AdyszTW2sPqH6Ij+7pEGN2SQAAAAAAAIRAVWFYxxgdyMzV62v2KzrYRz3jwswuCQAAAAAAuDi2g1UBwzD0ZO84XV0nUM98u0vbjp42uyQAAAAAAODiCIGqiJe7m14a0Fy1/Dw1ZkGyjpzKN7skAAAAAADgwgiBqlCIr6deHthChbZijZq/TdkFRWaXBAAAAAAAXBQhUBW7qpavpvVrrgOZeRq/eIeKijkxDAAAAAAAVD9CoGrQvn6IxnVvpF/3Z2rGyj0cHQ8AAAAAgJPo2fPaMt87evSI7rjj1mqs5uI4HayaDLw6Sgcz8/TJhlTVD/XV4LZ1zS4JAAAAAAC4EEKgajT82quUmpWnWT/sVd0gb13bsJbZJQEAAAAA4LC8dn4h7x2fVeo985sNVkHTW8p8/4035ig8PEI333x2Bc+7786VxWJRUtJGnTlzWkVFRbrvvn/r2muvu6zvLSgo0IwZL2jnzu2yWCwaMWK02rZtpz/+2Kvnn39GVmuR7PZiTZkyXbVrh+mppx7XsWPHVFxs0913D1P37r0q8tiSCIGqlcXN0LN9mur+z7ZowpIdemdwazUJ9ze7LAAAAAAA8Kfu3Xtq9uyZJSHQqlXfa8aMORo0aLD8/PyVlZWlBx64W507d5VhGJd836+++lyS9NFH/9GBA/s1atTDmjfvKy1c+KUGDRqiXr1ukNVqVXGxTWvX/qzatcP04ouvSJKys7Mr5dkIgaqZj4dFMwfG6+7/TdKo+dv04T/bqLa/l9llAQAAAADgcAqa3nLRVTtVoUmTpsrMPKnjxzOUmZmpgIAA1apVW7Nnz9CWLUkyDDdlZGTo5MkTqlWr9iXfd+vWzbrlltskSfXrxyoyMkqHDh1UfPzV+uij93TsWLq6dk1QvXoxatCgkV599WW9/vpsdep0rVq1alMpz8ZgaBOE+XtpZmILnSko0ugFycqz2swuCQAAAAAA/Klbtx5atWqFVq5croSEXlq27BtlZWXp3Xc/0QcffKrQ0FAVFhZWynf16nW9pk2bKS8vbz366P9o48bfFBNTX++994kaNmykt99+Q++//3alfBchkEniIvw15cZm2pmeraeW7lQxJ4YBAAAAAOAQEhJ6asWKZVq1aoW6deuh7OxshYSEyN3dXZs2bVBa2tHLvmerVq21bNk3kqSDBw8oPT1NMTH1dfhwqurUqatBgwarc+eu2rs3RcePZ8jLy1u9e/fRkCF3aPfunZXyXGwHM1GXhrX0yHUNNOuHP/TaT/s0oksDs0sCAAAAAMDlNWjQULm5OQoLC1Pt2rXVq9cNGjdulO688zY1bdpc9evHXvY9Bw4cpBkzXtCdd94mi8WiCROelqenp1au/F7ffbdU7u7uCg2tpTvvvEc7dmzX66+/IsNwk7u7u8aOfbxSnsuw281ZgmK12pSVlWvGV1e64GDfK34Wu92uaSv26MstRzWxV2MNaBlVydUBFetRoDrQo3B09CgcHT0KR0eP4nKkpR1QZGT9av1Oi8VNNltxtX5nZbjQ7yosLKDM61kJZDLDMDS2W0MdzsrX89/vUZ0gb/09JsTssgAAAAAAgJMhBHIA7hY3Pd+vmf41b7PGfb1D7w1prdhavmaXBQAAAAAALsHevXs0efJTpV7z8PDQ229/aFJFF0YI5CD8vdz18sAWuufTJD0yf5s+GNpGwb4eZpcFAAAAAEC1s9vtMgzD7DIuWcOGjfTBB59W63deyXQfTgdzIHWCvPXigHhlZBfo0a+TVVhU8/YjAgAAAABQEe7unsrJOX1FIYersNvtysk5LXd3z8v6HCuBHMzVdQI16fo4TViyU1OW7dYzN8TVqPQTAAAAAICKCAkJU2ZmhrKzs6rtOw3DqHGhk7u7p0JCwi7vM1VUCyqgV9NwHcrK05s/H1BMiI+GdazeqegAAAAAAJjFYnFX7drVe3K2q5xgRwjkoP7VIUYHM/M095ezQVCvpuFmlwQAAAAAAGowZgI5KMMwNKFnE7WpG6hnvt2lrUdOm10SAAAAAACowQiBHJinu5um949XeICXxi5I1uFTeWaXBAAAAAAAaihCIAcX7OuhWQNbqKjYrlFfJetMfpHZJQEAAAAAgBqIEKgGiA311fT+zXUwK0/jF29XkY2j4wEAAAAAwOUhBJzfArYAACAASURBVKoh2sUE64kejbXuQJZeXLm3xh1dBwAAAAAAzMXpYDVI/5aROpCZp49+O6T6oT4a+rdos0sCAAAAAAA1BCFQDfPwtbE6lJWnl3/4Q3WDfNS1US2zSwIAAAAAADUA28FqGDfD0LM3xKlphL8mLtmhXenZZpcEAAAAAABqAEKgGsjbw6KZifEK8vHQ6AXbdOxMgdklAQAAAAAAB0cIVEPV9vfSzMR4ZRfYNHpBsnILbWaXBAAAAAAAHBghUA3WJNxfU/s2VUpGtp5culO2Yk4MAwAAAAAAF0YIVMN1blBLo69rqB/3ntCcH/eZXQ4AAAAAAHBQnA7mBG5rW1cHMvP0vxtTFRPqo5uujjK7JAAAAAAA4GAIgZzE6G4NlZqVp+nfp6hukLc61A8xuyQAAAAAAOBA2A7mJNzdDD3Xt5lia/nq8UXb9ceJHLNLAgAAAAAADqTcEOjo0aO644471KdPH91444368MMPz7vGbrdrypQp6tmzp/r166fk5OQqKRYX5+/lrlkDW8jT4qZR85N1MrfQ7JIAAAAAAICDKDcEslgsevzxx7V06VL95z//0aeffqo9e/aUuubHH3/U/v37tWzZMk2ePFlPP/10VdWLckQFemtmYrxO5BRq7ILtKigqNrskAAAAAADgAMoNgcLDwxUfHy9J8vf3V4MGDZSenl7qmhUrVigxMVGGYah169Y6ffq0jh07VjUVo1zxUYF6+vo4/X70tCZ/t0t2O0fHAwAAAADg6i5rJlBqaqp27NihVq1alXo9PT1dkZGRJT9HRkaeFxShevWIC9NDnWP13c4Mvb32gNnlAAAAAAAAk13y6WA5OTkaOXKknnjiCfn7+1f4iy0WQ8HBvhW+jyOwWNwc8lke6RWntByr3l57UHF1gzWgVR2zS4JJHLVHgXPoUTg6ehSOjh6Fo6NH4ehcpUcvKQSyWq0aOXKk+vXrp169ep33fkREhNLS0kp+TktLU0RExEXvabPZlZWVe5nlOqbgYF+HfZaxXa/S/oxsjZ//u4IshlpHB5ldEkzgyD0KSPQoHB89CkdHj8LR0aNwdM7Uo2FhAWW+V+52MLvdrgkTJqhBgwa65557LnhNQkKCFixYILvdrs2bNysgIEDh4eFXXjEqjYfFTdP6N1dUoLce/Xq7UrPyzC4JAAAAAACYoNyVQBs3btTChQvVpEkTDRgwQJI0evRoHTlyRJI0ZMgQde3aVatXr1bPnj3l4+Oj5557rmqrxmUJ9vHQrIEt9K9PkzRq/ja9O6S1Ar09zC4LAAAAAABUI8Nu0tFRVqvNaZZa1ZRlYxsPZWn4F7+rdXSQ5tzUQu6Wy5oLjhqspvQoXBc9CkdHj8LR0aNwdPQoHJ0z9WiFtoPBefytXrAm9GqsDQez9MKKPRwdDwAAAACAC7nk08HgHPrGR+pgZp7eX3dI9UN8dMff65ldEgAAAAAAqAaEQC7owU6xOpSZpzk/7lN0sI+6Na5tdkkAAAAAAKCKsR3MBbkZhiZdH6f4qAA9uXSndqSfMbskAAAAAABQxQiBXJS3h0UvDYhXiI+HRs9PVvqZArNLAgAAAAAAVYgQyIXV8vPUrIEtlGe1adT8bcottJldEgAAAAAAqCKEQC6uUZifnuvbTHuP52jCkh2yFXNiGAAAAAAAzogQCPrHVaEam9BIa/44qVdW/2F2OQAAAAAAoApwOhgkSYNa19GBk7mat+mwYkJ8dEvrOmaXBAAAAAAAKhEhEEqMuq6hDp/K10sr96husLc6xoaaXRIAAAAAAKgkbAdDCYuboSk3NlWD2n4av2iH9hzPMbskAAAAAABQSQiBUIqfp7tmJsbL28Oi0fO36UROodklAQAAAACASkAIhPNEBnprZmK8TuZaNXZhsvKtHB0PAAAAAEBNRwiEC2oeGaBn+zTVtqNn9Ox3u1Vs5+h4AAAAAABqMkIglCmhcW2NuPYqLd+Vobm/HDC7HAAAAAAAUAGcDoaLuuPv0TqYmaf3fj2omGAf3RgfYXZJAAAAAADgChAC4aIMw9C4Ho10+HS+pizbraggL7WNDja7LAAAAAAAcJnYDoZyeVjcNK1fM9UN8tZjC7frUGae2SUBAAAAAIDLRAiESxLo7aFZA1tIkh6Zv02n8qwmVwQAAAAAAC4HIRAuWb0QH704IF5HT+dr3KLtstqKzS4JAAAAAABcIkIgXJY20UGa2KuJNh46peeXp8jO0fEAAAAAANQIDIbGZevTPEIHM/P07q8HVT/UV3e1r2d2SQAAAAAAoByEQLgiD/yjvg5l5unVn/YpOthb3ZuEmV0SAAAAAAC4CLaD4YoYhqGnro9Ty6hATfpml5KPnja7JAAAAAAAcBGEQLhiXu5ueimxuWr5emj0gmSlnc43uyQAAAAAAFAGQiBUSKivp2bd1EIFRcUaNT9Z2QVFZpcEAAAAAAAugBAIFdaglp9e6NdM+07kaMKSHSoq5sQwAAAAAAAcDSEQKsU1saF6rHsj/bIvUy//sNfscgAAAAAAwH/hdDBUmpta1dGBzDx9uvGwYkJ8dGubumaXBAAAAAAA/kQIhEo1sksDpWbla8aqvaob5KNODULNLgkAAAAAAIjtYKhkFjdDk/s0VaPafnpi8Q6lZGSbXRIAAAAAABAhEKqAr6dFMwe2kJ+XRaPmJ+t4TqHZJQEAAAAA4PIIgVAlIgK8NDMxXqfyrBqzIFn5VpvZJQEAAAAA4NIIgVBlmkYEaMqNTbUj7Yye/naXiu0cHQ8AAAAAgFkIgVClujaqrZFdG2jF7uN6Y81+s8sBAAAAAMBlcToYqtw//1ZXBzNz9cH6Q6oX4qP+LSLNLgkAAAAAAJfDSiBUOcMw9FhCI7WPCdZzy1O08VCW2SUBAAAAAOByCIFQLdwtbnqhX3PFBPvosa+368DJXLNLAgAAAADApRACodoEeLtr5sB4uRmGRs3fpqw8q9klAQAAAADgMgiBUK2ig3300oDmSj9ToMcWJquwqNjskgAAAAAAcAmEQKh2reoG6anecUo6fFrPLd8tO0fHAwAAAABQ5TgdDKbo3SxcBzPz9NbaA6of6qt7OsSYXRIAAAAAAE6NEAimGdYxRgcyc/X6mv2KDvZRz7gws0sCAAAAAMBpsR0MpjEMQ0/2jlOrOoF6+pud+v3IabNLAgAAAADAaRECwVRe7m56cUBzhfl7aezCZB05lW92SQAAAAAAOCVCIJguxNdTLw9soUJbsUbN36bsgiKzSwIAAAAAwOkQAsEhxNby1bR+zXUgM0/jF+9QUTEnhgEAAAAAUJnKDYHGjx+vjh07qm/fvhd8f926dfrb3/6mAQMGaMCAAXr11VcrvUi4hvb1QzSueyP9uj9TL63cw9HxAAAAAABUonJPB7vpppt0++23a9y4cWVe065dO82dO7dSC4NrGnh1lA5l5unjDamqH+qrIW3rml0SAAAAAABOodyVQH//+98VFBRUHbUAkqThXa7SdY1q6eUf9uqnvSfMLgcAAAAAAKdQKTOBNm/erP79+2vYsGFKSUmpjFvChbkZhp7t01RNwvw1YckO7T6WbXZJAAAAAADUeIb9EgavpKam6sEHH9TixYvPey87O1uGYcjPz0+rV6/W1KlTtWzZsnK/uLi4WDabc8x8sVjcZLMVm12G00k/na+b566VIUNfPHCNIgK9zS6pxqJH4ejoUTg6ehSOjh6Fo6NH4eicqUc9PCxlvlfuTKDy+Pv7l/z/rl276plnntHJkycVGhp60c/ZbHZlZeVW9OsdQnCwr9M8iyPxkjRjQLzu+2yz7vtog+be1ko+F2lmlI0ehaOjR+Ho6FE4OnoUjo4ehaNzph4NCwso870KbwfLyMgoOcVp69atKi4uVkhISEVvC0iS4sL9NeXGZtqZnq2nlu5UMSeGAQAAAABwRcpdCTR69GitX79emZmZ6tKli0aMGKGioiJJ0pAhQ/Tdd99p3rx5slgs8vb21syZM2UYRpUXDtfRpWEtPXJdA8364Q+99tM+jejSwOySAAAAAACocS5pJlBVsFptTrPUypmWjTkqu92uaSv26MstRzWhZ2MlXh1ldkk1Cj0KR0ePwtHRo3B09CgcHT0KR+dMPXqx7WAVngkEVAfDMDQ2oZEOZ+XrhRV7VCfIW+3rs+0QAAAAAIBLVSlHxAPVwd3N0PP9mikmxEfjFm3X/hPOkdICAAAAAFAdCIFQo/h7uevlgS3kaXHTI/O3KSvXanZJAAAAAADUCIRAqHHqBHnrpQHxysgu0NiFySosKja7JAAAAAAAHB4hEGqklnUCNen6OG05clqTl+2WSfPNAQAAAACoMRgMjRqrV9NwHcrK05s/H1BMiI/u61jf7JIAAAAAAHBYhECo0f7VIUYHM/P01i8HFBPso97Nws0uCQAAAAAAh8R2MNRohmFoQs8malM3UM9+t0tbDp8yuyQAAAAAABwSIRBqPE93N00fEK+IAC89unC7UrPyzC4JAAAAAACHQwgEpxDs46GZA1uoqNiu0fOTdSa/yOySAAAAAABwKIRAcBqxob6a3r+5Dmblafzi7SqycXQ8AAAAAADnEALBqbSLCdYTPRtr3YEsvbhyL0fHAwAAAADwJ04Hg9Pp3yJSBzPz9OH6Q4oJ8dE/20WbXRIAAAAAAKYjBIJTeqhzrA5l5umV1X8oOthbXRvVNrskAAAAAABMxXYwOCU3w9AzN8SpaYS/Ji7ZqZ3pZ8wuCQAAAAAAUxECwWl5e1g0MzFeQT4eGr0gWcfOFJhdEgAAAAAApiEEglOr7e+lWQPjlVNg0+gFycottJldEgAAAAAApiAEgtNrHOav5/o2U0pGtp5culO2Yk4MAwAAAAC4HkIguIRODUI1+rqG+nHvCc35cZ/Z5QAAAAAAUO04HQwu47a2dXUwM0//uzFVMSHeuqlVHbNLAgAAAACg2hACwaWM6tZQqafyNH3FHtUN8lGH2BCzSwIAAAAAoFqwHQwuxd3N0NQbmym2lq/GLdquP07kmF0SAAAAAADVghAILsffy12zBraQl7ubRs1P1sncQrNLAgAAAACgyhECwSVFBXprZmK8TuQUauyC7SooKja7JAAAAAAAqhQhEFxWfFSgnrkhTr8fPa1nv90lu52j4wEAAAAAzosQCC6te5MwPdQ5Vst2ZeitXw6YXQ4AAAAAAFWG08Hg8u5uX08HM/P0zq8HFRPqoxuaRZhdEgAAAAAAlY6VQHB5hmHoiZ6N1TY6SJO/263NqafMLgkAAAAAgEpHCARI8rC4aXr/5ooK9NbYhclKzcozuyQAAAAAACoVIRDwpyAfD80a2EKS9MhX23Q632pyRQAAAAAAVB5CIOAvYkJ8NH1Acx0+la9xi3aoyMbR8QAAAAAA50AIBPyXttHBmtCrsTYczNIL3+/h6HgAAAAAgFPgdDDgAvrGR+pQZp7eW3dI9UN9dMff65ldEgAAAAAAFUIIBJThgU6xOpiZrzk/7lN0sI+6Na5tdkkAAAAAAFwxtoMBZXAzDE26voniowL05NKd2pF+xuySAAAAAAC4YoRAwEV4e1j00oB4hfp6aPT8ZKWdzje7JAAAAAAArgghEFCOWn6emjmwhfKsNo1ekKycwiKzSwIAAAAA4LIRAgGXoFFtPz3fr5n+OJ6jiUt2ylbMiWEAAAAAgJqFEAi4RB1jQzUmoZHW/HFSr6z+w+xyAAAAAAC4LJwOBlyGQa3r6GBmnuZtOqx6IT4a1LqO2SUBAAAAAHBJCIGAy/RI1wZKzcrTjJV7VDfIW/+4KtTskgAAAAAAKBfbwYDLZHEzNOXGpmpQ209PLN6hPcdzzC4JAAAAAIByEQIBV8DP010zE+Pl42HR6PnbdCKn0OySAAAAAAC4KEIg4ApFBnprRmK8TuZaNXZhsvKtNrNLAgAAAACgTIRAQAU0jwzQ5D5NlXz0jJ75dpeK7RwdDwAAAABwTIRAQAV1a1xbI7pcpe93H9fcn/ebXQ4AAAAAABfE6WBAJbi9XbQOnMzTe+sOKSbEVzfGR5hdEgAAAAAApZS7Emj8+PHq2LGj+vbte8H37Xa7pkyZop49e6pfv35KTk6u9CIBR2cYhsb1aKR2McGasmy3NqVmmV0SAAAAAACllBsC3XTTTXrnnXfKfP/HH3/U/v37tWzZMk2ePFlPP/10ZdYH1BgeFjdN69dMdYO89djC7TqYmWd2SQAAAAAAlCg3BPr73/+uoKCgMt9fsWKFEhMTZRiGWrdurdOnT+vYsWOVWiRQUwR6e+jlm1pIkkbN36ZTeVaTKwIAAAAA4KwKD4ZOT09XZGRkyc+RkZFKT0+v6G2BGis62EcvDYjX0dP5Grdou6y2YrNLAgAAAADAvMHQFouh4GBfs76+Ulksbk7zLKgc1wX76vkiu8Z+uVUzftyn5xNbyDAM0+qhR+Ho6FE4OnoUjo4ehaOjR+HoXKVHKxwCRUREKC0treTntLQ0RUSUfzKSzWZXVlZuRb/eIQQH+zrNs6DydI0N1rBrYvTOrwcV6euhuzvEmFYLPQpHR4/C0dGjcHT0KBwdPQpH50w9GhYWUOZ7Fd4OlpCQoAULFshut2vz5s0KCAhQeHh4RW8LOIX7/1FfvZuG6bU1+7Vid4bZ5QAAAAAAXFi5K4FGjx6t9evXKzMzU126dNGIESNUVFQkSRoyZIi6du2q1atXq2fPnvLx8dFzzz1X5UUDNYVhGHqyd5yOnCrQpG92KTLAS/FRgWaXBQAAAABwQYbdbreb8cVWq81pllo507IxVI2TuYW653+TlF9UrA//2UaRgd7V+v30KBwdPQpHR4/C0dGjcHT0KBydM/VolW4HA1C+UF9PzbqphQqKijVqfrKyC4rMLgkAAAAA4GIIgYBq0qCWn6b1a659J3I0YckOFRWbsggPAAAAAOCiCIGAatQhNkSPdW+kX/ZlataqvWaXAwAAAABwIRU+Ih7A5bmpVR0dyMzTpxsPKybER7e1rWt2SQAAAAAAF0AIBJhgZJcGSs3K18wf9io62EedGoSaXRIAAAAAwMmxHQwwgcXN0OQ+TdU4zF9PLN6hlIxss0sCAAAAADg5QiDAJL6eFs1MjJefl0Wj5ifreHaB2SUBAAAAAJwYIRBgovAAL81MjNepPKvGLNyufKvN7JIAAAAAAE6KEAgwWdOIAE25sal2pJ3RpG92qdjO0fEAAAAAgMpHCAQ4gK6Naut/ujbQypTjen3NfrPLAQAAAAA4IU4HAxzE0L/V1YHMXH24/pBiQnzUv0Wk2SUBAAAAAJwIIRDgIAzD0GMJjXQ4K1/PLU9RnUBvtYsJNrssAAAAAICTYDsY4EDcLW56oV9zxQT7aNyi7dp/MtfskgAAAAAAToIQCHAwAd7umnVTvCyGoVHztykr12p2SQAAAAAAJ0AIBDigukE+enFAcx07U6DHvk5WYVGx2SUBAAAAAGo4QiDAQbWqG6Snescp6fBpPbd8t+wcHQ8AAAAAqAAGQ1eQ187P5Zb5u7xCWqoooq1swVdJBtkaKkfvZuE6mJWnt345oJgQX/3rmhizSwIAAAAA1FCEQBVkyT4qt+TPFVjwniSp2CtIRRGtZY1oK2tEWxVFtJbdO8TkKlGTDbsmRocy8/TGz/sVHeytXk3DzS4JAAAAAFADEQJVUG67kfLs/pjO7Nsqj7RNck9Pkkf6JvlueEWG/ewcl6LgBiqKaCtrRBsVRbZVUWhTyeJhcuWoKQzD0MReTXTkVL6e+XaXogK91bJOoNllAQAAAABqGMNu0qARq9WmrCznOP46ONj3/GcpzJFHxha5p22SR3qSPNI2yS0vQ5Jkd/dWUdjVska0kTWyrYoi2qjYv44JlaMmycwt1D2fblae1ab3h7ZRnSDvS/7sBXsUcCD0KBwdPQpHR4/C0dGjcHTO1KNhYQFlvkcIVAkuqVnsdrmdOSyP9E1yTz8bDLkf+11GcaEkyeYXqaLItrKGn10tZA27WvLwqYbqUZPsP5Gre+YlKdzfS+8OaS1/r0tbzOdM/0CDc6JH4ejoUTg6ehSOjh6Fo3OmHr1YCMR2sOpiGCoOjFZBYLQKGvc/+5qtQO7Ht5/dQvbniiGvvUslSXbDoqJazc4GQhFnVwsxdBqxtXw1rV9zjfxqm8Yv2qFZN7WQu5thdlkAAAAAgBqAlUCVoDITQyPvxNlVQn+GQu7pSXKzZks6N3S6zdltZAyddmkLth7V1OUpurlVlMZ1byTDuHgQ5EypNpwTPQpHR4/C0dGjcHT0KBydM/UoK4FqELtPLRXG9lBhbI+zLxTbZMnc82cgtKnsodN/zhZi6LRrSLw6Sgcz8/TxhlTVD/XVkLZ1zS4JAAAAAODgCIEcnZtFtlpxstWKk5oPliQZhdlyP7blz5PIkuR5cLW8d30h6ezQaWtYq7PH1Ee2VVFEWxX7R5n5BKgiw7tcpUNZeZq1aq/qBnmrS8NaZpcEAAAAAHBgbAerBKYvG7Pb5XYm9f9WC6VtknvGtvOHTv85W4ih084jz2rTA//Zov0nc/X24NaKC/e/4HWm9yhQDnoUjo4ehaOjR+Ho6FE4OmfqUU4Hq2IO2Sx/Dp32SNtUsmLIcvqApD+HTtdu/ud8obYqimwrW9BVUjlzZeCYMrILdPf/JkmSPvhnG4X5e513jUP2KPAX9CgcHT0KR0ePwtHRo3B0ztSjhEBVrKY0i5F7XB7HNv85dHqT3NM3X2DodFtZI9owdLqG2XUsW/d9tlmxob6ae1sr+XhYSr1fU3oUrosehaOjR+Ho6FE4OnoUjs6ZepTB0JAk2X1rlzF0etOfQ6eT5PvbLBk6mwsWBTc8Gwz9OVuoqFZTyY2WcURx4f6aemMzjV2YrKeW7tS0/s3lxsouAAAAAMBf8F/0rqzU0Okhkv5r6HTaJnke/OH8odORbf5cLcTQaUdybcNaeuS6hpq5aq9e/XGfRnZtYHZJAAAAAAAHQgiEUuye/rJGd5I1upPypL8Mnd705zayJPlseU++54ZO+0f9ZRtZWxWFtWTotIkGt6mjAydz9fGGVMWE+CjxakI6AAAAAMBZhEC4OMNQcWA9FQTWU0HjAWdfsxXIPSP5/04jS0+S196lkv46dPrP2UIMna5WhmFobEIjHc7K1wsr9qhOkLfa12e2EwAAAACAwdCVwpkGSF0p4/+zd+fhUZV3/8c/55yZyZ5MEkKCLEE2tYIKCrgvUERFKxXw0Wqrrdvzu7S21epT0aq1oleronazat272Ke4L1Xc+uBSGxQVi1pUZBMTluzrbOf3xyyZNQmQMJPJ+3VdXJmZs8w9k9uB+fi9v6d9e0woFNt02h1aPhbqLzT8INm57jSPOLu1dvl03l/e19bWLt1/5lRNHT9syM9RZDY+R5HpmKPIdMxRZDrmKDJdNs1Rrg42wLJpsvSbgF9Ww6fdwVDtKln1a2ObTldNi/QWoul0/9vS1Knv/vk95TktPfb/Dpfl9aV7SEBKfI4i0zFHkemYo8h0zFFkumyao4RAAyybJstAMjwtcmxdLWftqmDj6bpVMju2S5JsR568FQeEmk5Pk69yKk2n+8GHW5r13//7gYpynZoxxq3pY9yaUV2qyqKcdA8NiMHnKDIdcxQZx7YlX6cMb5sMX7uKCyw1dxiyHfmynQWS5Uz3CIEYfI4i02XTHCUEGmDZNFn2KNuW2bIpWC1UuyrYfHrbGhkxTaeD1ULeymnyDZ8iOWg6vbPe3dSoZz/epjc/266GDq8kqbo0TzOqSzVjjFsHj3arKJcqLKQXn6PIdMxR7LKAT4a3XYavPfjT2y7D2yYleSxy2xf1mK8jdpu3XQpvV+p/xtumU7YzP/SnIBQOJd5X+H54m6Pn+7Jy6PWIXcLnKDJdNs1RQqABlk2TJe0iTafD1ULvyWreKEmyTYd85V8L9RYKLiOj6XTfuN35qm9o0+fb21SzoVE1Gxu0alOTOn0BmYb0taoizQhVCU0ZUSyXw0z3kDHE8DmKTMcczXK2Lfm7goGLpy0UwoSCl5gQpk2GN3hf0QGOrz0hqIkEOP6unRtKTFiTKpRJDHTyiwrU0dSYMF6Fxts9xuj7occCfV8ybhtmQqgkZ4FsZ17U4/H3kwdQ0fflzJMM/v2RzfgcRabLpjlKCDTAsmmyZCKjfVuot9B7waVkW9+X6W2TFNV0OtxfiKbTSSWbo15/QB9+1ayaDY1aubFRa75qlt+Wchympo4s0Yxqt2aMKdXE4QUyCdowwPgcRaZjjmYIOxAXakQHLu1JQphkQU2HlKzqxvb3fRiGFRvMRAUZKato4oOPhFCnQHLk7nIQsltz1O/pNdDqc+gVHZjtdADWw/sXcz96v4LIccmrmvLpO5kh+BxFpsumOUoINMCyabIMCpGm06tCy8jei206XTohWC1UOU3eymnyl+8z5P/y78scbe3yadXmJtVsaFDNxkZ9sSO4f0muQ9PHlGp6tVszxrg1ys2SPPQ/PkeR6ZijOykcKiRZ7qT4oCBZpU18yBC+7+vcqWHYVk7SsKCnShs5ooOaZNUt+ZLpyrhK5IycowF/TMVRyqqkVHMhyVK48PE7IzgPeg+RusO4uO0pqpxkuQbojctOGTlHgSjZNEcJgQZYNk2WwcrwtMhR90HUZepXyezYISnUdHr4AZH+Qr6qaQoUVKV5xHvWrszRba1dWrmxUTUbG1WzoUHbWoO9mvYqyY0sHZs+2i13Po0nsfv4HEWmy8o5Gm4sHBO4JFsulFj5oZjHkuwf8PZ9GDISQxhX+At3siVFyatoEnrcOPIk0xrANzCzZOUcTcUOhJpiJ6sCSxIe9jSP44/fmYow05E8XAw3504aNqZYNhddUebIzbiQsT8MqTmKQSmb5igh0ADLpsmSNcJNp2tXhUKh9+TY9u/IP0r9hXt1VwtVTZOvYnJWN53e3Tlq27Y21HeoZmODajY06p1NjWrzBP+RtM/wFh/jRAAAIABJREFUwuCVx6rdmjqyRLnOofMPbvQfPkeR6dI6RyPVFL18ee2xmiLu+D40Fo5nm64kvV0Sv/ymWpKTrOLCduZJVnZ+4d3T+BztB0l7Q/VQsRZZJpciDI0OoEIXPunTMGT0sSopVTCaaqlcflr7LjFHkemyaY4SAg2wbJosWc3XKcf2NTH9hayWTZKimk5XTQ1WC2VZ0+n+nqO+gK1P6loiTaY/+LJZvoAtp2XogL2KNWNMqWZUu7VvZZEcZna8hxhYfI4i0/U6R6O/PMZ8+evp6k4prgYV/2VylxsLh74w9thXpYcrRUV/yXTkc8nxDMfnaIbze1NW1yllb6s+VDXt7BJJR27yEKkPoZJS7d/HzwfmaBzblmQHK9vsQNRtSXZAhgLBfSLbum8bMfuHtikQuW9E7x+9rxQ8Nub4QGg84WMDceft3s+IG4fixmFExhHeFvV6Eo6JPd6Iea7o4+24bbGvJ+HcMe+HHfVeKsnrDp07dNtZUqH6gy4PNqof5AiBBhgfaINXuOl0sGLovR6aTk+Tr/Ig2TklaR7xrhnoOdrh9ev9L5uCodCGBq3dFnwPC3MsHTLareljgk2mq8vyZGRJsIb+xeco0sK2JX+njK4WmZ4WGV1NMjwtMjwtMuNu56hT3vbmSEiz28tIMrCxMAY3PkeHqIBfhi/6Mym+ErC3UKl7f8Xvu8uVguHPrNjPM1d+kTweb1Q4ER1cKDb0iA4KwqFHivCgO7hItS11eNAdPqQID6L3TThPdOgRkOJCjIRgIioAiZwXu8U2zNDfeUbwp2FIMoKPxz3WfdsMHafQsaHHDVNG4TDVn/hwVlxoiBBogPGXbhYJ+GU1rA1WC6VsOh3sLTSYmk7v6Tna0O7Ryo3Bq47VbGjQlubg/8UeXujS9OrSYE+hMW4NK8zZY2NCZuNzFLsk4AsGNV3NMj3NMrqaY++Hbhue5lDIE36sKXS/pdclGrZhynYVycgtlt8Kf6npafnF4G0sjMGNz1H0q3BI3l+9lrxtMgNdwSwk9AXdjv+CHvmiHgyyI1/wQ1/cg1/Uw1/io4+JP96QHbd/5DzRIUH0MVLkue1U4YFhhs4rJY43eO6Y8CF6bD2NO/r9iBpH0mAjcnzsOOy4ccSPzY4bh+LGYSe8Hin2vUz+fvT8ezSi3svoxxX1Xqb+Pdoxj8f9HiOvs39l0+coIdAAy6bJgkTdTaeD1UIpm05XTZOvcmpGNp1O9xzd3Nihmo2NWrmhQSs3Nqqp0ydJ2rs8P9JketqoEhXmZH6ghoGR7jmKNLADwS8JXS0yPE0J1ThmKLxJGep0tfTpCkEBZ4FsV5HsnBLZriIFXEWyc4plu4pl5xQp4CoO3S+S7SpWIHw7vI+zQDIM5igyHnMUmY45ikyXTXOUEGiAZdNkQR/Ytszmjd1XIqtdJcf2NTFNp4PLx6ZmTNPpTJqjAdvWp1vbIk2m3/uySV2+gCxD2n9EcSQUmjyiSE6LJQ1DRSbNUfRB6P8Qm11JgpmoipyY6hxPc/f+4WN6WWpgWznB4CYqpLFzioNBTlR4E4gKdWKCHFdRv1VrMkeR6ZijyHTMUWS6bJqjhEADLJsmC3ZRdNPp8DKypE2ng+HQnm46nclztMsX0IdbmlWzMVgl9FFtiwK2lOc0NXVUSaTJ9PhhBTJZOpG1MnmOZiW/tzuYiQpxgtU4fVxW1cslwMPLqOycklBo05dqnOiQpyjY3yZDMEeR6ZijyHTMUWS6bJqjux0CrVixQkuWLFEgENCiRYt04YUXxmx//PHH9ctf/lKVlZWSpLPPPluLFi3q8ZyEQMh2RtvWYNPpUMWQY+sH3U2nc0tDVyGbukeaTg+mOdrS6dO7mxpVE+ontKGhQ5JUlu+MNJieXu3WiOLM+XKI3TeY5mja2QEZntbE6pqU1ThNoSVXUaGOr6PXpwk4C7qDmVTVODHbS0JBT5ECrhLJmZ9VPW+Yo8h0zFFkOuYoMl02zdGeQqBea6T9fr9uuOEGPfDAA6qsrNTChQs1a9YsTZgwIWa/k046Sddee+3ujxbIEnbBcHnGzZVn3NzgA+Gm07Xh3kLvybXhtSRNp4ONpwdL0+n+VpTr0LETh+nYicMkSbXNnd1Npjc26sVPtkmSRrtzNSPUZPrg0W6V5HHpYgwCti35OqOqa8L9b7r74gSrbZojtxOqczwtfVxGVaxATncFTqBwRCi4KYksleoObopD1ThFkVBHprWH3hQAAADsKb1+w1y9erWqq6s1evRoSdK8efP0yiuvJIRAAHphWvKX7yd/+X7S/mdJkoyuZjm2fhCpFnKtf1m5n/yvpHDT6QO7ewtVTlOgoDKdryAtqopzdcrkKp0yuUq2bWvdjvZIldDfP9qqxz74SoakfSsLI6HQgSNLlOOgnxAGQHgZVfjqUtFBTvQlxrtiK3CiLzfet2VUsdU2/uIxsdU4McuqEoMcWVx5DwAAAIl6DYHq6upUVdV9taPKykqtXr06Yb/ly5dr5cqV2nvvvXXVVVdpxIgR/TtSIAvZOcXyjj5K3tFHhR4IN51eFektlPfBH5T/XlzT6apgtVAmNJ3ekwzD0PhhBRo/rEBnThspnz+gNbUtqtnQqJUbG/THdzbroZpNclmGDhxZEmkyvc/wQllm9ixLwS4KL6NKuGR4c+zVqHqqxunTMqrC7oqanGIF8ofJ7x6X2MQ4OriJup1ty6gAAACQOXrtCfTCCy/o9ddf15IlSyRJTz75pFavXh2z9KuhoUEFBQVyuVx69NFH9fzzz+vhhx/u8YkDgYD8/rT0pO53lmXK7w+kexjIVr5OGbWrZWx5V8aX7wT/NHU3nbYrp8geeYjsvQ6WPfIQqTSx6fRQmaNtXT6t3NCgtz7frrc+36H/1LVKkkrynDp07zIdPr5ch48vV3VZvgy+ZGeUXueobUu+DqmzSeoKLo0K31Zn+H5zaFtT6HZT1ONNUldr78uoHLlSTrGUWxKsqAndVk4w0OnttnJYRpWthsrnKAYv5igyHXMUmS6b5qjTmfrfo72GQO+9955+85vf6L777pMk3X333ZKkiy66KOn+fr9fM2bM0LvvvtvjoGgMDey67qbT4f5C78vwBedgd9PpULVQ5UEqqRwxsHPUtiXZcT8VvB3zuLoft+3QF/K4/cP7Rs6rhHN3f5GPP3f0cbYa2z36cEuTVm9p0vtfNmtHW5cM2aoocOnAvYp0wIgiTR5RLHeeI+65o8ek3scaPea49yP5WJPsm+S1GwnvZZL3OunjkpHydxD/uBIeN5K+D9HvcfL3wejhfUgYZ9z7kOcMqKtpR2iJVVRT46hKHSPgU09sw4q9GlVOHy4pzjIq9BF/1yPTMUeR6ZijyHTZNEd3qzH0lClTtH79em3atEmVlZV67rnndNttt8Xss3XrVg0fPlyS9Oqrr2r8+PG7OWQAPUnadLr+P5HeQs7a2KbTdl6pygOhVDvJF3IjaaAg9RQ09FZRkQnKJY2XND/8QPj7vU/SxtAfZIxcZ2EkpAkuoxouv3t85ApUgZy4ICfuNsuoAAAAgJ71GgI5HA5de+21Ov/88+X3+7VgwQJNnDhRd955pyZPnqzZs2frkUce0auvvirLslRSUqKbb755T4wdQJhpyT/sa/IP+1qSptOrlOtrkMfjky0j9CXZiPqyHHc/+vHwvjJkJzye6nb0cYo9Nu7xhP1jvsCHj0vxHDK6T9XrWMP7dj8esKUtLR59vr1Nn21v14b6DnkDwb5DY0rzNL6iUBMqCjXGnSfTMuPepySvxzCC72/049FjjX5vk7yu5L+bZO9T7GN2/P4x70+q9zoo8Tl7ed8SnjP160n2PiS+P93P5S4tVmNzlwAAAAAMnF6Xgw0UloMBew5ztHedXr8+2NIcaTL9SV2rbEkFLkvTRpVoeujKY+PK6Sc0EJijyHTMUWQ65igyHXMUmS6b5uhuLQcDgKEg12lpZnWpZlaXStpbTR1evbupMXI5+tfX1UuSygtcoauOuTV9TKkqi+ghAwAAAGBwIAQCgCRK8pyaNalCsyZVSJK2NHVq5cYG1Wxo1NvrG/T3j7dKksaW5WnGmFJNH+PWwaPdKsrlYxUAAABAZuLbCgD0wV4luTp1ygidOmWEAratz7e3qWZDo2o2Nujpf9fqf9/fItOQvlZVFKoUKtWUEcVyOcx0Dx0AAAAAJBECAcBOMw1DEysKNbGiUGcdMkpef0AffhXsJ1SzoVEP1WzS/f/apByHqamjSoKh0JhSTRxeIJN+QgAAAADShBAIAHaT0zI1bZRb00a59d9HSK1dPq3a3KSaDQ2q2dioX634QtIXcuc5dcjoYD+hGdVujSzJS/fQAQAAAAwhhEAA0M8Kcxw6eny5jh5fLkna1tqllRu7m0y/vHabJGlkSW6kwfT00W65853pHDYAAACALEcIBAADrKIwRyd9rVInfa1Stm1rQ32HakJNppd/sk1PrK6VJO0zvDBy5bGDRpYo12mleeQAAAAAsgkhEADsQYZhaGx5vsaW5+v0qSPlC9j6pK4l0mT6L6u+1CPvbJbTMnTgXsWaPqZUM6rd2reySA6TfkIAAAAAdh0hEACkkcM0NHlEsSaPKNb3Dh2jDq9f73/ZFGoy3aC73lyvu96UCnMsHTLaHQmFqkvzZNBkGgAAAMBOIAQCgAyS57R02NgyHTa2TJLU0O7Ryo2NwZ5CGxr0j892SJKGF7o0vbo0dOUxt4YV5qRz2AAAAAAGAUIgAMhgpfkuHb/vcB2/73BJ0ubGDtVsbNTKDQ164/Mdem5NnSRpXHm+ZoRCoWmjS1Tg4uMdAAAAQCy+JQDAIDLKnadR7jyddsAIBWxbn25tizSZfmL1V3p01ZeyDGn/EcWhJtOlmjyiSE7LTPfQAQAAAKQZIRAADFKmYWifykLtU1mob08frS5fQB9uaVbNxgat3Nio+/+1UX94e6PynKamjQpedWzGmFKNH5ZPPyEAAABgCCIEAoAskeMwdcgYtw4Z45YktXT69O6mRtWE+gm9+UW9JKks36npY4KB0Ixqt6qKc9M5bAAAAAB7CCEQAGSpolyHjp04TMdOHCZJqm3uDDaYDjWafvGTbZKkMaV5wVCoulSHjC5Rca4zncMGAAAAMEAIgQBgiKgqztUpk6t0yuQq2batdTvaI1VCf/9oqx774CsZkvatLIw0mT5wZIlyHPQTAgAAALIBIRAADEGGYWj8sAKNH1agM6eNlM8f0JraFtVsaNTKjQ364zub9VDNJuU4TB24V3EwFKp2a1JFoSyTfkIAAADAYEQIBACQwzJ14MgSHTiyRBccXq12j1/vbW6KXHnsN69/Ib0uFec6dMjo7ibTo9y5NJkGAAAABglCIABAgnyXpSPGlemIcWWSpB1tHr2zsVE1Gxv0rw2NevXT7ZKkEcU5mjGmVNPHuDW92q2yfFc6hw0AAACgB4RAAIBelRe4NHe/4Zq733DZtq1NjZ2q2dCgmo3BQOipf9dKkiZWFESaTE8dWaJ8l5XmkQMAAAAIIwQCAOwUwzA0pjRPY0rztPCgveQP2Ppka6tWhkKhZe9v0Z/f/VIO09CUvYo1Y4xb08e4tX9VkRwWTaYBAACAdCEEAgDsFss0tH9VkfavKtK5M8eo0+vXB1uaI02m73lrg+5+a4MKXJamjSqJNJneuyyffkIAAADAHkQIBADoV7lOSzOrSzWzulTS3mrq8OrdTY2Ry9G/vq5ekjSswBVaOubWYZOGy+n3qyjHQTAEAAAADBBCIADAgCrJc2rWpArNmlQhSdrS1KmVoauOvb2+QX//eKv0wlpJksM0VJrvVFm+K/TTqdI8V/Bn/OP5LuU4WF4GAAAA9BUhEABgj9qrJFenThmhU6eMUMC29dm2Nn3V4dOX21tV3+5VQ7sn9NOrjfXt2tHuVZcvkPRcBS4rEgiFg6LSfJfK8hJDo+JcpyyTKiMAAAAMXYRAAIC0MQ1Dk4YXaoY7X42N7Sn36/D6Vd/uUUO7VzvagkFRQ4c3JjTa3Nip1Vua1djhVcBO9lySOy+xmihYbRQbJJXlu5TnNFmaBgAAgKxCCAQAyHh5TksjS/I0siSv130Dtq3mDp/qO4KhUX27V/VtHtV3hMKj0GMf1baovt2rNo8/6XlyHGZMUBRfcVSW1x0mufOcXPkMAAAAGY8QCACQVUzDkDvfKXe+Uyrvff8uXyCxsqgtdLsjWGW0rdWjtVuDy9V8ycqMJJXkOiLL0cpDP2MqjqKWqBXmWFQZAQAAYI8jBAIADGk5DlNVxbmqKs7tdV/bttXa1b00LVxdFAyNuoOkz7a3qaG9UU2dvqTncZhGJBwqzXdGQqOYvkZRy9RogA0AAID+QAgEAEAfGYaholyHinIdqi7rfX+fP6DGSIWRN2GJWjg02lDfrvqdbIBdFrkdu0StOM8hkyojAAAAJEEIBADAAHFYpoYV5mhYYU6v+9q2rQ5voLvKqD22AXa4r9Gmxo4eG2BbhlSSogF2siApz2kNwCsHAABAJiIEAgAgAxiGoXyXpXxXnka5e2+A7Q/Yau6MqjKKW6IWvpLamtoWNfTQADs3qgF2YmgU+5g7zymHSZURAADAYEUIBADAIGSZRii4cfVp/06vP3ZpWnTFUVQD7P+EGmD7e2iAHR8OJfY1Cv4scNEAGwAAIJMQAgEAMATkOi1VOa0+N8Bu6fJFAqOGdk9seBTVALu+vVHNKRpgOy1DpSmXpsUFSXlOuWiADQAAMKAIgQAAQAzDMFSc61RxrlNj+9AA2xvTADs6MIrta7S+lwbYhTlWMBzK6+5ZFA6Kum8Hfxbn0gAbAABgZxECAQCA3eK0TFUU5qiijw2w273+mJAoWV+jvjTAdoeXnyUJjUrzXTFL1HJpgA0AAEAIBAAA9hzDMFTgcqjA5ehzA+ymzqiqoqThkVdbemmAnec0u3sWRS1Riw6PykM/S2iADQAAshQhEAAAyFiWaags36WyfJekgl737/T6I8vPklUZNbR7VdfSpU96aIBtSCqJVBc5VZrnUkVJroxAQC7LlMthKtdhRm7nhP5E7lumcpzB+9HbchzB7SxjAwAA6UIIBAAAskau09IIp6URO9kAuz5FlVFDu0efbmvVqi+b1OX1q8sXkC/FldP6ymkZCQFRfJgUHxz1ti3mfD3sSwAFAMDQRggEAACGpNgG2Pk97ut256uxsV1ScImaxx9Qly8gjy8gjz+gzvBtX0BdO7Gty9d9P7ytudOXsC18fyACqOjwKFX1UsK+0RVRKc4TvY0ACgCAzEAIBAAAsBMs01CeaSkvDc2mfQFb3iThkSfF/b5si77f3OkL7hsVVoW3JVs6tzNclpE0IOophOopaMqxTOU4LLkc0cFW8H5OpLrKkssyZBBAAQAgiRAIAABg0HCYhhzpDqC8qcOkzlCAlFD1lGybN/Z+Y4evu3rK65fHb0e29UcAFQyITOWEwqhgQGQqx2HEBEY5KbfF3g8GTUbU7ST3CaAAABmGEAgAAAC9SncAlWxJXfC2PxRIhZfphe6HQySfP3Fb3P1gAOUJ7RuQx29Hbu9m/pTY98mKqmxymlFVS6m3xVRPxe2bE6qCKrelzrYuOa3Q81mGLJMQCgAQixAIAAAAGc1hGnK4LOW70hdARQdEkfvRFVFRlU6RPlD+YODU5fN3Vz3F7Vvf5Uu6rT8CKEOSy2FGekE5Q+FQOChyhqqZnGZoH0eKfVJuC503VPXkNKNuR4VRzlBg5bRMOUxCKQBIJ0IgAAAAIIW0BlD+QMqgqSuub5PlcqippVNef7DKqftnMLjy+ruDqfhtrZ7gUrxkx4Rv9xfTUEKIFB9UOZOFSNbOB1Wx4Vfi8eFtFsEUgCGEEAgAAADIQA7LlMMyVeDqfd/oK9j1N9u2gxVR/oC8oaV0iWFS97ZwcBQTKPkC8ga6b/cUVHn9AbV4fSm2BY/Z3SvlRbPCwVSKaqeUFVTxYVaPQVWqMCt50MXV9AAMFEIgAAAAACkZhhEJMdSHQGpPCNi2fMlCpISgqqdt3cd7o/ZJFVS1e30xQZUn7pjdbWAezTKNmGoll2XIkWIZXo9L/ZIEVY7oYxypju++ml+4qopgCsgOfQqBVqxYoSVLligQCGjRokW68MILY7Z7PB5deeWVWrNmjdxut26//XaNGjVqQAYMAAAAYGgzDSMYYDjMdA8lImDbkWolbyCQcoldcElfVDCVotrJExVOJQudvL6A2rp6rqjqx5V8cphGbCWTGdvvqad+US7LVGG+S16PT4ZhyDSC4aKp4O/SMCTDCN2Oeiz6Z+SYJMf2tl/4fvd+SR5TD+cI/Ywfm5lk3MmOjRlniv2APaXXEMjv9+uGG27QAw88oMrKSi1cuFCzZs3ShAkTIvv87W9/U3FxsV566SU999xzuvXWW3XHHXcM6MABAAAAIFOYhqFcp6VcZ7pH0s0fiAuO4pfhpQqqfHEVVCmCKm/cPl2+gFq7fN1VVFEVVb6ArYAd/hNcZtiPxVODXkJopF7CLUMpA6U+BWkKP1fywGunxtHXUE6JIdyuB3o9hG19CPTMuP0MQxrjCajClTnB8kDpNQRavXq1qqurNXr0aEnSvHnz9Morr8SEQK+++qouueQSSdLcuXN1ww03yLZtEk0AAAAASBPLNGSZlnKde76xebxUfavsuFAoYNuyJdnh29E/FbtvzDF92C9hf3UfF7tf3GNJxmmnPDb8HKmPTRjvTuxnq6fnDO8nSUnO0cOx/khIF3xt/TOO7v3i36vg45kXBj5/0UxVFOakexgDqtcQqK6uTlVVVZH7lZWVWr16dcI+I0aMCJ7Q4VBRUZEaGhpUVlbWz8MFAAAAAGQLwzBkGVKwrgVDVaowMFmA1PN+KUKrPoSIIysKVZGX/W2T0/YKLcuQ252frqfvV5ZlZs1rQXZijiLTMUeR6ZijyHTMUWQ65igynWWZ8vsD6R7GgOs1BKqsrFRtbW3kfl1dnSorKxP2+eqrr1RVVSWfz6eWlhaVlpb2eF6/3x6wy1juaQN5SU6gPzBHkemYo8h0zFFkOuYoMh1zFJkum+ZoRUVRym29dj2aMmWK1q9fr02bNsnj8ei5557TrFmzYvaZNWuWnnjiCUnSiy++qEMPPZR+QAAAAAAAABmk10ogh8Oha6+9Vueff778fr8WLFigiRMn6s4779TkyZM1e/ZsLVy4UFdccYXmzJmjkpIS3X777Xti7AAAAAAAAOgjw7bttPTi9nr9WVNqlU1lY8hOzFFkOuYoMh1zFJmOOYpMxxxFpsumObpby8EAAAAAAAAw+BECAQAAAAAADAGEQAAAAAAAAEMAIRAAAAAAAMAQQAgEAAAAAAAwBBACAQAAAAAADAGEQAAAAAAAAEOAYdu2ne5BAAAAAAAAYGBRCQQAAAAAADAEEAIBAAAAAAAMAYRAAAAAAAAAQwAhEAAAAAAAwBBACAQAAAAAADAEEAIBAAAAAAAMAY50D2AwW7FihZYsWaJAIKBFixbpwgsvTPeQgBhXXXWV/vGPf6i8vFzPPvtsuocDJPjqq6905ZVXaseOHTIMQ6effrrOOeecdA8LiOjq6tJZZ50lj8cjv9+vuXPn6tJLL033sIAEfr9fCxYsUGVlpe6+++50DweIMWvWLBUUFMg0TVmWpccffzzdQwJiNDc365prrtHatWtlGIZuuukmTZ06Nd3DGhCEQLvI7/frhhtu0AMPPKDKykotXLhQs2bN0oQJE9I9NCDitNNO09lnn63/+Z//SfdQgKQsy9JPfvIT7b///mptbdWCBQt0xBFH8FmKjOFyufTQQw+poKBAXq9X3/rWt3T00UfroIMOSvfQgBgPP/ywxo8fr9bW1nQPBUjqoYceUllZWbqHASS1ZMkSHXXUUfrVr34lj8ejzs7OdA9pwLAcbBetXr1a1dXVGj16tFwul+bNm6dXXnkl3cMCYkyfPl0lJSXpHgaQ0vDhw7X//vtLkgoLCzVu3DjV1dWleVRAN8MwVFBQIEny+Xzy+XwyDCPNowJi1dbW6h//+IcWLlyY7qEAwKDT0tKilStXRj5DXS6XiouL0zyqgUMItIvq6upUVVUVuV9ZWckXFwDYDZs3b9bHH3+sAw88MN1DAWL4/X6deuqpOvzww3X44YczR5FxbrrpJl1xxRUyTf5pj8x13nnn6bTTTtNf//rXdA8FiLF582aVlZXpqquu0vz583X11Vervb093cMaMPxNAQBIu7a2Nl166aVavHixCgsL0z0cIIZlWXrqqaf0f//3f1q9erXWrl2b7iEBEa+99prKyso0efLkdA8FSOkvf/mLnnjiCd17773605/+pJUrV6Z7SECEz+fTRx99pDPPPFNPPvmk8vLydM8996R7WAOGEGgXVVZWqra2NnK/rq5OlZWVaRwRAAxOXq9Xl156qU455RQdf/zx6R4OkFJxcbFmzpyp119/Pd1DASJWrVqlV199VbNmzdJll12mt99+Wz/+8Y/TPSwgRvh7Unl5uebMmaPVq1eneURAt6qqKlVVVUUqfU844QR99NFHaR7VwCEE2kVTpkzR+vXrtWnTJnk8Hj333HOaNWtWuocFAIOKbdu6+uqrNW7cOH33u99N93CABPX19WpubpYkdXZ26q233tK4cePSPCqg2+WXX64VK1bo1Vdf1dKlS3XooYfq1ltvTfewgIj29vZIw/L29na9+eabmjhxYppHBXSrqKhQVVWV1q1bJ0n65z//qfHjx6d5VAOHq4PtIofDoWuvvVbnn39+5JKcfJgh01x22WWqqalRQ0ODjj76aH3/+9/XokWL0j0sIOLdd9/VU089pUmTJunUU0+VFJy3xxxzTJpHBgRt3bpVP/nJT+T3+2Xbtk444QQdd9xx6R4WAAwaO3bs0MUXXywp2GPt5JNP1tFHH53mUQGxfvrTn+rHP/6xvF6vRo8erZtvvjndQxogSIVEAAAgAElEQVQwhm3bdroHAQAAAAAAgIHFcjAAAAAAAIAhgBAIAAAAAABgCCAEAgAAAAAAGAIIgQAAAAAAAIYAQiAAAAAAAIAhgBAIAAAAAABgCCAEAgAAAAAAGAIIgQAAAAAAAIYAQiAAAAAAAIAhgBAIAAAAAABgCCAEAgAAAAAAGAIIgQAAAAAAAIYAQiAAAAAAAIAhgBAIAAAAAABgCCAEAgAAAAAAGAIIgQAAAAAAAIYAQiAAAAAAAIAhgBAIAAAAAABgCCAEAgAAAAAAGAIIgQAAAAAAAIYAQiAAAAAAAIAhgBAIAAAAAABgCCAEAgAAAAAAGAIIgQAAAAAAAIYAQiAAAJBRmpqadMABB2ifffbRk08+me7hAAAAZA1CIAAAkFGeeeYZeTwejRo1So899li6hwMAAJA1CIEAAEBGWbZsmWbOnKlzzjlHK1eu1KZNm9I9pH7V2tqa7iEAAIAhihAIAABkjDVr1ujjjz/WN7/5TZ188slyOBxatmxZwn4ej0f33nuvTj31VB144IE6+OCDddppp+mPf/xjzH6tra26/fbbdeKJJ2rKlCmaOXOmzjzzTD333HORfb797W9r1qxZCc+xefNm7bPPPvr1r38deSwQCOiuu+7SWWedpSOOOEKTJ0/Wscceq+uuu04NDQ0pj3/++ed12mmn6YADDtCNN94Y2eftt9/WhRdeqJkzZ2rKlCmaPXu2Fi9erPr6eu3YsUOTJ0/W5ZdfnvS9+tnPfqZ9991Xmzdv7tubCwAAhjxHugcAAAAQtmzZMuXn5+v4449Xfn6+jj32WD355JP6wQ9+INMM/r8rj8ej8847TzU1NTryyCP1jW98Qzk5OVq7dq2WL1+us88+W5LU3Nysb33rW/r00081d+5cnXnmmQoEAvroo4/02muvad68eTs9Pq/Xq/vuu0/HH3+8Zs+erby8PH344Yd67LHHtGrVKj322GNyuVwxx7z88st65JFHdOaZZ+qMM85QYWGhJOnRRx/V9ddfr8rKSp1xxhkaOXKktmzZotdee011dXXab7/9NGvWLL300ktqbm5WcXFx5JxdXV169tlndfjhh2vUqFG7+nYDAIAhhhAIAABkhHCwMXfuXOXn50uS5s+fr5deekmvv/66jjnmGEnSQw89pJqaGl100UW67LLLYs4RCAQit5cuXapPP/1UN9xwg/7rv/4r5X47w+Vy6Y033lBubm7ksTPPPFNTp07VNddco5dfflknnXRSzDGfffaZnn76aY0fPz7yWG1trW688UaNGzdOjz76aEzA88Mf/jAyvtNPP10vvviinnnmGZ111lmRfV588UU1Nzdr4cKFu/Q6AADA0MRyMAAAkBGWL1+u5uZmzZ8/P/LYMccco7KyspgG0c8884xKSkp08cUXJ5wjXC0UCAT0/PPPa/z48QkBUPR+O8swjEgA5Pf71dzcrPr6eh166KGSpNWrVyccc8wxx8QEQJL0wgsvyOv16pJLLokJgOLHd8QRRyRtkL1s2TK53W59/etf36XXAQAAhiYqgQAAQEZYtmyZysrKVFVVpQ0bNkQeP+KII/TCCy+ovr5eZWVl2rBhg/bbbz/l5OSkPFdDQ4Oampp01FFH9fs4n3/+eT3wwAP6+OOP5fV6Y7Y1NTUl7D927NiEx9avXy9J2m+//Xp8LsMwtGjRIt1+++36+OOPtd9++2nTpk2qqanRd77znYSlZwAAAD0hBAIAAGm3adMm/etf/5Jt25o7d27SfZ5++mmde+65e2xMfr8/4bHly5frRz/6kQ444AAtXrxYI0aMUE5Ojvx+v84//3zZtp1wTF5e3m6NY8GCBfr1r3+tZcuW6ac//amWLVsm27a1aNGi3TovAAAYegiBAABA2j3++OOybVs33nijioqKErbfcccdeuyxx3Tuuedq7NixWrdunTweT8pKmNLSUpWUlOiTTz7p9bndbrfWrFmT8HiyS9M/9dRTysnJ0cMPPxwT7nz++ee9Pk+0cHXQxx9/rL333rvHfSsqKnTcccfpmWee0eWXX64nnnhCBx54oCZOnLhTzwkAAEBPIAAAkFaBQEBPPPGEJk2apEWLFumEE05I+HPyySdr7dq1Wr16tU455RQ1NTXpd7/7XcK5wpU4pmlq3rx5+uyzz/S3v/0t5X5SMJBpa2uL6ecTCAT04IMPJhxnWZYMw4hpLG3btu66666des0nnHCCnE6nfvvb36q1tbXH8UnSokWL1NTUpOuuu051dXVUAQEAgF1CJRAAAEirN954Q1999VWPV7o6/vjjI0uirrnmGr322mu666679OGHH+rII4+Uy+XSZ599pi+++CIS3vzwhz/U22+/rWuuuUZvvvmmDj74YNm2rY8//lg+n0+33HKLpOAVuB544AFdfPHF+s53viOn06kXX3wx6XKwuXPn6sUXX9Q555yj+fPny+fz6eWXX1ZHR8dOveaqqiotXrxYN9xwg0455RSdeuqpGjlypOrq6vTKK6/opptuiukXdNRRR2nkyJF6+umnlZ+fn3AFMgAAgL4gBAIAAGm1bNkySdKcOXNS7jNp0iSNHTtWzz//vBYvXqz7779f999/v5599lktXbpUOTk5qq6u1mmnnRY5pqSkRH/961/1+9//Xi+99JJefvllFRQUaPz48Tr77LMj+40ePVq//e1vtXTpUt15551yu9069dRTtWDBAp144okx45g3b57a2tr04IMP6he/+IVKSkp03HHH6fLLL9fMmTN36nV/61vf0pgxY3TffffpkUcekcfj0fDhw3XYYYepqqoqZl/TNLVgwQL96le/0oknnqiCgoKdei4AAABJMuxkHQwBAACQUe69917deuutevTRRzV16tR0DwcAAAxChEAAAAAZzufz6YQTTlBeXp6eeeaZdA8HAAAMUiwHAwAAyFCbNm3S+++/r1deeUWbNm3S0qVL0z0kAAAwiBECAQAAZKiVK1fqqquuUmlpqS6++GLNmzcv3UMCAACDGMvBAAAAAAAAhgAz3QMAAAAAAADAwEvbcrBAICC/PzuKkCzLyJrXguzEHEWmY44i0zFHkemYo8h0zFFkumyao06nlXJb2kIgv99WY2N7up6+X7nd+VnzWpCdmKPIdMxRZDrmKDIdcxSZjjmKTJdNc7SioijlNpaDAQAAAAAADAGEQAAAAAAAAEMAIRAAAAAAAMAQkLaeQMn4/T41NGyTz+dJ91B2Sl2dIdvOjgZS0RwOl0pLK2RZGTVNAAAAAADALsiob/cNDduUm5uvgoIqGYaR7uH0mWWZ8vsD6R5Gv7JtW21tzWpo2KZhw0akezgAAAAAAGA3ZdRyMJ/Po4KC4kEVAGUrwzBUUFA86KqyAAAAAABAchkVAkkiAMog/C4AAAAAAMgevYZAV111lQ477DCdfPLJSbfbtq0bb7xRc+bM0SmnnKI1a9b0+yABAAAAAACwe3oNgU477TT94Q9/SLl9xYoVWr9+vZYvX66f//znuv766/tzfFnL5/OlewgAAAAAAGAI6bUx9PTp07V58+aU21955RXNnz9fhmHooIMOUnNzs7Zu3arhw4f360D3pKuuulx1dXXyeDxatOgMnXrqaXr77bd0zz2/ld8fkNvt1p133qX29nbdccct+uSTj2UY0ne/e4GOPXa25sw5Si+99Lok6bXXXtZbb72hq6++XkuWXC+Xy6W1a/+jAw44ULNnH68777xNHk+XcnJytXjxtRozZqz8fr/uuuvX+te/3pJpmjrllPnae+/xWrbsUd18822SpJUr39bjjy/TzTffms63CgAAAAAADBK7fXWwuro6VVVVRe5XVVWprq6u1xDIsgy53flx5zJkWelvU3T11derpKREnZ2dOu+8b+vYY4/TL3+5RHfd9QfttddINTU1ybJMPfzwfSosLNSf/vS/kqTm5ubI+MM/TdOUYQRfl2EY2rZtq+6990FZlqW2tlb9/vf3yeFwqKbmX7rnnt/p5ptv1VNPPaa6uq/08MOPyuFwqKmpScXFxbrttl+oublJpaWlev75Z3XKKacO+PtlGIm/Jww+lmXye0RGY44i0zFHkemYo8h0zFFkuqEyR9N2iXi/31ZjY3vMY7ZtRy61/tyaOj3979p+fc5vTK7SvP0re93vr3/9s1as+Ickqa6uVk888ZgOPHCqKitHyO8PqLCwSH5/QDU1/9LPfnZT6PUEVFBQGBl/+GcgEIi8Ltu2ddxxX5dkyO8PqKmpWXfcca02b94owzDk8/lC531b8+cvkGGYkecLBGzNnXui/v73Z3XSSd/Qv/+9Wldfff2AX5rethN/Txh83O58fo/IaMxRZDrmKDIdcxSZjjnas4BtK2BLgYDdfdsO3Q5IAdkKBGz57dD3ZtuWbUv+QOinHX9cb+dS6BxR50x2roCtgMLnijpP3PkTz9U9zkD0+ZVizMnOHz3m0O2Y8yv1mMPP06f3JHTbnefSn789Te58Z5pnw+6rqChKuW23Q6DKykrV1naHNbW1taqs7D1oyVSrVr2jd96p0d13P6Dc3FxdcsmFmjBhkjZsWL8TZ+m+qpbHE3uJ9dzc3MjtP/zh95o27RDdfPOt+uqrLfr+9y/q8azz5n1DV175I7lcOTruuNlyONKW4QEAAABAhM8fUJc/oC5fQB5fQJ2hn12+gDz+gJzb2tTc0pkkWOjDF/6oECFpEJH0XL2FFFHhRJKQIvm5UocUSQOVyLmSBDdxz5MtDEmmacg0JNPo/mmZhgwp+NMwZBmK/AzuH9w3+Jghw5Asw0h+LsuUy4p/nuDt4PMYskzFPY8h04zdN+a2aWhkeYEKcqx0v4UDbrdThFmzZumPf/yj5s2bpw8++EBFRUX90g9o3v6Vfara6W9tba0qKipWbm6uNmxYr48++rc8Ho8++OA9bdnypfbaa6Sam5tUXFyi6dNn6vHH/6bLLrtCUnA5WHFxscrKyrR+/RcaM6ZaK1a8pvz8gqTP1draqoqKCknS888/E3l8+vSZeuqpxzV16iFyOByR5xs2rELDhlXooYfu0x13/G7g3wwAAAAAg4ovYKvL508ZxsTcD2/3ByLHdEVtS3qMPxCzX5fPL48vIH8agwwzOlAIf7FP+oU/9LOPIUV0QGA5zNhAoYdzmaYhU7EhRSTYSBJS9Djm3s4VCkuiz2UYPYQgMpKeK3Wgkiy4Sf3+GobR8y8rgw2VarVeQ6DLLrtMNTU1amho0NFHH63vf//7kStbnXnmmTrmmGP0f//3f5ozZ47y8vJ00003DfigB9LMmYfryScf11lnLdSYMdX62tcmy+1264orFuvqq69QIGCrtLRUd9zxO51zznlauvQXOuusRTIMU9/73gU65phZ+u//vkRXXvlDud2l2nff/dTR0ZH0uc466zu68cbr9dBD9+mww46MPH7yyfO1adNGnXvumbIsh77xjflasOC/JEnHH3+CGhsbNHbs3nvg3QAAAACwK8JhjMcXG5qEQ5Sewphk4U3CMXFhjCcUyOxOGGMaUo7DVI7DkssylOu05LLM0GOminIdynWYkcdcju5tOZHHLeU4jOA5Qo/nOkyVleSpva0rFGoEQ4TEQCUxWOgx8AiFFIM5eAD2NMO27bRktl6vPyFlq63doKqq6nQMZ7dYljngvXnCli79hSZN2kcnnzx/jzzfYP2dINZQSbUxeDFHkemYo8h0zNHUegtj4itf+hLGpDrHngpjXKFgZVfCmOhjcqOPtUw5BvCiM8xRZLpsmqMD2hMIe873vne28vLydMklP0r3UAAAAICd4gvYkYCkL2FMdH+ZLp9/p8KY7sBnYMOYwhyHhhUMrjAGwNBGCDSI3H//H9M9BAAAAAxyycKY1BUtiWFMly9UWZPsmB4CHf9udL/tKYxx7UYYk5OsuoYwBkAWIwQCAAAA0sS2bXn8tjq9sdUxnaGAptMXUJc3WAUT2ZZk34BhqrXD06dAhzAGAIYuQiAAAAAgii90NaT4QKbLmxjQdHpjK2o6oypsOr3dS5K6t3WHOOHlTbsayYSXFeU4TOW5HHKaRkwYU54fG6zsTBgTs2yJMAYAsgYhEAAAADJawLYjDXpTVcx0RQUyndGBTNxj4VCmO6BJPGZXK2UcoRAm12lFwpbc0J8Cl0Nl+aZynWakkiY6xMlxWpHbuaHt3fuaynVYMSFOjsOMuSJSNjU0BQAMHEIgAAAA7BTbtuUL2MlDl6iqmWRVMZ2+xFCm52DHL88udvY1pIRQJhKoOLsvdx0TykSFNLkxoUswlMmNO0f0dofJZaoBAJmNEAgAACAL+AN27NKjJEuX+lIFEwxv/L0GNLvaVsZpGQlVLeGQpijXoYq4ypekoYwz9n5uXCVN+HinZcRUywAAMNQRAu2GOXOO0ksvvZ7uYQAAgAwUbvibKnRJFcrsVIPgqNDHu4vVMqahlKFMjsNUSa4jWCUTUwWTuFwpvnImfI7cqPsuy5RFtQwAAGlDCJQFfD6fHA5+lQAA7C7bttXS5dP2No92tHm0o82r7W0eddlSU2tXTIVMfIPfZAHOrjb8DV91KVk/GHeeM6HnTDiASRrKOBPPkRsV0jhMqmUAABgqSA6i3HXXrzV8eKUWLDhdknTffXfLsiy99967amlpls/n0wUX/D8dddSxvZ6rvb1dV111edLj/v73Z/Xoo3+UZGjChAn66U9/rvr6Hbrllpu1ZcuXkqQf//gnGjasQlde+UM98sj/SpL+/OdH1NHRrvPOu0iXXHKhJk7cR6tXv6+vf32uRo8eo4ceuk8+n1fFxW5dd93PVVZWrvb2dt1xxy365JOPZBiGvvvdC9Ta2qrPP/9MP/jB5ZKkp59+QuvXr9Oll17e7+8pAACZwOsPhEIdj7a3ebWjrSsS8Oxo82hHu0fbW4M/k1XUWKaRsNQofDvPaak035UQyuxK5Uz4SkwmoQwAABgAGRsC5XyyTLkfP9qv5+zc7wx17bsw5fbZs+foV79aGgmBXnvtZd1226+1aNEZKigoVGNjoy666FwdeeQxvf4fM5fLpZtuuiXhuC++WKeHHrpfv//9/XK73WpubpIk3XHHrZo6dZpuvvlW+f1+dXR0qKWlucfn8Hq9uu++RyRJzc3NuueeB2UYhp555kn96U8P6/vf/5EefPAPKigo1MMP/zWyn8Ph0MMP36+LL/6BHA6Hnn/+GV1xxeI+v48AAGQC27bV3OnTjvZwuBOs3Om+HfxZ3+ZRU6cv6TnceU6VFzg1rMClMaNLVJ7v0rBCl8rzXSovcGlYQfDnqMoiNTV17OFXCAAA0L8yNgRKh0mT9lVDQ722b9+mhoYGFRUVqbx8mH71q9v0wQfvyTBMbdu2TfX1O1RePqzX8919928Tjlu1aqWOO2623G63JKm4uESStGrVSl1zzc8kSZZlqbCwsNcQaPbsOZHb27Zt1XXXXaUdO7bL6/VqxIiRkqR33qnRz352U2S/4uJiSdLBB0/Xm2++rrFj95bP59P48RN24p0CAGDgeHwB1ccEO56Yqp3o6p1kVTsuy4iEN9WleZo2qiRyPzrYKct3ymmZfRoTy6UAAEA2yNgQqGvfhT1W7QyU4477ul577RXV1+/QrFnHa/nyv6uxsVH33fdHORwOLVx4ijweT6/n2dXjolmWJdvu/setx9MVsz0vLy9y+/bbf6kzzjhLRx55jFatekf3339Pj+c++eT5euSR+zVmzFiddNIpOzUuAAB2VnTVTnjZ1Y42b9TtnavaqS6LCnbiqncKcyxCGwAAgCQyNgRKl1mz5uiXv1yixsZG/eY39+jVV19SaWmpHA6HVq16R7W1X/XpPK2trUmPmzZtuhYvvkJnnHGWSkqCy8GKi0t08MHT9eSTy3T66d+KLAcrKytXQ0O9mpoalZeXr7feekMzZx6W9Pna2lo1bNhwSdILLzwXeXz69Jl6/PG/Rfr/NDc3q7i4WPvvP1lbt9Zp7dr/6MEH/7I7bxkAYAgLV+10V+wkX5bVe9VOTkLVTnT1Tnm+U44+Vu0AAAAgOUKgOOPGjVd7e5sqKio0bNgwHX/8ifqf//mRvvOd/9K++35N1dVj+3SeVMeNGzde55zzPV1yyYUyTUuTJu2jq6++Xj/4wY/1y18u0bPPPiXTtPTjH/9EkycfoHPPvUAXXHCOKiqG9/jc3/vehfrpT3+ioqIiHXzw9EiD6XPOOU9Ll/5C3/726TJNS9/73gU65phZkqTjjpujzz77T2SJGAAAUnfVTkLT5DZv8Hbo8d6qdoIhjjOmaie+eqfARdUOAADAnmLY0euN9iCv16/GxvaYx2prN6iqqjodw9ktlmXK7w+kexg77corf6jTT/+WDjlkRsp9BuvvBLHc7vyE/96ATMIc3TPiq3aS9dvpqWonx2FGApzwsqz4PjvDQr12sq1qhzmKTMccRaZjjiLTZdMcragoSrmNSqAhqKWlRRdccI4mTJjYYwAEAMh88VU7scFOl3a0eyPhTnOKqp3SPGcozHFqbFlJQrATvk3VDgAAwOBGCLSbPv/8M91447WKrqdyOp26996H0jeoXhQVFenRRx9P9zAAAD3w+AKRhsnxlzyP77fjC/RctTO2LF8HjyqJrdgJNVLOxqodAAAAJEcItJvGj5+ghx9+dFAuBwMA7Fm2baup05ci2PEEq3ZCV8vqrWpnWIFLY8vyqNoBAABAn2VcCGTbNv9ozRBpahcFAINOV6jXzo42T8Ilz+P77fRUtTOswKWx5fk6eHRJzCXPqdoBAABAf8ioEMjhcKmtrVkFBcUEQWlm27ba2prlcLjSPRQASIvoqp3Ey593993prWonHODsXZan8oKcpA2VqdoBAADAnpBRIVBpaYUaGraptbUx3UPZKYZhZGXVjMPhUmlpRbqHAQD9Kly1E7zkedTlz9t3rWrnkDFulRc4I5c8D/fhoWoHAAAAmSajQiDLcmjYsBHpHsZOy6ZLyQHAYBSu2omt0kms2tne5lFLV2LVjiHJHV21U56f8jLoVO0AAABgsMqoEAgAgGhdvkDyUCdSvePV9tYu1bd7U1bthEOcvaOqduKDndI8qnYAAACQ/QiBAABp4/UHtGpzkzb+u06bt7f2uWqnNN8ZCXHCVTvBKh5nzJWyqNoBAAAAuhECAQD2qB1tHr31Rb3eWFevt9c3qN3rl5S8aid4PzbYoWoHAAAA2DWEQACAAWXbttZua9Mb63bojXX1WvNVi2xJFYUuzd2vQkeOK9exX6uSv9ND1Q4AAAAwgAiBAAD9rtPr18qNjXpjXb3eWLdDW1s9kqT9q4p04eHVOmpcuSYNL4iEPsV5TjV2edM5ZAAAACDrEQIBAPpFbXOn3gwt81q5sVFdvoDynZZmji3VRePKdPjeZRpW4Er3MAEAAIAhixAIALBL/AFbH9W26I11O/T6unp9uq1NkrRXSa7mT6nSUePKNXVUiVwO+vcAAAAAmYAQCADQZ61dPv1rQ4NeX1evt9bVq6HDK9OQDhxZokuP3ltHjivX2LI8evsAAAAAGYgQCADQo82NHXp9Xb3e+HyHVm1uki9gqyjHocP3LtWR48p12NhSleQ50z1MAAAAAL3oUwi0YsUKLVmyRIFAQIsWLdKFF14Ys/3LL7/U4sWLVV9fL7fbrVtuuUVVVVUDMmAAwMDy+QP6YEtzpKnz+voOSdLeZfk6c9pIHTm+TAfsVSKHSbUPAAAAMJj0GgL5/X7dcMMNeuCBB1RZWamFCxdq1qxZmjBhQmSfX/ziF5o/f76++c1v6p///Kduu+023XLLLQM6cABA/2ns8Oqf6+v1xuf1+uf6BrV0+eQwDR08ukQLDtxLR44r0yh3XrqHCQAAAGA39BoCrV69WtXV1Ro9erQkad68eXrllVdiQqDPP/9cV111lSTp0EMP1cUXXzxAwwUA9AfbtrVuR3uk2mf1lmYFbKks36ljJ5TryPHlmlntVoGLVcMAAABAtuj1X/d1dXUxS7sqKyu1evXqmH323XdfLV++XOecc45eeukltbW1qaGhQaWlpf0/YgDALvH4Alq1uVGvfx4MfrY0d0mS9hleqO/OHKOjxpVpv6oimTR1BgAAALJSv/wv3iuvvFI///nP9cQTT+iQQw5RZWWlLMvq8RjLMuR25/fH06edZZlZ81qQnZijQ9e2li79Y+02vfafbXrz8+1q9/iV4zB1+Phy/fexFTp20nCNKMlN9zCZo8h4zFFkOuYoMh1zFJluqMzRXkOgyspK1dbWRu7X1dWpsrIyYZ/f/OY3kqS2tjYtX75cxcXFPZ7X77fV2Ni+K2POOG53fta8FmQn5ujQYdu2/rO1NXg1r3X1+qi2RZI0vNClE/cbriPHlemQ0W7lOkNBvR3IiLnBHEWmY44i0zFHkemYo8h02TRHKyqKUm7rNQSaMmWK1q9fr02bNqmyslLPPfecbrvttph9wlcFM01T99xzjxYsWLD7owYA9EmH16+aDY16Y90OvflFvba1emRImjyiSP/viLE6clyZJlYUyGCZFwAAADCk9RoCORwOXXvttTr//PPl9/u1YMECTZw4UXfeeacmT56s2bNnq6amRkuXLpVhGDrkkEN03XXX7YmxA8CQ9VVzZ6Sp8zsbG+Xx2ypwWTp0bKmOHFemw/cuU1m+K93DBAAAAJBBDNu27XQ8sdfrz5pSq2wqG0N2Yo4Ofv6ArTW1LXpj3Q69/nm9PtveJkka5c7VUePKdeS4Mk0dVSKnZaZ5pLuGOYpMxxxFpmOOItMxR5HpsmmO7tZyMABAerR2+fT2+obQMq8GNXZ4ZRnSQaNK9INjxunIcWWqLs1jmRcAAACAPiEEAoAMsrGhI1jts65e721ukj9gqyTXof/f3p3HR1Uf6h9/zmxZyUqYJBICgbAmIC5AFVyCiAKKsthWu9xW6r39tcXivpW23KL2Vtti77221Htp9fZ2uRXFiq22YAVcQFWcb8IAACAASURBVFwawiqEACHJAFnInsxyfn8EQxISAmQ5k5nP+/XylUzOmZlnJl9Gvg/fc85nRiRpRlaSpg1PVFyk0+qYAAAAAAYgSiAAsJDPH9DHR6u1ubBcWwordLiyQZKUlRytOy4dqhlZScpJj5PDxmofAAAA4LyYpiTzzK/tfqaWr/7wqEfC41UCQBCpqvfqnaIKbT5QofcOVai2yS+n3dClGQn67OR0XZmVpIvio6yOCQAAcH78XhneWhnNdTK8NS1fm2tkeOtkOH2KrGtUlxPwTibqRuuEXecwmVeb/TvZr8vH7rhNnTx2y1ej9WfqJkvLPkZ3WTq8rs73ty5P5+9T13na7X8OpctZf189zGJ8+hjnwYwYJOP2zTKjB5/3fQcSSiAA6GOmaepAeb22HGg5zKugtFoBU0qKdmpmdoqmZyVpSmaiol12q6MCAIBwc5bixvZpgdNcI6O5tuU/b9uvHfbxN531qbo+VW3/MWVIhk0yDEmn/vv0/IptfmYabbe12eeMn3XcX2dsO3N/dfH4vZTn1FezbR7bqdud7q82+59DnnaP31ke29mznO29ucAsZ75HZ39vTu9/+rEik9NkRiZ0OXZCBSUQAPSBJl9AHxypar2Me2l1y1+Kxrljdee0YZqelayx7ljZOKkzAAA4X/1Y3HzKdETKdMYq4IqV6YyV6YpVIDZVfucoma6W25/+vHU/V6xM1yCZzhgNSk5WdXXTGZP9rguFs0/mzU6Kg7OXC/ydC2cXkRAthcjVwc6GEggAesnx2ia9XVihLYUV2nqoUo2+gCIdNk3JTNRXpw7TlVlJSomNsDomgLMxTckMqGU5eeD09z5byzYmEQAulGXFzSAFXDEXVNyc/hor2Xo4dUyIVkChP8EGgh0lEABcoIBpao+nVltOndR5t6dWkpQ6KELzJrg1fWSyLh0ar0gnh3n1qq4m6aYp44yf6/T3pilDp79vOW687TazzbbAqW1mm/ud+XwyAx22me2e32iz3znlbXP/9ts6PG672+3zGh1e15nPd455Ozy/cUb+0/c3OnlfPr1tdPI8p1+j2eXrPzNvh99Xx+dr+/vq4v3u9n3t5vwBKZJMe0TLpMoeIZ36ajoipXY/j5Bpj5R56mvL7QiZjqgO+316/64e6/R+svE5AljC6uLmVAljWXEDIOTwqQCg93Ux4Ws/oe5ukq4OE7vuJqhnm6R3UQq0naCe4yS92RdQ4fEa7T9eo0+O16q+2Se7ApoXH6llo6M0KjlaKTGOluesNqWdZ5ukd/V8F1YqGJ39/GyT9DPKhrbP2VnZcK6T9LaP2/b97nqSbhhScqDj76WL3xfO2+nzH9jUuiT+1Pfmpz9vXS5v67Dt0yX07fc7vc12atvp/dpva/N8rY/bcts0zsxltn5/Zi6zk/yf7md28XMZRtevv2PetucxOPW6zFOvKyrCocbaU5M4X6MMf1PLf75G6dRXw98ko+mkDF9Tm/0aW7f16Hdoc7QplFoKps4Lpcgz9/t0W1fFVYcS6/TtCMnmYvUTBp6gKW7SKG4ABB0+YYBQ5G+S41i+nCVb5SzZKkfjMSX6fOcwST/bv+63L0vCeZI+WNKUT284T32tl3T41H8XoP1JCttPoDufpLedvKrDRLz9RLvzSXrbbeowQba1nljPtNm7n6S3uf8ZJ1tszX62SbrkinCpqTnQ4fV3nIjb2r/+DpP0dq//VC6zw+vq9H3tJJfZxftyxuN2UiqYHcqSts9rdvg9tN/W/jW3fw/OIW8nJc7pHOipiIRo1ffkPAGmKQWaW0ojX9Opcqjlq3xtC6U2pZGvsbVQ6lg2nf6+sWW/xirZznisU/uY/guPLaND8RTZRfF0biubThdPHVdJddjPEXH68wXhIeiKm0Ey2xxCRXEDIFTwaQWEAm+9nGUfylnynpylW+Us+7D1L0C+xNEyU0bJ7zN7MEE12mw7jwlqx9UEnfxr/Ln9q39XqxQ6FicdVyl0LFU6vs7O8/pNQ5+cqNcHR2v0YfFJHapsVEA2pcVF6pJhibpsWJLGuwfJ7nCcPe8ZpULXKxXCXUJCtOrC4ER8CGOG0VJy2COkCF3AhWt7IOBrUwo1nS6OziiU2qxyOlVAnVFKtSuYmmT46mU0VraWWu1WRfV49ZOrTfHUSdnUcWVTa1nV9cqm9qukIjt9LNmd3YdDi2Atbj4tZyhuAOAMfPIBA5DRdFLO0u0tpU/JVjmO58sI+GQaNvkGT1BDzhflTZ8qb9oUmVHJSkiIVjUT7LOqafTp3aKWkzq/c7BCJxt9stucmnxRlm6emKwrRyQpMym63X1Cf80TgJBgc7RMeBXbv+WTGehQMHVeNsnfdPqwudbiqamL4un0Y9m8Fafu03FbY7fndzprbMPeYcVSx5VNnRVPbcqqTs8TFdX1Kqk23/fLPwoEeXETaFPSUNwAQO/jUxQYAIz6Ey0rfD49vOvELhkyZdqc8g2ZpIaL/7ml9Em9TGZEnNVxBwTTNHWosqH1Eu4fF5+U35TiIx26MitJ07OSNS0zUYMi+ZgEgAti2FrKD0dUP5dPphTwnnFI3RnnaGotmNoWT43ty6o2+7eWTc01svmOd76CKuDtWfQzzuvUWaF0+rC7tiubbFERiqmpsri4GXTq0CmKGwAIVnwiA0HIVlPSusrHWbpVjsr9klr+QuZ1X6r6y5e1lD7uSyRnlMVpBw6vP6CPik+2Fj9HqholSaMGx+iLl2doelaSctLiZLdxeBYADFiGIdldMu0uyTWonw+9859a2dQkw9fQ4fxPXRxS1/HwvLOeeLy6w37tTzwe1aPiJrZNgUNxAwChik93wGqmKfvJg62rfJwlW2WvOSJJCrgGyZt2uRrHLpY3fZp8KbmS3WVx4IGlsr5Z7xys1JbCcr1bVKm6Zr9cdkOXDUvQ5y8dqulZSUqLi7Q6JgAgFNjski1apjNaphL773lNUwnxkaqq7tl5mAAAoY8SCOhvZkD2ir3tS5/6Y5KkQFSyvGlT1DDpzpbSJ3lcy18occ5M09T+E3XaUlihzQcqVFBaLVPS4BiXZo1J0fSsZE3JTFCUk/cVABAiDIO/LwAAzgklENDX/F45ThTIWbKt9fAuW9PJlk2xafIOvVL16VPlTZ8mf8JIrhR1ARq9fn1w5KQ2F5ZrS2GFPDUt/xI6PnWQvnZFpmZkJWn0kFjZeG8BAAAAhDFKIKC3+RrlPPbx6ZU+pdtl+FquzOWLH6GmrBvlTZ8mb/pUBQYNpfS5QMdqmrTlYIW2HCjXtsNVavIFFOW0aWpmou76TKauGJGowbERVscEAAAAgKBBCQT0VHOdnGXbT5c+no9kBJolSb7ksWoct1jetGnypk9RIMZtcdiBK2Ca2l1Wo82FLZdx33usVpKUHheh+Tmpmj4ySZcMTVCEw2ZxUgAAAAAITpRAwHkyGivlLH3/VOnznhzHC2SYfpmGXb6UHDVM/ErLlbvSLpcZ2Y8nhQxBdc0+bTtUpc0HyvX2wQpV1HtlM6SJ6XH65owRmp6VpKzkaBmspgIAAACAblECAd2w1XlazudT2nLJdkf5HkmSaXPJ656s+ku+IW/6VPlSL225pCp6pLiqQW+fWu3zQXGVvH5TsRF2XTE8SdNHJukzw5OUEOW0OiYAAAAADDiUQEBbpilbTXHrKh9nyVY5Th5s2eSIljftMtWNuqllpc+QiyUHlxbvKV/A1I6Sam0pLNfmwgodLG85f9LwpCh9dvJFmp6VpEnpcXLYOcwLAAAAAHqCEgjhzTRlrzrQWvg4S7bKXlsiSQpExMubNkWNE+5oWekzOEeyswKlN1Q3evXuwUptLizXu0WVqm70yWEzdMnQeN06MU3TRyQpIzHK6pgAAAAAEFIogRBeAn7Zy/fIVfKenKUtpY+tobxlU1SKmtOnqn7y1+VNnyp/8ljJYPVJbzBNU0UVDa2rffKPnpTflBKjnJoxMlkzspI0NTNRsRF8JAEAAABAX2HGhdDm98pxPL/N5drfl625umXToAw1D7u25dCu9Knyx4/gcu29yOsP6MPik9pSWKHNB8p19GSjJCk7JUZfnpKh6VnJGp86SHYb7zkAAAAA9AdKIIQWX4Ocno9Olz5lH8jwNbRsShylpk/P55M+VYFBF1kcNvRU1De3ntR566FK1TX7FeGw6fJhCfri5UN15YgkpcZxHiUAAAAAsAIlEAY0o7lGjtLtcpVslbN0qxyej2UEvDJlyDd4vBrGf/7U5dqnyIxOsTpuyDFNU/uO12lLYbm2FFZoZ2mNTElDYl2aPXaIpmcl6fJhCYp02q2OCgAAAABhjxIIA4rRUNF6Lh9nyVY5TuyUYQZk2hzypUxUw6Ql8qZPkzftMpkR8VbHDUmNXr/eP1ylLYUV2lJYrmO1zTIkTUgbpH++MlPTs5I1OiVGBofWAQAAAEBQoQRCULPVlp4+tKtkqxyV+yRJpj1C3tRLVH/p0pbSJ/USyRltcdrQVVbdqLcPthzm9f7hKjX5Aop22jVteKKmZyXpihFJSo5xWR0TAAAAAHAWlEAIHqYpW/UhOUu2thzeVbJV9upDkqSAM1a+tMtUO2ZBy+Xah0yU7BEWBw5dAdPUrrIabS6s0JYD5dp3vE6SlB4fqVtyUzUjK1mTh8bL5eDqaQAAAAAwUFACwTpmQPaKT9oc3vWe7HUeSVIgMlHetClqyP2nltJn8HjJxnDtS7VNPm07VKnNhRV652CFKuq9shvSxIvitfSqEZqelazhSVEc5gUAAAAAAxSzavSfgE+OE7taCx9n6TbZGislSf4Yt7zp01SfPlXetKnyJ2VLBqtM+lpxVUPrap8Pi0/KFzAVF+nQZ4YnakZWsqYNT1R8lNPqmAAAAACAXkAJhL7jb5LjWP6pw7vek6N0u2ze2pZNcZlqGn796cu1xw2TWGHS53wBU/84erL1pM5FFQ2SpBHJ0br90os0PStZuelxctj4XQAAAABAqDmnEmjTpk1auXKlAoGAFi9erLvuuqvd9pKSEj344IOqqamR3+/Xfffdp6uvvrpPAiOIeevlLPvw1CqfrXKWfSjD3yRJ8iWNUdOp8/l406YoEJtmcdjwUdfs06Z/lOiNglK9c7BSNU0+OWyGLstI0KJJ6boyK0lDE6KsjgkAAAAA6GPdlkB+v18rVqzQmjVr5Ha7tWjRIuXl5WnUqFGt+zz77LO68cYbdfvtt2v//v266667tHHjxj4NDusZTSflLN3eUvqUbJXjeL6MgE+mYZNvcI4acr7UWvqYUUlWxw1LVfVe3fm7j3W4skFJ0U5dMypZM0Yma0pmgmJcLAQEAAAAgHDS7SwwPz9fmZmZysjIkCTNnTtXGzZsaFcCGYah2tqWw3xqamo0ZMiQPooLKxn1x1vO51O6raX0ObFLhkyZNqd87ovVcPG/qDl9qnxpl8l0DbI6bthr9Pq17OUCeWqatPoLl2jSkBjZOOQOAAAAAMJWtyWQx+NRampq62232638/Px2+3zzm9/UnXfeqf/5n/9RQ0OD1qxZ0/tJ0e9sNUdbV/k4S7bKUXVAkmQ6IuVNvUz1U+5pWenjniw5OJwomPgCph55dbd2ltbohzeP17Vjhqiqqt7qWAAAAAAAC/XK8SDr16/Xrbfeqq9+9av66KOP9MADD+jVV1+Vzdb11Z3sdkMJCdG98fSWs9ttA/+1mKZUcUDG4XdkO/KujMPvyDh5pGVTRJzMjGnyX/JFmcM+IzN1kgy7Sy5JLmtToxOmaeo7r+zU5sIKfXfeON16+bDQGKMIaYxRBDvGKIIdYxTBjjGKYBcuY7TbEsjtdqusrKz1tsfjkdvtbrfPH//4Rz333HOSpMmTJ6upqUmVlZVKTk7u8nH9fjNkViYkJEQPvNdiBmQv39O6ysdVslW2huOSpEBUsprTp8qbu0TN6dPkTx4r2eyn71vjk+SzJje69dy7h/T77cX6pykZmjcmRVVV9QNzjCKsMEYR7BijCHaMUQQ7xiiCXSiN0ZSUrk/P0m0JlJubq6KiIh05ckRut1vr16/X008/3W6ftLQ0vfvuu1qwYIEOHDigpqYmJSVxIuCg4vfKcaKgtfRxlm6Trelky6bYdDVnzDh1ufZp8idkcbn2AeqVHWX6xTuHNHf8EP2/6cOtjgMAAAAACCLdlkAOh0PLly/XkiVL5Pf7tXDhQmVnZ2vVqlXKycnRzJkz9dBDD+mxxx7Tr371KxmGoSeffFIGJYK1fI1yHvu4TemzXYavpdX0JWSpaeScU1fumqZA3FCLw6I3vF1Yocf/uk/ThifqsetH82cQAAAAANCOYZqmacUTe73+kFlqFQzLxozmWjnKPjhd+ng+khFoliT5ksfJmz5VzenTWi7XHsPV20LNztJq/csf8jU8KVo//+zEMy7/HgxjFDgbxiiCHWMUwY4ximDHGEWwC6Ux2qPDwRCcjMZKOUvfP1X6vCfH8QIZpl+mYZcvJVcNE78ib/o0edMukxmZaHVc9KEjlQ369ks7lRTj0k8W5JxRAAEAAAAAIFECDRi2Oo+cJdvkLG25ZLujfI8kybRHyOu+WPWXfvPU5dovlVwxFqdFfymva9a3Xtwh0zT1zIIcDY7hem0AAAAAgM5RAgUj05Stprh1lY+zZKscJw+2bHJEy5t2uepG3dxS+gyZJDkiLQ4MK9Q3+7XspQKdqGvWs4snKjMp9C9nCAAAAAC4cJRAwcA0Za/cf7r0Kd0qe22pJCkQES9v2lQ1TviCvOlT5UvJkWz82sKdzx/Qw6/u0t5jtfrR/AnKTY+zOhIAAAAAIMjRJlgh4JejfHfrKh9n6TbZGsolSf7oIfKmT1V9+lR506fKnzRGMmwWB0YwMU1Tj//1E71zsFIPz8rWVSOTrY4EAAAAABgAKIH6g79ZjuM72pQ+78vWXNOyKW6YmjPz5E2bKm/6FPnjR0hc2htn8Yt3DulPOz1aMm2YFkxMszoOAAAAAGCAoATqC94GOT0ftrlc+wcyfI2SJF9itpqy57eczydtqgKD0i0Oi4Fk7T9K9F/vHdb8nFTddUWm1XEAAAAAAAMIJVBvaKyW69Cm1tLHcewfMgJemTLkGzxBDePvOFX6TJEZPdjqtBig3tp/Qj/csF9XjkjSQ7OyZbBiDAAAAABwHiiBeijmnR/I8fFqxZsBmTaHfEMmqeHir7Uc3pV2mcyIeKsjIgTkl1Tr0fV7NNY9SE/cNE4OGwUQAAAAAOD8UAL1kDf1MkVMj1NN0qXyuidLTi7Tjd5VVFGve14q0JBYl35y6wRFOe1WRwIAAAAADECUQD3UnHWDAgkL5K2qtzoKQtCJ2ibd/eIO2QxDzyzMVVK0y+pIAAAAAIABihIICFK1TT7dvbZAlQ1e/fy2SRqaEGV1JAAAAADAAGazOgCAM3n9AT34yi4dOFGnJ24ar/Gpg6yOBAAAAAAY4CiBgCBjmqb+9fV92na4So9eP1pXjkiyOhIAAAAAIARQAgFB5j+2FOnPu4/p61cO1005qVbHAQAAAACECEogIIj84aOj+vW2I1o4KU1fmZphdRwAAAAAQAihBAKCxMZ9x/XUxgO6emSy7s8bJcMwrI4EAAAAAAghlEBAEPio+KS+89oe5aTF6Qdzx8puowACAAAAAPQuSiDAYoXldbr35Z1Ki4vUj2+doEin3epIAAAAAIAQRAkEWOhYTZOWvlggl8OmZxbmKiHKaXUkAAAAAECIogQCLFLb5NPdawtU2+TTqltzlB4faXUkAAAAAEAIc1gdAAhHzb6A7l+3Uwcr6rVqQY7GuGOtjgQAAAAACHGsBAL6WcA09f2/7NX2Iye1fPZoTc1MtDoSAAAAACAMUAIB/WzVW4V6Y+9xfWvGCM0Z77Y6DgAAAAAgTFACAf3oN9uL9b8fHNVnJ6fri5cPtToOAAAAACCMUAIB/eSNPcf007cKlZc9WMuuGSnDMKyOBAAAAAAII5RAQD/YfrhK3/vLXk2+KE4r5oyV3UYBBAAAAADoX5RAQB/bf7xO963bqaEJUXrqlgmKcPDHDgAAAADQ/5iNAn2orLpRS9fuUIzLrmcW5Cgu0ml1JAAAAABAmHJYHQAIVdWNXi1dW6D6Zr+e+9zFSo2LtDoSAAAAACCMUQIBfaDJF9B9L+9UcVWDnlmQq1EpMVZHAgAAAACEOUogoJf5A6aWv7ZHHx2t1sq5Y3XZsASrIwEAAAAAwDmBgN5kmqZ+/OYBbfzkhJZdk6Xrxw6xOhIAAAAAAJLOcSXQpk2btHLlSgUCAS1evFh33XVXu+2PP/64tm7dKklqbGxUeXm5tm/f3vtpgSD3wvvF+sPHJbr90ot0+6VDrY4DAAAAAECrbksgv9+vFStWaM2aNXK73Vq0aJHy8vI0atSo1n0eeeSR1u9feOEF7dq1q2/SAkHstV0e/WzzQc0ak6K7r86yOg4AAAAAAO10ezhYfn6+MjMzlZGRIZfLpblz52rDhg1d7r9+/XrNmzevV0MCwW5rUaVWvL5Pl2bE63s3jJHNMKyOBAAAAABAO92WQB6PR6mpqa233W63PB5Pp/sePXpUxcXFmjZtWu8lBILcXk+tHnhll0YkReup+RPkcnCqLQAAAABA8OnVq4OtX79es2fPlt1u73Zfu91QQkJ0bz69Zex2W8i8Fpyf4sp6LXt5p+KjnVrzlcuVGhdpdaROMUYR7BijCHaMUQQ7xiiCHWMUwS5cxmi3JZDb7VZZWVnrbY/HI7fb3em+r732mpYvX35OT+z3m6qqqj/HmMEtISE6ZF4Lzl1Vg1dLfvuxGr1+PbdokiIDgaAdB4xRBDvGKIIdYxTBjjGKYMcYRbALpTGakjKoy23dHreSm5uroqIiHTlyRM3NzVq/fr3y8vLO2O/AgQOqrq7W5MmTe5YWGAAavX7d89JOlVY36ulbJigrOcbqSAAAAAAAnFW3K4EcDoeWL1+uJUuWyO/3a+HChcrOztaqVauUk5OjmTNnSmpZBTRnzhwZnBAXIc4XMPXo+j0qKK3WkzeN0+Sh8VZHAgAAAACgW4ZpmqYVT+z1+kNmqVUoLRvD2ZmmqSf/tl9r80t1f95I3Tb5IqsjnRPGKIIdYxTBjjGKYMcYRbBjjCLYhdIY7dHhYABOW7P1iNbml+pLl2cMmAIIAAAAAACJEgg4Z38qKNOzbxfpxnFD9I0Zw62OAwAAAADAeaEEAs7B2wcrtPKNfZqamaDvzB4tG+e+AgAAAAAMMJRAQDd2ldXo4T/t0qiUWP3w5vFy2vljAwAAAAAYeJjNAmdRXNWgZS8VKDHKqZ8uyFGMq9sL6gEAAAAAEJQogYAuVNQ361sv7pA/YGrVwlwNjnFZHQkAAAAAgAvGsgagEw1ev5a9tFPHa5v1n4snanhStNWRAAAAAADoEVYCAR34AqYe/tNu7fHUaOXcsZqYHmd1JAAAAAAAeowSCGjDNE09+ddP9PbBCj04c5SuHjXY6kgAAAAAAPQKSiCgjV++e0jrCsr01WnDtGBSutVxAAAAAADoNZRAwCkv5Zfql+8e1k0T3PqXKzKtjgMAAAAAQK+iBAIkbT5Qrif/9omuGJGoR2ZlyzAMqyMBAAAAANCrKIEQ9naUVOvhV3drzJBYPTFvvBx2/lgAAAAAAEIPs12EtUMV9Vr2UoFSYl366YIcRbvsVkcCAAAAAKBPUAIhbJ2oa9bStQWyGYaeWZCrpGiX1ZEAAAAAAOgzDqsDAFaoa/Zp2doCVdQ16+e3TVRGYpTVkQAAAAAA6FOsBELY8fkDeuiV3frkeK2euGmcJqTFWR0JAAAAAIA+RwmEsGKapn7wxj69d6hSj8warelZyVZHAgAAAACgX1ACIaw8+3aR1u86pn++IlM356ZaHQcAAAAAgH5DCYSw8X8fl2jN1iO6dWKq7pw2zOo4AAAAAAD0K0oghIU3PzmhH23YrxlZSXpgZrYMw7A6EgAAAAAA/YoSCCHv4+KTemz9bk1IG6TH542Tw0YBBAAAAAAIP5RACGkHy+t177qdSo2L1E9uyVGk0251JAAAAAAALEEJhJB1vLZJS1/cIYfN0KoFOUqIdlodCQAAAAAAyzisDgD0hdomn+5eW6DqRp9+8dmJGpoQZXUkAAAAAAAsRQmEkOP1B3T/K7tUWF6vn946QWPdg6yOBAAAAACA5TgcDCElYJr6/l/2avvhKi2fPVrThidZHQkAAAAAgKBACYSQ8u+bDur1Pcf1jenDNWe82+o4AAAAAAAEDUoghIzffnhUL2wv1uKL0/XlKRlWxwEAAAAAIKhQAiEk/HXvcf3kzQO6ZlSy7r12pAzDsDoSAAAAAABBhRIIA94HR6r03T/v0cT0OP3rnLGy2yiAAAAAAADoiBIIA9r+E3W6b91OXRQfqadvmaBIp93qSAAAAAAABCVKIAxYZdWNuvvFHYpy2vXMwlzFRzmtjgQAAAAAQNA6pxJo06ZNmj17tmbNmqXVq1d3us9rr72mOXPmaO7cubr33nt7NSTQUU2jT3evLVBds1+rFuQoLS7S6kgAAAAAAAQ1R3c7+P1+rVixQmvWrJHb7daiRYuUl5enUaNGte5TVFSk1atX67e//a3i4+NVXl7ep6ER3pp8Ad23bqcOVzbomYU5yk6JtToSAAAAAABBr9uVQPn5+crMzFRGRoZcLpfmzp2rDRs2tNvnD3/4g+644w7Fx8dLkpKTk/smLcJewDT1vT/v0YfFJ/W9G8bo8mGJVkcCAAAAAGBA6LYE8ng8Sk1Nbb3tdrvl8Xja7VNUVKSDBw/qc5/7nG677TZt2rSp95Mi7JmmqZ/8vVB/23dCd1+dpdnjhlgdCQAAAACAAaPbw8HOhd/v16FDh/TCCy+orKxMX/jCF/SnP/1JcXFxXd7HbjeUkBDdG09vObvdFjKvJZg9t+WgfvfhUf3TZzL1jZnZMgwuBX+uGKMIdoxRBDvGKIIdYxTBjjGKYBcuVAWFrgAAFXxJREFUY7TbEsjtdqusrKz1tsfjkdvtPmOfSZMmyel0KiMjQ8OHD1dRUZEmTpzY5eP6/aaqqup7ED14JCREh8xrCVZ/2X1MP3x9r64bnaKvf2aYTp5ssDrSgMIYRbBjjCLYMUYR7BijCHaMUQS7UBqjKSmDutzW7eFgubm5Kioq0pEjR9Tc3Kz169crLy+v3T7XXXedtm3bJkmqqKhQUVGRMjIyehgbaLHtUKW+/5e9umRovL534xjZWAEEAAAAAMB563YlkMPh0PLly7VkyRL5/X4tXLhQ2dnZWrVqlXJycjRz5kzNmDFDb7/9tubMmSO73a4HHnhAiYmcsBc9t+9YrR54ZZcyk6L01PwJinB021sCAAAAAIBOGKZpmlY8sdfrD5mlVqG0bCyYlFY36qv/+7FshvTft0+We1CE1ZEGLMYogh1jFMGOMYpgxxhFsGOMItiF0hg92+FgvXJiaKC3nWzwaumLO9TkC+iXn5tEAQQAAAAAQA9RAiHoNHr9uvflnTp6slH/vihXIwfHWB0JAAAAAIABjxOsIKj4A6a+89oe5ZdUa8WNY3XJ0ASrIwEAAAAAEBIogRA0TNPUUxv36+/7y3XPtSN13ZgUqyMBAAAAABAyKIEQNH617Yj++I9SffGyofrcJRdZHQcAAAAAgJBCCYSg8OrOMv3nliLNHpuib141wuo4AAAAAACEHEogWO7dogr94I1PdPmwBH33hjGyGYbVkQAAAAAACDmUQLDUbk+NHnxll7KSo/VvN4+X086QBAAAAACgLzDjhmWKqxr07bUFSohyatWCHMVGOKyOBAAAAABAyKIEgiWq6r26e22B/AFTzyzIVUpshNWRAAAAAAAIaSy9QL9r8Pq17OUCeWqa9B+LcjU8OdrqSAAAAAAAhDxWAqFf+QKmHnl1t3aV1egHc8Zq0kXxVkcCAAAAACAsUAKh35imqR/+7RNtKazQ/XmjdE32YKsjAQAAAAAQNiiB0G+ee++wXt5Rpq9MzdCii9OtjgMAAAAAQFihBEK/WLejVKvfOaS5E9z6+pXDrY4DAAAAAEDYoQRCn9tSWK4n/vqJpg1P1GOzsmUYhtWRAAAAAAAIO5RA6FM7S6v18J92a/SQWP3wpvFy2BlyAAAAAABYgRk5+syRygZ9+6WdSopx6Se35ijaZbc6EgAAAAAAYYsSCH2ivK5Z33pxhyTpZwtzlRzjsjgRAAAAAADhzWF1AISe+ma/lr1UoBN1zfr5bRM1LDHK6kgAAAAAAIQ9VgKhV/n8AT30p13ae6xWT8wbp5y0OKsjAQAAAAAAUQKhF5mmqZV//UTvFlXq4euyNWNkstWRAAAAAADAKZRA6DU/f+eQXt3p0V2fydQtE9OsjgMAAAAAANqgBEKvePEfJfrv9w5rfm6qlnxmmNVxAAAAAABAB5RA6LG39p/Qv23Yr+lZSXroumwZhmF1JAAAAAAA0AElEHokv6Raj67fo3HuQXp83jg5bBRAAAAAAAAEI0ogXLCi8nrd81KBhsS69JNbJyjKabc6EgAAAAAA6AIlEC7IidomLV27Q3aboWcW5iox2mV1JAAAAAAAcBYOqwNg4Klt8unutQWqavDq57dN0tCEKKsjAQAAAACAblAC4bx4/QE9+MouHSiv149vmaDxqYOsjgQAAAAAAM4Bh4PhnAVMUyte36dth6v02PXZumJEktWRAAAAAADAOaIEwjn7j81F+svuY/p/04dr3oRUq+MAAAAAAIDzQAmEc/L7D4/q+fePaOGkNP3TlAyr4wAAAAAAgPN0TiXQpk2bNHv2bM2aNUurV68+Y/vatWs1bdo0zZ8/X/Pnz9f//d//9XpQWGfDvuN6+s0DumZUsu7PGyXDMKyOBAAAAAAAzlO3J4b2+/1asWKF1qxZI7fbrUWLFikvL0+jRo1qt9+cOXO0fPnyPgsKa3xYXKXlr+1Rbnqc/nXOWNltFEAAAAAAAAxE3a4Eys/PV2ZmpjIyMuRyuTR37lxt2LChP7LBYgdO1Om+l3cpLS5ST98yQZFOu9WRAAAAAADABeq2BPJ4PEpNPX0SYLfbLY/Hc8Z+b7zxhm666SYtXbpUpaWlvZsS/c5T06SlL+6Qy2HTMwtzlRDltDoSAAAAAADogW4PBzsX1157rebNmyeXy6Xf/e53evDBB/X888+f9T52u6GEhOjeeHrL2e22kHktklTd4NU9L3yoOq9f/3vnVI1Pi7M6Enoo1MYoQg9jFMGOMYpgxxhFsGOMItiFyxjttgRyu90qKytrve3xeOR2u9vtk5iY2Pr94sWL9aMf/ajbJ/b7TVVV1Z9P1qCVkBAdMq+l2RfQ0rU7VHiiTqsW5Cg9yhEyry2chdIYRWhijCLYMUYR7BijCHaMUQS7UBqjKSmDutzW7eFgubm5Kioq0pEjR9Tc3Kz169crLy+v3T7Hjh1r/X7jxo0aOXJkD+LCKgHT1Pf+slcfHDmp5TeM1pTMxO7vBAAAAAAABoRuVwI5HA4tX75cS5Yskd/v18KFC5Wdna1Vq1YpJydHM2fO1AsvvKCNGzfKbrcrPj5eTzzxRH9kRy9b9Vah/rr3uJZeNUI3jnN3fwcAAAAAADBgGKZpmlY8sdfrD5mlVqGwbOw324v107cK9dnJ6br32pEyDC4FH0pCYYwitDFGEewYowh2jFEEO8Yogl0ojdEeHQ6G0PfGnmP66VuFmjl6sJZdQwEEAAAAAEAoogQKc+8frtR3/7xXky+K0/dvHCu7jQIIAAAAAIBQRAkUxj45Xqv71+3SsMQoPXXLBEU4GA4AAAAAAIQqZv1hqqy6UXevLVCMy65VC3IUF+m0OhIAAAAAAOhD3V4dDKHnZINXS18sUIPXr19+9mKlxkVaHQkAAAAAAPQxSqAw0+QL6L51O1V8skE/W5irUSkxVkcCAAAAAAD9gBIojPgDppa/tkcfH63W4/PG6dKMBKsjAQAAAACAfsI5gcKEaZr68ZsHtPGTE1p2TZZmjUmxOhIAAAAAAOhHlEBh4vn3i/WHj0t0x6VDdfulQ62OAwAAAAAA+hklUBh4bZdH/775oK4fk6KlV4+wOg4AAAAAALAAJVCI21pUqRWv79NlGfH67g1jZDMMqyMBAAAAAAALUAKFsL2eWj3wyi5lJUfrR/MnyOXg1w0AAAAAQLiiFQhRR0826O6XChQX6dCqBTmKjeBCcAAAAAAAhDNKoBBUVe/V0hcL5PUH9MzCXKXERlgdCQAAAAAAWIwSKMQ0ev265+WdKqtu1NPzJ2hEcrTVkQAAAAAAQBDgGKEQ4guYenT9HhWUVuvJm8fr4qHxVkcCAAAAAABBgpVAIcI0Tf3bhk+06UC57ssbqbzswVZHAgAAAAAAQYQSKET899bDeim/TF+ekqHbJl9kdRwAAAAAABBkKIFCwCsFZfr524c0Z/wQfWP6cKvjAAAAAACAIEQJNMC9fbBCj7+xT9MyE/Wd60fLMAyrIwEAAAAAgCBECTSA7Syr0UOv7NKolFg9efM4Oez8OgEAAAAAQOdoDQao4qoGLVtboKRop366IEcxLi70BgAAAAAAukYJNABV1DfrWy/uUMA0tWphrgbHuKyOBAAAAAAAghzLRwaY+ma/lr20U8drm/Xs4okanhRtdSQAAAAAADAAsBJoAPH5A3r41V3a46nRyrnjlJseZ3UkAAAAAAAwQFACDRCmaeqJv32idw5W6sGZo3T1qGSrIwEAAAAAgAGEEmiAWP3OIb1S4NGd04ZpwaR0q+MAAAAAAIABhhJoAFibX6rn3jusm3Pc+ucrMq2OAwAAAAAABiBKoCC36UC5fvi3T3TliCQ9fF22DMOwOhIAAAAAABiAKIGC2I6Saj3y6m6NdQ/SEzeNk8POrwsAAAAAAFwYWoUgdaiiXsteKlBKrEs/uXWCopx2qyMBAAAAAIABjBIoCJ2oa9bSF3fIZhh6ZkGukqJdVkcCAAAAAAADnMPqAGivrtmnb68tUEW9Vz//7CRlJEZZHQkAAAAAAISAc1oJtGnTJs2ePVuzZs3S6tWru9zv9ddf15gxY7Rjx45eCxhOvP6AHnplt/Yfr9WTN4/XhNRBVkcCAAAAAAAhotsSyO/3a8WKFXruuee0fv16vfrqq9q/f/8Z+9XW1ur555/XpEmT+iRoqDNNUz94Y5/eO1SpR64frStHJFkdCQAAAAAAhJBuS6D8/HxlZmYqIyNDLpdLc+fO1YYNG87Yb9WqVfra176miIiIPgka6v5zS5Fe23VM/3Jlpm7OSbU6DgAAAAAACDHdlkAej0epqadLCbfbLY/H026fnTt3qqysTNdcc02vBwwHf/ioRL/adkQLJqbpq1OHWR0HAAAAAACEoB6fGDoQCOjJJ5/UE088cV73s9sNJSRE9/Tpg4Ldbrvg1/LGLo+eenO/Zo4dopULcuWwc8E29L6ejFGgPzBGEewYowh2jFEEO8Yogl24jNFuSyC3262ysrLW2x6PR263u/V2XV2d9u3bpy996UuSpOPHj+vrX/+6nn32WeXm5nb5uH6/qaqq+p5kDxoJCdEX9Fo+Lj6pZX/MV07qIH3v+mzV1jT2QTrgwsco0F8Yowh2jFEEO8Yogh1jFMEulMZoSkrXF5nqtgTKzc1VUVGRjhw5IrfbrfXr1+vpp59u3T5o0CBt3bq19fYXv/hFPfDAA2ctgCAVltfp3nU7lRoXqR/fkqNIp93qSAAAAAAAIIR1WwI5HA4tX75cS5Yskd/v18KFC5Wdna1Vq1YpJydHM2fO7I+cIeVYTZOWvlggh83QMwtzlBDttDoSAAAAAAAIcYZpmqYVT+z1+kNmqdX5LBurbfLprt//Q0erGrX6s5M0xh3bx+mA0FraiNDEGEWwY4wi2DFGEewYowh2oTRGe3Q4GHpPsy+g+9ftVGF5vVbdmkMBBAAAAAAA+g2XouonAdPUitf3avuRk1o+e7SmDk+0OhIAAAAAAAgjlED95GebDur1Pcf1zRkjNGe8u/s7AAAAAAAA9CJKoH7wvx8U63+2F+u2i9P1pcuHWh0HAAAAAACEIUqgPvbXvcf1078X6trswbrn2pEyDMPqSAAAAAAAIAxRAvWhD45U6bt/3qNJF8VpxY1jZLdRAAEAAAAAAGtQAvWR/cfrdN+6nRoaH6Wn5k9QpNNudSQAAAAAABDGKIH6QFl1o+5eu0NRTrueWZij+Cin1ZEAAAAAAECYc1gdINRUN3p199oC1TX79cvPTVJqXKTVkQAAAAAAACiBelOTL6D71u3S4coG/WxhrrJTYq2OBAAAAAAAIIkSqNcETFPf/fMefVR8UivnjtVlwxKsjgQAAAAAANCKcwL1AtM09eM3D2jDvhP69tVZun7sEKsjAQAAAAAAtEMJ1Av+6+0i/f6jEt1+6UW647KhVscBAAAAAAA4AyVQD72x55h++PpeXTc6RXdfnWV1HAAAAAAAgE5RAvXQpgPluip7sL5/4xjZDMPqOAAAAAAAAJ3ixNA9tGLOWCUlxqiqqt7qKAAAAAAAAF1iJVAPsfoHAAAAAAAMBJRAAAAAAAAAYYASCAAAAAAAIAxQAgEAAAAAAIQBSiAAAAAAAIAwQAkEAAAAAAAQBiiBAAAAAAAAwgAlEAAAAAAAQBigBAIAAAAAAAgDlEAAAAAAAABhgBIIAAAAAAAgDFACAQAAAAAAhAHDNE3T6hAAAAAAAADoW6wEAgAAAAAACAOUQAAAAAAAAGGAEggAAAAAACAMUAIBAAAAAACEAUogAAAAAACAMEAJBAAAAAAAEAYcVgcYyDZt2qSVK1cqEAho8eLFuuuuu6yOBLTz8MMP6+9//7uSk5P16quvWh0HOENpaakeeOABlZeXyzAM3Xbbbfryl79sdSygVVNTk+644w41NzfL7/dr9uzZWrp0qdWxgDP4/X4tXLhQbrdbv/jFL6yOA7STl5enmJgY2Ww22e12rV271upIQDvV1dV67LHHtG/fPhmGoccff1yTJ0+2OlafoAS6QH6/XytWrNCaNWvkdru1aNEi5eXladSoUVZHA1otWLBAX/jCF/Tggw9aHQXolN1u10MPPaQJEyaotrZWCxcu1JVXXslnKYKGy+XSr3/9a8XExMjr9er222/XVVddpYsvvtjqaEA7zz//vEaOHKna2lqrowCd+vWvf62kpCSrYwCdWrlypWbMmKFnnnlGzc3NamxstDpSn+FwsAuUn5+vzMxMZWRkyOVyae7cudqwYYPVsYB2Lr/8csXHx1sdA+jSkCFDNGHCBElSbGyssrKy5PF4LE4FnGYYhmJiYiRJPp9PPp9PhmFYnApor6ysTH//+9+1aNEiq6MAwIBTU1Oj999/v/Uz1OVyKS4uzuJUfYcS6AJ5PB6lpqa23na73UxcAKAHiouLtXv3bk2aNMnqKEA7fr9f8+fP1xVXXKErrriCMYqg8/jjj+v++++XzcZf7RG87rzzTi1YsEC///3vrY4CtFNcXKykpCQ9/PDDuuWWW/Too4+qvr7e6lh9hv9TAAAsV1dXp6VLl+qRRx5RbGys1XGAdux2u9atW6e33npL+fn52rdvn9WRgFZvvvmmkpKSlJOTY3UUoEu//e1v9dJLL+mXv/ylfvOb3+j999+3OhLQyufzadeuXfr85z+vl19+WVFRUVq9erXVsfoMJdAFcrvdKisra73t8XjkdrstTAQAA5PX69XSpUt100036frrr7c6DtCluLg4TZ06VZs3b7Y6CtDqww8/1MaNG5WXl6d77rlH7733nu677z6rYwHtfDpPSk5O1qxZs5Sfn29xIuC01NRUpaamtq70veGGG7Rr1y6LU/UdSqALlJubq6KiIh05ckTNzc1av3698vLyrI4FAAOKaZp69NFHlZWVpa985StWxwHOUFFRoerqaklSY2Oj3nnnHWVlZVmcCjjt3nvv1aZNm7Rx40b9+Mc/1rRp0/TUU09ZHQtoVV9f33rC8vr6er399tvKzs62OBVwWkpKilJTU1VYWChJevfddzVy5EiLU/Udrg52gRwOh5YvX64lS5a0XpKTDzMEm3vuuUfbtm1TZWWlrrrqKn3rW9/S4sWLrY4FtPrggw+0bt06jR49WvPnz5fUMm6vvvpqi5MBLY4dO6aHHnpIfr9fpmnqhhtu0LXXXmt1LAAYMMrLy/WNb3xDUss51ubNm6errrrK4lRAe9/5znd03333yev1KiMjQ0888YTVkfqMYZqmaXUIAAAAAAAA9C0OBwMAAAAAAAgDlEAAAAAAAABhgBIIAAAAAAAgDFACAQAAAAAAhAFKIAAAAAAAgDBACQQAAAAAABAGKIEAAAAAAADCACUQAAAAAABAGPj/6ZNwI+Kb+lwAAAAASUVORK5CYII=\n",
627
+ "text/plain": [
628
+ "<Figure size 1440x864 with 2 Axes>"
629
+ ]
630
+ },
631
+ "metadata": {},
632
+ "output_type": "display_data"
633
+ }
634
+ ],
635
+ "source": [
636
+ "import matplotlib.pyplot as plt\n",
637
+ "import seaborn as sns\n",
638
+ "sns.set_style('darkgrid')\n",
639
+ "\n",
640
+ "history = pd.DataFrame(history.history)\n",
641
+ "fig, ax = plt.subplots(2, 1, figsize=(20, 12))\n",
642
+ "fig.suptitle('Learning Curve', fontsize=24)\n",
643
+ "history[['loss', 'val_loss']].plot(ax=ax[0])\n",
644
+ "history[['accuracy', 'val_accuracy']].plot(ax=ax[1])\n",
645
+ "ax[0].set_title('Loss', fontsize=18)\n",
646
+ "ax[1].set_title('Accuarcy', fontsize=18);"
647
+ ]
648
+ },
649
+ {
650
+ "cell_type": "code",
651
+ "execution_count": null,
652
+ "id": "0b5ad997",
653
+ "metadata": {
654
+ "id": "0b5ad997",
655
+ "papermill": {
656
+ "duration": 0.370269,
657
+ "end_time": "2022-03-31T16:26:16.713170",
658
+ "exception": false,
659
+ "start_time": "2022-03-31T16:26:16.342901",
660
+ "status": "completed"
661
+ },
662
+ "tags": []
663
+ },
664
+ "outputs": [],
665
+ "source": []
666
+ }
667
+ ],
668
+ "metadata": {
669
+ "accelerator": "GPU",
670
+ "colab": {
671
+ "collapsed_sections": [],
672
+ "name": "Text-Classification-Attention-Positional Embeddings.ipynb",
673
+ "provenance": []
674
+ },
675
+ "kernelspec": {
676
+ "display_name": "Python 3 (ipykernel)",
677
+ "language": "python",
678
+ "name": "python3"
679
+ },
680
+ "language_info": {
681
+ "codemirror_mode": {
682
+ "name": "ipython",
683
+ "version": 3
684
+ },
685
+ "file_extension": ".py",
686
+ "mimetype": "text/x-python",
687
+ "name": "python",
688
+ "nbconvert_exporter": "python",
689
+ "pygments_lexer": "ipython3",
690
+ "version": "3.9.7"
691
+ },
692
+ "papermill": {
693
+ "default_parameters": {},
694
+ "duration": 126.733939,
695
+ "end_time": "2022-03-31T16:26:20.094746",
696
+ "environment_variables": {},
697
+ "exception": null,
698
+ "input_path": "__notebook__.ipynb",
699
+ "output_path": "__notebook__.ipynb",
700
+ "parameters": {},
701
+ "start_time": "2022-03-31T16:24:13.360807",
702
+ "version": "2.3.3"
703
+ }
704
+ },
705
+ "nbformat": 4,
706
+ "nbformat_minor": 5
707
+ }
13_Siamese_Network.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
14_Variational_Auto_Encoder.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
15_Object_Detection_Sliding_Window.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
16_Object_Detection_Selective_Search.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
17_Attention_Is_All_You_Need.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
18_Feature_Tokenizer_Transformer.ipynb ADDED
The diff for this file is too large to render. See raw diff
 
19_counterfactual_explanations.ipynb ADDED
@@ -0,0 +1,618 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "raw",
5
+ "metadata": {},
6
+ "source": [
7
+ "---\n",
8
+ "title: 20 Counterfactual Explanations\n",
9
+ "description: A basic tutorial to learn about counterfactual explanations for explainable AI\n",
10
+ "---"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "markdown",
15
+ "metadata": {},
16
+ "source": [
17
+ "<a href=\"https://colab.research.google.com/drive/1mTSRjqki3VsH9MVPfNtJ5nJxcCHvL8B6?usp=sharing\" target=\"_blank\"><img align=\"left\" alt=\"Colab\" title=\"Open in Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"></a>"
18
+ ]
19
+ },
20
+ {
21
+ "cell_type": "markdown",
22
+ "metadata": {
23
+ "id": "J4c6-k2cuM_6"
24
+ },
25
+ "source": [
26
+ "# Counterfactual explanations\n",
27
+ "Counterfactual explanations (CEs) are an important tool from the field of explainable artificial intelligence (XAI). \n",
28
+ "This notebook teaches what CEs are, why they are important, and provides how they can be discovered.\n",
29
+ "\n",
30
+ "## To begin with: What is *XAI*?\n",
31
+ "XAI is a subfield of AI concerned with developing methods to help us use AI systems in a fair, safe, and responsible manner.\n",
32
+ "To do that, XAI aims at *explaining* why an AI system (typically, actually a machine learning model) behaves the way it does.\n",
33
+ "There are two main categories of XAI methods:\n",
34
+ "\n",
35
+ "1 - Methods to understand why very large and complex models, like deep neural nets and large ensembles of decision trees, come to certain decisions/predictions. \n",
36
+ "These models are typically called *black-box* models.\n",
37
+ "\n",
38
+ "2 - Methods to generate models that are so simple that they can be interpreted directly. Models of this type are, e.g., decision trees, rule sets, and equations found by symbolic regression.\n",
39
+ "These models are typically called *glass-box* models.\n",
40
+ "\n",
41
+ "## A brief intro to CEs\n",
42
+ "CEs belong to the first category mentioned above: methods to explain black-box models.\n",
43
+ "Let us consider the case in which we have a model that is a classifier, i.e., our model is a function $$f : \\Omega^d → \\mathbb{C},$$\n",
44
+ "where $\\Omega^d$ is our space of $d$ features (some of which are numerical and thus in $\\mathbb{R}$, some of which are categorical) while $\\mathbb{C}$ is the space of classes (for example for a classifier of credit risk, $\\mathbb{C} = \\{ \\textit{High risk}, \\textit{Low risk} \\}$).\n",
45
+ "\n",
46
+ "Say $\\mathbf{x} \\in \\Omega^d$ is a possible input for our classifier $f$.\n",
47
+ "$\\mathbf{x}$ represents a user. For example, $\\mathbf{x}$ can be the:\n",
48
+ "$$\\mathbf{x} = ( \\textit{ age : 22, gender : Female, savings : 5.000\\$, job : student, } \\dots ). $$\n",
49
+ "For a given $\\mathbf{x}$, $f$ will predict a certain class $c$ (e.g., \"$\\textit{High risk}$\").\n",
50
+ "Now, a CE aims to answer the question:\n",
51
+ "\"What **small change** is needed to $\\mathbf{x}$ such that the new input $\\mathbf{x}^\\prime$ will cause $f$ to produce the desired class $c^\\star$? (e.g., $f(\\mathbf{x}^\\prime) = \\textit{Low risk}$).\n",
52
+ "\n",
53
+ "A CE is a possible answer to the question above. \n",
54
+ "For example, an answer could be that the user needs to increase their savings ($\\textit{5.000\\$} → {8.000\\$}$) and change occupation ($ \\textit{student} \\rightarrow \\textit{part-time employed}$).\n",
55
+ "However, a CE may also reveal that $f$ changes its prediction based on ethnicity or gender (all other features remaining the same), meaning that $f$ learned harmful biases (e.g., from historical data) that perpetuate a discrimination against minorities (unfairness).\n",
56
+ "\n",
57
+ "Here's a simplified depiction in a 2D feature space:\n",
58
+ "![](https://drive.google.com/uc?export=view&id=1eQTEExQhIgi-2sEoCcyMELfKXACTrxAW)\n",
59
+ "\n",
60
+ "\n",
61
+ "### Seeking *small* changes to $x$\n",
62
+ "\n",
63
+ "We seek *small* changes to $x$ to observe how $f$ behaves in the neighborhood of an input to gain information on what the decision boundary looks like in that area. \n",
64
+ "Moreover, a very interesting property of CEs is that they prescribe a possible intervention that the user may actually want to pursue!\n",
65
+ "Thus, we wish that the cost of intervention is small for the user. \n",
66
+ "This means that $\\mathbf{x}^\\prime$ needs to be as close as possible to $\\mathbf{x}$, under some meaningful distance function $\\delta$ that captures the cost of intervention.\n",
67
+ "\n",
68
+ "## Additional reading material\n",
69
+ "An excellent and beginner-friendly starting point is the book by Christoph Molnar: \"Inteprable ML Book\".\n",
70
+ "Here's a direct link to his chapter on CEs (co-written by Susanne Dandl): https://christophm.github.io/interpretable-ml-book/counterfactual.html\n",
71
+ "\n",
72
+ "## Note: CEs vs adversarial examples\n",
73
+ "CEs are similar to adversarial examples (AEs). In both cases, one searches for changes to the input $x$ that trigger a change to the prediction made by $f$. However, CEs are intended to explain $f$ and not to fool it!"
74
+ ]
75
+ },
76
+ {
77
+ "cell_type": "markdown",
78
+ "metadata": {
79
+ "id": "IGvCL3iUogUv"
80
+ },
81
+ "source": [
82
+ "## Let's get started\n",
83
+ "In this notebook we simulate a financial credit risk situation, in which a black-box model (we will be using a random forest) has been trained to tell which users are at high or low risk of default (i.e., become unable of paying back the credit given by the bank).\n",
84
+ "We will then use a CE discovery algorithm to see how an user can change their (unfavorable) situation (i.e., f(x)=high risk)."
85
+ ]
86
+ },
87
+ {
88
+ "cell_type": "markdown",
89
+ "metadata": {
90
+ "id": "yGrHkZmHp9J2"
91
+ },
92
+ "source": [
93
+ "### Set up libraries & random seed"
94
+ ]
95
+ },
96
+ {
97
+ "cell_type": "code",
98
+ "execution_count": null,
99
+ "metadata": {
100
+ "id": "TnthnOL5p8-K"
101
+ },
102
+ "outputs": [],
103
+ "source": [
104
+ "import numpy as np\n",
105
+ "import pandas as pd\n",
106
+ "from sklearn.ensemble import RandomForestClassifier\n",
107
+ "from sklearn.model_selection import train_test_split\n",
108
+ "from sklearn.metrics import accuracy_score, balanced_accuracy_score\n",
109
+ "\n",
110
+ "SEED = 42\n",
111
+ "np.random.seed(SEED) # for reproducibility"
112
+ ]
113
+ },
114
+ {
115
+ "cell_type": "markdown",
116
+ "metadata": {
117
+ "id": "Y7QNj2ykp_lT"
118
+ },
119
+ "source": [
120
+ "### Load data\n",
121
+ "We load the data set \"South German Credit\", which concerns learning a model of whether providing a financial credit to a user may be safe or risky.\n",
122
+ "See https://archive.ics.uci.edu/ml/datasets/South+German+Credit+%28UPDATE%29 for more info.\n",
123
+ "\n",
124
+ "We get this data from the repo of CoGS, a baseline algorithm for the discovery of CEs (more details later). "
125
+ ]
126
+ },
127
+ {
128
+ "cell_type": "code",
129
+ "execution_count": null,
130
+ "metadata": {
131
+ "colab": {
132
+ "base_uri": "https://localhost:8080/"
133
+ },
134
+ "id": "7oqiuCzXtvFS",
135
+ "outputId": "8910b0a3-7ba3-4c19-dbf1-117d6e001750"
136
+ },
137
+ "outputs": [
138
+ {
139
+ "name": "stdout",
140
+ "output_type": "stream",
141
+ "text": [
142
+ "Cloning into 'cogs'...\n",
143
+ "remote: Enumerating objects: 37, done.\u001b[K\n",
144
+ "remote: Counting objects: 100% (37/37), done.\u001b[K\n",
145
+ "remote: Compressing objects: 100% (33/33), done.\u001b[K\n",
146
+ "remote: Total 37 (delta 12), reused 20 (delta 3), pack-reused 0\u001b[K\n",
147
+ "Unpacking objects: 100% (37/37), done.\n",
148
+ "/content/cogs\n",
149
+ "Processing /content/cogs\n",
150
+ "\u001b[33m DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.\n",
151
+ " pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.\u001b[0m\n",
152
+ "Requirement already satisfied: numpy>=1.16.1 in /usr/local/lib/python3.7/dist-packages (from cogs==1.0.0) (1.21.6)\n",
153
+ "Building wheels for collected packages: cogs\n",
154
+ " Building wheel for cogs (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
155
+ " Created wheel for cogs: filename=cogs-1.0.0-py3-none-any.whl size=23323 sha256=bfbcbe6a8d379e20ad7023d11586ef90bafcb3d1558be0c07e1b581cfe64165a\n",
156
+ " Stored in directory: /tmp/pip-ephem-wheel-cache-uojmy6do/wheels/7a/25/e6/fef7467ff3dd1da42831774b151adc77e68a0d3ca439f9a2a4\n",
157
+ "Successfully built cogs\n",
158
+ "Installing collected packages: cogs\n",
159
+ "Successfully installed cogs-1.0.0\n"
160
+ ]
161
+ }
162
+ ],
163
+ "source": [
164
+ "# clone repo, access, and install repo\n",
165
+ "! git clone https://github.com/marcovirgolin/cogs\n",
166
+ "% cd /content/cogs\n",
167
+ "! pip install ."
168
+ ]
169
+ },
170
+ {
171
+ "cell_type": "markdown",
172
+ "metadata": {
173
+ "id": "JR2I2y6ptwBJ"
174
+ },
175
+ "source": [
176
+ "Load the data and preprocess it a bit"
177
+ ]
178
+ },
179
+ {
180
+ "cell_type": "code",
181
+ "execution_count": null,
182
+ "metadata": {
183
+ "colab": {
184
+ "base_uri": "https://localhost:8080/"
185
+ },
186
+ "id": "-jkZaa82p2sV",
187
+ "outputId": "83e3ec70-fe4c-439a-86e8-1ad67008c3c3"
188
+ },
189
+ "outputs": [
190
+ {
191
+ "name": "stdout",
192
+ "output_type": "stream",
193
+ "text": [
194
+ "Num. features: 19, feature names: ['duration_in_month', 'credit_history', 'purpose', 'credit_amount', 'savings', 'present_emp_since', 'installment_as_income_perc', 'personal_status_sex', 'other_debtors', 'present_res_since', 'property', 'age', 'other_installment_plans', 'housing', 'credits_this_bank', 'job', 'people_under_maintenance', 'telephone', 'foreign_worker']\n"
195
+ ]
196
+ }
197
+ ],
198
+ "source": [
199
+ "# Load data set & do some pre-processing\n",
200
+ "df = pd.read_csv(\"south_german_credit.csv\")\n",
201
+ "df.drop(\"account_check_status\",axis=1,inplace=True)\n",
202
+ "categorical_feature_names = ['purpose', 'personal_status_sex',\n",
203
+ " 'other_debtors', 'other_installment_plans', 'telephone', 'foreign_worker']\n",
204
+ "# Note: some other features are indices (categories in which the order matters), treated as numerical here for simplicity\n",
205
+ "label_name = 'credit_risk'\n",
206
+ "desired_class = 1 # this means \"low risk\"\n",
207
+ "\n",
208
+ "for feat in categorical_feature_names: # convert categorical features into integer codes\n",
209
+ " df[feat] = pd.Categorical(df[feat])\n",
210
+ " df[feat] = df[feat].cat.codes \n",
211
+ "feature_names = list(df.columns)\n",
212
+ "feature_names.remove(label_name)\n",
213
+ "\n",
214
+ "print(\"Num. features: {}, feature names: {}\".format(len(feature_names), feature_names))\n",
215
+ "\n",
216
+ "# Prepare data to be in numpy format, as typically used to train a scikit-learn model\n",
217
+ "X = df[feature_names].to_numpy()\n",
218
+ "y = df[label_name].to_numpy().astype(int)\n",
219
+ "# Assume we have a specific train & test split\n",
220
+ "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=SEED)"
221
+ ]
222
+ },
223
+ {
224
+ "cell_type": "markdown",
225
+ "metadata": {
226
+ "id": "SWXjBOl1qSGe"
227
+ },
228
+ "source": [
229
+ "### Train the model\n",
230
+ "Here we train the model, but in a practical situation we may assume that the model has already been trained (and is, e.g., property of the bank that assesses to whether to award the credit or not).\n",
231
+ "\n",
232
+ "We use random forest because it is quick and easy. However, you can use any model you like, such as a deep neural net. \n",
233
+ "As classicly done in XAI litereature, we call this model a *black-box model*. "
234
+ ]
235
+ },
236
+ {
237
+ "cell_type": "code",
238
+ "execution_count": null,
239
+ "metadata": {
240
+ "colab": {
241
+ "base_uri": "https://localhost:8080/"
242
+ },
243
+ "id": "h88N_NNhqg8e",
244
+ "outputId": "f64f2eb7-72a0-401e-ddd0-31424aaaa233"
245
+ },
246
+ "outputs": [
247
+ {
248
+ "data": {
249
+ "text/plain": [
250
+ "RandomForestClassifier(class_weight='balanced', min_samples_leaf=25,\n",
251
+ " random_state=42)"
252
+ ]
253
+ },
254
+ "execution_count": 4,
255
+ "metadata": {},
256
+ "output_type": "execute_result"
257
+ }
258
+ ],
259
+ "source": [
260
+ "# Train black-box model (bbm)\n",
261
+ "bbm = RandomForestClassifier(random_state=SEED, class_weight=\"balanced\", min_samples_leaf=25)\n",
262
+ "bbm.fit(X_train, y_train)\n",
263
+ "# note: we do not one-hot encode multi-category features here for simplicity"
264
+ ]
265
+ },
266
+ {
267
+ "cell_type": "markdown",
268
+ "metadata": {
269
+ "id": "ye5fAPULv5uR"
270
+ },
271
+ "source": [
272
+ "Let's check that the model has a decent accuracy \n",
273
+ "(Note: not really needed for the purpose of CEs)"
274
+ ]
275
+ },
276
+ {
277
+ "cell_type": "code",
278
+ "execution_count": null,
279
+ "metadata": {
280
+ "colab": {
281
+ "base_uri": "https://localhost:8080/"
282
+ },
283
+ "id": "kCtSTtm-v8OK",
284
+ "outputId": "3b0e73e9-80f4-4b29-fd5d-9a6db3d26185"
285
+ },
286
+ "outputs": [
287
+ {
288
+ "name": "stdout",
289
+ "output_type": "stream",
290
+ "text": [
291
+ "acc:0.760, bal.-acc:0.691\n"
292
+ ]
293
+ }
294
+ ],
295
+ "source": [
296
+ "print(\"acc:{:.3f}, bal.-acc:{:.3f}\".format(accuracy_score(y_test, bbm.predict(X_test)), balanced_accuracy_score(y_test, bbm.predict(X_test))))"
297
+ ]
298
+ },
299
+ {
300
+ "cell_type": "markdown",
301
+ "metadata": {
302
+ "id": "nKS172JXuNAC"
303
+ },
304
+ "source": [
305
+ "### Pick the user\n",
306
+ "Next, we simulate to have a user for whom the decision of the black-box model is the undesired one. \n",
307
+ "For example, let's pick the last point in the test set for which the prediction is unfavourable."
308
+ ]
309
+ },
310
+ {
311
+ "cell_type": "code",
312
+ "execution_count": null,
313
+ "metadata": {
314
+ "colab": {
315
+ "base_uri": "https://localhost:8080/"
316
+ },
317
+ "id": "3k7vjJZXuNAD",
318
+ "outputId": "680b07f2-41ab-4492-ed3d-ec66286d703f"
319
+ },
320
+ "outputs": [
321
+ {
322
+ "name": "stdout",
323
+ "output_type": "stream",
324
+ "text": [
325
+ "Description of x:\n",
326
+ " duration_in_month 48\n",
327
+ " credit_history 0\n",
328
+ " purpose 8\n",
329
+ " credit_amount 3844\n",
330
+ " savings 2\n",
331
+ " present_emp_since 4\n",
332
+ " installment_as_income_perc 4\n",
333
+ " personal_status_sex 2\n",
334
+ " other_debtors 0\n",
335
+ " present_res_since 4\n",
336
+ " property 4\n",
337
+ " age 34\n",
338
+ " other_installment_plans 2\n",
339
+ " housing 3\n",
340
+ " credits_this_bank 1\n",
341
+ " job 2\n",
342
+ " people_under_maintenance 1\n",
343
+ " telephone 0\n",
344
+ " foreign_worker 1\n"
345
+ ]
346
+ }
347
+ ],
348
+ "source": [
349
+ "# Let's consider, e.g., the last test sample for which an undesired decision is given\n",
350
+ "p = bbm.predict(X_test)\n",
351
+ "idx = np.argwhere(p != desired_class).squeeze()[-1]\n",
352
+ "x = X_test[idx] # this is our unhappy user!\n",
353
+ "\n",
354
+ "# show features of this user\n",
355
+ "print(\"Description of x:\")\n",
356
+ "for i, feat_name in enumerate(feature_names):\n",
357
+ " print(\" \", feat_name+\" \"*(30-len(feat_name)), x[i])"
358
+ ]
359
+ },
360
+ {
361
+ "cell_type": "markdown",
362
+ "metadata": {
363
+ "id": "cinV3YS8uNAD"
364
+ },
365
+ "source": [
366
+ "### CE discovery algorithm\n",
367
+ "We use the library CoGS to find a CE.\n",
368
+ "CoGS (Counterfactual Genetic Search) is a relatively quick to run and easy to use library that makes no assumptions on the black-box model $f$ (e.g., it does not require linearity nor gradients to work). \n",
369
+ "Moreover, CoGS can handle both numerical and categorical features.\n"
370
+ ]
371
+ },
372
+ {
373
+ "cell_type": "markdown",
374
+ "metadata": {
375
+ "id": "0GtnJ2fhryAa"
376
+ },
377
+ "source": [
378
+ "### Setting up the search space\n",
379
+ "To set up the space in which CoGS searches, we must provide:\n",
380
+ "1) Intervals within which the search takes place (for categorical features, which categories are possible)\n",
381
+ "2) The indices of categorical features (for CoGS to know which are categorical and which are numerical)\n",
382
+ "3) Optional plausibility constraints to ensure that the discovered CE can be realized (e.g., the age of a person cannot decrease)\n",
383
+ "\n",
384
+ "All of these three must be provided as lists that have the same order, in particular the order used to list the feature in `X_train` and `X_test`."
385
+ ]
386
+ },
387
+ {
388
+ "cell_type": "code",
389
+ "execution_count": null,
390
+ "metadata": {
391
+ "id": "HqwFrDnYuNAE"
392
+ },
393
+ "outputs": [],
394
+ "source": [
395
+ "# Set up search bounds\n",
396
+ "feature_intervals = list()\n",
397
+ "for i, feat in enumerate(feature_names):\n",
398
+ " if feat in categorical_feature_names:\n",
399
+ " interval_i = np.unique(X_train[:,i])\n",
400
+ " else:\n",
401
+ " interval_i = (np.min(X_train[:,i]), np.max(X_train[:,i]))\n",
402
+ " feature_intervals.append(interval_i)\n",
403
+ "\n",
404
+ "# Set up which feature indices are categorical\n",
405
+ "indices_categorical_features = [i for i, feat in enumerate(feature_names) if feat in categorical_feature_names]\n",
406
+ "\n",
407
+ "# Let's also set up a plausibility constraint for the feature \"age\" (can only increase)\n",
408
+ "# and one for foreign worker (cannot change, must stay equal to what it is)\n",
409
+ "pcs = ['>=' if feat=='age' else ('=' if feat=='foreign_worker' else None) for feat in feature_names]"
410
+ ]
411
+ },
412
+ {
413
+ "cell_type": "markdown",
414
+ "metadata": {
415
+ "id": "0yztFC6buNAE"
416
+ },
417
+ "source": [
418
+ "## Hyper parameters\n",
419
+ "We can now setup the hyper-parameters of CoGS, and then run the search!\n",
420
+ "We put some comments to explain what they mean in the code below.\n",
421
+ "\n",
422
+ "As distance $\\delta$, here we use Gower's distance, which takes into account both numerical differences and categorical mismatches (see https://christophm.github.io/interpretable-ml-book/counterfactual.html#method-by-dandl-et-al.).\n",
423
+ "In a genetic algorithm, the quality of solutions is measured in terms of *fitness*, where normally higher is better.\n",
424
+ "Thus the fitness used here is set to be the opposite of Gower's distance."
425
+ ]
426
+ },
427
+ {
428
+ "cell_type": "code",
429
+ "execution_count": null,
430
+ "metadata": {
431
+ "id": "I5TqNa9NuNAE"
432
+ },
433
+ "outputs": [],
434
+ "source": [
435
+ "from cogs.evolution import Evolution\n",
436
+ "from cogs.fitness import gower_fitness_function\n",
437
+ "\n",
438
+ "cogs = Evolution(\n",
439
+ " ### hyper-parameters of the problem (required!) ###\n",
440
+ " x=x, # the starting point aka unhappy user\n",
441
+ " fitness_function=gower_fitness_function, # a classic fitness function for counterfactual explanations\n",
442
+ " fitness_function_kwargs={'blackbox':bbm,'desired_class': desired_class}, # these must be passed for the fitness function to work\n",
443
+ " feature_intervals=feature_intervals, # intervals within which the search operates\n",
444
+ " indices_categorical_features=indices_categorical_features, # the indices of the features that are categorical\n",
445
+ " plausibility_constraints=pcs, # can be \"None\" if no constraints need to be set\n",
446
+ " ### hyper-parameters of the evolution (all optional) ###\n",
447
+ " evolution_type='classic', # the type of evolution, classic works quite well\n",
448
+ " population_size=1000, # how many candidate counterfactual examples to evolve simultaneously\n",
449
+ " n_generations=25, # number of iterations for the evolution\n",
450
+ " selection_name='tournament_4', # selection pressure\n",
451
+ " init_temperature=0.8, # how \"far\" from x we initialize\n",
452
+ " num_features_mutation_strength=0.25, # strength of random mutations for numerical features\n",
453
+ " num_features_mutation_strength_decay=0.5, # decay for the hyper-param. above\n",
454
+ " num_features_mutation_strength_decay_generations=[10,15,20], # when to apply the decay\n",
455
+ " ### other settings ###\n",
456
+ " verbose=True # logs progress at every generation \n",
457
+ ")"
458
+ ]
459
+ },
460
+ {
461
+ "cell_type": "markdown",
462
+ "metadata": {
463
+ "id": "PfjKfkb9uNAF"
464
+ },
465
+ "source": [
466
+ "Ready to run!"
467
+ ]
468
+ },
469
+ {
470
+ "cell_type": "code",
471
+ "execution_count": null,
472
+ "metadata": {
473
+ "colab": {
474
+ "base_uri": "https://localhost:8080/"
475
+ },
476
+ "id": "Af-wLnsPuNAG",
477
+ "outputId": "fe1c51c0-bee8-4408-ebe9-c1317128f282"
478
+ },
479
+ "outputs": [
480
+ {
481
+ "name": "stdout",
482
+ "output_type": "stream",
483
+ "text": [
484
+ "generation: 1 best fitness: -0.23960241235590057 avg. fitness: -0.5655859030567522\n",
485
+ "generation: 2 best fitness: -0.23960241235590057 avg. fitness: -0.45144422711053095\n",
486
+ "generation: 3 best fitness: -0.23960241235590057 avg. fitness: -0.3932625759375893\n",
487
+ "generation: 4 best fitness: -0.18999766508484883 avg. fitness: -0.3424200187317699\n",
488
+ "generation: 5 best fitness: -0.18999766508484883 avg. fitness: -0.29326014109826054\n",
489
+ "generation: 6 best fitness: -0.10306974074439322 avg. fitness: -0.24382975966224887\n",
490
+ "generation: 7 best fitness: -0.06996051106514789 avg. fitness: -0.1966318789656929\n",
491
+ "generation: 8 best fitness: -0.059808815825190084 avg. fitness: -0.15648801037922425\n",
492
+ "generation: 9 best fitness: -0.03346219281443785 avg. fitness: -0.12237121200700284\n",
493
+ "generation: 10 best fitness: -0.029642499013702472 avg. fitness: -0.09555415482021212\n",
494
+ "generation: 11 best fitness: -0.029642499013702472 avg. fitness: -0.0736235690606997\n",
495
+ "generation: 12 best fitness: -0.029642499013702472 avg. fitness: -0.05854284736818094\n",
496
+ "generation: 13 best fitness: -0.029642499013702472 avg. fitness: -0.046547369048175184\n",
497
+ "generation: 14 best fitness: -0.029642499013702472 avg. fitness: -0.03997231228968589\n",
498
+ "generation: 15 best fitness: -0.029642499013702472 avg. fitness: -0.035952275049771346\n",
499
+ "generation: 16 best fitness: -0.029637374050308057 avg. fitness: -0.03372073074558145\n",
500
+ "generation: 17 best fitness: -0.029637374050308057 avg. fitness: -0.03255322048135062\n",
501
+ "generation: 18 best fitness: -0.029616365796527388 avg. fitness: -0.03200569502030889\n",
502
+ "generation: 19 best fitness: -0.02961122525464162 avg. fitness: -0.03211885944499849\n",
503
+ "generation: 20 best fitness: -0.02961122525464162 avg. fitness: -0.03209647434482432\n",
504
+ "generation: 21 best fitness: -0.02961122525464162 avg. fitness: -0.031716957592190904\n",
505
+ "generation: 22 best fitness: -0.02961122525464162 avg. fitness: -0.03144263058972952\n",
506
+ "generation: 23 best fitness: -0.02961122525464162 avg. fitness: -0.03189248282843857\n",
507
+ "generation: 24 best fitness: -0.029607941498829213 avg. fitness: -0.031491601489145066\n",
508
+ "generation: 25 best fitness: -0.029607941498829213 avg. fitness: -0.03188161227227084\n"
509
+ ]
510
+ }
511
+ ],
512
+ "source": [
513
+ "cogs.run()"
514
+ ]
515
+ },
516
+ {
517
+ "cell_type": "markdown",
518
+ "metadata": {
519
+ "id": "o89rjGA8uNAG"
520
+ },
521
+ "source": [
522
+ "## Counterfactual explanation\n",
523
+ "Now that CoGS has terminated, we can look at its result.\n",
524
+ "The field `cogs.elite` contains the best-found counterfactual example, i.e., a point `x'` for which `bbm(x')=desired_class`.\n",
525
+ "The respective counterfactual explanation is simply `x'-x` (there exist more involved definitions of counterfactual explanations, here we use this simple one).\n",
526
+ "Let's take a look at what the user needs to do to obtain the desired class, i.e., be granted the loan."
527
+ ]
528
+ },
529
+ {
530
+ "cell_type": "code",
531
+ "execution_count": null,
532
+ "metadata": {
533
+ "colab": {
534
+ "base_uri": "https://localhost:8080/"
535
+ },
536
+ "id": "AqOxGWuAuNAG",
537
+ "outputId": "1f733e26-7d77-4d3a-e799-821f32ee0fea"
538
+ },
539
+ "outputs": [
540
+ {
541
+ "name": "stdout",
542
+ "output_type": "stream",
543
+ "text": [
544
+ "Success! Here's the explanation:\n",
545
+ " Feature 'savings' should change from '2' to '2.5'\n"
546
+ ]
547
+ }
548
+ ],
549
+ "source": [
550
+ "from pandas.core.arrays import categorical\n",
551
+ "# Get the best-found counterfactual example (called elite)\n",
552
+ "cf_example = cogs.elite\n",
553
+ "cf_explanation = cogs.elite - x\n",
554
+ "\n",
555
+ "# Show counterfactual explanation\n",
556
+ "if bbm.predict([cf_example])[0] == desired_class:\n",
557
+ " print(\"Success! Here's the explanation:\")\n",
558
+ " for i, feat in enumerate(feature_names):\n",
559
+ " if cf_explanation[i] != 0:\n",
560
+ " print(\" Feature '{}' should change from '{}' to '{}'\".format(feat, np.round(x[i],3), np.round(cf_example[i],3)))\n",
561
+ "else:\n",
562
+ " print(\"Failed to find a counterfactual explanation for the desired class :(\")"
563
+ ]
564
+ },
565
+ {
566
+ "cell_type": "markdown",
567
+ "metadata": {
568
+ "id": "Nc8OEQUMy15q"
569
+ },
570
+ "source": [
571
+ "# Exercise idea\n",
572
+ "Here's an idea for an exercise.\n",
573
+ "One of the features is called `foreign_worker`. This may be considered a sensitive feature: should $f$ be allowed to discriminate based only on that?\n",
574
+ "\n",
575
+ "Try to use CoGS to search whether a CE can be found for (one or more) `x` who is a foreign worker and for whom `bbm` says `high risk`, that recommends not to be a foreign worker.\n",
576
+ "To do that, you can set the plausibility constraints to \"`=`\" for all features except for `foreign_worker`."
577
+ ]
578
+ },
579
+ {
580
+ "cell_type": "code",
581
+ "execution_count": null,
582
+ "metadata": {
583
+ "id": "E-I15CXc0A2t"
584
+ },
585
+ "outputs": [],
586
+ "source": []
587
+ }
588
+ ],
589
+ "metadata": {
590
+ "colab": {
591
+ "collapsed_sections": [],
592
+ "name": "counterfactual explanations.ipynb",
593
+ "provenance": []
594
+ },
595
+ "interpreter": {
596
+ "hash": "82b2f7e49a54dfc9e19a85f649bd0ef29fcdbc801e6c42932c693ea93cc5c6ab"
597
+ },
598
+ "kernelspec": {
599
+ "display_name": "Python 3 (ipykernel)",
600
+ "language": "python",
601
+ "name": "python3"
602
+ },
603
+ "language_info": {
604
+ "codemirror_mode": {
605
+ "name": "ipython",
606
+ "version": 3
607
+ },
608
+ "file_extension": ".py",
609
+ "mimetype": "text/x-python",
610
+ "name": "python",
611
+ "nbconvert_exporter": "python",
612
+ "pygments_lexer": "ipython3",
613
+ "version": "3.9.12"
614
+ }
615
+ },
616
+ "nbformat": 4,
617
+ "nbformat_minor": 1
618
+ }
20_Named_Emtity_Recognition_Transformers.ipynb ADDED
The diff for this file is too large to render. See raw diff