ahmedabdelwahed commited on
Commit
6b03211
1 Parent(s): f0eb78a
README.md CHANGED
@@ -17,14 +17,14 @@ should probably proofread and complete it, then remove this comment. -->
17
  This model is a fine-tuned version of [ahmedabdelwahed/Mojiz-sft](https://huggingface.co/ahmedabdelwahed/Mojiz-sft) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
  - Loss: 0.0000
20
- - Rewards/chosen: 17.3685
21
- - Rewards/rejected: -9.0879
22
  - Rewards/accuracies: 1.0
23
- - Rewards/margins: 26.4564
24
- - Logps/rejected: -89.2152
25
- - Logps/chosen: -291.5760
26
- - Logits/rejected: -11.3625
27
- - Logits/chosen: -12.3502
28
 
29
  ## Model description
30
 
@@ -50,32 +50,92 @@ The following hyperparameters were used during training:
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
  - lr_scheduler_warmup_steps: 150
53
- - training_steps: 2000
54
 
55
  ### Training results
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
58
  |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
59
  | 0.0017 | 0.41 | 100 | 0.0000 | 9.9359 | -3.7597 | 1.0 | 13.6956 | -78.5589 | -306.4413 | -11.4127 | -12.4541 |
60
- | 0.0002 | 0.82 | 200 | 0.0000 | 14.2220 | -5.8111 | 1.0 | 20.0331 | -82.6616 | -297.8690 | -11.3000 | -12.2683 |
61
- | 0.0036 | 1.22 | 300 | 0.0000 | 14.9129 | -6.8369 | 1.0 | 21.7499 | -84.7133 | -296.4872 | -11.2647 | -12.2239 |
62
- | 0.0 | 1.63 | 400 | 0.0000 | 15.5788 | -7.9152 | 1.0 | 23.4939 | -86.8698 | -295.1555 | -11.2175 | -12.1437 |
63
- | 0.0 | 2.04 | 500 | 0.0000 | 15.9537 | -8.1483 | 1.0 | 24.1020 | -87.3360 | -294.4057 | -11.2256 | -12.1568 |
64
- | 0.0 | 2.45 | 600 | 0.0000 | 16.3724 | -8.1224 | 1.0 | 24.4948 | -87.2842 | -293.5682 | -11.2611 | -12.2092 |
65
- | 0.0 | 2.86 | 700 | 0.0000 | 16.2464 | -8.7026 | 1.0 | 24.9490 | -88.4446 | -293.8203 | -11.2351 | -12.1506 |
66
- | 0.0 | 3.27 | 800 | 0.0000 | 17.0144 | -7.9818 | 1.0 | 24.9962 | -87.0030 | -292.2843 | -11.3359 | -12.3118 |
67
- | 0.0 | 3.67 | 900 | 0.0000 | 17.0212 | -8.0028 | 1.0 | 25.0241 | -87.0451 | -292.2705 | -11.3356 | -12.3114 |
68
- | 0.0 | 4.08 | 1000 | 0.0000 | 17.1508 | -8.0673 | 1.0 | 25.2182 | -87.1741 | -292.0114 | -11.3467 | -12.3293 |
69
- | 0.0 | 4.49 | 1100 | 0.0000 | 17.1035 | -8.6586 | 1.0 | 25.7621 | -88.3567 | -292.1060 | -11.3427 | -12.3196 |
70
- | 0.0 | 4.9 | 1200 | 0.0000 | 17.1191 | -8.6910 | 1.0 | 25.8101 | -88.4214 | -292.0748 | -11.3428 | -12.3192 |
71
- | 0.0 | 5.31 | 1300 | 0.0000 | 17.2084 | -8.9388 | 1.0 | 26.1472 | -88.9170 | -291.8962 | -11.3530 | -12.3370 |
72
- | 0.0 | 5.71 | 1400 | 0.0000 | 17.2160 | -9.1194 | 1.0 | 26.3354 | -89.2783 | -291.8811 | -11.3506 | -12.3312 |
73
- | 0.0 | 6.12 | 1500 | 0.0000 | 17.2326 | -9.1376 | 1.0 | 26.3702 | -89.3146 | -291.8478 | -11.3494 | -12.3291 |
74
- | 0.0002 | 6.53 | 1600 | 0.0000 | 17.2830 | -9.1192 | 1.0 | 26.4022 | -89.2778 | -291.7470 | -11.3555 | -12.3384 |
75
- | 0.0 | 6.94 | 1700 | 0.0000 | 17.3062 | -9.1044 | 1.0 | 26.4107 | -89.2482 | -291.7006 | -11.3582 | -12.3429 |
76
- | 0.0 | 7.35 | 1800 | 0.0000 | 17.3229 | -9.1143 | 1.0 | 26.4372 | -89.2681 | -291.6673 | -11.3586 | -12.3435 |
77
- | 0.0 | 7.76 | 1900 | 0.0000 | 17.3411 | -9.1107 | 1.0 | 26.4518 | -89.2609 | -291.6309 | -11.3600 | -12.3457 |
78
- | 0.0 | 8.16 | 2000 | 0.0000 | 17.3685 | -9.0879 | 1.0 | 26.4564 | -89.2152 | -291.5760 | -11.3625 | -12.3502 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
79
 
80
 
81
  ### Framework versions
 
17
  This model is a fine-tuned version of [ahmedabdelwahed/Mojiz-sft](https://huggingface.co/ahmedabdelwahed/Mojiz-sft) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
  - Loss: 0.0000
20
+ - Rewards/chosen: 20.7508
21
+ - Rewards/rejected: -10.7382
22
  - Rewards/accuracies: 1.0
23
+ - Rewards/margins: 31.4890
24
+ - Logps/rejected: -92.5158
25
+ - Logps/chosen: -284.8114
26
+ - Logits/rejected: -11.6194
27
+ - Logits/chosen: -12.6924
28
 
29
  ## Model description
30
 
 
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
  - lr_scheduler_warmup_steps: 150
53
+ - training_steps: 8000
54
 
55
  ### Training results
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
58
  |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
59
  | 0.0017 | 0.41 | 100 | 0.0000 | 9.9359 | -3.7597 | 1.0 | 13.6956 | -78.5589 | -306.4413 | -11.4127 | -12.4541 |
60
+ | 0.0002 | 0.82 | 200 | 0.0000 | 14.2382 | -5.8180 | 1.0 | 20.0562 | -82.6754 | -297.8366 | -11.3000 | -12.2684 |
61
+ | 0.0035 | 1.22 | 300 | 0.0000 | 14.9451 | -6.8831 | 1.0 | 21.8282 | -84.8057 | -296.4229 | -11.2631 | -12.2221 |
62
+ | 0.0 | 1.63 | 400 | 0.0000 | 15.6239 | -8.0940 | 1.0 | 23.7178 | -87.2274 | -295.0653 | -11.2114 | -12.1338 |
63
+ | 0.0 | 2.04 | 500 | 0.0000 | 15.9950 | -8.3192 | 1.0 | 24.3142 | -87.6779 | -294.3232 | -11.2217 | -12.1499 |
64
+ | 0.0 | 2.45 | 600 | 0.0000 | 16.4967 | -8.2808 | 1.0 | 24.7775 | -87.6010 | -293.3195 | -11.2633 | -12.2118 |
65
+ | 0.0 | 2.86 | 700 | 0.0000 | 16.2905 | -9.0144 | 1.0 | 25.3049 | -89.0682 | -293.7320 | -11.2314 | -12.1373 |
66
+ | 0.0 | 3.27 | 800 | 0.0000 | 17.3895 | -7.9312 | 1.0 | 25.3208 | -86.9019 | -291.5340 | -11.3726 | -12.3633 |
67
+ | 0.0 | 3.67 | 900 | 0.0000 | 17.3977 | -7.9560 | 1.0 | 25.3537 | -86.9514 | -291.5177 | -11.3723 | -12.3628 |
68
+ | 0.0 | 4.08 | 1000 | 0.0000 | 17.4673 | -8.1543 | 1.0 | 25.6216 | -87.3481 | -291.3784 | -11.3750 | -12.3654 |
69
+ | 0.0 | 4.49 | 1100 | 0.0000 | 17.3363 | -8.9657 | 1.0 | 26.3020 | -88.9709 | -291.6405 | -11.3670 | -12.3470 |
70
+ | 0.0 | 4.9 | 1200 | 0.0000 | 17.3540 | -9.0028 | 1.0 | 26.3568 | -89.0451 | -291.6051 | -11.3671 | -12.3466 |
71
+ | 0.0 | 5.31 | 1300 | 0.0000 | 17.4850 | -9.3043 | 1.0 | 26.7893 | -89.6480 | -291.3430 | -11.3838 | -12.3759 |
72
+ | 0.0 | 5.71 | 1400 | 0.0000 | 17.6089 | -9.3554 | 1.0 | 26.9643 | -89.7502 | -291.0953 | -11.3893 | -12.3826 |
73
+ | 0.0 | 6.12 | 1500 | 0.0000 | 17.6418 | -9.3848 | 1.0 | 27.0266 | -89.8090 | -291.0294 | -11.3872 | -12.3788 |
74
+ | 0.0001 | 6.53 | 1600 | 0.0000 | 17.7200 | -9.3570 | 1.0 | 27.0770 | -89.7534 | -290.8731 | -11.3975 | -12.3941 |
75
+ | 0.0 | 6.94 | 1700 | 0.0000 | 17.7617 | -9.3377 | 1.0 | 27.0994 | -89.7148 | -290.7896 | -11.4020 | -12.4017 |
76
+ | 0.0 | 7.35 | 1800 | 0.0000 | 17.8247 | -9.3772 | 1.0 | 27.2019 | -89.7938 | -290.6637 | -11.4033 | -12.4039 |
77
+ | 0.0 | 7.76 | 1900 | 0.0000 | 17.8638 | -9.3928 | 1.0 | 27.2566 | -89.8251 | -290.5855 | -11.4046 | -12.4052 |
78
+ | 0.0 | 8.16 | 2000 | 0.0000 | 18.1144 | -9.2188 | 1.0 | 27.3332 | -89.4771 | -290.0843 | -11.4242 | -12.4400 |
79
+ | 0.0 | 8.57 | 2100 | 0.0000 | 18.1229 | -9.2243 | 1.0 | 27.3472 | -89.4881 | -290.0672 | -11.4242 | -12.4401 |
80
+ | 0.0 | 8.98 | 2200 | 0.0000 | 18.1432 | -9.2739 | 1.0 | 27.4171 | -89.5872 | -290.0266 | -11.4281 | -12.4420 |
81
+ | 0.0 | 9.39 | 2300 | 0.0000 | 18.2729 | -9.3131 | 1.0 | 27.5860 | -89.6657 | -289.7673 | -11.4278 | -12.4441 |
82
+ | 0.0 | 9.8 | 2400 | 0.0000 | 18.2914 | -9.3532 | 1.0 | 27.6446 | -89.7459 | -289.7303 | -11.4279 | -12.4436 |
83
+ | 0.0 | 10.2 | 2500 | 0.0000 | 18.3550 | -9.3675 | 1.0 | 27.7225 | -89.7745 | -289.6031 | -11.4324 | -12.4488 |
84
+ | 0.0 | 10.61 | 2600 | 0.0000 | 18.5092 | -9.4395 | 1.0 | 27.9487 | -89.9185 | -289.2947 | -11.4477 | -12.4716 |
85
+ | 0.0 | 11.02 | 2700 | 0.0000 | 18.5278 | -9.4387 | 1.0 | 27.9666 | -89.9169 | -289.2574 | -11.4484 | -12.4728 |
86
+ | 0.0 | 11.43 | 2800 | 0.0000 | 18.9266 | -9.3672 | 1.0 | 28.2938 | -89.7738 | -288.4599 | -11.4894 | -12.5273 |
87
+ | 0.0 | 11.84 | 2900 | 0.0000 | 18.9978 | -9.4237 | 1.0 | 28.4215 | -89.8868 | -288.3174 | -11.5000 | -12.5400 |
88
+ | 0.0 | 12.24 | 3000 | 0.0000 | 19.0186 | -9.4479 | 1.0 | 28.4665 | -89.9352 | -288.2759 | -11.4983 | -12.5375 |
89
+ | 0.0 | 12.65 | 3100 | 0.0000 | 19.0213 | -9.4485 | 1.0 | 28.4698 | -89.9365 | -288.2705 | -11.4994 | -12.5392 |
90
+ | 0.0 | 13.06 | 3200 | 0.0000 | 19.0656 | -9.5104 | 1.0 | 28.5759 | -90.0602 | -288.1819 | -11.4988 | -12.5380 |
91
+ | 0.0 | 13.47 | 3300 | 0.0000 | 19.0811 | -9.5638 | 1.0 | 28.6449 | -90.1670 | -288.1508 | -11.4994 | -12.5412 |
92
+ | 0.0 | 13.88 | 3400 | 0.0000 | 19.0755 | -9.6303 | 1.0 | 28.7058 | -90.3000 | -288.1620 | -11.4984 | -12.5391 |
93
+ | 0.0 | 14.29 | 3500 | 0.0000 | 19.0764 | -9.6361 | 1.0 | 28.7124 | -90.3116 | -288.1603 | -11.4984 | -12.5390 |
94
+ | 0.0 | 14.69 | 3600 | 0.0000 | 19.7645 | -9.6207 | 1.0 | 29.3852 | -90.2808 | -286.7841 | -11.5674 | -12.6283 |
95
+ | 0.0 | 15.1 | 3700 | 0.0000 | 19.7594 | -9.7019 | 1.0 | 29.4613 | -90.4432 | -286.7942 | -11.5659 | -12.6252 |
96
+ | 0.0 | 15.51 | 3800 | 0.0000 | 19.8213 | -9.7241 | 1.0 | 29.5454 | -90.4877 | -286.6704 | -11.5693 | -12.6319 |
97
+ | 0.0 | 15.92 | 3900 | 0.0000 | 19.8591 | -9.7267 | 1.0 | 29.5857 | -90.4928 | -286.5949 | -11.5754 | -12.6423 |
98
+ | 0.0 | 16.33 | 4000 | 0.0000 | 20.1637 | -10.0565 | 1.0 | 30.2202 | -91.1524 | -285.9856 | -11.6035 | -12.6809 |
99
+ | 0.0 | 16.73 | 4100 | 0.0000 | 20.1671 | -10.0572 | 1.0 | 30.2244 | -91.1539 | -285.9789 | -11.6039 | -12.6816 |
100
+ | 0.0 | 17.14 | 4200 | 0.0000 | 20.1791 | -10.1186 | 1.0 | 30.2977 | -91.2767 | -285.9549 | -11.6032 | -12.6803 |
101
+ | 0.0 | 17.55 | 4300 | 0.0000 | 20.1786 | -10.1726 | 1.0 | 30.3512 | -91.3847 | -285.9559 | -11.6026 | -12.6788 |
102
+ | 0.0 | 17.96 | 4400 | 0.0000 | 20.1663 | -10.2017 | 1.0 | 30.3680 | -91.4428 | -285.9804 | -11.6022 | -12.6778 |
103
+ | 0.0 | 18.37 | 4500 | 0.0000 | 20.1651 | -10.2076 | 1.0 | 30.3727 | -91.4546 | -285.9829 | -11.6021 | -12.6777 |
104
+ | 0.0 | 18.78 | 4600 | 0.0000 | 20.1509 | -10.2578 | 1.0 | 30.4087 | -91.5550 | -286.0112 | -11.6017 | -12.6762 |
105
+ | 0.0 | 19.18 | 4700 | 0.0000 | 20.1784 | -10.2457 | 1.0 | 30.4241 | -91.5308 | -285.9563 | -11.6037 | -12.6793 |
106
+ | 0.0 | 19.59 | 4800 | 0.0000 | 20.1812 | -10.2503 | 1.0 | 30.4315 | -91.5400 | -285.9507 | -11.6040 | -12.6798 |
107
+ | 0.0 | 20.0 | 4900 | 0.0000 | 20.1823 | -10.2604 | 1.0 | 30.4428 | -91.5603 | -285.9484 | -11.6041 | -12.6798 |
108
+ | 0.0 | 20.41 | 5000 | 0.0000 | 20.1883 | -10.2616 | 1.0 | 30.4499 | -91.5626 | -285.9364 | -11.6051 | -12.6818 |
109
+ | 0.0 | 20.82 | 5100 | 0.0000 | 20.1896 | -10.2675 | 1.0 | 30.4571 | -91.5745 | -285.9339 | -11.6051 | -12.6819 |
110
+ | 0.0 | 21.22 | 5200 | 0.0000 | 20.1736 | -10.3226 | 1.0 | 30.4962 | -91.6847 | -285.9659 | -11.6057 | -12.6823 |
111
+ | 0.0 | 21.63 | 5300 | 0.0000 | 20.1824 | -10.3241 | 1.0 | 30.5065 | -91.6877 | -285.9483 | -11.6061 | -12.6830 |
112
+ | 0.0 | 22.04 | 5400 | 0.0000 | 20.1732 | -10.3699 | 1.0 | 30.5431 | -91.7793 | -285.9666 | -11.6051 | -12.6797 |
113
+ | 0.0 | 22.45 | 5500 | 0.0000 | 20.5647 | -10.3381 | 1.0 | 30.9027 | -91.7156 | -285.1837 | -11.6065 | -12.6773 |
114
+ | 0.0 | 22.86 | 5600 | 0.0000 | 20.5540 | -10.3886 | 1.0 | 30.9426 | -91.8166 | -285.2050 | -11.6059 | -12.6761 |
115
+ | 0.0 | 23.27 | 5700 | 0.0000 | 20.5442 | -10.3824 | 1.0 | 30.9267 | -91.8043 | -285.2246 | -11.6076 | -12.6788 |
116
+ | 0.0 | 23.67 | 5800 | 0.0000 | 20.5517 | -10.4140 | 1.0 | 30.9657 | -91.8675 | -285.2097 | -11.6099 | -12.6809 |
117
+ | 0.0 | 24.08 | 5900 | 0.0000 | 20.5647 | -10.4280 | 1.0 | 30.9927 | -91.8955 | -285.1837 | -11.6096 | -12.6804 |
118
+ | 0.0 | 24.49 | 6000 | 0.0000 | 20.6521 | -10.4626 | 1.0 | 31.1147 | -91.9646 | -285.0089 | -11.6107 | -12.6823 |
119
+ | 0.0 | 24.9 | 6100 | 0.0000 | 20.6569 | -10.4643 | 1.0 | 31.1212 | -91.9680 | -284.9993 | -11.6109 | -12.6826 |
120
+ | 0.0 | 25.31 | 6200 | 0.0000 | 20.6600 | -10.4637 | 1.0 | 31.1238 | -91.9669 | -284.9930 | -11.6118 | -12.6838 |
121
+ | 0.0 | 25.71 | 6300 | 0.0000 | 20.6544 | -10.4876 | 1.0 | 31.1420 | -92.0146 | -285.0042 | -11.6117 | -12.6833 |
122
+ | 0.0 | 26.12 | 6400 | 0.0000 | 20.6428 | -10.5264 | 1.0 | 31.1692 | -92.0923 | -285.0274 | -11.6141 | -12.6869 |
123
+ | 0.0 | 26.53 | 6500 | 0.0000 | 20.6443 | -10.5316 | 1.0 | 31.1758 | -92.1026 | -285.0245 | -11.6142 | -12.6869 |
124
+ | 0.0 | 26.94 | 6600 | 0.0000 | 20.6314 | -10.5251 | 1.0 | 31.1566 | -92.0897 | -285.0502 | -11.6162 | -12.6900 |
125
+ | 0.0 | 27.35 | 6700 | 0.0000 | 20.6378 | -10.5259 | 1.0 | 31.1637 | -92.0912 | -285.0375 | -11.6175 | -12.6919 |
126
+ | 0.0 | 27.76 | 6800 | 0.0000 | 20.6497 | -10.5256 | 1.0 | 31.1754 | -92.0907 | -285.0136 | -11.6195 | -12.6951 |
127
+ | 0.0 | 28.16 | 6900 | 0.0000 | 20.6415 | -10.5752 | 1.0 | 31.2167 | -92.1899 | -285.0301 | -11.6187 | -12.6923 |
128
+ | 0.0 | 28.57 | 7000 | 0.0000 | 20.7394 | -10.6843 | 1.0 | 31.4237 | -92.4081 | -284.8342 | -11.6178 | -12.6906 |
129
+ | 0.0 | 28.98 | 7100 | 0.0000 | 20.7446 | -10.6882 | 1.0 | 31.4328 | -92.4159 | -284.8239 | -11.6186 | -12.6916 |
130
+ | 0.0 | 29.39 | 7200 | 0.0000 | 20.7502 | -10.6915 | 1.0 | 31.4417 | -92.4224 | -284.8127 | -11.6190 | -12.6923 |
131
+ | 0.0 | 29.8 | 7300 | 0.0000 | 20.7515 | -10.6967 | 1.0 | 31.4482 | -92.4328 | -284.8100 | -11.6190 | -12.6923 |
132
+ | 0.0 | 30.2 | 7400 | 0.0000 | 20.7524 | -10.7011 | 1.0 | 31.4535 | -92.4416 | -284.8083 | -11.6192 | -12.6925 |
133
+ | 0.0 | 30.61 | 7500 | 0.0000 | 20.7499 | -10.7111 | 1.0 | 31.4610 | -92.4616 | -284.8133 | -11.6191 | -12.6922 |
134
+ | 0.0 | 31.02 | 7600 | 0.0000 | 20.7487 | -10.7160 | 1.0 | 31.4647 | -92.4715 | -284.8157 | -11.6192 | -12.6922 |
135
+ | 0.0 | 31.43 | 7700 | 0.0000 | 20.7477 | -10.7229 | 1.0 | 31.4705 | -92.4852 | -284.8177 | -11.6191 | -12.6919 |
136
+ | 0.0 | 31.84 | 7800 | 0.0000 | 20.7512 | -10.7255 | 1.0 | 31.4766 | -92.4904 | -284.8107 | -11.6191 | -12.6921 |
137
+ | 0.0 | 32.24 | 7900 | 0.0000 | 20.7510 | -10.7372 | 1.0 | 31.4881 | -92.5138 | -284.8111 | -11.6195 | -12.6924 |
138
+ | 0.0 | 32.65 | 8000 | 0.0000 | 20.7508 | -10.7382 | 1.0 | 31.4890 | -92.5158 | -284.8114 | -11.6194 | -12.6924 |
139
 
140
 
141
  ### Framework versions
adapter_config.json CHANGED
@@ -19,8 +19,8 @@
19
  "rank_pattern": {},
20
  "revision": null,
21
  "target_modules": [
22
- "v",
23
- "q"
24
  ],
25
  "task_type": "SEQ_2_SEQ_LM"
26
  }
 
19
  "rank_pattern": {},
20
  "revision": null,
21
  "target_modules": [
22
+ "q",
23
+ "v"
24
  ],
25
  "task_type": "SEQ_2_SEQ_LM"
26
  }
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:cee3e1bfa215df1a3e688744448a3a01bed605da4d393df476ecddf79c902fc5
3
  size 7098016
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cf688099d883d2006cc879009f62f9c8207a9950ffd18be19933ccd425426184
3
  size 7098016
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fe74c66fc54071506a581583a53f3e4402f016002b22d95b06e97f597f7273e5
3
  size 4219
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c0d7edfc77632a0a0266c5f66cb96963c7a1b158f40b17aeac6a895951e04415
3
  size 4219