haryoaw commited on
Commit
5bdb518
1 Parent(s): 73b107f

Initial Commit

Browse files
Files changed (3) hide show
  1. README.md +115 -115
  2. pytorch_model.bin +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -20,9 +20,9 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the massive dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 1.8062
24
- - Accuracy: 0.8268
25
- - F1: 0.8025
26
 
27
  ## Model description
28
 
@@ -53,118 +53,118 @@ The following hyperparameters were used during training:
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
55
  |:-------------:|:-----:|:------:|:---------------:|:--------:|:------:|
56
- | 5.8616 | 0.27 | 5000 | 5.7378 | 0.3131 | 0.1357 |
57
- | 4.0354 | 0.53 | 10000 | 4.0456 | 0.5639 | 0.4108 |
58
- | 3.214 | 0.8 | 15000 | 3.2320 | 0.6631 | 0.5563 |
59
- | 2.5727 | 1.07 | 20000 | 2.8403 | 0.7076 | 0.6163 |
60
- | 2.3415 | 1.34 | 25000 | 2.6493 | 0.7307 | 0.6653 |
61
- | 2.2015 | 1.6 | 30000 | 2.4508 | 0.7523 | 0.6823 |
62
- | 2.1388 | 1.87 | 35000 | 2.3180 | 0.7656 | 0.7033 |
63
- | 1.6356 | 2.14 | 40000 | 2.3088 | 0.7689 | 0.7156 |
64
- | 1.6347 | 2.41 | 45000 | 2.1922 | 0.7797 | 0.7354 |
65
- | 1.6499 | 2.67 | 50000 | 2.1674 | 0.7868 | 0.7428 |
66
- | 1.5928 | 2.94 | 55000 | 2.0982 | 0.7911 | 0.7513 |
67
- | 1.2806 | 3.21 | 60000 | 2.1371 | 0.7923 | 0.7537 |
68
- | 1.3074 | 3.47 | 65000 | 2.1245 | 0.7915 | 0.7541 |
69
- | 1.2884 | 3.74 | 70000 | 2.0867 | 0.7953 | 0.7601 |
70
- | 1.2463 | 4.01 | 75000 | 2.0390 | 0.8025 | 0.7662 |
71
- | 1.0794 | 4.28 | 80000 | 2.0816 | 0.8009 | 0.7686 |
72
- | 1.0835 | 4.54 | 85000 | 2.0316 | 0.8034 | 0.7710 |
73
- | 1.1207 | 4.81 | 90000 | 2.0470 | 0.8004 | 0.7688 |
74
- | 0.9022 | 5.08 | 95000 | 2.0423 | 0.8030 | 0.7701 |
75
- | 0.9286 | 5.34 | 100000 | 2.0434 | 0.8041 | 0.7715 |
76
- | 0.9315 | 5.61 | 105000 | 2.0235 | 0.8076 | 0.7726 |
77
- | 0.9399 | 5.88 | 110000 | 2.0008 | 0.8079 | 0.7754 |
78
- | 0.81 | 6.15 | 115000 | 2.0305 | 0.8084 | 0.7779 |
79
- | 0.806 | 6.41 | 120000 | 2.0511 | 0.8074 | 0.7814 |
80
- | 0.8236 | 6.68 | 125000 | 2.0400 | 0.8093 | 0.7814 |
81
- | 0.8242 | 6.95 | 130000 | 2.0178 | 0.8108 | 0.7823 |
82
- | 0.7284 | 7.22 | 135000 | 2.0302 | 0.8100 | 0.7804 |
83
- | 0.7307 | 7.48 | 140000 | 2.0083 | 0.8107 | 0.7835 |
84
- | 0.7513 | 7.75 | 145000 | 2.0581 | 0.8091 | 0.7788 |
85
- | 0.7044 | 8.02 | 150000 | 2.0368 | 0.8100 | 0.7853 |
86
- | 0.652 | 8.28 | 155000 | 1.9963 | 0.8134 | 0.7884 |
87
- | 0.6853 | 8.55 | 160000 | 1.9990 | 0.8123 | 0.7877 |
88
- | 0.6805 | 8.82 | 165000 | 2.0179 | 0.8111 | 0.7821 |
89
- | 0.6164 | 9.09 | 170000 | 2.0073 | 0.8140 | 0.7860 |
90
- | 0.6112 | 9.35 | 175000 | 2.0204 | 0.8139 | 0.7901 |
91
- | 0.6364 | 9.62 | 180000 | 1.9777 | 0.8152 | 0.7889 |
92
- | 0.6379 | 9.89 | 185000 | 1.9776 | 0.8152 | 0.7875 |
93
- | 0.5741 | 10.15 | 190000 | 2.0021 | 0.8161 | 0.7898 |
94
- | 0.5912 | 10.42 | 195000 | 1.9818 | 0.8151 | 0.7898 |
95
- | 0.5889 | 10.69 | 200000 | 2.0130 | 0.8156 | 0.7937 |
96
- | 0.5912 | 10.96 | 205000 | 1.9661 | 0.8168 | 0.7914 |
97
- | 0.5541 | 11.22 | 210000 | 1.9663 | 0.8183 | 0.7928 |
98
- | 0.5668 | 11.49 | 215000 | 1.9864 | 0.8160 | 0.7895 |
99
- | 0.5706 | 11.76 | 220000 | 1.9790 | 0.8163 | 0.7916 |
100
- | 0.5272 | 12.03 | 225000 | 1.9643 | 0.8168 | 0.7948 |
101
- | 0.5188 | 12.29 | 230000 | 1.9595 | 0.8176 | 0.7944 |
102
- | 0.5248 | 12.56 | 235000 | 1.9694 | 0.8163 | 0.7915 |
103
- | 0.5115 | 12.83 | 240000 | 1.9617 | 0.8179 | 0.7913 |
104
- | 0.4857 | 13.09 | 245000 | 1.9211 | 0.8200 | 0.7931 |
105
- | 0.4943 | 13.36 | 250000 | 1.9530 | 0.8193 | 0.7965 |
106
- | 0.5122 | 13.63 | 255000 | 1.9413 | 0.8182 | 0.7892 |
107
- | 0.5008 | 13.9 | 260000 | 1.9578 | 0.8170 | 0.7951 |
108
- | 0.4694 | 14.16 | 265000 | 1.9340 | 0.8193 | 0.7958 |
109
- | 0.4718 | 14.43 | 270000 | 1.9197 | 0.8210 | 0.7957 |
110
- | 0.4785 | 14.7 | 275000 | 1.9179 | 0.8209 | 0.7942 |
111
- | 0.49 | 14.96 | 280000 | 1.9358 | 0.8194 | 0.7925 |
112
- | 0.4561 | 15.23 | 285000 | 1.9124 | 0.8210 | 0.7958 |
113
- | 0.4702 | 15.5 | 290000 | 1.9025 | 0.8213 | 0.7961 |
114
- | 0.4637 | 15.77 | 295000 | 1.9314 | 0.8189 | 0.7944 |
115
- | 0.4363 | 16.03 | 300000 | 1.8834 | 0.8230 | 0.7979 |
116
- | 0.4513 | 16.3 | 305000 | 1.8924 | 0.8228 | 0.8007 |
117
- | 0.4608 | 16.57 | 310000 | 1.8992 | 0.8221 | 0.7997 |
118
- | 0.4416 | 16.84 | 315000 | 1.8989 | 0.8221 | 0.7984 |
119
- | 0.4273 | 17.1 | 320000 | 1.8973 | 0.8221 | 0.7997 |
120
- | 0.432 | 17.37 | 325000 | 1.8992 | 0.8219 | 0.7990 |
121
- | 0.4287 | 17.64 | 330000 | 1.8878 | 0.8225 | 0.7993 |
122
- | 0.4366 | 17.9 | 335000 | 1.9000 | 0.8214 | 0.7964 |
123
- | 0.4143 | 18.17 | 340000 | 1.8816 | 0.8225 | 0.7973 |
124
- | 0.417 | 18.44 | 345000 | 1.8695 | 0.8225 | 0.7972 |
125
- | 0.4156 | 18.71 | 350000 | 1.8747 | 0.8239 | 0.7990 |
126
- | 0.419 | 18.97 | 355000 | 1.8639 | 0.8240 | 0.7985 |
127
- | 0.4101 | 19.24 | 360000 | 1.8648 | 0.8239 | 0.8002 |
128
- | 0.4058 | 19.51 | 365000 | 1.8534 | 0.8251 | 0.8001 |
129
- | 0.4095 | 19.77 | 370000 | 1.8595 | 0.8246 | 0.8001 |
130
- | 0.4034 | 20.04 | 375000 | 1.8520 | 0.8242 | 0.7981 |
131
- | 0.4045 | 20.31 | 380000 | 1.8533 | 0.8246 | 0.8001 |
132
- | 0.3922 | 20.58 | 385000 | 1.8548 | 0.8239 | 0.7988 |
133
- | 0.4046 | 20.84 | 390000 | 1.8570 | 0.8246 | 0.8013 |
134
- | 0.3971 | 21.11 | 395000 | 1.8472 | 0.8238 | 0.7999 |
135
- | 0.3929 | 21.38 | 400000 | 1.8406 | 0.8244 | 0.8001 |
136
- | 0.3881 | 21.65 | 405000 | 1.8515 | 0.8245 | 0.8008 |
137
- | 0.3937 | 21.91 | 410000 | 1.8364 | 0.8240 | 0.8003 |
138
- | 0.3807 | 22.18 | 415000 | 1.8360 | 0.8254 | 0.7999 |
139
- | 0.3762 | 22.45 | 420000 | 1.8352 | 0.8253 | 0.7998 |
140
- | 0.389 | 22.71 | 425000 | 1.8429 | 0.8245 | 0.7991 |
141
- | 0.3751 | 22.98 | 430000 | 1.8262 | 0.8256 | 0.8016 |
142
- | 0.3637 | 23.25 | 435000 | 1.8275 | 0.8252 | 0.8014 |
143
- | 0.3758 | 23.52 | 440000 | 1.8359 | 0.8252 | 0.8004 |
144
- | 0.3773 | 23.78 | 445000 | 1.8230 | 0.8240 | 0.7982 |
145
- | 0.3573 | 24.05 | 450000 | 1.8289 | 0.8251 | 0.8000 |
146
- | 0.3689 | 24.32 | 455000 | 1.8260 | 0.8269 | 0.8035 |
147
- | 0.3661 | 24.58 | 460000 | 1.8287 | 0.8255 | 0.8009 |
148
- | 0.3648 | 24.85 | 465000 | 1.8294 | 0.8259 | 0.8024 |
149
- | 0.3521 | 25.12 | 470000 | 1.8233 | 0.8261 | 0.8031 |
150
- | 0.359 | 25.39 | 475000 | 1.8137 | 0.8264 | 0.8006 |
151
- | 0.3616 | 25.65 | 480000 | 1.8337 | 0.8245 | 0.8004 |
152
- | 0.361 | 25.92 | 485000 | 1.8157 | 0.8262 | 0.8008 |
153
- | 0.3472 | 26.19 | 490000 | 1.8244 | 0.8263 | 0.8019 |
154
- | 0.3532 | 26.46 | 495000 | 1.8270 | 0.8246 | 0.8011 |
155
- | 0.3497 | 26.72 | 500000 | 1.8150 | 0.8266 | 0.8020 |
156
- | 0.3539 | 26.99 | 505000 | 1.8159 | 0.8262 | 0.8016 |
157
- | 0.3508 | 27.26 | 510000 | 1.8135 | 0.8256 | 0.8014 |
158
- | 0.3397 | 27.52 | 515000 | 1.8092 | 0.8255 | 0.8004 |
159
- | 0.3475 | 27.79 | 520000 | 1.8093 | 0.8259 | 0.8005 |
160
- | 0.3431 | 28.06 | 525000 | 1.8102 | 0.8263 | 0.8008 |
161
- | 0.3473 | 28.33 | 530000 | 1.8088 | 0.8265 | 0.8011 |
162
- | 0.3435 | 28.59 | 535000 | 1.8042 | 0.8263 | 0.8017 |
163
- | 0.3425 | 28.86 | 540000 | 1.8117 | 0.8260 | 0.8012 |
164
- | 0.3347 | 29.13 | 545000 | 1.8131 | 0.8264 | 0.8017 |
165
- | 0.3322 | 29.39 | 550000 | 1.8088 | 0.8257 | 0.8007 |
166
- | 0.3375 | 29.66 | 555000 | 1.8070 | 0.8255 | 0.8006 |
167
- | 0.3346 | 29.93 | 560000 | 1.8062 | 0.8268 | 0.8025 |
168
 
169
 
170
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [FacebookAI/xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) on the massive dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 1.8147
24
+ - Accuracy: 0.8277
25
+ - F1: 0.8039
26
 
27
  ## Model description
28
 
 
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
55
  |:-------------:|:-----:|:------:|:---------------:|:--------:|:------:|
56
+ | 6.0249 | 0.27 | 5000 | 5.8081 | 0.3147 | 0.1329 |
57
+ | 4.3373 | 0.53 | 10000 | 4.3107 | 0.5396 | 0.3628 |
58
+ | 3.4678 | 0.8 | 15000 | 3.4302 | 0.6411 | 0.5106 |
59
+ | 2.7173 | 1.07 | 20000 | 3.0007 | 0.6913 | 0.5899 |
60
+ | 2.4577 | 1.34 | 25000 | 2.7295 | 0.7203 | 0.6347 |
61
+ | 2.2593 | 1.6 | 30000 | 2.5136 | 0.7454 | 0.6683 |
62
+ | 2.1902 | 1.87 | 35000 | 2.3803 | 0.7600 | 0.7085 |
63
+ | 1.6969 | 2.14 | 40000 | 2.2853 | 0.7723 | 0.7110 |
64
+ | 1.6768 | 2.41 | 45000 | 2.2101 | 0.7813 | 0.7304 |
65
+ | 1.6905 | 2.67 | 50000 | 2.1416 | 0.7879 | 0.7439 |
66
+ | 1.6272 | 2.94 | 55000 | 2.0825 | 0.7935 | 0.7525 |
67
+ | 1.3056 | 3.21 | 60000 | 2.1342 | 0.7916 | 0.7485 |
68
+ | 1.3281 | 3.47 | 65000 | 2.0752 | 0.7961 | 0.7578 |
69
+ | 1.3173 | 3.74 | 70000 | 2.0450 | 0.7997 | 0.7627 |
70
+ | 1.25 | 4.01 | 75000 | 2.0475 | 0.8050 | 0.7647 |
71
+ | 1.1003 | 4.28 | 80000 | 2.0769 | 0.8008 | 0.7647 |
72
+ | 1.085 | 4.54 | 85000 | 2.0143 | 0.8055 | 0.7729 |
73
+ | 1.1138 | 4.81 | 90000 | 2.0097 | 0.8062 | 0.7741 |
74
+ | 0.8998 | 5.08 | 95000 | 2.0293 | 0.8042 | 0.7687 |
75
+ | 0.941 | 5.34 | 100000 | 2.0463 | 0.8035 | 0.7730 |
76
+ | 0.9522 | 5.61 | 105000 | 2.0034 | 0.8080 | 0.7740 |
77
+ | 0.9612 | 5.88 | 110000 | 1.9783 | 0.8129 | 0.7818 |
78
+ | 0.8171 | 6.15 | 115000 | 2.0325 | 0.8105 | 0.7782 |
79
+ | 0.8311 | 6.41 | 120000 | 2.0537 | 0.8083 | 0.7822 |
80
+ | 0.8211 | 6.68 | 125000 | 2.0400 | 0.8117 | 0.7812 |
81
+ | 0.8373 | 6.95 | 130000 | 2.0087 | 0.8089 | 0.7817 |
82
+ | 0.7282 | 7.22 | 135000 | 2.0047 | 0.8129 | 0.7851 |
83
+ | 0.7229 | 7.48 | 140000 | 2.0254 | 0.8092 | 0.7847 |
84
+ | 0.7496 | 7.75 | 145000 | 1.9874 | 0.8159 | 0.7890 |
85
+ | 0.712 | 8.02 | 150000 | 2.0405 | 0.8122 | 0.7851 |
86
+ | 0.671 | 8.28 | 155000 | 2.0360 | 0.8128 | 0.7891 |
87
+ | 0.6774 | 8.55 | 160000 | 2.0099 | 0.8136 | 0.7899 |
88
+ | 0.7075 | 8.82 | 165000 | 1.9979 | 0.8131 | 0.7852 |
89
+ | 0.6209 | 9.09 | 170000 | 1.9955 | 0.8152 | 0.7908 |
90
+ | 0.6128 | 9.35 | 175000 | 1.9814 | 0.8161 | 0.7927 |
91
+ | 0.6485 | 9.62 | 180000 | 1.9914 | 0.8159 | 0.7925 |
92
+ | 0.6385 | 9.89 | 185000 | 1.9978 | 0.8169 | 0.7939 |
93
+ | 0.5789 | 10.15 | 190000 | 2.0151 | 0.8157 | 0.7938 |
94
+ | 0.5933 | 10.42 | 195000 | 1.9624 | 0.8191 | 0.7952 |
95
+ | 0.602 | 10.69 | 200000 | 1.9850 | 0.8185 | 0.7931 |
96
+ | 0.5901 | 10.96 | 205000 | 1.9757 | 0.8175 | 0.7913 |
97
+ | 0.5686 | 11.22 | 210000 | 1.9776 | 0.8168 | 0.7928 |
98
+ | 0.5642 | 11.49 | 215000 | 2.0007 | 0.8155 | 0.7918 |
99
+ | 0.5734 | 11.76 | 220000 | 1.9678 | 0.8183 | 0.7939 |
100
+ | 0.5281 | 12.03 | 225000 | 1.9652 | 0.8172 | 0.7948 |
101
+ | 0.5051 | 12.29 | 230000 | 1.9726 | 0.8188 | 0.7933 |
102
+ | 0.5247 | 12.56 | 235000 | 1.9615 | 0.8188 | 0.7937 |
103
+ | 0.5239 | 12.83 | 240000 | 1.9451 | 0.8188 | 0.7964 |
104
+ | 0.4843 | 13.09 | 245000 | 1.9342 | 0.8203 | 0.7939 |
105
+ | 0.5033 | 13.36 | 250000 | 1.9629 | 0.8182 | 0.7964 |
106
+ | 0.5131 | 13.63 | 255000 | 1.9466 | 0.8181 | 0.7959 |
107
+ | 0.5116 | 13.9 | 260000 | 1.9256 | 0.8206 | 0.7966 |
108
+ | 0.4832 | 14.16 | 265000 | 1.9252 | 0.8206 | 0.7993 |
109
+ | 0.4746 | 14.43 | 270000 | 1.9285 | 0.8197 | 0.7956 |
110
+ | 0.483 | 14.7 | 275000 | 1.9409 | 0.8188 | 0.7933 |
111
+ | 0.4955 | 14.96 | 280000 | 1.9275 | 0.8211 | 0.7963 |
112
+ | 0.4609 | 15.23 | 285000 | 1.9160 | 0.8213 | 0.7963 |
113
+ | 0.477 | 15.5 | 290000 | 1.9189 | 0.8218 | 0.7997 |
114
+ | 0.4685 | 15.77 | 295000 | 1.9135 | 0.8216 | 0.7970 |
115
+ | 0.4449 | 16.03 | 300000 | 1.8993 | 0.8232 | 0.7964 |
116
+ | 0.444 | 16.3 | 305000 | 1.8900 | 0.8231 | 0.7979 |
117
+ | 0.4584 | 16.57 | 310000 | 1.9016 | 0.8220 | 0.7991 |
118
+ | 0.4401 | 16.84 | 315000 | 1.8982 | 0.8213 | 0.7951 |
119
+ | 0.4252 | 17.1 | 320000 | 1.8930 | 0.8228 | 0.7997 |
120
+ | 0.438 | 17.37 | 325000 | 1.8836 | 0.8222 | 0.7984 |
121
+ | 0.4391 | 17.64 | 330000 | 1.8935 | 0.8220 | 0.7991 |
122
+ | 0.4471 | 17.9 | 335000 | 1.8970 | 0.8212 | 0.7988 |
123
+ | 0.4159 | 18.17 | 340000 | 1.8877 | 0.8235 | 0.8001 |
124
+ | 0.4227 | 18.44 | 345000 | 1.8950 | 0.8226 | 0.7995 |
125
+ | 0.4178 | 18.71 | 350000 | 1.8888 | 0.8233 | 0.7992 |
126
+ | 0.4214 | 18.97 | 355000 | 1.8758 | 0.8222 | 0.7974 |
127
+ | 0.4113 | 19.24 | 360000 | 1.8703 | 0.8235 | 0.7975 |
128
+ | 0.4066 | 19.51 | 365000 | 1.8790 | 0.8243 | 0.7994 |
129
+ | 0.413 | 19.77 | 370000 | 1.8561 | 0.8248 | 0.8013 |
130
+ | 0.4099 | 20.04 | 375000 | 1.8576 | 0.8240 | 0.7993 |
131
+ | 0.4092 | 20.31 | 380000 | 1.8591 | 0.8252 | 0.8030 |
132
+ | 0.3952 | 20.58 | 385000 | 1.8563 | 0.8258 | 0.8023 |
133
+ | 0.4017 | 20.84 | 390000 | 1.8603 | 0.8253 | 0.8009 |
134
+ | 0.3964 | 21.11 | 395000 | 1.8472 | 0.8251 | 0.8028 |
135
+ | 0.3891 | 21.38 | 400000 | 1.8540 | 0.8251 | 0.8014 |
136
+ | 0.3881 | 21.65 | 405000 | 1.8558 | 0.8247 | 0.8010 |
137
+ | 0.3953 | 21.91 | 410000 | 1.8594 | 0.8244 | 0.8005 |
138
+ | 0.3872 | 22.18 | 415000 | 1.8576 | 0.8249 | 0.7995 |
139
+ | 0.3754 | 22.45 | 420000 | 1.8486 | 0.8255 | 0.8008 |
140
+ | 0.3798 | 22.71 | 425000 | 1.8511 | 0.8252 | 0.7998 |
141
+ | 0.3744 | 22.98 | 430000 | 1.8491 | 0.8242 | 0.7987 |
142
+ | 0.3691 | 23.25 | 435000 | 1.8343 | 0.8247 | 0.7995 |
143
+ | 0.3718 | 23.52 | 440000 | 1.8496 | 0.8247 | 0.7997 |
144
+ | 0.3761 | 23.78 | 445000 | 1.8433 | 0.8251 | 0.8016 |
145
+ | 0.3634 | 24.05 | 450000 | 1.8397 | 0.8247 | 0.8000 |
146
+ | 0.3704 | 24.32 | 455000 | 1.8334 | 0.8254 | 0.8007 |
147
+ | 0.3651 | 24.58 | 460000 | 1.8367 | 0.8255 | 0.8022 |
148
+ | 0.3649 | 24.85 | 465000 | 1.8321 | 0.8258 | 0.8022 |
149
+ | 0.3573 | 25.12 | 470000 | 1.8358 | 0.8255 | 0.8020 |
150
+ | 0.355 | 25.39 | 475000 | 1.8301 | 0.8259 | 0.8023 |
151
+ | 0.3595 | 25.65 | 480000 | 1.8257 | 0.8263 | 0.8042 |
152
+ | 0.3625 | 25.92 | 485000 | 1.8244 | 0.8265 | 0.8025 |
153
+ | 0.3508 | 26.19 | 490000 | 1.8303 | 0.8254 | 0.8021 |
154
+ | 0.3578 | 26.46 | 495000 | 1.8261 | 0.8264 | 0.8035 |
155
+ | 0.3506 | 26.72 | 500000 | 1.8200 | 0.8267 | 0.8029 |
156
+ | 0.3514 | 26.99 | 505000 | 1.8247 | 0.8266 | 0.8032 |
157
+ | 0.3496 | 27.26 | 510000 | 1.8255 | 0.8269 | 0.8046 |
158
+ | 0.3435 | 27.52 | 515000 | 1.8173 | 0.8262 | 0.8019 |
159
+ | 0.3502 | 27.79 | 520000 | 1.8137 | 0.8269 | 0.8035 |
160
+ | 0.3463 | 28.06 | 525000 | 1.8167 | 0.8265 | 0.8024 |
161
+ | 0.344 | 28.33 | 530000 | 1.8144 | 0.8275 | 0.8023 |
162
+ | 0.3469 | 28.59 | 535000 | 1.8133 | 0.8268 | 0.8021 |
163
+ | 0.3458 | 28.86 | 540000 | 1.8153 | 0.8254 | 0.8013 |
164
+ | 0.3387 | 29.13 | 545000 | 1.8111 | 0.8269 | 0.8019 |
165
+ | 0.3346 | 29.39 | 550000 | 1.8097 | 0.8271 | 0.8036 |
166
+ | 0.3393 | 29.66 | 555000 | 1.8179 | 0.8268 | 0.8033 |
167
+ | 0.3398 | 29.93 | 560000 | 1.8147 | 0.8277 | 0.8039 |
168
 
169
 
170
  ### Framework versions
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:37d3003635b4704dd77bf04c27f1243b1a9962487db9eb2ab8b71654a73074b9
3
  size 429287542
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7a40dbc7bb0d2e66bbcfc7e473c22b5aa9b8e5b3ced8ce83da0d5ce30aabbba
3
  size 429287542
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:99284134701a1f4db59dc3bc1a3bba315919e7e9c091657c472b71b4467e92eb
3
  size 4600
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bdce10ed0481c617e83668c5e6abf015ee42a56a060b50d4b3d48fd159d20651
3
  size 4600