LecJackS commited on
Commit
d9f2004
1 Parent(s): cf9b827

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +107 -13
README.md CHANGED
@@ -21,7 +21,7 @@ model-index:
21
  metrics:
22
  - name: F1
23
  type: f1
24
- value: 0.9525165396532953
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -31,8 +31,8 @@ should probably proofread and complete it, then remove this comment. -->
31
 
32
  This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the conll2003 dataset.
33
  It achieves the following results on the evaluation set:
34
- - Loss: 0.0409
35
- - F1: 0.9525
36
 
37
  ## Model description
38
 
@@ -57,23 +57,117 @@ The following hyperparameters were used during training:
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: linear
60
- - num_epochs: 6
61
 
62
  ### Training results
63
 
64
- | Training Loss | Epoch | Step | Validation Loss | F1 |
65
- |:-------------:|:-----:|:----:|:---------------:|:------:|
66
- | 0.1394 | 1.0 | 439 | 0.0482 | 0.9274 |
67
- | 0.0399 | 2.0 | 878 | 0.0393 | 0.9407 |
68
- | 0.0221 | 3.0 | 1317 | 0.0357 | 0.9488 |
69
- | 0.013 | 4.0 | 1756 | 0.0355 | 0.9518 |
70
- | 0.0077 | 5.0 | 2195 | 0.0404 | 0.9524 |
71
- | 0.0046 | 6.0 | 2634 | 0.0409 | 0.9525 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
72
 
73
 
74
  ### Framework versions
75
 
76
- - Transformers 4.29.0
77
  - Pytorch 2.0.0+cu118
78
  - Datasets 2.12.0
79
  - Tokenizers 0.13.3
 
21
  metrics:
22
  - name: F1
23
  type: f1
24
+ value: 0.948444966049124
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the conll2003 dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.0898
35
+ - F1: 0.9484
36
 
37
  ## Model description
38
 
 
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: linear
60
+ - num_epochs: 100
61
 
62
  ### Training results
63
 
64
+ | Training Loss | Epoch | Step | Validation Loss | F1 |
65
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
66
+ | 0.1415 | 1.0 | 439 | 0.0447 | 0.9367 |
67
+ | 0.0429 | 2.0 | 878 | 0.0437 | 0.9310 |
68
+ | 0.0259 | 3.0 | 1317 | 0.0534 | 0.9328 |
69
+ | 0.0195 | 4.0 | 1756 | 0.0449 | 0.9429 |
70
+ | 0.0146 | 5.0 | 2195 | 0.0484 | 0.9421 |
71
+ | 0.0121 | 6.0 | 2634 | 0.0523 | 0.9392 |
72
+ | 0.0099 | 7.0 | 3073 | 0.0500 | 0.9428 |
73
+ | 0.0077 | 8.0 | 3512 | 0.0536 | 0.9423 |
74
+ | 0.008 | 9.0 | 3951 | 0.0672 | 0.9254 |
75
+ | 0.0079 | 10.0 | 4390 | 0.0589 | 0.9442 |
76
+ | 0.007 | 11.0 | 4829 | 0.0669 | 0.9400 |
77
+ | 0.0051 | 12.0 | 5268 | 0.0602 | 0.9409 |
78
+ | 0.0052 | 13.0 | 5707 | 0.0639 | 0.9441 |
79
+ | 0.0036 | 14.0 | 6146 | 0.0635 | 0.9431 |
80
+ | 0.0033 | 15.0 | 6585 | 0.0858 | 0.9328 |
81
+ | 0.0038 | 16.0 | 7024 | 0.0653 | 0.9478 |
82
+ | 0.0047 | 17.0 | 7463 | 0.0689 | 0.9431 |
83
+ | 0.0039 | 18.0 | 7902 | 0.0687 | 0.9442 |
84
+ | 0.0031 | 19.0 | 8341 | 0.0687 | 0.9459 |
85
+ | 0.0027 | 20.0 | 8780 | 0.0785 | 0.9424 |
86
+ | 0.0047 | 21.0 | 9219 | 0.0654 | 0.9444 |
87
+ | 0.0035 | 22.0 | 9658 | 0.0748 | 0.9454 |
88
+ | 0.0021 | 23.0 | 10097 | 0.0714 | 0.9423 |
89
+ | 0.003 | 24.0 | 10536 | 0.0730 | 0.9433 |
90
+ | 0.0031 | 25.0 | 10975 | 0.0682 | 0.9417 |
91
+ | 0.0021 | 26.0 | 11414 | 0.0762 | 0.9407 |
92
+ | 0.0025 | 27.0 | 11853 | 0.0773 | 0.9391 |
93
+ | 0.0019 | 28.0 | 12292 | 0.0739 | 0.9420 |
94
+ | 0.0032 | 29.0 | 12731 | 0.0755 | 0.9413 |
95
+ | 0.0023 | 30.0 | 13170 | 0.0755 | 0.9439 |
96
+ | 0.0024 | 31.0 | 13609 | 0.0747 | 0.9456 |
97
+ | 0.0018 | 32.0 | 14048 | 0.0730 | 0.9430 |
98
+ | 0.0017 | 33.0 | 14487 | 0.0866 | 0.9385 |
99
+ | 0.0019 | 34.0 | 14926 | 0.0695 | 0.9440 |
100
+ | 0.0016 | 35.0 | 15365 | 0.0818 | 0.9442 |
101
+ | 0.0034 | 36.0 | 15804 | 0.0750 | 0.9459 |
102
+ | 0.0019 | 37.0 | 16243 | 0.0808 | 0.9414 |
103
+ | 0.0013 | 38.0 | 16682 | 0.0797 | 0.9422 |
104
+ | 0.0015 | 39.0 | 17121 | 0.0814 | 0.9394 |
105
+ | 0.0019 | 40.0 | 17560 | 0.0757 | 0.9415 |
106
+ | 0.0011 | 41.0 | 17999 | 0.0778 | 0.9453 |
107
+ | 0.0011 | 42.0 | 18438 | 0.0825 | 0.9407 |
108
+ | 0.0012 | 43.0 | 18877 | 0.0767 | 0.9458 |
109
+ | 0.0022 | 44.0 | 19316 | 0.0865 | 0.9396 |
110
+ | 0.0009 | 45.0 | 19755 | 0.0826 | 0.9459 |
111
+ | 0.0008 | 46.0 | 20194 | 0.0819 | 0.9473 |
112
+ | 0.0017 | 47.0 | 20633 | 0.0844 | 0.9420 |
113
+ | 0.0015 | 48.0 | 21072 | 0.0827 | 0.9448 |
114
+ | 0.0014 | 49.0 | 21511 | 0.0800 | 0.9464 |
115
+ | 0.0008 | 50.0 | 21950 | 0.0770 | 0.9474 |
116
+ | 0.0011 | 51.0 | 22389 | 0.0766 | 0.9471 |
117
+ | 0.0006 | 52.0 | 22828 | 0.0896 | 0.9424 |
118
+ | 0.0011 | 53.0 | 23267 | 0.0866 | 0.9425 |
119
+ | 0.001 | 54.0 | 23706 | 0.0853 | 0.9426 |
120
+ | 0.0007 | 55.0 | 24145 | 0.0831 | 0.9462 |
121
+ | 0.0008 | 56.0 | 24584 | 0.0805 | 0.9457 |
122
+ | 0.0008 | 57.0 | 25023 | 0.0866 | 0.9438 |
123
+ | 0.0008 | 58.0 | 25462 | 0.0822 | 0.9421 |
124
+ | 0.0011 | 59.0 | 25901 | 0.0837 | 0.9417 |
125
+ | 0.0007 | 60.0 | 26340 | 0.0823 | 0.9466 |
126
+ | 0.0008 | 61.0 | 26779 | 0.0825 | 0.9425 |
127
+ | 0.0004 | 62.0 | 27218 | 0.0825 | 0.9433 |
128
+ | 0.0005 | 63.0 | 27657 | 0.0826 | 0.9435 |
129
+ | 0.0004 | 64.0 | 28096 | 0.0838 | 0.9437 |
130
+ | 0.0008 | 65.0 | 28535 | 0.0909 | 0.9424 |
131
+ | 0.0004 | 66.0 | 28974 | 0.0825 | 0.9464 |
132
+ | 0.0004 | 67.0 | 29413 | 0.0917 | 0.9454 |
133
+ | 0.0004 | 68.0 | 29852 | 0.0843 | 0.9487 |
134
+ | 0.0005 | 69.0 | 30291 | 0.0825 | 0.9481 |
135
+ | 0.0003 | 70.0 | 30730 | 0.0825 | 0.9456 |
136
+ | 0.0005 | 71.0 | 31169 | 0.0835 | 0.9460 |
137
+ | 0.0003 | 72.0 | 31608 | 0.0906 | 0.9481 |
138
+ | 0.0001 | 73.0 | 32047 | 0.0916 | 0.9471 |
139
+ | 0.0007 | 74.0 | 32486 | 0.0885 | 0.9460 |
140
+ | 0.0003 | 75.0 | 32925 | 0.0879 | 0.9481 |
141
+ | 0.0001 | 76.0 | 33364 | 0.0871 | 0.9505 |
142
+ | 0.0002 | 77.0 | 33803 | 0.0906 | 0.9486 |
143
+ | 0.0003 | 78.0 | 34242 | 0.0934 | 0.9469 |
144
+ | 0.0002 | 79.0 | 34681 | 0.0911 | 0.9466 |
145
+ | 0.0003 | 80.0 | 35120 | 0.0871 | 0.9489 |
146
+ | 0.0003 | 81.0 | 35559 | 0.0876 | 0.9494 |
147
+ | 0.0002 | 82.0 | 35998 | 0.0884 | 0.9482 |
148
+ | 0.0001 | 83.0 | 36437 | 0.0910 | 0.9469 |
149
+ | 0.0002 | 84.0 | 36876 | 0.0874 | 0.9473 |
150
+ | 0.0002 | 85.0 | 37315 | 0.0864 | 0.9463 |
151
+ | 0.0001 | 86.0 | 37754 | 0.0878 | 0.9472 |
152
+ | 0.0002 | 87.0 | 38193 | 0.0836 | 0.9500 |
153
+ | 0.0001 | 88.0 | 38632 | 0.0861 | 0.9495 |
154
+ | 0.0001 | 89.0 | 39071 | 0.0869 | 0.9503 |
155
+ | 0.0001 | 90.0 | 39510 | 0.0878 | 0.9480 |
156
+ | 0.0001 | 91.0 | 39949 | 0.0878 | 0.9501 |
157
+ | 0.0 | 92.0 | 40388 | 0.0886 | 0.9477 |
158
+ | 0.0001 | 93.0 | 40827 | 0.0884 | 0.9497 |
159
+ | 0.0001 | 94.0 | 41266 | 0.0897 | 0.9487 |
160
+ | 0.0001 | 95.0 | 41705 | 0.0896 | 0.9490 |
161
+ | 0.0001 | 96.0 | 42144 | 0.0879 | 0.9499 |
162
+ | 0.0001 | 97.0 | 42583 | 0.0884 | 0.9490 |
163
+ | 0.0001 | 98.0 | 43022 | 0.0899 | 0.9486 |
164
+ | 0.0001 | 99.0 | 43461 | 0.0897 | 0.9488 |
165
+ | 0.0001 | 100.0 | 43900 | 0.0898 | 0.9484 |
166
 
167
 
168
  ### Framework versions
169
 
170
+ - Transformers 4.29.1
171
  - Pytorch 2.0.0+cu118
172
  - Datasets 2.12.0
173
  - Tokenizers 0.13.3