emilys commited on
Commit
1c28f85
1 Parent(s): adc47e6

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +173 -0
README.md ADDED
@@ -0,0 +1,173 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ datasets:
5
+ - conll2003
6
+ metrics:
7
+ - precision
8
+ - recall
9
+ - f1
10
+ - accuracy
11
+ model-index:
12
+ - name: twitter-roberta-base-dec2021-CoNLL
13
+ results:
14
+ - task:
15
+ name: Token Classification
16
+ type: token-classification
17
+ dataset:
18
+ name: conll2003
19
+ type: conll2003
20
+ args: conll2003
21
+ metrics:
22
+ - name: Precision
23
+ type: precision
24
+ value: 0.9557004346372451
25
+ - name: Recall
26
+ type: recall
27
+ value: 0.9621339616290812
28
+ - name: F1
29
+ type: f1
30
+ value: 0.9589064072458906
31
+ - name: Accuracy
32
+ type: accuracy
33
+ value: 0.9925041859740664
34
+ ---
35
+
36
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
37
+ should probably proofread and complete it, then remove this comment. -->
38
+
39
+ # twitter-roberta-base-dec2021-CoNLL
40
+
41
+ This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-dec2021](https://huggingface.co/cardiffnlp/twitter-roberta-base-dec2021) on the conll2003 dataset.
42
+ It achieves the following results on the evaluation set:
43
+ - Loss: 0.0413
44
+ - Precision: 0.9557
45
+ - Recall: 0.9621
46
+ - F1: 0.9589
47
+ - Accuracy: 0.9925
48
+
49
+ ## Model description
50
+
51
+ More information needed
52
+
53
+ ## Intended uses & limitations
54
+
55
+ More information needed
56
+
57
+ ## Training and evaluation data
58
+
59
+ More information needed
60
+
61
+ ## Training procedure
62
+
63
+ ### Training hyperparameters
64
+
65
+ The following hyperparameters were used during training:
66
+ - learning_rate: 5e-05
67
+ - train_batch_size: 64
68
+ - eval_batch_size: 1024
69
+ - seed: 42
70
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
71
+ - lr_scheduler_type: linear
72
+ - num_epochs: 10
73
+
74
+ ### Training results
75
+
76
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
77
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
78
+ | No log | 0.11 | 25 | 0.2126 | 0.5639 | 0.6067 | 0.5845 | 0.9349 |
79
+ | No log | 0.23 | 50 | 0.0849 | 0.8259 | 0.8612 | 0.8431 | 0.9765 |
80
+ | No log | 0.34 | 75 | 0.0640 | 0.8752 | 0.8957 | 0.8853 | 0.9820 |
81
+ | No log | 0.45 | 100 | 0.0572 | 0.8848 | 0.9034 | 0.8940 | 0.9832 |
82
+ | No log | 0.57 | 125 | 0.0469 | 0.9071 | 0.9239 | 0.9155 | 0.9866 |
83
+ | No log | 0.68 | 150 | 0.0442 | 0.9198 | 0.9278 | 0.9238 | 0.9877 |
84
+ | No log | 0.8 | 175 | 0.0424 | 0.9192 | 0.9322 | 0.9256 | 0.9881 |
85
+ | No log | 0.91 | 200 | 0.0407 | 0.9170 | 0.9414 | 0.9291 | 0.9891 |
86
+ | No log | 1.02 | 225 | 0.0402 | 0.9264 | 0.9403 | 0.9333 | 0.9894 |
87
+ | No log | 1.14 | 250 | 0.0399 | 0.9329 | 0.9446 | 0.9387 | 0.9897 |
88
+ | No log | 1.25 | 275 | 0.0384 | 0.9278 | 0.9413 | 0.9345 | 0.9897 |
89
+ | No log | 1.36 | 300 | 0.0363 | 0.9379 | 0.9477 | 0.9427 | 0.9906 |
90
+ | No log | 1.48 | 325 | 0.0362 | 0.9380 | 0.9493 | 0.9436 | 0.9905 |
91
+ | No log | 1.59 | 350 | 0.0364 | 0.9397 | 0.9497 | 0.9447 | 0.9905 |
92
+ | No log | 1.7 | 375 | 0.0367 | 0.9324 | 0.9475 | 0.9399 | 0.9899 |
93
+ | No log | 1.82 | 400 | 0.0372 | 0.9350 | 0.9460 | 0.9404 | 0.9899 |
94
+ | No log | 1.93 | 425 | 0.0339 | 0.9411 | 0.9514 | 0.9462 | 0.9909 |
95
+ | No log | 2.05 | 450 | 0.0336 | 0.9419 | 0.9529 | 0.9474 | 0.9911 |
96
+ | No log | 2.16 | 475 | 0.0336 | 0.9447 | 0.9537 | 0.9492 | 0.9914 |
97
+ | 0.079 | 2.27 | 500 | 0.0345 | 0.9420 | 0.9566 | 0.9492 | 0.9914 |
98
+ | 0.079 | 2.39 | 525 | 0.0364 | 0.9436 | 0.9522 | 0.9479 | 0.9913 |
99
+ | 0.079 | 2.5 | 550 | 0.0340 | 0.9479 | 0.9514 | 0.9496 | 0.9916 |
100
+ | 0.079 | 2.61 | 575 | 0.0339 | 0.9481 | 0.9559 | 0.9520 | 0.9917 |
101
+ | 0.079 | 2.73 | 600 | 0.0396 | 0.9326 | 0.9504 | 0.9414 | 0.9902 |
102
+ | 0.079 | 2.84 | 625 | 0.0348 | 0.9461 | 0.9544 | 0.9502 | 0.9915 |
103
+ | 0.079 | 2.95 | 650 | 0.0359 | 0.9419 | 0.9527 | 0.9473 | 0.9908 |
104
+ | 0.079 | 3.07 | 675 | 0.0347 | 0.9434 | 0.9573 | 0.9503 | 0.9916 |
105
+ | 0.079 | 3.18 | 700 | 0.0351 | 0.9464 | 0.9566 | 0.9515 | 0.9918 |
106
+ | 0.079 | 3.3 | 725 | 0.0370 | 0.9446 | 0.9536 | 0.9491 | 0.9911 |
107
+ | 0.079 | 3.41 | 750 | 0.0358 | 0.9462 | 0.9583 | 0.9522 | 0.9917 |
108
+ | 0.079 | 3.52 | 775 | 0.0353 | 0.9483 | 0.9564 | 0.9523 | 0.9920 |
109
+ | 0.079 | 3.64 | 800 | 0.0351 | 0.9469 | 0.9564 | 0.9516 | 0.9916 |
110
+ | 0.079 | 3.75 | 825 | 0.0361 | 0.9479 | 0.9579 | 0.9529 | 0.9919 |
111
+ | 0.079 | 3.86 | 850 | 0.0370 | 0.9498 | 0.9581 | 0.9539 | 0.9918 |
112
+ | 0.079 | 3.98 | 875 | 0.0374 | 0.9460 | 0.9574 | 0.9517 | 0.9915 |
113
+ | 0.079 | 4.09 | 900 | 0.0381 | 0.9506 | 0.9594 | 0.9550 | 0.9922 |
114
+ | 0.079 | 4.2 | 925 | 0.0415 | 0.9460 | 0.9557 | 0.9509 | 0.9912 |
115
+ | 0.079 | 4.32 | 950 | 0.0390 | 0.9493 | 0.9556 | 0.9524 | 0.9917 |
116
+ | 0.079 | 4.43 | 975 | 0.0389 | 0.9483 | 0.9591 | 0.9536 | 0.9919 |
117
+ | 0.0123 | 4.55 | 1000 | 0.0379 | 0.9464 | 0.9569 | 0.9516 | 0.9918 |
118
+ | 0.0123 | 4.66 | 1025 | 0.0376 | 0.9463 | 0.9579 | 0.9521 | 0.9920 |
119
+ | 0.0123 | 4.77 | 1050 | 0.0373 | 0.9499 | 0.9571 | 0.9535 | 0.9917 |
120
+ | 0.0123 | 4.89 | 1075 | 0.0366 | 0.9520 | 0.9584 | 0.9552 | 0.9923 |
121
+ | 0.0123 | 5.0 | 1100 | 0.0374 | 0.9488 | 0.9606 | 0.9547 | 0.9923 |
122
+ | 0.0123 | 5.11 | 1125 | 0.0393 | 0.9516 | 0.9589 | 0.9552 | 0.9920 |
123
+ | 0.0123 | 5.23 | 1150 | 0.0389 | 0.9539 | 0.9603 | 0.9571 | 0.9925 |
124
+ | 0.0123 | 5.34 | 1175 | 0.0397 | 0.9486 | 0.9576 | 0.9531 | 0.9917 |
125
+ | 0.0123 | 5.45 | 1200 | 0.0397 | 0.9478 | 0.9569 | 0.9523 | 0.9919 |
126
+ | 0.0123 | 5.57 | 1225 | 0.0388 | 0.9483 | 0.9593 | 0.9537 | 0.9920 |
127
+ | 0.0123 | 5.68 | 1250 | 0.0389 | 0.9502 | 0.9606 | 0.9554 | 0.9923 |
128
+ | 0.0123 | 5.8 | 1275 | 0.0380 | 0.9547 | 0.9616 | 0.9582 | 0.9925 |
129
+ | 0.0123 | 5.91 | 1300 | 0.0391 | 0.9496 | 0.9603 | 0.9549 | 0.9924 |
130
+ | 0.0123 | 6.02 | 1325 | 0.0381 | 0.9548 | 0.9603 | 0.9575 | 0.9924 |
131
+ | 0.0123 | 6.14 | 1350 | 0.0400 | 0.9529 | 0.9596 | 0.9562 | 0.9922 |
132
+ | 0.0123 | 6.25 | 1375 | 0.0393 | 0.9544 | 0.9616 | 0.9580 | 0.9927 |
133
+ | 0.0123 | 6.36 | 1400 | 0.0419 | 0.9514 | 0.9621 | 0.9567 | 0.9924 |
134
+ | 0.0123 | 6.48 | 1425 | 0.0415 | 0.9532 | 0.9626 | 0.9579 | 0.9925 |
135
+ | 0.0123 | 6.59 | 1450 | 0.0415 | 0.952 | 0.9613 | 0.9566 | 0.9923 |
136
+ | 0.0123 | 6.7 | 1475 | 0.0399 | 0.9542 | 0.9611 | 0.9577 | 0.9925 |
137
+ | 0.0052 | 6.82 | 1500 | 0.0416 | 0.9522 | 0.9591 | 0.9556 | 0.9921 |
138
+ | 0.0052 | 6.93 | 1525 | 0.0410 | 0.9502 | 0.9599 | 0.9550 | 0.9919 |
139
+ | 0.0052 | 7.05 | 1550 | 0.0406 | 0.9507 | 0.9613 | 0.9560 | 0.9921 |
140
+ | 0.0052 | 7.16 | 1575 | 0.0400 | 0.9508 | 0.9603 | 0.9555 | 0.9923 |
141
+ | 0.0052 | 7.27 | 1600 | 0.0402 | 0.9525 | 0.9618 | 0.9571 | 0.9924 |
142
+ | 0.0052 | 7.39 | 1625 | 0.0401 | 0.9550 | 0.9633 | 0.9591 | 0.9925 |
143
+ | 0.0052 | 7.5 | 1650 | 0.0397 | 0.9555 | 0.9647 | 0.9601 | 0.9927 |
144
+ | 0.0052 | 7.61 | 1675 | 0.0412 | 0.9526 | 0.9610 | 0.9568 | 0.9922 |
145
+ | 0.0052 | 7.73 | 1700 | 0.0419 | 0.9531 | 0.9616 | 0.9574 | 0.9923 |
146
+ | 0.0052 | 7.84 | 1725 | 0.0407 | 0.9555 | 0.9621 | 0.9588 | 0.9927 |
147
+ | 0.0052 | 7.95 | 1750 | 0.0409 | 0.9551 | 0.9628 | 0.9589 | 0.9927 |
148
+ | 0.0052 | 8.07 | 1775 | 0.0413 | 0.9520 | 0.9616 | 0.9568 | 0.9924 |
149
+ | 0.0052 | 8.18 | 1800 | 0.0414 | 0.9505 | 0.9605 | 0.9555 | 0.9923 |
150
+ | 0.0052 | 8.3 | 1825 | 0.0410 | 0.9542 | 0.9605 | 0.9573 | 0.9924 |
151
+ | 0.0052 | 8.41 | 1850 | 0.0417 | 0.9553 | 0.9599 | 0.9576 | 0.9924 |
152
+ | 0.0052 | 8.52 | 1875 | 0.0418 | 0.9545 | 0.9606 | 0.9576 | 0.9923 |
153
+ | 0.0052 | 8.64 | 1900 | 0.0414 | 0.9544 | 0.9616 | 0.9580 | 0.9924 |
154
+ | 0.0052 | 8.75 | 1925 | 0.0419 | 0.9555 | 0.9620 | 0.9587 | 0.9925 |
155
+ | 0.0052 | 8.86 | 1950 | 0.0415 | 0.9544 | 0.9611 | 0.9577 | 0.9926 |
156
+ | 0.0052 | 8.98 | 1975 | 0.0413 | 0.9542 | 0.9611 | 0.9577 | 0.9926 |
157
+ | 0.0027 | 9.09 | 2000 | 0.0412 | 0.9553 | 0.9628 | 0.9590 | 0.9927 |
158
+ | 0.0027 | 9.2 | 2025 | 0.0408 | 0.9554 | 0.9630 | 0.9592 | 0.9927 |
159
+ | 0.0027 | 9.32 | 2050 | 0.0404 | 0.9545 | 0.9613 | 0.9579 | 0.9926 |
160
+ | 0.0027 | 9.43 | 2075 | 0.0407 | 0.9557 | 0.9618 | 0.9587 | 0.9926 |
161
+ | 0.0027 | 9.55 | 2100 | 0.0410 | 0.9552 | 0.9618 | 0.9585 | 0.9926 |
162
+ | 0.0027 | 9.66 | 2125 | 0.0412 | 0.9552 | 0.9620 | 0.9586 | 0.9925 |
163
+ | 0.0027 | 9.77 | 2150 | 0.0413 | 0.9557 | 0.9621 | 0.9589 | 0.9925 |
164
+ | 0.0027 | 9.89 | 2175 | 0.0413 | 0.9557 | 0.9621 | 0.9589 | 0.9925 |
165
+ | 0.0027 | 10.0 | 2200 | 0.0413 | 0.9557 | 0.9621 | 0.9589 | 0.9925 |
166
+
167
+
168
+ ### Framework versions
169
+
170
+ - Transformers 4.20.1
171
+ - Pytorch 1.12.0
172
+ - Datasets 2.3.2
173
+ - Tokenizers 0.12.1