davidliu1110 commited on
Commit
7503f50
1 Parent(s): 948c7b0

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +114 -0
README.md ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ metrics:
5
+ - precision
6
+ - recall
7
+ - f1
8
+ - accuracy
9
+ model-index:
10
+ - name: xlm-roberta-base-david
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # xlm-roberta-base-david
18
+
19
+ This model was trained from scratch on an unknown dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.0697
22
+ - Precision: 0.9497
23
+ - Recall: 0.9544
24
+ - F1: 0.9520
25
+ - Accuracy: 0.9864
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 8
46
+ - eval_batch_size: 8
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - lr_scheduler_warmup_ratio: 0.1
51
+ - num_epochs: 5
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
56
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
57
+ | 1.2104 | 0.1 | 100 | 0.6752 | 0.1279 | 0.0587 | 0.0804 | 0.7761 |
58
+ | 0.5384 | 0.2 | 200 | 0.3366 | 0.2616 | 0.2119 | 0.2341 | 0.8771 |
59
+ | 0.3168 | 0.3 | 300 | 0.2264 | 0.5493 | 0.4996 | 0.5233 | 0.9211 |
60
+ | 0.2345 | 0.39 | 400 | 0.1796 | 0.6662 | 0.8297 | 0.7390 | 0.9395 |
61
+ | 0.1883 | 0.49 | 500 | 0.1687 | 0.7203 | 0.8207 | 0.7672 | 0.9413 |
62
+ | 0.1587 | 0.59 | 600 | 0.1414 | 0.7661 | 0.8354 | 0.7992 | 0.9525 |
63
+ | 0.1605 | 0.69 | 700 | 0.1209 | 0.7946 | 0.8672 | 0.8293 | 0.9609 |
64
+ | 0.1365 | 0.79 | 800 | 0.1120 | 0.8304 | 0.8696 | 0.8495 | 0.9657 |
65
+ | 0.1205 | 0.89 | 900 | 0.1098 | 0.8659 | 0.8786 | 0.8722 | 0.9683 |
66
+ | 0.1353 | 0.99 | 1000 | 0.1239 | 0.8436 | 0.8794 | 0.8611 | 0.9643 |
67
+ | 0.1083 | 1.09 | 1100 | 0.1243 | 0.8537 | 0.8892 | 0.8711 | 0.9657 |
68
+ | 0.0961 | 1.18 | 1200 | 0.1078 | 0.8689 | 0.8965 | 0.8825 | 0.9696 |
69
+ | 0.0798 | 1.28 | 1300 | 0.0995 | 0.8774 | 0.9038 | 0.8904 | 0.9724 |
70
+ | 0.0843 | 1.38 | 1400 | 0.0965 | 0.8793 | 0.9144 | 0.8965 | 0.9733 |
71
+ | 0.0923 | 1.48 | 1500 | 0.0957 | 0.8815 | 0.9030 | 0.8921 | 0.9730 |
72
+ | 0.0847 | 1.58 | 1600 | 0.0959 | 0.8617 | 0.8941 | 0.8776 | 0.9709 |
73
+ | 0.089 | 1.68 | 1700 | 0.0844 | 0.8982 | 0.9201 | 0.9090 | 0.9760 |
74
+ | 0.0721 | 1.78 | 1800 | 0.0767 | 0.9 | 0.9095 | 0.9047 | 0.9782 |
75
+ | 0.0803 | 1.88 | 1900 | 0.0776 | 0.8981 | 0.9340 | 0.9157 | 0.9774 |
76
+ | 0.0766 | 1.97 | 2000 | 0.0611 | 0.9166 | 0.9315 | 0.9240 | 0.9816 |
77
+ | 0.0651 | 2.07 | 2100 | 0.0771 | 0.9127 | 0.9454 | 0.9287 | 0.9817 |
78
+ | 0.0562 | 2.17 | 2200 | 0.0908 | 0.9031 | 0.9112 | 0.9071 | 0.9771 |
79
+ | 0.0629 | 2.27 | 2300 | 0.0656 | 0.9184 | 0.9356 | 0.9269 | 0.9817 |
80
+ | 0.0504 | 2.37 | 2400 | 0.0836 | 0.8998 | 0.9299 | 0.9146 | 0.9775 |
81
+ | 0.0464 | 2.47 | 2500 | 0.0791 | 0.9310 | 0.9340 | 0.9325 | 0.9816 |
82
+ | 0.0396 | 2.57 | 2600 | 0.0763 | 0.9167 | 0.9234 | 0.9200 | 0.9816 |
83
+ | 0.0582 | 2.67 | 2700 | 0.0705 | 0.9198 | 0.9446 | 0.9320 | 0.9833 |
84
+ | 0.0561 | 2.76 | 2800 | 0.0635 | 0.9274 | 0.9470 | 0.9371 | 0.9835 |
85
+ | 0.0446 | 2.86 | 2900 | 0.0679 | 0.9301 | 0.9438 | 0.9369 | 0.9828 |
86
+ | 0.0429 | 2.96 | 3000 | 0.0663 | 0.9209 | 0.9397 | 0.9302 | 0.9820 |
87
+ | 0.0323 | 3.06 | 3100 | 0.0771 | 0.9303 | 0.9462 | 0.9382 | 0.9825 |
88
+ | 0.0228 | 3.16 | 3200 | 0.0839 | 0.9279 | 0.9446 | 0.9362 | 0.9830 |
89
+ | 0.0332 | 3.26 | 3300 | 0.0717 | 0.9365 | 0.9495 | 0.9429 | 0.9839 |
90
+ | 0.0351 | 3.36 | 3400 | 0.0668 | 0.9358 | 0.9381 | 0.9369 | 0.9840 |
91
+ | 0.0425 | 3.46 | 3500 | 0.0688 | 0.9363 | 0.9462 | 0.9412 | 0.9838 |
92
+ | 0.0431 | 3.55 | 3600 | 0.0710 | 0.9321 | 0.9503 | 0.9411 | 0.9840 |
93
+ | 0.0228 | 3.65 | 3700 | 0.0748 | 0.9343 | 0.9511 | 0.9426 | 0.9838 |
94
+ | 0.0334 | 3.75 | 3800 | 0.0770 | 0.9401 | 0.9462 | 0.9431 | 0.9834 |
95
+ | 0.0373 | 3.85 | 3900 | 0.0713 | 0.9294 | 0.9446 | 0.9369 | 0.9832 |
96
+ | 0.0368 | 3.95 | 4000 | 0.0668 | 0.9380 | 0.9495 | 0.9437 | 0.9845 |
97
+ | 0.0295 | 4.05 | 4100 | 0.0706 | 0.9364 | 0.9487 | 0.9425 | 0.9843 |
98
+ | 0.0169 | 4.15 | 4200 | 0.0675 | 0.9426 | 0.9503 | 0.9464 | 0.9863 |
99
+ | 0.0234 | 4.24 | 4300 | 0.0697 | 0.9497 | 0.9544 | 0.9520 | 0.9864 |
100
+ | 0.0235 | 4.34 | 4400 | 0.0713 | 0.9392 | 0.9576 | 0.9483 | 0.9857 |
101
+ | 0.0233 | 4.44 | 4500 | 0.0689 | 0.9428 | 0.9544 | 0.9486 | 0.9857 |
102
+ | 0.015 | 4.54 | 4600 | 0.0744 | 0.9404 | 0.9511 | 0.9457 | 0.9846 |
103
+ | 0.0154 | 4.64 | 4700 | 0.0753 | 0.9406 | 0.9552 | 0.9478 | 0.9860 |
104
+ | 0.0235 | 4.74 | 4800 | 0.0733 | 0.9431 | 0.9584 | 0.9507 | 0.9859 |
105
+ | 0.0239 | 4.84 | 4900 | 0.0728 | 0.9438 | 0.9576 | 0.9506 | 0.9864 |
106
+ | 0.0237 | 4.94 | 5000 | 0.0727 | 0.9437 | 0.9560 | 0.9498 | 0.9862 |
107
+
108
+
109
+ ### Framework versions
110
+
111
+ - Transformers 4.29.0.dev0
112
+ - Pytorch 1.10.1+cu113
113
+ - Datasets 2.11.0
114
+ - Tokenizers 0.13.3