davidliu1110 commited on
Commit
d52ce80
1 Parent(s): 0832799

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +46 -36
README.md CHANGED
@@ -18,11 +18,11 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [bert-base-chinese](https://huggingface.co/bert-base-chinese) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.0506
22
- - Precision: 0.9438
23
- - Recall: 0.9606
24
- - F1: 0.9521
25
- - Accuracy: 0.9894
26
 
27
  ## Model description
28
 
@@ -48,42 +48,52 @@ The following hyperparameters were used during training:
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
  - lr_scheduler_warmup_ratio: 0.1
51
- - num_epochs: 3
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
56
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
57
- | 0.978 | 0.1 | 100 | 0.3192 | 0.5125 | 0.5222 | 0.5173 | 0.8965 |
58
- | 0.2074 | 0.2 | 200 | 0.1329 | 0.7333 | 0.8386 | 0.7825 | 0.9568 |
59
- | 0.1558 | 0.3 | 300 | 0.1216 | 0.7877 | 0.8297 | 0.8082 | 0.9617 |
60
- | 0.1269 | 0.39 | 400 | 0.1003 | 0.8036 | 0.8628 | 0.8321 | 0.9684 |
61
- | 0.1027 | 0.49 | 500 | 0.0807 | 0.8463 | 0.8958 | 0.8704 | 0.9765 |
62
- | 0.0953 | 0.59 | 600 | 0.0869 | 0.8440 | 0.9072 | 0.8745 | 0.9743 |
63
- | 0.0952 | 0.69 | 700 | 0.0701 | 0.8960 | 0.9085 | 0.9022 | 0.9804 |
64
- | 0.0868 | 0.79 | 800 | 0.0717 | 0.8863 | 0.9212 | 0.9034 | 0.9808 |
65
- | 0.0791 | 0.89 | 900 | 0.0668 | 0.8928 | 0.9314 | 0.9117 | 0.9814 |
66
- | 0.0768 | 0.99 | 1000 | 0.0631 | 0.8958 | 0.9288 | 0.9120 | 0.9812 |
67
- | 0.0529 | 1.09 | 1100 | 0.0731 | 0.8980 | 0.9288 | 0.9132 | 0.9813 |
68
- | 0.0594 | 1.18 | 1200 | 0.0644 | 0.8998 | 0.9238 | 0.9116 | 0.9821 |
69
- | 0.0403 | 1.28 | 1300 | 0.0643 | 0.9186 | 0.9327 | 0.9256 | 0.9838 |
70
- | 0.0504 | 1.38 | 1400 | 0.0594 | 0.9167 | 0.9365 | 0.9265 | 0.9845 |
71
- | 0.0536 | 1.48 | 1500 | 0.0594 | 0.9184 | 0.9441 | 0.9311 | 0.9854 |
72
- | 0.0465 | 1.58 | 1600 | 0.0692 | 0.9025 | 0.9288 | 0.9155 | 0.9829 |
73
- | 0.0409 | 1.68 | 1700 | 0.0621 | 0.9073 | 0.9327 | 0.9198 | 0.9835 |
74
- | 0.0466 | 1.78 | 1800 | 0.0533 | 0.9218 | 0.9441 | 0.9328 | 0.9869 |
75
- | 0.0499 | 1.88 | 1900 | 0.0522 | 0.9280 | 0.9492 | 0.9384 | 0.9868 |
76
- | 0.0405 | 1.97 | 2000 | 0.0478 | 0.9186 | 0.9466 | 0.9324 | 0.9875 |
77
- | 0.0313 | 2.07 | 2100 | 0.0557 | 0.9232 | 0.9466 | 0.9348 | 0.9870 |
78
- | 0.02 | 2.17 | 2200 | 0.0680 | 0.9196 | 0.9441 | 0.9317 | 0.9853 |
79
- | 0.0267 | 2.27 | 2300 | 0.0599 | 0.9292 | 0.9504 | 0.9397 | 0.9880 |
80
- | 0.0275 | 2.37 | 2400 | 0.0527 | 0.9413 | 0.9568 | 0.9490 | 0.9893 |
81
- | 0.0228 | 2.47 | 2500 | 0.0538 | 0.9363 | 0.9530 | 0.9446 | 0.9876 |
82
- | 0.0212 | 2.57 | 2600 | 0.0538 | 0.9364 | 0.9543 | 0.9452 | 0.9878 |
83
- | 0.023 | 2.67 | 2700 | 0.0519 | 0.9413 | 0.9581 | 0.9496 | 0.9884 |
84
- | 0.0293 | 2.76 | 2800 | 0.0513 | 0.9426 | 0.9606 | 0.9515 | 0.9894 |
85
- | 0.0185 | 2.86 | 2900 | 0.0506 | 0.9438 | 0.9606 | 0.9521 | 0.9894 |
86
- | 0.025 | 2.96 | 3000 | 0.0499 | 0.9365 | 0.9555 | 0.9459 | 0.9881 |
 
 
 
 
 
 
 
 
 
 
87
 
88
 
89
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [bert-base-chinese](https://huggingface.co/bert-base-chinese) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.0530
22
+ - Precision: 0.9486
23
+ - Recall: 0.9619
24
+ - F1: 0.9552
25
+ - Accuracy: 0.9892
26
 
27
  ## Model description
28
 
 
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
  - lr_scheduler_warmup_ratio: 0.1
51
+ - num_epochs: 4
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
56
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
57
+ | 0.9951 | 0.1 | 100 | 0.3761 | 0.4443 | 0.4460 | 0.4451 | 0.8659 |
58
+ | 0.2378 | 0.2 | 200 | 0.1500 | 0.7185 | 0.7980 | 0.7562 | 0.9512 |
59
+ | 0.1625 | 0.3 | 300 | 0.1324 | 0.7662 | 0.8247 | 0.7944 | 0.9546 |
60
+ | 0.1351 | 0.39 | 400 | 0.1014 | 0.8024 | 0.8615 | 0.8309 | 0.9665 |
61
+ | 0.1083 | 0.49 | 500 | 0.0814 | 0.8529 | 0.9136 | 0.8822 | 0.9771 |
62
+ | 0.0944 | 0.59 | 600 | 0.1071 | 0.8282 | 0.8945 | 0.8601 | 0.9683 |
63
+ | 0.106 | 0.69 | 700 | 0.0696 | 0.8784 | 0.9085 | 0.8932 | 0.9802 |
64
+ | 0.0897 | 0.79 | 800 | 0.0743 | 0.8764 | 0.9098 | 0.8928 | 0.9790 |
65
+ | 0.0809 | 0.89 | 900 | 0.0677 | 0.8933 | 0.9250 | 0.9089 | 0.9813 |
66
+ | 0.0856 | 0.99 | 1000 | 0.0646 | 0.8966 | 0.9250 | 0.9106 | 0.9808 |
67
+ | 0.057 | 1.09 | 1100 | 0.0719 | 0.8851 | 0.9199 | 0.9022 | 0.9800 |
68
+ | 0.0592 | 1.18 | 1200 | 0.0670 | 0.9064 | 0.9225 | 0.9144 | 0.9816 |
69
+ | 0.0461 | 1.28 | 1300 | 0.0593 | 0.9081 | 0.9288 | 0.9183 | 0.9836 |
70
+ | 0.0555 | 1.38 | 1400 | 0.0576 | 0.9151 | 0.9314 | 0.9232 | 0.9848 |
71
+ | 0.0582 | 1.48 | 1500 | 0.0592 | 0.9167 | 0.9365 | 0.9265 | 0.9854 |
72
+ | 0.0511 | 1.58 | 1600 | 0.0538 | 0.9203 | 0.9390 | 0.9296 | 0.9857 |
73
+ | 0.0438 | 1.68 | 1700 | 0.0568 | 0.9081 | 0.9288 | 0.9183 | 0.9848 |
74
+ | 0.0486 | 1.78 | 1800 | 0.0546 | 0.9160 | 0.9428 | 0.9292 | 0.9859 |
75
+ | 0.0534 | 1.88 | 1900 | 0.0472 | 0.9232 | 0.9466 | 0.9348 | 0.9863 |
76
+ | 0.0432 | 1.97 | 2000 | 0.0457 | 0.9283 | 0.9543 | 0.9411 | 0.9877 |
77
+ | 0.0355 | 2.07 | 2100 | 0.0530 | 0.9376 | 0.9543 | 0.9458 | 0.9882 |
78
+ | 0.0234 | 2.17 | 2200 | 0.0595 | 0.9327 | 0.9504 | 0.9415 | 0.9867 |
79
+ | 0.0306 | 2.27 | 2300 | 0.0485 | 0.9377 | 0.9568 | 0.9472 | 0.9891 |
80
+ | 0.0317 | 2.37 | 2400 | 0.0464 | 0.9437 | 0.9593 | 0.9515 | 0.9897 |
81
+ | 0.0234 | 2.47 | 2500 | 0.0510 | 0.9410 | 0.9530 | 0.9470 | 0.9884 |
82
+ | 0.0276 | 2.57 | 2600 | 0.0507 | 0.9342 | 0.9568 | 0.9454 | 0.9883 |
83
+ | 0.0279 | 2.67 | 2700 | 0.0495 | 0.9384 | 0.9492 | 0.9438 | 0.9875 |
84
+ | 0.0292 | 2.76 | 2800 | 0.0491 | 0.9388 | 0.9555 | 0.9471 | 0.9880 |
85
+ | 0.0209 | 2.86 | 2900 | 0.0502 | 0.9451 | 0.9619 | 0.9534 | 0.9879 |
86
+ | 0.0257 | 2.96 | 3000 | 0.0510 | 0.9438 | 0.9606 | 0.9521 | 0.9878 |
87
+ | 0.0205 | 3.06 | 3100 | 0.0512 | 0.9391 | 0.9593 | 0.9491 | 0.9884 |
88
+ | 0.0112 | 3.16 | 3200 | 0.0529 | 0.9391 | 0.9606 | 0.9497 | 0.9887 |
89
+ | 0.017 | 3.26 | 3300 | 0.0614 | 0.9414 | 0.9593 | 0.9503 | 0.9874 |
90
+ | 0.0178 | 3.36 | 3400 | 0.0490 | 0.9353 | 0.9555 | 0.9453 | 0.9889 |
91
+ | 0.0189 | 3.46 | 3500 | 0.0512 | 0.9426 | 0.9606 | 0.9515 | 0.9889 |
92
+ | 0.0175 | 3.55 | 3600 | 0.0539 | 0.9426 | 0.9593 | 0.9509 | 0.9884 |
93
+ | 0.012 | 3.65 | 3700 | 0.0521 | 0.945 | 0.9606 | 0.9527 | 0.9894 |
94
+ | 0.0179 | 3.75 | 3800 | 0.0533 | 0.9438 | 0.9606 | 0.9521 | 0.9886 |
95
+ | 0.0185 | 3.85 | 3900 | 0.0530 | 0.9486 | 0.9619 | 0.9552 | 0.9892 |
96
+ | 0.0175 | 3.95 | 4000 | 0.0522 | 0.9462 | 0.9606 | 0.9533 | 0.9890 |
97
 
98
 
99
  ### Framework versions