cgt commited on
Commit
c6c7a1c
1 Parent(s): 1add129

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -14
README.md CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [hfl/chinese-pert-large](https://huggingface.co/hfl/chinese-pert-large) on the cmrc2018 dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.5678
20
 
21
  ## Model description
22
 
@@ -41,22 +41,17 @@ The following hyperparameters were used during training:
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
- - num_epochs: 10
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss |
49
- |:-------------:|:-----:|:-----:|:---------------:|
50
- | 1.1216 | 1.0 | 1200 | 0.7522 |
51
- | 0.7144 | 2.0 | 2400 | 0.6930 |
52
- | 0.5018 | 3.0 | 3600 | 0.7647 |
53
- | 0.3669 | 4.0 | 4800 | 0.8131 |
54
- | 0.278 | 5.0 | 6000 | 0.9423 |
55
- | 0.2087 | 6.0 | 7200 | 1.0350 |
56
- | 0.1477 | 7.0 | 8400 | 1.1962 |
57
- | 0.1235 | 8.0 | 9600 | 1.3345 |
58
- | 0.0937 | 9.0 | 10800 | 1.4887 |
59
- | 0.0705 | 10.0 | 12000 | 1.5678 |
60
 
61
 
62
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [hfl/chinese-pert-large](https://huggingface.co/hfl/chinese-pert-large) on the cmrc2018 dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.8522
20
 
21
  ## Model description
22
 
 
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
+ - num_epochs: 5
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-----:|:----:|:---------------:|
50
+ | 1.0891 | 1.0 | 1200 | 0.7374 |
51
+ | 0.712 | 2.0 | 2400 | 0.6467 |
52
+ | 0.5068 | 3.0 | 3600 | 0.7374 |
53
+ | 0.3865 | 4.0 | 4800 | 0.7852 |
54
+ | 0.3197 | 5.0 | 6000 | 0.8522 |
 
 
 
 
 
55
 
56
 
57
  ### Framework versions