gongliyu commited on
Commit
b550be8
1 Parent(s): 76d3c94

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +55 -25
README.md CHANGED
@@ -18,10 +18,10 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 5.6128
22
- - Precision: 0.7943
23
- - Recall: 0.7156
24
- - F1: 0.7529
25
  - Hashcode: roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2)
26
  - Gen Len: 19.0
27
 
@@ -48,32 +48,62 @@ The following hyperparameters were used during training:
48
  - seed: 42
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
- - num_epochs: 20
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Hashcode | Gen Len |
56
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:------------------------------------------------------:|:-------:|
57
- | No log | 1.0 | 1 | 9.4237 | 0.8002 | 0.7174 | 0.7565 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
58
- | No log | 2.0 | 2 | 8.7381 | 0.8002 | 0.7174 | 0.7565 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
59
- | No log | 3.0 | 3 | 8.1554 | 0.8002 | 0.7174 | 0.7565 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
60
- | No log | 4.0 | 4 | 7.6577 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
61
- | No log | 5.0 | 5 | 7.2423 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
62
- | No log | 6.0 | 6 | 6.8994 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
63
- | No log | 7.0 | 7 | 6.6213 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
64
- | No log | 8.0 | 8 | 6.3973 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
65
- | No log | 9.0 | 9 | 6.2378 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
66
- | No log | 10.0 | 10 | 6.0923 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
67
- | No log | 11.0 | 11 | 5.9803 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
68
- | No log | 12.0 | 12 | 5.8914 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
69
- | No log | 13.0 | 13 | 5.8197 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
70
- | No log | 14.0 | 14 | 5.7621 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
71
- | No log | 15.0 | 15 | 5.7159 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
72
- | No log | 16.0 | 16 | 5.6797 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
73
- | No log | 17.0 | 17 | 5.6521 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
74
- | No log | 18.0 | 18 | 5.6321 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
75
- | No log | 19.0 | 19 | 5.6192 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
76
- | No log | 20.0 | 20 | 5.6128 | 0.7943 | 0.7156 | 0.7529 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
77
 
78
 
79
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 4.5422
22
+ - Precision: nan
23
+ - Recall: 0.7117
24
+ - F1: 0.5635
25
  - Hashcode: roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2)
26
  - Gen Len: 19.0
27
 
 
48
  - seed: 42
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
+ - num_epochs: 50
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Hashcode | Gen Len |
56
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:------------------------------------------------------:|:-------:|
57
+ | No log | 1.0 | 1 | 12.9679 | 0.7745 | 0.7227 | 0.7474 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
58
+ | No log | 2.0 | 2 | 12.1426 | 0.7811 | 0.7221 | 0.7503 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
59
+ | No log | 3.0 | 3 | 11.2809 | 0.7811 | 0.7221 | 0.7503 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
60
+ | No log | 4.0 | 4 | 10.4669 | 0.7821 | 0.7273 | 0.7536 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
61
+ | No log | 5.0 | 5 | 9.7061 | 0.7821 | 0.7273 | 0.7536 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
62
+ | No log | 6.0 | 6 | 9.0054 | 0.7821 | 0.7273 | 0.7536 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
63
+ | No log | 7.0 | 7 | 8.3875 | 0.7821 | 0.7273 | 0.7536 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
64
+ | No log | 8.0 | 8 | 7.8287 | 0.7772 | 0.7278 | 0.7515 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
65
+ | No log | 9.0 | 9 | 7.3385 | 0.7772 | 0.7278 | 0.7515 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
66
+ | No log | 10.0 | 10 | 6.9141 | 0.7772 | 0.7278 | 0.7515 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
67
+ | No log | 11.0 | 11 | 6.5516 | 0.7801 | 0.7240 | 0.7509 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
68
+ | No log | 12.0 | 12 | 6.2399 | 0.7801 | 0.7240 | 0.7509 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
69
+ | No log | 13.0 | 13 | 5.9851 | 0.7801 | 0.7240 | 0.7509 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
70
+ | No log | 14.0 | 14 | 5.7744 | 0.7801 | 0.7240 | 0.7509 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
71
+ | No log | 15.0 | 15 | 5.5976 | 0.7801 | 0.7240 | 0.7509 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
72
+ | No log | 16.0 | 16 | 5.4546 | 0.7873 | 0.7158 | 0.7497 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
73
+ | No log | 17.0 | 17 | 5.3403 | 0.7873 | 0.7158 | 0.7497 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
74
+ | No log | 18.0 | 18 | 5.2461 | 0.7873 | 0.7158 | 0.7497 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
75
+ | No log | 19.0 | 19 | 5.1688 | 0.7873 | 0.7158 | 0.7497 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
76
+ | No log | 20.0 | 20 | 5.1052 | 0.7922 | 0.7169 | 0.7525 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
77
+ | No log | 21.0 | 21 | 5.0489 | 0.7922 | 0.7169 | 0.7525 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
78
+ | No log | 22.0 | 22 | 5.0025 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
79
+ | No log | 23.0 | 23 | 4.9621 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
80
+ | No log | 24.0 | 24 | 4.9263 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
81
+ | No log | 25.0 | 25 | 4.8933 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
82
+ | No log | 26.0 | 26 | 4.8623 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
83
+ | No log | 27.0 | 27 | 4.8327 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
84
+ | No log | 28.0 | 28 | 4.8060 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
85
+ | No log | 29.0 | 29 | 4.7811 | 0.7941 | 0.7122 | 0.7508 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
86
+ | No log | 30.0 | 30 | 4.7583 | 0.7712 | 0.7105 | 0.7392 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
87
+ | No log | 31.0 | 31 | 4.7361 | 0.7712 | 0.7105 | 0.7392 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
88
+ | No log | 32.0 | 32 | 4.7152 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
89
+ | No log | 33.0 | 33 | 4.6964 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
90
+ | No log | 34.0 | 34 | 4.6789 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
91
+ | No log | 35.0 | 35 | 4.6627 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
92
+ | No log | 36.0 | 36 | 4.6475 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
93
+ | No log | 37.0 | 37 | 4.6330 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
94
+ | No log | 38.0 | 38 | 4.6192 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
95
+ | No log | 39.0 | 39 | 4.6066 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
96
+ | No log | 40.0 | 40 | 4.5957 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
97
+ | No log | 41.0 | 41 | 4.5859 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
98
+ | No log | 42.0 | 42 | 4.5771 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
99
+ | No log | 43.0 | 43 | 4.5693 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
100
+ | No log | 44.0 | 44 | 4.5625 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
101
+ | No log | 45.0 | 45 | 4.5567 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
102
+ | No log | 46.0 | 46 | 4.5518 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
103
+ | No log | 47.0 | 47 | 4.5480 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
104
+ | No log | 48.0 | 48 | 4.5451 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
105
+ | No log | 49.0 | 49 | 4.5432 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
106
+ | No log | 50.0 | 50 | 4.5422 | nan | 0.7117 | 0.5635 | roberta-large_L17_idf_version=0.3.12(hug_trans=4.30.2) | 19.0 |
107
 
108
 
109
  ### Framework versions