dfanswerrocket commited on
Commit
aea722a
1 Parent(s): 2fa8eaa

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +101 -101
README.md CHANGED
@@ -14,7 +14,7 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 1.9992
18
 
19
  ## Model description
20
 
@@ -45,106 +45,106 @@ The following hyperparameters were used during training:
45
 
46
  | Training Loss | Epoch | Step | Validation Loss |
47
  |:-------------:|:-----:|:----:|:---------------:|
48
- | No log | 1.0 | 67 | 8.8964 |
49
- | No log | 2.0 | 134 | 8.8550 |
50
- | No log | 3.0 | 201 | 8.7423 |
51
- | No log | 4.0 | 268 | 8.5343 |
52
- | No log | 5.0 | 335 | 8.3243 |
53
- | No log | 6.0 | 402 | 8.1392 |
54
- | No log | 7.0 | 469 | 7.9616 |
55
- | 8.6114 | 8.0 | 536 | 7.7924 |
56
- | 8.6114 | 9.0 | 603 | 7.6305 |
57
- | 8.6114 | 10.0 | 670 | 7.4707 |
58
- | 8.6114 | 11.0 | 737 | 7.3065 |
59
- | 8.6114 | 12.0 | 804 | 7.1600 |
60
- | 8.6114 | 13.0 | 871 | 7.0228 |
61
- | 8.6114 | 14.0 | 938 | 6.8804 |
62
- | 7.5409 | 15.0 | 1005 | 6.7334 |
63
- | 7.5409 | 16.0 | 1072 | 6.6021 |
64
- | 7.5409 | 17.0 | 1139 | 6.4789 |
65
- | 7.5409 | 18.0 | 1206 | 6.3473 |
66
- | 7.5409 | 19.0 | 1273 | 6.2252 |
67
- | 7.5409 | 20.0 | 1340 | 6.1058 |
68
- | 7.5409 | 21.0 | 1407 | 5.9892 |
69
- | 7.5409 | 22.0 | 1474 | 5.8674 |
70
- | 6.6051 | 23.0 | 1541 | 5.7496 |
71
- | 6.6051 | 24.0 | 1608 | 5.6393 |
72
- | 6.6051 | 25.0 | 1675 | 5.5244 |
73
- | 6.6051 | 26.0 | 1742 | 5.4279 |
74
- | 6.6051 | 27.0 | 1809 | 5.3221 |
75
- | 6.6051 | 28.0 | 1876 | 5.2126 |
76
- | 6.6051 | 29.0 | 1943 | 5.1221 |
77
- | 5.8003 | 30.0 | 2010 | 5.0178 |
78
- | 5.8003 | 31.0 | 2077 | 4.9183 |
79
- | 5.8003 | 32.0 | 2144 | 4.8303 |
80
- | 5.8003 | 33.0 | 2211 | 4.7328 |
81
- | 5.8003 | 34.0 | 2278 | 4.6467 |
82
- | 5.8003 | 35.0 | 2345 | 4.5548 |
83
- | 5.8003 | 36.0 | 2412 | 4.4697 |
84
- | 5.8003 | 37.0 | 2479 | 4.3860 |
85
- | 5.0905 | 38.0 | 2546 | 4.2980 |
86
- | 5.0905 | 39.0 | 2613 | 4.2172 |
87
- | 5.0905 | 40.0 | 2680 | 4.1351 |
88
- | 5.0905 | 41.0 | 2747 | 4.0635 |
89
- | 5.0905 | 42.0 | 2814 | 3.9834 |
90
- | 5.0905 | 43.0 | 2881 | 3.9091 |
91
- | 5.0905 | 44.0 | 2948 | 3.8376 |
92
- | 4.481 | 45.0 | 3015 | 3.7662 |
93
- | 4.481 | 46.0 | 3082 | 3.7011 |
94
- | 4.481 | 47.0 | 3149 | 3.6335 |
95
- | 4.481 | 48.0 | 3216 | 3.5671 |
96
- | 4.481 | 49.0 | 3283 | 3.5011 |
97
- | 4.481 | 50.0 | 3350 | 3.4388 |
98
- | 4.481 | 51.0 | 3417 | 3.3746 |
99
- | 4.481 | 52.0 | 3484 | 3.3151 |
100
- | 3.9521 | 53.0 | 3551 | 3.2551 |
101
- | 3.9521 | 54.0 | 3618 | 3.1943 |
102
- | 3.9521 | 55.0 | 3685 | 3.1410 |
103
- | 3.9521 | 56.0 | 3752 | 3.0885 |
104
- | 3.9521 | 57.0 | 3819 | 3.0384 |
105
- | 3.9521 | 58.0 | 3886 | 2.9890 |
106
- | 3.9521 | 59.0 | 3953 | 2.9376 |
107
- | 3.5177 | 60.0 | 4020 | 2.8906 |
108
- | 3.5177 | 61.0 | 4087 | 2.8406 |
109
- | 3.5177 | 62.0 | 4154 | 2.7951 |
110
- | 3.5177 | 63.0 | 4221 | 2.7590 |
111
- | 3.5177 | 64.0 | 4288 | 2.7136 |
112
- | 3.5177 | 65.0 | 4355 | 2.6725 |
113
- | 3.5177 | 66.0 | 4422 | 2.6343 |
114
- | 3.5177 | 67.0 | 4489 | 2.5941 |
115
- | 3.1601 | 68.0 | 4556 | 2.5563 |
116
- | 3.1601 | 69.0 | 4623 | 2.5241 |
117
- | 3.1601 | 70.0 | 4690 | 2.4894 |
118
- | 3.1601 | 71.0 | 4757 | 2.4552 |
119
- | 3.1601 | 72.0 | 4824 | 2.4227 |
120
- | 3.1601 | 73.0 | 4891 | 2.3942 |
121
- | 3.1601 | 74.0 | 4958 | 2.3678 |
122
- | 2.8815 | 75.0 | 5025 | 2.3362 |
123
- | 2.8815 | 76.0 | 5092 | 2.3100 |
124
- | 2.8815 | 77.0 | 5159 | 2.2851 |
125
- | 2.8815 | 78.0 | 5226 | 2.2570 |
126
- | 2.8815 | 79.0 | 5293 | 2.2346 |
127
- | 2.8815 | 80.0 | 5360 | 2.2155 |
128
- | 2.8815 | 81.0 | 5427 | 2.1933 |
129
- | 2.8815 | 82.0 | 5494 | 2.1714 |
130
- | 2.6556 | 83.0 | 5561 | 2.1551 |
131
- | 2.6556 | 84.0 | 5628 | 2.1381 |
132
- | 2.6556 | 85.0 | 5695 | 2.1203 |
133
- | 2.6556 | 86.0 | 5762 | 2.1049 |
134
- | 2.6556 | 87.0 | 5829 | 2.0899 |
135
- | 2.6556 | 88.0 | 5896 | 2.0796 |
136
- | 2.6556 | 89.0 | 5963 | 2.0649 |
137
- | 2.5131 | 90.0 | 6030 | 2.0534 |
138
- | 2.5131 | 91.0 | 6097 | 2.0443 |
139
- | 2.5131 | 92.0 | 6164 | 2.0360 |
140
- | 2.5131 | 93.0 | 6231 | 2.0258 |
141
- | 2.5131 | 94.0 | 6298 | 2.0190 |
142
- | 2.5131 | 95.0 | 6365 | 2.0111 |
143
- | 2.5131 | 96.0 | 6432 | 2.0100 |
144
- | 2.5131 | 97.0 | 6499 | 2.0040 |
145
- | 2.4077 | 98.0 | 6566 | 2.0005 |
146
- | 2.4077 | 99.0 | 6633 | 1.9997 |
147
- | 2.4077 | 100.0 | 6700 | 1.9992 |
148
 
149
 
150
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 1.7230
18
 
19
  ## Model description
20
 
 
45
 
46
  | Training Loss | Epoch | Step | Validation Loss |
47
  |:-------------:|:-----:|:----:|:---------------:|
48
+ | No log | 1.0 | 68 | 7.5711 |
49
+ | No log | 2.0 | 136 | 7.5137 |
50
+ | No log | 3.0 | 204 | 7.4045 |
51
+ | No log | 4.0 | 272 | 7.2699 |
52
+ | No log | 5.0 | 340 | 7.1041 |
53
+ | No log | 6.0 | 408 | 6.9418 |
54
+ | No log | 7.0 | 476 | 6.7958 |
55
+ | 7.5401 | 8.0 | 544 | 6.6459 |
56
+ | 7.5401 | 9.0 | 612 | 6.5085 |
57
+ | 7.5401 | 10.0 | 680 | 6.3777 |
58
+ | 7.5401 | 11.0 | 748 | 6.2365 |
59
+ | 7.5401 | 12.0 | 816 | 6.1106 |
60
+ | 7.5401 | 13.0 | 884 | 5.9833 |
61
+ | 7.5401 | 14.0 | 952 | 5.8674 |
62
+ | 6.5715 | 15.0 | 1020 | 5.7555 |
63
+ | 6.5715 | 16.0 | 1088 | 5.6403 |
64
+ | 6.5715 | 17.0 | 1156 | 5.5376 |
65
+ | 6.5715 | 18.0 | 1224 | 5.4137 |
66
+ | 6.5715 | 19.0 | 1292 | 5.3225 |
67
+ | 6.5715 | 20.0 | 1360 | 5.2182 |
68
+ | 6.5715 | 21.0 | 1428 | 5.1122 |
69
+ | 6.5715 | 22.0 | 1496 | 5.0065 |
70
+ | 5.7874 | 23.0 | 1564 | 4.9041 |
71
+ | 5.7874 | 24.0 | 1632 | 4.8166 |
72
+ | 5.7874 | 25.0 | 1700 | 4.7134 |
73
+ | 5.7874 | 26.0 | 1768 | 4.6366 |
74
+ | 5.7874 | 27.0 | 1836 | 4.5368 |
75
+ | 5.7874 | 28.0 | 1904 | 4.4495 |
76
+ | 5.7874 | 29.0 | 1972 | 4.3610 |
77
+ | 5.0922 | 30.0 | 2040 | 4.2840 |
78
+ | 5.0922 | 31.0 | 2108 | 4.1986 |
79
+ | 5.0922 | 32.0 | 2176 | 4.1160 |
80
+ | 5.0922 | 33.0 | 2244 | 4.0367 |
81
+ | 5.0922 | 34.0 | 2312 | 3.9648 |
82
+ | 5.0922 | 35.0 | 2380 | 3.8908 |
83
+ | 5.0922 | 36.0 | 2448 | 3.8100 |
84
+ | 4.4927 | 37.0 | 2516 | 3.7385 |
85
+ | 4.4927 | 38.0 | 2584 | 3.6692 |
86
+ | 4.4927 | 39.0 | 2652 | 3.6037 |
87
+ | 4.4927 | 40.0 | 2720 | 3.5427 |
88
+ | 4.4927 | 41.0 | 2788 | 3.4718 |
89
+ | 4.4927 | 42.0 | 2856 | 3.4000 |
90
+ | 4.4927 | 43.0 | 2924 | 3.3363 |
91
+ | 4.4927 | 44.0 | 2992 | 3.2797 |
92
+ | 3.9767 | 45.0 | 3060 | 3.2366 |
93
+ | 3.9767 | 46.0 | 3128 | 3.1579 |
94
+ | 3.9767 | 47.0 | 3196 | 3.0965 |
95
+ | 3.9767 | 48.0 | 3264 | 3.0387 |
96
+ | 3.9767 | 49.0 | 3332 | 2.9887 |
97
+ | 3.9767 | 50.0 | 3400 | 2.9314 |
98
+ | 3.9767 | 51.0 | 3468 | 2.8779 |
99
+ | 3.5181 | 52.0 | 3536 | 2.8385 |
100
+ | 3.5181 | 53.0 | 3604 | 2.7807 |
101
+ | 3.5181 | 54.0 | 3672 | 2.7384 |
102
+ | 3.5181 | 55.0 | 3740 | 2.6938 |
103
+ | 3.5181 | 56.0 | 3808 | 2.6386 |
104
+ | 3.5181 | 57.0 | 3876 | 2.6043 |
105
+ | 3.5181 | 58.0 | 3944 | 2.5500 |
106
+ | 3.1415 | 59.0 | 4012 | 2.5146 |
107
+ | 3.1415 | 60.0 | 4080 | 2.4785 |
108
+ | 3.1415 | 61.0 | 4148 | 2.4321 |
109
+ | 3.1415 | 62.0 | 4216 | 2.3939 |
110
+ | 3.1415 | 63.0 | 4284 | 2.3641 |
111
+ | 3.1415 | 64.0 | 4352 | 2.3193 |
112
+ | 3.1415 | 65.0 | 4420 | 2.2894 |
113
+ | 3.1415 | 66.0 | 4488 | 2.2563 |
114
+ | 2.8316 | 67.0 | 4556 | 2.2242 |
115
+ | 2.8316 | 68.0 | 4624 | 2.1952 |
116
+ | 2.8316 | 69.0 | 4692 | 2.1640 |
117
+ | 2.8316 | 70.0 | 4760 | 2.1346 |
118
+ | 2.8316 | 71.0 | 4828 | 2.1069 |
119
+ | 2.8316 | 72.0 | 4896 | 2.0837 |
120
+ | 2.8316 | 73.0 | 4964 | 2.0536 |
121
+ | 2.5874 | 74.0 | 5032 | 2.0310 |
122
+ | 2.5874 | 75.0 | 5100 | 2.0053 |
123
+ | 2.5874 | 76.0 | 5168 | 1.9829 |
124
+ | 2.5874 | 77.0 | 5236 | 1.9605 |
125
+ | 2.5874 | 78.0 | 5304 | 1.9421 |
126
+ | 2.5874 | 79.0 | 5372 | 1.9192 |
127
+ | 2.5874 | 80.0 | 5440 | 1.9045 |
128
+ | 2.3824 | 81.0 | 5508 | 1.8918 |
129
+ | 2.3824 | 82.0 | 5576 | 1.8708 |
130
+ | 2.3824 | 83.0 | 5644 | 1.8547 |
131
+ | 2.3824 | 84.0 | 5712 | 1.8397 |
132
+ | 2.3824 | 85.0 | 5780 | 1.8275 |
133
+ | 2.3824 | 86.0 | 5848 | 1.8078 |
134
+ | 2.3824 | 87.0 | 5916 | 1.8017 |
135
+ | 2.3824 | 88.0 | 5984 | 1.7901 |
136
+ | 2.2537 | 89.0 | 6052 | 1.7802 |
137
+ | 2.2537 | 90.0 | 6120 | 1.7678 |
138
+ | 2.2537 | 91.0 | 6188 | 1.7610 |
139
+ | 2.2537 | 92.0 | 6256 | 1.7523 |
140
+ | 2.2537 | 93.0 | 6324 | 1.7447 |
141
+ | 2.2537 | 94.0 | 6392 | 1.7385 |
142
+ | 2.2537 | 95.0 | 6460 | 1.7343 |
143
+ | 2.1756 | 96.0 | 6528 | 1.7286 |
144
+ | 2.1756 | 97.0 | 6596 | 1.7267 |
145
+ | 2.1756 | 98.0 | 6664 | 1.7239 |
146
+ | 2.1756 | 99.0 | 6732 | 1.7233 |
147
+ | 2.1756 | 100.0 | 6800 | 1.7230 |
148
 
149
 
150
  ### Framework versions