OnePoint16 commited on
Commit
85d557e
1 Parent(s): bf881f6

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +103 -23
README.md CHANGED
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 3.5173
19
 
20
  ## Model description
21
 
@@ -40,37 +40,117 @@ The following hyperparameters were used during training:
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
- - num_epochs: 20
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | No log | 1.0 | 21 | 3.9834 |
50
- | No log | 2.0 | 42 | 3.3933 |
51
- | No log | 3.0 | 63 | 3.3457 |
52
- | No log | 4.0 | 84 | 3.2997 |
53
- | No log | 5.0 | 105 | 3.2746 |
54
- | No log | 6.0 | 126 | 3.2334 |
55
- | No log | 7.0 | 147 | 3.2655 |
56
- | No log | 8.0 | 168 | 3.3428 |
57
- | No log | 9.0 | 189 | 3.3693 |
58
- | No log | 10.0 | 210 | 3.3065 |
59
- | No log | 11.0 | 231 | 3.3817 |
60
- | No log | 12.0 | 252 | 3.4270 |
61
- | No log | 13.0 | 273 | 3.4121 |
62
- | No log | 14.0 | 294 | 3.4588 |
63
- | No log | 15.0 | 315 | 3.4182 |
64
- | No log | 16.0 | 336 | 3.4626 |
65
- | No log | 17.0 | 357 | 3.4975 |
66
- | No log | 18.0 | 378 | 3.4880 |
67
- | No log | 19.0 | 399 | 3.5034 |
68
- | No log | 20.0 | 420 | 3.5173 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
69
 
70
 
71
  ### Framework versions
72
 
73
  - Transformers 4.31.0
74
  - Pytorch 2.0.1+cu118
75
- - Datasets 2.14.3
76
  - Tokenizers 0.13.3
 
15
 
16
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 5.6100
19
 
20
  ## Model description
21
 
 
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
+ - num_epochs: 100
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
+ | No log | 1.0 | 21 | 4.2930 |
50
+ | No log | 2.0 | 42 | 3.3634 |
51
+ | No log | 3.0 | 63 | 3.2834 |
52
+ | No log | 4.0 | 84 | 3.2596 |
53
+ | No log | 5.0 | 105 | 3.2594 |
54
+ | No log | 6.0 | 126 | 3.2574 |
55
+ | No log | 7.0 | 147 | 3.2845 |
56
+ | No log | 8.0 | 168 | 3.2187 |
57
+ | No log | 9.0 | 189 | 3.3233 |
58
+ | No log | 10.0 | 210 | 3.3231 |
59
+ | No log | 11.0 | 231 | 3.3505 |
60
+ | No log | 12.0 | 252 | 3.5721 |
61
+ | No log | 13.0 | 273 | 3.5463 |
62
+ | No log | 14.0 | 294 | 3.5413 |
63
+ | No log | 15.0 | 315 | 3.6203 |
64
+ | No log | 16.0 | 336 | 3.6025 |
65
+ | No log | 17.0 | 357 | 3.6301 |
66
+ | No log | 18.0 | 378 | 3.8150 |
67
+ | No log | 19.0 | 399 | 4.0084 |
68
+ | No log | 20.0 | 420 | 3.9528 |
69
+ | No log | 21.0 | 441 | 4.0350 |
70
+ | No log | 22.0 | 462 | 3.9436 |
71
+ | No log | 23.0 | 483 | 4.0115 |
72
+ | 1.7508 | 24.0 | 504 | 4.0571 |
73
+ | 1.7508 | 25.0 | 525 | 4.0290 |
74
+ | 1.7508 | 26.0 | 546 | 4.0609 |
75
+ | 1.7508 | 27.0 | 567 | 4.2875 |
76
+ | 1.7508 | 28.0 | 588 | 4.0578 |
77
+ | 1.7508 | 29.0 | 609 | 4.1743 |
78
+ | 1.7508 | 30.0 | 630 | 4.1155 |
79
+ | 1.7508 | 31.0 | 651 | 4.2136 |
80
+ | 1.7508 | 32.0 | 672 | 4.3880 |
81
+ | 1.7508 | 33.0 | 693 | 4.4454 |
82
+ | 1.7508 | 34.0 | 714 | 4.3621 |
83
+ | 1.7508 | 35.0 | 735 | 4.1829 |
84
+ | 1.7508 | 36.0 | 756 | 4.2985 |
85
+ | 1.7508 | 37.0 | 777 | 4.5783 |
86
+ | 1.7508 | 38.0 | 798 | 4.4504 |
87
+ | 1.7508 | 39.0 | 819 | 4.6955 |
88
+ | 1.7508 | 40.0 | 840 | 4.5165 |
89
+ | 1.7508 | 41.0 | 861 | 4.3018 |
90
+ | 1.7508 | 42.0 | 882 | 4.5299 |
91
+ | 1.7508 | 43.0 | 903 | 4.6147 |
92
+ | 1.7508 | 44.0 | 924 | 4.4756 |
93
+ | 1.7508 | 45.0 | 945 | 4.6782 |
94
+ | 1.7508 | 46.0 | 966 | 4.6168 |
95
+ | 1.7508 | 47.0 | 987 | 4.7553 |
96
+ | 0.2318 | 48.0 | 1008 | 4.8580 |
97
+ | 0.2318 | 49.0 | 1029 | 4.8970 |
98
+ | 0.2318 | 50.0 | 1050 | 4.8502 |
99
+ | 0.2318 | 51.0 | 1071 | 4.7219 |
100
+ | 0.2318 | 52.0 | 1092 | 4.9355 |
101
+ | 0.2318 | 53.0 | 1113 | 5.0003 |
102
+ | 0.2318 | 54.0 | 1134 | 5.1603 |
103
+ | 0.2318 | 55.0 | 1155 | 5.0398 |
104
+ | 0.2318 | 56.0 | 1176 | 5.1349 |
105
+ | 0.2318 | 57.0 | 1197 | 5.1403 |
106
+ | 0.2318 | 58.0 | 1218 | 5.0170 |
107
+ | 0.2318 | 59.0 | 1239 | 5.0553 |
108
+ | 0.2318 | 60.0 | 1260 | 5.2331 |
109
+ | 0.2318 | 61.0 | 1281 | 5.0543 |
110
+ | 0.2318 | 62.0 | 1302 | 5.1769 |
111
+ | 0.2318 | 63.0 | 1323 | 5.4024 |
112
+ | 0.2318 | 64.0 | 1344 | 5.2960 |
113
+ | 0.2318 | 65.0 | 1365 | 5.2071 |
114
+ | 0.2318 | 66.0 | 1386 | 5.1635 |
115
+ | 0.2318 | 67.0 | 1407 | 5.2613 |
116
+ | 0.2318 | 68.0 | 1428 | 5.3370 |
117
+ | 0.2318 | 69.0 | 1449 | 5.3725 |
118
+ | 0.2318 | 70.0 | 1470 | 5.2739 |
119
+ | 0.2318 | 71.0 | 1491 | 5.2887 |
120
+ | 0.0363 | 72.0 | 1512 | 5.4713 |
121
+ | 0.0363 | 73.0 | 1533 | 5.4102 |
122
+ | 0.0363 | 74.0 | 1554 | 5.3190 |
123
+ | 0.0363 | 75.0 | 1575 | 5.3406 |
124
+ | 0.0363 | 76.0 | 1596 | 5.4775 |
125
+ | 0.0363 | 77.0 | 1617 | 5.4636 |
126
+ | 0.0363 | 78.0 | 1638 | 5.4894 |
127
+ | 0.0363 | 79.0 | 1659 | 5.5111 |
128
+ | 0.0363 | 80.0 | 1680 | 5.5769 |
129
+ | 0.0363 | 81.0 | 1701 | 5.5069 |
130
+ | 0.0363 | 82.0 | 1722 | 5.5296 |
131
+ | 0.0363 | 83.0 | 1743 | 5.5471 |
132
+ | 0.0363 | 84.0 | 1764 | 5.5630 |
133
+ | 0.0363 | 85.0 | 1785 | 5.5563 |
134
+ | 0.0363 | 86.0 | 1806 | 5.5700 |
135
+ | 0.0363 | 87.0 | 1827 | 5.6082 |
136
+ | 0.0363 | 88.0 | 1848 | 5.5808 |
137
+ | 0.0363 | 89.0 | 1869 | 5.5351 |
138
+ | 0.0363 | 90.0 | 1890 | 5.4856 |
139
+ | 0.0363 | 91.0 | 1911 | 5.5007 |
140
+ | 0.0363 | 92.0 | 1932 | 5.5076 |
141
+ | 0.0363 | 93.0 | 1953 | 5.5377 |
142
+ | 0.0363 | 94.0 | 1974 | 5.5612 |
143
+ | 0.0363 | 95.0 | 1995 | 5.5754 |
144
+ | 0.0067 | 96.0 | 2016 | 5.5861 |
145
+ | 0.0067 | 97.0 | 2037 | 5.5973 |
146
+ | 0.0067 | 98.0 | 2058 | 5.6035 |
147
+ | 0.0067 | 99.0 | 2079 | 5.6073 |
148
+ | 0.0067 | 100.0 | 2100 | 5.6100 |
149
 
150
 
151
  ### Framework versions
152
 
153
  - Transformers 4.31.0
154
  - Pytorch 2.0.1+cu118
155
+ - Datasets 2.14.4
156
  - Tokenizers 0.13.3