henryscheible commited on
Commit
30f5bff
1 Parent(s): 0054c9f

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +99 -53
README.md CHANGED
@@ -21,7 +21,7 @@ model-index:
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
- value: 0.5604395604395604
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -31,8 +31,8 @@ should probably proofread and complete it, then remove this comment. -->
31
 
32
  This model is a fine-tuned version of [bert-large-uncased](https://huggingface.co/bert-large-uncased) on the stereoset dataset.
33
  It achieves the following results on the evaluation set:
34
- - Loss: 0.6811
35
- - Accuracy: 0.5604
36
 
37
  ## Model description
38
 
@@ -51,8 +51,8 @@ More information needed
51
  ### Training hyperparameters
52
 
53
  The following hyperparameters were used during training:
54
- - learning_rate: 0.01
55
- - train_batch_size: 128
56
  - eval_batch_size: 64
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
@@ -63,54 +63,100 @@ The following hyperparameters were used during training:
63
 
64
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
65
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
66
- | No log | 0.42 | 10 | 0.7271 | 0.4874 |
67
- | No log | 0.83 | 20 | 1.0411 | 0.5338 |
68
- | No log | 1.25 | 30 | 0.7747 | 0.4922 |
69
- | No log | 1.67 | 40 | 0.7764 | 0.5039 |
70
- | No log | 2.08 | 50 | 0.7071 | 0.5283 |
71
- | No log | 2.5 | 60 | 0.9678 | 0.5275 |
72
- | No log | 2.92 | 70 | 0.9379 | 0.4765 |
73
- | No log | 3.33 | 80 | 0.7202 | 0.5604 |
74
- | No log | 3.75 | 90 | 0.6890 | 0.5659 |
75
- | No log | 4.17 | 100 | 0.9567 | 0.5314 |
76
- | No log | 4.58 | 110 | 0.8345 | 0.4772 |
77
- | No log | 5.0 | 120 | 0.8105 | 0.4749 |
78
- | No log | 5.42 | 130 | 1.0642 | 0.5275 |
79
- | No log | 5.83 | 140 | 0.7281 | 0.4906 |
80
- | No log | 6.25 | 150 | 0.7931 | 0.5008 |
81
- | No log | 6.67 | 160 | 1.0145 | 0.5283 |
82
- | No log | 7.08 | 170 | 0.7188 | 0.5039 |
83
- | No log | 7.5 | 180 | 0.7695 | 0.4765 |
84
- | No log | 7.92 | 190 | 0.9310 | 0.5314 |
85
- | No log | 8.33 | 200 | 0.7231 | 0.4882 |
86
- | No log | 8.75 | 210 | 0.7542 | 0.4757 |
87
- | No log | 9.17 | 220 | 0.8772 | 0.5267 |
88
- | No log | 9.58 | 230 | 0.7258 | 0.5039 |
89
- | No log | 10.0 | 240 | 0.7100 | 0.5063 |
90
- | No log | 10.42 | 250 | 0.7352 | 0.5432 |
91
- | No log | 10.83 | 260 | 0.7694 | 0.5165 |
92
- | No log | 11.25 | 270 | 0.7787 | 0.5149 |
93
- | No log | 11.67 | 280 | 0.6976 | 0.5353 |
94
- | No log | 12.08 | 290 | 0.7041 | 0.5667 |
95
- | No log | 12.5 | 300 | 0.8013 | 0.5290 |
96
- | No log | 12.92 | 310 | 0.8125 | 0.4765 |
97
- | No log | 13.33 | 320 | 0.7458 | 0.5455 |
98
- | No log | 13.75 | 330 | 0.7402 | 0.4757 |
99
- | No log | 14.17 | 340 | 0.7438 | 0.4906 |
100
- | No log | 14.58 | 350 | 0.6891 | 0.5463 |
101
- | No log | 15.0 | 360 | 0.7108 | 0.5 |
102
- | No log | 15.42 | 370 | 0.7005 | 0.5455 |
103
- | No log | 15.83 | 380 | 0.6970 | 0.5345 |
104
- | No log | 16.25 | 390 | 0.6849 | 0.5597 |
105
- | No log | 16.67 | 400 | 0.6998 | 0.5447 |
106
- | No log | 17.08 | 410 | 0.6845 | 0.5589 |
107
- | No log | 17.5 | 420 | 0.6819 | 0.5487 |
108
- | No log | 17.92 | 430 | 0.6990 | 0.5290 |
109
- | No log | 18.33 | 440 | 0.6820 | 0.5471 |
110
- | No log | 18.75 | 450 | 0.6852 | 0.5487 |
111
- | No log | 19.17 | 460 | 0.6817 | 0.5549 |
112
- | No log | 19.58 | 470 | 0.6830 | 0.5581 |
113
- | No log | 20.0 | 480 | 0.6811 | 0.5604 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
114
 
115
 
116
  ### Framework versions
 
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
+ value: 0.543171114599686
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  This model is a fine-tuned version of [bert-large-uncased](https://huggingface.co/bert-large-uncased) on the stereoset dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.6847
35
+ - Accuracy: 0.5432
36
 
37
  ## Model description
38
 
 
51
  ### Training hyperparameters
52
 
53
  The following hyperparameters were used during training:
54
+ - learning_rate: 5e-05
55
+ - train_batch_size: 64
56
  - eval_batch_size: 64
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
 
63
 
64
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
65
  |:-------------:|:-----:|:----:|:---------------:|:--------:|
66
+ | 0.8199 | 0.21 | 10 | 0.7324 | 0.5416 |
67
+ | 0.7127 | 0.43 | 20 | 0.6844 | 0.5369 |
68
+ | 0.6952 | 0.64 | 30 | 0.6932 | 0.5353 |
69
+ | 0.6928 | 0.85 | 40 | 0.6842 | 0.5471 |
70
+ | 0.7036 | 1.06 | 50 | 0.6845 | 0.5440 |
71
+ | 0.6865 | 1.28 | 60 | 0.6873 | 0.5275 |
72
+ | 0.6941 | 1.49 | 70 | 0.6857 | 0.5400 |
73
+ | 0.7049 | 1.7 | 80 | 0.6850 | 0.5447 |
74
+ | 0.6964 | 1.91 | 90 | 0.6851 | 0.5455 |
75
+ | 0.6956 | 2.13 | 100 | 0.6857 | 0.5345 |
76
+ | 0.6967 | 2.34 | 110 | 0.6847 | 0.5440 |
77
+ | 0.7033 | 2.55 | 120 | 0.6851 | 0.5440 |
78
+ | 0.6888 | 2.77 | 130 | 0.6847 | 0.5400 |
79
+ | 0.6925 | 2.98 | 140 | 0.6847 | 0.5440 |
80
+ | 0.6868 | 3.19 | 150 | 0.6847 | 0.5400 |
81
+ | 0.7036 | 3.4 | 160 | 0.6844 | 0.5463 |
82
+ | 0.6945 | 3.62 | 170 | 0.6843 | 0.5440 |
83
+ | 0.6929 | 3.83 | 180 | 0.6845 | 0.5487 |
84
+ | 0.6905 | 4.04 | 190 | 0.6846 | 0.5463 |
85
+ | 0.693 | 4.26 | 200 | 0.6851 | 0.5424 |
86
+ | 0.6958 | 4.47 | 210 | 0.6855 | 0.5463 |
87
+ | 0.6973 | 4.68 | 220 | 0.6849 | 0.5455 |
88
+ | 0.6836 | 4.89 | 230 | 0.6854 | 0.5400 |
89
+ | 0.6921 | 5.11 | 240 | 0.6878 | 0.5283 |
90
+ | 0.7023 | 5.32 | 250 | 0.6851 | 0.5440 |
91
+ | 0.6952 | 5.53 | 260 | 0.6849 | 0.5440 |
92
+ | 0.705 | 5.74 | 270 | 0.6843 | 0.5471 |
93
+ | 0.694 | 5.96 | 280 | 0.6846 | 0.5424 |
94
+ | 0.6932 | 6.17 | 290 | 0.6850 | 0.5447 |
95
+ | 0.6903 | 6.38 | 300 | 0.6848 | 0.5432 |
96
+ | 0.6893 | 6.6 | 310 | 0.6844 | 0.5455 |
97
+ | 0.6934 | 6.81 | 320 | 0.6845 | 0.5495 |
98
+ | 0.6996 | 7.02 | 330 | 0.6847 | 0.5502 |
99
+ | 0.6819 | 7.23 | 340 | 0.6848 | 0.5447 |
100
+ | 0.6927 | 7.45 | 350 | 0.6851 | 0.5432 |
101
+ | 0.703 | 7.66 | 360 | 0.6849 | 0.5479 |
102
+ | 0.6922 | 7.87 | 370 | 0.6848 | 0.5463 |
103
+ | 0.7008 | 8.09 | 380 | 0.6846 | 0.5440 |
104
+ | 0.7052 | 8.3 | 390 | 0.6844 | 0.5487 |
105
+ | 0.701 | 8.51 | 400 | 0.6841 | 0.5447 |
106
+ | 0.7164 | 8.72 | 410 | 0.6851 | 0.5447 |
107
+ | 0.6947 | 8.94 | 420 | 0.6849 | 0.5424 |
108
+ | 0.6904 | 9.15 | 430 | 0.6840 | 0.5463 |
109
+ | 0.6874 | 9.36 | 440 | 0.6842 | 0.5455 |
110
+ | 0.709 | 9.57 | 450 | 0.6846 | 0.5455 |
111
+ | 0.7024 | 9.79 | 460 | 0.6845 | 0.5502 |
112
+ | 0.6916 | 10.0 | 470 | 0.6847 | 0.5440 |
113
+ | 0.6971 | 10.21 | 480 | 0.6844 | 0.5471 |
114
+ | 0.6903 | 10.43 | 490 | 0.6845 | 0.5463 |
115
+ | 0.6923 | 10.64 | 500 | 0.6850 | 0.5440 |
116
+ | 0.6948 | 10.85 | 510 | 0.6854 | 0.5424 |
117
+ | 0.6914 | 11.06 | 520 | 0.6862 | 0.5330 |
118
+ | 0.6915 | 11.28 | 530 | 0.6860 | 0.5353 |
119
+ | 0.6918 | 11.49 | 540 | 0.6847 | 0.5471 |
120
+ | 0.6936 | 11.7 | 550 | 0.6850 | 0.5455 |
121
+ | 0.6993 | 11.91 | 560 | 0.6847 | 0.5447 |
122
+ | 0.704 | 12.13 | 570 | 0.6852 | 0.5440 |
123
+ | 0.6934 | 12.34 | 580 | 0.6848 | 0.5455 |
124
+ | 0.6969 | 12.55 | 590 | 0.6849 | 0.5455 |
125
+ | 0.695 | 12.77 | 600 | 0.6850 | 0.5495 |
126
+ | 0.7044 | 12.98 | 610 | 0.6849 | 0.5463 |
127
+ | 0.7066 | 13.19 | 620 | 0.6863 | 0.5322 |
128
+ | 0.6799 | 13.4 | 630 | 0.6860 | 0.5338 |
129
+ | 0.6886 | 13.62 | 640 | 0.6849 | 0.5479 |
130
+ | 0.697 | 13.83 | 650 | 0.6847 | 0.5432 |
131
+ | 0.6849 | 14.04 | 660 | 0.6847 | 0.5416 |
132
+ | 0.7028 | 14.26 | 670 | 0.6847 | 0.5432 |
133
+ | 0.6992 | 14.47 | 680 | 0.6849 | 0.5471 |
134
+ | 0.7016 | 14.68 | 690 | 0.6854 | 0.5416 |
135
+ | 0.6918 | 14.89 | 700 | 0.6846 | 0.5471 |
136
+ | 0.6899 | 15.11 | 710 | 0.6846 | 0.5440 |
137
+ | 0.6933 | 15.32 | 720 | 0.6846 | 0.5440 |
138
+ | 0.6841 | 15.53 | 730 | 0.6846 | 0.5416 |
139
+ | 0.6891 | 15.74 | 740 | 0.6846 | 0.5424 |
140
+ | 0.6935 | 15.96 | 750 | 0.6846 | 0.5424 |
141
+ | 0.6868 | 16.17 | 760 | 0.6847 | 0.5440 |
142
+ | 0.6973 | 16.38 | 770 | 0.6850 | 0.5471 |
143
+ | 0.6792 | 16.6 | 780 | 0.6850 | 0.5471 |
144
+ | 0.6787 | 16.81 | 790 | 0.6849 | 0.5440 |
145
+ | 0.6976 | 17.02 | 800 | 0.6847 | 0.5463 |
146
+ | 0.6841 | 17.23 | 810 | 0.6848 | 0.5455 |
147
+ | 0.6883 | 17.45 | 820 | 0.6848 | 0.5479 |
148
+ | 0.6899 | 17.66 | 830 | 0.6847 | 0.5432 |
149
+ | 0.6987 | 17.87 | 840 | 0.6847 | 0.5455 |
150
+ | 0.6956 | 18.09 | 850 | 0.6847 | 0.5455 |
151
+ | 0.6843 | 18.3 | 860 | 0.6847 | 0.5455 |
152
+ | 0.6781 | 18.51 | 870 | 0.6847 | 0.5455 |
153
+ | 0.6837 | 18.72 | 880 | 0.6847 | 0.5432 |
154
+ | 0.7108 | 18.94 | 890 | 0.6847 | 0.5432 |
155
+ | 0.7048 | 19.15 | 900 | 0.6847 | 0.5432 |
156
+ | 0.6912 | 19.36 | 910 | 0.6847 | 0.5432 |
157
+ | 0.707 | 19.57 | 920 | 0.6847 | 0.5424 |
158
+ | 0.697 | 19.79 | 930 | 0.6847 | 0.5424 |
159
+ | 0.6922 | 20.0 | 940 | 0.6847 | 0.5432 |
160
 
161
 
162
  ### Framework versions