MayBashendy commited on
Commit
51804c6
1 Parent(s): cb7d57c

End of training

Browse files
Files changed (1) hide show
  1. README.md +88 -93
README.md CHANGED
@@ -4,21 +4,21 @@ base_model: aubmindlab/bert-base-arabertv02
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
- - name: Arabic_FineTuningAraBERT_AugV0_k1_task1_organization_fold0
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
- # Arabic_FineTuningAraBERT_AugV0_k1_task1_organization_fold0
15
 
16
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.9449
19
- - Qwk: 0.5896
20
- - Mse: 0.9449
21
- - Rmse: 0.9721
22
 
23
  ## Model description
24
 
@@ -47,93 +47,88 @@ The following hyperparameters were used during training:
47
 
48
  ### Training results
49
 
50
- | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
- |:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|
52
- | No log | 0.1176 | 2 | 5.1330 | 0.0904 | 5.1330 | 2.2656 |
53
- | No log | 0.2353 | 4 | 3.4675 | 0.0 | 3.4675 | 1.8621 |
54
- | No log | 0.3529 | 6 | 2.3648 | 0.0696 | 2.3648 | 1.5378 |
55
- | No log | 0.4706 | 8 | 1.6993 | 0.0 | 1.6993 | 1.3036 |
56
- | No log | 0.5882 | 10 | 1.4000 | 0.1973 | 1.4000 | 1.1832 |
57
- | No log | 0.7059 | 12 | 1.3257 | 0.1933 | 1.3257 | 1.1514 |
58
- | No log | 0.8235 | 14 | 1.2645 | 0.2939 | 1.2645 | 1.1245 |
59
- | No log | 0.9412 | 16 | 1.2126 | 0.2668 | 1.2126 | 1.1012 |
60
- | No log | 1.0588 | 18 | 0.9516 | 0.3259 | 0.9516 | 0.9755 |
61
- | No log | 1.1765 | 20 | 0.8599 | 0.4590 | 0.8599 | 0.9273 |
62
- | No log | 1.2941 | 22 | 0.8162 | 0.4085 | 0.8162 | 0.9035 |
63
- | No log | 1.4118 | 24 | 0.9496 | 0.3478 | 0.9496 | 0.9745 |
64
- | No log | 1.5294 | 26 | 1.1079 | 0.3209 | 1.1079 | 1.0526 |
65
- | No log | 1.6471 | 28 | 1.1311 | 0.3209 | 1.1311 | 1.0635 |
66
- | No log | 1.7647 | 30 | 1.0768 | 0.3494 | 1.0768 | 1.0377 |
67
- | No log | 1.8824 | 32 | 1.0150 | 0.5698 | 1.0150 | 1.0075 |
68
- | No log | 2.0 | 34 | 1.0474 | 0.5468 | 1.0474 | 1.0234 |
69
- | No log | 2.1176 | 36 | 0.9894 | 0.5653 | 0.9894 | 0.9947 |
70
- | No log | 2.2353 | 38 | 0.8745 | 0.5830 | 0.8745 | 0.9351 |
71
- | No log | 2.3529 | 40 | 0.8798 | 0.6738 | 0.8798 | 0.9380 |
72
- | No log | 2.4706 | 42 | 1.0482 | 0.5830 | 1.0482 | 1.0238 |
73
- | No log | 2.5882 | 44 | 1.1713 | 0.5660 | 1.1713 | 1.0823 |
74
- | No log | 2.7059 | 46 | 1.0797 | 0.5686 | 1.0797 | 1.0391 |
75
- | No log | 2.8235 | 48 | 0.9651 | 0.5686 | 0.9651 | 0.9824 |
76
- | No log | 2.9412 | 50 | 0.8528 | 0.6338 | 0.8528 | 0.9235 |
77
- | No log | 3.0588 | 52 | 0.8694 | 0.5312 | 0.8694 | 0.9324 |
78
- | No log | 3.1765 | 54 | 0.9500 | 0.5312 | 0.9500 | 0.9747 |
79
- | No log | 3.2941 | 56 | 0.9084 | 0.5312 | 0.9084 | 0.9531 |
80
- | No log | 3.4118 | 58 | 0.8059 | 0.5312 | 0.8059 | 0.8977 |
81
- | No log | 3.5294 | 60 | 0.8138 | 0.6338 | 0.8138 | 0.9021 |
82
- | No log | 3.6471 | 62 | 0.9146 | 0.5422 | 0.9146 | 0.9564 |
83
- | No log | 3.7647 | 64 | 0.9550 | 0.5365 | 0.9550 | 0.9773 |
84
- | No log | 3.8824 | 66 | 0.9336 | 0.5365 | 0.9336 | 0.9662 |
85
- | No log | 4.0 | 68 | 0.9319 | 0.5625 | 0.9319 | 0.9654 |
86
- | No log | 4.1176 | 70 | 0.8851 | 0.6260 | 0.8851 | 0.9408 |
87
- | No log | 4.2353 | 72 | 0.8516 | 0.6182 | 0.8516 | 0.9228 |
88
- | No log | 4.3529 | 74 | 0.8250 | 0.6188 | 0.8250 | 0.9083 |
89
- | No log | 4.4706 | 76 | 0.8158 | 0.6690 | 0.8158 | 0.9032 |
90
- | No log | 4.5882 | 78 | 0.8102 | 0.5882 | 0.8102 | 0.9001 |
91
- | No log | 4.7059 | 80 | 0.8191 | 0.5882 | 0.8191 | 0.9051 |
92
- | No log | 4.8235 | 82 | 0.8353 | 0.6441 | 0.8353 | 0.9139 |
93
- | No log | 4.9412 | 84 | 0.8613 | 0.6213 | 0.8613 | 0.9281 |
94
- | No log | 5.0588 | 86 | 0.9234 | 0.6038 | 0.9234 | 0.9610 |
95
- | No log | 5.1765 | 88 | 0.9213 | 0.5714 | 0.9213 | 0.9598 |
96
- | No log | 5.2941 | 90 | 0.8488 | 0.5563 | 0.8488 | 0.9213 |
97
- | No log | 5.4118 | 92 | 0.8263 | 0.5563 | 0.8263 | 0.9090 |
98
- | No log | 5.5294 | 94 | 0.8231 | 0.5896 | 0.8231 | 0.9073 |
99
- | No log | 5.6471 | 96 | 0.8229 | 0.6213 | 0.8229 | 0.9071 |
100
- | No log | 5.7647 | 98 | 0.8101 | 0.6213 | 0.8101 | 0.9000 |
101
- | No log | 5.8824 | 100 | 0.7902 | 0.6732 | 0.7902 | 0.8889 |
102
- | No log | 6.0 | 102 | 0.7975 | 0.6732 | 0.7975 | 0.8930 |
103
- | No log | 6.1176 | 104 | 0.7976 | 0.6732 | 0.7976 | 0.8931 |
104
- | No log | 6.2353 | 106 | 0.8341 | 0.6732 | 0.8341 | 0.9133 |
105
- | No log | 6.3529 | 108 | 0.8567 | 0.6677 | 0.8567 | 0.9256 |
106
- | No log | 6.4706 | 110 | 0.9204 | 0.6213 | 0.9204 | 0.9594 |
107
- | No log | 6.5882 | 112 | 0.9584 | 0.6038 | 0.9584 | 0.9790 |
108
- | No log | 6.7059 | 114 | 0.9535 | 0.6213 | 0.9535 | 0.9765 |
109
- | No log | 6.8235 | 116 | 0.9361 | 0.6213 | 0.9361 | 0.9675 |
110
- | No log | 6.9412 | 118 | 0.8850 | 0.6213 | 0.8850 | 0.9407 |
111
- | No log | 7.0588 | 120 | 0.8731 | 0.6213 | 0.8731 | 0.9344 |
112
- | No log | 7.1765 | 122 | 0.8714 | 0.6213 | 0.8714 | 0.9335 |
113
- | No log | 7.2941 | 124 | 0.8738 | 0.6213 | 0.8738 | 0.9348 |
114
- | No log | 7.4118 | 126 | 0.8868 | 0.6213 | 0.8868 | 0.9417 |
115
- | No log | 7.5294 | 128 | 0.9237 | 0.6213 | 0.9237 | 0.9611 |
116
- | No log | 7.6471 | 130 | 0.9781 | 0.5714 | 0.9781 | 0.9890 |
117
- | No log | 7.7647 | 132 | 0.9926 | 0.5686 | 0.9926 | 0.9963 |
118
- | No log | 7.8824 | 134 | 0.9841 | 0.5686 | 0.9841 | 0.9920 |
119
- | No log | 8.0 | 136 | 0.9469 | 0.5896 | 0.9469 | 0.9731 |
120
- | No log | 8.1176 | 138 | 0.9149 | 0.5896 | 0.9149 | 0.9565 |
121
- | No log | 8.2353 | 140 | 0.8707 | 0.6677 | 0.8707 | 0.9331 |
122
- | No log | 8.3529 | 142 | 0.8480 | 0.7178 | 0.8480 | 0.9209 |
123
- | No log | 8.4706 | 144 | 0.8500 | 0.6677 | 0.8500 | 0.9219 |
124
- | No log | 8.5882 | 146 | 0.8616 | 0.6677 | 0.8616 | 0.9282 |
125
- | No log | 8.7059 | 148 | 0.8742 | 0.6677 | 0.8742 | 0.9350 |
126
- | No log | 8.8235 | 150 | 0.8966 | 0.5896 | 0.8966 | 0.9469 |
127
- | No log | 8.9412 | 152 | 0.9197 | 0.5896 | 0.9197 | 0.9590 |
128
- | No log | 9.0588 | 154 | 0.9430 | 0.5896 | 0.9430 | 0.9711 |
129
- | No log | 9.1765 | 156 | 0.9552 | 0.5714 | 0.9552 | 0.9774 |
130
- | No log | 9.2941 | 158 | 0.9605 | 0.5714 | 0.9605 | 0.9800 |
131
- | No log | 9.4118 | 160 | 0.9609 | 0.5714 | 0.9609 | 0.9803 |
132
- | No log | 9.5294 | 162 | 0.9624 | 0.5714 | 0.9624 | 0.9810 |
133
- | No log | 9.6471 | 164 | 0.9603 | 0.5896 | 0.9603 | 0.9799 |
134
- | No log | 9.7647 | 166 | 0.9534 | 0.5896 | 0.9534 | 0.9764 |
135
- | No log | 9.8824 | 168 | 0.9476 | 0.5896 | 0.9476 | 0.9735 |
136
- | No log | 10.0 | 170 | 0.9449 | 0.5896 | 0.9449 | 0.9721 |
137
 
138
 
139
  ### Framework versions
 
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
+ - name: Arabic_FineTuningAraBERT_AugV0_k1_task1_organization_fold1
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # Arabic_FineTuningAraBERT_AugV0_k1_task1_organization_fold1
15
 
16
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.6040
19
+ - Qwk: 0.6847
20
+ - Mse: 0.6040
21
+ - Rmse: 0.7772
22
 
23
  ## Model description
24
 
 
47
 
48
  ### Training results
49
 
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.125 | 2 | 3.7867 | -0.0180 | 3.7867 | 1.9459 |
53
+ | No log | 0.25 | 4 | 1.9998 | 0.2320 | 1.9998 | 1.4141 |
54
+ | No log | 0.375 | 6 | 1.0711 | -0.0396 | 1.0711 | 1.0350 |
55
+ | No log | 0.5 | 8 | 0.9376 | 0.0982 | 0.9376 | 0.9683 |
56
+ | No log | 0.625 | 10 | 0.7792 | 0.2125 | 0.7792 | 0.8827 |
57
+ | No log | 0.75 | 12 | 0.5459 | 0.4731 | 0.5459 | 0.7389 |
58
+ | No log | 0.875 | 14 | 0.5198 | 0.51 | 0.5198 | 0.7210 |
59
+ | No log | 1.0 | 16 | 0.5273 | 0.5517 | 0.5273 | 0.7262 |
60
+ | No log | 1.125 | 18 | 0.5321 | 0.6224 | 0.5321 | 0.7295 |
61
+ | No log | 1.25 | 20 | 0.5241 | 0.6667 | 0.5241 | 0.7240 |
62
+ | No log | 1.375 | 22 | 0.4927 | 0.6486 | 0.4927 | 0.7020 |
63
+ | No log | 1.5 | 24 | 0.5528 | 0.5692 | 0.5528 | 0.7435 |
64
+ | No log | 1.625 | 26 | 0.4789 | 0.6889 | 0.4789 | 0.6920 |
65
+ | No log | 1.75 | 28 | 0.6811 | 0.5395 | 0.6811 | 0.8253 |
66
+ | No log | 1.875 | 30 | 1.0119 | 0.3772 | 1.0119 | 1.0059 |
67
+ | No log | 2.0 | 32 | 0.8998 | 0.3784 | 0.8998 | 0.9486 |
68
+ | No log | 2.125 | 34 | 0.5753 | 0.5070 | 0.5753 | 0.7585 |
69
+ | No log | 2.25 | 36 | 0.4745 | 0.5767 | 0.4745 | 0.6888 |
70
+ | No log | 2.375 | 38 | 0.5193 | 0.6147 | 0.5193 | 0.7206 |
71
+ | No log | 2.5 | 40 | 0.4825 | 0.5767 | 0.4825 | 0.6946 |
72
+ | No log | 2.625 | 42 | 0.4648 | 0.6182 | 0.4648 | 0.6818 |
73
+ | No log | 2.75 | 44 | 0.5393 | 0.7072 | 0.5393 | 0.7343 |
74
+ | No log | 2.875 | 46 | 0.5297 | 0.7220 | 0.5297 | 0.7278 |
75
+ | No log | 3.0 | 48 | 0.4946 | 0.7220 | 0.4946 | 0.7033 |
76
+ | No log | 3.125 | 50 | 0.4299 | 0.7 | 0.4299 | 0.6557 |
77
+ | No log | 3.25 | 52 | 0.4385 | 0.6667 | 0.4385 | 0.6622 |
78
+ | No log | 3.375 | 54 | 0.4424 | 0.6667 | 0.4424 | 0.6651 |
79
+ | No log | 3.5 | 56 | 0.4762 | 0.7 | 0.4762 | 0.6901 |
80
+ | No log | 3.625 | 58 | 0.5567 | 0.7 | 0.5567 | 0.7462 |
81
+ | No log | 3.75 | 60 | 0.5485 | 0.7 | 0.5485 | 0.7406 |
82
+ | No log | 3.875 | 62 | 0.5083 | 0.6290 | 0.5083 | 0.7130 |
83
+ | No log | 4.0 | 64 | 0.4976 | 0.6084 | 0.4976 | 0.7054 |
84
+ | No log | 4.125 | 66 | 0.5228 | 0.7 | 0.5228 | 0.7231 |
85
+ | No log | 4.25 | 68 | 0.4997 | 0.7 | 0.4997 | 0.7069 |
86
+ | No log | 4.375 | 70 | 0.5183 | 0.7 | 0.5183 | 0.7200 |
87
+ | No log | 4.5 | 72 | 0.6043 | 0.7016 | 0.6043 | 0.7774 |
88
+ | No log | 4.625 | 74 | 0.6707 | 0.7016 | 0.6707 | 0.8190 |
89
+ | No log | 4.75 | 76 | 0.6478 | 0.7016 | 0.6478 | 0.8048 |
90
+ | No log | 4.875 | 78 | 0.5690 | 0.6769 | 0.5690 | 0.7543 |
91
+ | No log | 5.0 | 80 | 0.5403 | 0.6263 | 0.5403 | 0.7350 |
92
+ | No log | 5.125 | 82 | 0.5245 | 0.7050 | 0.5245 | 0.7242 |
93
+ | No log | 5.25 | 84 | 0.5001 | 0.6978 | 0.5001 | 0.7072 |
94
+ | No log | 5.375 | 86 | 0.4801 | 0.6978 | 0.4801 | 0.6929 |
95
+ | No log | 5.5 | 88 | 0.4978 | 0.7287 | 0.4978 | 0.7055 |
96
+ | No log | 5.625 | 90 | 0.5501 | 0.7016 | 0.5501 | 0.7417 |
97
+ | No log | 5.75 | 92 | 0.6042 | 0.7219 | 0.6042 | 0.7773 |
98
+ | No log | 5.875 | 94 | 0.5626 | 0.7016 | 0.5626 | 0.7501 |
99
+ | No log | 6.0 | 96 | 0.5092 | 0.7050 | 0.5092 | 0.7136 |
100
+ | No log | 6.125 | 98 | 0.5001 | 0.6288 | 0.5001 | 0.7072 |
101
+ | No log | 6.25 | 100 | 0.4953 | 0.6288 | 0.4953 | 0.7038 |
102
+ | No log | 6.375 | 102 | 0.5378 | 0.7287 | 0.5378 | 0.7333 |
103
+ | No log | 6.5 | 104 | 0.6240 | 0.6789 | 0.6240 | 0.7899 |
104
+ | No log | 6.625 | 106 | 0.7492 | 0.6991 | 0.7492 | 0.8656 |
105
+ | No log | 6.75 | 108 | 0.8331 | 0.6415 | 0.8331 | 0.9127 |
106
+ | No log | 6.875 | 110 | 0.8239 | 0.6415 | 0.8239 | 0.9077 |
107
+ | No log | 7.0 | 112 | 0.7363 | 0.6991 | 0.7363 | 0.8581 |
108
+ | No log | 7.125 | 114 | 0.6138 | 0.6606 | 0.6138 | 0.7835 |
109
+ | No log | 7.25 | 116 | 0.5381 | 0.6573 | 0.5381 | 0.7336 |
110
+ | No log | 7.375 | 118 | 0.5289 | 0.6978 | 0.5289 | 0.7272 |
111
+ | No log | 7.5 | 120 | 0.5491 | 0.6784 | 0.5491 | 0.7410 |
112
+ | No log | 7.625 | 122 | 0.6250 | 0.6789 | 0.6250 | 0.7906 |
113
+ | No log | 7.75 | 124 | 0.7445 | 0.6991 | 0.7445 | 0.8629 |
114
+ | No log | 7.875 | 126 | 0.7655 | 0.6991 | 0.7655 | 0.8749 |
115
+ | No log | 8.0 | 128 | 0.7507 | 0.6991 | 0.7507 | 0.8664 |
116
+ | No log | 8.125 | 130 | 0.7196 | 0.6991 | 0.7196 | 0.8483 |
117
+ | No log | 8.25 | 132 | 0.6492 | 0.6975 | 0.6492 | 0.8057 |
118
+ | No log | 8.375 | 134 | 0.6000 | 0.6899 | 0.6000 | 0.7746 |
119
+ | No log | 8.5 | 136 | 0.5639 | 0.6899 | 0.5639 | 0.7509 |
120
+ | No log | 8.625 | 138 | 0.5275 | 0.7 | 0.5275 | 0.7263 |
121
+ | No log | 8.75 | 140 | 0.5147 | 0.7050 | 0.5147 | 0.7174 |
122
+ | No log | 8.875 | 142 | 0.5215 | 0.7050 | 0.5215 | 0.7221 |
123
+ | No log | 9.0 | 144 | 0.5328 | 0.6733 | 0.5328 | 0.7300 |
124
+ | No log | 9.125 | 146 | 0.5525 | 0.7016 | 0.5525 | 0.7433 |
125
+ | No log | 9.25 | 148 | 0.5649 | 0.6847 | 0.5649 | 0.7516 |
126
+ | No log | 9.375 | 150 | 0.5844 | 0.6847 | 0.5844 | 0.7644 |
127
+ | No log | 9.5 | 152 | 0.5954 | 0.6847 | 0.5954 | 0.7716 |
128
+ | No log | 9.625 | 154 | 0.5976 | 0.6847 | 0.5976 | 0.7730 |
129
+ | No log | 9.75 | 156 | 0.5980 | 0.6847 | 0.5980 | 0.7733 |
130
+ | No log | 9.875 | 158 | 0.6022 | 0.6847 | 0.6022 | 0.7760 |
131
+ | No log | 10.0 | 160 | 0.6040 | 0.6847 | 0.6040 | 0.7772 |
 
 
 
 
 
132
 
133
 
134
  ### Framework versions