Edit model card

arabert_baseline_augmented_more_organization_task1_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7158
  • Qwk: 0.7172
  • Mse: 0.7158
  • Rmse: 0.8461

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 1.7016 0.1642 1.7016 1.3044
No log 0.0833 4 1.2738 0.1750 1.2738 1.1286
No log 0.125 6 1.2622 0.1736 1.2622 1.1235
No log 0.1667 8 1.4336 0.0742 1.4336 1.1973
No log 0.2083 10 1.6089 0.2006 1.6089 1.2684
No log 0.25 12 1.8224 0.0533 1.8224 1.3500
No log 0.2917 14 1.8011 0.0385 1.8011 1.3420
No log 0.3333 16 1.5946 0.1582 1.5946 1.2628
No log 0.375 18 1.4496 0.0742 1.4496 1.2040
No log 0.4167 20 1.3713 0.1736 1.3713 1.1710
No log 0.4583 22 1.2801 0.1736 1.2801 1.1314
No log 0.5 24 1.1589 0.1463 1.1589 1.0765
No log 0.5417 26 1.1683 0.3636 1.1683 1.0809
No log 0.5833 28 1.1922 0.2910 1.1922 1.0919
No log 0.625 30 1.1586 0.1463 1.1586 1.0764
No log 0.6667 32 1.2494 0.1413 1.2494 1.1178
No log 0.7083 34 1.2074 0.1555 1.2074 1.0988
No log 0.75 36 1.1404 0.2478 1.1404 1.0679
No log 0.7917 38 1.2618 0.2125 1.2618 1.1233
No log 0.8333 40 1.5388 0.2254 1.5388 1.2405
No log 0.875 42 1.6423 0.1778 1.6423 1.2815
No log 0.9167 44 1.7744 0.1600 1.7744 1.3321
No log 0.9583 46 1.5268 0.2990 1.5268 1.2356
No log 1.0 48 1.2378 0.2958 1.2378 1.1126
No log 1.0417 50 1.1977 0.1876 1.1977 1.0944
No log 1.0833 52 1.1358 0.1876 1.1358 1.0657
No log 1.125 54 1.0778 0.2102 1.0778 1.0382
No log 1.1667 56 1.0462 0.4173 1.0462 1.0229
No log 1.2083 58 1.0130 0.4867 1.0130 1.0065
No log 1.25 60 0.9262 0.4639 0.9262 0.9624
No log 1.2917 62 0.8585 0.5107 0.8585 0.9265
No log 1.3333 64 0.8122 0.4324 0.8122 0.9012
No log 1.375 66 0.7755 0.5108 0.7755 0.8806
No log 1.4167 68 0.7536 0.5108 0.7536 0.8681
No log 1.4583 70 0.7520 0.5097 0.7520 0.8672
No log 1.5 72 0.8661 0.5970 0.8661 0.9306
No log 1.5417 74 0.8993 0.6011 0.8993 0.9483
No log 1.5833 76 0.9214 0.5772 0.9214 0.9599
No log 1.625 78 0.9799 0.5772 0.9799 0.9899
No log 1.6667 80 0.9062 0.6100 0.9062 0.9519
No log 1.7083 82 0.9675 0.3121 0.9675 0.9836
No log 1.75 84 0.9513 0.3721 0.9513 0.9753
No log 1.7917 86 0.9122 0.3541 0.9122 0.9551
No log 1.8333 88 0.9007 0.4106 0.9007 0.9490
No log 1.875 90 0.8584 0.4607 0.8584 0.9265
No log 1.9167 92 0.7580 0.5629 0.7580 0.8706
No log 1.9583 94 0.7066 0.5098 0.7066 0.8406
No log 2.0 96 0.7098 0.5290 0.7098 0.8425
No log 2.0417 98 0.7480 0.5473 0.7480 0.8649
No log 2.0833 100 0.8287 0.6143 0.8287 0.9104
No log 2.125 102 0.7862 0.5488 0.7862 0.8867
No log 2.1667 104 0.7394 0.5281 0.7394 0.8599
No log 2.2083 106 0.6726 0.5283 0.6726 0.8201
No log 2.25 108 0.6606 0.5283 0.6606 0.8128
No log 2.2917 110 0.6648 0.5283 0.6648 0.8154
No log 2.3333 112 0.6614 0.5468 0.6614 0.8133
No log 2.375 114 0.6454 0.5640 0.6454 0.8034
No log 2.4167 116 0.6557 0.5640 0.6557 0.8098
No log 2.4583 118 0.6659 0.5818 0.6659 0.8160
No log 2.5 120 0.6556 0.5818 0.6556 0.8097
No log 2.5417 122 0.6791 0.5468 0.6791 0.8241
No log 2.5833 124 0.7349 0.7342 0.7349 0.8573
No log 2.625 126 0.7328 0.7986 0.7328 0.8560
No log 2.6667 128 0.8162 0.7823 0.8162 0.9035
No log 2.7083 130 0.9252 0.7221 0.9252 0.9619
No log 2.75 132 0.9043 0.7139 0.9043 0.9510
No log 2.7917 134 0.8027 0.7623 0.8027 0.8959
No log 2.8333 136 0.7367 0.7801 0.7367 0.8583
No log 2.875 138 0.6459 0.6775 0.6459 0.8037
No log 2.9167 140 0.6098 0.6774 0.6098 0.7809
No log 2.9583 142 0.5942 0.6316 0.5942 0.7709
No log 3.0 144 0.6023 0.6316 0.6023 0.7761
No log 3.0417 146 0.6150 0.6488 0.6150 0.7842
No log 3.0833 148 0.6189 0.6488 0.6189 0.7867
No log 3.125 150 0.6059 0.6488 0.6059 0.7784
No log 3.1667 152 0.6189 0.6602 0.6189 0.7867
No log 3.2083 154 0.6219 0.6602 0.6219 0.7886
No log 3.25 156 0.6274 0.6310 0.6274 0.7921
No log 3.2917 158 0.6464 0.6310 0.6464 0.8040
No log 3.3333 160 0.6717 0.6898 0.6717 0.8195
No log 3.375 162 0.7372 0.6909 0.7372 0.8586
No log 3.4167 164 0.7792 0.6909 0.7792 0.8827
No log 3.4583 166 0.9314 0.7221 0.9314 0.9651
No log 3.5 168 1.0842 0.7221 1.0842 1.0412
No log 3.5417 170 1.0153 0.7221 1.0153 1.0076
No log 3.5833 172 0.8811 0.7221 0.8811 0.9387
No log 3.625 174 0.7166 0.5906 0.7166 0.8465
No log 3.6667 176 0.6945 0.5291 0.6945 0.8334
No log 3.7083 178 0.6979 0.4745 0.6979 0.8354
No log 3.75 180 0.6615 0.5285 0.6615 0.8133
No log 3.7917 182 0.6267 0.5462 0.6267 0.7917
No log 3.8333 184 0.6119 0.5660 0.6119 0.7823
No log 3.875 186 0.6251 0.6781 0.6251 0.7906
No log 3.9167 188 0.6571 0.7360 0.6571 0.8106
No log 3.9583 190 0.6702 0.6588 0.6702 0.8187
No log 4.0 192 0.7479 0.7717 0.7479 0.8648
No log 4.0417 194 0.7891 0.7623 0.7891 0.8883
No log 4.0833 196 0.7511 0.7623 0.7511 0.8666
No log 4.125 198 0.6949 0.7525 0.6949 0.8336
No log 4.1667 200 0.6489 0.5898 0.6489 0.8055
No log 4.2083 202 0.6396 0.6216 0.6396 0.7997
No log 4.25 204 0.6271 0.6175 0.6271 0.7919
No log 4.2917 206 0.6576 0.6898 0.6576 0.8109
No log 4.3333 208 0.7410 0.7139 0.7410 0.8608
No log 4.375 210 0.8531 0.7139 0.8531 0.9236
No log 4.4167 212 0.8413 0.7063 0.8413 0.9172
No log 4.4583 214 0.7067 0.7289 0.7067 0.8407
No log 4.5 216 0.6722 0.7225 0.6722 0.8199
No log 4.5417 218 0.6839 0.6885 0.6839 0.8270
No log 4.5833 220 0.7297 0.5786 0.7297 0.8542
No log 4.625 222 0.8156 0.5266 0.8156 0.9031
No log 4.6667 224 0.8218 0.5266 0.8218 0.9066
No log 4.7083 226 0.7851 0.4887 0.7851 0.8860
No log 4.75 228 0.7643 0.6418 0.7643 0.8743
No log 4.7917 230 0.7446 0.6992 0.7446 0.8629
No log 4.8333 232 0.7711 0.7063 0.7711 0.8781
No log 4.875 234 0.7149 0.7447 0.7149 0.8455
No log 4.9167 236 0.6441 0.6975 0.6441 0.8026
No log 4.9583 238 0.6134 0.7209 0.6134 0.7832
No log 5.0 240 0.6157 0.7209 0.6157 0.7847
No log 5.0417 242 0.6495 0.6975 0.6495 0.8059
No log 5.0833 244 0.6703 0.6975 0.6703 0.8187
No log 5.125 246 0.6821 0.6813 0.6821 0.8259
No log 5.1667 248 0.6653 0.6818 0.6653 0.8156
No log 5.2083 250 0.6940 0.6813 0.6940 0.8331
No log 5.25 252 0.7257 0.6813 0.7257 0.8519
No log 5.2917 254 0.8274 0.7522 0.8274 0.9096
No log 5.3333 256 0.9021 0.7264 0.9021 0.9498
No log 5.375 258 0.9698 0.7426 0.9698 0.9848
No log 5.4167 260 1.0117 0.7516 1.0117 1.0058
No log 5.4583 262 1.0087 0.7221 1.0087 1.0043
No log 5.5 264 0.9675 0.7418 0.9675 0.9836
No log 5.5417 266 0.8617 0.6564 0.8617 0.9283
No log 5.5833 268 0.7616 0.5522 0.7616 0.8727
No log 5.625 270 0.7146 0.4878 0.7146 0.8453
No log 5.6667 272 0.6756 0.5476 0.6756 0.8219
No log 5.7083 274 0.6580 0.58 0.6580 0.8112
No log 5.75 276 0.6531 0.6182 0.6531 0.8081
No log 5.7917 278 0.6787 0.6261 0.6787 0.8238
No log 5.8333 280 0.7087 0.6301 0.7087 0.8419
No log 5.875 282 0.7417 0.6642 0.7417 0.8612
No log 5.9167 284 0.7604 0.6642 0.7604 0.8720
No log 5.9583 286 0.8006 0.6951 0.8006 0.8947
No log 6.0 288 0.8069 0.7139 0.8069 0.8983
No log 6.0417 290 0.8402 0.7139 0.8402 0.9166
No log 6.0833 292 0.8419 0.7139 0.8419 0.9175
No log 6.125 294 0.8049 0.6951 0.8049 0.8972
No log 6.1667 296 0.7841 0.6882 0.7841 0.8855
No log 6.2083 298 0.8166 0.7063 0.8166 0.9036
No log 6.25 300 0.8454 0.7063 0.8454 0.9194
No log 6.2917 302 0.8753 0.7063 0.8753 0.9356
No log 6.3333 304 0.8271 0.6882 0.8271 0.9095
No log 6.375 306 0.7625 0.6871 0.7625 0.8732
No log 6.4167 308 0.7187 0.6975 0.7187 0.8477
No log 6.4583 310 0.6845 0.6662 0.6845 0.8273
No log 6.5 312 0.6827 0.6662 0.6827 0.8262
No log 6.5417 314 0.7055 0.6714 0.7055 0.8400
No log 6.5833 316 0.7257 0.5823 0.7257 0.8519
No log 6.625 318 0.7211 0.6254 0.7211 0.8492
No log 6.6667 320 0.6997 0.5867 0.6997 0.8365
No log 6.7083 322 0.7039 0.5898 0.7039 0.8390
No log 6.75 324 0.7067 0.6107 0.7067 0.8407
No log 6.7917 326 0.7139 0.6719 0.7139 0.8449
No log 6.8333 328 0.7371 0.6143 0.7371 0.8586
No log 6.875 330 0.7933 0.6353 0.7933 0.8907
No log 6.9167 332 0.8818 0.6903 0.8818 0.9391
No log 6.9583 334 0.9674 0.7221 0.9674 0.9836
No log 7.0 336 1.0250 0.7221 1.0250 1.0124
No log 7.0417 338 1.0158 0.7221 1.0158 1.0079
No log 7.0833 340 0.9518 0.7221 0.9518 0.9756
No log 7.125 342 0.8565 0.7139 0.8565 0.9255
No log 7.1667 344 0.8147 0.7139 0.8147 0.9026
No log 7.2083 346 0.7584 0.6882 0.7584 0.8708
No log 7.25 348 0.7248 0.6702 0.7248 0.8513
No log 7.2917 350 0.7168 0.6882 0.7168 0.8466
No log 7.3333 352 0.7216 0.6882 0.7216 0.8494
No log 7.375 354 0.7302 0.6882 0.7302 0.8545
No log 7.4167 356 0.7388 0.6882 0.7388 0.8596
No log 7.4583 358 0.7314 0.6882 0.7314 0.8552
No log 7.5 360 0.7059 0.6888 0.7059 0.8402
No log 7.5417 362 0.7043 0.7342 0.7043 0.8392
No log 7.5833 364 0.7178 0.6702 0.7178 0.8472
No log 7.625 366 0.7557 0.6882 0.7557 0.8693
No log 7.6667 368 0.7766 0.6882 0.7766 0.8813
No log 7.7083 370 0.7843 0.6951 0.7843 0.8856
No log 7.75 372 0.7778 0.6951 0.7778 0.8819
No log 7.7917 374 0.8034 0.7139 0.8034 0.8963
No log 7.8333 376 0.8331 0.7139 0.8331 0.9127
No log 7.875 378 0.8335 0.7139 0.8335 0.9130
No log 7.9167 380 0.8116 0.7139 0.8116 0.9009
No log 7.9583 382 0.8013 0.7139 0.8013 0.8952
No log 8.0 384 0.7915 0.7139 0.7915 0.8897
No log 8.0417 386 0.7726 0.6951 0.7726 0.8790
No log 8.0833 388 0.7600 0.6951 0.7600 0.8718
No log 8.125 390 0.7533 0.7427 0.7533 0.8679
No log 8.1667 392 0.7511 0.7427 0.7511 0.8667
No log 8.2083 394 0.7375 0.7427 0.7375 0.8588
No log 8.25 396 0.7214 0.7337 0.7214 0.8494
No log 8.2917 398 0.7239 0.7172 0.7239 0.8508
No log 8.3333 400 0.7236 0.7172 0.7236 0.8506
No log 8.375 402 0.7197 0.7172 0.7197 0.8483
No log 8.4167 404 0.7017 0.7172 0.7017 0.8377
No log 8.4583 406 0.6943 0.7172 0.6943 0.8333
No log 8.5 408 0.6936 0.7172 0.6936 0.8328
No log 8.5417 410 0.6889 0.7172 0.6889 0.8300
No log 8.5833 412 0.6949 0.7172 0.6949 0.8336
No log 8.625 414 0.7042 0.7172 0.7042 0.8392
No log 8.6667 416 0.7194 0.7172 0.7194 0.8482
No log 8.7083 418 0.7559 0.7172 0.7559 0.8694
No log 8.75 420 0.7810 0.7346 0.7810 0.8838
No log 8.7917 422 0.8087 0.7346 0.8087 0.8993
No log 8.8333 424 0.8240 0.7346 0.8240 0.9078
No log 8.875 426 0.8176 0.7346 0.8176 0.9042
No log 8.9167 428 0.8155 0.7346 0.8155 0.9030
No log 8.9583 430 0.8001 0.7346 0.8001 0.8945
No log 9.0 432 0.7936 0.7346 0.7936 0.8908
No log 9.0417 434 0.7872 0.7346 0.7872 0.8873
No log 9.0833 436 0.7933 0.7346 0.7933 0.8907
No log 9.125 438 0.8034 0.7346 0.8034 0.8964
No log 9.1667 440 0.8099 0.7346 0.8099 0.8999
No log 9.2083 442 0.8032 0.7346 0.8032 0.8962
No log 9.25 444 0.7911 0.7346 0.7911 0.8894
No log 9.2917 446 0.7743 0.7346 0.7743 0.8800
No log 9.3333 448 0.7545 0.7346 0.7545 0.8686
No log 9.375 450 0.7459 0.7172 0.7459 0.8637
No log 9.4167 452 0.7456 0.7172 0.7456 0.8635
No log 9.4583 454 0.7395 0.7172 0.7395 0.8600
No log 9.5 456 0.7343 0.7172 0.7343 0.8569
No log 9.5417 458 0.7290 0.7172 0.7290 0.8538
No log 9.5833 460 0.7256 0.7172 0.7256 0.8518
No log 9.625 462 0.7242 0.7172 0.7242 0.8510
No log 9.6667 464 0.7226 0.7172 0.7226 0.8501
No log 9.7083 466 0.7208 0.7172 0.7208 0.8490
No log 9.75 468 0.7213 0.7172 0.7213 0.8493
No log 9.7917 470 0.7198 0.7172 0.7198 0.8484
No log 9.8333 472 0.7181 0.7172 0.7181 0.8474
No log 9.875 474 0.7164 0.7172 0.7164 0.8464
No log 9.9167 476 0.7156 0.7172 0.7156 0.8459
No log 9.9583 478 0.7158 0.7172 0.7158 0.8461
No log 10.0 480 0.7158 0.7172 0.7158 0.8461

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
10
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/arabert_baseline_augmented_more_organization_task1_fold0

Finetuned
(296)
this model