arabic-hs-target-prediction

This model is a fine-tuned version of aubmindlab/bert-base-arabert on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3406
  • F1: 0.7151
  • Roc Auc: 0.8021
  • Accuracy: 0.6699

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 48
  • eval_batch_size: 20
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss F1 Roc Auc Accuracy
0.4935 0.3215 100 0.4566 0.2686 0.5717 0.1646
0.4381 0.6431 200 0.4188 0.5871 0.7142 0.5256
0.4178 0.9646 300 0.4103 0.6120 0.7308 0.5495
0.3953 1.2862 400 0.3874 0.6449 0.7519 0.5982
0.3708 1.6077 500 0.3765 0.6535 0.7575 0.5959
0.3658 1.9293 600 0.3709 0.6488 0.7541 0.5837
0.3431 2.2508 700 0.3502 0.6810 0.7774 0.6245
0.3211 2.5723 800 0.3561 0.6817 0.7781 0.6315
0.325 2.8939 900 0.3527 0.6876 0.7813 0.6367
0.3162 3.2154 1000 0.3405 0.6966 0.7879 0.6442
0.3037 3.5370 1100 0.3444 0.6984 0.7892 0.6470
0.304 3.8585 1200 0.3364 0.7033 0.7926 0.6573
0.2898 4.1801 1300 0.3412 0.7020 0.7919 0.6460
0.2881 4.5016 1400 0.3376 0.7119 0.7989 0.6629
0.2871 4.8232 1500 0.3446 0.7016 0.7924 0.6517
0.2865 5.1447 1600 0.3353 0.7084 0.7973 0.6564
0.2738 5.4662 1700 0.3363 0.7102 0.7982 0.6634
0.2638 5.7878 1800 0.3368 0.7167 0.8023 0.6746
0.2656 6.1093 1900 0.3419 0.7121 0.7993 0.6643
0.2646 6.4309 2000 0.3340 0.7183 0.8041 0.6765
0.2664 6.7524 2100 0.3334 0.7164 0.8034 0.6751
0.2468 7.0740 2200 0.3373 0.7182 0.8044 0.6737
0.2488 7.3955 2300 0.3386 0.7114 0.7994 0.6671
0.2484 7.7170 2400 0.3395 0.7157 0.8025 0.6742
0.2515 8.0386 2500 0.3355 0.7130 0.8006 0.6709
0.2357 8.3601 2600 0.3410 0.7131 0.8005 0.6671
0.2475 8.6817 2700 0.3409 0.7171 0.8037 0.6704
0.2484 9.0032 2800 0.3409 0.7144 0.8015 0.6667
0.237 9.3248 2900 0.3403 0.7130 0.8007 0.6676
0.2386 9.6463 3000 0.3400 0.7153 0.8024 0.6699
0.2355 9.9678 3100 0.3406 0.7151 0.8021 0.6699

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.5.1+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.0
Downloads last month
19
Safetensors
Model size
137M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for HrantDinkFoundation/arabic-hs-target-prediction

Finetuned
(12)
this model