Edit model card

arabert_cross_vocabulary_task5_fold6

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4885
  • Qwk: 0.6405
  • Mse: 0.4871

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.0328 2 1.7653 0.0854 1.7539
No log 0.0656 4 0.9960 0.2694 0.9869
No log 0.0984 6 1.0209 0.4489 1.0210
No log 0.1311 8 1.2656 0.2397 1.2659
No log 0.1639 10 1.1746 0.1146 1.1733
No log 0.1967 12 1.0616 0.1286 1.0603
No log 0.2295 14 0.9704 0.1900 0.9690
No log 0.2623 16 0.8921 0.2622 0.8910
No log 0.2951 18 0.8267 0.4004 0.8262
No log 0.3279 20 0.7715 0.4447 0.7714
No log 0.3607 22 0.7094 0.4821 0.7095
No log 0.3934 24 0.6609 0.5270 0.6611
No log 0.4262 26 0.6404 0.5798 0.6403
No log 0.4590 28 0.6536 0.6218 0.6535
No log 0.4918 30 0.6622 0.6382 0.6618
No log 0.5246 32 0.6200 0.6334 0.6191
No log 0.5574 34 0.6047 0.6457 0.6038
No log 0.5902 36 0.6135 0.6481 0.6127
No log 0.6230 38 0.6735 0.6781 0.6730
No log 0.6557 40 0.6993 0.6904 0.6989
No log 0.6885 42 0.6637 0.6940 0.6631
No log 0.7213 44 0.6006 0.7091 0.5995
No log 0.7541 46 0.5531 0.7057 0.5517
No log 0.7869 48 0.5246 0.6740 0.5231
No log 0.8197 50 0.5072 0.6225 0.5054
No log 0.8525 52 0.5048 0.5882 0.5030
No log 0.8852 54 0.5019 0.5965 0.5002
No log 0.9180 56 0.4958 0.6015 0.4942
No log 0.9508 58 0.4908 0.6094 0.4893
No log 0.9836 60 0.4885 0.6405 0.4871

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
135M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for salbatarni/arabert_cross_vocabulary_task5_fold6

Finetuned
(702)
this model