salbatarni's picture
End of training
2cf4108 verified
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_baseline_development_task7_fold0
    results: []

arabert_baseline_development_task7_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3255
  • Qwk: 0.6
  • Mse: 0.3255

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.3333 2 1.1202 0.0777 1.1202
No log 0.6667 4 0.6041 0.4521 0.6041
No log 1.0 6 0.5933 0.4048 0.5933
No log 1.3333 8 0.6033 0.4186 0.6033
No log 1.6667 10 0.4045 0.5380 0.4045
No log 2.0 12 0.4962 0.5370 0.4962
No log 2.3333 14 0.4729 0.5380 0.4729
No log 2.6667 16 0.4672 0.4643 0.4672
No log 3.0 18 0.5466 0.4421 0.5466
No log 3.3333 20 0.6361 0.4508 0.6361
No log 3.6667 22 0.4635 0.4421 0.4635
No log 4.0 24 0.3643 0.6 0.3643
No log 4.3333 26 0.3664 0.6237 0.3664
No log 4.6667 28 0.3535 0.6 0.3535
No log 5.0 30 0.3681 0.5545 0.3681
No log 5.3333 32 0.3906 0.5327 0.3906
No log 5.6667 34 0.3676 0.5327 0.3676
No log 6.0 36 0.3373 0.6 0.3373
No log 6.3333 38 0.3425 0.6324 0.3425
No log 6.6667 40 0.3594 0.5960 0.3594
No log 7.0 42 0.3550 0.6 0.3550
No log 7.3333 44 0.3547 0.5874 0.3547
No log 7.6667 46 0.3761 0.5726 0.3761
No log 8.0 48 0.3915 0.5726 0.3915
No log 8.3333 50 0.3777 0.5726 0.3777
No log 8.6667 52 0.3558 0.5642 0.3558
No log 9.0 54 0.3383 0.5874 0.3383
No log 9.3333 56 0.3286 0.6 0.3286
No log 9.6667 58 0.3263 0.6 0.3263
No log 10.0 60 0.3255 0.6 0.3255

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1