salbatarni's picture
End of training
6b430bf verified
metadata
base_model: aubmindlab/bert-base-arabertv02
tags:
  - generated_from_trainer
model-index:
  - name: arabert_baseline_development_task2_fold0
    results: []

arabert_baseline_development_task2_fold0

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4592
  • Qwk: 0.2794
  • Mse: 0.4600

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse
No log 0.3333 2 5.4721 -0.0129 5.4748
No log 0.6667 4 2.5151 0.0507 2.5239
No log 1.0 6 1.3235 0.0123 1.3253
No log 1.3333 8 0.6015 0.0645 0.6018
No log 1.6667 10 0.5523 0.1695 0.5523
No log 2.0 12 0.6138 -0.0138 0.6123
No log 2.3333 14 0.6765 0.0331 0.6760
No log 2.6667 16 0.8075 0.0645 0.8118
No log 3.0 18 1.2170 0.0000 1.2213
No log 3.3333 20 0.9705 0.1348 0.9751
No log 3.6667 22 0.5286 0.2674 0.5292
No log 4.0 24 0.5438 0.1600 0.5431
No log 4.3333 26 0.5876 0.0 0.5872
No log 4.6667 28 0.5812 0.0 0.5809
No log 5.0 30 0.5243 0.1600 0.5239
No log 5.3333 32 0.4585 0.2794 0.4579
No log 5.6667 34 0.4400 0.2699 0.4394
No log 6.0 36 0.5014 0.2581 0.5003
No log 6.3333 38 0.5619 0.1947 0.5601
No log 6.6667 40 0.5416 0.1947 0.5393
No log 7.0 42 0.5359 0.1947 0.5336
No log 7.3333 44 0.5271 0.1947 0.5250
No log 7.6667 46 0.4913 0.2596 0.4901
No log 8.0 48 0.4604 0.2674 0.4603
No log 8.3333 50 0.4564 0.2794 0.4567
No log 8.6667 52 0.4600 0.2794 0.4606
No log 9.0 54 0.4613 0.2794 0.4620
No log 9.3333 56 0.4606 0.2794 0.4614
No log 9.6667 58 0.4598 0.2794 0.4606
No log 10.0 60 0.4592 0.2794 0.4600

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1