Edit model card

roberta_classification

This model is a fine-tuned version of roberta-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2731
  • Accuracy: {'accuracy': 0.8465909090909091}
  • F1: {'f1': 0.8396445042099528}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 1.0 263 1.1741 {'accuracy': 0.6363636363636364} {'f1': 0.6202787331893512}
1.181 2.0 526 0.9322 {'accuracy': 0.7386363636363636} {'f1': 0.7177199655598837}
1.181 3.0 789 0.7835 {'accuracy': 0.7727272727272727} {'f1': 0.7657783584890875}
0.3689 4.0 1052 0.8597 {'accuracy': 0.7727272727272727} {'f1': 0.768360357103512}
0.3689 5.0 1315 0.7560 {'accuracy': 0.8125} {'f1': 0.8031513875852524}
0.165 6.0 1578 0.7579 {'accuracy': 0.8200757575757576} {'f1': 0.8142845258630059}
0.165 7.0 1841 0.8900 {'accuracy': 0.8352272727272727} {'f1': 0.8316422201059607}
0.0778 8.0 2104 0.9315 {'accuracy': 0.8295454545454546} {'f1': 0.825285136658407}
0.0778 9.0 2367 1.1370 {'accuracy': 0.8181818181818182} {'f1': 0.8091288762824846}
0.0335 10.0 2630 1.0799 {'accuracy': 0.8465909090909091} {'f1': 0.841700330957688}
0.0335 11.0 2893 1.2487 {'accuracy': 0.8314393939393939} {'f1': 0.8269815181159639}
0.0162 12.0 3156 1.2194 {'accuracy': 0.8295454545454546} {'f1': 0.8243565671691487}
0.0162 13.0 3419 1.2592 {'accuracy': 0.8333333333333334} {'f1': 0.8312612314115424}
0.0073 14.0 3682 1.2885 {'accuracy': 0.8257575757575758} {'f1': 0.8198413592956925}
0.0073 15.0 3945 1.2133 {'accuracy': 0.8352272727272727} {'f1': 0.8291568008253063}
0.0046 16.0 4208 1.2625 {'accuracy': 0.8409090909090909} {'f1': 0.8343252944129244}
0.0046 17.0 4471 1.2498 {'accuracy': 0.8409090909090909} {'f1': 0.8356461395476784}
0.0032 18.0 4734 1.3041 {'accuracy': 0.8390151515151515} {'f1': 0.8307896138032654}
0.0032 19.0 4997 1.2544 {'accuracy': 0.8446969696969697} {'f1': 0.83889081905153}
0.0022 20.0 5260 1.2731 {'accuracy': 0.8465909090909091} {'f1': 0.8396445042099528}

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
9
Safetensors
Model size
210M params
Tensor type
F32
·

Finetuned from

Space using Ahmed235/roberta_classification 1