FNST_trad_j / README.md
mrovejaxd's picture
End of training
296c57e verified
metadata
base_model: dccuchile/bert-base-spanish-wwm-cased
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: FNST_trad_j
    results: []

FNST_trad_j

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6540
  • Accuracy: 0.6525
  • F1: 0.6178

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 32

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.1058 1.0 1500 1.0564 0.5442 0.3843
0.9559 2.0 3000 0.9522 0.585 0.5503
0.8789 3.0 4500 0.8843 0.61 0.5733
0.8292 4.0 6000 0.8614 0.6167 0.5734
0.7807 5.0 7500 0.8519 0.62 0.5896
0.7559 6.0 9000 0.8648 0.6283 0.5965
0.7098 7.0 10500 0.8579 0.63 0.5961
0.6703 8.0 12000 0.8536 0.6417 0.6029
0.6114 9.0 13500 0.8686 0.6358 0.5997
0.611 10.0 15000 0.8948 0.6342 0.6045
0.5614 11.0 16500 0.9173 0.6342 0.6046
0.515 12.0 18000 0.9289 0.6425 0.6089
0.5107 13.0 19500 0.9581 0.64 0.6052
0.4691 14.0 21000 1.0099 0.6433 0.6091
0.4476 15.0 22500 1.0543 0.6458 0.6108
0.398 16.0 24000 1.1170 0.6425 0.6051
0.3828 17.0 25500 1.1585 0.6517 0.6102
0.3567 18.0 27000 1.2252 0.6475 0.6114
0.3334 19.0 28500 1.2827 0.6675 0.6317
0.2982 20.0 30000 1.4256 0.6517 0.6257
0.2734 21.0 31500 1.4591 0.6583 0.6305
0.2556 22.0 33000 1.5516 0.66 0.6263
0.2409 23.0 34500 1.6793 0.6592 0.6219
0.2226 24.0 36000 1.8157 0.66 0.6218
0.1971 25.0 37500 1.9089 0.6575 0.6241
0.1832 26.0 39000 2.0406 0.6558 0.6300
0.1921 27.0 40500 2.1448 0.6583 0.6254
0.1496 28.0 42000 2.2888 0.6458 0.6136
0.1538 29.0 43500 2.3520 0.66 0.6241
0.1558 30.0 45000 2.4748 0.6492 0.6207
0.1409 31.0 46500 2.5126 0.6542 0.6175
0.119 32.0 48000 2.6540 0.6525 0.6178

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1