Edit model card

ABL_trad_k

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.8641
  • Accuracy: 0.6842
  • F1: 0.6826

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 36

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
0.9411 1.0 1500 0.9017 0.5792 0.5764
0.8328 2.0 3000 0.8458 0.62 0.6188
0.8006 3.0 4500 0.8183 0.64 0.6391
0.7283 4.0 6000 0.8154 0.6442 0.6430
0.7006 5.0 7500 0.7978 0.6492 0.6487
0.6555 6.0 9000 0.8009 0.6542 0.6536
0.6263 7.0 10500 0.8033 0.6617 0.6612
0.5805 8.0 12000 0.8155 0.6658 0.6657
0.5385 9.0 13500 0.8608 0.675 0.6729
0.5108 10.0 15000 0.8545 0.6733 0.6732
0.4791 11.0 16500 0.8950 0.6758 0.6750
0.4423 12.0 18000 0.9145 0.6792 0.6790
0.4295 13.0 19500 0.9497 0.6708 0.6707
0.3782 14.0 21000 1.0309 0.6742 0.6734
0.3656 15.0 22500 1.0706 0.6783 0.6774
0.3312 16.0 24000 1.1327 0.6733 0.6730
0.3008 17.0 25500 1.1870 0.6825 0.6822
0.2851 18.0 27000 1.3284 0.685 0.6847
0.2636 19.0 28500 1.4260 0.6858 0.6851
0.2752 20.0 30000 1.4733 0.6833 0.6833
0.2231 21.0 31500 1.6163 0.68 0.6800
0.2052 22.0 33000 1.7674 0.6792 0.6786
0.198 23.0 34500 1.8474 0.6833 0.6827
0.1854 24.0 36000 1.9509 0.6775 0.6766
0.1944 25.0 37500 2.0660 0.68 0.6790
0.1649 26.0 39000 2.1718 0.6825 0.6812
0.1443 27.0 40500 2.3664 0.68 0.6791
0.1251 28.0 42000 2.4144 0.6833 0.6827
0.1357 29.0 43500 2.4407 0.6875 0.6875
0.1279 30.0 45000 2.4419 0.6933 0.6932
0.1112 31.0 46500 2.5989 0.6833 0.6828
0.098 32.0 48000 2.6390 0.68 0.6792
0.0864 33.0 49500 2.7293 0.6792 0.6780
0.0897 34.0 51000 2.7814 0.6833 0.6820
0.0869 35.0 52500 2.8468 0.68 0.6787
0.0834 36.0 54000 2.8641 0.6842 0.6826

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
10
Safetensors
Model size
110M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mrovejaxd/ABL_trad_k

Finetuned
(75)
this model