Edit model card

emotions_bert

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5151
  • F1 Micro: 0.6887
  • F1 Macro: 0.6024
  • Accuracy: 0.1929

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Micro F1 Macro Accuracy
0.7549 0.4082 20 0.6455 0.6125 0.4264 0.1243
0.6144 0.8163 40 0.5675 0.6510 0.5188 0.1670
0.5496 1.2245 60 0.5414 0.6747 0.5570 0.1883
0.4878 1.6327 80 0.5191 0.6849 0.5894 0.2104
0.4754 2.0408 100 0.5140 0.6810 0.5909 0.2013
0.4027 2.4490 120 0.5169 0.6849 0.5880 0.2207
0.3986 2.8571 140 0.5151 0.6887 0.6024 0.1929
0.3711 3.2653 160 0.5187 0.6820 0.5991 0.2188
0.325 3.6735 180 0.5263 0.6753 0.5928 0.1942
0.3303 4.0816 200 0.5294 0.6900 0.5949 0.2149
0.2801 4.4898 220 0.5420 0.6840 0.5953 0.2097
0.2748 4.8980 240 0.5583 0.6797 0.5861 0.2162
0.2452 5.3061 260 0.5781 0.6758 0.5871 0.1981
0.2253 5.7143 280 0.5889 0.6715 0.5812 0.1929
0.226 6.1224 300 0.5955 0.6793 0.5852 0.2207
0.1958 6.5306 320 0.6120 0.6734 0.5861 0.2032
0.1952 6.9388 340 0.6209 0.6744 0.5806 0.2084
0.1758 7.3469 360 0.6339 0.6756 0.5789 0.2136
0.1691 7.7551 380 0.6412 0.6773 0.5779 0.2188
0.1613 8.1633 400 0.6431 0.6761 0.5794 0.2142
0.1486 8.5714 420 0.6532 0.6718 0.5763 0.2104
0.1529 8.9796 440 0.6577 0.6737 0.5747 0.2136
0.1436 9.3878 460 0.6658 0.6734 0.5744 0.2194
0.1399 9.7959 480 0.6640 0.6735 0.5745 0.2188

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
109M params
Tensor type
F32
·

Finetuned from