emotions_classifier / README.md
NabeelShar's picture
End of training
de55721
|
raw
history blame
4.54 kB
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_keras_callback
model-index:
  - name: NabeelShar/emotions_classifier
    results: []

NabeelShar/emotions_classifier

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.1146
  • Validation Loss: 1.6637
  • Train Accuracy: 0.3625
  • Epoch: 49

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 0.0003, 'decay_steps': 32000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Train Accuracy Epoch
2.0910 2.0947 0.1062 0
2.0049 1.9103 0.2062 1
2.0473 1.9654 0.175 2
1.9824 2.1773 0.125 3
2.0538 2.0144 0.1875 4
1.9921 2.0826 0.1437 5
2.0904 2.0995 0.1812 6
2.0866 2.0908 0.1313 7
2.0718 2.0800 0.125 8
2.0511 2.0358 0.1938 9
1.9794 1.9049 0.2313 10
1.9289 1.8717 0.1875 11
1.8696 1.8451 0.2062 12
1.8361 1.8010 0.2062 13
1.8122 1.7457 0.225 14
1.7571 1.7331 0.25 15
1.6846 1.8783 0.25 16
1.6954 1.8015 0.25 17
1.7414 1.7329 0.1625 18
1.6662 1.6900 0.2625 19
1.6322 1.7607 0.25 20
1.5822 1.6670 0.3063 21
1.6279 1.6800 0.3 22
1.5737 1.7843 0.25 23
1.5851 1.6927 0.2875 24
1.4926 1.6640 0.2687 25
1.4879 1.7408 0.2812 26
1.5564 1.6668 0.275 27
1.5093 1.6259 0.3187 28
1.4428 1.6973 0.2437 29
1.4328 1.6412 0.2812 30
1.3778 1.6470 0.3187 31
1.4635 1.6411 0.325 32
1.4044 1.6643 0.2938 33
1.2991 1.6864 0.2875 34
1.3467 1.6124 0.2687 35
1.3422 1.6517 0.2687 36
1.3998 1.5634 0.325 37
1.2963 1.7403 0.3 38
1.3050 1.7550 0.3187 39
1.2988 1.6917 0.3438 40
1.2601 1.6739 0.3125 41
1.1943 1.7200 0.35 42
1.2663 1.6505 0.3312 43
1.2228 1.7337 0.3312 44
1.1413 1.7777 0.2812 45
1.1429 1.7138 0.3375 46
1.0760 1.7160 0.3187 47
1.1625 1.8049 0.3063 48
1.1146 1.6637 0.3625 49

Framework versions

  • Transformers 4.33.1
  • TensorFlow 2.13.0
  • Datasets 2.14.5
  • Tokenizers 0.13.3