Edit model card

Hierarchical_Agent_Action

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the agent_action_class dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5942
  • Accuracy: 0.8403

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.4407 0.81 100 2.2716 0.6058
1.7756 1.61 200 1.6162 0.7065
1.3948 2.42 300 1.2200 0.7698
1.131 3.23 400 1.0012 0.7856
0.9239 4.03 500 0.9055 0.7827
0.8699 4.84 600 0.8103 0.7827
0.6707 5.65 700 0.7610 0.7842
0.6206 6.45 800 0.7312 0.7885
0.5795 7.26 900 0.6989 0.8101
0.4914 8.06 1000 0.7066 0.7813
0.5087 8.87 1100 0.6398 0.8187
0.4373 9.68 1200 0.6293 0.8043
0.4365 10.48 1300 0.6726 0.7971
0.4517 11.29 1400 0.6047 0.8245
0.4114 12.1 1500 0.6088 0.8230
0.426 12.9 1600 0.6165 0.8201
0.3456 13.71 1700 0.6133 0.8259
0.332 14.52 1800 0.6736 0.8201
0.3646 15.32 1900 0.6406 0.8173
0.3287 16.13 2000 0.6978 0.7971
0.2793 16.94 2100 0.6433 0.8173
0.2924 17.74 2200 0.6474 0.8144
0.2605 18.55 2300 0.6279 0.8288
0.2016 19.35 2400 0.6361 0.8216
0.2524 20.16 2500 0.6394 0.8259
0.2017 20.97 2600 0.6683 0.8158
0.2082 21.77 2700 0.6389 0.8345
0.2751 22.58 2800 0.6141 0.8374
0.207 23.39 2900 0.6052 0.8259
0.1791 24.19 3000 0.6332 0.8230
0.1719 25.0 3100 0.5942 0.8403
0.1685 25.81 3200 0.6121 0.8360
0.1557 26.61 3300 0.6237 0.8345
0.1694 27.42 3400 0.6372 0.8317
0.1927 28.23 3500 0.6378 0.8273
0.1375 29.03 3600 0.6258 0.8331
0.1653 29.84 3700 0.6262 0.8331

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.14.1
Downloads last month
1
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results