CTMAE-P2-V4-S1

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7784
  • Accuracy: 0.8696

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6500

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6464 0.0202 131 0.8455 0.5435
0.4219 1.0202 262 2.1245 0.5435
1.1348 2.0202 393 2.1033 0.5435
0.7407 3.0202 524 0.8223 0.5435
1.2791 4.0202 655 1.5891 0.5435
0.8313 5.0202 786 1.4843 0.5435
1.0405 6.0202 917 1.5994 0.5435
1.5845 7.0202 1048 1.0656 0.5435
0.7645 8.0202 1179 1.7169 0.5435
0.9242 9.0202 1310 1.1991 0.5435
0.9458 10.0202 1441 1.6688 0.5435
0.7963 11.0202 1572 1.4274 0.5435
0.6725 12.0202 1703 0.6368 0.5652
0.6568 13.0202 1834 1.1090 0.5435
1.0769 14.0202 1965 1.7989 0.5435
0.2657 15.0202 2096 0.7522 0.6304
0.6413 16.0202 2227 1.5062 0.5435
0.7469 17.0202 2358 0.9307 0.5870
0.6873 18.0202 2489 0.4671 0.7174
0.8322 19.0202 2620 1.0945 0.6522
0.3899 20.0202 2751 0.8506 0.6739
0.8377 21.0202 2882 0.8038 0.6957
0.7089 22.0202 3013 0.8173 0.6522
0.6477 23.0202 3144 0.5239 0.7826
0.7071 24.0202 3275 0.7577 0.7826
0.468 25.0202 3406 1.1352 0.7174
0.6235 26.0202 3537 0.7960 0.7609
0.2239 27.0202 3668 1.2848 0.6957
0.3065 28.0202 3799 1.0995 0.7391
0.5603 29.0202 3930 0.8299 0.7391
0.4514 30.0202 4061 0.5079 0.8261
0.3525 31.0202 4192 0.6861 0.7174
0.0011 32.0202 4323 1.1933 0.7391
0.4758 33.0202 4454 0.8103 0.7826
0.5178 34.0202 4585 0.8117 0.7826
0.3839 35.0202 4716 0.7044 0.8261
0.0296 36.0202 4847 0.7980 0.8043
0.6077 37.0202 4978 0.8048 0.8478
0.0009 38.0202 5109 0.7784 0.8696
0.6204 39.0202 5240 0.8538 0.8261
0.0006 40.0202 5371 1.0553 0.7609
0.2712 41.0202 5502 0.9511 0.8261
0.0579 42.0202 5633 0.9046 0.8478
0.0003 43.0202 5764 1.5329 0.6957
0.0002 44.0202 5895 1.0590 0.7826
0.4781 45.0202 6026 0.8769 0.8478
0.0706 46.0202 6157 0.9025 0.8261
0.0024 47.0202 6288 0.8750 0.8261
0.0153 48.0202 6419 0.8643 0.8696
0.0004 49.0125 6500 0.8715 0.8478

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
12
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V4-S1

Finetuned
(36)
this model