Edit model card

MAE-CT-CPC-Dicotomized-v7-tricot

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 5.4223
  • Accuracy: 0.3077

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 7900

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0989 0.0101 80 1.1031 0.3191
1.0889 1.0101 160 1.1058 0.3404
1.0739 2.0101 240 1.1233 0.4043
1.0036 3.0101 320 1.1596 0.2766
1.0706 4.0101 400 1.1731 0.2553
0.9669 5.0101 480 1.1228 0.3191
1.0233 6.0101 560 1.1490 0.4043
0.8492 7.0101 640 1.2636 0.3830
0.8842 8.0101 720 1.4061 0.3617
0.6599 9.0101 800 1.3445 0.2979
0.6723 10.0101 880 1.4072 0.3617
0.604 11.0101 960 1.4199 0.3617
0.4959 12.0101 1040 1.5689 0.3617
0.3758 13.0101 1120 1.7867 0.3617
0.6257 14.0101 1200 1.9218 0.3617
0.3693 15.0101 1280 2.0988 0.3191
0.5933 16.0101 1360 1.8413 0.4043
0.202 17.0101 1440 2.7537 0.3191
0.1454 18.0101 1520 2.4612 0.4255
0.1332 19.0101 1600 3.0944 0.3404
0.9193 20.0101 1680 2.8691 0.4043
0.1201 21.0101 1760 3.0564 0.4255
0.1716 22.0101 1840 3.3907 0.3404
0.0402 23.0101 1920 3.7917 0.3191
0.0709 24.0101 2000 3.5487 0.4043
0.1021 25.0101 2080 3.9004 0.4043
0.0029 26.0101 2160 4.1949 0.3617
0.1352 27.0101 2240 4.5038 0.3617
0.0173 28.0101 2320 3.9352 0.3830
0.0012 29.0101 2400 4.3234 0.4043
0.0007 30.0101 2480 4.2877 0.3830
0.2292 31.0101 2560 4.7297 0.3191
0.0004 32.0101 2640 4.4710 0.3830
0.0361 33.0101 2720 4.2391 0.4255
0.0002 34.0101 2800 4.2256 0.4043
0.0082 35.0101 2880 5.0734 0.3404
0.0318 36.0101 2960 4.0735 0.4255
0.0002 37.0101 3040 5.1464 0.2553
0.0003 38.0101 3120 4.6340 0.4043
0.48 39.0101 3200 4.3370 0.4255
0.0002 40.0101 3280 4.5820 0.3617
0.0002 41.0101 3360 5.0157 0.3191
0.1209 42.0101 3440 4.3109 0.3830
0.0001 43.0101 3520 4.4596 0.4043
0.0109 44.0101 3600 4.4251 0.3830
0.0001 45.0101 3680 5.2962 0.2979
0.1516 46.0101 3760 4.2314 0.4043
0.0001 47.0101 3840 4.0705 0.5319
0.0001 48.0101 3920 4.5586 0.4255
0.0266 49.0101 4000 4.9479 0.4043
0.0001 50.0101 4080 4.3270 0.4468
0.1307 51.0101 4160 4.7948 0.3830
0.0019 52.0101 4240 4.3638 0.3617
0.0001 53.0101 4320 4.5863 0.4255
0.0001 54.0101 4400 4.7373 0.4255
0.0006 55.0101 4480 3.9066 0.4468
0.0001 56.0101 4560 4.0314 0.4681
0.0001 57.0101 4640 4.0581 0.5106
0.0001 58.0101 4720 5.0045 0.3830
0.0001 59.0101 4800 4.0895 0.4255
0.0713 60.0101 4880 5.0429 0.4255
0.0017 61.0101 4960 4.7870 0.4255
0.0676 62.0101 5040 5.0957 0.3830
0.0 63.0101 5120 4.6062 0.4043
0.0045 64.0101 5200 5.2459 0.3830
0.0943 65.0101 5280 5.0856 0.3617
0.0002 66.0101 5360 4.4492 0.4894
0.0002 67.0101 5440 5.1795 0.4043
0.0007 68.0101 5520 4.3202 0.4681
0.1678 69.0101 5600 4.8688 0.4043
0.0001 70.0101 5680 5.2880 0.4043
0.0 71.0101 5760 5.1151 0.4255
0.0005 72.0101 5840 4.5667 0.4255
0.0 73.0101 5920 4.2883 0.4681
0.0 74.0101 6000 4.6848 0.4255
0.0 75.0101 6080 4.8157 0.4468
0.0 76.0101 6160 4.8248 0.4468
0.0 77.0101 6240 4.5636 0.4894
0.0 78.0101 6320 4.5817 0.4255
0.0001 79.0101 6400 4.7743 0.3830
0.0001 80.0101 6480 4.9000 0.4043
0.0002 81.0101 6560 4.7669 0.4255
0.0 82.0101 6640 4.8225 0.4468
0.0 83.0101 6720 4.8331 0.4468
0.0 84.0101 6800 4.7154 0.4468
0.0 85.0101 6880 4.7169 0.4468
0.0 86.0101 6960 4.9004 0.4255
0.0 87.0101 7040 4.9092 0.4255
0.0 88.0101 7120 4.8941 0.4255
0.0 89.0101 7200 4.7898 0.4255
0.0 90.0101 7280 4.8271 0.4468
0.0 91.0101 7360 4.8320 0.4468
0.0 92.0101 7440 4.8274 0.4468
0.0 93.0101 7520 4.8269 0.4468
0.0 94.0101 7600 4.8785 0.3830
0.0 95.0101 7680 4.9640 0.4255
0.0 96.0101 7760 4.9480 0.4255
0.0 97.0101 7840 4.9404 0.4255
0.0 98.0076 7900 4.9351 0.4255

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
22
Safetensors
Model size
304M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for beingbatman/MAE-CT-CPC-Dicotomized-v7-tricot

Finetuned
(9)
this model