Edit model card

videomae-base-finetuned-engine-subset-20230313

This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8913
  • Accuracy: 0.6745

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 6
  • eval_batch_size: 6
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 1110

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.6212 0.03 38 2.3629 0.3774
2.455 1.03 76 2.3674 0.2170
2.4311 2.03 114 2.2191 0.3231
2.2768 3.03 152 2.1227 0.3608
1.7528 4.03 190 1.7296 0.4363
1.5381 5.03 228 1.5016 0.4340
1.407 6.03 266 1.2878 0.5448
1.1053 7.03 304 1.5210 0.4009
1.0893 8.03 342 1.3902 0.4623
0.8136 9.03 380 1.6456 0.4033
0.9565 10.03 418 1.1826 0.5613
1.0147 11.03 456 1.2099 0.5118
0.9125 12.03 494 1.1850 0.5495
0.7091 13.03 532 1.2324 0.5354
0.7361 14.03 570 1.0225 0.6226
0.6979 15.03 608 1.0738 0.5590
0.5265 16.03 646 1.1062 0.5873
0.5651 17.03 684 1.1402 0.5802
0.7182 18.03 722 1.0974 0.5802
0.6582 19.03 760 1.0529 0.6179
0.5709 20.03 798 0.9655 0.6344
0.4808 21.03 836 1.0441 0.6226
0.5816 22.03 874 0.9445 0.6439
0.5057 23.03 912 1.0248 0.6321
0.6253 24.03 950 0.9518 0.6604
0.6841 25.03 988 0.8913 0.6745
0.5933 26.03 1026 0.9013 0.6439
0.389 27.03 1064 0.9090 0.6627
0.3705 28.03 1102 0.8936 0.6722
0.6043 29.01 1110 0.8942 0.6722

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.10.1
  • Tokenizers 0.13.2
Downloads last month
1
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.