Edit model card

ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-mdb-enst-2

This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-mdb-2 on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-MDB-ENST-2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6379
  • Wer: 0.3792

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 30
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.4226 0.99 35 2.0435 0.4154
0.8744 1.99 70 1.7193 0.4382
0.9474 2.99 105 1.7853 0.4374
0.8316 3.99 140 1.2827 0.4306
0.8336 4.99 175 1.0676 0.4040
0.7345 5.99 210 1.5364 0.4264
0.6666 6.99 245 1.4284 0.4585
0.6677 7.99 280 0.9475 0.4003
0.6779 8.99 315 1.1172 0.4209
0.6503 9.99 350 0.8999 0.3834
0.6159 10.99 385 1.1501 0.4386
0.6831 11.99 420 1.0860 0.3825
0.5959 12.99 455 0.9410 0.4045
0.7154 13.99 490 1.0463 0.3821
0.6094 14.99 525 0.8598 0.3965
0.6929 15.99 560 0.9494 0.3931
0.7627 16.99 595 0.8060 0.3948
0.601 17.99 630 0.9890 0.3965
0.546 18.99 665 0.8059 0.3990
0.5222 19.99 700 0.6379 0.3792
0.5802 20.99 735 0.6995 0.3661
0.5731 21.99 770 0.8405 0.3606
0.5462 22.99 805 0.6667 0.3965
0.6057 23.99 840 0.8396 0.3762
0.5323 24.99 875 0.9054 0.3952
0.683 25.99 910 0.6898 0.4062
0.525 26.99 945 0.7245 0.3884
0.4885 27.99 980 0.8076 0.4049
0.4653 28.99 1015 0.8100 0.3838
0.4827 29.99 1050 0.7247 0.3863
0.4839 30.99 1085 0.7009 0.3817
0.4982 31.99 1120 0.7637 0.3914
0.6105 32.99 1155 0.7343 0.3914
0.4936 33.99 1190 0.7390 0.3762
0.4674 34.99 1225 0.6724 0.3581
0.4677 35.99 1260 0.6730 0.3488
0.516 36.99 1295 0.6956 0.3728
0.4507 37.99 1330 0.6483 0.3615
0.4207 38.99 1365 0.7718 0.3484
0.4803 39.99 1400 0.8316 0.3775
0.3946 40.99 1435 0.8322 0.3568
0.411 41.99 1470 0.9933 0.3707
0.4405 42.99 1505 0.8789 0.3943
0.5124 43.99 1540 0.9030 0.3707
0.5959 44.99 1575 0.7809 0.3948
0.3841 45.99 1610 0.7716 0.3965
0.3975 46.99 1645 0.7064 0.3931
1.4983 47.99 1680 3.2866 0.3627
0.3962 48.99 1715 0.6486 0.3648
0.4422 49.99 1750 0.8450 0.3779
0.4198 50.99 1785 0.7628 0.3564
0.3577 51.99 1820 0.7553 0.3678
0.4425 52.99 1855 0.7566 0.3716
0.3492 53.99 1890 0.7710 0.3631
0.3731 54.99 1925 0.7737 0.3627
0.3868 55.99 1960 0.7021 0.3572
0.3311 56.99 1995 0.6603 0.3518
0.3993 57.99 2030 0.6664 0.3581
0.4226 58.99 2065 0.6813 0.3551
0.4143 59.99 2100 0.6567 0.3568
0.3623 60.99 2135 0.6568 0.3454
0.3228 61.99 2170 0.7326 0.3568
0.3204 62.99 2205 0.7277 0.3640
0.377 63.99 2240 0.7145 0.3585
0.3487 64.99 2275 0.6943 0.3505
0.343 65.99 2310 0.7461 0.3395
0.3251 66.99 2345 0.7442 0.3564
0.3135 67.99 2380 0.7331 0.3530
0.381 68.99 2415 0.7306 0.3513
0.3319 69.99 2450 0.8495 0.3484
0.3552 70.99 2485 0.7546 0.3551
0.3292 71.99 2520 0.7483 0.3450
0.3041 72.99 2555 0.7305 0.3522
0.3606 73.99 2590 0.7358 0.3484
0.3629 74.99 2625 0.7709 0.3446
0.3409 75.99 2660 0.7568 0.3585
0.3315 76.99 2695 0.7466 0.3475
0.2934 77.99 2730 0.7351 0.3496
0.3366 78.99 2765 0.8014 0.3484
0.3176 79.99 2800 0.8014 0.3420
0.3319 80.99 2835 0.7996 0.3437
0.2967 81.99 2870 0.8156 0.3412
0.3137 82.99 2905 0.8025 0.3361
0.3133 83.99 2940 0.7784 0.3416
0.3134 84.99 2975 0.7894 0.3336
0.3216 85.99 3010 0.8331 0.3395
0.365 86.99 3045 0.7980 0.3353
0.2962 87.99 3080 0.7965 0.3404
0.3126 88.99 3115 0.7470 0.3420
0.2843 89.99 3150 0.7788 0.3404
0.2967 90.99 3185 0.7902 0.3374
0.3171 91.99 3220 0.8022 0.3404
0.3069 92.99 3255 0.7999 0.3345
0.3571 93.99 3290 0.7896 0.3404
0.2805 94.99 3325 0.7831 0.3391
0.3099 95.99 3360 0.7909 0.3366
0.2868 96.99 3395 0.7918 0.3395
0.2626 97.99 3430 0.7766 0.3429
0.2634 98.99 3465 0.7770 0.3437
0.288 99.99 3500 0.7762 0.3429

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.8.1+cu111
  • Datasets 2.7.1.dev0
  • Tokenizers 0.13.2
Downloads last month
3