Edit model card

ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3

This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3 on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-SMT-DRUMS-V2+MDBDRUMS dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5550
  • Wer: 0.3147

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 4
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.1747 1.0 45 0.5638 0.3337
0.2339 2.0 90 0.5785 0.3254
0.2849 3.0 135 0.5586 0.3397
0.2396 4.0 180 0.5868 0.3266
0.2272 5.0 225 0.6052 0.3230
0.2497 6.0 270 0.5913 0.3278
0.2218 7.0 315 0.5926 0.3349
0.2584 8.0 360 0.5617 0.3218
0.2741 9.0 405 0.5901 0.3230
0.2481 10.0 450 0.5860 0.3278
0.2504 11.0 495 0.5991 0.3123
0.2125 12.0 540 0.5992 0.3218
0.2482 13.0 585 0.5756 0.3194
0.2135 14.0 630 0.5836 0.3302
0.2345 15.0 675 0.6347 0.3254
0.1912 16.0 720 0.6160 0.3206
0.2117 17.0 765 0.6268 0.3099
0.2217 18.0 810 0.6873 0.3182
0.2165 19.0 855 0.6721 0.3159
0.207 20.0 900 0.6312 0.3206
0.2263 21.0 945 0.6223 0.3290
0.2015 22.0 990 0.6319 0.3182
0.1997 23.0 1035 0.6527 0.3135
0.2318 24.0 1080 0.5987 0.3278
0.2196 25.0 1125 0.6269 0.3242
0.2298 26.0 1170 0.5774 0.3254
0.2117 27.0 1215 0.5938 0.3027
0.2553 28.0 1260 0.5831 0.3123
0.226 29.0 1305 0.6151 0.3099
0.1635 30.0 1350 0.5622 0.3230
0.5734 31.0 1395 0.6198 0.2920
0.2196 32.0 1440 0.5779 0.3039
0.2019 33.0 1485 0.5866 0.3111
0.2222 34.0 1530 0.5557 0.3063
0.2167 35.0 1575 0.5740 0.3206
0.2011 36.0 1620 0.5598 0.3004
0.2032 37.0 1665 0.5550 0.3147
0.225 38.0 1710 0.5794 0.3099
0.2068 39.0 1755 0.6223 0.3063
0.2105 40.0 1800 0.5797 0.3039
0.1968 41.0 1845 0.5681 0.2968
0.224 42.0 1890 0.5742 0.3170
0.2351 43.0 1935 0.5567 0.3111
0.2121 44.0 1980 0.5893 0.3039
0.1913 45.0 2025 0.6030 0.3027
0.1636 46.0 2070 0.5812 0.3004
0.2062 47.0 2115 0.6081 0.3004
0.2031 48.0 2160 0.5610 0.3159
0.1892 49.0 2205 0.5863 0.3147
0.1712 50.0 2250 0.5943 0.3159
0.1886 51.0 2295 0.5953 0.3051
0.1748 52.0 2340 0.5761 0.3087
0.1705 53.0 2385 0.6045 0.2872
0.1794 54.0 2430 0.5731 0.3075
0.1815 55.0 2475 0.5949 0.2849
0.1571 56.0 2520 0.5663 0.2884
0.1902 57.0 2565 0.5903 0.2956
0.2057 58.0 2610 0.5820 0.2872
0.1904 59.0 2655 0.5923 0.2896
0.1677 60.0 2700 0.5769 0.3075
0.1859 61.0 2745 0.5566 0.3147
0.2382 62.0 2790 0.5849 0.3051
0.1753 63.0 2835 0.5773 0.3075
0.1651 64.0 2880 0.5877 0.3039
0.1781 65.0 2925 0.5905 0.3027
0.1582 66.0 2970 0.5800 0.3015
0.1538 67.0 3015 0.6025 0.3075
0.1606 68.0 3060 0.5758 0.3039
0.1522 69.0 3105 0.5860 0.2932
0.1521 70.0 3150 0.5896 0.2956
0.1592 71.0 3195 0.5738 0.3027
0.2245 72.0 3240 0.5782 0.3039
0.2185 73.0 3285 0.5722 0.3027
0.1597 74.0 3330 0.5891 0.3004
0.1713 75.0 3375 0.5650 0.3027
0.1464 76.0 3420 0.5860 0.3063
0.1551 77.0 3465 0.5755 0.3027
0.1509 78.0 3510 0.5895 0.2944
0.176 79.0 3555 0.5750 0.2992
0.1695 80.0 3600 0.5759 0.3004
0.1797 81.0 3645 0.5904 0.2992
0.1371 82.0 3690 0.5923 0.3015
0.1798 83.0 3735 0.5864 0.2992
0.1386 84.0 3780 0.5733 0.3004
0.2173 85.0 3825 0.5751 0.3004
0.151 86.0 3870 0.5711 0.2968
0.1579 87.0 3915 0.5750 0.2992
0.1328 88.0 3960 0.5764 0.2944
0.1657 89.0 4005 0.5769 0.3004
0.1353 90.0 4050 0.5715 0.2956
0.1982 91.0 4095 0.5754 0.2968
0.1687 92.0 4140 0.5725 0.2980
0.1842 93.0 4185 0.5750 0.2980
0.1893 94.0 4230 0.5789 0.2944
0.1744 95.0 4275 0.5750 0.3004
0.1745 96.0 4320 0.5794 0.2980
0.1665 97.0 4365 0.5755 0.3004
0.1569 98.0 4410 0.5763 0.2968
0.1449 99.0 4455 0.5779 0.2968
0.1469 100.0 4500 0.5774 0.2968

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.8.1+cu111
  • Datasets 2.7.1.dev0
  • Tokenizers 0.13.2
Downloads last month
4