gary109's picture
update model card README.md
55c27ba
|
raw
history blame
7.66 kB
metadata
tags:
  - automatic-speech-recognition
  - gary109/AI_Light_Dance
  - generated_from_trainer
datasets:
  - ai_light_dance
metrics:
  - wer
model-index:
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v2
    results: []

ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v2

This model is a fine-tuned version of gary109/ai-light-dance_drums_pretrain_wav2vec2-base-new on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-SMT-DRUMS-V2+MDBDRUMS dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5264
  • Wer: 0.3635

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
4.6468 0.98 22 3.2315 1.0
1.5745 1.98 44 3.1603 1.0
1.465 2.98 66 2.2551 1.0
1.3168 3.98 88 1.8461 1.0
1.1359 4.98 110 1.4874 0.9797
0.9769 5.98 132 1.7359 0.5495
0.9019 6.98 154 1.5833 0.5268
0.8057 7.98 176 1.4892 0.5304
1.0845 8.98 198 1.3939 0.5197
0.7562 9.98 220 1.1238 0.5447
0.7259 10.98 242 1.2936 0.5006
0.7318 11.98 264 1.2763 0.4660
0.6452 12.98 286 1.2947 0.4779
0.6353 13.98 308 1.1925 0.4517
0.6463 14.98 330 0.8667 0.4100
0.5381 15.98 352 1.1243 0.3909
0.5637 16.98 374 0.8683 0.3754
0.6149 17.98 396 1.1040 0.3731
0.6138 18.98 418 1.1068 0.3850
0.7381 19.98 440 0.9203 0.3623
0.5064 20.98 462 0.8806 0.3540
0.4731 21.98 484 0.7259 0.3623
0.5232 22.98 506 0.7935 0.3516
0.4689 23.98 528 0.7771 0.3540
0.4902 24.98 550 0.6897 0.3909
0.4079 25.98 572 0.8030 0.3552
0.5045 26.98 594 0.6778 0.3790
0.4373 27.98 616 0.7456 0.3695
0.4366 28.98 638 0.7009 0.3433
0.3944 29.98 660 0.6841 0.3468
0.4206 30.98 682 0.7093 0.3373
0.3949 31.98 704 0.6901 0.3576
0.4416 32.98 726 0.6762 0.3397
0.4248 33.98 748 0.7196 0.3540
0.4214 34.98 770 0.6669 0.3254
0.416 35.98 792 0.6422 0.3445
0.3687 36.98 814 0.6345 0.3504
0.4119 37.98 836 0.6306 0.3385
0.359 38.98 858 0.6538 0.3576
0.359 39.98 880 0.6613 0.3349
0.3488 40.98 902 0.5976 0.3468
0.3543 41.98 924 0.6327 0.3433
0.3647 42.98 946 0.6208 0.3600
0.3529 43.98 968 0.6008 0.3492
0.3691 44.98 990 0.6065 0.3492
0.329 45.98 1012 0.6288 0.3373
0.3357 46.98 1034 0.5760 0.3480
0.3318 47.98 1056 0.5637 0.3564
0.3181 48.98 1078 0.5560 0.3468
0.3313 49.98 1100 0.5905 0.3337
0.3059 50.98 1122 0.5443 0.3278
0.3375 51.98 1144 0.5695 0.3576
0.3191 52.98 1166 0.5874 0.3385
0.3115 53.98 1188 0.5264 0.3635
0.3044 54.98 1210 0.5480 0.3433
0.3256 55.98 1232 0.5677 0.3385
0.2938 56.98 1254 0.5597 0.3445
0.2853 57.98 1276 0.5942 0.3373
0.3348 58.98 1298 0.5733 0.3421
0.3024 59.98 1320 0.5604 0.3433
0.2655 60.98 1342 0.5348 0.3468
0.3029 61.98 1364 0.5752 0.3206
0.3435 62.98 1386 0.5489 0.3063
0.3125 63.98 1408 0.5736 0.3075
0.263 64.98 1430 0.5505 0.3206
0.2665 65.98 1452 0.5391 0.3230
0.299 66.98 1474 0.5389 0.3135
0.2909 67.98 1496 0.5841 0.3099
0.2988 68.98 1518 0.5847 0.3004
0.2879 69.98 1540 0.5941 0.2968
0.2802 70.98 1562 0.6612 0.2920
0.2877 71.98 1584 0.5641 0.3051
0.2727 72.98 1606 0.6138 0.3063
0.2668 73.98 1628 0.6087 0.2920
0.2675 74.98 1650 0.5876 0.2932
0.264 75.98 1672 0.6043 0.2980
0.2352 76.98 1694 0.5829 0.2932
0.2494 77.98 1716 0.5775 0.3063
0.2621 78.98 1738 0.5676 0.2956
0.2788 79.98 1760 0.5864 0.2932
0.2615 80.98 1782 0.5754 0.3015
0.2542 81.98 1804 0.5651 0.3027
0.2641 82.98 1826 0.5731 0.3004
0.2532 83.98 1848 0.5782 0.2968
0.2645 84.98 1870 0.5718 0.3039
0.2296 85.98 1892 0.5628 0.3147
0.2394 86.98 1914 0.5920 0.3027
0.2636 87.98 1936 0.6085 0.2968
0.2371 88.98 1958 0.5809 0.3075
0.2364 89.98 1980 0.5927 0.3039
0.2812 90.98 2002 0.5713 0.3123
0.2141 91.98 2024 0.5743 0.3039
0.2919 92.98 2046 0.5837 0.3063
0.2288 93.98 2068 0.5860 0.3015
0.2585 94.98 2090 0.5776 0.3147
0.2529 95.98 2112 0.5625 0.3159
0.2343 96.98 2134 0.5700 0.3087
0.2567 97.98 2156 0.5729 0.3087
0.2448 98.98 2178 0.5728 0.3111
0.2501 99.98 2200 0.5744 0.3099

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.8.1+cu111
  • Datasets 2.7.1.dev0
  • Tokenizers 0.13.2