gary109's picture
update model card README.md
df268c0
|
raw
history blame
7.7 kB
metadata
tags:
  - automatic-speech-recognition
  - gary109/AI_Light_Dance
  - generated_from_trainer
datasets:
  - ai_light_dance
metrics:
  - wer
model-index:
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-mdb-2
    results: []

ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-mdb-2

This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-idmt-2 on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-MDB-2 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4346
  • Wer: 0.2242

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 30
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
22.7802 0.98 11 60.1549 0.9882
13.7635 1.98 22 18.1822 0.9985
3.4364 2.98 33 1.2339 0.7316
1.0479 3.98 44 0.8433 0.4086
0.739 4.98 55 0.7657 0.3097
0.6492 5.98 66 0.8034 0.2994
0.6044 6.98 77 0.6401 0.3333
0.5662 7.98 88 0.7298 0.2611
0.5321 8.98 99 0.8126 0.2979
0.5037 9.98 110 0.7135 0.2994
0.4823 10.98 121 0.5976 0.2655
0.4622 11.98 132 0.6875 0.2448
0.4761 12.98 143 0.6402 0.2463
0.4296 13.98 154 0.8217 0.2448
0.4655 14.98 165 0.7825 0.2552
0.4122 15.98 176 0.7121 0.2448
0.4234 16.98 187 0.8301 0.2670
0.441 17.98 198 0.7343 0.2640
0.4781 18.98 209 0.7388 0.2139
0.4006 19.98 220 0.6700 0.2522
0.42 20.98 231 0.5540 0.2493
0.4289 21.98 242 0.9950 0.2493
0.4014 22.98 253 0.7283 0.2522
0.3397 23.98 264 0.8327 0.2655
0.3879 24.98 275 0.9388 0.2906
0.3445 25.98 286 0.7623 0.2522
0.3933 26.98 297 0.9125 0.2419
0.3173 27.98 308 0.7447 0.2448
0.3734 28.98 319 0.6601 0.2935
0.3347 29.98 330 0.7022 0.2699
0.3564 30.98 341 0.7488 0.2920
0.3371 31.98 352 0.6413 0.2581
0.355 32.98 363 0.5131 0.2375
0.3648 33.98 374 0.5808 0.2286
0.3209 34.98 385 0.5392 0.2257
0.3522 35.98 396 0.4411 0.2227
0.3252 36.98 407 0.4693 0.2109
0.3216 37.98 418 0.4621 0.2065
0.3119 38.98 429 0.5094 0.2168
0.3247 39.98 440 0.4897 0.2316
0.3246 40.98 451 0.6471 0.2212
0.2997 41.98 462 0.5569 0.2153
0.2969 42.98 473 0.4766 0.2094
0.3202 43.98 484 0.4978 0.2316
0.3093 44.98 495 0.4776 0.2183
0.298 45.98 506 0.5008 0.2198
0.3151 46.98 517 0.4811 0.2080
0.2824 47.98 528 0.5011 0.2065
0.3089 48.98 539 0.5131 0.2139
0.3064 49.98 550 0.4749 0.2227
0.2734 50.98 561 0.5397 0.2080
0.2911 51.98 572 0.4975 0.2035
0.2889 52.98 583 0.4633 0.2168
0.2523 53.98 594 0.4589 0.2242
0.272 54.98 605 0.4856 0.2124
0.2733 55.98 616 0.4474 0.2242
0.2856 56.98 627 0.4534 0.2271
0.2402 57.98 638 0.4346 0.2242
0.2567 58.98 649 0.5014 0.2286
0.28 59.98 660 0.4428 0.2183
0.2541 60.98 671 0.4876 0.2227
0.2544 61.98 682 0.4705 0.2050
0.2786 62.98 693 0.4449 0.2021
0.2524 63.98 704 0.5585 0.2094
0.2524 64.98 715 0.5179 0.2109
0.2852 65.98 726 0.5063 0.2198
0.2393 66.98 737 0.4768 0.1991
0.2522 67.98 748 0.4473 0.1932
0.2768 68.98 759 0.4714 0.1991
0.2463 69.98 770 0.4948 0.1947
0.2379 70.98 781 0.4978 0.1932
0.2343 71.98 792 0.4526 0.1903
0.3377 72.98 803 0.4518 0.1962
0.2683 73.98 814 0.4457 0.2109
0.2371 74.98 825 0.4564 0.2021
0.2438 75.98 836 0.4876 0.2094
0.2408 76.98 847 0.4386 0.2021
0.2323 77.98 858 0.4513 0.1991
0.271 78.98 869 0.4874 0.2021
0.229 79.98 880 0.4882 0.2065
0.224 80.98 891 0.4981 0.1991
0.2442 81.98 902 0.5448 0.2021
0.2075 82.98 913 0.4626 0.1991
0.2314 83.98 924 0.4706 0.2065
0.2208 84.98 935 0.5073 0.2035
0.2547 85.98 946 0.4818 0.1962
0.2895 86.98 957 0.4931 0.1991
0.1988 87.98 968 0.4702 0.2006
0.2383 88.98 979 0.4682 0.1991
0.2332 89.98 990 0.4575 0.2065
0.1983 90.98 1001 0.4706 0.1991
0.2182 91.98 1012 0.4756 0.1991
0.2161 92.98 1023 0.4686 0.1962
0.2215 93.98 1034 0.4689 0.1932
0.2223 94.98 1045 0.4514 0.1888
0.2068 95.98 1056 0.4482 0.1888
0.2046 96.98 1067 0.4481 0.1858
0.2411 97.98 1078 0.4532 0.1903
0.2296 98.98 1089 0.4601 0.1932
0.2211 99.98 1100 0.4625 0.1947

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.8.1+cu111
  • Datasets 2.7.1.dev0
  • Tokenizers 0.13.2