gary109's picture
update model card README.md
cf92369
|
raw
history blame
13.5 kB
metadata
tags:
  - generated_from_trainer
datasets:
  - ai_light_dance
model-index:
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base
    results: []

ai-light-dance_drums_ft_pretrain_wav2vec2-base

This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base on the ai_light_dance dataset. It achieves the following results on the evaluation set:

  • Loss: 2.1612
  • Wer: 0.6576

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 200.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.9 8 2.3799 0.7797
0.9677 1.9 16 2.4062 0.7914
0.9433 2.9 24 2.6955 0.7683
1.0276 3.9 32 2.5258 0.7893
0.9923 4.9 40 2.4174 0.7824
0.9923 5.9 48 2.7146 0.7788
1.0927 6.9 56 2.9691 0.7884
1.0133 7.9 64 3.1105 0.7644
0.98 8.9 72 2.8638 0.7644
0.9688 9.9 80 2.7538 0.7582
0.9688 10.9 88 2.6010 0.7713
1.0453 11.9 96 2.6365 0.7483
0.9397 12.9 104 2.4740 0.7629
0.9466 13.9 112 2.6466 0.7617
0.9665 14.9 120 2.6643 0.7644
0.9665 15.9 128 2.4003 0.7408
1.0577 16.9 136 2.6307 0.7707
0.9154 17.9 144 2.6735 0.7492
0.9352 18.9 152 2.4403 0.7492
0.8999 19.9 160 2.7195 0.7650
0.8999 20.9 168 2.2873 0.7603
0.9935 21.9 176 2.8440 0.7513
0.885 22.9 184 2.7574 0.7486
0.8979 23.9 192 3.1471 0.7537
0.9074 24.9 200 3.0892 0.7288
0.9074 25.9 208 2.8641 0.7519
0.9498 26.9 216 2.8798 0.7399
0.8821 27.9 224 2.8115 0.7495
0.8968 28.9 232 2.9225 0.7447
0.8783 29.9 240 2.5414 0.7504
0.8783 30.9 248 2.3528 0.7378
0.9428 31.9 256 2.8073 0.7142
0.8184 32.9 264 2.5757 0.7192
0.9092 33.9 272 2.4403 0.7094
0.8749 34.9 280 2.6912 0.7219
0.8749 35.9 288 2.4073 0.7327
0.9235 36.9 296 2.4446 0.7267
0.8654 37.9 304 2.8211 0.7360
0.8428 38.9 312 2.4811 0.7243
0.8355 39.9 320 2.3575 0.7192
0.8355 40.9 328 2.3957 0.7139
0.8992 41.9 336 2.4373 0.7139
0.8221 42.9 344 2.4235 0.7127
0.8305 43.9 352 2.3405 0.7112
0.8328 44.9 360 2.3406 0.7216
0.8328 45.9 368 2.4469 0.7166
0.8611 46.9 376 2.4297 0.7157
0.8092 47.9 384 2.5868 0.7094
0.8173 48.9 392 2.2558 0.7004
0.7772 49.9 400 2.3598 0.7004
0.7772 50.9 408 2.3083 0.6881
0.8494 51.9 416 2.4431 0.7013
0.7997 52.9 424 2.3005 0.7112
0.7879 53.9 432 2.1985 0.7297
0.7694 54.9 440 2.3376 0.7082
0.7694 55.9 448 2.3716 0.7013
0.8397 56.9 456 2.3817 0.7115
0.7868 57.9 464 2.2577 0.7091
0.7311 58.9 472 2.3895 0.7127
0.7796 59.9 480 2.2760 0.7100
0.7796 60.9 488 2.5685 0.7073
0.8272 61.9 496 2.3881 0.7028
0.7639 62.9 504 2.3457 0.7085
0.789 63.9 512 2.3291 0.7007
0.7472 64.9 520 2.5174 0.7049
0.7472 65.9 528 2.3997 0.7154
0.8056 66.9 536 2.4574 0.7237
0.7752 67.9 544 2.4980 0.7004
0.7084 68.9 552 2.2370 0.7085
0.7824 69.9 560 2.3595 0.6971
0.7824 70.9 568 2.1996 0.7004
0.7776 71.9 576 2.2957 0.6902
0.7205 72.9 584 2.2436 0.6908
0.7074 73.9 592 2.2361 0.6932
0.7237 74.9 600 2.2078 0.6857
0.7237 75.9 608 2.2334 0.6905
0.7862 76.9 616 2.3565 0.6977
0.7299 77.9 624 2.1293 0.6779
0.6755 78.9 632 2.2524 0.6860
0.724 79.9 640 2.2069 0.6887
0.724 80.9 648 2.5267 0.6785
0.7878 81.9 656 2.6394 0.6824
0.6882 82.9 664 2.4648 0.6764
0.6996 83.9 672 2.4116 0.6890
0.7149 84.9 680 2.1044 0.6893
0.7149 85.9 688 2.1448 0.6917
0.7499 86.9 696 2.2603 0.6875
0.6881 87.9 704 2.1306 0.6815
0.6652 88.9 712 2.1952 0.6905
0.7093 89.9 720 2.3550 0.6767
0.7093 90.9 728 2.2610 0.6749
0.7439 91.9 736 2.1472 0.6857
0.6898 92.9 744 2.0744 0.6881
0.6734 93.9 752 2.0898 0.6929
0.6926 94.9 760 2.0896 0.6684
0.6926 95.9 768 2.1929 0.6812
0.7154 96.9 776 2.1538 0.6860
0.6493 97.9 784 2.1438 0.6815
0.6755 98.9 792 2.1561 0.6902
0.6667 99.9 800 2.0767 0.6908
0.6667 100.9 808 2.1064 0.6785
0.7016 101.9 816 2.2278 0.6767
0.6726 102.9 824 2.2616 0.6690
0.6725 103.9 832 2.1331 0.6878
0.6657 104.9 840 2.1497 0.6732
0.6657 105.9 848 2.1601 0.6738
0.6989 106.9 856 2.3191 0.6675
0.6658 107.9 864 2.3547 0.6788
0.6398 108.9 872 2.3368 0.6740
0.6465 109.9 880 2.1896 0.6806
0.6465 110.9 888 2.1210 0.6797
0.727 111.9 896 2.3508 0.6687
0.6409 112.9 904 2.3440 0.6752
0.6573 113.9 912 2.2695 0.6624
0.645 114.9 920 2.1471 0.6770
0.645 115.9 928 2.1867 0.6743
0.7103 116.9 936 2.2330 0.6702
0.6214 117.9 944 2.2174 0.6687
0.6134 118.9 952 2.1980 0.6621
0.6612 119.9 960 2.2891 0.6749
0.6612 120.9 968 2.2863 0.6642
0.688 121.9 976 2.3198 0.6669
0.6451 122.9 984 2.1696 0.6669
0.6308 123.9 992 2.1255 0.6597
0.6359 124.9 1000 2.2053 0.6570
0.6359 125.9 1008 2.1915 0.6582
0.6845 126.9 1016 2.1406 0.6657
0.6609 127.9 1024 2.1852 0.6752
0.6345 128.9 1032 2.1838 0.6621
0.6055 129.9 1040 2.1586 0.6702
0.6055 130.9 1048 2.1627 0.6681
0.6737 131.9 1056 2.2631 0.6761
0.6237 132.9 1064 2.2554 0.6621
0.6468 133.9 1072 2.2539 0.6669
0.5948 134.9 1080 2.2464 0.6516
0.5948 135.9 1088 2.3491 0.6621
0.6645 136.9 1096 2.2537 0.6621
0.6195 137.9 1104 2.3717 0.6666
0.6317 138.9 1112 2.2025 0.6552
0.6336 139.9 1120 2.1422 0.6624
0.6336 140.9 1128 2.1062 0.6606
0.664 141.9 1136 2.2254 0.6597
0.6047 142.9 1144 2.3226 0.6540
0.6173 143.9 1152 2.2279 0.6684
0.6466 144.9 1160 2.1866 0.6573
0.6466 145.9 1168 2.2489 0.6591
0.6585 146.9 1176 2.2274 0.6480
0.6244 147.9 1184 2.1959 0.6627
0.6527 148.9 1192 2.2115 0.6594
0.6247 149.9 1200 2.2805 0.6621
0.6247 150.9 1208 2.2129 0.6579
0.6614 151.9 1216 2.2385 0.6636
0.6309 152.9 1224 2.2757 0.6615
0.6501 153.9 1232 2.3266 0.6648
0.5869 154.9 1240 2.3361 0.6633
0.5869 155.9 1248 2.3452 0.6540
0.6676 156.9 1256 2.2800 0.6615
0.6494 157.9 1264 2.3058 0.6663
0.6017 158.9 1272 2.2906 0.6663
0.6266 159.9 1280 2.2316 0.6597
0.6266 160.9 1288 2.1886 0.6711
0.6704 161.9 1296 2.3184 0.6591
0.6239 162.9 1304 2.3544 0.6618
0.5997 163.9 1312 2.2984 0.6678
0.6228 164.9 1320 2.2930 0.6693
0.6228 165.9 1328 2.3272 0.6585
0.6683 166.9 1336 2.3457 0.6573
0.598 167.9 1344 2.2178 0.6639
0.6164 168.9 1352 2.1439 0.6543
0.5963 169.9 1360 2.1239 0.6513
0.5963 170.9 1368 2.1392 0.6594
0.6782 171.9 1376 2.1292 0.6579
0.5783 172.9 1384 2.1257 0.6597
0.6087 173.9 1392 2.1253 0.6594
0.6045 174.9 1400 2.1333 0.6561
0.6045 175.9 1408 2.1042 0.6507
0.6299 176.9 1416 2.1110 0.6570
0.6401 177.9 1424 2.1161 0.6612
0.622 178.9 1432 2.1684 0.6483
0.599 179.9 1440 2.1906 0.6552
0.599 180.9 1448 2.2258 0.6492
0.6516 181.9 1456 2.2038 0.6537
0.5907 182.9 1464 2.1949 0.6534
0.5979 183.9 1472 2.1962 0.6531
0.6064 184.9 1480 2.1943 0.6498
0.6064 185.9 1488 2.1708 0.6525
0.6363 186.9 1496 2.1660 0.6561
0.6257 187.9 1504 2.1741 0.6573
0.6128 188.9 1512 2.1726 0.6564
0.602 189.9 1520 2.1689 0.6555
0.602 190.9 1528 2.1702 0.6567
0.645 191.9 1536 2.1752 0.6591
0.5916 192.9 1544 2.1907 0.6561
0.5853 193.9 1552 2.1866 0.6546
0.5735 194.9 1560 2.1830 0.6555
0.5735 195.9 1568 2.1760 0.6564
0.6294 196.9 1576 2.1679 0.6579
0.6149 197.9 1584 2.1632 0.6576
0.5761 198.9 1592 2.1614 0.6573
0.6111 199.9 1600 2.1612 0.6576

Framework versions

  • Transformers 4.24.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.1