gary109's picture
update model card README.md
4f9d321
|
raw
history blame
13.5 kB
metadata
tags:
  - generated_from_trainer
datasets:
  - ai_light_dance
model-index:
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base
    results: []

ai-light-dance_drums_ft_pretrain_wav2vec2-base

This model is a fine-tuned version of gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base on the ai_light_dance dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0090
  • Wer: 0.6277

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0004
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 200.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.9 8 2.0373 0.6779
0.6674 1.9 16 2.2593 0.6749
0.6653 2.9 24 2.0434 0.6669
0.7249 3.9 32 2.1790 0.6935
0.683 4.9 40 2.1210 0.6866
0.683 5.9 48 2.2408 0.6803
0.7518 6.9 56 2.2883 0.6902
0.686 7.9 64 2.1073 0.6818
0.6771 8.9 72 2.3409 0.6690
0.6593 9.9 80 2.4715 0.6603
0.6593 10.9 88 2.0975 0.6723
0.7433 11.9 96 2.0338 0.6729
0.6497 12.9 104 2.1415 0.6824
0.6497 13.9 112 2.1818 0.6654
0.6799 14.9 120 2.0864 0.6755
0.6799 15.9 128 2.4925 0.6797
0.7459 16.9 136 2.3355 0.6860
0.6576 17.9 144 2.2341 0.6705
0.6798 18.9 152 2.1420 0.6615
0.6479 19.9 160 2.4265 0.6755
0.6479 20.9 168 2.3000 0.6944
0.708 21.9 176 2.2466 0.6732
0.6596 22.9 184 2.4366 0.6776
0.643 23.9 192 2.0910 0.6513
0.6644 24.9 200 2.2660 0.6645
0.6644 25.9 208 2.1543 0.6764
0.6936 26.9 216 2.1505 0.6699
0.6338 27.9 224 2.2813 0.6708
0.6393 28.9 232 2.1040 0.6597
0.6378 29.9 240 2.2749 0.6740
0.6378 30.9 248 2.1098 0.6612
0.6829 31.9 256 2.1962 0.6513
0.6002 32.9 264 2.1311 0.6618
0.6656 33.9 272 2.2651 0.6510
0.633 34.9 280 2.2622 0.6513
0.633 35.9 288 2.2586 0.6621
0.6644 36.9 296 2.4158 0.6594
0.6235 37.9 304 2.4254 0.6477
0.6041 38.9 312 2.3081 0.6633
0.6215 39.9 320 2.4257 0.6498
0.6215 40.9 328 2.3012 0.6366
0.6684 41.9 336 2.2060 0.6585
0.6201 42.9 344 2.0308 0.6681
0.5957 43.9 352 2.1375 0.6576
0.6158 44.9 360 2.0826 0.6672
0.6158 45.9 368 2.1990 0.6585
0.6291 46.9 376 2.0542 0.6651
0.5924 47.9 384 2.0573 0.6540
0.6063 48.9 392 2.2484 0.6531
0.5984 49.9 400 2.0362 0.6606
0.5984 50.9 408 2.1028 0.6555
0.6309 51.9 416 2.2151 0.6591
0.5979 52.9 424 2.0955 0.6609
0.5941 53.9 432 2.2526 0.6732
0.5897 54.9 440 2.3852 0.6543
0.5897 55.9 448 2.0804 0.6564
0.6443 56.9 456 2.0027 0.6570
0.6055 57.9 464 2.1207 0.6636
0.5422 58.9 472 2.2516 0.6618
0.5879 59.9 480 2.0028 0.6549
0.5879 60.9 488 2.3659 0.6576
0.638 61.9 496 2.4999 0.6552
0.5874 62.9 504 2.3140 0.6483
0.5829 63.9 512 2.2087 0.6408
0.5632 64.9 520 2.1989 0.6534
0.5632 65.9 528 2.2446 0.6624
0.6143 66.9 536 2.1099 0.6420
0.5924 67.9 544 2.2372 0.6423
0.5239 68.9 552 2.3488 0.6453
0.5883 69.9 560 2.1961 0.6438
0.5883 70.9 568 2.2004 0.6309
0.5918 71.9 576 2.0202 0.6336
0.5602 72.9 584 2.0784 0.6372
0.5323 73.9 592 2.1598 0.6573
0.5584 74.9 600 2.1241 0.6351
0.5584 75.9 608 2.1136 0.6381
0.5979 76.9 616 2.1425 0.6330
0.5525 77.9 624 2.1256 0.6304
0.5197 78.9 632 2.0802 0.6312
0.5509 79.9 640 2.1101 0.6369
0.5509 80.9 648 2.0785 0.6348
0.6176 81.9 656 2.0631 0.6271
0.5294 82.9 664 2.1448 0.6357
0.5399 83.9 672 2.2993 0.6423
0.5507 84.9 680 2.2422 0.6441
0.5507 85.9 688 2.1183 0.6429
0.5813 86.9 696 2.1622 0.6408
0.5319 87.9 704 2.0703 0.6363
0.5247 88.9 712 2.1978 0.6411
0.5607 89.9 720 2.3071 0.6411
0.5607 90.9 728 2.2638 0.6304
0.5796 91.9 736 2.1073 0.6441
0.5521 92.9 744 2.0579 0.6456
0.5625 93.9 752 2.0664 0.6501
0.5901 94.9 760 2.0674 0.6327
0.5901 95.9 768 2.1852 0.6381
0.5974 96.9 776 2.2212 0.6387
0.5359 97.9 784 2.1028 0.6390
0.5643 98.9 792 2.1438 0.6516
0.5488 99.9 800 2.1104 0.6447
0.5488 100.9 808 2.1390 0.6399
0.5906 101.9 816 2.3833 0.6387
0.5735 102.9 824 2.4907 0.6304
0.5617 103.9 832 2.1177 0.6438
0.5547 104.9 840 2.0854 0.6366
0.5547 105.9 848 2.1921 0.6411
0.5805 106.9 856 2.2754 0.6312
0.5455 107.9 864 2.2802 0.6348
0.5342 108.9 872 2.3219 0.6324
0.5372 109.9 880 2.0900 0.6423
0.5372 110.9 888 1.9905 0.6351
0.6146 111.9 896 2.2073 0.6295
0.5517 112.9 904 2.2818 0.6387
0.5501 113.9 912 2.4256 0.6318
0.5469 114.9 920 2.2074 0.6411
0.5469 115.9 928 2.2370 0.6283
0.6065 116.9 936 2.2339 0.6268
0.5265 117.9 944 2.2718 0.6235
0.512 118.9 952 2.1963 0.6333
0.5571 119.9 960 2.2201 0.6348
0.5571 120.9 968 2.1106 0.6330
0.5778 121.9 976 2.3302 0.6306
0.539 122.9 984 2.3715 0.6274
0.5306 123.9 992 2.2417 0.6351
0.5271 124.9 1000 2.1695 0.6250
0.5271 125.9 1008 2.1912 0.6280
0.5766 126.9 1016 2.1122 0.6339
0.5483 127.9 1024 2.0696 0.6321
0.5414 128.9 1032 2.0935 0.6315
0.5125 129.9 1040 2.1693 0.6336
0.5125 130.9 1048 2.1351 0.6315
0.5733 131.9 1056 2.1570 0.6405
0.5285 132.9 1064 2.1997 0.6309
0.5426 133.9 1072 2.1216 0.6321
0.5018 134.9 1080 2.1742 0.6247
0.5018 135.9 1088 2.1208 0.6304
0.5611 136.9 1096 2.1228 0.6304
0.5258 137.9 1104 2.2256 0.6309
0.5364 138.9 1112 2.1623 0.6306
0.528 139.9 1120 2.0064 0.6289
0.528 140.9 1128 2.0472 0.6298
0.5637 141.9 1136 2.1907 0.6318
0.5051 142.9 1144 2.1570 0.6292
0.523 143.9 1152 2.0497 0.6423
0.5516 144.9 1160 2.0907 0.6324
0.5516 145.9 1168 2.1479 0.6309
0.5524 146.9 1176 2.0570 0.6274
0.5215 147.9 1184 2.1380 0.6339
0.5447 148.9 1192 2.2314 0.6304
0.521 149.9 1200 2.1473 0.6333
0.521 150.9 1208 2.1240 0.6292
0.5501 151.9 1216 2.1306 0.6217
0.5309 152.9 1224 2.1294 0.6315
0.5293 153.9 1232 2.2013 0.6286
0.4898 154.9 1240 2.2169 0.6292
0.4898 155.9 1248 2.2271 0.6238
0.559 156.9 1256 2.1804 0.6277
0.5451 157.9 1264 2.1884 0.6304
0.5072 158.9 1272 2.2299 0.6309
0.5259 159.9 1280 2.1661 0.6259
0.5259 160.9 1288 2.1579 0.6265
0.5609 161.9 1296 2.2086 0.6169
0.5168 162.9 1304 2.1466 0.6223
0.4984 163.9 1312 2.1418 0.6259
0.5254 164.9 1320 2.1172 0.6283
0.5254 165.9 1328 2.0919 0.6247
0.5685 166.9 1336 2.1055 0.6262
0.4952 167.9 1344 2.0839 0.6253
0.5024 168.9 1352 2.0244 0.6256
0.5028 169.9 1360 2.0158 0.6241
0.5028 170.9 1368 2.0097 0.6241
0.5731 171.9 1376 1.9885 0.6217
0.4829 172.9 1384 1.9992 0.6238
0.5101 173.9 1392 1.9918 0.6211
0.5058 174.9 1400 1.9633 0.6283
0.5058 175.9 1408 1.9551 0.6229
0.5182 176.9 1416 2.0169 0.6163
0.5443 177.9 1424 2.0160 0.6187
0.522 178.9 1432 2.0600 0.6178
0.5042 179.9 1440 2.0367 0.6259
0.5042 180.9 1448 2.0717 0.6187
0.5486 181.9 1456 2.0583 0.6253
0.4946 182.9 1464 2.0680 0.6262
0.5056 183.9 1472 2.0847 0.6220
0.513 184.9 1480 2.0797 0.6232
0.513 185.9 1488 2.0560 0.6226
0.5334 186.9 1496 2.0349 0.6289
0.5265 187.9 1504 2.0137 0.6277
0.5135 188.9 1512 2.0228 0.6259
0.5062 189.9 1520 2.0344 0.6244
0.5062 190.9 1528 2.0259 0.6253
0.5459 191.9 1536 2.0191 0.6262
0.4993 192.9 1544 2.0224 0.6250
0.4965 193.9 1552 2.0135 0.6274
0.4827 194.9 1560 2.0027 0.6289
0.4827 195.9 1568 2.0065 0.6265
0.5181 196.9 1576 2.0064 0.6271
0.518 197.9 1584 2.0078 0.6277
0.4807 198.9 1592 2.0084 0.6277
0.5078 199.9 1600 2.0090 0.6277

Framework versions

  • Transformers 4.24.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.1