gary109 commited on
Commit
5973dfd
1 Parent(s): 8a7d735

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +122 -111
README.md CHANGED
@@ -1,7 +1,5 @@
1
  ---
2
  tags:
3
- - automatic-speech-recognition
4
- - gary109/AI_Light_Dance
5
  - generated_from_trainer
6
  datasets:
7
  - ai_light_dance
@@ -9,7 +7,20 @@ metrics:
9
  - wer
10
  model-index:
11
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3
12
- results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ---
14
 
15
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -17,10 +28,10 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3
19
 
20
- This model is a fine-tuned version of [gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3](https://huggingface.co/gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3) on the GARY109/AI_LIGHT_DANCE - ONSET-IDMT-SMT-DRUMS-V2+MDBDRUMS dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.5634
23
- - Wer: 0.3290
24
 
25
  ## Model description
26
 
@@ -39,12 +50,12 @@ More information needed
39
  ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
42
- - learning_rate: 0.0003
43
- - train_batch_size: 4
44
- - eval_batch_size: 4
45
  - seed: 42
46
- - gradient_accumulation_steps: 4
47
- - total_train_batch_size: 16
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
  - lr_scheduler_warmup_steps: 100
@@ -55,106 +66,106 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Wer |
57
  |:-------------:|:-----:|:----:|:---------------:|:------:|
58
- | 0.5063 | 0.98 | 11 | 0.6068 | 0.4029 |
59
- | 0.3405 | 1.98 | 22 | 0.6141 | 0.4005 |
60
- | 0.3751 | 2.98 | 33 | 0.6488 | 0.3838 |
61
- | 0.3557 | 3.98 | 44 | 0.6436 | 0.3814 |
62
- | 0.3535 | 4.98 | 55 | 0.6866 | 0.3754 |
63
- | 0.3523 | 5.98 | 66 | 0.6454 | 0.3886 |
64
- | 0.3608 | 6.98 | 77 | 0.6975 | 0.3921 |
65
- | 0.3566 | 7.98 | 88 | 0.6382 | 0.3838 |
66
- | 0.3668 | 8.98 | 99 | 0.6750 | 0.3802 |
67
- | 0.3662 | 9.98 | 110 | 0.6258 | 0.3933 |
68
- | 0.3549 | 10.98 | 121 | 0.6575 | 0.3909 |
69
- | 0.3696 | 11.98 | 132 | 0.8053 | 0.3814 |
70
- | 0.3475 | 12.98 | 143 | 0.7944 | 0.3659 |
71
- | 0.3783 | 13.98 | 154 | 0.7665 | 0.3719 |
72
- | 0.3546 | 14.98 | 165 | 0.6892 | 0.3778 |
73
- | 0.3398 | 15.98 | 176 | 0.6846 | 0.3790 |
74
- | 0.7065 | 16.98 | 187 | 0.8373 | 0.3814 |
75
- | 0.3939 | 17.98 | 198 | 0.6791 | 0.3850 |
76
- | 0.3668 | 18.98 | 209 | 0.6238 | 0.4041 |
77
- | 0.3357 | 19.98 | 220 | 0.6239 | 0.3826 |
78
- | 0.3452 | 20.98 | 231 | 0.6344 | 0.3588 |
79
- | 0.3446 | 21.98 | 242 | 0.6696 | 0.3814 |
80
- | 0.3232 | 22.98 | 253 | 0.6717 | 0.3552 |
81
- | 0.3607 | 23.98 | 264 | 0.5785 | 0.3695 |
82
- | 0.3382 | 24.98 | 275 | 0.6849 | 0.3552 |
83
- | 0.3326 | 25.98 | 286 | 0.6569 | 0.3623 |
84
- | 0.3239 | 26.98 | 297 | 0.6621 | 0.3600 |
85
- | 0.3053 | 27.98 | 308 | 0.6053 | 0.3421 |
86
- | 0.3062 | 28.98 | 319 | 0.5786 | 0.3647 |
87
- | 0.2916 | 29.98 | 330 | 0.5802 | 0.3814 |
88
- | 0.3104 | 30.98 | 341 | 0.6721 | 0.3564 |
89
- | 0.3157 | 31.98 | 352 | 0.6086 | 0.3671 |
90
- | 0.3607 | 32.98 | 363 | 0.6838 | 0.3778 |
91
- | 0.3057 | 33.98 | 374 | 0.7922 | 0.3576 |
92
- | 0.2828 | 34.98 | 385 | 0.7821 | 0.3647 |
93
- | 0.311 | 35.98 | 396 | 0.7524 | 0.3504 |
94
- | 0.3311 | 36.98 | 407 | 0.7453 | 0.3433 |
95
- | 0.2871 | 37.98 | 418 | 0.7173 | 0.3373 |
96
- | 0.3008 | 38.98 | 429 | 0.7352 | 0.3409 |
97
- | 0.2821 | 39.98 | 440 | 0.6026 | 0.3421 |
98
- | 0.2845 | 40.98 | 451 | 0.6171 | 0.3266 |
99
- | 0.2758 | 41.98 | 462 | 0.5992 | 0.3325 |
100
- | 0.2838 | 42.98 | 473 | 0.6034 | 0.3302 |
101
- | 0.2783 | 43.98 | 484 | 0.6026 | 0.3361 |
102
- | 0.2889 | 44.98 | 495 | 0.5918 | 0.3337 |
103
- | 0.2871 | 45.98 | 506 | 0.6071 | 0.3409 |
104
- | 0.2708 | 46.98 | 517 | 0.6761 | 0.3385 |
105
- | 0.2975 | 47.98 | 528 | 0.6716 | 0.3302 |
106
- | 0.2709 | 48.98 | 539 | 0.6057 | 0.3468 |
107
- | 0.3136 | 49.98 | 550 | 0.5900 | 0.3409 |
108
- | 0.27 | 50.98 | 561 | 0.5971 | 0.3385 |
109
- | 0.2763 | 51.98 | 572 | 0.5792 | 0.3433 |
110
- | 0.2763 | 52.98 | 583 | 0.5660 | 0.3635 |
111
- | 0.2602 | 53.98 | 594 | 0.5739 | 0.3468 |
112
- | 0.2611 | 54.98 | 605 | 0.5796 | 0.3409 |
113
- | 0.2536 | 55.98 | 616 | 0.5740 | 0.3528 |
114
- | 0.2954 | 56.98 | 627 | 0.5749 | 0.3409 |
115
- | 0.2404 | 57.98 | 638 | 0.6084 | 0.3397 |
116
- | 0.2562 | 58.98 | 649 | 0.5996 | 0.3445 |
117
- | 0.2531 | 59.98 | 660 | 0.6330 | 0.3373 |
118
- | 0.2413 | 60.98 | 671 | 0.6049 | 0.3492 |
119
- | 0.2617 | 61.98 | 682 | 0.6037 | 0.3445 |
120
- | 0.2648 | 62.98 | 693 | 0.6109 | 0.3445 |
121
- | 0.2547 | 63.98 | 704 | 0.5955 | 0.3397 |
122
- | 0.2473 | 64.98 | 715 | 0.6020 | 0.3421 |
123
- | 0.2332 | 65.98 | 726 | 0.5810 | 0.3456 |
124
- | 0.2391 | 66.98 | 737 | 0.5788 | 0.3385 |
125
- | 0.2451 | 67.98 | 748 | 0.5795 | 0.3468 |
126
- | 0.2517 | 68.98 | 759 | 0.5896 | 0.3373 |
127
- | 0.2549 | 69.98 | 770 | 0.5835 | 0.3313 |
128
- | 0.2458 | 70.98 | 781 | 0.5930 | 0.3206 |
129
- | 0.237 | 71.98 | 792 | 0.5989 | 0.3218 |
130
- | 0.256 | 72.98 | 803 | 0.5663 | 0.3445 |
131
- | 0.2432 | 73.98 | 814 | 0.5931 | 0.3409 |
132
- | 0.2575 | 74.98 | 825 | 0.6022 | 0.3337 |
133
- | 0.2325 | 75.98 | 836 | 0.5875 | 0.3361 |
134
- | 0.2545 | 76.98 | 847 | 0.5987 | 0.3433 |
135
- | 0.2419 | 77.98 | 858 | 0.6004 | 0.3337 |
136
- | 0.2426 | 78.98 | 869 | 0.5892 | 0.3278 |
137
- | 0.2452 | 79.98 | 880 | 0.6037 | 0.3278 |
138
- | 0.237 | 80.98 | 891 | 0.5972 | 0.3349 |
139
- | 0.2692 | 81.98 | 902 | 0.5826 | 0.3361 |
140
- | 0.2445 | 82.98 | 913 | 0.5837 | 0.3278 |
141
- | 0.2418 | 83.98 | 924 | 0.5880 | 0.3278 |
142
- | 0.2297 | 84.98 | 935 | 0.5838 | 0.3325 |
143
- | 0.248 | 85.98 | 946 | 0.5876 | 0.3325 |
144
- | 0.2328 | 86.98 | 957 | 0.5844 | 0.3313 |
145
- | 0.238 | 87.98 | 968 | 0.5672 | 0.3278 |
146
- | 0.3792 | 88.98 | 979 | 0.5694 | 0.3290 |
147
- | 0.2353 | 89.98 | 990 | 0.5634 | 0.3290 |
148
- | 0.2352 | 90.98 | 1001 | 0.5751 | 0.3290 |
149
- | 0.2329 | 91.98 | 1012 | 0.5683 | 0.3313 |
150
- | 0.366 | 92.98 | 1023 | 0.5717 | 0.3302 |
151
- | 0.2401 | 93.98 | 1034 | 0.5696 | 0.3302 |
152
- | 0.2627 | 94.98 | 1045 | 0.5769 | 0.3290 |
153
- | 0.2279 | 95.98 | 1056 | 0.5720 | 0.3313 |
154
- | 0.2393 | 96.98 | 1067 | 0.5660 | 0.3325 |
155
- | 0.244 | 97.98 | 1078 | 0.5660 | 0.3325 |
156
- | 0.2424 | 98.98 | 1089 | 0.5683 | 0.3325 |
157
- | 0.229 | 99.98 | 1100 | 0.5707 | 0.3325 |
158
 
159
 
160
  ### Framework versions
 
1
  ---
2
  tags:
 
 
3
  - generated_from_trainer
4
  datasets:
5
  - ai_light_dance
 
7
  - wer
8
  model-index:
9
  - name: ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3
10
+ results:
11
+ - task:
12
+ name: Automatic Speech Recognition
13
+ type: automatic-speech-recognition
14
+ dataset:
15
+ name: ai_light_dance
16
+ type: ai_light_dance
17
+ config: onset-idmt-smt-drums-v2+MDBDrums
18
+ split: train
19
+ args: onset-idmt-smt-drums-v2+MDBDrums
20
+ metrics:
21
+ - name: Wer
22
+ type: wer
23
+ value: 0.2967818831942789
24
  ---
25
 
26
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
28
 
29
  # ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3
30
 
31
+ This model is a fine-tuned version of [gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3](https://huggingface.co/gary109/ai-light-dance_drums_ft_pretrain_wav2vec2-base-new-v3) on the ai_light_dance dataset.
32
  It achieves the following results on the evaluation set:
33
+ - Loss: 0.5774
34
+ - Wer: 0.2968
35
 
36
  ## Model description
37
 
 
50
  ### Training hyperparameters
51
 
52
  The following hyperparameters were used during training:
53
+ - learning_rate: 0.0001
54
+ - train_batch_size: 2
55
+ - eval_batch_size: 2
56
  - seed: 42
57
+ - gradient_accumulation_steps: 2
58
+ - total_train_batch_size: 4
59
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
  - lr_scheduler_type: linear
61
  - lr_scheduler_warmup_steps: 100
 
66
 
67
  | Training Loss | Epoch | Step | Validation Loss | Wer |
68
  |:-------------:|:-----:|:----:|:---------------:|:------:|
69
+ | 0.1747 | 1.0 | 45 | 0.5638 | 0.3337 |
70
+ | 0.2339 | 2.0 | 90 | 0.5785 | 0.3254 |
71
+ | 0.2849 | 3.0 | 135 | 0.5586 | 0.3397 |
72
+ | 0.2396 | 4.0 | 180 | 0.5868 | 0.3266 |
73
+ | 0.2272 | 5.0 | 225 | 0.6052 | 0.3230 |
74
+ | 0.2497 | 6.0 | 270 | 0.5913 | 0.3278 |
75
+ | 0.2218 | 7.0 | 315 | 0.5926 | 0.3349 |
76
+ | 0.2584 | 8.0 | 360 | 0.5617 | 0.3218 |
77
+ | 0.2741 | 9.0 | 405 | 0.5901 | 0.3230 |
78
+ | 0.2481 | 10.0 | 450 | 0.5860 | 0.3278 |
79
+ | 0.2504 | 11.0 | 495 | 0.5991 | 0.3123 |
80
+ | 0.2125 | 12.0 | 540 | 0.5992 | 0.3218 |
81
+ | 0.2482 | 13.0 | 585 | 0.5756 | 0.3194 |
82
+ | 0.2135 | 14.0 | 630 | 0.5836 | 0.3302 |
83
+ | 0.2345 | 15.0 | 675 | 0.6347 | 0.3254 |
84
+ | 0.1912 | 16.0 | 720 | 0.6160 | 0.3206 |
85
+ | 0.2117 | 17.0 | 765 | 0.6268 | 0.3099 |
86
+ | 0.2217 | 18.0 | 810 | 0.6873 | 0.3182 |
87
+ | 0.2165 | 19.0 | 855 | 0.6721 | 0.3159 |
88
+ | 0.207 | 20.0 | 900 | 0.6312 | 0.3206 |
89
+ | 0.2263 | 21.0 | 945 | 0.6223 | 0.3290 |
90
+ | 0.2015 | 22.0 | 990 | 0.6319 | 0.3182 |
91
+ | 0.1997 | 23.0 | 1035 | 0.6527 | 0.3135 |
92
+ | 0.2318 | 24.0 | 1080 | 0.5987 | 0.3278 |
93
+ | 0.2196 | 25.0 | 1125 | 0.6269 | 0.3242 |
94
+ | 0.2298 | 26.0 | 1170 | 0.5774 | 0.3254 |
95
+ | 0.2117 | 27.0 | 1215 | 0.5938 | 0.3027 |
96
+ | 0.2553 | 28.0 | 1260 | 0.5831 | 0.3123 |
97
+ | 0.226 | 29.0 | 1305 | 0.6151 | 0.3099 |
98
+ | 0.1635 | 30.0 | 1350 | 0.5622 | 0.3230 |
99
+ | 0.5734 | 31.0 | 1395 | 0.6198 | 0.2920 |
100
+ | 0.2196 | 32.0 | 1440 | 0.5779 | 0.3039 |
101
+ | 0.2019 | 33.0 | 1485 | 0.5866 | 0.3111 |
102
+ | 0.2222 | 34.0 | 1530 | 0.5557 | 0.3063 |
103
+ | 0.2167 | 35.0 | 1575 | 0.5740 | 0.3206 |
104
+ | 0.2011 | 36.0 | 1620 | 0.5598 | 0.3004 |
105
+ | 0.2032 | 37.0 | 1665 | 0.5550 | 0.3147 |
106
+ | 0.225 | 38.0 | 1710 | 0.5794 | 0.3099 |
107
+ | 0.2068 | 39.0 | 1755 | 0.6223 | 0.3063 |
108
+ | 0.2105 | 40.0 | 1800 | 0.5797 | 0.3039 |
109
+ | 0.1968 | 41.0 | 1845 | 0.5681 | 0.2968 |
110
+ | 0.224 | 42.0 | 1890 | 0.5742 | 0.3170 |
111
+ | 0.2351 | 43.0 | 1935 | 0.5567 | 0.3111 |
112
+ | 0.2121 | 44.0 | 1980 | 0.5893 | 0.3039 |
113
+ | 0.1913 | 45.0 | 2025 | 0.6030 | 0.3027 |
114
+ | 0.1636 | 46.0 | 2070 | 0.5812 | 0.3004 |
115
+ | 0.2062 | 47.0 | 2115 | 0.6081 | 0.3004 |
116
+ | 0.2031 | 48.0 | 2160 | 0.5610 | 0.3159 |
117
+ | 0.1892 | 49.0 | 2205 | 0.5863 | 0.3147 |
118
+ | 0.1712 | 50.0 | 2250 | 0.5943 | 0.3159 |
119
+ | 0.1886 | 51.0 | 2295 | 0.5953 | 0.3051 |
120
+ | 0.1748 | 52.0 | 2340 | 0.5761 | 0.3087 |
121
+ | 0.1705 | 53.0 | 2385 | 0.6045 | 0.2872 |
122
+ | 0.1794 | 54.0 | 2430 | 0.5731 | 0.3075 |
123
+ | 0.1815 | 55.0 | 2475 | 0.5949 | 0.2849 |
124
+ | 0.1571 | 56.0 | 2520 | 0.5663 | 0.2884 |
125
+ | 0.1902 | 57.0 | 2565 | 0.5903 | 0.2956 |
126
+ | 0.2057 | 58.0 | 2610 | 0.5820 | 0.2872 |
127
+ | 0.1904 | 59.0 | 2655 | 0.5923 | 0.2896 |
128
+ | 0.1677 | 60.0 | 2700 | 0.5769 | 0.3075 |
129
+ | 0.1859 | 61.0 | 2745 | 0.5566 | 0.3147 |
130
+ | 0.2382 | 62.0 | 2790 | 0.5849 | 0.3051 |
131
+ | 0.1753 | 63.0 | 2835 | 0.5773 | 0.3075 |
132
+ | 0.1651 | 64.0 | 2880 | 0.5877 | 0.3039 |
133
+ | 0.1781 | 65.0 | 2925 | 0.5905 | 0.3027 |
134
+ | 0.1582 | 66.0 | 2970 | 0.5800 | 0.3015 |
135
+ | 0.1538 | 67.0 | 3015 | 0.6025 | 0.3075 |
136
+ | 0.1606 | 68.0 | 3060 | 0.5758 | 0.3039 |
137
+ | 0.1522 | 69.0 | 3105 | 0.5860 | 0.2932 |
138
+ | 0.1521 | 70.0 | 3150 | 0.5896 | 0.2956 |
139
+ | 0.1592 | 71.0 | 3195 | 0.5738 | 0.3027 |
140
+ | 0.2245 | 72.0 | 3240 | 0.5782 | 0.3039 |
141
+ | 0.2185 | 73.0 | 3285 | 0.5722 | 0.3027 |
142
+ | 0.1597 | 74.0 | 3330 | 0.5891 | 0.3004 |
143
+ | 0.1713 | 75.0 | 3375 | 0.5650 | 0.3027 |
144
+ | 0.1464 | 76.0 | 3420 | 0.5860 | 0.3063 |
145
+ | 0.1551 | 77.0 | 3465 | 0.5755 | 0.3027 |
146
+ | 0.1509 | 78.0 | 3510 | 0.5895 | 0.2944 |
147
+ | 0.176 | 79.0 | 3555 | 0.5750 | 0.2992 |
148
+ | 0.1695 | 80.0 | 3600 | 0.5759 | 0.3004 |
149
+ | 0.1797 | 81.0 | 3645 | 0.5904 | 0.2992 |
150
+ | 0.1371 | 82.0 | 3690 | 0.5923 | 0.3015 |
151
+ | 0.1798 | 83.0 | 3735 | 0.5864 | 0.2992 |
152
+ | 0.1386 | 84.0 | 3780 | 0.5733 | 0.3004 |
153
+ | 0.2173 | 85.0 | 3825 | 0.5751 | 0.3004 |
154
+ | 0.151 | 86.0 | 3870 | 0.5711 | 0.2968 |
155
+ | 0.1579 | 87.0 | 3915 | 0.5750 | 0.2992 |
156
+ | 0.1328 | 88.0 | 3960 | 0.5764 | 0.2944 |
157
+ | 0.1657 | 89.0 | 4005 | 0.5769 | 0.3004 |
158
+ | 0.1353 | 90.0 | 4050 | 0.5715 | 0.2956 |
159
+ | 0.1982 | 91.0 | 4095 | 0.5754 | 0.2968 |
160
+ | 0.1687 | 92.0 | 4140 | 0.5725 | 0.2980 |
161
+ | 0.1842 | 93.0 | 4185 | 0.5750 | 0.2980 |
162
+ | 0.1893 | 94.0 | 4230 | 0.5789 | 0.2944 |
163
+ | 0.1744 | 95.0 | 4275 | 0.5750 | 0.3004 |
164
+ | 0.1745 | 96.0 | 4320 | 0.5794 | 0.2980 |
165
+ | 0.1665 | 97.0 | 4365 | 0.5755 | 0.3004 |
166
+ | 0.1569 | 98.0 | 4410 | 0.5763 | 0.2968 |
167
+ | 0.1449 | 99.0 | 4455 | 0.5779 | 0.2968 |
168
+ | 0.1469 | 100.0 | 4500 | 0.5774 | 0.2968 |
169
 
170
 
171
  ### Framework versions