gary109 commited on
Commit
35dfb82
1 Parent(s): d3f6645

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +58 -30
README.md CHANGED
@@ -1,8 +1,6 @@
1
  ---
2
  license: apache-2.0
3
  tags:
4
- - automatic-speech-recognition
5
- - gary109/AI_Light_Dance
6
  - generated_from_trainer
7
  model-index:
8
  - name: ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1
@@ -14,10 +12,10 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1
16
 
17
- This model is a fine-tuned version of [gary109/ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1](https://huggingface.co/gary109/ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1) on the GARY109/AI_LIGHT_DANCE - ONSET-SINGING3 dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.5691
20
- - Wer: 0.3107
21
 
22
  ## Model description
23
 
@@ -36,7 +34,7 @@ More information needed
36
  ### Training hyperparameters
37
 
38
  The following hyperparameters were used during training:
39
- - learning_rate: 1e-05
40
  - train_batch_size: 4
41
  - eval_batch_size: 4
42
  - seed: 42
@@ -44,34 +42,64 @@ The following hyperparameters were used during training:
44
  - total_train_batch_size: 16
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
- - lr_scheduler_warmup_steps: 1000
48
- - num_epochs: 20.0
49
  - mixed_precision_training: Native AMP
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Wer |
54
- |:-------------:|:-----:|:----:|:---------------:|:------:|
55
- | 0.5352 | 1.0 | 288 | 0.6025 | 0.3327 |
56
- | 0.5365 | 2.0 | 576 | 0.6086 | 0.3480 |
57
- | 0.5359 | 3.0 | 864 | 0.6111 | 0.3363 |
58
- | 0.5395 | 4.0 | 1152 | 0.6082 | 0.3416 |
59
- | 0.5692 | 5.0 | 1440 | 0.5949 | 0.3318 |
60
- | 0.5592 | 6.0 | 1728 | 0.6046 | 0.3323 |
61
- | 0.5172 | 7.0 | 2016 | 0.5838 | 0.3185 |
62
- | 0.5108 | 8.0 | 2304 | 0.6066 | 0.3211 |
63
- | 0.4981 | 9.0 | 2592 | 0.5958 | 0.3164 |
64
- | 0.5193 | 10.0 | 2880 | 0.5889 | 0.3144 |
65
- | 0.4988 | 11.0 | 3168 | 0.5691 | 0.3107 |
66
- | 0.4966 | 12.0 | 3456 | 0.5908 | 0.3127 |
67
- | 0.4801 | 13.0 | 3744 | 0.5812 | 0.3110 |
68
- | 0.5025 | 14.0 | 4032 | 0.5805 | 0.3046 |
69
- | 0.5048 | 15.0 | 4320 | 0.5906 | 0.3134 |
70
- | 0.4772 | 16.0 | 4608 | 0.5693 | 0.3010 |
71
- | 0.4748 | 17.0 | 4896 | 0.5707 | 0.3028 |
72
- | 0.4745 | 18.0 | 5184 | 0.5709 | 0.3019 |
73
- | 0.4548 | 19.0 | 5472 | 0.5720 | 0.3004 |
74
- | 0.4619 | 20.0 | 5760 | 0.5729 | 0.3019 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
75
 
76
 
77
  ### Framework versions
 
1
  ---
2
  license: apache-2.0
3
  tags:
 
 
4
  - generated_from_trainer
5
  model-index:
6
  - name: ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1
 
12
 
13
  # ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1
14
 
15
+ This model is a fine-tuned version of [gary109/ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1](https://huggingface.co/gary109/ai-light-dance_singing3_ft_wav2vec2-large-xlsr-53-v1) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.5592
18
+ - Wer: 0.2671
19
 
20
  ## Model description
21
 
 
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
+ - learning_rate: 8e-06
38
  - train_batch_size: 4
39
  - eval_batch_size: 4
40
  - seed: 42
 
42
  - total_train_batch_size: 16
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - lr_scheduler_warmup_steps: 5000
46
+ - num_epochs: 50.0
47
  - mixed_precision_training: Native AMP
48
 
49
  ### Training results
50
 
51
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
52
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
53
+ | 0.4853 | 1.0 | 288 | 0.5760 | 0.3098 |
54
+ | 0.48 | 2.0 | 576 | 0.5787 | 0.3085 |
55
+ | 0.4625 | 3.0 | 864 | 0.5925 | 0.3112 |
56
+ | 0.4704 | 4.0 | 1152 | 0.6065 | 0.3108 |
57
+ | 0.4854 | 5.0 | 1440 | 0.6036 | 0.3112 |
58
+ | 0.4918 | 6.0 | 1728 | 0.6007 | 0.3148 |
59
+ | 0.4549 | 7.0 | 2016 | 0.6039 | 0.3073 |
60
+ | 0.4546 | 8.0 | 2304 | 0.6129 | 0.3073 |
61
+ | 0.4404 | 9.0 | 2592 | 0.6062 | 0.3054 |
62
+ | 0.4681 | 10.0 | 2880 | 0.6063 | 0.3075 |
63
+ | 0.469 | 11.0 | 3168 | 0.5881 | 0.3031 |
64
+ | 0.4903 | 12.0 | 3456 | 0.5913 | 0.3047 |
65
+ | 0.4677 | 13.0 | 3744 | 0.5921 | 0.3055 |
66
+ | 0.502 | 14.0 | 4032 | 0.5905 | 0.3042 |
67
+ | 0.5028 | 15.0 | 4320 | 0.5989 | 0.3088 |
68
+ | 0.4706 | 16.0 | 4608 | 0.5665 | 0.3066 |
69
+ | 0.4839 | 17.0 | 4896 | 0.6003 | 0.3111 |
70
+ | 0.4733 | 18.0 | 5184 | 0.5937 | 0.3039 |
71
+ | 0.4544 | 19.0 | 5472 | 0.5903 | 0.3025 |
72
+ | 0.4616 | 20.0 | 5760 | 0.6064 | 0.2968 |
73
+ | 0.475 | 21.0 | 6048 | 0.5883 | 0.2960 |
74
+ | 0.4707 | 22.0 | 6336 | 0.5900 | 0.2888 |
75
+ | 0.4562 | 23.0 | 6624 | 0.5642 | 0.2956 |
76
+ | 0.455 | 24.0 | 6912 | 0.5732 | 0.2893 |
77
+ | 0.5011 | 25.0 | 7200 | 0.5612 | 0.2876 |
78
+ | 0.4658 | 26.0 | 7488 | 0.5631 | 0.2915 |
79
+ | 0.4423 | 27.0 | 7776 | 0.5668 | 0.2853 |
80
+ | 0.4287 | 28.0 | 8064 | 0.5664 | 0.2847 |
81
+ | 0.4634 | 29.0 | 8352 | 0.5687 | 0.2875 |
82
+ | 0.4413 | 30.0 | 8640 | 0.5684 | 0.2954 |
83
+ | 0.4385 | 31.0 | 8928 | 0.5602 | 0.2801 |
84
+ | 0.4557 | 32.0 | 9216 | 0.5637 | 0.2747 |
85
+ | 0.4344 | 33.0 | 9504 | 0.5690 | 0.2853 |
86
+ | 0.4264 | 34.0 | 9792 | 0.5653 | 0.2866 |
87
+ | 0.4395 | 35.0 | 10080 | 0.5764 | 0.2808 |
88
+ | 0.4278 | 36.0 | 10368 | 0.5758 | 0.2761 |
89
+ | 0.44 | 37.0 | 10656 | 0.5816 | 0.2770 |
90
+ | 0.4356 | 38.0 | 10944 | 0.5814 | 0.2784 |
91
+ | 0.487 | 39.0 | 11232 | 0.5694 | 0.2834 |
92
+ | 0.44 | 40.0 | 11520 | 0.5637 | 0.2747 |
93
+ | 0.4151 | 41.0 | 11808 | 0.5683 | 0.2763 |
94
+ | 0.4208 | 42.0 | 12096 | 0.5720 | 0.2732 |
95
+ | 0.4354 | 43.0 | 12384 | 0.5657 | 0.2771 |
96
+ | 0.4304 | 44.0 | 12672 | 0.5735 | 0.2724 |
97
+ | 0.3991 | 45.0 | 12960 | 0.5638 | 0.2688 |
98
+ | 0.4348 | 46.0 | 13248 | 0.5639 | 0.2699 |
99
+ | 0.4291 | 47.0 | 13536 | 0.5577 | 0.2682 |
100
+ | 0.4252 | 48.0 | 13824 | 0.5611 | 0.2680 |
101
+ | 0.4253 | 49.0 | 14112 | 0.5621 | 0.2682 |
102
+ | 0.4298 | 50.0 | 14400 | 0.5592 | 0.2671 |
103
 
104
 
105
  ### Framework versions