jadasdn commited on
Commit
80a604b
1 Parent(s): 560abf2

End of training

Browse files
Files changed (1) hide show
  1. README.md +121 -0
README.md ADDED
@@ -0,0 +1,121 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: jadasdn/wav2vec2-2
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: wav2vec2-3
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # wav2vec2-3
17
+
18
+ This model is a fine-tuned version of [jadasdn/wav2vec2-2](https://huggingface.co/jadasdn/wav2vec2-2) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.9689
21
+ - Wer: 0.3821
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 0.0001
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_steps: 1000
47
+ - num_epochs: 30
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
53
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
54
+ | 0.5494 | 0.5 | 500 | 0.4696 | 0.3879 |
55
+ | 0.4979 | 1.0 | 1000 | 0.5284 | 0.4156 |
56
+ | 0.438 | 1.5 | 1500 | 0.5127 | 0.4121 |
57
+ | 0.4577 | 2.0 | 2000 | 0.5421 | 0.4162 |
58
+ | 0.3528 | 2.5 | 2500 | 0.5667 | 0.4117 |
59
+ | 0.3685 | 3.0 | 3000 | 0.5354 | 0.4109 |
60
+ | 0.3047 | 3.5 | 3500 | 0.5899 | 0.4244 |
61
+ | 0.3103 | 4.0 | 4000 | 0.5587 | 0.4107 |
62
+ | 0.2626 | 4.5 | 4500 | 0.6265 | 0.4255 |
63
+ | 0.2735 | 5.0 | 5000 | 0.6153 | 0.4104 |
64
+ | 0.2244 | 5.5 | 5500 | 0.5916 | 0.4058 |
65
+ | 0.2418 | 6.0 | 6000 | 0.5921 | 0.4134 |
66
+ | 0.2013 | 6.5 | 6500 | 0.6632 | 0.4079 |
67
+ | 0.211 | 7.0 | 7000 | 0.6563 | 0.4206 |
68
+ | 0.182 | 7.5 | 7500 | 0.6546 | 0.4096 |
69
+ | 0.1912 | 8.0 | 8000 | 0.6429 | 0.4043 |
70
+ | 0.1621 | 8.5 | 8500 | 0.7105 | 0.4060 |
71
+ | 0.1695 | 9.0 | 9000 | 0.6985 | 0.4027 |
72
+ | 0.1512 | 9.5 | 9500 | 0.7729 | 0.4060 |
73
+ | 0.1521 | 10.0 | 10000 | 0.7261 | 0.4047 |
74
+ | 0.1355 | 10.5 | 10500 | 0.7665 | 0.4097 |
75
+ | 0.1385 | 11.0 | 11000 | 0.7366 | 0.4044 |
76
+ | 0.1275 | 11.5 | 11500 | 0.7494 | 0.4046 |
77
+ | 0.1278 | 12.0 | 12000 | 0.7756 | 0.4028 |
78
+ | 0.1165 | 12.5 | 12500 | 0.8292 | 0.4064 |
79
+ | 0.1176 | 13.0 | 13000 | 0.7808 | 0.4053 |
80
+ | 0.1091 | 13.5 | 13500 | 0.8157 | 0.4052 |
81
+ | 0.1092 | 14.0 | 14000 | 0.8295 | 0.4071 |
82
+ | 0.1024 | 14.5 | 14500 | 0.8617 | 0.4032 |
83
+ | 0.1005 | 15.0 | 15000 | 0.8066 | 0.4042 |
84
+ | 0.0932 | 15.5 | 15500 | 0.8588 | 0.4022 |
85
+ | 0.0932 | 16.0 | 16000 | 0.8429 | 0.4036 |
86
+ | 0.087 | 16.5 | 16500 | 0.8975 | 0.4037 |
87
+ | 0.0866 | 17.0 | 17000 | 0.8916 | 0.3976 |
88
+ | 0.0835 | 17.5 | 17500 | 0.8566 | 0.3955 |
89
+ | 0.0828 | 18.0 | 18000 | 0.8581 | 0.3993 |
90
+ | 0.0756 | 18.5 | 18500 | 0.8827 | 0.3962 |
91
+ | 0.0789 | 19.0 | 19000 | 0.9072 | 0.3957 |
92
+ | 0.071 | 19.5 | 19500 | 0.9349 | 0.3920 |
93
+ | 0.0719 | 20.0 | 20000 | 0.9064 | 0.3931 |
94
+ | 0.0698 | 20.5 | 20500 | 0.9208 | 0.3932 |
95
+ | 0.0671 | 21.0 | 21000 | 0.8935 | 0.3927 |
96
+ | 0.0656 | 21.5 | 21500 | 0.9271 | 0.3936 |
97
+ | 0.0612 | 22.0 | 22000 | 0.9792 | 0.3979 |
98
+ | 0.0576 | 22.5 | 22500 | 0.9530 | 0.3917 |
99
+ | 0.0588 | 23.0 | 23000 | 0.9617 | 0.3928 |
100
+ | 0.0532 | 23.5 | 23500 | 0.9754 | 0.3854 |
101
+ | 0.0563 | 24.0 | 24000 | 0.9559 | 0.3915 |
102
+ | 0.053 | 24.5 | 24500 | 0.9845 | 0.3894 |
103
+ | 0.0512 | 25.0 | 25000 | 0.9629 | 0.3876 |
104
+ | 0.0516 | 25.5 | 25500 | 0.9565 | 0.3873 |
105
+ | 0.0491 | 26.0 | 26000 | 0.9726 | 0.3877 |
106
+ | 0.0473 | 26.5 | 26500 | 0.9831 | 0.3877 |
107
+ | 0.0441 | 27.0 | 27000 | 0.9645 | 0.3839 |
108
+ | 0.042 | 27.5 | 27500 | 1.0040 | 0.3851 |
109
+ | 0.0458 | 28.0 | 28000 | 0.9685 | 0.3839 |
110
+ | 0.043 | 28.5 | 28500 | 0.9687 | 0.3841 |
111
+ | 0.0399 | 29.0 | 29000 | 0.9764 | 0.3831 |
112
+ | 0.0421 | 29.5 | 29500 | 0.9691 | 0.3828 |
113
+ | 0.041 | 30.0 | 30000 | 0.9689 | 0.3821 |
114
+
115
+
116
+ ### Framework versions
117
+
118
+ - Transformers 4.35.2
119
+ - Pytorch 2.1.0+cu118
120
+ - Datasets 2.15.0
121
+ - Tokenizers 0.15.0