tuanio commited on
Commit
fbabb49
1 Parent(s): f1e9ece

End of training

Browse files
Files changed (2) hide show
  1. README.md +144 -0
  2. trainer_state.json +0 -0
README.md ADDED
@@ -0,0 +1,144 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/wav2vec2-xls-r-300m
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: 1-char-based-freeze_cnn-dropout0.1
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # 1-char-based-freeze_cnn-dropout0.1
17
+
18
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.2454
21
+ - Wer: 0.1804
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 12
42
+ - eval_batch_size: 4
43
+ - seed: 42
44
+ - distributed_type: multi-GPU
45
+ - num_devices: 4
46
+ - total_train_batch_size: 48
47
+ - total_eval_batch_size: 16
48
+ - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - lr_scheduler_warmup_ratio: 0.1
51
+ - training_steps: 200000
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
56
+ |:-------------:|:-----:|:------:|:---------------:|:------:|
57
+ | 5.3032 | 0.09 | 2500 | 7.0530 | 1.0 |
58
+ | 3.4521 | 0.18 | 5000 | 3.6019 | 1.0 |
59
+ | 3.3037 | 0.26 | 7500 | 3.4931 | 1.0 |
60
+ | 3.2012 | 0.35 | 10000 | 3.4193 | 1.0 |
61
+ | 2.3026 | 0.44 | 12500 | 1.9423 | 0.9873 |
62
+ | 1.4238 | 0.53 | 15000 | 0.8772 | 0.6695 |
63
+ | 1.1592 | 0.62 | 17500 | 0.6630 | 0.5011 |
64
+ | 0.861 | 0.7 | 20000 | 0.5460 | 0.4239 |
65
+ | 0.8123 | 0.79 | 22500 | 0.4794 | 0.3830 |
66
+ | 0.7568 | 0.88 | 25000 | 0.4369 | 0.3463 |
67
+ | 0.7182 | 0.97 | 27500 | 0.4111 | 0.3289 |
68
+ | 0.6896 | 1.06 | 30000 | 0.4041 | 0.3102 |
69
+ | 0.6655 | 1.14 | 32500 | 0.3933 | 0.2986 |
70
+ | 0.5738 | 1.23 | 35000 | 0.3676 | 0.2829 |
71
+ | 0.6361 | 1.32 | 37500 | 0.3533 | 0.2727 |
72
+ | 0.6142 | 1.41 | 40000 | 0.3545 | 0.2716 |
73
+ | 0.6346 | 1.5 | 42500 | 0.3428 | 0.2615 |
74
+ | 0.5739 | 1.58 | 45000 | 0.3470 | 0.2578 |
75
+ | 0.544 | 1.67 | 47500 | 0.3207 | 0.2490 |
76
+ | 0.5283 | 1.76 | 50000 | 0.3202 | 0.2424 |
77
+ | 0.5552 | 1.85 | 52500 | 0.3187 | 0.2379 |
78
+ | 0.5218 | 1.94 | 55000 | 0.3242 | 0.2383 |
79
+ | 0.4939 | 2.02 | 57500 | 0.3277 | 0.2418 |
80
+ | 0.5141 | 2.11 | 60000 | 0.3058 | 0.2329 |
81
+ | 0.5189 | 2.2 | 62500 | 0.3086 | 0.2273 |
82
+ | 0.4993 | 2.29 | 65000 | 0.3005 | 0.2245 |
83
+ | 0.5156 | 2.38 | 67500 | 0.2998 | 0.2223 |
84
+ | 0.4787 | 2.46 | 70000 | 0.2940 | 0.2173 |
85
+ | 0.5296 | 2.55 | 72500 | 0.3003 | 0.2225 |
86
+ | 0.4759 | 2.64 | 75000 | 0.2995 | 0.2144 |
87
+ | 0.485 | 2.73 | 77500 | 0.2882 | 0.2126 |
88
+ | 0.4888 | 2.82 | 80000 | 0.2893 | 0.2189 |
89
+ | 0.51 | 2.9 | 82500 | 0.2767 | 0.2046 |
90
+ | 0.4703 | 2.99 | 85000 | 0.2899 | 0.2124 |
91
+ | 0.4406 | 3.08 | 87500 | 0.2787 | 0.2068 |
92
+ | 0.4328 | 3.17 | 90000 | 0.2823 | 0.2070 |
93
+ | 0.4399 | 3.26 | 92500 | 0.2802 | 0.2058 |
94
+ | 0.4788 | 3.34 | 95000 | 0.2741 | 0.2084 |
95
+ | 0.4621 | 3.43 | 97500 | 0.2817 | 0.2038 |
96
+ | 0.523 | 3.52 | 100000 | 0.2735 | 0.2015 |
97
+ | 0.4689 | 3.61 | 102500 | 0.2631 | 0.1995 |
98
+ | 0.4502 | 3.7 | 105000 | 0.2689 | 0.1986 |
99
+ | 0.4402 | 3.78 | 107500 | 0.2726 | 0.1987 |
100
+ | 0.4189 | 3.87 | 110000 | 0.2724 | 0.1994 |
101
+ | 0.4526 | 3.96 | 112500 | 0.2596 | 0.1918 |
102
+ | 0.4755 | 4.05 | 115000 | 0.2583 | 0.1900 |
103
+ | 0.4374 | 4.14 | 117500 | 0.2590 | 0.1944 |
104
+ | 0.4155 | 4.23 | 120000 | 0.2695 | 0.1961 |
105
+ | 0.4463 | 4.31 | 122500 | 0.2605 | 0.1909 |
106
+ | 0.4007 | 4.4 | 125000 | 0.2529 | 0.1891 |
107
+ | 0.4156 | 4.49 | 127500 | 0.2568 | 0.1913 |
108
+ | 0.4124 | 4.58 | 130000 | 0.2559 | 0.1889 |
109
+ | 0.4085 | 4.67 | 132500 | 0.2610 | 0.1922 |
110
+ | 0.4474 | 4.75 | 135000 | 0.2588 | 0.1961 |
111
+ | 0.4098 | 4.84 | 137500 | 0.2512 | 0.1877 |
112
+ | 0.3941 | 4.93 | 140000 | 0.2549 | 0.1891 |
113
+ | 0.3917 | 5.02 | 142500 | 0.2544 | 0.1863 |
114
+ | 0.4324 | 5.11 | 145000 | 0.2564 | 0.1882 |
115
+ | 0.4255 | 5.19 | 147500 | 0.2536 | 0.1885 |
116
+ | 0.3894 | 5.28 | 150000 | 0.2538 | 0.1860 |
117
+ | 0.4108 | 5.37 | 152500 | 0.2539 | 0.1860 |
118
+ | 0.4312 | 5.46 | 155000 | 0.2526 | 0.1849 |
119
+ | 0.3786 | 5.55 | 157500 | 0.2504 | 0.1837 |
120
+ | 0.4033 | 5.63 | 160000 | 0.2516 | 0.1852 |
121
+ | 0.3973 | 5.72 | 162500 | 0.2570 | 0.1870 |
122
+ | 0.3994 | 5.81 | 165000 | 0.2499 | 0.1846 |
123
+ | 0.4183 | 5.9 | 167500 | 0.2489 | 0.1835 |
124
+ | 0.3826 | 5.99 | 170000 | 0.2468 | 0.1847 |
125
+ | 0.4103 | 6.07 | 172500 | 0.2477 | 0.1806 |
126
+ | 0.4291 | 6.16 | 175000 | 0.2492 | 0.1835 |
127
+ | 0.4417 | 6.25 | 177500 | 0.2464 | 0.1824 |
128
+ | 0.3962 | 6.34 | 180000 | 0.2476 | 0.1815 |
129
+ | 0.4633 | 6.43 | 182500 | 0.2447 | 0.1818 |
130
+ | 0.422 | 6.51 | 185000 | 0.2455 | 0.1802 |
131
+ | 0.4098 | 6.6 | 187500 | 0.2488 | 0.1814 |
132
+ | 0.4018 | 6.69 | 190000 | 0.2453 | 0.1804 |
133
+ | 0.4559 | 6.78 | 192500 | 0.2458 | 0.1823 |
134
+ | 0.4033 | 6.87 | 195000 | 0.2451 | 0.1794 |
135
+ | 0.3829 | 6.95 | 197500 | 0.2453 | 0.1804 |
136
+ | 0.3676 | 7.04 | 200000 | 0.2454 | 0.1804 |
137
+
138
+
139
+ ### Framework versions
140
+
141
+ - Transformers 4.34.0
142
+ - Pytorch 2.0.1
143
+ - Datasets 2.14.5
144
+ - Tokenizers 0.14.1
trainer_state.json ADDED
The diff for this file is too large to render. See raw diff