namkyeong commited on
Commit
36917b1
1 Parent(s): 549b3d6

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +176 -0
README.md ADDED
@@ -0,0 +1,176 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: facebook_wav2vec2-xls-r-300m_meet_tr_p_10_character
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # facebook_wav2vec2-xls-r-300m_meet_tr_p_10_character
14
+
15
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.6125
18
+ - Cer: 0.1445
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 0.0001
38
+ - train_batch_size: 7
39
+ - eval_batch_size: 7
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - lr_scheduler_warmup_steps: 1000
44
+ - num_epochs: 3
45
+ - mixed_precision_training: Native AMP
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Cer | Validation Loss |
50
+ |:-------------:|:-----:|:-----:|:------:|:---------------:|
51
+ | 42.2505 | 0.03 | 300 | 1.0 | 49.1511 |
52
+ | 13.6348 | 0.05 | 600 | 1.0 | 5.7557 |
53
+ | 5.1009 | 0.08 | 900 | 1.0 | 5.2280 |
54
+ | 4.9349 | 0.1 | 1200 | 0.9830 | 5.0621 |
55
+ | 4.8724 | 0.13 | 1500 | 0.9740 | 4.9526 |
56
+ | 4.7966 | 0.15 | 1800 | 0.9595 | 4.9397 |
57
+ | 4.7368 | 0.18 | 2100 | 0.9560 | 4.7570 |
58
+ | 4.613 | 0.2 | 2400 | 0.9387 | 4.5097 |
59
+ | 4.1484 | 0.23 | 2700 | 0.6766 | 3.6760 |
60
+ | 3.4652 | 0.25 | 3000 | 0.5760 | 3.1567 |
61
+ | 3.0993 | 0.28 | 3300 | 0.5257 | 2.7454 |
62
+ | 2.8765 | 0.3 | 3600 | 0.4912 | 2.6302 |
63
+ | 2.6016 | 0.33 | 3900 | 0.4584 | 2.4138 |
64
+ | 2.4859 | 0.35 | 4200 | 0.4390 | 2.1625 |
65
+ | 2.3619 | 0.38 | 4500 | 0.4184 | 2.0083 |
66
+ | 2.2449 | 0.41 | 4800 | 0.4043 | 1.8993 |
67
+ | 2.2346 | 0.43 | 5100 | 0.3931 | 1.8297 |
68
+ | 2.0855 | 0.46 | 5400 | 0.3815 | 1.7839 |
69
+ | 2.0646 | 0.48 | 5700 | 0.3632 | 1.7346 |
70
+ | 1.9853 | 0.51 | 6000 | 0.3549 | 1.6617 |
71
+ | 1.9231 | 0.53 | 6300 | 0.3483 | 1.5524 |
72
+ | 1.8599 | 0.56 | 6600 | 0.3447 | 1.5524 |
73
+ | 1.8234 | 0.58 | 6900 | 0.3329 | 1.5756 |
74
+ | 1.8244 | 0.61 | 7200 | 0.3303 | 1.4651 |
75
+ | 1.7709 | 0.63 | 7500 | 0.3151 | 1.4303 |
76
+ | 1.7174 | 0.66 | 7800 | 0.3211 | 1.4079 |
77
+ | 1.7407 | 0.68 | 8100 | 0.3067 | 1.3984 |
78
+ | 1.7047 | 0.71 | 8400 | 0.3140 | 1.3826 |
79
+ | 1.6523 | 0.73 | 8700 | 0.2988 | 1.3349 |
80
+ | 1.6664 | 0.76 | 9000 | 0.2981 | 1.3117 |
81
+ | 1.6511 | 0.78 | 9300 | 0.2916 | 1.2932 |
82
+ | 1.5999 | 0.81 | 9600 | 0.2880 | 1.2659 |
83
+ | 1.6036 | 0.84 | 9900 | 0.2884 | 1.2867 |
84
+ | 1.5836 | 0.86 | 10200 | 0.2878 | 1.2605 |
85
+ | 1.5701 | 0.89 | 10500 | 0.2752 | 1.2592 |
86
+ | 1.5447 | 0.91 | 10800 | 0.2733 | 1.3209 |
87
+ | 1.537 | 0.94 | 11100 | 0.2694 | 1.2138 |
88
+ | 1.5456 | 0.96 | 11400 | 0.2727 | 1.2115 |
89
+ | 1.5075 | 0.99 | 11700 | 0.2658 | 1.1827 |
90
+ | 1.4665 | 1.01 | 12000 | 0.2643 | 1.1708 |
91
+ | 1.3995 | 1.04 | 12300 | 0.2590 | 1.1326 |
92
+ | 1.3602 | 1.06 | 12600 | 0.2613 | 1.1597 |
93
+ | 1.3734 | 1.09 | 12900 | 0.2619 | 1.1165 |
94
+ | 1.4103 | 1.11 | 13200 | 0.2607 | 1.1200 |
95
+ | 1.3684 | 1.14 | 13500 | 0.2516 | 1.1350 |
96
+ | 1.3654 | 1.16 | 13800 | 0.2516 | 1.0959 |
97
+ | 1.317 | 1.19 | 14100 | 0.2492 | 1.1201 |
98
+ | 1.3467 | 1.22 | 14400 | 0.2470 | 1.1138 |
99
+ | 1.3656 | 1.24 | 14700 | 0.2498 | 1.0575 |
100
+ | 1.3532 | 1.27 | 15000 | 0.2416 | 1.0771 |
101
+ | 1.3109 | 1.29 | 15300 | 0.2426 | 1.0389 |
102
+ | 1.2722 | 1.32 | 15600 | 0.2423 | 1.0465 |
103
+ | 1.2786 | 1.34 | 15900 | 0.2435 | 1.0547 |
104
+ | 1.3118 | 1.37 | 16200 | 0.2418 | 1.0417 |
105
+ | 1.2774 | 1.39 | 16500 | 0.2396 | 1.0232 |
106
+ | 1.2686 | 1.42 | 16800 | 0.2376 | 1.0082 |
107
+ | 1.2974 | 1.44 | 17100 | 0.2360 | 1.0424 |
108
+ | 1.2286 | 1.47 | 17400 | 0.2362 | 0.9912 |
109
+ | 1.2505 | 1.49 | 17700 | 0.2335 | 1.0350 |
110
+ | 1.2401 | 1.52 | 18000 | 0.2287 | 1.0426 |
111
+ | 1.2683 | 1.54 | 18300 | 0.2353 | 0.9930 |
112
+ | 1.2632 | 1.57 | 18600 | 0.2269 | 0.9945 |
113
+ | 1.2464 | 1.59 | 18900 | 0.2248 | 0.9810 |
114
+ | 1.2565 | 1.62 | 19200 | 0.2241 | 0.9859 |
115
+ | 1.2462 | 1.65 | 19500 | 0.2263 | 1.0128 |
116
+ | 1.244 | 1.67 | 19800 | 0.2256 | 1.0231 |
117
+ | 1.1923 | 1.7 | 20100 | 0.2189 | 0.9952 |
118
+ | 1.1993 | 1.72 | 20400 | 0.2191 | 0.9601 |
119
+ | 1.1992 | 1.75 | 20700 | 0.2171 | 0.9660 |
120
+ | 1.1902 | 1.77 | 21000 | 0.2165 | 0.9466 |
121
+ | 1.1929 | 1.8 | 21300 | 0.2176 | 0.9196 |
122
+ | 1.1703 | 1.82 | 21600 | 0.2137 | 0.9248 |
123
+ | 1.1667 | 1.85 | 21900 | 0.2131 | 0.9491 |
124
+ | 1.1401 | 1.87 | 22200 | 0.2128 | 0.9040 |
125
+ | 1.1689 | 1.9 | 22500 | 0.2113 | 0.9453 |
126
+ | 1.1515 | 1.92 | 22800 | 0.2116 | 0.9191 |
127
+ | 1.1553 | 1.95 | 23100 | 0.2095 | 0.9255 |
128
+ | 1.1657 | 1.97 | 23400 | 0.2102 | 0.9070 |
129
+ | 1.1371 | 2.0 | 23700 | 0.2123 | 0.9225 |
130
+ | 1.0175 | 2.03 | 24000 | 0.2090 | 0.9125 |
131
+ | 1.0356 | 2.05 | 24300 | 0.2060 | 0.8881 |
132
+ | 1.0307 | 2.08 | 24600 | 0.2037 | 0.9103 |
133
+ | 1.0044 | 2.1 | 24900 | 0.2057 | 0.8796 |
134
+ | 1.0662 | 2.13 | 25200 | 0.2022 | 0.8735 |
135
+ | 0.9837 | 2.15 | 25500 | 0.2025 | 0.8667 |
136
+ | 1.0106 | 2.18 | 25800 | 0.2024 | 0.8756 |
137
+ | 1.0179 | 2.2 | 26100 | 0.2038 | 0.8836 |
138
+ | 1.0049 | 2.23 | 26400 | 0.1987 | 0.8721 |
139
+ | 0.9742 | 2.25 | 26700 | 0.2005 | 0.8609 |
140
+ | 0.9918 | 2.28 | 27000 | 0.1985 | 0.8611 |
141
+ | 0.9956 | 2.3 | 27300 | 0.1994 | 0.8532 |
142
+ | 1.0048 | 2.33 | 27600 | 0.1963 | 0.8572 |
143
+ | 0.9873 | 2.35 | 27900 | 0.1954 | 0.8666 |
144
+ | 1.003 | 2.38 | 28200 | 0.1952 | 0.8549 |
145
+ | 0.9405 | 2.4 | 28500 | 0.1947 | 0.8589 |
146
+ | 0.9762 | 2.43 | 28800 | 0.6396 | 0.1527 |
147
+ | 0.9279 | 2.45 | 29100 | 0.6245 | 0.1535 |
148
+ | 0.954 | 2.48 | 29400 | 0.6344 | 0.1502 |
149
+ | 1.0043 | 2.5 | 29700 | 0.6290 | 0.1517 |
150
+ | 0.9289 | 2.53 | 30000 | 0.6228 | 0.1507 |
151
+ | 0.9303 | 2.56 | 30300 | 0.6330 | 0.1504 |
152
+ | 0.9703 | 2.58 | 30600 | 0.6346 | 0.1508 |
153
+ | 0.9247 | 2.61 | 30900 | 0.6222 | 0.1486 |
154
+ | 0.9083 | 2.63 | 31200 | 0.6288 | 0.1480 |
155
+ | 0.914 | 2.66 | 31500 | 0.6279 | 0.1478 |
156
+ | 0.9218 | 2.68 | 31800 | 0.6299 | 0.1485 |
157
+ | 0.9497 | 2.71 | 32100 | 0.6266 | 0.1480 |
158
+ | 0.9258 | 2.73 | 32400 | 0.6207 | 0.1469 |
159
+ | 0.9159 | 2.76 | 32700 | 0.6097 | 0.1456 |
160
+ | 0.9388 | 2.78 | 33000 | 0.6127 | 0.1470 |
161
+ | 0.9427 | 2.81 | 33300 | 0.6156 | 0.1454 |
162
+ | 0.9376 | 2.83 | 33600 | 0.6152 | 0.1464 |
163
+ | 0.8933 | 2.86 | 33900 | 0.6144 | 0.1442 |
164
+ | 0.9053 | 2.88 | 34200 | 0.6092 | 0.1440 |
165
+ | 0.9055 | 2.91 | 34500 | 0.6147 | 0.1439 |
166
+ | 0.9067 | 2.93 | 34800 | 0.6146 | 0.1439 |
167
+ | 0.9103 | 2.96 | 35100 | 0.6135 | 0.1442 |
168
+ | 0.9227 | 2.99 | 35400 | 0.6125 | 0.1445 |
169
+
170
+
171
+ ### Framework versions
172
+
173
+ - Transformers 4.17.0
174
+ - Pytorch 1.10.0+cu113
175
+ - Datasets 1.18.3
176
+ - Tokenizers 0.15.2