bigmorning commited on
Commit
54dcb70
1 Parent(s): 5ea480e

Upload TFWhisperForConditionalGeneration

Browse files
Files changed (4) hide show
  1. README.md +171 -0
  2. config.json +64 -0
  3. generation_config.json +117 -0
  4. tf_model.h5 +3 -0
README.md ADDED
@@ -0,0 +1,171 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: openai/whisper-tiny
4
+ tags:
5
+ - generated_from_keras_callback
6
+ model-index:
7
+ - name: whisper_syl_cv12_pad_lob100_low__0115
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information Keras had access to. You should
12
+ probably proofread and complete it, then remove this comment. -->
13
+
14
+ # whisper_syl_cv12_pad_lob100_low__0115
15
+
16
+ This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Train Loss: 0.0015
19
+ - Train Accuracy: 0.0362
20
+ - Train Wermet: 0.0015
21
+ - Validation Loss: 0.6863
22
+ - Validation Accuracy: 0.0236
23
+ - Validation Wermet: 0.2234
24
+ - Epoch: 114
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
44
+ - training_precision: float32
45
+
46
+ ### Training results
47
+
48
+ | Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
49
+ |:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
50
+ | 5.2930 | 0.0113 | 2.0658 | 3.9415 | 0.0117 | 0.9401 | 0 |
51
+ | 4.6215 | 0.0121 | 0.8917 | 3.7803 | 0.0120 | 0.9294 | 1 |
52
+ | 4.4086 | 0.0128 | 0.8403 | 3.6070 | 0.0124 | 0.9223 | 2 |
53
+ | 4.1842 | 0.0135 | 0.8337 | 3.4291 | 0.0128 | 0.8867 | 3 |
54
+ | 3.9981 | 0.0141 | 0.8182 | 3.3251 | 0.0131 | 0.8750 | 4 |
55
+ | 3.8531 | 0.0145 | 0.8058 | 3.2385 | 0.0133 | 0.8699 | 5 |
56
+ | 3.7345 | 0.0149 | 0.7925 | 3.1751 | 0.0134 | 0.8665 | 6 |
57
+ | 3.6307 | 0.0152 | 0.7851 | 3.1031 | 0.0136 | 0.8507 | 7 |
58
+ | 3.5437 | 0.0155 | 0.7717 | 3.0752 | 0.0138 | 0.8286 | 8 |
59
+ | 3.4649 | 0.0157 | 0.7651 | 3.0334 | 0.0139 | 0.8417 | 9 |
60
+ | 3.3926 | 0.0159 | 0.7531 | 3.0022 | 0.0139 | 0.8413 | 10 |
61
+ | 3.3262 | 0.0162 | 0.7462 | 2.9669 | 0.0140 | 0.8264 | 11 |
62
+ | 3.2625 | 0.0164 | 0.7367 | 2.9342 | 0.0141 | 0.8520 | 12 |
63
+ | 3.1979 | 0.0166 | 0.7231 | 2.9046 | 0.0144 | 0.8196 | 13 |
64
+ | 3.1319 | 0.0169 | 0.7133 | 2.8607 | 0.0145 | 0.8026 | 14 |
65
+ | 3.0616 | 0.0172 | 0.7007 | 2.8165 | 0.0146 | 0.7788 | 15 |
66
+ | 2.9792 | 0.0176 | 0.6816 | 2.7552 | 0.0149 | 0.7643 | 16 |
67
+ | 2.8905 | 0.0180 | 0.6641 | 2.6788 | 0.0151 | 0.7473 | 17 |
68
+ | 2.7749 | 0.0186 | 0.6424 | 2.5824 | 0.0155 | 0.7241 | 18 |
69
+ | 2.6263 | 0.0193 | 0.6159 | 2.4206 | 0.0161 | 0.7047 | 19 |
70
+ | 2.4352 | 0.0203 | 0.5829 | 2.2230 | 0.0168 | 0.6500 | 20 |
71
+ | 2.1941 | 0.0216 | 0.5411 | 2.0349 | 0.0175 | 0.5980 | 21 |
72
+ | 1.9184 | 0.0231 | 0.4922 | 1.7850 | 0.0184 | 0.5659 | 22 |
73
+ | 1.6174 | 0.0249 | 0.4371 | 1.5664 | 0.0192 | 0.5081 | 23 |
74
+ | 1.3542 | 0.0265 | 0.3851 | 1.3992 | 0.0199 | 0.4690 | 24 |
75
+ | 1.1499 | 0.0278 | 0.3408 | 1.2512 | 0.0205 | 0.4299 | 25 |
76
+ | 0.9878 | 0.0288 | 0.3029 | 1.1479 | 0.0209 | 0.4013 | 26 |
77
+ | 0.8600 | 0.0297 | 0.2735 | 1.0527 | 0.0213 | 0.3755 | 27 |
78
+ | 0.7516 | 0.0305 | 0.2441 | 0.9803 | 0.0216 | 0.3570 | 28 |
79
+ | 0.6626 | 0.0311 | 0.2197 | 0.9314 | 0.0219 | 0.3416 | 29 |
80
+ | 0.5863 | 0.0316 | 0.1993 | 0.8730 | 0.0221 | 0.3238 | 30 |
81
+ | 0.5187 | 0.0321 | 0.1775 | 0.8357 | 0.0223 | 0.3136 | 31 |
82
+ | 0.4608 | 0.0326 | 0.1610 | 0.8059 | 0.0224 | 0.3033 | 32 |
83
+ | 0.4087 | 0.0330 | 0.1467 | 0.7746 | 0.0226 | 0.2949 | 33 |
84
+ | 0.3642 | 0.0334 | 0.1298 | 0.7476 | 0.0227 | 0.2847 | 34 |
85
+ | 0.3221 | 0.0337 | 0.1168 | 0.7330 | 0.0228 | 0.2802 | 35 |
86
+ | 0.2837 | 0.0340 | 0.1030 | 0.7093 | 0.0229 | 0.2728 | 36 |
87
+ | 0.2509 | 0.0343 | 0.0882 | 0.6941 | 0.0229 | 0.2687 | 37 |
88
+ | 0.2209 | 0.0346 | 0.0747 | 0.6892 | 0.0230 | 0.2656 | 38 |
89
+ | 0.1934 | 0.0349 | 0.0670 | 0.6824 | 0.0230 | 0.2630 | 39 |
90
+ | 0.1688 | 0.0351 | 0.0542 | 0.6773 | 0.0230 | 0.2625 | 40 |
91
+ | 0.1469 | 0.0353 | 0.0429 | 0.6700 | 0.0231 | 0.2633 | 41 |
92
+ | 0.1268 | 0.0355 | 0.0365 | 0.6680 | 0.0231 | 0.2578 | 42 |
93
+ | 0.1086 | 0.0357 | 0.0284 | 0.6643 | 0.0231 | 0.2540 | 43 |
94
+ | 0.0920 | 0.0358 | 0.0221 | 0.6645 | 0.0231 | 0.2530 | 44 |
95
+ | 0.0783 | 0.0359 | 0.0169 | 0.6621 | 0.0232 | 0.2540 | 45 |
96
+ | 0.0667 | 0.0360 | 0.0121 | 0.6714 | 0.0232 | 0.2532 | 46 |
97
+ | 0.0563 | 0.0361 | 0.0094 | 0.6604 | 0.0232 | 0.2503 | 47 |
98
+ | 0.0477 | 0.0361 | 0.0072 | 0.6620 | 0.0232 | 0.2489 | 48 |
99
+ | 0.0397 | 0.0362 | 0.0055 | 0.6611 | 0.0232 | 0.2502 | 49 |
100
+ | 0.0330 | 0.0362 | 0.0045 | 0.6686 | 0.0232 | 0.2496 | 50 |
101
+ | 0.0283 | 0.0362 | 0.0033 | 0.6705 | 0.0232 | 0.2503 | 51 |
102
+ | 0.0242 | 0.0362 | 0.0034 | 0.6686 | 0.0232 | 0.2486 | 52 |
103
+ | 0.0212 | 0.0362 | 0.0031 | 0.6686 | 0.0232 | 0.2493 | 53 |
104
+ | 0.0197 | 0.0362 | 0.0028 | 0.6688 | 0.0232 | 0.2530 | 54 |
105
+ | 0.0226 | 0.0362 | 0.0041 | 0.6598 | 0.0233 | 0.2451 | 55 |
106
+ | 0.0158 | 0.0362 | 0.0024 | 0.6605 | 0.0233 | 0.2428 | 56 |
107
+ | 0.0115 | 0.0362 | 0.0018 | 0.6648 | 0.0233 | 0.2435 | 57 |
108
+ | 0.0094 | 0.0362 | 0.0017 | 0.6672 | 0.0233 | 0.2446 | 58 |
109
+ | 0.0081 | 0.0362 | 0.0018 | 0.6731 | 0.0233 | 0.2439 | 59 |
110
+ | 0.0071 | 0.0362 | 0.0017 | 0.6762 | 0.0233 | 0.2429 | 60 |
111
+ | 0.0062 | 0.0362 | 0.0017 | 0.6794 | 0.0233 | 0.2426 | 61 |
112
+ | 0.0055 | 0.0362 | 0.0017 | 0.6825 | 0.0233 | 0.2429 | 62 |
113
+ | 0.0048 | 0.0362 | 0.0017 | 0.6895 | 0.0233 | 0.2450 | 63 |
114
+ | 0.0042 | 0.0362 | 0.0019 | 0.6914 | 0.0233 | 0.2424 | 64 |
115
+ | 0.0037 | 0.0362 | 0.0018 | 0.6938 | 0.0233 | 0.2423 | 65 |
116
+ | 0.0224 | 0.0361 | 0.0080 | 0.6695 | 0.0234 | 0.2409 | 66 |
117
+ | 0.0127 | 0.0362 | 0.0037 | 0.6685 | 0.0234 | 0.2383 | 67 |
118
+ | 0.0065 | 0.0362 | 0.0017 | 0.6714 | 0.0234 | 0.2359 | 68 |
119
+ | 0.0045 | 0.0362 | 0.0017 | 0.6645 | 0.0234 | 0.2347 | 69 |
120
+ | 0.0034 | 0.0362 | 0.0016 | 0.6671 | 0.0234 | 0.2353 | 70 |
121
+ | 0.0028 | 0.0362 | 0.0014 | 0.6715 | 0.0234 | 0.2354 | 71 |
122
+ | 0.0024 | 0.0362 | 0.0014 | 0.6745 | 0.0234 | 0.2358 | 72 |
123
+ | 0.0022 | 0.0362 | 0.0014 | 0.6778 | 0.0234 | 0.2356 | 73 |
124
+ | 0.0020 | 0.0362 | 0.0013 | 0.6797 | 0.0234 | 0.2357 | 74 |
125
+ | 0.0018 | 0.0362 | 0.0014 | 0.6833 | 0.0234 | 0.2355 | 75 |
126
+ | 0.0016 | 0.0362 | 0.0013 | 0.6885 | 0.0234 | 0.2363 | 76 |
127
+ | 0.0068 | 0.0362 | 0.0035 | 0.7270 | 0.0232 | 0.2500 | 77 |
128
+ | 0.0131 | 0.0362 | 0.0076 | 0.6965 | 0.0234 | 0.2397 | 78 |
129
+ | 0.0054 | 0.0362 | 0.0088 | 0.6764 | 0.0235 | 0.2339 | 79 |
130
+ | 0.0029 | 0.0362 | 0.0041 | 0.6806 | 0.0235 | 0.2334 | 80 |
131
+ | 0.0019 | 0.0362 | 0.0039 | 0.6723 | 0.0235 | 0.2316 | 81 |
132
+ | 0.0016 | 0.0362 | 0.0028 | 0.6765 | 0.0235 | 0.2315 | 82 |
133
+ | 0.0014 | 0.0362 | 0.0025 | 0.6786 | 0.0235 | 0.2306 | 83 |
134
+ | 0.0013 | 0.0362 | 0.0023 | 0.6805 | 0.0235 | 0.2304 | 84 |
135
+ | 0.0012 | 0.0362 | 0.0022 | 0.6830 | 0.0235 | 0.2301 | 85 |
136
+ | 0.0011 | 0.0362 | 0.0022 | 0.6881 | 0.0235 | 0.2308 | 86 |
137
+ | 0.0010 | 0.0362 | 0.0022 | 0.6875 | 0.0235 | 0.2303 | 87 |
138
+ | 0.0009 | 0.0362 | 0.0022 | 0.6909 | 0.0235 | 0.2307 | 88 |
139
+ | 0.0008 | 0.0362 | 0.0020 | 0.6934 | 0.0235 | 0.2299 | 89 |
140
+ | 0.0007 | 0.0362 | 0.0022 | 0.6968 | 0.0235 | 0.2307 | 90 |
141
+ | 0.0007 | 0.0362 | 0.0020 | 0.7005 | 0.0235 | 0.2300 | 91 |
142
+ | 0.0006 | 0.0362 | 0.0021 | 0.7040 | 0.0235 | 0.2307 | 92 |
143
+ | 0.0006 | 0.0362 | 0.0020 | 0.7086 | 0.0235 | 0.2309 | 93 |
144
+ | 0.0005 | 0.0362 | 0.0020 | 0.7116 | 0.0235 | 0.2318 | 94 |
145
+ | 0.0005 | 0.0362 | 0.0018 | 0.7151 | 0.0235 | 0.2305 | 95 |
146
+ | 0.0111 | 0.0362 | 0.2014 | 0.7185 | 0.0234 | 0.2861 | 96 |
147
+ | 0.0069 | 0.0362 | 0.0051 | 0.7036 | 0.0235 | 0.2337 | 97 |
148
+ | 0.0028 | 0.0362 | 0.0015 | 0.6946 | 0.0235 | 0.2324 | 98 |
149
+ | 0.0023 | 0.0362 | 0.0018 | 0.6937 | 0.0235 | 0.2295 | 99 |
150
+ | 0.0017 | 0.0362 | 0.0013 | 0.6886 | 0.0235 | 0.2283 | 100 |
151
+ | 0.0010 | 0.0362 | 0.0008 | 0.6891 | 0.0236 | 0.2274 | 101 |
152
+ | 0.0009 | 0.0362 | 0.0013 | 0.6901 | 0.0236 | 0.2275 | 102 |
153
+ | 0.0008 | 0.0362 | 0.0015 | 0.6922 | 0.0236 | 0.2273 | 103 |
154
+ | 0.0006 | 0.0362 | 0.0015 | 0.6923 | 0.0236 | 0.2274 | 104 |
155
+ | 0.0008 | 0.0362 | 0.0014 | 0.6996 | 0.0235 | 0.2288 | 105 |
156
+ | 0.0006 | 0.0362 | 0.0014 | 0.6967 | 0.0236 | 0.2266 | 106 |
157
+ | 0.0005 | 0.0362 | 0.0013 | 0.6988 | 0.0236 | 0.2260 | 107 |
158
+ | 0.0004 | 0.0362 | 0.0027 | 0.7008 | 0.0236 | 0.2278 | 108 |
159
+ | 0.0004 | 0.0362 | 0.0017 | 0.7034 | 0.0236 | 0.2261 | 109 |
160
+ | 0.0004 | 0.0362 | 0.0018 | 0.7036 | 0.0236 | 0.2265 | 110 |
161
+ | 0.0004 | 0.0362 | 0.0015 | 0.7090 | 0.0236 | 0.2255 | 111 |
162
+ | 0.0112 | 0.0362 | 0.0059 | 0.7014 | 0.0235 | 0.2271 | 112 |
163
+ | 0.0034 | 0.0362 | 0.0023 | 0.6869 | 0.0236 | 0.2252 | 113 |
164
+ | 0.0015 | 0.0362 | 0.0015 | 0.6863 | 0.0236 | 0.2234 | 114 |
165
+
166
+
167
+ ### Framework versions
168
+
169
+ - Transformers 4.33.0.dev0
170
+ - TensorFlow 2.13.0
171
+ - Tokenizers 0.13.3
config.json ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-tiny",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 50257
12
+ ],
13
+ "bos_token_id": 50257,
14
+ "classifier_proj_size": 256,
15
+ "d_model": 384,
16
+ "decoder_attention_heads": 6,
17
+ "decoder_ffn_dim": 1536,
18
+ "decoder_layerdrop": 0.0,
19
+ "decoder_layers": 4,
20
+ "decoder_start_token_id": 50258,
21
+ "dropout": 0.0,
22
+ "encoder_attention_heads": 6,
23
+ "encoder_ffn_dim": 1536,
24
+ "encoder_layerdrop": 0.0,
25
+ "encoder_layers": 4,
26
+ "eos_token_id": 50257,
27
+ "forced_decoder_ids": [
28
+ [
29
+ 1,
30
+ 50289
31
+ ],
32
+ [
33
+ 2,
34
+ 50359
35
+ ],
36
+ [
37
+ 3,
38
+ 50363
39
+ ]
40
+ ],
41
+ "init_std": 0.02,
42
+ "is_encoder_decoder": true,
43
+ "mask_feature_length": 10,
44
+ "mask_feature_min_masks": 0,
45
+ "mask_feature_prob": 0.0,
46
+ "mask_time_length": 10,
47
+ "mask_time_min_masks": 2,
48
+ "mask_time_prob": 0.05,
49
+ "max_length": 448,
50
+ "max_source_positions": 1500,
51
+ "max_target_positions": 448,
52
+ "median_filter_width": 7,
53
+ "model_type": "whisper",
54
+ "num_hidden_layers": 4,
55
+ "num_mel_bins": 80,
56
+ "pad_token_id": -100,
57
+ "scale_embedding": false,
58
+ "suppress_tokens": [],
59
+ "torch_dtype": "float32",
60
+ "transformers_version": "4.33.0.dev0",
61
+ "use_cache": true,
62
+ "use_weighted_layer_sum": false,
63
+ "vocab_size": 51865
64
+ }
generation_config.json ADDED
@@ -0,0 +1,117 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "begin_suppress_tokens": [
4
+ 220,
5
+ 50257
6
+ ],
7
+ "bos_token_id": 50257,
8
+ "decoder_start_token_id": 50258,
9
+ "eos_token_id": 50257,
10
+ "forced_decoder_ids": [
11
+ [
12
+ 1,
13
+ 50259
14
+ ],
15
+ [
16
+ 2,
17
+ 50359
18
+ ],
19
+ [
20
+ 3,
21
+ 50363
22
+ ]
23
+ ],
24
+ "max_length": 448,
25
+ "pad_token_id": 50257,
26
+ "suppress_tokens": [
27
+ 1,
28
+ 2,
29
+ 7,
30
+ 8,
31
+ 9,
32
+ 10,
33
+ 14,
34
+ 25,
35
+ 26,
36
+ 27,
37
+ 28,
38
+ 29,
39
+ 31,
40
+ 58,
41
+ 59,
42
+ 60,
43
+ 61,
44
+ 62,
45
+ 63,
46
+ 90,
47
+ 91,
48
+ 92,
49
+ 93,
50
+ 359,
51
+ 503,
52
+ 522,
53
+ 542,
54
+ 873,
55
+ 893,
56
+ 902,
57
+ 918,
58
+ 922,
59
+ 931,
60
+ 1350,
61
+ 1853,
62
+ 1982,
63
+ 2460,
64
+ 2627,
65
+ 3246,
66
+ 3253,
67
+ 3268,
68
+ 3536,
69
+ 3846,
70
+ 3961,
71
+ 4183,
72
+ 4667,
73
+ 6585,
74
+ 6647,
75
+ 7273,
76
+ 9061,
77
+ 9383,
78
+ 10428,
79
+ 10929,
80
+ 11938,
81
+ 12033,
82
+ 12331,
83
+ 12562,
84
+ 13793,
85
+ 14157,
86
+ 14635,
87
+ 15265,
88
+ 15618,
89
+ 16553,
90
+ 16604,
91
+ 18362,
92
+ 18956,
93
+ 20075,
94
+ 21675,
95
+ 22520,
96
+ 26130,
97
+ 26161,
98
+ 26435,
99
+ 28279,
100
+ 29464,
101
+ 31650,
102
+ 32302,
103
+ 32470,
104
+ 36865,
105
+ 42863,
106
+ 47425,
107
+ 49870,
108
+ 50254,
109
+ 50258,
110
+ 50358,
111
+ 50359,
112
+ 50360,
113
+ 50361,
114
+ 50362
115
+ ],
116
+ "transformers_version": "4.33.0.dev0"
117
+ }
tf_model.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:17310c245554fcba112efb807d081dac1587ec46b96a5b122fe6057921f3b62d
3
+ size 151253960